Most contracts are still written the way they were thirty years ago. Dense. Clause-heavy. Optimised for dispute resolution rather than business execution. And despite two decades of legal tech promises, the fundamental experience of contracting has barely changed.
Of contract disputes
stem from misunderstanding, not bad faith — the implementation gap is a design problem, not a compliance problem (industry estimates)
Generative AI is changing that — not because it automates drafting (though it does), but because it makes a much deeper shift possible: from contracts as legal artefacts to contracts as strategic tools. This is the central argument of Generative AI, Contracts, Law and Design, a new collection edited by Corrales Compagnucci, Haapio, and others, published by Springer in 2025. The book's opening chapter lays out five themes that any legal professional working with contracts should be thinking about.
Here is what matters — and what it means for practice in Switzerland and across Europe.
The implementation gap is real — and expensive
There is a well-documented problem in contracting that the book calls the "implementation gap." Contracts are drafted by legal teams, signed by executives, and then handed to operational staff who need to actually perform the obligations. The problem: those operational staff frequently cannot understand what the contract requires of them.
This is not a training issue. It is a design issue. Contracts are written in a register that serves legal precision but fails practical communication. The result is non-compliance, missed deadlines, and disputes that could have been prevented — all of which cost money.
GenAI does not just simplify language. It can generate role-specific contract guides, extract obligation summaries for different stakeholders, and create visual representations of complex workflows. The chapter highlights work by Waller, Haapio, and Shone comparing GenAI's ability to produce contract guides against those created by experienced information designers. The results are promising — not perfect, but promising.
From reactive law to proactive contract design
The most consequential idea in the chapter is the shift from reactive to proactive law. Traditional legal practice is fundamentally reactive: you draft a contract to manage risk, and when something goes wrong, you litigate. Proactive law inverts this. You design contracts to prevent disputes and promote desired outcomes from the start.
GenAI makes proactive contract design practical in ways it was not before:
- Predictive risk identification. AI can analyse contract portfolios to identify patterns that correlate with disputes, delays, or non-performance — before they happen.
- Scenario simulation. Rather than drafting for the worst case and hoping for the best, you can model different contract structures against likely scenarios.
- Dynamic contract adaptation. Contracts can be designed with built-in flexibility, supported by AI monitoring that flags when conditions change and suggests amendments.
This is not theoretical. The book documents case studies where AI-assisted contract design has been applied to sustainable finance, healthcare data governance, and commercial contracting. The common thread: contracts designed with AI assistance are more comprehensible, more actionable, and more aligned with business objectives.
Consider the sustainable finance example. EU regulation related to sustainable finance aims to help achieve the objectives of the Green Deal, but economic efficiency and sustainability are often perceived as conflicting goals. The book presents a case where a team of specialised AI agents — an "AI crew" — was deployed to reconcile these perspectives for a European textile firm. Each agent handled a different aspect of financial management, from ESG reporting to cost optimisation. The results demonstrate that GenAI is not limited to text generation; it can orchestrate complex analytical workflows that support strategic decision-making across multiple dimensions simultaneously.
The EU AI Act shapes how you can use these tools
Any discussion of GenAI in legal practice in 2026 must account for the EU AI Act (Regulation 2024/1689), which is now being phased in. The Act creates a risk-based framework that directly affects how law firms and in-house teams can deploy AI tools.
For Swiss firms, the picture is nuanced. Switzerland is not an EU member state, but Swiss firms advising EU-based clients or handling contracts governed by EU law will need to comply. The Swiss Federal Council is also monitoring the EU AI Act closely, and alignment is likely over time.
The book's chapter by Metin and Kerikmae raises a pointed concern: that the rush toward AI-driven efficiency in justice systems may compromise the rule of law if transparency, accountability, and access to justice are not designed into the tools themselves. This is not anti-technology sentiment — it is a design requirement.
Responsible AI is a design problem, not a compliance checklist
The chapter introduces the concept of "Responsible AI" — the idea that AI systems should be safe, transparent, and aligned with societal values. This sounds like a platitude until you look at the specifics.
In the context of legal AI, responsible design means:
- Bias auditing. Legal AI trained on historical case law will inherit the biases of that case law. If your contract review tool was trained predominantly on US or UK contracts, it may not flag issues specific to Swiss or continental European law.
- Transparency of reasoning. When an AI tool suggests a contract clause or flags a risk, you need to understand why. Black-box recommendations are incompatible with professional responsibility obligations.
- Data governance. Contract data is sensitive. Where it is processed, who has access, and how it is retained are not optional considerations — they are professional duty requirements.
Human-centred design — in the tool and in the output
The chapter makes a distinction that is easy to overlook: human-centred design applies not only to the AI tool's interface but also to its outputs. An AI contract tool with an intuitive dashboard is worthless if the contracts it produces are still incomprehensible to the people who need to act on them.
This is where legal design thinking meets GenAI capability:
- Visual contracts. AI can generate visual representations of contract structures, timelines, and obligation flows that make complex agreements navigable.
- Interactive interfaces. Rather than a static PDF, contracts can be delivered as interactive documents where stakeholders can explore the clauses relevant to their role.
- Audience-specific communication. The same contract can be presented differently to legal teams, operational managers, and executives — each seeing the information most relevant to their decisions.
70%
Of contract disputes
stem from misunderstanding, not bad faith (industry estimates)
3x
Comprehension improvement
when contracts use visual design elements (legal design research)
45%
Time reduction
in first-pass contract review with AI assistance (LawGeex benchmarks)
The broader point: GenAI is not just about efficiency. It is about making contracts work better as instruments of collaboration and trust. When contracts are comprehensible to all parties — not just the lawyers — they function as tools for alignment rather than sources of friction.
What this means for Swiss legal professionals
If you work in contract law, corporate advisory, or in-house legal in Switzerland or the DACH region, five practical takeaways emerge from this chapter:
-
Audit your contract portfolio for the implementation gap. Identify contracts where operational teams struggle with comprehension and compliance. These are your highest-value targets for AI-assisted redesign.
-
Evaluate GenAI tools against proactive criteria. Do not just ask whether a tool can draft faster. Ask whether it can identify risks, simulate scenarios, and generate audience-specific outputs.
-
Build EU AI Act awareness now. Even if your firm is Swiss-based, client exposure to EU regulation is almost certain. Understanding the risk categories and compliance obligations early is a competitive advantage.
-
Insist on transparency from AI vendors. Ask about training data, bias auditing, and hallucination rates. If a vendor cannot answer these questions, they are not ready for legal use.
-
Start with human-centred outputs, not just tools. The measure of success is not whether your team uses AI — it is whether the contracts your team produces are more comprehensible, more actionable, and more aligned with business objectives.
The end of "legal writing as usual"
One of the most provocative arguments in the book comes from Helena Haapio, who declares "no more legal writing in contracts." This is more than a rhetorical gesture. For decades, contracts have been treated as formal legal documents written by lawyers for other lawyers. Their structure, tone, and language have remained remarkably consistent over time — reflecting inherited drafting conventions rather than the needs of the people who actually use them.
Haapio's argument is that if the purpose of contracts is to set and reach common goals, their language should not feel like a mysterious spell. The rise of visual contracts, plain language initiatives, and GenAI integration into the contracting process challenges long-standing assumptions about what contracts are for and how they should work. When you can move beyond "legal writing as usual," contracts become tools for understanding and action — not just instruments of legal protection.
This has implications for how lawyers see their own role. The traditional position — firefighter, interpreter, guardian of rules — gives way to something more ambitious: proactive solution provider and strategic enabler. A partner in designing and building a better outcome, not just documenting the terms of an existing one.
The road ahead
The convergence of GenAI, contract law, and design thinking is not a future possibility — it is happening now. The firms that will lead are not necessarily the ones with the biggest AI budgets. They are the ones that understand that the purpose of a contract is not to be legally defensible in hindsight, but to be practically useful in the present.
The intellectual property questions alone are significant. AI-generated art and content blur the lines between original and derivative works, and the replication of distinctive artistic styles raises concerns that existing copyright frameworks were not designed to address. For contract professionals, this matters because AI-generated contract clauses, templates, and designs raise analogous questions about originality, ownership, and attribution.
Generative AI, Contracts, Law and Design provides a strong intellectual foundation for this shift. The chapter reviewed here sets up the key questions. The answers will come from practitioners willing to rethink what contracts are for — and to use the tools now available to make them better.
This article draws on Chapter 1 of Generative AI, Contracts, Law and Design (Springer, 2025), edited by M. Corrales Compagnucci, H. Haapio, M. Fenwick, and others. The book is part of the Perspectives in Law, Business and Innovation series.