If you're using a US-hosted AI tool to draft client documents, you may already be in breach — and not only of GDPR. Swiss and German law firms deploying AI face overlapping documentation requirements from two distinct frameworks. Understanding which one applies to your firm, and what each requires, is the starting point for any defensible compliance posture.
Personal liability per individual
Under the Swiss FADP, penalties target the responsible individual — not the firm. This is a criminal sanction, not a regulatory fine.
Two Regimes, Not One
This is the most important clarification for Swiss firms: the Swiss Federal Act on Data Protection (FADP, or nFADP) is not GDPR. Swiss firms operating in Switzerland are subject to the FADP, which has been in force since 1 September 2023. GDPR applies additionally only if the firm processes personal data of EU residents — which, for firms in Basel, Zurich, or other Swiss border cantons with German or French client bases, is often the case.
The practical differences matter:
- FADP penalties are levied on the responsible individual, not the firm — up to CHF 250,000 per person. GDPR fines go to the organisation.
- FADP "high-risk profiling" (Art. 5(g) FADP) is a Swiss-specific concept with no direct GDPR equivalent. It captures automated processing that enables core attributes of a person to be assessed — relevant for any AI tool performing scoring, evaluation, or risk categorisation of clients or employees.
- DPIA threshold: Under the FADP, a Data Protection Impact Assessment (DPIA) is required when AI processing poses "considerable risk" to the personality rights of data subjects (Art. 22 FADP). The threshold differs from GDPR's "high risk" test.
German firms are straightforwardly under GDPR and, once it applies in their jurisdiction, the EU AI Act.
The SME Exemption That Probably Does Not Apply to You
GDPR Article 30(5) provides that organisations with fewer than 250 employees are generally exempt from the obligation to maintain full Records of Processing Activities (RoPA). Most law firms with 5-30 lawyers fall below this threshold. The exemption looks helpful. It almost certainly does not apply.
The exemption is void where the processing is likely to result in a risk to the rights and freedoms of data subjects, is not occasional, or includes special categories of data (Article 9 GDPR) or criminal offence data (Article 10 GDPR). Legal client data is by definition sensitive: it may include health information, family law matters, criminal records, immigration status, or financial data. Processing is not occasional — it is the core of the firm's work. In practice, every law firm using AI tools that touch client files should maintain RoPA entries for those tools, regardless of headcount.
Under the FADP, the obligation is broader still: there is no equivalent SME headcount exemption for firms processing sensitive personal data.
Records of Processing Activities — What Each AI Tool Entry Must Contain
Each AI tool your firm uses that touches personal data requires a separate entry in your RoPA. For a firm with no dedicated data protection officer (DPO), the managing partner is the de facto data protection contact and owns these records.
A complete entry for a law firm AI tool must capture:
- Processing activity and tool name — e.g., "Contract analysis and review using [Vendor] AI platform"
- Purpose — be specific. "Legal services" is not sufficient. Document whether the tool is used for contract drafting, litigation document review, client intake screening, employee scheduling, or another distinct purpose
- Legal basis — under GDPR Art. 6, most client data processing will rely on performance of a contract (Art. 6(1)(b)) or legitimate interests (Art. 6(1)(f)); under FADP, the equivalent bases apply. For special category data, an additional basis under Art. 9 GDPR or FADP Art. 31 is required
- Categories of data subjects and data — list types, not every individual field: clients, opposing parties, witnesses, employees
- Recipients — this is where your AI vendor and its sub-processors must appear
- Third-country transfers — identify where data is stored and processed; document the transfer mechanism (Standard Contractual Clauses, adequacy decision)
- Retention period — or the criteria used to determine it
Maintain this record in a simple spreadsheet or document. The point is not a perfect legal instrument — it is a demonstrable, systematic record that you can produce if your supervisory authority asks.
When a DPIA Is Required
A DPIA is mandatory under GDPR Article 35 when processing is likely to result in high risk to individuals. For AI tools used by law firms, the triggers are:
- Systematic and extensive profiling with significant effects (Art. 35(3)(a)) — any AI tool that evaluates, scores, or segments clients or employees
- Large-scale processing of special category data (Art. 35(3)(b)) — legal files routinely contain health data, financial data, and criminal record information
- AI-assisted decision-making about individuals — including AI-driven evaluation of employees (performance management tools) or clients (intake screening)
Under the FADP, the equivalent obligation (Art. 22) applies where processing poses "considerable risk." Swiss supervisory authorities have indicated that AI-based processing of legal and health data generally meets this threshold.
A DPIA is a living document. If the AI tool changes materially — new data types, new model versions, expanded use cases — review and update it.
Vendor Contracts: The Clause That Matters Most
When an AI vendor processes personal data on behalf of your firm, you are the controller and the vendor is a processor. A Data Processing Agreement (DPA) is mandatory under both GDPR Art. 28 and FADP Art. 9.
For firms with no compliance officer, focus your negotiation on one clause before all others: explicit prohibition on model training using client data. Many AI vendors' standard terms are ambiguous on this point. Some reserve the right to use inputs to improve their models. If a vendor refuses to include an explicit prohibition on training their model with your client data, do not use the tool. This is also a hard obligation under Swiss professional secrecy law (Art. 321 StGB): client data cannot be disclosed for purposes outside the mandate.
Additional clauses that require scrutiny:
- Sub-processor chains: AI tools rely on cloud infrastructure providers as sub-processors. Obtain the full sub-processor list and require advance notice of changes (GDPR Art. 28(2)).
- Data residency: Identify where data is stored. For highly sensitive client data, consider requiring contractually that processing remain within the EU or Switzerland.
- Breach notification: The DPA should require the vendor to notify your firm within 24-48 hours of a discovered breach — not merely "without undue delay" — to give you time to assess and report within GDPR's 72-hour window.
- Deletion on termination: Require that all personal data, including backup copies, be deleted or returned at contract end.
A Practical Workflow for a Firm Without a DPO
Most firms under 250 employees do not legally need a DPO under GDPR (the obligation arises under Art. 37 for organisations whose core activities involve large-scale systematic monitoring or large-scale special category data processing). Under the FADP, no mandatory DPO requirement exists for private firms. But someone must own data protection. For most small firms, that person is the managing partner.
A minimal defensible workflow:
- Maintain the RoPA — one entry per AI tool, kept current as tools change.
- Review vendor DPAs — verify the no-training clause before onboarding any new tool. Flag any vendor that declines.
- Run a DPIA for any tool doing profiling or special-category processing — a structured two-page document is sufficient for a small firm. Focus on: what data, what risk, what safeguards.
- Log new tool requests — when a lawyer or staff member wants to adopt a new AI tool, run it through a two-question check: (a) does the vendor's DPA prohibit model training on client data? (b) is the data hosted in the EU or Switzerland, or is there an adequate transfer mechanism? If either answer is unsatisfactory, do not deploy.
The accountability principle does not require perfection. It requires demonstrable, systematic effort. Supervisory authorities investigating an incident will ask for documentation first. A firm that can produce a current RoPA, its vendor DPAs, and a DPIA for its highest-risk tool is in a fundamentally different position from one that cannot.
GDPR AI Documentation Checklist
0/0Privacy-Enhancing Technologies for Legal AI
Six privacy-enhancing technologies are directly relevant to law firm AI deployments:
- Differential privacy — adds calibrated noise to data and can support anonymisation strategies where re-identification risk is genuinely remote. It is not an automatic GDPR or FADP exemption on its own; firms still need to assess whether the resulting dataset is truly anonymous in context.
- Synthetic data — replicates statistical properties of real client data without containing any actual client information. Safe even if leaked.
- Federated learning — train models across multiple parties without aggregating data into a single point of failure. Relevant for multi-office firms.
- Encryption — AES-256 for data at rest, TLS 1.3 for data in transit. Non-negotiable for any legal AI deployment.
- Pseudonymisation — changes identifiers but remains GDPR-regulated (unlike full anonymisation).
- Anonymisation — irreversibly removes identifiers. Falls outside GDPR scope entirely if done correctly.
The practical implication: design anonymisation and privacy-preserving analytics up front, not after a deletion request arrives. Differential privacy and related PETs can materially reduce risk, but firms should validate irreversibility before treating any output as outside GDPR or FADP scope.
AI Incident Response: When Things Go Wrong
If a legal AI tool leaks client privileged data, the firm faces a dual obligation: data breach notification (GDPR: 72 hours; FADP: "as soon as possible") and potential Art. 321 StGB professional secrecy violation — which carries criminal consequences on top of data protection penalties.
Your incident response plan must address both simultaneously. ISO 42001 Control A.8.4 requires a documented communication plan covering data breaches involving personal information, with separate reporting requirements to supervisory authorities and affected individuals. The plan must integrate with your broader firm incident management while accounting for AI-specific risks like model data leakage and embedding exposure.
Is Your AI Data Processing GDPR-Compliant?
0 questions
Need help building your firm's GDPR documentation for AI tools? Get in touch for a structured review.