Of Finnish adults
carry a payment default entry — and across the EU, millions of households borrowed to cover basic living expenses during the pandemic
Over-indebtedness is not a fringe problem. In Finland alone, nearly 8% of adults carry a payment default entry. Across the EU, millions of households borrowed to cover basic living expenses during the pandemic. In Switzerland, the SchKG enforcement system processes hundreds of thousands of debt collection cases annually — and the numbers keep climbing, particularly among younger adults caught in cycles of instant credit and subscription spending.
The legal infrastructure built to address this — judicial debt rehabilitation, Schuldenbereinigung, consumer bankruptcy — was designed for a world of physical banks and paper applications. It leaves large segments of the debtor population without meaningful access to relief. And the gap between who needs help and who gets it is precisely the kind of problem that generative AI could narrow.
The Justice Gap Is a Design Problem
Most European debt rehabilitation systems share a structural flaw: they filter for "deserving" debtors. In Finland, the Act on the Adjustment of Debts of a Private Individual (57/1993) requires that insolvency result from circumstances "not primarily the fault of the debtor." Consumer credit debt — the category growing fastest — has traditionally been excluded. The moralistic framing is not unique to Finland. Across the DACH region, over-indebtedness carries stigma, and formal rehabilitation pathways are gated by eligibility criteria that screen out many who need them most.
The result is a justice gap. People who don't qualify for judicial proceedings are left to handle repayment enforcement on their own, sometimes for 20 to 25 years. Public debt counseling exists — Switzerland's Schuldenberatung offices, Germany's Schuldnerberatungsstellen — but these services are under-resourced and rarely personalized. The World Justice Project's global data confirms what practitioners already know: most people seeking advice for financial and legal problems report not receiving sufficient help.
The Tech-Driven Debt Trap
The irony runs deep. Technology is both a primary driver of over-indebtedness and the most underutilized tool for addressing it. Instant loans — small amounts obtainable via mobile phone within minutes — have proliferated across Europe. Research from Finland and the Nordics documents how persuasive and aggressive marketing of high-interest consumer credit has pushed especially young adults into reckless borrowing. But it is not only recklessness at play. For low-income consumers, borrowing may be the only available option to cover essential living expenses, particularly during economic downturns. Once caught in the cycle of predatory lending — higher commissions, higher interest rates, limited alternatives — the debt becomes self-sustaining, with documented negative effects on mental health, social participation, and intergenerational mobility.
Government digital services have not kept pace. People seeking personalized financial guidance typically encounter outdated websites, generic FAQ pages, and PDF brochures that provide no actionable path forward. When public services fail to meet the need, people turn to private operators — some of whom exploit exactly the vulnerabilities that created the problem. The arena is wide open for AI tools that could do better, if designed and regulated properly.
Why Behavioral Science Matters Here
This is not just a legal architecture problem. It is a behavioral one. Research consistently shows that over-indebtedness is driven by a complex interplay of socioeconomic conditions, psychological factors, and environmental pressures — not simply "irresponsible spending."
The COM-B model, developed by Michie, Atkins, and West, provides a useful framework. It holds that for any behavior to change, three conditions must be met simultaneously:
- Capability: The person needs knowledge and skills — financial literacy, budgeting competence, understanding of credit terms.
- Opportunity: External conditions must support the behavior — access to information, time, social support, and appropriate tools.
- Motivation: The person must have sufficient intrinsic or extrinsic drive — positive attitudes, self-regulation, emotional resilience.
Education alone does not change long-term financial behavior. The research is clear on this. What works is sustained, personalized intervention that addresses all three dimensions — and that is exactly what AI coaching could provide.
What AI Financial Coaches Can Already Do
The technology is not hypothetical. A range of AI-driven financial tools already exist:
- Budgeting and spending analysis: YNAB, Empower, MoneyWiz — apps that use AI to categorize expenses and surface savings opportunities.
- Debt consolidation platforms: Anyfin, Bright — services that consolidate multiple debts and negotiate lower interest rates.
- Automated negotiation: Haggle It (Cleo) — a bot that negotiates with creditors on the user's behalf.
- Credit improvement tools: Experian Boost, DebtBusters — platforms that help users build better credit profiles.
- Financial education: Zogo, MoneyMasters — adaptive learning paths based on the user's financial situation and progress.
- Robo-advisors: Betterment (US, SEC-registered investment adviser), Simplewealth (CH, Swiss-based robo-adviser) — algorithmic portfolio management.
What is missing from this picture is a thorough, government-backed or nonprofit-backed AI tool designed specifically for over-indebted individuals — one that combines budgeting, legal guidance, behavioral coaching, and integration with existing debt enforcement procedures.
The authors of the underlying research built a prototype called "Finance Friend" using OpenAI's GPT Builder. Their tests showed that even a simple GPT-4-based tool could provide holistic financial guidance, translate complex credit terms into plain language, offer personalized expenditure analysis, and create a motivating interaction environment. The tool also demonstrated sensitivity to acute distress situations, providing thorough advice covering expenditure review, income strategies, and creditor negotiation.
The Regulatory Picture: EU AI Act Meets Financial Services
For legal professionals advising clients in this space — whether financial institutions, fintech startups, or public sector agencies — the regulatory intersection is complex and consequential.
EU AI Act (2024/1689): AI systems used for creditworthiness assessment are classified as high-risk under Annex III. A financial coaching tool that evaluates a user's ability to pay or recommends debt restructuring strategies may trigger conformity assessment requirements, depending on its design. The Act requires transparency, human oversight, and documentation obligations for providers and deployers of high-risk systems.
PSD2 and Open Finance: The Second Payment Services Directive already enables third-party access to bank account data with customer consent. The proposed PSD3 and the Financial Data Access Regulation (FIDA) will expand this further. These frameworks create the technical and legal infrastructure for AI coaches to integrate with real financial data — but also raise significant data protection questions.
Swiss FADP (nFADP): For Swiss-based deployments, the revised Federal Act on Data Protection applies independently of EU rules. High-risk profiling — a concept unique to Swiss law — may be triggered by AI tools that analyze financial behavior to generate recommendations. The penalty structure targets individuals, not entities, with fines up to CHF 250,000.
CHF 250k
Max FADP Fine
Per responsible individual, not per entity
Art. 53
AI Act Provider Obligations
Documentation, transparency, copyright compliance
Annex III, pt. 5
High-Risk Classification
AI evaluating creditworthiness or access to essential services
Trust Is the Bottleneck
The technology to build effective AI financial coaches exists today. The behavioral science to design them well is mature. The regulatory frameworks — while complex — are workable. The real constraint is trust.
Users in financial distress are vulnerable. They are precisely the population most susceptible to manipulation through dark patterns and predatory design. An AI coach that uses behavioral insights to push people toward specific financial products, rather than genuinely supporting their recovery, would cause serious harm. The chapter authors rightly cite Dorresteijn and Verbeek's reminder: "Well-being only comes about when users of technologies have the ability to formulate their own answers to the question of the good life."
Building trustworthy AI in this domain requires several non-negotiable elements:
- Transparency about limitations. The tool must clearly state what it can and cannot do, and direct users to human professionals for complex decisions.
- No commercial conflicts of interest. If the tool recommends a financial product, the user must understand who benefits.
- Privacy by design. Financial data is among the most sensitive categories. Processing must comply with applicable data protection law, and data minimization principles must be enforced architecturally.
- Human oversight. High-stakes recommendations — restructuring proposals, negotiations with creditors — should involve human review.
- Explainability. Users must be able to understand why a recommendation was made.
What This Means for Your Practice
If you advise financial institutions, fintech companies, or public sector bodies in the DACH region, three developments deserve your attention:
Banks as natural developers. Banks hold customer financial data and have a direct commercial interest in their customers' debt recovery. Under PSD2/PSD3, third parties can also access this data with consent. The question is not whether AI financial coaches will emerge in the banking sector, but how they will be regulated and whether your clients are positioned to build or deploy them compliantly.
The public sector opportunity. No government-led AI solution currently provides full, personalized debt rehabilitation support. This is a gap that public-private partnerships could fill — but only with careful attention to the regulatory constraints outlined above.
Liability questions. When an AI coach provides advice that leads to adverse financial outcomes — a user follows a recommended budget that proves unsustainable, or negotiates with creditors based on AI-generated terms that are incorrect — the question of who bears responsibility is unresolved. Product liability, professional liability, and contractual frameworks all need updating.
From Fitness Apps to Financial Recovery
One analogy from the underlying research is worth dwelling on. Fitness applications — Strava, MyFitnessPal, Apple Health — have successfully changed behavior at scale by combining personalization, goal tracking, milestone celebrations, and adaptive coaching. Financial recovery shares the same structural challenge: it requires sustained behavior change over months or years, with progress that is often invisible in the short term. A well-designed AI financial coach could borrow the same interaction patterns: daily check-ins, spending summaries that reward discipline, short-term milestones that make a 5-year repayment plan feel achievable, and adaptive advice that responds to setbacks without judgment.
The difference, of course, is the stakes. A fitness app that miscounts calories is annoying. A financial coach that miscalculates interest rates or hallucinates loan terms can cause real financial harm. The design principles that make fitness apps engaging are transferable — but the safety requirements are categorically higher.
Richard Susskind's framework for access to justice is relevant here. He argues that justice encompasses four components: dispute resolution, dispute containment, dispute avoidance, and legal health promotion. AI financial coaches operate primarily in the last two categories — preventing debt problems before they escalate and promoting financial competence as a form of legal health. This framing matters because it positions these tools not as replacements for judicial processes, but as upstream interventions that reduce the demand on courts and enforcement systems.
The intersection of AI, behavioral science, and debt rehabilitation is not a niche topic. It touches financial regulation, data protection, consumer law, and access to justice. For legal professionals operating in this space, the opportunity is to shape how these tools are built — rather than to litigate the consequences after they fail.