26.02.2026 09:28 AM

AI in Banking — a Future Scenario for Personal Finance

teaser Retail banking

No one wants even more products; everyone wants more convenience. That applies to money, too: personal finances should run more smoothly, with less friction, fewer clicks, and less fragmentation. This is exactly where AI comes in. Many people already use general‑purpose AI as an everyday assistant, and that naturally includes money questions. In retail banking, specialized AI personal‑finance tools are still relatively rare, but the race to deliver an AI‑powered financial day‑to‑day is well underway, especially internationally.

As Commerzbank’s innovation unit, we therefore developed a three‑stage scenario: What happens when rapid advances in AI, including agents, reach everyday retail customers? How does the management of personal finances change, and what does that mean for retail banking?

When customers use AI for their personal finances

What happens when AI isn’t used only inside banks, but becomes a tool customers themselves use for personal finance, shifting from an occasional helper to a constant in everyday life? In that world, it’s no longer menus and product pages that structure the journey, but an AI that answers questions, explains relationships, and increasingly prepares concrete options, or even takes action.

We think this development through in a forward‑looking narrative, following a fictional character: Emma. She stands in for many retail customers whose financial lives already span accounts, cards, subscriptions, and apps. Above all, they want one thing: less effort and more clarity.

We follow Emma through a future narrative of AI use in personal financial management, structured in three stages:

  • Navigator: AI helps with understanding and interpretation.
  • Co‑Pilot: AI prepares decision packages and sets up next steps.
  • Agent: AI acts independently within a defined set of guardrails.

In each stage, we explain what that means for retail banking and for customers.

Stage 1: Navigator — understanding as AI’s first job

Clarity instead of clickwork: AI as a navigator for personal finance

8:47 a.m., October 25, 2026; Frankfurt am Main.

Emma’s phone buzzes: balance low, credit‑card statement due, account statement available. Nothing dramatic, but enough to trigger the question of whether everything is still on track.

Emma types a simple question: How much did I spend on restaurants this month? Instead of exporting transactions and maintaining spreadsheets, the AI returns the number and immediately places it in context against her budget.

In this stage, AI primarily helps with understanding: it sorts transactions, detects patterns, and surfaces hints that Emma can interpret. The mental load still sits with her: remember, weigh trade‑offs, follow through.

We call this the Navigator stage because the AI supports like a GPS: it is activated by a question or instruction and needs a destination, i.e., a direction from A to B. It provides orientation and suggestions; decisions and execution remain with Emma.

Graphic titled “Navigator” showing a market overview of AI-enabled financial assistant solutions. On the left, “Established players” include DBS (Personalized Nudges), BBVA (Financial Coach), Capital One (Eno – AI Assistant), Bank of America (Erica – AI Assistant), Citi (AskWealth – Advisor AI Assistant), and RBC (Nomi – AI Assistant). On the right, “Challenger” includes bunq (Finn – Personal AI Assistant), Lightyear (Investment Intelligence), Moneybox (Aurora AI – Financial Guidance), Stash (Erica – AI Assistant), and eToro (Tori – AI Investing Companion).

When context becomes a core capability — and what that means for retail banking

Digital offers and payment methods in retail banking keep expanding. And the more accounts, merchants, and subscriptions accumulate, the harder it becomes to keep an overview. As a result, the need to track spending grows.

In the Navigator stage, financial AI can create real customer value because it doesn’t just list activity; it interprets it: Where are the outliers? Which habits run in the background? What’s normal – and what isn’t?

What this means for retail banking: AI is here and evolving rapidly, and the needs around personal finance already exist. The two will inevitably converge, because financial AI capabilities align closely with still‑unmet customer needs.

This requires data access that already exists today and is likely to improve with FiDA. For retail banking, that’s a real opportunity: banks that roll out Navigator functions early as a feature can meaningfully reduce customer effort, create orientation, and build trust in day‑to‑day money management. At the same time, banks must remain compatible when customers use third‑party Navigator solutions and want to connect bank data securely.

Stage 2: Co‑Pilot — when AI proposes solutions

A single briefing, not a barrage of notifications

Roughly two years later, in late fall 2028; Frankfurt am Main.

Instead of having to piece information together, Emma now receives a short recurring briefing on her financial situation: grocery spending has been above her usual level for three weeks, her savings goal for the next trip is progressing faster than planned, and a contract has gotten more expensive. Alternatives have been checked and a switch is prepared. Emma only has to approve or dismiss.

When Emma starts thinking about buying a home, her financial baseline is pre‑assessed: income, equity contribution, and realistic market terms quickly define a sensible price range.

The difference from the Navigator stage: signals turn into decision packages. The system explains what happened, why, and which options make sense. Emma remains the decision-maker, but her role shifts: less chasing details, more making calls.

We call this stage Co‑Pilot: the AI is continuously active, thinks along, observes, proposes, and prepares actions.

That requires a broader setup: onboarding, access to financial data, and memory that learns from decisions and preferences. Crucially, it also requires robust identity and consent logic so that prepared steps can actually be carried through to execution.

Graphic titled “Co-Pilot” presenting a market overview of AI co-pilot/concierge offerings. The “Established players” section lists Rocket Companies (Logic AI Platform – Homeownership Platform) and Capital One (Chat Concierge – Agentic Car Dealer; Velocity Black – Luxury Concierge). The “Challenger” section includes Albert (Genius – Personal Assistant), Origin (AI Financial Advisor), cleo (AI Personal Financial Assistant), and Better (Betsy – Loan Assistant).

Decision packages instead of clicking through menus

The Co‑Pilot creates decision foundations. Transactions are not executed autonomously; they happen only after approval. That makes this stage more sensitive: to generate good decision packages, the AI continuously makes pre‑decisions about relevance, options, and priorities.

To do that, it needs context about the customer (goals, preferences, patterns) and a clear framework: Which goals come first? Which rules always apply (e.g., liquidity first)? Where must the system ask explicitly? This interplay of goal hierarchy, rules, data quality, and explainability determines whether suggestions feel like support or like a black box.

For customers, the Co‑Pilot primarily means less mental load. Instead of keeping everything in mind, it reminds, prioritizes, and follows up. Everyday relief comes faster because recurring routines (contract checks, reimbursements, budget adjustments) are pre‑structured.

At the same time, the interface becomes fluid: a briefing on the phone, approval on the go, deep review on the desktop, without losing continuity. And because reliable suggestions build trust, willingness grows to delegate more, including more sensitive, tasks over time. That’s why the Co‑Pilot is the essential intermediate step on the path to the Agent stage.

For retail banking, data and action access becomes the playing field of this stage. It echoes PSD2, extended by machine‑readable capabilities, for example via MCP servers: not just data, but standardized descriptions of executable actions. Banks therefore have to decide which role they want to play in the Co‑Pilot ecosystem: a fully integrated Co‑Pilot experience, or a highly connectable data‑and‑action provider.

Stage 3: Agent — autonomy within a defined framework

The family‑office principle – powered by AI

Another five years later, in spring 2033, Mainheide near Frankfurt.

Emma is 35 years old and now owns her home. She rarely opens banking apps, not out of disinterest, but because much of it runs in the background. She regularly maintains her rules: limits, categories, blocklists, approval thresholds, and exceptions. Within these guardrails, a system of specialized agents handles routines like bills, savings contributions, subscription optimization, payments and, when needed, portfolio adjustments, including fraud prevention, detection, investigation, and loss compensation.

We call this third stage Agent because the AI doesn’t just explain or prepare; it acts: independently, within the rule set, with logs and clear approvals. Emma’s financial agent orchestrates multiple AI agents in distinct roles: cashflow/liquidity, payments, contracts, investments, or fraud.

The logic resembles a classic family office for very wealthy households, from the era before AI was used there. In the future, you won’t have to be very wealthy to experience truly personalized financial support in action: these roles will be performed not by people, but by AI agents in retail banking.

Emma’s agent is omnipresent: via voice at home, via messenger on the go, or on the desktop. A finance cockpit provides overview with drill‑down. And because the agent monitors continuously, it reaches out proactively when something looks off, and prioritizes so that the important things don’t get lost.

For the financial agent to truly reduce effort, the Co‑Pilot setup needs an additional agent layer: policies with limits and approval thresholds for autonomous decision and execution, signable actions, explainable logs, security/fraud handling including dispute/recovery, plus monitoring and a kill switch, so everything remains consistent.

Graphic titled “Agent” showing a market overview of agentic (autonomous) solutions in finance and commerce. On the left, “Established players” include Mastercard (Shopping Muse – Conversational Commerce) and PayPal (Agentic Commerce; Cymbio – New Era of Agentic Commerce). On the right, “Challenger” includes Autonomous (Financial Agent), bitpanda.ai (AI Wealth Coach), arta ai (AI Sidekick for Investing), and Ryt Bank (AI-Powered Bank).

What shifts in retail banking when agents decide

Agents can make decisions themselves within predefined rules, limits, and approval thresholds. That’s the most powerful and most consequential stage.

It likely won’t arrive in one big leap, but step by step. Trust is built through use: first Navigator, then Co‑Pilot – and only then will people be ready to delegate decisions and execution. It starts with repetitive tasks and grows with positive experiences.

For customers, this mainly means routines run in the background. Less app‑opening, less micromanagement – more focus on outcomes. At the same time, risk management becomes proactive: anomalies are detected, prioritized, and escalated when needed before damage occurs. And because comparisons, follow‑ups, and switching processes can be triggered automatically, everyday negotiating power increases. The key requirement is transparency despite autonomy: instead of constant push notifications, there are short logs showing what happened – and why.

For retail banking, the implications are far‑reaching: when customers delegate decisions and execution to their AI agents, customer access shifts, from direct human interaction to an AI layer through which requests, decisions, and transactions increasingly flow. Banks therefore need to design for an agent channel: compatible actions, robust consent mechanisms, audit trails, and fraud and dispute processes.

That raises a strategic question: Is it advantageous for banks if the customer’s AI agent is provided by the bank itself, or if the bank primarily functions as a secure data‑and‑action provider for third‑party agents?

Conclusion: AI assistants will become the norm — the question is by whom

Of course, the evolution of AI assistants in personal financial management isn’t governed by laws of nature. Still, we expect the benefits to be large enough that this technology will prevail. It won’t happen overnight, but likely faster than the digital transformation of the financial industry we’ve experienced so far.

If AI shapes personal financial management, one question becomes central: who provides these AI assistants?



Matrix graphic with columns “Strengths” and “Weaknesses” and five rows: Banks, Fintechs, Tech platforms, AI platform providers, and DIY solutions. Banks: strengths “customer access, trust, long data history”; weaknesses “regulatory requirements, speed of execution, distrust due to cross-selling.” Fintechs: strengths “UX focus, high speed”; weaknesses “trust/brand depth, dependence on data access, distrust due to cross-selling.” Tech platforms: strengths “reach, own ecosystems, AI capabilities”; weaknesses “trust in a financial context, regulatory friction.” AI platform providers: strengths “technology stack, speed of innovation”; weaknesses “trust/liability, regulatory embedding.” DIY solutions: strengths “immediately possible, high control”; weaknesses “effort (initial and ongoing), security, blind spots.”

Choosing the provider is potentially disruptive: if the assistant doesn’t come from the primary bank, an AI layer inserts itself between customer and institution. Relationship, context, and steerability weaken.

Once that distance grows, market dynamics change. Switching banks is still often painful for people; for agents it’s just another task. What customers avoid out of convenience today, an agent may automate tomorrow, including follow‑ups, optimizations, and price comparisons. Loyalty can be encoded as a rule. But consistently choosing the second‑best option usually comes with a cost, often worse terms or less performance.

Our message: don’t underestimate the speed of the market, and don’t overestimate your institution’s pace to adapt. To remain relevant in retail banking, banks should roll out Navigator and Co‑Pilot functions now and, in parallel, build the agent rails: actions, consent, audit, fraud, and disputes, so the bank remains compatible in the age of agents.