flow-three
ProblemSolutionServicesApproachWork
Log inLet's Talk
flow-three

Performance design for scaling digital platforms.

Sections

ProblemSolutionServicesApproachWork

Explore

InsightsBrand

Connect

Book MeetingLinkedInEmailWhatsAppCall

© form-three Pte. Ltd. 2026 · UEN: 202321123E

Privacy PolicyTerms & Conditions
Insight•
Perspectives•March 27, 2026

What Is Performance Design (And Why It's Not UX)

Performance design isn't UX with better metrics. It's a different discipline. What it is, how AI changes it, and what shifts when you start doing it.

Kamil Gottwald
Kamil Gottwald
7 min read
Performance design flow diagram showing decision architecture across a digital product funnel

Every few months I sit across from a product lead or a founder who's just shipped a redesign. The visual is sharper. The interactions feel better. Everyone's proud. Then the numbers come in and conversion is flat. Or worse.

The diagnosis is always the same: they redesigned the surface and left the structure alone.

Performance design is the discipline that works on that structure: the decision logic underneath a digital product, and whether it's actually producing the outcome the business needs. Where UX optimises for usability and CRO optimises isolated page elements, performance design works on the end-to-end decision system.

Sonos shipped a completely rebuilt app in May 2024. New architecture, new cloud backend, new interface. What users got was an app that couldn't control speaker volume, lost alarm functionality, and broke systems people had relied on for years. Engineers inside the company had warned it wasn't ready. By January 2025: roughly $500 million wiped from market value, revenue targets missed by $200 million, CEO gone. The new app looked fine. The decision architecture underneath had been rebuilt without mapping how people actually used the product.

Snapchat had a similar moment in 2018. They reorganised how friend and publisher content were separated. 1.2 million people petitioned to revert it. $1.3 billion in market value gone in a day.

Both were structural failures: teams that optimised for how something looked without mapping how people made decisions inside the product.

Geometric steel framework structure showing interconnected hexagonal modules, the structural logic beneath what users see

What it actually is

The analogy I keep coming back to is structural engineering. An architect draws something beautiful. A structural engineer makes sure it stands up under load and performs to spec. A beautiful building that fails its structural assessment is a failed building. Nobody in architecture debates this.

Digital product design hasn't caught up. The growth team runs A/B tests. The analytics team watches the funnel. The design team ships screens. And the structural logic of the entire flow (how each step earns the right to the next) belongs to nobody.

Performance design owns that layer. Three shifts define it:

  • Decision architecture over information architecture. IA organises content. Decision architecture organises choices. The fix usually isn't making the form prettier. It's changing the sequence so value arrives before effort is demanded.

  • Friction as a structural problem, not a cosmetic one. Users don't leave because a button is the wrong colour. They leave because the underlying logic is broken: personal details demanded before a price is shown, three steps deep with no sense of progress. You can't fix that by polishing screens. You have to rewire the flow.

  • Velocity: technical and psychological. Google found that a 400ms delay reduced searches per user by 0.59%, and even after the delay was removed, those users searched less for weeks. Once momentum breaks, users recalibrate downward and don't always come back.

Performance design ≠ UX ≠ CRO

Traditional UX optimises for usability. CRO optimises isolated page elements. Both matter. But optimising one touchpoint often degrades another: a shorter form converts better but produces lower-quality leads, a more aggressive upsell increases ARPU but tanks retention.

Performance design works on the whole decision system, including the trade-offs between touchpoints that neither UX nor CRO is designed to hold.

What the work looks like

When we run a performance design engagement, the first thing that happens is not design. It's diagnosis, in three phases:

  1. Analytics. Funnel data, drop-off rates, cohort behaviour, revenue leak calculation.
  2. Behavioural research. Session recordings, heatmaps, hesitation patterns.
  3. Decision-flow analysis. Mapping the IA against the actual choices users face, and where those don't align.

The diagnosis is where most of the value sits, and it's the part most teams skip. They see a drop-off on step four and redesign step four. But the drop-off on step four is often caused by something on step two: a price that wasn't shown, a trust signal that was missing.

What comes out isn't a list of UI fixes. It's a categorisation across three layers:

  • Technical fixes: page speed, broken states, tracking gaps.
  • UX improvements: layout, hierarchy, interaction logic.
  • Organisational issues: team handoff failures, misaligned incentives, process gaps that produce friction the design team can't solve alone.

That third layer is the one most design processes never surface. It's also where about half the real problems live.

From there it's a loop. Prioritise by impact and effort. Fix. Measure. Repeat. If the number didn't move, the fix didn't work.

Analytics dashboard showing real-time performance metrics, the data layer that drives every performance design decision

Where AI fits

This is the part most people don't expect.

The diagnostic phase used to take two to three weeks. Most of that time was pattern recognition: scanning session recordings, cross-referencing drop-off data with heatmaps, running heuristic audits, mapping competitor flows. AI handles that now.

We run specialised agents at each stage:

  • Heuristics agent scoring every touchpoint against established conversion patterns.
  • Research agent analysing session recordings at a volume no human team could match.
  • SEO and traffic agent mapping whether landing experiences match the intent users arrived with.
  • Organisational mapping agent surfacing cross-team handoff gaps from how the product is structured.

None of these replace the consultant. All of them mean the consultant shows up to the first strategy conversation with findings that would have taken weeks to assemble.

The diagnostic used to be 80% gathering and 20% judgment. Now it's closer to the reverse. The human decides what the patterns mean, which trade-offs to make, and how to have the difficult conversations with stakeholders. That judgment layer is still entirely human.

AI also enables prediction: analysing a proposed flow structure and flagging likely friction points before anything goes live. Not perfectly. But it catches structural mistakes early enough that the fix cycle starts from a better baseline.

Abstract dark network of connected nodes, representing AI agent architecture processing performance design diagnostics

For design and product leads, this changes the team conversation. The hours your team used to spend on audits, teardowns, and annotated flow analyses are now handled faster by AI. What your team needs to be good at now is knowing which problem to solve first, understanding why the organisation keeps producing friction in the same places, and designing interventions that account for constraints the data can't see. The role shifts from thoroughness to judgment. If your team is still structured around the first, this is the moment to rethink that.

Where to start

The teams I've watched make this transition usually change three things, in this order:

  1. Look at the funnel data before opening Figma. Once you've seen where the drop-off is and calculated what it's costing, the design brief writes itself. This is the single highest-leverage habit change for a design or product team.
  2. Measure outcome, not output. Not "did we ship it" but "did the number move." Make that the first question in every review, not the last.
  3. Surface the organisational friction alongside the UX friction. The two are usually connected. You can't fix one without at least naming the other. This is the conversation most leads avoid, and it's also the one that unlocks the biggest improvements.

If your team is stuck between "the design looks good" and "the numbers aren't moving," that's where this conversation usually starts. I do talks, workshops, and team sessions on how to make this shift in practice, including the AI-accelerated parts. Always up for meetups and offsites too.

Get in touch if that's relevant to something you're working on.

Related: Why pretty websites don't pay the bills · Why we became a performance design consultancy

Next Up

3D render of people sitting in individual boxes arranged in a grid, illustrating how org structures isolate multidisciplinary talent
Perspectives•8 min read

Your Org Chart Is Filtering Out the People You Need Most

Multidisciplinary talent breaks hiring systems built for specialists. In the AI era, that structural blind spot is a competitive liability most organisations still haven't addressed.

Mar 30, 2026
Abstract data flow visualization representing flow-three's performance design methodology, from friction diagnosis to conversion
Perspectives•9 min read

Why We Became a Performance Design Consultancy

Most designers are building themselves out of relevance by staying in their lane. How that observation became flow-three, a performance design consultancy.

Mar 25, 2026

Most platforms already have the traffic. The problem is what happens to it.

Let's Talk

Where are your users getting stuck?

First call is always diagnostic. You describe where the numbers feel wrong — most of the time, we can identify the cause before we’ve seen the product.

Not a pitch. A look at the problem together.

Or

We typically respond within 1 business day.