Performance design isn't UX with better metrics. It's a different discipline. What it is, how AI changes it, and what shifts when you start doing it.


Every few months I sit across from a product lead or a founder who's just shipped a redesign. The visual is sharper. The interactions feel better. Everyone's proud. Then the numbers come in and conversion is flat. Or worse.
The diagnosis is always the same: they redesigned the surface and left the structure alone.
Performance design is the discipline that works on that structure: the decision logic underneath a digital product, and whether it's actually producing the outcome the business needs. Where UX optimises for usability and CRO optimises isolated page elements, performance design works on the end-to-end decision system.
Sonos shipped a completely rebuilt app in May 2024. New architecture, new cloud backend, new interface. What users got was an app that couldn't control speaker volume, lost alarm functionality, and broke systems people had relied on for years. Engineers inside the company had warned it wasn't ready. By January 2025: roughly $500 million wiped from market value, revenue targets missed by $200 million, CEO gone. The new app looked fine. The decision architecture underneath had been rebuilt without mapping how people actually used the product.
Snapchat had a similar moment in 2018. They reorganised how friend and publisher content were separated. 1.2 million people petitioned to revert it. $1.3 billion in market value gone in a day.
Both were structural failures: teams that optimised for how something looked without mapping how people made decisions inside the product.

The analogy I keep coming back to is structural engineering. An architect draws something beautiful. A structural engineer makes sure it stands up under load and performs to spec. A beautiful building that fails its structural assessment is a failed building. Nobody in architecture debates this.
Digital product design hasn't caught up. The growth team runs A/B tests. The analytics team watches the funnel. The design team ships screens. And the structural logic of the entire flow (how each step earns the right to the next) belongs to nobody.
Performance design owns that layer. Three shifts define it:
Decision architecture over information architecture. IA organises content. Decision architecture organises choices. The fix usually isn't making the form prettier. It's changing the sequence so value arrives before effort is demanded.
Friction as a structural problem, not a cosmetic one. Users don't leave because a button is the wrong colour. They leave because the underlying logic is broken: personal details demanded before a price is shown, three steps deep with no sense of progress. You can't fix that by polishing screens. You have to rewire the flow.
Velocity: technical and psychological. Google found that a 400ms delay reduced searches per user by 0.59%, and even after the delay was removed, those users searched less for weeks. Once momentum breaks, users recalibrate downward and don't always come back.
Traditional UX optimises for usability. CRO optimises isolated page elements. Both matter. But optimising one touchpoint often degrades another: a shorter form converts better but produces lower-quality leads, a more aggressive upsell increases ARPU but tanks retention.
Performance design works on the whole decision system, including the trade-offs between touchpoints that neither UX nor CRO is designed to hold.
When we run a performance design engagement, the first thing that happens is not design. It's diagnosis, in three phases:
The diagnosis is where most of the value sits, and it's the part most teams skip. They see a drop-off on step four and redesign step four. But the drop-off on step four is often caused by something on step two: a price that wasn't shown, a trust signal that was missing.
What comes out isn't a list of UI fixes. It's a categorisation across three layers:
That third layer is the one most design processes never surface. It's also where about half the real problems live.
From there it's a loop. Prioritise by impact and effort. Fix. Measure. Repeat. If the number didn't move, the fix didn't work.

This is the part most people don't expect.
The diagnostic phase used to take two to three weeks. Most of that time was pattern recognition: scanning session recordings, cross-referencing drop-off data with heatmaps, running heuristic audits, mapping competitor flows. AI handles that now.
We run specialised agents at each stage:
None of these replace the consultant. All of them mean the consultant shows up to the first strategy conversation with findings that would have taken weeks to assemble.
The diagnostic used to be 80% gathering and 20% judgment. Now it's closer to the reverse. The human decides what the patterns mean, which trade-offs to make, and how to have the difficult conversations with stakeholders. That judgment layer is still entirely human.
AI also enables prediction: analysing a proposed flow structure and flagging likely friction points before anything goes live. Not perfectly. But it catches structural mistakes early enough that the fix cycle starts from a better baseline.

For design and product leads, this changes the team conversation. The hours your team used to spend on audits, teardowns, and annotated flow analyses are now handled faster by AI. What your team needs to be good at now is knowing which problem to solve first, understanding why the organisation keeps producing friction in the same places, and designing interventions that account for constraints the data can't see. The role shifts from thoroughness to judgment. If your team is still structured around the first, this is the moment to rethink that.
The teams I've watched make this transition usually change three things, in this order:
If your team is stuck between "the design looks good" and "the numbers aren't moving," that's where this conversation usually starts. I do talks, workshops, and team sessions on how to make this shift in practice, including the AI-accelerated parts. Always up for meetups and offsites too.
Get in touch if that's relevant to something you're working on.
Related: Why pretty websites don't pay the bills · Why we became a performance design consultancy

Multidisciplinary talent breaks hiring systems built for specialists. In the AI era, that structural blind spot is a competitive liability most organisations still haven't addressed.

Most designers are building themselves out of relevance by staying in their lane. How that observation became flow-three, a performance design consultancy.
Let's Talk
First call is always diagnostic. You describe where the numbers feel wrong — most of the time, we can identify the cause before we’ve seen the product.
Not a pitch. A look at the problem together.