NN/G's State of UX 2026 gets the diagnosis right but the prescription wrong. The designers who survive this shift won't go deeper. They'll go wider.


Nielsen Norman Group published their State of UX 2026 last month. The headline: UI is no longer a differentiator, and "surface-level design won't be enough to stay competitive." Their prescription is to design deeper. Go upstream. Develop judgment, strategy, discernment.
They're right about the diagnosis. The direction they're pointing doesn't match what actually moves outcomes.
The observation is right. UI is commoditising fast. Design systems, component libraries, and AI-assisted production have made competent interfaces cheap to produce. The old argument "we're worth more because our design is better" runs into a question most design teams can't answer: better by what measure, and does that difference show up anywhere the business actually tracks?
Not because the work isn't good. Most of it is. The problem is that good interface work and business outcomes are only loosely connected. The gap between them is where most of the value gets lost.
NN/G is right that the answer isn't more craft. But "design deeper" sends people in the wrong direction.

Deeper into what, though?
The natural read is deeper into UX practice itself. More rigorous research. More systems thinking. More strategic framing. More involvement at the start of product definition. Designers already doing this work will nod along. Designers who aren't will add it to the list of things they should be better at.
But this is still depth inside the same box. And staying in the design lane, however far down you go, doesn't change what you actually contribute to revenue.
Look at the decisions that actually moved the numbers on any project you've worked on. They almost never fit cleanly into one discipline. They crossed the boundary between design and the functions next to it: growth, product, data, copy, conversion. The indispensable designers aren't the ones who are best at UX practice. They're the ones who stopped waiting for a brief and started investigating the problem behind it.
We worked on a fintech comparison platform across four APAC markets. Traffic was strong. Conversion was the actual problem.
The surface reading pointed to interface issues: long product listings, unclear hierarchy, typical design work. But when we actually looked at the data, the drop-off wasn't happening because the interface was bad. It was happening upstream.
Users were arriving from ad campaigns with declared intent. A user searching for "balance transfer credit cards" would land on a page showing forty cards in identical visual weight with no filtering applied. What the ad promised and what the page showed were completely disconnected. The acquisition team, the data team, the product team, and our design team all had partial visibility on different pieces of this gap. None of us had actually looked at the full flow together until this project.
Connecting that gap was the work. The visual design changes that came after were just execution.
That kind of contribution doesn't come from going deeper into UX practice. It comes from being willing to follow the problem wherever it lives, even into territory that technically belongs to someone else.

So what does "going wider" actually mean in practice?
Going wider doesn't mean becoming a generalist without depth. It means refusing the artificial boundary around what design is allowed to touch.
Read the funnel data before opening Figma or Sketch. Not because the brief asked you to. Because the data shows you which problem is actually worth solving. The brief is always someone else's interpretation of the data, usually compressed through three internal approvals. Most designers wait for that interpretation. Going wider means going to the raw data first.
Own the copy at decision points. Not as a copywriter. As the person responsible for what a user actually reads at the moment they're deciding whether to keep going. The boundary between design and copy at those moments is completely arbitrary. The outcome isn't.
Participate in the growth conversation. If the acquisition funnel is losing people before they reach the product, that's a design problem. It just has a growth label attached. If it moves revenue, it should be a design conversation.
Push back on the brief when the brief is wrong. Not about how something looks. About which problem is actually being solved. I once worked with a team that wanted to redesign their entire onboarding flow. The data showed the drop-off was happening at step one, specifically because users didn't understand what the product actually did. The onboarding flow wasn't the problem. That conversation, before they spent two months designing, saved them months. It came from having looked at the data before opening the design tool.
Going wider requires being measured on outcomes, not craft. That accountability is uncomfortable. You can spend years going deeper into UX practice, produce work that ships, gets approved, and never see how it connected to a business result. There's safety in that. Going wider removes the safety.
When you work wider, the design decision gets connected to a number. You'll know whether it moved the number or it didn't. You're working in territory where other teams have opinions and authority. You're visibly wrong sometimes, and the data shows it rather than just feedback shows it.
Most designers don't pursue this. Not because they can't. Because the incentive structure doesn't reward it. The brief rewards delivery. Performance reviews reward stakeholder satisfaction. Portfolios reward craft. None of those reward "I identified the actual problem and the metric moved."
There's a real constraint worth naming. Going wider is easier in some contexts than others. An embedded design team with direct data access operates completely differently than a studio working on fixed-scope briefs for clients. This isn't an excuse. It's context. I don't have a clean solution for that tension, and I'm not sure I do. The instinct can develop anywhere. But the full application of it requires an organisation that allows it. Not all of them do.
That's probably why "design deeper" is the more comfortable recommendation. Deeper is available to everyone. Wider depends on where you sit in the organisation.
Performance design changes that equation. The measure of whether design worked isn't the design itself. It's what actually happened after the design shipped.
The NN/G report is right that junior and generalist roles are under the most pressure. What it misses is why. It isn't depth that's missing. It's that these designers haven't yet developed the instinct to work on the actual problem rather than the problem they were handed.
The designers who are going to be fine are the ones who treat the funnel like it's theirs to understand. Who read data before opening a tool. Who think of the revenue line as something they can influence rather than something that happens to them. Who stopped treating the brief as the problem they need to solve.
That's not design practice going deeper. It's design practice going wider. And that's where the real problems are sitting.
If your design team is producing work that looks competent and isn't moving the numbers, the gap is usually not craft. It's rarely about the design being wrong.
Let's look at where the work is actually landing.
Related: Why we built flow-three around performance design · What performance design actually means

Multidisciplinary talent breaks hiring systems built for specialists. In the AI era, that structural blind spot is a competitive liability most organisations still haven't addressed.

Performance design isn't UX with better metrics. It's a different discipline. What it is, how AI changes it, and what shifts when you start doing it.
Let's Talk
First call is always diagnostic. You describe where the numbers feel wrong — most of the time, we can identify the cause before we’ve seen the product.
Not a pitch. A look at the problem together.