The Part AI Can't Prompt For

By
Tim Huey
April 8, 2026
3 mins
Share this post
Collage showing a woman smiling while using a smartphone with a search bar 'Client meeting', a dashboard with charts and data, a scalable design system map, and a group of people networking in a modern office.

The Part AI Can't Prompt For

When Netflix started competing seriously for original content, something unexpected happened to the people who write television. Not the writers, the showrunners. Personally, I'd never heard of a showrunner until more recently. It's the person responsible for holding an entire season together as a coherent experience for a real audience. That role had always existed, always mattered, always been the difference between a show that works and one that doesn't. But it was largely invisible to anyone outside the industry.

Then streaming made producing content cheap and fast, and suddenly that judgment, the ability to understand an audience deeply enough to make hundreds of creative decisions in service of their experience, became the scarcest thing in the room. Netflix signed Shonda Rhimes for $150 million and Ryan Murphy for $300 million in what Fast Company described as a "billion-dollar arms race" for showrunner talent Fast Company. Once volume stopped being the constraint, the judgment to make shows that work for people became the thing that actually differentiated one platform from another.

The work hadn't changed. What changed was how visible it became.

We're Watching a Similar Shift Happen in Product Design

We've been watching something similar happen in product design, and it's worth naming, because it has real implications for how product teams are built right now.

AI tools have made producing interfaces crazy fast. A founder or product owner can go from concept to something that looks and functions like a real product in a matter of hours. Getting a proof of concept in front of stakeholders faster, pressure-testing an idea before committing resources, moving quickly through early validation; these are legitimate gains, and teams should use them.

Yet there's a specific moment we keep seeing, somewhere after the prototype exists and before the product ships, where the nature of the problem changes. The screens look right. The flows are plausible. And then real users encounter it.

What AI Handles Well, and What It Doesn't

Nielsen Norman Group recently drew a careful distinction between vibe coding (where a user describes what they want and AI builds it) and professional design work, arguing that the line between them determines what you're holding AI accountable for: execution fidelity or design judgment. Nielsen Norman Group Execution fidelity is something AI handles well. Design judgment is something else: it's the work of understanding that a first-time user reads your product differently than the team that built it. It's knowing where people hesitate, where they misread a label, where a flow that feels obvious to an insider creates friction for a stranger. It's the questions a prompt can't ask on your behalf.

What Remains When AI Handles the Execution

We're now seeing that when AI handles the execution layer (generating screens, producing layouts, building functional interfaces), what remains is exactly the work that product design has always been about. Intuition is part of that, yes, but intuition built on theoretical frameworks you've studied and an educated eye you've developed over time. User research, behavioral analysis, usability testing, pattern recognition earned from working in the space and watching real people interact with real products. Mapping where a user's mental model diverges from the team's assumptions. Understanding where that gap lives, not because it seems right but because you looked. Making the hundreds of small decisions that determine whether an experience holds together for someone who didn't build it.

Nielsen Norman Group's State of UX 2026 observed that users are increasingly fatigued by “AI slop.” When everything gets that AI sparkle, it can easily become noise, not novelty. Nielsen Norman Group The products standing out right now aren't the ones that moved fastest through the generative phase. They're the ones where someone asked harder questions after the prototype existed.

The Screens Were Never the Hard Part

The showrunner's job didn't become less valuable when Netflix could produce more content; it became more visible, because volume exposed the thing volume couldn't solve. Product design is in the same moment. The screens were never the hard part. Now, finally, that's easier for everyone to see.

Sources:

Fast Company, "How Netflix Created a $1 Billion Arms Race for TV Writers," 2019; Nielsen Norman Group, "GenUI vs. Vibe Coding: Who's Designing?," March 2026; Nielsen Norman Group, "State of UX 2026," January 2026.

Share this post

Sign up for our newsletter

Get eCommerce insights, product page audits, and conversion strategies delivered to your inbox.

By clicking Sign Up you're confirming that you agree with our Terms and Conditions.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.