Luke Moody
Get in touch
InsightsProduct
Product

Vibe coding got you an MVP. Now what?

Luke Moody10 March 20267 min

Something has shifted in how early-stage products get built. A founder with a clear problem and a weekend can now produce something that looks, feels, and mostly behaves like a real product — without writing much code themselves.

Cursor. Bolt. v0. The tools are genuinely impressive and the speed is real.

But there's a moment that happens after the demo. A real user touches it. And things start to break in ways that didn't show up when you were the one using it.

This is the moment nobody's writing about. Everyone's celebrating the build. Fewer people are talking about what comes next.

Why AI-generated MVPs break in specific ways

The tools are optimised for output speed. They produce working code — often remarkably good code — for the happy path. The problem is that real products aren't experienced along the happy path.

Real users misunderstand interfaces. They click things in the wrong order. They arrive in states the model didn't anticipate. They read error messages. They wait for things to load and form an opinion about your product in that gap.

AI tools don't have opinions about what loading feels like. They don't make judgements about whether an error message communicates trust or destroys it. They produce functional states, not considered ones.

The result is a product that works in a demo and feels unfinished the moment someone uses it independently.

The three things that consistently need attention

After building and working with AI-generated MVPs, the same problems show up in roughly the same order.

Component architecture

AI tools optimise for getting something on screen. The component structure they produce often reflects that priority — components that are too large, too coupled, or structured in ways that make sense for a single use case but become obstacles the moment you want to extend anything.

This isn't a criticism. It's the right trade-off for validation speed. But it does mean that before you build further on top of the scaffold, you need to understand it well enough to know what to refactor first.

The test: can you read the codebase and understand what each component is responsible for? If the answer is no, that's the first thing to address. Building on top of architecture you don't understand creates compounding technical debt faster than almost anything else.

State management and edge cases

The model tested the flow it was asked to implement. It didn't test what happens when:

  • A user submits a form twice before the first response returns
  • An API call fails mid-session and the UI doesn't know what state it's in
  • A user navigates away and back mid-flow
  • The data doesn't match the shape the component expected

These aren't exotic edge cases. They're the first ten minutes of any real user session. Finding and handling them properly is the difference between something that feels like a product and something that feels like a prototype.

Interface emotion

This is the hardest one to describe but the most important.

The way your product feels is determined almost entirely by the UI, not the model or the logic underneath it. A slow loading state that communicates nothing makes a fast API feel slow. An error message written by the model ("An unexpected error occurred. Please try again.") destroys trust that took the rest of the interface several screens to build.

Interface emotion is about the micro-decisions: what does the loading state say? How long before it says something different? What does success feel like? What does failure communicate?

These decisions weren't made when the tool built your MVP. They need to be made now.

What I'd actually do

This isn't abstract — here's the specific sequence I'd follow after receiving an AI-generated MVP.

Week one: understand before touching. Read the codebase. Don't change anything yet. Map what exists, what it does, and where the obvious structural problems are. Write it down. Resist the urge to start fixing immediately — you'll make better decisions after a full picture.

Identify the critical path. What is the one flow that the product absolutely must get right? The thing that, if it breaks or feels wrong, a user won't come back from? That flow gets the most attention first.

Fix the interface states on the critical path. Every loading state, every error state, every empty state on that path. These have an outsized impact on perceived quality and they're often the fastest wins available.

Then address the component architecture. Once the critical path feels right, refactor the components that serve it. Don't refactor everything — just the parts you're about to build on top of.

Then extend. Only now, with a codebase you understand and a critical path that feels considered, start adding the next thing.

The reframe that helps

Stop thinking of the AI output as your product. Think of it as an incredibly detailed wireframe that happens to run in a browser.

A wireframe is enormously useful. It proves the flow makes sense. It gives stakeholders something to react to. It validates the idea faster than any other method.

But you wouldn't ship a wireframe to users and call it done. You'd use it as the foundation for the real design work.

The AI-generated MVP is the same thing at a higher fidelity. Use it for what it's brilliant at — fast validation, quick iteration, proving the concept — and then bring the engineering craft to the parts that deserve it.

The vibe coding got you to the starting line. The engineering gets you across it.

A note on the tools themselves

None of this is a reason not to use them. The speed advantage is real and material. Getting from idea to testable product in a weekend changes what's possible for a solo founder or a small team.

The trap is treating the output as more finished than it is. The founders who use these tools well are the ones who are clear-eyed about what they've produced and what still needs to happen.

That clarity is what separates the MVPs that become products from the ones that stay MVPs.

Work with me

Got something worth building?

I take on a small number of React / Next.js projects each year. If it's interesting, let's talk.
Luke Moody
Luke MoodySenior React / Next.js engineer. Building @Clubase_ and @PostHackerio. Manchester.
Keep readingMore from the same thread.A couple more pieces on product, engineering, and the work that starts after the prototype.
8 Mar 2026 · Product

Building in public needs systems, not motivation

Shipping side projects consistently has less to do with hype and more to do with having a repeatable way to make progress. Here's what actually works.

Read
5 Mar 2026 · Product

Your MVP is not your product strategy

A first release proves interest. It does not decide the product, the market, or what deserves investment next.

Read