Eight weeks from kickoff to a real MVP that real customers are using. This is the plan we run when a funded startup comes to us needing to test a hypothesis fast. It's not theoretical. It's the actual week-by-week shape of an engagement that has worked for us repeatedly.
The eight-week plan
From kickoff to customer beta
Eight weeks. Three of them on the core flow. One on each of polish, adjacent flows, beta, launch.
Week 1: Discovery and scope lock
The most important week.
Outputs:
- One-page product brief. What this MVP is testing. Who the user is. The single most important thing it must do.
- One-page architecture document. Stack, hosting, data model, third-party services.
- A "we will not build this" list. Everything explicitly out of scope.
- Risk register. The 3 to 5 things most likely to slow us down.
- Working repo with auth and a "hello world" page deployed to a staging URL.
What we cut: Anything that isn't on the path from "user signs up" to "user completes the core action." A typical week-1 cut list:
- Multi-tenancy. (One workspace per user for now.)
- Admin panel. (Inspect the database directly.)
- Notifications and email beyond auth flows.
- Mobile app. (Web first, mobile in v2.)
- Advanced analytics. (PostHog or Plausible event firing only.)
- Custom roles. (Two roles maximum: owner and member.)
The cut list is more important than the build list. The team that says yes to too much fails at week 6.
Week 2: Auth, account, core data model
Goal: Users can sign up, log in, and see an empty version of the core feature.
Done means:
- Auth flow works (sign up, sign in, password reset, email verification).
- Account page exists. User can update their name.
- The core data model is in Postgres. Tables, foreign keys, indexes, RLS.
- The "empty state" of the main view renders.
- CI runs, deploys, observability hooks in.
Architecture decisions made here don't get revisited. Pick once, commit, move.
Weeks 3 to 4: The core flow
Goal: A user can perform the core action end-to-end and see the result.
If the MVP is a project management tool, weeks 3 to 4 are "create a project, add tasks, view them, complete them." If it's an AI feature, weeks 3 to 4 are "the AI feature works on real input and produces useful output." Whatever the product is about, this is the period that nails it.
Done means:
- The happy path works for a logged-in user.
- The data persists correctly across sessions.
- Edge cases for the happy path are handled (empty, max, invalid input).
- A senior engineer can demo the feature confidently.
This is where the team spends most of the engineering hours. Everything before it sets up, everything after it polishes.
Week 5: Polish and edge cases
Goal: The core flow holds up under non-happy-path usage.
Done means:
- Error states handled gracefully.
- Loading states real, not placeholders.
- Mobile responsiveness on the core flow.
- Accessibility check on the core flow (keyboard nav, screen reader basics).
- Performance acceptable on a slow connection.
This week is unglamorous and absolutely necessary. Skipping it means customers find these issues themselves.
Week 6: Adjacent flows and onboarding
Goal: A first-time user knows what to do.
Done means:
- First-run experience guides the user to the core action.
- One or two adjacent flows around the core feature work (settings, sharing, basic search).
- Customer support knows how to handle the most likely questions.
We deliberately don't build "the full product" here. Just enough scaffold around the core that a new user can succeed.
Week 7: Internal beta and bug bash
Goal: The team uses the product daily for the use case it claims to solve. Then a structured bug bash by the QA engineer.
Done means:
- Bug bash output is reviewed and triaged.
- Anything blocker-tier is fixed.
- Anything non-blocker is logged but explicitly punted.
- Staging is solid.
This week separates teams that ship products that work from teams that ship products that compile.
Week 8: Customer beta and production launch
Goal: Real customers using the product in production.
Done means:
- Production deployment from a real domain with SSL.
- Auth in production with real email delivery.
- Customer onboarding emails sent.
- Monitoring and alerts wired.
- First two to ten customers using the product daily.
- A clear backlog of v2 work based on customer feedback in the first week.
The launch isn't the end of the engagement. The first week post-launch is when the most important learnings come in. Plan to be available for support fixes through week 9 at minimum.
What this plan optimises for
- Speed to first signal. The hypothesis the MVP is testing gets tested in week 8, not month 6.
- Operational simplicity. One stack, one hosting target, one database, one auth provider.
- Senior judgement at every decision. No ramp-up time for the team.
- Customer learning over feature completeness. Eight weeks ships a working slice; the next quarter expands it based on what customers actually do.
What this plan deliberately doesn't optimise for
- A polished product. The first version looks rough.
- Coverage of every customer segment. Pick one segment, ship for them.
- Scalability beyond ~100 customers. Address that after you have customers.
- Enterprise features. Save for after product-market fit.
The 8-week milestone calendar
Common mistakes
- Spending week 1 on architecture nirvana. Week 1 is for decisions, not perfect decisions.
- Letting the "we will not build this" list grow back. Discipline matters. New items belong in v2.
- Skipping week 5 polish to get to week 6 features. Polish is the difference between a product and a prototype.
- Hiring juniors to make the team bigger. Juniors slow eight-week MVPs.
How Hashorn delivers MVPs
Hashorn's MVP development engagement is exactly this eight-week plan. Senior engineers, one QA, one delivery lead, the brief signed off in week 1, the customer beta running by week 8. We pair it with AI software development workflows so the team ships at the pace AI-augmented engineering allows. For teams that want to keep going past week 8, we transition into a dedicated team engagement seamlessly.
Conclusion
A real MVP in eight weeks is achievable for the right team with the right scope. The discipline is in week 1 (lock scope hard), in weeks 3 to 4 (nail the core), and in week 5 (polish before features). The teams that fail try to build too much. The teams that succeed cut more than they thought they could.
Frequently asked questions
Need help building AI-powered software, QA automation, or secure cloud systems?
Talk to Hashorn's engineering team. Dedicated senior engineers, QA, and security with same-week ramp.