Marcus Williams spent four years as a product manager at a B2B software company. He understood product development, knew how to run a roadmap, and had watched enough SaaS launches to have strong opinions about what makes them succeed and fail.
What he didn’t have was experience running one himself, alone, with no team, no dedicated budget, and a full-time job he hadn’t yet left.
In January, Marcus gave himself 90 days to launch his first independent SaaS product — a lightweight project status reporting tool for small agencies. He documented his process in detail, including how he used AI milestone generation to plan and navigate the launch. Here’s what that looked like.
The Starting Point
Marcus’s situation when he began:
- Product concept validated through 11 conversations with potential customers
- Technical stack decided (Next.js, Supabase, Stripe)
- 15–20 hours per week available while still employed
- Zero launch experience as a solo founder
- Target launch: 90 days with at least 5 paying customers on day 1
His initial instinct was to plan from the front: “Build the product, then market it.” He’d seen this approach fail repeatedly in his PM career — the “build it and they’ll come” assumption that depoliticizes the distribution problem by deferring it.
Instead, he decided to use AI milestone generation to work backwards from the launch, with one specific instruction to himself: don’t let AI’s output be generic.
The Initial AI Milestone Generation Session
Marcus spent 25 minutes on his initial milestone generation session. Before prompting AI, he wrote out his goal description:
“Launch a project status reporting tool for small agencies (5–20 person teams) by April 15. Paying customers: at least 5 on launch day. I’m building solo, available 15–20 hours/week. Tech stack is decided. I have 11 validated customer conversations. I don’t have a marketing audience — no newsletter, no significant social following. Budget: $500 for the 90 days including tools and any paid distribution.”
He then asked AI to reverse-engineer milestones from day 90 backward, with explicit instructions to flag dependencies and identify milestones most likely to take longer than expected.
The output surprised him.
The Milestone Plan AI Generated
Week 12 (Launch — April 15):
- 5 paying customers signed up
- Product live and stable
- Onboarding flow tested by at least 3 non-technical users
Week 10–11 (Pre-launch):
- Beta users (10 target) actively using the product for two weeks
- Critical bugs resolved from beta feedback
- Pricing page and checkout flow live and tested
- Launch announcement drafted and reviewed by someone in the target audience
Week 8–9 (Beta):
- Core feature set complete (status reporting, client view, automated weekly digest)
- 10 beta users recruited and onboarded
- Payment infrastructure fully functional (not just “set up” — tested end-to-end)
Week 6–7 (Build Phase 2):
- Core product functionality complete
- First beta user invited
- Email notifications working
Week 4–5 (Build Phase 1):
- Auth and core data model complete
- Basic status report creation/editing functional
- Development environment stable
Week 2–3 (Foundation):
- Legal: terms of service, privacy policy complete
- Payment infrastructure: Stripe account set up, test transactions working
- Domain, hosting, email established
- Landing page live (pre-product)
Week 1 (Pre-build):
- ICP (ideal customer profile) tightened based on customer conversations
- Competitive positioning documented
- Landing page copy drafted
- Distribution plan for beta recruitment defined
What AI Caught That Marcus Had Missed
Three additions from AI’s output caught Marcus off guard.
The landing page in week 1. Marcus had planned to build the landing page after the product. AI flagged this as a dependency for beta recruitment — you can’t recruit beta users without something to point them to. Placing the landing page in week 1 meant Marcus could be building his waitlist while building the product, rather than scrambling to recruit beta users with nothing to show.
The payment infrastructure in week 2. Marcus’s original plan had payment setup in week 7. AI specifically flagged this as a “high-risk late dependency” — its reasoning: payment integration bugs discovered in week 7 of a solo project, two weeks before beta launch, have no buffer for resolution. Moving payment setup to week 2 built in a five-week buffer for anything unexpected. This turned out to be the single most valuable intervention.
The “tested end-to-end” qualifier on payment. AI didn’t just flag that payment needed to be set up earlier — it added a completion criterion that Marcus hadn’t specified: “payment infrastructure fully functional (tested end-to-end, not just configured).” This distinction between “set up” and “tested” was one Marcus credits as the reason a Stripe Connect configuration issue got resolved in week 3 rather than week 9.
Week-by-Week: What Actually Happened
Weeks 1–3: On track. Landing page went live in week 1 with an email capture form. Stripe issue emerged in week 2 — Stripe Connect configuration for handling agency-client billing required a verification process that took 10 days. Resolved in week 3 with no impact on the broader plan.
Weeks 4–6: Slightly behind. Auth and core data model took a week longer than estimated due to a decision to change the data architecture after week 4 (Marcus decided to restructure how “projects” related to “clients” after feedback from two conversations). AI recalibration in week 5 absorbed the delay by simplifying a feature in phase 1 — moving email notification configuration to post-launch instead of pre-launch.
Weeks 7–9: On track. Ten beta users recruited through personal outreach to contacts from his PM network and two targeted LinkedIn posts. Beta feedback was more positive than expected — most feedback was around UI polish rather than core functionality. Six users said they’d pay.
Weeks 10–11: Behind on one milestone. The “launch announcement drafted and reviewed” milestone slipped by a week because Marcus underestimated how long it took to get feedback from a target-audience reviewer. AI recalibration moved the final announcement to a softer launch on April 16 (one day after the target) with a planned secondary push on April 22. Marcus accepted the one-day slip and held the April 22 date as a secondary milestone.
Week 12 (Launch): 7 paying customers on day 1. Three came from the beta group. Four came from the LinkedIn posts in the two weeks leading up to launch. Monthly recurring revenue on day 1: $560.
The Recalibration That Made the Difference
Marcus ran four AI recalibration sessions during the 90 days: at weeks 5, 8, 10, and 12.
The week 5 recalibration was the most significant. The data architecture change in weeks 4–5 had created a real problem: Marcus was running about a week behind on core product development. An unrevised plan would have had him launching beta with incomplete functionality.
His recalibration prompt: “Here’s my original milestone plan: [paste]. Here’s what happened: I changed the data architecture in week 4, which cost me about 6 days. I’m behind on the phase 1 build. What needs to change to hit the April 15 launch date? What can I cut without compromising a minimum viable launch?”
AI’s response identified two features in the original scope that didn’t affect the core value proposition — email notification configuration and a client permissions system — and recommended deferring both to a post-launch update. This reduced scope by approximately 20 hours of development work, absorbing the delay without shifting the launch date.
Marcus’s reflection: “The recalibration wasn’t about AI being smarter than me. It was about having a structured way to ask ‘given what’s actually happened, what’s the most rational path forward?’ I would have figured it out eventually, but it would have taken me two days of anxious mental churn to get to the same answer AI gave me in a ten-minute session.”
What Marcus Would Do Differently
Two lessons from the process that Marcus was candid about:
He underspecified the beta recruitment milestone. “10 beta users recruited” sounds specific, but it isn’t. It says nothing about who those 10 users should be — whether they’re target customers or just willing contacts. Three of Marcus’s beta users weren’t really in his ICP, which diluted the feedback value. The milestone should have read “10 beta users recruited from the target ICP (small agencies with 5–20 person teams).”
He didn’t use AI to stress-test the plan before starting. A useful practice he discovered mid-process: after generating an initial milestone plan, ask AI to play devil’s advocate. Prompt: “Assume this plan fails to meet the April 15 launch date. What are the most likely reasons? Which milestones are most likely to be the cause?” Running that stress-test upfront might have surfaced the data architecture risk before it became a week-6 problem.
Using Beyond Time for the Same Process
Marcus ran his milestone plan manually — AI for generation and recalibration, a spreadsheet for tracking, Google Calendar for milestone dates. It worked, but it required consistent self-discipline to maintain the system across 90 days.
Beyond Time is designed to streamline exactly this process: AI generates the milestone plan, milestones live on your calendar automatically, and the recalibration loop is built into a weekly review rather than something you have to remember to schedule.
For solo founders managing the cognitive load of a product launch, the difference between a system that requires active maintenance and one that prompts you when it’s time to recalibrate is meaningful.
The Takeaway That Matters
Marcus launched. Seven paying customers, $560 MRR, a working product. Not a home run — but a real, shipped, revenue-generating product from someone who had never done it before, in 90 days, while employed.
The milestone plan didn’t guarantee success. What it did was reduce the number of decisions Marcus had to make under stress. When the architecture issue appeared in week 4, he didn’t have to think from scratch — he had a plan to update, and he knew how to update it.
“The plan was wrong from the moment I made it,” Marcus said. “But having something to be wrong gave me something to work from. That’s the whole point.”
Action step: If you’re planning a launch, product, or project in the next 90 days, spend 20 minutes generating your milestone plan with AI using full context — deadline, starting point, hours available, known constraints. Run the reverse-engineering prompt and look specifically for milestones AI adds that you hadn’t included in your mental model of the plan.
Frequently Asked Questions
-
How did Marcus use AI throughout the 90-day launch process?
Marcus used AI in three distinct phases: initial milestone generation (reverse-engineering the launch plan from day 90 backward), weekly recalibration (feeding progress updates and asking AI to revise the forward plan), and problem-solving (when he hit unexpected obstacles, he described them to AI and asked for adjusted approaches). The recalibration loop was what he credited most for keeping the launch on track.
-
What was the single most valuable thing AI caught in Marcus's launch plan?
The payment and legal infrastructure milestone. Marcus's original plan had him setting up Stripe and creating his terms of service in week 7 — after building the core product. AI flagged this as a common failure point, noting that payment integration bugs discovered late in a solo project often delay launches by two to three weeks. Moving payment setup to week 2 gave Marcus time to resolve an unexpected Stripe Connect issue without it affecting the launch date.