Back to blog
Case Study20 February 202610 min read

How I Built Scorafy: From Concept to Live SaaS Product Using AI-Assisted Development

The full build story of Scorafy - from concept to production SaaS product. Architecture decisions, AI-assisted development workflows, and the lessons learned along the way.

AB

Adam Broons

Founder, Cognitiv

There's a particular kind of itch that comes from solving the same problem manually, over and over, and knowing that software could do it better. That's how Scorafy started.

I'd spent months working with coaches, educators, and HR professionals who were drowning in assessment data. They'd collect feedback through surveys and questionnaires, then spend hours - sometimes days - turning raw scores into meaningful reports. The analysis wasn't complicated. It was just tedious, repetitive, and time-consuming. The kind of work that AI handles brilliantly.

So I built Scorafy. A SaaS platform that takes assessment data and generates personalised, AI-powered reports in minutes instead of hours. And I built the entire thing using AI-assisted development.

Here's how it actually went.

The stack

Every architecture decision started with the same question: what lets me move fastest without creating technical debt I'll regret in six months?

Next.js 16 + TypeScript for the frontend and API routes. Server Components for performance, App Router for clean routing, TypeScript because I refuse to ship JavaScript without type safety in 2026.

Supabase for the database and authentication. PostgreSQL under the hood, Row Level Security for data isolation, built-in auth that handles email/password and OAuth. I evaluated Firebase and a custom Postgres setup, but Supabase hit the sweet spot of developer experience and production readiness.

Claude AI (Anthropic) for report generation. I tested GPT-4, Claude, and Gemini for the core AI feature. Claude consistently produced the most nuanced, well-structured assessment reports. The writing quality matters when your output is a professional document that coaches hand to their clients.

Stripe for payments. REST API for checkout session creation, webhook SDK for event handling. Three pricing tiers ($39, $99, $199 per month) with a 25% annual discount.

Vercel for deployment. Git push to deploy, preview URLs for every PR, edge functions, analytics. The developer experience is unmatched for Next.js projects.

Tailwind CSS v4 for styling. Utility-first CSS that keeps the codebase clean and the design system consistent.

What AI-assisted development actually looks like

I want to be clear about something: AI didn't write Scorafy. I built Scorafy with AI as an accelerator.

Here's the difference. When I needed to set up Stripe webhook handling, I didn't ask an AI to "build me a payment system." I designed the payment flow, decided on the webhook events I needed to handle, structured the database schema for subscriptions, and then used AI to help me write the implementation faster.

AI was most valuable for:

Boilerplate acceleration. Setting up database schemas, API route handlers, form validation, error handling patterns. The stuff that's well-understood but takes time to type out.

Problem-solving conversations. When I hit an architectural decision point - like how to handle concurrent report generation without blocking the UI - I'd discuss the trade-offs with Claude. Not "tell me what to do" but "here are my options, what am I missing?"

Code review and bug hunting. Paste in a component that's not behaving right, describe the expected vs actual behaviour, and get targeted debugging suggestions.

Documentation and testing. Writing TypeScript interfaces, JSDoc comments, test cases. The mechanical parts of good engineering that often get skipped when you're moving fast.

AI was least valuable for:

Architecture decisions. The big choices - which database, how to structure the subscription model, what the user flow should look like - those required human judgment, domain knowledge, and business context that AI simply doesn't have.

UI/UX design. AI can generate components, but designing a user experience that feels right requires taste, user empathy, and iteration that you can only get by using the thing yourself.

Edge cases and security. Row Level Security policies, webhook signature verification, rate limiting, input sanitisation. These are areas where getting it wrong has real consequences, so I wrote and tested them carefully.

The timeline

From first commit to live product: roughly six weeks of focused work.

Week 1-2: Core architecture. Database schema, authentication flow, project management CRUD. Getting the foundation right so everything built on top of it would be solid.

Week 3: AI integration. Building the assessment ingestion pipeline, prompt engineering for report generation, output formatting and PDF export.

Week 4: Payments. Stripe integration, subscription management, usage tracking, billing portal.

Week 5: Polish. Landing page, onboarding flow, email notifications (via Resend), error handling, loading states.

Week 6: Launch prep. SEO, analytics, directory submissions, social assets, documentation.

Six weeks is fast for a production SaaS product. Without AI assistance, I'd estimate the same build at 14-16 weeks. The acceleration wasn't from AI writing code I didn't understand - it was from AI removing the friction at every step.

The hard parts

Prompt engineering is real engineering. Getting Claude to generate consistently high-quality assessment reports required dozens of iterations on the system prompt. Small changes in wording would produce dramatically different outputs. I spent more time refining prompts than I spent on the entire payment integration.

Stripe in live mode is nerve-wracking. Testing with test keys is comfortable. The moment you flip to live keys and real money starts flowing, every webhook handler feels like it needs one more review.

Authentication edge cases multiply. Email verification flows, password reset, session management across tabs, OAuth callback handling. Each one seems simple in isolation. Together, they form a web of states that all need to work correctly.

Pricing is harder than building. I changed the pricing model three times before settling on the current tiers. Technical problems have technical solutions. Pricing problems require understanding your market, your costs, and the perceived value of what you've built.

What I'd do differently

Start with the landing page, not the product. I built the product first and the marketing second. In retrospect, I'd validate demand with a landing page and waitlist before writing a single line of application code.

Set up analytics from day one. I added Google Analytics and event tracking late in the process. I wish I'd had usage data from the first internal testing sessions.

Write the onboarding flow before the features. The best feature in the world is worthless if users can't figure out how to use it. I rebuilt the onboarding twice.

The result

Scorafy is live at scorafy.com. Coaches, educators, and HR professionals can upload assessment data and get AI-generated reports that would have taken hours to produce manually.

It's early days. The product is live, the infrastructure is solid, and now the real work begins: finding users, gathering feedback, and iterating.

Building Scorafy proved something I'd been telling my consulting clients for months: AI-assisted development doesn't replace the need for good engineering decisions. It amplifies the speed at which a competent developer can execute them. The developer still matters. The decisions still matter. AI just removes the busywork between having an idea and shipping it.

If you're considering building a SaaS product and want to talk through the architecture, the AI tooling, or the build process, get in touch. This is exactly the kind of project I help clients with at Cognitiv.

Want to discuss this further?

I'm always up for a conversation about AI, product development, or technology strategy.

Get in Touch