Spec-Driven Development
Imagine two developers starting the same project on the same day.
Developer A opens their IDE and starts coding. They build a user authentication system, adding features as they think of them. Two weeks later, they realize they forgot password reset functionality. They refactor. Then they discover the session handling doesn't work on mobile. They refactor again. Each discovery means rewriting code. Three months later, they're still debugging edge cases.
Developer B spends day one writing a specification. They define exactly what authentication means in their context: what security requirements matter, what edge cases exist, what success looks like. They clarify ambiguities before writing a single line of code. Then they hand that specification to an AI agent. Two weeks later, they have a complete, tested implementation. They spend months two and three building features, not fixing bugs.
Both developers are skilled. Both worked hard. But Developer B practiced Spec-Driven Development (SDD)—a methodology that prioritizes clear thinking before implementation.
The difference isn't coding ability. It's process.
What Is Spec-Driven Development?
Spec-Driven Development (SDD) is a methodology where you write complete specifications before writing code. AI agents then implement against those specifications while you focus on design, architecture, and validation.
This isn't documentation written after the fact. It's not a vague product requirements document. It's a precise specification that serves as the source of truth for implementation.
The Core Equation
When you provide AI with a clear specification, you eliminate the guesswork. You tell it exactly what to build, why it matters, what constraints exist, and what success looks like. The AI can then execute precisely.
When you provide a vague idea, the AI must guess. Each guess is an opportunity for misalignment. Five iterations later, you've wasted hours fixing things you could have specified upfront.
Why SDD Matters Now
SDD wasn't practical twenty years ago. Writing specifications took as long as writing code. But AI changes the equation:
- AI generates code faster than humans write it—if the requirements are clear
- AI handles implementation details—syntax, libraries, frameworks
- You focus on what humans do best—design, architecture, business logic
The bottleneck shifted from implementation to specification. Your primary skill is no longer writing code—it's writing specifications that guide AI implementation.
The SDD Workflow: Six Phases
SDD provides a systematic workflow from idea to validated implementation. Each phase removes ambiguity before the next phase begins.
Phase 1: Specify (Define What)
Question: What are we building and why does it matter?
Output: A specification document with four elements:
- Intent: Why does this feature exist? What user problem does it solve?
- Success Criteria: What does correct implementation look like? How do we measure success?
- Constraints: What limits exist? Performance, security, compliance, scale, technical constraints
- Non-Goals: What are we explicitly NOT building? (prevents scope creep)
Example Specification:
Phase 2: Clarify (Remove Ambiguity)
Question: What's underspecified or ambiguous?
Output: A list of clarification questions with answers encoded back into the specification.
Before planning, you must identify what you don't know. Common ambiguities:
- Edge cases: What happens when email service is down? When user enters emoji in password?
- Integration points: Does this connect to existing user database? Create new one?
- Error handling: What error message for "email already registered" vs "invalid email format"?
- Business logic: Can users register multiple accounts with same email? Trial period?
Example Clarification:
Phase 3: Plan (Design How)
Question: How will we approach building this?
Output: A plan showing architecture, dependencies, testing strategy, and tradeoffs.
Example Plan:
Phase 4: Tasks (Break Down Work)
Question: What are the concrete work items?
Output: A task list with dependencies and acceptance criteria.
Example Tasks:
Phase 5: Implement (AI Executes)
Question: How do we execute the plan?
Output: Working code that matches the specification and passes acceptance criteria.
Implementation Strategy:
- Provide AI with the specification, plan, and tasks
- Review code before committing (human-in-the-loop)
- Run tests and validate against success criteria
- Iterate only if implementation doesn't match spec
Example Prompt:
Phase 6: Validate (Verify Quality)
Question: Did we build what we specified?
Output: Validation report confirming implementation matches specification.
Validation Checklist:
- All success criteria met
- All constraints satisfied (performance, security)
- All edge cases tested
- Code follows project patterns
- Tests pass (unit, integration, edge cases)
- Documentation updated
Example Validation:
What Makes a Good Specification?
A good specification has four characteristics: clarity, completeness, constraints, and testability.
1. Clarity: No Ambiguity
Bad: "Build a registration system" Good: "Build a user registration system with email verification, password requirements, and rate limiting"
Bad: "Make it fast" Good: "Response time < 200ms for 95th percentile of requests"
Bad: "Handle errors gracefully" Good: "Return user-friendly error messages, never stack traces. Log errors for debugging."
2. Completeness: Cover All Scenarios
Use this checklist to ensure completeness:
Functional Requirements:
- What are all the inputs? (data types, formats, validation)
- What are all the outputs? (success responses, error cases)
- What are all the edge cases? (null, empty, invalid, unexpected)
- What are all the states? (initial, processing, success, failure)
Non-Functional Requirements:
- Performance: Response time, throughput, concurrent users
- Security: Authentication, authorization, encryption, rate limiting
- Compliance: GDPR, HIPAA, SOC2, industry regulations
- Scalability: Expected load, growth projections, caching strategy
Integration Requirements:
- What external services does this connect to? (databases, APIs, third-party services)
- What happens when those services are slow or unavailable?
- What data formats do we use? (JSON, protobuf, CSV)
- What authentication do we need? (API keys, OAuth, tokens)
3. Constraints: Define Boundaries
Constraints prevent "just add this feature" scope creep. Explicitly state:
Technical Constraints:
- Must use Python 3.11+ (company standard)
- Must support PostgreSQL and MySQL (customer requirement)
- Must work offline for 30 days (PWA requirement)
Business Constraints:
- Must launch by Q2 (marketing deadline)
- Budget: $500/month for cloud services
- No external dependencies beyond approved list
Design Constraints:
- Must follow existing design system
- Must be accessible (WCAG 2.1 AA)
- Must support mobile and desktop
4. Testability: Can We Verify Success?
Every success criterion must be measurable:
Bad: "User-friendly interface" Good: "New users can complete registration in < 60 seconds without documentation"
Bad: "Good performance" Good: "95th percentile response time < 200ms under 1,000 concurrent users"
Bad: "Secure implementation" Good: "Passes OWASP Top 10 security checklist, no critical vulnerabilities"
SDD vs Vibe Coding
"Vibe Coding" is writing code based on intuition—trying things, seeing what works, iterating reactively. SDD is thinking systematically—specifying first, then implementing.
Aspect
Vibe Coding
Spec-Driven Development
Starting Point
Open IDE, start coding
Write specification first
Decision Making
Figure it out as you code
Make decisions upfront
Iteration
5-10 cycles of "fix what I forgot"
1-2 cycles of refinement
Edge Cases
Discovered in production
Planned in advance
AI Collaboration
"Build me a thing" (guesses)
"Implement this spec" (precision)
Time Distribution
80% coding, 20% fixing
20% specifying, 80% building
Scalability
Falls apart beyond 1,000 lines
Scales to complex systems
Team Coordination
"Read the code"
"Read the spec"
When Vibe Coding Works:
- Learning a new framework (exploration phase)
- Prototyping throwaway code (proof-of-concept)
- Simple scripts with no edge cases (< 50 lines)
When SDD Is Essential:
- Production features with business impact
- Systems with multiple components or integrations
- Projects where requirements matter (security, compliance, performance)
- Work involving AI agents or multiple developers
When to Use SDD
Not every project needs full SDD. Use this decision framework:
Use Full SDD When:
- Production features: User-facing functionality that impacts business metrics
- Complex systems: Multiple components, integrations, or workflows
- Security-critical: Authentication, payments, data processing
- Team projects: Multiple developers need shared understanding
- AI-assisted development: You're using AI agents for implementation
Example: Building a payment processing system—use full SDD. Security matters, edge cases are critical, and errors cost money.
Use Lightweight SDD When:
- Simple utilities: Internal tools, scripts, automation
- Prototype code: Exploratory work that will be discarded
- Well-understood patterns: CRUD APIs, basic web pages
Example: Building a CSV parser for a one-time data migration—use lightweight SDD. Write down input format, output format, and error handling, then implement.
Skip SDD When:
- Learning experiments: You're exploring a new technology
- Throwaway prototypes: Code that won't reach production
- Trivial changes: Fixing a typo, updating a color
Example: Updating button color from blue to green—just make the change.
Validation Practices and Quality Gates
SDD includes validation at every phase. Each phase has quality gates that must pass before proceeding.
Phase Quality Gates
Specify Phase Gate:
- Intent is clear (why this exists)
- Success criteria are measurable
- Constraints are explicit
- Non-goals are defined
- Stakeholders approve (if team project)
Clarify Phase Gate:
- All ambiguous terms defined
- Edge cases identified
- Integration points specified
- Error handling defined
Plan Phase Gate:
- Architecture diagram exists
- Dependencies identified
- Testing strategy defined
- Tradeoffs documented
Tasks Phase Gate:
- Each task has acceptance criteria
- Dependencies between tasks explicit
- No task exceeds 2 hours
- Tasks ordered correctly
Implement Phase Gate:
- Code follows specification
- Code follows project patterns
- Tests pass (unit, integration)
- Code review approved
Validate Phase Gate:
- All success criteria met
- All constraints satisfied
- Edge cases tested
- Documentation updated
- Stakeholder sign-off
Automated Quality Checks
Where possible, automate quality gates:
SDD and AI Collaboration
SDD transforms AI from a chatbot into an implementation partner. The workflow looks different:
Without SDD
With SDD
The key difference: AI asks clarifying questions during planning, not during implementation.
Common SDD Mistakes
Mistake 1: Writing the Spec After the Code
Anti-pattern: Build the feature, then document what you built.
Why it fails: You're documenting decisions, not making them. The spec becomes a retrospective, not a guide.
Fix: Write the spec first. Revise it only if you discover something truly unknowable upfront.
Mistake 2: Vague Success Criteria
Anti-pattern: "User-friendly interface", "Good performance", "Secure implementation"
Why it fails: These aren't testable. You can't verify if you succeeded.
Fix: Make every criterion measurable. "95th percentile response time < 200ms", "Passes OWASP Top 10 checklist", "New users complete registration in < 60s without documentation"
Mistake 3: Skipping Non-Goals
Anti-pattern: No explicit statement of what you're NOT building.
Why it fails: Scope creeps. Every conversation becomes "should we add X?"
Fix: Explicitly list non-goals. When someone asks for feature X, say "That's in our non-goals list for Phase 1. We'll consider it for Phase 2."
Mistake 4: Treating Specs as Static
Anti-pattern: Write spec, never update it, even when requirements change.
Why it fails: Spec becomes outdated. Implementation drifts from spec.
Fix: Treat specs as living documents. Update them when requirements change. Keep spec and implementation in sync.
Try With AI
Prompt 1: Write a Specification
What you're learning: How to think systematically about requirements before implementation. You're practicing moving from vague ideas to precise specifications.
Prompt 2: Evaluate Your Current Workflow
What you're learning: Self-awareness about your development process. Understanding your current workflow helps you identify where SDD would have the most impact.
Prompt 3: SDD vs Vibe Coding Scenarios
What you're learning: Decision-making skills. You're learning to recognize when SDD is essential vs when it's overkill. This judgment is as important as knowing how to write specs.
What's Next
You now understand the SDD methodology. In upcoming lessons, you'll practice writing specifications for real features and learn to use AI agents to implement them.
The core insight: In the agentic era, how clearly you think before you code determines how quickly you ship.
SDD isn't bureaucracy. It's acceleration. By thinking systematically upfront, you eliminate the iterations that slow you down. You ship faster, with fewer bugs, and more confidence.
Your new role: specification engineer and system architect. AI's role: implementation partner. Together, you build what matters—faster than ever before.
Core Concept
Spec-Driven Development (SDD) is a methodology where you write complete specifications before writing code, then AI agents implement against those specifications while you focus on design, architecture, and validation. The core equation: vague idea + AI = 5+ iterations of misalignment; clear specification + AI = 1-2 iterations of refinement. The bottleneck has shifted from implementation to specification quality.
Key Mental Models
- SDD Six-Phase Workflow: Specify (define what/why) -> Clarify (remove ambiguity) -> Plan (design how) -> Tasks (break down work) -> Implement (AI executes) -> Validate (verify quality). Each phase removes ambiguity before the next begins.
- Four Specification Qualities: Clarity (no ambiguity -- measurable, not vague), Completeness (all scenarios covered -- functional, non-functional, integration), Constraints (explicit boundaries -- technical, business, design), Testability (every criterion verifiable -- quantified, not subjective).
- SDD vs Vibe Coding: SDD invests 20% time specifying and 80% building; Vibe Coding spends 80% coding and 20% fixing. SDD scales to complex systems; Vibe Coding falls apart beyond 1,000 lines.
- AI Asks During Planning, Not Implementation: With SDD, clarifying questions happen in the Clarify phase. Without SDD, AI must guess requirements during implementation, causing misalignment iterations.
- Decision Framework for SDD Depth: Full SDD (production features, complex systems, security-critical), Lightweight SDD (simple utilities, prototypes, well-understood patterns), Skip SDD (learning experiments, throwaway code, trivial changes).
Key Facts
- Specification has four elements: Intent (why this exists), Success Criteria (what correct looks like), Constraints (limits that exist), Non-Goals (what we are NOT building)
- Quality gate phases: Each of the six phases has explicit pass/fail criteria before proceeding to the next
- Developer A vs B comparison: Developer A (code-first) spends 3 months debugging edge cases; Developer B (spec-first) has complete tested implementation in 2 weeks and builds features in months 2-3
- Vibe Coding works for: Learning new frameworks, prototyping throwaway code, simple scripts under 50 lines
- SDD is essential for: Production features, multi-component systems, security/compliance requirements, AI-assisted development, team projects
- Task sizing rule: No single task should exceed 2 hours of work
Critical Patterns
- The specification document structure: Intent (user problem solved) -> Success Criteria (measurable outcomes) -> Constraints (performance, security, compliance, scale) -> Non-Goals (explicit scope boundaries preventing creep)
- The Clarify phase catches unknowns before they become expensive: edge cases, integration points, error handling specifics, business logic ambiguities
- The Plan includes architecture, dependency sequence, testing strategy, and documented tradeoffs with rationale
- Validation confirms implementation matches specification across all success criteria, constraints, edge cases, and quality gates
Common Mistakes
- Writing the spec after the code (turns specification into retrospective documentation rather than a guide that drives implementation quality)
- Using vague success criteria like "user-friendly" or "good performance" instead of measurable criteria like "95th percentile response time < 200ms" or "new users complete in < 60 seconds"
- Skipping Non-Goals (without explicit scope boundaries, every conversation becomes "should we add X?" leading to scope creep)
- Treating specs as static documents rather than living artifacts that update when requirements change (spec-implementation drift makes both unreliable)
Connections
- Builds on: The orchestrator role (Lesson 2) where specification writing is the primary skill; the Nine Pillars (Lesson 6) where SDD is Pillar 7 orchestrating all others; AIDD characteristics (Lesson 6) including Specification-Driven, Quality-Gated, and Human-Verified
- Leads to: The Synthesis lesson (Lesson 8) where SDD is positioned as the methodology enabling reliable Digital FTE delegation; practical SDD workflows in later chapters where students execute the six phases on real features