Tetrifox
Portfolio Advisory · AI Adoption

AI-Assisted
Development
Framework

A practical guide for engineering teams and business leaders on working with AI on existing codebases — from a single service to one million lines.

4Phases
Orient
Contribute
Communicate
Understand
Scroll
Part 1 · Mental Model

Treat the AI as a
Senior Hire

The most common failure mode: treating the model as a search engine. A better frame: an exceptionally talented engineer who arrived this morning with no prior knowledge of your company, codebase, or product.

“The quality of AI output is determined by the quality of context you provide. Prompting is onboarding.”

You would not hand a new hire a ticket and say “go fix this bug.” You would invest time in orientation — business context, system architecture, team conventions, past decisions and their reasons. The quality of their work would be directly proportional to the quality of that onboarding.

The AI has no memory between sessions. Every session is Day One for a new hire. This means context investment must be reusable, structured, and maintained like code — not rebuilt from scratch each time.

This also means Opus specifically rewards the approach more than lighter models. Its capability ceiling is genuinely high. The bottleneck is almost always context quality, not model quality. Giving Opus a thin prompt is like hiring a consultant and refusing to brief them.

1
Layer One
Business Intent
What problem does this software solve? Who are its users? What is the company's current strategic priority? Without this, the AI produces technically correct but business-irrelevant solutions.
2
Layer Two
Architectural Reality
How is the system structured? What are its key constraints, dependencies, and integration points? Without this, solutions break the system or ignore its design patterns.
3
Layer Three
Codebase Conventions
What patterns, idioms, and standards does this team use? What is explicitly avoided and why? Without this, output isn't mergeable or maintainable by the team.
4
Layer Four
Task Intent
What is the real goal of this task — not just its surface description? What does "done" actually look like? Without this, the model gives literal but wrong answers.

Note for business leaders: Layer 1 is your layer. Documenting business intent in plain language — in a shared location — multiplies developer and AI productivity. This is not documentation overhead. It is leverage.

Part 2 · The Framework

Four Phases of
Effective Collaboration

These phases apply whether the codebase has 10,000 lines or 10 million. They are not sequential checkboxes — they are orientations your thinking moves through. Experienced practitioners do all four instinctively. This framework makes that instinct teachable.

01
Phase One

Orient

Map the territory before you move

Core Practices

  • Generate a structural overview: directory layout, major modules, entry points, and data flows
  • Identify the “weight-bearing walls” — components everything else depends on
  • Surface the technology stack: languages, frameworks, databases, external APIs
  • Note what is missing or confusing — orphaned files, inconsistent naming are risk flags

Prompt Templates

Here is the top-level directory structure. Narrate what kind of system this appears to be and identify the major domains or modules.
What appears to be most tightly coupled? What changes would likely have cascading effects?
02
Phase Two

Understand

Build shared mental models of intent

Core Practices

  • Read key modules for intent, not syntax — what problem is each solving?
  • Identify business rules encoded in the software (often invisible to new developers and AI)
  • Map the “why” behind architectural decisions, even ones that seem poor
  • Create or update the Context Document with what you've learned

Prompt Templates

Here is the authentication module. Explain what business rules appear to be enforced here, in plain language.
This pattern appears repeatedly [paste example]. Why might the team have adopted it? What are its tradeoffs?
03
Phase Three

Contribute

Execute with maintained context

The Perfect Prompt — 5 Elements

  • System Context — relevant architecture facts for this task
  • Business Intent — why this task exists in the first place
  • Scope Constraints — what must not change under any circumstances
  • Task Definition — what specifically needs to happen
  • Definition of Done — what success concretely looks like

Iterative Deepening

  • Round 1: Ask the AI to explain its approach before writing code. Review and agree on the approach.
  • Round 2: Implement with the agreed approach, referencing specific files and functions.
  • Round 3: Review against conventions and edge cases. Refine.
04
Phase Four

Communicate

Keep business stakeholders in the loop

Three Practices

  • Progress in plain language — weekly updates describing user impact and business risk, not lines of code
  • Decision logs — what was chosen, what was rejected, and why. Prevents future misunderstandings and builds trust
  • Early risk surfacing — when AI uncovers unexpected complexity, that reaches stakeholders in time to adjust plans

Translation Examples

We refactored the repository layer to use dependency injection.
becomes
We restructured the data access code so it's easier to test and safer to change when we add new features.
Part 3 · Context Document

The Single Highest-
Leverage Practice

A living reference file checked into the repository. Loaded at the start of every AI session. Updated when the system changes. Without it, every session is starting cold.

Most teams already have several context documents — none of them labeled as such. A README, architecture decision records, inline comments on non-obvious logic, API documentation. The problem is they're scattered, partially stale, and written for different audiences.

The practice is simply making this intentional: one file — a CONTEXT.md at the repository root — that you know is the thing you load at the start of every AI session.

What to include: What this system does and who uses it. The tech stack. Key architectural patterns and why. What the team treats as sacred (never break these things). Pointers to where the interesting complexity lives.

Target length: 200–400 lines of plain prose and bullet points. An afternoon to write the first version. 10 minutes to update when something significant changes.

For large or multi-service systems, you'll eventually have one system-level document and one per major domain. But that's a maturity step, not a starting point. Start with one.

Opening Every AI Session

1
Load the Context Document
Paste it directly or reference it explicitly. The model has no memory of previous sessions — treat it as Day One, every time.
2
Describe what has changed
Any updates since the last session. New dependencies, recent refactors, decisions made.
3
State the session goal
One sentence. What you are trying to accomplish in this session specifically.
4
State the constraints
What must not change. This is as important as the goal itself.

This takes 2–3 minutes and dramatically improves output quality for the entire session.

CONTEXT.md
# SYSTEM CONTEXT DOCUMENT# Keep this updated. Load at session start.## System OverviewMulti-tenant SaaS platform for supply chainvisibility. ~180k lines. Node/React/Postgres.~2,400 active enterprise customers.## Architecture- API gateway → domain services (6 total)- Event-driven via internal message bus- Shared auth service (DO NOT TOUCH)- Read replicas for all reporting queries## Sacred RulesNever write directly to reporting DBPayments service: any change needs 2 reviewsAll external calls must use retry wrapper## Conventions- Repository pattern throughout (no raw queries)- Errors: never throw, always return Result type- Feature flags via LaunchDarkly before shipping## Known Risk Areas- /legacy/importer — untested, fragile- Notification service — tech debt sprint Q3## Decision Log2024-11 — Chose Postgres over MongoDB for audit trail requirements2025-02 — Moved to monorepo, kept separate deploy pipelines per service
Example CONTEXT.md — adapt to your system
Part 4 · At Scale

Works at Any Size.
Changes at Large.

The four-phase framework holds for codebases of 1 million lines or more. What changes is the investment required and the strategies needed to manage scope — not the underlying principles.

Domain Segmentation

Never treat a large codebase as a single unit. Decompose into bounded domains — areas of responsibility that can be oriented and understood independently. Each domain gets its own Context Document section. AI sessions are scoped to one domain at a time wherever possible.

Blast Radius Management

In tightly coupled systems, it's harder to know what a change affects. Before any significant change, explicitly ask: “Identify every component, interface, or data contract this modification might affect — in order of likelihood and severity.” This surfaces risks before they become failures.

Strangler Fig Pattern

For large legacy systems: build new capabilities alongside the old system, gradually replacing legacy behavior. No risky big-bang rewrites. AI assists this pattern exceptionally well — interfaces, adapters, consistency across boundaries — but the pattern must be established by human architects first.

The Context Hierarchy

For large systems, context operates at three levels. You never start from scratch; you narrow progressively.

LevelScopeContainsUpdated When
System-levelThe entire codebaseArchitecture, business rules, global conventions, tech stackMajor architectural changes
Domain-levelOne module or serviceModule purpose, local patterns, key interfaces, known risksDomain is meaningfully changed
Session-levelThis specific taskSpecific goal, relevant files, current constraints, definition of doneEvery AI session
Part 5 · Anti-Patterns

What Not to Do

Recognising failure modes is as important as knowing best practices. These patterns are common, plausible-looking, and consistently produce bad outcomes.

For Developers

Vibe Prompting
Giving the AI a task with no architectural context and hoping for the best. The output will be technically plausible but contextually wrong — and the errors will be subtle enough to pass initial review.
Context Amnesia
Starting each session fresh without loading the Context Document. The model will make inconsistent decisions — different patterns, different assumptions — across sessions on the same codebase.
Prompt-and-Merge
Accepting AI output without reviewing it as a diff. Subtle errors compound. The AI will sometimes make plausible-looking changes in areas you did not intend to modify.
Scope Creep by Delegation
Asking the AI to "improve" large sections of code without specific goals. Produces churn, not progress — and diffs that are impossible to review meaningfully.

For Business Stakeholders

Speed Theatre
Measuring AI adoption by code volume or ticket velocity rather than quality, maintainability, and reduced rework. Fast is only good if the output doesn't create three new problems.
Context Starvation
Expecting developers to write great AI prompts without investing in shared business documentation. Layer 1 context — business intent — is a business responsibility. Starving it produces bad AI output.
Visibility Abandonment
Assuming AI adoption means less communication is needed. The opposite is true. Faster delivery requires faster feedback loops. The translation layer between technical and business is more important, not less.
Big Bang Context
Attempting to document the entire system's context before starting any AI-assisted work. The Context Document is built iteratively as you orient and understand. Perfect is the enemy of useful.
Quick Reference

The Framework
at a Glance

A summary of all four phases — developer focus, stakeholder signal, and the key question each phase answers.

PhaseDeveloper FocusBusiness SignalKey Question
01 · OrientMap structure, identify weight-bearing components, surface unknowns. Build the initial map before any changes."We understand the system. Here is what we found, and here is what requires care."What is this system and how does it work?
02 · UnderstandDecode intent, document business rules, build the Context Document. Move from structure to meaning."Here is what this code is actually doing, in plain language — and here is what we must preserve."Why does this code exist and what does it protect?
03 · ContributeStructured prompts with full context, iterative deepening, review as diff. Make changes with precision."Here is what changed, why it was the right approach, and what we verified before merging."How do we change things without breaking what matters?
04 · CommunicateTranslate technical decisions into business language, surface risks early, maintain the decision log."Here is our progress, here is what we decided and why, and here is what you need to act on."How do we keep everyone aligned as things move fast?

Context is not overhead — it is the product. The quality of your AI output is determined by the quality of context you invest in building. This investment compounds: a well-maintained Context Document makes every future session faster, more accurate, and more aligned with what the business actually needs.