GPS for Your Codebase.
Not Pins on a Map.
OCXP Engine builds an infrastructure-aware graph of your entire codebase, then gives your AI agent turn-by-turn directions instead of random file fragments.
RAG gives your AI a phone book. You need a GPS.
Your AI agent asks "how does authentication work?" RAG returns 10 file chunks based on keyword similarity. Maybe some are relevant. Maybe they're from a deprecated module. Maybe they miss the database connection that's the actual bottleneck.
OCXP maps your infrastructure. Then navigates it.
OCXP Engine indexes your project and builds a graph that understands not just your code, but your databases, queues, API endpoints, and service connections. When your AI asks a question, it gets a direct route to the answer — including every downstream system that could be affected.
"How does auth work?"
"How does auth work?"
What makes OCXP different.
Not a better search engine. A fundamentally different approach to AI context.
Infrastructure Awareness
Traces function calls through databases, message queues, and downstream services. When you change userService.createUser, OCXP knows it writes to DynamoDB, triggers an SQS queue, and affects billing-service.
Progressive Zoom
Five detail levels from L0 (module names, 10 tokens) to L4 (full source code). Your AI agent requests exactly the depth it needs for the current task. No more dumping entire files.
Team Intelligence
The graph learns from every query. Common patterns surface faster. Team knowledge persists across sessions. Your AI gets smarter the more your team uses it.
Task-Aware Routing
"Add rate limiting to POST /api/users" returns only the route handler, existing rate limiter, and config file. Sniper rifle precision instead of shotgun scatter.
Blast Radius Detection
Before you ship, know exactly what a change will affect. Function-level impact analysis that traces through your entire infrastructure stack.
53% Fewer Tokens
Structured context blocks instead of scattered file chunks. 4,062 tokens instead of 8,563. Cut your AI agent costs in half while doubling accuracy.
See the difference.
Impact Analysis
$ ocxp-engine impact "userService.createUser"
{
"function": "userService.createUser",
"writes_to": "DynamoDB:users-table",
"triggers": "SQS:user-created-queue",
"consumers": ["billing-service", "notification-service"],
"warning": "Schema change breaks billing-service"
}One command. Full downstream visibility. No guessing.
Progressive Zoom
// L0: Module overview (10 tokens)
{ "modules": ["auth", "oauth", "session"] }
// L1: Entry points and dependencies (85 tokens)
{ "endpoints": ["POST /login", "POST /refresh"],
"deps": ["bcrypt", "jsonwebtoken", "redis"] }
// L2: Function signatures and call graphs (340 tokens)
{ "functions": [...], "call_graph": [...],
"complexity": { "cyclomatic": 12 } }
// L4: Full source with line-by-line context
// Only when your agent actually needs itFrom 10 tokens to full source. Your agent picks the right level.
RAG vs OCXP Engine
Same question. Fundamentally different approach.
| RAG | OCXP Engine | |
|---|---|---|
| How it finds context | Keyword search across file chunks | Graph traversal across infrastructure |
| What it returns | ~10 random file fragments | Precise module with dependencies |
| Infrastructure awareness | None - files only | DB, queues, services, endpoints |
| Accuracy | ~60% (40% failure rate) | ~90% precision |
| Token usage | ~8,563 tokens per query | ~4,062 tokens per query |
| Hops to answer | ~12 hops (scattered) | 2-3 hops (direct) |
| Learning | Stateless - starts fresh every time | Learns from team usage patterns |
| Detail control | All or nothing | Progressive zoom (L0-L4) |
Three steps. Zero configuration.
Index
OCXP parses your source code, IaC configs, API routes, database schemas, and queue configurations. Builds a complete infrastructure graph in under 30 seconds.
Query
Your AI agent asks a question via MCP. OCXP navigates the graph, finds the precise context, and returns it at the right zoom level. 2-3 hops, not 12.
Learn
Every query refines the graph. Common patterns surface faster. Team knowledge compounds. The more you use it, the smarter it gets.
MCP is your AI's hands. OCXP is its memory.
MCP (Model Context Protocol)
- -Runtime tool execution
- -Gives AI the ability to take actions
- -File operations, API calls, shell commands
- -Stateless per session
OCXP Engine (via MCP)
- +Precision context delivery
- +Gives AI the knowledge to decide what to do
- +Infrastructure graph, impact analysis, zoom levels
- +Learns and improves over time
OCXP runs as an MCP server. Works with Claude Code, Cursor, Windsurf, and any MCP-compatible client. No plugins, no extensions — just add the config and go.
No vendor lock-in. No black boxes.
View Sourcesoon
Every line of code is on GitHub. Read it, fork it, contribute.
Report Issuessoon
Found a bug? Open an issue. We track everything publicly.
Contributesoon
PRs welcome. Check the contributing guide to get started.
Stop guessing. Start knowing.
Install the CLI, index your project, and give your AI agent GPS-grade context in 30 seconds.
Free forever for individual developers. Open source. Apache 2.0 licensed.