Context beats parameters. Graph beats prompt.
Oxagen graphs your business ontology and codebase into a typed, queryable knowledge graph. Agents get richer context, run on cheaper models, and ship evals that prove the accuracy delta.
npx @oxagen/mcp-install init --agent cursorMCP-native · Neo4j-backed · Workspace-scoped · SOC 2 Type II in progress.
Connect & Forget
Wire in your data. Forget about it.
Universal connectors, MCP ingest, and OpenAPI. Your ontology keeps ingesting and self-improving without you lifting a finger — shared memory for every agent you ship.
- Universal connector framework: Gmail, Outlook, Plaid, Drive, Calendar, Photos — and the MCP protocol as an ingest surface
- OAuth + encrypted-at-rest token vault — we never see credentials
- Continuous embeddings, entity extraction, and edge discovery — the graph keeps sharpening
- Expose the ontology as an MCP server in Claude, ChatGPT, Cursor, or any MCP-capable agent
Context layer
The context your agents should have been reading
Business ontology and code graph in one typed, traversable layer. Agents stop hallucinating. Models get cheaper. Evals prove the delta.
Business ontology + code graph
Two graphs. One context layer.
Your business has a schema — entities, relationships, domain types. Your codebase has structure — classes, functions, call paths. Oxagen maps both into one typed, traversable graph so agents stop guessing and start knowing.
- Business ontology: entities, relationships, and domain schema auto-discovered from connected sources
- Code graph: classes, functions, data models, and call paths — agents understand your implementation, not just docs
- Neo4j-canonical graph + pgvector hybrid search — sub-50ms multi-hop traversal at agent speed
Model selection
Run Haiku. Get Opus-level results.
Context does the work the model was compensating for. When agents traverse the graph instead of guessing, you swap expensive stateless models for fast cheap ones — and accuracy goes up, not down.
- 95% inference cost reduction: graph context replaces model size as the quality lever
- Deterministic entity IDs — every agent refers to the same node across runs and sessions
- Built-in evals: side-by-side accuracy + cost comparison across model tiers, on every workspace update
Security
Tenant-Isolated. Agent-Safe.
Multi-tenant architecture enforces isolation at the database row. AES-256 at rest, TLS 1.3 in transit. Your ontology is never used to train third-party models. Ship agent workflows without leaking context across tenants.
- PostgreSQL RLS — tenant isolation enforced at the database level
- OAuth tokens encrypted with AES-256-GCM — never plaintext
- Pluggable auth (JWT, Auth0, custom) — no vendor lock-in
95%
Inference cost reduction
<50ms
p95 3-hop graph retrieval
2 graphs
Business ontology + codebase
Evals
Built in, not bolted on
Ingest
Universal Connectors + MCP Ingest Surface
Plug your data sources in once. Or let other MCP servers feed the graph. The ontology stays in sync either way.
Plaid
Connect bank and financial accounts securely. Oxagen uses Plaid to map your spending, income, and cash flow.
Google Calendar
Surface what matters, automate scheduling, and connect how you spend your time with the rest of your ontology.
Gmail
Manage your email, draft replies, and connect communications with the rest of your ontology.
Google Photos
Integrate your image memories, albums, and photo libraries across devices.
Outlook Calendar
Surface what matters, automate scheduling, and connect your Outlook calendar.
Outlook Mail
Manage your Outlook inbox and connect communications with your ontology.
Microsoft Office 365
Connect documents, spreadsheets, emails, and more for complete context.
Google Drive
Access your cloud files and documents for unified search and productivity insights.
Microsoft OneDrive
Access and connect your OneDrive files, documents, and assets.
Run faster models. Get better results.
Graph your context. Let agents traverse it. Ship evals that prove the delta — in accuracy, in cost, in agent performance.
No sales call required. Self-serve from install to first query.