Port of
Context

Replace LLM tool calling with Code Mode in secure sandboxes.
Connect AI Agents to the real world with a robust and open source framework.

Code Mode with MCP improves context efficiency and accuracy

Presenting MCP servers as code APIs rather than direct tool calls reduces token usage and improves task completion success rates.

Traditional MCP
Sequential tool calling
0
tokens
• Load all tool definitions upfront
• Pass large datasets through context
• Sequential processing
Code Mode
Load only what you need
0
tokens
• Discover tools on-demand
• Process data in sandbox
• Parallel execution
98.7%
Token Reduction
From 150,000 tokens to 2,000 tokens

pctx makes Code Mode simple

An open source framework that brings the power of Code Mode to your AI agents

Run Locally

No cloud dependency. Full control over your environment

Bring Any LLM

Works with Claude, GPT, Gemini, or any model you choose

Deploy Anywhere

Docker, AWS, GCP, Azure—deploy to any cloud platform

Architecture

AI agents write code that executes in pctx sandboxesType-checked, compiled, and run securely • Authenticated MCP connectionsAI Agents (Bring any LLM)pctxTypeScript Compiler Sandbox• Type checking• Rich error feedback• No network accessExecution Sandbox• Authenticated MCP client connections• Restricted network• Tool call executionLocalSlackGitHubCustom