Code Mode: The Better Way to Use MCP - Code Execution with MCP Framework by Port of Context

Port of
Context

Replace LLM tool calling with Code Mode in secure sandboxes.
Connect AI Agents to the real world with a robust and open source framework.

Improve context efficiency with Code Mode

Following Anthropic and Cloudflare's approach to Code Mode, pctx presents MCP servers as code APIs rather than direct tool calls, dramatically improving context efficiency and task completion success rates.

Traditional MCP
Sequential tool calling
0
tokens
• Load all tool definitions upfront
• Pass large datasets through context
• Sequential processing
Code Mode
Load only what you need
0
tokens
• Discover tools on-demand
• Process data in sandbox
• Parallel execution
98.7%
Token Reduction
From 150,000 tokens to 2,000 tokens

pctx makes Code Mode MCP simple

An open source framework for code execution with MCP that brings the power of Code Mode to your AI agents

Run Locally

No cloud dependency. Full control over your environment

Bring Any LLM

Works with Claude, GPT, Gemini, or any model you choose

Deploy Anywhere

Docker, AWS, GCP, Azure—deploy to any cloud platform

Architecture

AI agents write code that executes in pctx sandboxesType-checked, compiled, and run securely • Authenticated MCP connectionsAI Agents (Bring any LLM)pctxTypeScript Compiler Sandbox• Type checking• Rich error feedback• No network accessExecution Sandbox• Authenticated MCP client connections• Restricted network• Tool call executionLocalSlackGitHubCustom

Frequently Asked Questions

Everything you need to know about Code Mode execution with MCP

Code Mode is an approach to AI tool execution where MCP servers are presented as code APIs rather than direct tool calls. Instead of loading all tool definitions upfront and passing large datasets through context, Code Mode enables on-demand tool discovery, processes data in sandboxes, enables parallel execution, and dramatically reduces token usage.

Use pctx when you need better performance, lower costs, or are working with complex multi-tool workflows. pctx reduces token usage by 98.7% (from 150K to 2K tokens), enables parallel execution instead of sequential tool calls, and provides sandbox security. It's especially valuable in production environments where token costs and context limits are concerns.

pctx implements the same Code Mode principles pioneered by Anthropic and Cloudflare: presenting MCP servers as code APIs rather than direct tool calls. The key difference is that pctx is open-source (MIT license), self-hostable, and works with any LLM (Claude, GPT, Gemini, etc.), giving you full control without vendor lock-in.

Absolutely. pctx is designed to work with any LLM (Claude, GPT-4, Gemini, local models, etc.) and integrates with all existing MCP servers. You can connect internal tools, third-party APIs, or custom services. The framework is LLM-agnostic and follows the standard Model Context Protocol, ensuring broad compatibility.

pctx dramatically improves performance. You'll see a 98.7% reduction in token usage (150K → 2K tokens for complex workflows), 5-10x faster execution through parallel processing, and corresponding cost savings. For example, a complex request that costs $0.45 with traditional MCP costs only $0.006 with pctx using Claude Sonnet pricing.

Migration is straightforward. For traditional MCP: install pctx, configure your existing MCP servers, and update prompts to generate code instead of tool calls. For direct APIs: wrap them as MCP servers (many popular services already have MCP implementations) or create simple adapters. Most teams complete migration in a few hours with immediate performance benefits.

No. pctx runs locally on your development machine, in Docker containers, or on any cloud platform (AWS, GCP, Azure, Vercel, Railway, etc.). There's no cloud dependency or required subscription. You have full control over where and how pctx runs, making it ideal for both local development and production deployments.

pctx is built on top of MCP and enhances it with Code Mode execution. While MCP defines how tools communicate with AI systems, pctx optimizes this communication by presenting MCP servers as code APIs, reducing token usage by 98.7%, and enabling sandbox execution. Think of pctx as the production-ready Code Mode framework for MCP.

Ready to get started with Code Mode?