Python Package
What is Hypercontext?¶
Hypercontext is a standalone Python SDK for building AI agents that are aware of their own context. Unlike conventional agent frameworks that operate with fixed prompts and static tool sets, Hypercontext agents:
- Observe their own reasoning in real time
- Rewrite their own context based on performance feedback
- Archive successful strategies in an evolutionary knowledge base
- Transfer learned behaviors across tasks and sessions
The framework is inspired by the Hyperagents paper's vision of meta-cognitive AI systems — agents that don't just use context, but reason about and modify their own context to improve over time.
Features¶
- Self-Referential Context Loop — agents read and rewrite their own system prompts, tool descriptions, and memory at runtime
- Meta-Cognitive Self-Modification — built-in reflection and context evolution based on task outcomes
- Evolutionary Archive — persistent store of proven context configurations, ranked by fitness
- Transfer Learning — reuse context patterns across different tasks, domains, and agent instances
- Context Fission — intelligent decomposition of complex contexts into specialized sub-contexts (from ContextFission)
- Zero External Dependencies — pure Python core; TypeScript SDK with minimal deps
- Dual SDK — first-class Python and TypeScript/Node.js support
- Framework Agnostic — works with any LLM provider (OpenAI, Anthropic, local models, etc.)
- Operational Provider Recipes — practical setup guides and runnable demos for Claude, OpenAI, Ollama, OpenAI-compatible servers, and local models
- Dedicated TUI — a curses-based terminal dashboard for browsing, pinning, and executing CLI commands in the shell, with
--workdirsupport for project-root workflows - MCP Everywhere — a stdio MCP daemon for Claude Desktop, Claude Code, Codex, and other terminal or desktop agents, plus the existing HTTP server for the browser dashboard
What You Get¶
Installing hypercontext from PyPI gives you:
- the Python SDK
- the
python -m hypercontextCLI - the dedicated terminal UI
- the bundled stdio MCP daemon
- the bundled HTTP MCP server
- the browser dashboard launcher
- the example-friendly provider runtime and agent APIs
You do not need to install a separate mcp package.
This is the full Hypercontext runtime for Python users, not a thin SDK-only
package.
Coverage At A Glance¶
| Ships | Does Not Ship |
|---|---|
| Python SDK, CLI, TUI, stdio MCP daemon, HTTP server, browser launcher, provider runtime, agents, memory, scoring, archive, and examples | README.md, usage.md, or the npm SDK |
Prerequisites¶
- Python 3.10 or newer
pip- A virtual environment is strongly recommended
Create a virtual environment if you want to keep the install isolated:
python3 -m venv .venv
source .venv/bin/activate
python -m pip install --upgrade pip
Install Hypercontext¶
Install the published package:
pip install hypercontext
If you want the development extras that match this repository’s toolchain:
pip install "hypercontext[dev]"
The package is designed to be used without needing a local workspace copy.
Verify The Install¶
Confirm that the package imports correctly:
python - <<'PY'
import hypercontext
print("hypercontext version:", hypercontext.__version__)
print("Loaded from:", hypercontext.__file__)
PY
Or use the CLI:
python -m hypercontext version
Package Configuration¶
Hypercontext reads runtime settings from the process environment.
You only need a .env file when you want to set provider credentials,
output directories, token budgets, or other runtime flags. If you do want
one, copy the example template and populate it for your environment:
cp .env.example .env
Key settings include:
HYPERCONTEXT_PROVIDERHYPERCONTEXT_MODELHYPERCONTEXT_API_KEYHYPERCONTEXT_BASE_URLHYPERCONTEXT_ORGANIZATIONHYPERCONTEXT_TOKEN_BUDGETHYPERCONTEXT_TEMPERATUREHYPERCONTEXT_MAX_TOKENSHYPERCONTEXT_OUTPUT_DIRHYPERCONTEXT_MAX_GENERATIONSHYPERCONTEXT_PARENT_SELECTIONHYPERCONTEXT_CAVEMAN_MODEHYPERCONTEXT_DOCKER_IMAGEHYPERCONTEXT_SANDBOX_ENABLEDHYPERCONTEXT_CONVERGENCE_CEILING
Claude / Anthropic Setup¶
For Anthropic, use the API root URL:
HYPERCONTEXT_PROVIDER=anthropic
HYPERCONTEXT_MODEL=claude-sonnet-4-20250514
ANTHROPIC_API_KEY=your-key-here
HYPERCONTEXT_BASE_URL=https://api.anthropic.com
ANTHROPIC_BASE_URL=https://api.anthropic.com
The provider helpers normalize an accidental trailing /v1 suffix, but the
cleanest setup is to keep the root API URL in your environment file.
Ollama Setup¶
For a fully local backend, start Ollama and point Hypercontext at the local server:
ollama serve
ollama pull llama3
HYPERCONTEXT_PROVIDER=ollama
HYPERCONTEXT_MODEL=llama3
OLLAMA_BASE_URL=http://localhost:11434
Then the same package commands, TUI, MCP daemon, browser UI, and agent APIs all work against the Ollama model you selected. For a full walkthrough, see Ollama guide.
Named Provider Presets¶
If you want multiple backends in one project, use named provider presets in a
YAML config file and resolve them with Settings.create_provider() or
LLMClient.from_settings().
Example:
provider_name: claude-test
provider_presets:
claude-test:
provider: anthropic
model: claude-sonnet-4-20250514
api_key: ${ANTHROPIC_API_KEY}
base_url: https://api.anthropic.com
local-mock:
provider: mock
model: demo
extra_kwargs:
response_delay: 0.0
Hypercontext expands ${VAR} values from the process environment when loading
YAML config, so you can keep secrets out of the file.
CLI Command Reference¶
The package ships a standard-library CLI:
versionprints the package versionproviderslists registered provider backendsrunexecutes the evolutionary loopcompressreduces text sizevalidatechecks compression fidelityevaluateruns benchmark-style evaluation scaffoldsarchiveinspects archive contentbenchmarkruns benchmark helpersmcplaunches the stdio MCP daemon for desktop and terminal agentsservelaunches the HTTP MCP server for browser integrationsuilaunches the browser dashboardtuiopens the dedicated curses terminal UIdockerexercises sandbox-oriented workflows
Examples:
python -m hypercontext providers
python -m hypercontext run --generations 5 --output-dir ./runs/demo --workdir .
python -m hypercontext mcp --workdir /path/to/project
python -m hypercontext tui --workdir /path/to/project
python -m hypercontext serve --port 8080 --workdir /path/to/project
Use --workdir whenever you want Hypercontext to operate against a specific
project root instead of the current shell directory.
Use In Your Own Python Project¶
Typical imports:
from hypercontext import (
HyperContext,
TaskAgent,
MetaAgent,
LLMClient,
ArchiveStore,
LineageTracker,
ContextCompressor,
ContextRetriever,
)
Direct Orchestration¶
from hypercontext import HyperContext
hc = HyperContext(output_dir="./hypercontext_output")
summary = hc.run(max_generations=3)
print(summary)
Provider-Backed Calls¶
from hypercontext import LLMClient
from hypercontext.providers import ProviderRegistry
registry = ProviderRegistry.instance()
provider = registry.create(
"anthropic",
model="claude-sonnet-4-20250514",
api_key="your-key-here",
base_url="https://api.anthropic.com",
)
client = LLMClient(provider=provider)
text, history, metadata = client.complete("Summarize Hypercontext in one sentence.")
print(text)
Agent Workflows¶
Use TaskAgent when you want a repeatable task wrapper, and MetaAgent when
you want repository-aware tool use or self-modification workflows.
See the example gallery in the main docs for broader usage patterns.
Assistant Integrations¶
For Claude Desktop, Claude Code, Codex, or other MCP-aware assistants, use the native stdio daemon:
python -m hypercontext mcp --workdir /path/to/project
For browser-style or custom web integrations, use the HTTP server:
python -m hypercontext serve --port 8080 --workdir /path/to/project
For a full guide with install-method-specific integration examples, see Integrations.
Troubleshooting¶
If installation fails:
- Confirm that you are using Python 3.10 or newer
- Upgrade
pip - Try
pip install hypercontext --no-cache-dir - If you are behind a locked-down network, make sure access to PyPI is available
- If you are testing a provider-backed workflow, confirm that the provider SDK and credentials are installed and exported in your shell