Status Login

Roll your own

Run agents and sandboxes locally or in your private cloud

Bring Your Own Keys

Use the AI model that works best for you, lock it down and even run on your own hardware

Your Choice, Your Control

Don't get locked into a single AI provider. Bring your own keys and use the model that best fits your needs, budget, and requirements.

Experiment Freely

Switch between different LLM providers effortlessly. Our platform supports multiple providers, so you can use different models for different tasks, compare performance, or switch providers without changing your workflows or code.

Easy Configuration for running sandboxes locally, on your home lab or in your cloud

Add files and install packages. Use NodeJS, Python or Rust. Secure micro-VM's provide an execution environment for agents.

heyo.conf
1[server]
2
3host = "127.0.0.1"
4port = 4444
5log_level = "info"
6
7
8[llm]
9
10default_provider = "anthropic"
11default_model = "claude-sonnet-4-20250514"
12
13
14[llm.api_keys]
15
16openai = "your-openai-api-key-here"
17anthropic = "your-anthropic-api-key-here"
18exa = "your-exa-api-key-here"
19inceptionlabs = "your-inceptionlabs-api-key-here"
20mistral = "your-mistral-api-key-here"
21gemini = "your-gemini-api-key-here"
22
23
24[llm.models]
25
26openai = "gpt-4"
27anthropic = "claude-haiku-4-5-20251001"
28ollama = "qwq:latest"
29mistral = "magistral-medium-2509"
30inceptionlabs = "mercury"
31gemini = "gemini-1.5-pro"
32
33
34[mcp]
35
36default_tools = ["websearch", "webresource"]
37bearer_token = "your_auth_token_here"
38
39[[mcp.servers]]
40
41name = "heymcp"
42url = "http://localhost:8080/sse"
43enabled = true
44tools = ["websearch", "webresource"]
45
46
47[notifications]
48
49[notifications.twilio]
50
51account_sid = "your-twilio-account-sid-here"
52auth_token = "your-twilio-auth-token-here"
53from_number = "your-twilio-phone-number-here"
54
55
56[notifications.email]
57
58smtp_server = "smtp.gmail.com"
59smtp_port = 587
60username = "your-email@gmail.com"
61password = "your-app-password"
62from_email = "your-email@gmail.com"
63
64
65[auth]
66
67use_authentication = true
68jwt_secret = "your-super-secret-jwt-key-here"
69jwt_expiration = "24h"
70multitenant = true
71
72
73[storage]
74
75driver = "postgres"
76database_url = "postgresql://postgres:your-password@localhost:5444/heyo_db"
77
78
79[storage.s3]
80
81bucket = "your-s3-bucket-name"
82region = "us-east-1"
83profile = "your-aws-profile"
84
85
86[memory]
87
88summarization_provider = "anthropic"
89summarization_model = "claude-sonnet-4-20250514"
90summarization_temperature = 0.3
91summarization_max_tokens = 5000

Multiple Providers

Support for Anthropic, Mistral, OpenAI, Google Gemini, Inception, and Ollama. Use the models that work best for your specific use cases.

Local & Cloud

Run models locally with Ollama for complete privacy, or use cloud providers for more powerful models. Mix and match as needed.

Easy Switching

Change providers or models with a simple configuration change. No need to rewrite your agents or workflows.

Cost Control

Choose providers based on cost, performance, and features. Optimize your AI spend by using the right model for each task.

Supported Providers

Connect with your preferred AI provider

Anthropic

Mistral

OpenAI

Inception Labs

Ollama

Gemini

Build Your Own Sandbox

An isolated virtual machine for your agents.

Your Sandbox

Add files, scripts, data, etc. Run standard terminal commands. Have agents build scripts and webpages.

Host Anywhere

Linux, MacOS, and Linux support. Micro-VM's provide isolation on almost any platform.

Customizable Environment

Install and configure any tools, libraries, or dependencies you need. Your sandbox is tailored to your specific workflows and requirements.

Deploy Anywhere

Run on your own hardware, cloud providers, or any infrastructure you choose. No vendor lock-in—you decide where your sandbox lives.

Isolated & Secure

Each sandbox runs in its own isolated VM, ensuring security and preventing conflicts between different projects or environments.

Easy Management

Simple setup and management tools make it easy to create, configure, and maintain your sandbox environments without complex infrastructure knowledge.

Headless Agents

Agents that can be triggered remotely and communicate with your systems

Remote Control, Local Power

Headless agents give you the flexibility to trigger and control your automations from anywhere, while maintaining direct communication with your systems. Whether you need to start a process from a webhook, schedule a task, or integrate with external services, headless agents provide the bridge between remote triggers and your local infrastructure.

Seamless Integration

Connect your agents to APIs, webhooks, and external services without compromising on security or performance. Headless agents run independently, allowing you to build complex workflows that respond to events, process data, and communicate with your existing systems—all while keeping your data and processes under your control.

Remote Triggers

Trigger your agents from anywhere using webhooks, APIs, or scheduled events. Perfect for integrating with external services and automating workflows.

System Communication

Agents communicate directly with your systems, databases, and services. Maintain full control over your data and infrastructure.

Always Available

Headless agents run independently, ready to respond to triggers at any time. No need to keep applications open or maintain active sessions.

Secure & Private

All communication is secure and private. Your agents run on your infrastructure, keeping your data local and under your control.

Login to download the app