Skip to main content
Dedalus helps you ship agent workflows that are:
  • Provider-agnostic: Use OpenAI, Anthropic, Google, xAI, DeepSeek, and more with one API.
  • Tool- and MCP-native: Let models call local functions and hosted MCP servers.
  • Production-ready: Streaming, structured outputs, routing/handoffs, and runtime policies.

What are you trying to build?

Chat with a model

Send a prompt and get a response from any provider/model.

Equip a model with tools

Let the model call typed Python/TS functions that you implement.

Stream agent output

Print responses as they’re generated (great for UIs/CLIs).

Add MCP servers

Connect to hosted MCP servers with one line.

Get reliable JSON

Validate model output against schemas (Pydantic/Zod).

Route across models

Provide multiple models; the agent can route/handoff by phase.

Installation

uv pip install dedalus-labs

Set Your API Key

Get your API key from the dashboard and set it as an environment variable:
export DEDALUS_API_KEY="your-api-key"
Or use a .env file:
DEDALUS_API_KEY=your-api-key

Your First Request

Let’s build this incrementally.

1) Chat with a model

import asyncio
from dedalus_labs import AsyncDedalus, DedalusRunner
from dotenv import load_dotenv

load_dotenv()

async def main():
    client = AsyncDedalus()
    runner = DedalusRunner(client)

    response = await runner.run(
        input="What are the key factors that influence weather patterns?",
        model="anthropic/claude-opus-4-6",
    )

    print(response.final_output)

if __name__ == "__main__":
    asyncio.run(main())

2) Add an MCP server

Here we connect a well-known MCP server and let the model use it.
import asyncio
from dedalus_labs import AsyncDedalus, DedalusRunner
from dotenv import load_dotenv

load_dotenv()

async def main():
    client = AsyncDedalus()
    runner = DedalusRunner(client)

    response = await runner.run(
        input="What's the weather forecast for San Francisco this week?",
        model="anthropic/claude-opus-4-6",
        mcp_servers=["windsornguyen/open-meteo-mcp"],  # Weather forecasts via Open-Meteo
    )

    print(response.final_output)

if __name__ == "__main__":
    asyncio.run(main())

3) Add a local tool

Define a function with type hints and a docstring. Pass it to runner.run(). The SDK extracts the schema automatically and handles execution when the model decides to use it.
import asyncio
from dedalus_labs import AsyncDedalus, DedalusRunner
from dotenv import load_dotenv

load_dotenv()

def as_bullets(items: list[str]) -> str:
    """Format items as a bulleted list."""
    return "\n".join(f"• {item}" for item in items)

async def main():
    client = AsyncDedalus()
    runner = DedalusRunner(client)

    response = await runner.run(
        input=(
            "Get the 7-day weather forecast for San Francisco "
            "and format the daily conditions as bullets using as_bullets."
        ),
        model="anthropic/claude-opus-4-6",
        mcp_servers=["windsornguyen/open-meteo-mcp"],
        tools=[as_bullets],
    )

    print(response.final_output)

if __name__ == "__main__":
    asyncio.run(main())

4) Stream output

import asyncio
from dedalus_labs import AsyncDedalus, DedalusRunner
from dedalus_labs.utils.stream import stream_async
from dotenv import load_dotenv

load_dotenv()

async def main():
    client = AsyncDedalus()
    runner = DedalusRunner(client)

    stream = runner.run(
        input="Explain how weather forecasting works in one paragraph, streaming as you write.",
        model="anthropic/claude-opus-4-6",
        stream=True,
    )

    await stream_async(stream)

if __name__ == "__main__":
    asyncio.run(main())

Next steps

Use Cases

Start from common agent patterns and templates.

Cookbook

End-to-end implementations and working recipes.
Go deeper: Tools · MCP Servers · Structured Outputs · Streaming

Get the latest SDKs

Python SDK

dedalus-labs/dedalus-sdk-python

TypeScript SDK

dedalus-labs/dedalus-sdk-typescript
Connect these docs programmatically to Claude, VSCode, and more via MCP for real-time answers.
Last modified on April 11, 2026