Getting Started with Weave AI

Weave AI is Loom's module for building AI systems visually — conversational agents and Server Graphs with inline tool definitions and agent orchestration. You design your system on a canvas and Loom compiles it to clean, production-ready Python using the your chosen provider's SDK (Anthropic, OpenAI, Google, or xAI) and the MCP Python SDK. No Loom dependency at runtime. You own the output.

What You Can Build

Weave AI has two project types:

Project TypeWhat You BuildOutput
AgentA conversational AI agent with tool use and memoryagent.py
Server GraphAn MCP server with inline tool definitions, imported agents, and orchestration logicserver.py + agent subfolders

You can build each type independently, or compose them: define agents in separate projects, then import them into a Server Graph that defines tools inline and compiles everything into a single deployable package.

Installation

Install the required SDKs before running generated code:

# For Agent projects (install the SDK for your chosen provider)
pip install anthropic      # Claude models
pip install openai         # GPT / o-series / Grok (xAI) models
pip install google-genai   # Gemini models

# For Server Graph projects
pip install "mcp[cli]" anyio anthropic

Each generated project includes a requirements.txt with the correct dependencies.

Quickstart: Your First Server Graph

  1. Create a new project and select Weave AI as the module, then choose Server Graph
  2. Add a Gateway node — choose a transport (stdio, http, or sse)
  3. Add a Tool Definition node — set the tool name, description, and parameters
  4. Add Get Tool Arg and Tool Return nodes
  5. Wire exec flow: Gateway → Tool Definition → Tool Return
  6. Wire data flow: Tool Definition input → Get Tool Arg, Get Tool Arg value → Tool Return value
  7. Click Generate, then run:
python server.py

Add it to Claude Desktop's MCP config and your tool is live.

Quickstart: Your First Agent

  1. Create a new project, select Weave AI, then choose Agent
  2. Add a persona in the left panel — set a name and write a system prompt
  3. On the canvas, add Agent StartUser MessageAppend MessageLLM CallOn Tool CallAgent End
  4. Wire exec flow through the nodes in order
  5. Wire data flow: initial_input → User Message content, message history into LLM Call messages, LLM response to On Tool Call, text output to Agent End
  6. Click Generate and run:
python agent.py

The Editor

Agent projects open a canvas with a Personas panel on the left. Server Graph projects show an Agents panel for importing Agent projects and a Server Config panel for gateway settings.

Right-click the canvas to open the node palette. Nodes are organized by category under Weave AI and Core.

Generated Output

All generated code:

  • Uses the framework's native API directly (Anthropic, OpenAI, or Google GenAI SDK; MCP Python SDK)
  • Has no dependency on Loom at runtime
  • Can be modified by hand after export
  • Can be pushed directly to a GitHub repo via the Push to GitHub button