Getting Started with Weave AI
Weave AI is Loom's module for building AI systems visually — conversational agents and Server Graphs with inline tool definitions and agent orchestration. You design your system on a canvas and Loom compiles it to clean, production-ready Python using the your chosen provider's SDK (Anthropic, OpenAI, Google, or xAI) and the MCP Python SDK. No Loom dependency at runtime. You own the output.
What You Can Build
Weave AI has two project types:
| Project Type | What You Build | Output |
|---|---|---|
| Agent | A conversational AI agent with tool use and memory | agent.py |
| Server Graph | An MCP server with inline tool definitions, imported agents, and orchestration logic | server.py + agent subfolders |
You can build each type independently, or compose them: define agents in separate projects, then import them into a Server Graph that defines tools inline and compiles everything into a single deployable package.
Installation
Install the required SDKs before running generated code:
# For Agent projects (install the SDK for your chosen provider)
pip install anthropic # Claude models
pip install openai # GPT / o-series / Grok (xAI) models
pip install google-genai # Gemini models
# For Server Graph projects
pip install "mcp[cli]" anyio anthropicEach generated project includes a requirements.txt with the correct dependencies.
Quickstart: Your First Server Graph
- Create a new project and select Weave AI as the module, then choose Server Graph
- Add a Gateway node — choose a transport (stdio, http, or sse)
- Add a Tool Definition node — set the tool name, description, and parameters
- Add Get Tool Arg and Tool Return nodes
- Wire exec flow: Gateway → Tool Definition → Tool Return
- Wire data flow: Tool Definition
input→ Get Tool Arg, Get Tool Argvalue→ Tool Returnvalue - Click Generate, then run:
python server.pyAdd it to Claude Desktop's MCP config and your tool is live.
Quickstart: Your First Agent
- Create a new project, select Weave AI, then choose Agent
- Add a persona in the left panel — set a name and write a system prompt
- On the canvas, add Agent Start → User Message → Append Message → LLM Call → On Tool Call → Agent End
- Wire exec flow through the nodes in order
- Wire data flow:
initial_input→ User Messagecontent, message history into LLM Callmessages, LLM response to On Tool Call, text output to Agent End - Click Generate and run:
python agent.pyThe Editor
Agent projects open a canvas with a Personas panel on the left. Server Graph projects show an Agents panel for importing Agent projects and a Server Config panel for gateway settings.
Right-click the canvas to open the node palette. Nodes are organized by category under Weave AI and Core.
Generated Output
All generated code:
- Uses the framework's native API directly (Anthropic, OpenAI, or Google GenAI SDK; MCP Python SDK)
- Has no dependency on Loom at runtime
- Can be modified by hand after export
- Can be pushed directly to a GitHub repo via the Push to GitHub button