LiteLLM
Package: litellm · 10 nodes · Provider-agnostic LLM calls
Call OpenAI, Anthropic, Ollama, Groq, Mistral, and other providers through a unified interface. Switch models by changing a single Model Select dropdown.
Node Reference
| Node | Type | Inputs | Outputs |
|---|---|---|---|
| Completion | statement | Messages (list<any>), System Prompt (str) | Response (litellm.Response) |
| Stream | statement | Messages (list<any>), System Prompt (str) | Chunk Text (str), Full Text (str) |
| Set Api Key | statement | API Key (str) | - |
| Model Select | expression | - | Model (str) |
| Fallback | expression | Primary (str), Fallback (str) | Models (list<any>) |
| Embed | statement | Text (str) | Embedding (litellm.Embedding) |
| Get Text | expression | Response (litellm.Response) | Text (str) |
| Get Usage | expression | Response (litellm.Response) | Tokens (int) |
| Build Message | expression | Content (str) | Message (litellm.Message) |
| Build Messages | expression | Msg 0-3 (litellm.Message) | Messages (list<any>) |
Typical Pipeline
Build Message → Build Messages → Completion → Get Text → Print. Use Fallback to chain primary + backup models.