LiteLLM

Package: litellm · 10 nodes · Provider-agnostic LLM calls

Call OpenAI, Anthropic, Ollama, Groq, Mistral, and other providers through a unified interface. Switch models by changing a single Model Select dropdown.

Node Reference

NodeTypeInputsOutputs
CompletionstatementMessages (list<any>), System Prompt (str)Response (litellm.Response)
StreamstatementMessages (list<any>), System Prompt (str)Chunk Text (str), Full Text (str)
Set Api KeystatementAPI Key (str)-
Model Selectexpression-Model (str)
FallbackexpressionPrimary (str), Fallback (str)Models (list<any>)
EmbedstatementText (str)Embedding (litellm.Embedding)
Get TextexpressionResponse (litellm.Response)Text (str)
Get UsageexpressionResponse (litellm.Response)Tokens (int)
Build MessageexpressionContent (str)Message (litellm.Message)
Build MessagesexpressionMsg 0-3 (litellm.Message)Messages (list<any>)

Typical Pipeline

Build Message → Build Messages → Completion → Get Text → Print. Use Fallback to chain primary + backup models.