Skip to content

Use other models (LiteLLM)

Klisk supports any language model provider thanks to LiteLLM.

Installation

bash
pip install 'klisk[litellm]'

Configuration

Anthropic (Claude)

sh
# .env
ANTHROPIC_API_KEY=sk-ant-...
python
agent = define_agent(
    name="MyAgent",
    instructions="...",
    model="anthropic/claude-sonnet-4-20250514",
    tools=get_tools("my_tool"),
)

Google Gemini

sh
# .env
GEMINI_API_KEY=...
python
agent = define_agent(
    name="MyAgent",
    instructions="...",
    model="gemini/gemini-2.5-pro",
    tools=get_tools("my_tool"),
)

Mistral

sh
# .env
MISTRAL_API_KEY=...
python
agent = define_agent(
    name="MyAgent",
    instructions="...",
    model="mistral/mistral-large-latest",
    tools=get_tools("my_tool"),
)

Model format

The format is always provider/model-name:

ProviderPrefixExample
OpenAI(no prefix)gpt-5.2
Anthropicanthropic/anthropic/claude-sonnet-4-20250514
Googlegemini/gemini/gemini-2.5-pro
Mistralmistral/mistral/mistral-large-latest

TIP

For OpenAI models you don't need a prefix or LiteLLM. Only install klisk[litellm] if you're going to use other providers.

Klisk Documentation