Use other models (LiteLLM)
Klisk supports any language model provider thanks to LiteLLM.
Installation
bash
pip install 'klisk[litellm]'Configuration
Anthropic (Claude)
sh
# .env
ANTHROPIC_API_KEY=sk-ant-...python
agent = define_agent(
name="MyAgent",
instructions="...",
model="anthropic/claude-sonnet-4-20250514",
tools=get_tools("my_tool"),
)Google Gemini
sh
# .env
GEMINI_API_KEY=...python
agent = define_agent(
name="MyAgent",
instructions="...",
model="gemini/gemini-2.5-pro",
tools=get_tools("my_tool"),
)Mistral
sh
# .env
MISTRAL_API_KEY=...python
agent = define_agent(
name="MyAgent",
instructions="...",
model="mistral/mistral-large-latest",
tools=get_tools("my_tool"),
)Model format
The format is always provider/model-name:
| Provider | Prefix | Example |
|---|---|---|
| OpenAI | (no prefix) | gpt-5.2 |
| Anthropic | anthropic/ | anthropic/claude-sonnet-4-20250514 |
gemini/ | gemini/gemini-2.5-pro | |
| Mistral | mistral/ | mistral/mistral-large-latest |
TIP
For OpenAI models you don't need a prefix or LiteLLM. Only install klisk[litellm] if you're going to use other providers.
