Providers
OpenSploit supports 75+ LLM providers through the AI SDK and Models.dev. You can use cloud providers or run models locally.
Tip
For offline, private operation, use Ollama with local models. No data leaves your machine.
Adding a Provider
- Run the
/connectcommand in the TUI - Select your provider
- Enter your API key
Credentials are stored in ~/.local/share/opensploit/auth.json.
Note
Use environment variables or file references in config to avoid storing keys in plain text. See Configuration.
Local Models (Ollama)
For fully local, offline operation, use Ollama:
-
Install Ollama from ollama.com
-
Pull a model:
ollama pull llama3.2
- Select Ollama as your provider in OpenSploit
Recommended models for security tasks:
llama3.2- Good balance of speed and capabilitycodellama- Better for code analysismixtral- Strong reasoning for complex tasks
Cloud Providers
Anthropic (Claude)
{
"provider": {
"anthropic": {
"options": {
"apiKey": "{env:ANTHROPIC_API_KEY}"
}
}
},
"model": "anthropic/claude-sonnet-4-5"
}
OpenAI
{
"provider": {
"openai": {
"options": {
"apiKey": "{env:OPENAI_API_KEY}"
}
}
},
"model": "openai/gpt-4o"
}
Google (Gemini)
{
"provider": {
"google": {
"options": {
"apiKey": "{env:GOOGLE_API_KEY}"
}
}
},
"model": "google/gemini-pro"
}
Amazon Bedrock
{
"provider": {
"bedrock": {
"options": {
"region": "us-east-1"
}
}
},
"model": "bedrock/anthropic.claude-3-sonnet"
}
Bedrock uses AWS credentials from your environment.
Custom Base URL
You can customize the base URL for any provider:
{
"provider": {
"anthropic": {
"options": {
"baseURL": "https://your-proxy.com/v1"
}
}
}
}
Disabling Providers
Prevent providers from loading even if credentials are available:
{
"disabled_providers": ["openai", "gemini"]
}
Enabling Specific Providers
Allow only specific providers:
{
"enabled_providers": ["anthropic", "ollama"]
}