Cohere
Overview
Protect your Cohere API integration with LockLLM's transparent proxy layer. Route all Cohere requests through LockLLM for automatic prompt injection scanning without modifying your application logic. Compatible with Command, Embed, and all Cohere model variants built for enterprise applications.
The integration maintains Cohere's complete API interface, including generation, embedding, classification, and summarization endpoints. Your existing code works exactly as before, with added security protection.
How it Works
Update your Cohere client's base URL to route through the LockLLM proxy. All requests are intercepted, prompts are scanned for security threats in real-time, and safe requests are forwarded to Cohere. Malicious prompts are blocked before reaching the model.
The proxy adds 150-250ms overhead and operates transparently. All standard Cohere features including streaming, tool use, and RAG connectors work without modification.
Supported Models
All Cohere models and endpoints are supported through the proxy:
- Command models for generation
- Embed models for embeddings
- Classify endpoints
- Summarize endpoints
- Rerank capabilities
- Custom fine-tuned models
Quick Start
Update your Cohere client configuration to route through the proxy:
import { CohereClient } from 'cohere-ai'
const client = new CohereClient({
baseURL: 'https://api.lockllm.com/v1/proxy/cohere',
token: process.env.COHERE_API_KEY,
additionalHeaders: {
'X-LockLLM-Key': process.env.LOCKLLM_API_KEY
}
})
// All requests are automatically scanned
const response = await client.generate({
prompt: userInput,
model: 'command'
})Python Example
import cohere
client = cohere.Client(
api_key=os.environ["COHERE_API_KEY"],
base_url="https://api.lockllm.com/v1/proxy/cohere",
headers={
"X-LockLLM-Key": os.environ["LOCKLLM_API_KEY"]
}
)
# Automatic security scanning
response = client.generate(
prompt=user_input,
model='command'
)Features
- Automatic Scanning: All prompts scanned without code changes
- Full API Support: Works with all Cohere endpoints and models
- Streaming Support: Compatible with streaming responses
- Tool Use: Preserves tool calling capabilities
- RAG Connectors: Works with Cohere's retrieval features
- Enterprise Ready: Supports enterprise security requirements
Configuration
Configure security behavior using request headers:
X-LockLLM-Key: Your LockLLM API key (required)
Getting Started
Generate API keys in the dashboard, update your client configuration with the proxy URL, and start making secure requests. Visit the documentation for complete setup guides, examples, and best practices.