OpenRouter

Overview

Protect your OpenRouter integration with LockLLM's transparent proxy layer. Route all OpenRouter requests through LockLLM for automatic prompt injection scanning without modifying your application logic. OpenRouter provides unified access to multiple LLM providers, and LockLLM adds consistent security across all of them.

The integration maintains OpenRouter's complete API interface, preserving access to all supported models from OpenAI, Anthropic, Google, and dozens of other providers. Your existing code works exactly as before, with added security protection.

How it Works

Update your OpenRouter client's base URL to route through the LockLLM proxy. All requests are intercepted, prompts are scanned for security threats in real-time, and safe requests are forwarded to OpenRouter. Malicious prompts are blocked before reaching any provider.

The proxy adds 150-250ms overhead and operates transparently. All standard OpenRouter features including model selection, fallbacks, and provider routing work without modification.

Supported Models

All OpenRouter models are supported through the proxy:

  • OpenAI models (GPT-5.2, GPT-5, GPT-4)
  • Anthropic models (Claude Opus, Sonnet, Haiku)
  • Google models (Gemini Pro, Flash)
  • Open-source models (Llama, Mistral, etc.)
  • All other providers supported by OpenRouter

Quick Start

Update your OpenRouter client configuration to route through the proxy:

const client = new OpenAI({
  baseURL: 'https://api.lockllm.com/v1/proxy/openrouter',
  apiKey: process.env.OPENROUTER_API_KEY,
  defaultHeaders: {
    'X-LockLLM-Key': process.env.LOCKLLM_API_KEY,
    'HTTP-Referer': 'https://yourapp.com',
    'X-Title': 'Your App Name'
  }
})

// All requests are automatically scanned
const response = await client.chat.completions.create({
  model: 'anthropic/claude-opus-4.5',
  messages: [{ role: 'user', content: userInput }]
})

Python Example

from openai import OpenAI

client = OpenAI(
    base_url="https://api.lockllm.com/v1/proxy/openrouter",
    api_key=os.environ["OPENROUTER_API_KEY"],
    default_headers={
        "X-LockLLM-Key": os.environ["LOCKLLM_API_KEY"],
        "HTTP-Referer": "https://yourapp.com",
        "X-Title": "Your App Name"
    }
)

# Automatic security scanning
response = client.chat.completions.create(
    model="anthropic/claude-opus-4.5",
    messages=[{"role": "user", "content": user_input}]
)

Features

  • Unified Security: Consistent protection across all providers
  • Multi-Provider Support: Works with all OpenRouter models
  • Model Fallbacks: Preserves OpenRouter's fallback mechanisms
  • Provider Routing: Compatible with provider preferences
  • Streaming Support: Works with streaming responses
  • Cost Optimization: Maintains OpenRouter's cost routing

Configuration

Configure security behavior using request headers:

  • X-LockLLM-Key: Your LockLLM API key (required)

Getting Started

Generate API keys in the dashboard, update your client configuration with the proxy URL, and start making secure requests. Visit the documentation for complete setup guides, examples, and best practices.