Azure OpenAI
Overview
Protect your Azure OpenAI Service deployments with LockLLM's transparent proxy layer. Route all Azure OpenAI requests through LockLLM for automatic prompt injection scanning without modifying your application logic. Compatible with all GPT models hosted on Microsoft Azure infrastructure, providing enterprise-grade security for Azure AI deployments.
The integration maintains Azure OpenAI's complete API interface, including all deployment-specific features, private networking, and compliance requirements. Your existing code works exactly as before, with added security protection layer.
How it Works
Update your Azure OpenAI client's endpoint URL to route through the LockLLM proxy. All requests are intercepted, prompts are scanned for security threats in real-time, and safe requests are forwarded to your Azure deployment. Malicious prompts are blocked before reaching the model.
The proxy adds 150-250ms overhead and operates transparently. All standard Azure OpenAI features including streaming, function calling, content filtering, and private endpoints work without modification.
Supported Models
All Azure OpenAI Service models are supported through the proxy:
- GPT-5.2 and GPT-5 series
- GPT-4 Turbo and GPT-4
- GPT-3.5 Turbo
- All vision-enabled models
- Custom fine-tuned deployments
- Regional deployment variants
Quick Start
Update your Azure OpenAI client configuration to route through the proxy:
const client = new AzureOpenAI({
endpoint: 'https://api.lockllm.com/v1/proxy/azure/your-resource',
apiKey: process.env.AZURE_OPENAI_API_KEY,
apiVersion: '2024-02-15',
deployment: 'your-deployment',
defaultHeaders: {
'X-LockLLM-Key': process.env.LOCKLLM_API_KEY
}
})
// All requests are automatically scanned
const response = await client.chat.completions.create({
messages: [{ role: 'user', content: userInput }]
})Python Example
from openai import AzureOpenAI
client = AzureOpenAI(
azure_endpoint="https://api.lockllm.com/v1/proxy/azure/your-resource",
api_key=os.environ["AZURE_OPENAI_API_KEY"],
api_version="2024-02-15",
default_headers={
"X-LockLLM-Key": os.environ["LOCKLLM_API_KEY"]
}
)
# Automatic security scanning
response = client.chat.completions.create(
model="your-deployment",
messages=[{"role": "user", "content": user_input}]
)Features
- Automatic Scanning: All prompts scanned without code changes
- Enterprise Security: Compatible with Azure security requirements
- Deployment Support: Works with all Azure deployment configurations
- Content Filtering: Preserves Azure's built-in content filters
- Private Endpoints: Compatible with Azure private networking
- Compliance Ready: Maintains Azure compliance certifications
Configuration
Configure security behavior using request headers:
X-LockLLM-Key: Your LockLLM API key (required)
Getting Started
Generate API keys in the dashboard, update your client configuration with the proxy endpoint, and start making secure requests. Visit the documentation for complete setup guides, examples, and Azure-specific best practices.