xAI (Grok)

Overview

Protect your xAI Grok integration with LockLLM's transparent proxy layer. Route all Grok requests through LockLLM for automatic prompt injection scanning. Compatible with Grok 4 and all xAI model variants.

Quick Start

const client = new OpenAI({
  baseURL: 'https://api.lockllm.com/v1/proxy/xai',
  apiKey: process.env.LOCKLLM_API_KEY // Your LockLLM API key
})

const response = await client.chat.completions.create({
  model: 'grok-4',
  messages: [{ role: 'user', content: userInput }]
})

Features

  • Automatic Scanning: All prompts scanned without code changes
  • Grok 4 Support: Compatible with latest xAI models
  • Real-Time Protection: Instant threat detection
  • Streaming Support: Works with streaming responses

Getting Started

Generate API keys in the dashboard and visit the documentation for setup guides.