Google Vertex AI
Overview
Protect your Google Cloud Vertex AI integration with LockLLM's transparent proxy layer. Secure PaLM, Gemini, and custom models on GCP with automatic prompt injection scanning.
Quick Start
import { VertexAI } from '@google-cloud/vertexai'
// Configure with LockLLM proxy endpoint
const vertex = new VertexAI({
project: process.env.LOCKLLM_API_KEY, // Your LockLLM API key
location: 'us-central1',
apiEndpoint: 'https://api.lockllm.com/v1/proxy/vertex-ai'
})
const model = vertex.preview.getGenerativeModel({ model: 'gemini-3-pro' })
const response = await model.generateContent(userInput)Features
- GCP Integration: Compatible with Google Cloud Platform
- Multiple Models: PaLM, Gemini, and custom models
- Enterprise Ready: Supports GCP security requirements
- Automatic Scanning: Real-time security protection
Getting Started
Generate API keys in the dashboard and visit the documentation.