Skip to content

vLLM

Terminal window
# Server-side (TapPass stores the key in its encrypted vault)
None (self-hosted)
from tappass import Agent
agent = Agent("https://tappass.example.com", "tp_...")
response = agent.chat("Hello", model="vllm/your-model")
Terminal window
export OPENAI_BASE_URL=https://tappass.example.com/v1
export OPENAI_API_KEY=tp_...
# Any OpenAI-compatible tool now routes through TapPass

Point to your vLLM server via LiteLLM configuration.

All requests pass through the full governance pipeline regardless of provider.