// Docs

Quickstart

Get Cencurity running in VS Code in under two minutes.

Steps

  1. Install the Cencurity extension from VS Code Marketplace.
  2. Open Command Palette (Ctrl+Shift+P / Cmd+Shift+P on macOS) and run Cencurity: Enable Protection.
  3. If Roo Code is not installed, Cencurity installs it automatically and reloads the window when needed.
  4. Select your LLM provider and enter the provider URL (e.g. https://api.x.ai).
  5. Enter your API key.
  6. Select a model.
  7. Run Cencurity: Open Security Center from the Command Palette to open the security dashboard inside VS Code.

Done. Once protection is active, Cencurity routes traffic through a local gateway (127.0.0.1:38180) and applies local security scanning to LLM responses before they reach Roo Code.

How it works

AI Agent → Cencurity Security Gateway → LLM Provider
  • Your API key stays inside the IDE — it is never sent to an external server.
  • Requests are routed through the local security gateway.
  • Responses are scanned against local security policies before reaching the editor.
  • Only policy violations are logged. Normal traffic is not stored.

Supported providers

  • OpenAI
  • Anthropic
  • Gemini
  • OpenRouter
  • Any OpenAI-compatible LLM

Command Palette commands

Command Description
Cencurity: Enable Protection Activate protection and select an LLM provider
Cencurity: Disable Protection Deactivate protection and restore previous routing
Cencurity: Open Security Center Open the security dashboard inside VS Code
Cencurity: Test Protection Verify local proxy connectivity
Cencurity: Show Runtime Info Check runtime and protection status
Cencurity: Install or Update Core Install or update the local core runtime

Next steps