// Docs
Quickstart
Get Cencurity running in VS Code in under two minutes.
Steps
- Install the Cencurity extension from VS Code Marketplace.
- Open Command Palette (
Ctrl+Shift+P/Cmd+Shift+Pon macOS) and runCencurity: Enable Protection. - If Roo Code is not installed, Cencurity installs it automatically and reloads the window when needed.
- Select your LLM provider and enter the provider URL (e.g.
https://api.x.ai). - Enter your API key.
- Select a model.
- Run
Cencurity: Open Security Centerfrom the Command Palette to open the security dashboard inside VS Code.
Done. Once protection is active, Cencurity routes traffic through a local gateway (127.0.0.1:38180) and applies local security scanning to LLM responses before they reach Roo Code.
How it works
AI Agent → Cencurity Security Gateway → LLM Provider
- Your API key stays inside the IDE — it is never sent to an external server.
- Requests are routed through the local security gateway.
- Responses are scanned against local security policies before reaching the editor.
- Only policy violations are logged. Normal traffic is not stored.
Supported providers
- OpenAI
- Anthropic
- Gemini
- OpenRouter
- Any OpenAI-compatible LLM
Command Palette commands
| Command | Description |
|---|---|
Cencurity: Enable Protection |
Activate protection and select an LLM provider |
Cencurity: Disable Protection |
Deactivate protection and restore previous routing |
Cencurity: Open Security Center |
Open the security dashboard inside VS Code |
Cencurity: Test Protection |
Verify local proxy connectivity |
Cencurity: Show Runtime Info |
Check runtime and protection status |
Cencurity: Install or Update Core |
Install or update the local core runtime |
