diff --git a/README.md b/README.md index 9c745a47..301653f3 100644 --- a/README.md +++ b/README.md @@ -1,23 +1,35 @@ -![image](https://github.com/user-attachments/assets/ab37063d-039d-4857-be88-231047a7b282) - + + + CodeGate logo + [![CI](https://github.com/stacklok/codegate/actions/workflows/ci.yml/badge.svg)](https://github.com/stacklok/codegate/actions/workflows/ci.yml) -Codegate is a local gateway that makes AI coding assistants safer. Codegate ensures AI-generated recommendations adhere to best practices, while safeguarding your code's integrity, and protecting your individual privacy. With Codegate, you can confidently leverage AI in your development workflow without compromising security or productivity. Codegate is designed to work seamlessly with coding assistants, allowing you to safely enjoy all the benefits of AI code generation. +CodeGate is a local gateway that makes AI coding assistants safer. CodeGate +ensures AI-generated recommendations adhere to best practices, while +safeguarding your code's integrity, and protecting your individual privacy. With +CodeGate, you can confidently leverage AI in your development workflow without +compromising security or productivity. CodeGate is designed to work seamlessly +with coding assistants, allowing you to safely enjoy all the benefits of AI code +generation. -Codegate is developed by [Stacklok](https://stacklok.com), a group of security experts with many years of experience building developer friendly open source security software tools and platforms. +CodeGate is developed by [Stacklok](https://stacklok.com), a group of security +experts with many years of experience building developer friendly open source +security software tools and platforms. ## Experimental 🚧 -Codegate is **experimental** and **undergoing fast iterations of development**. +CodeGate is **experimental** and **undergoing fast iterations of development**. - Features may change frequently -- Expect possible bugs. +- Expect possible bugs - Contributions, feedback, and testing are highly encouraged and welcomed! -## ✨ Why Codegate? +## ✨ Why CodeGate? -In today's world where AI coding assistants are becoming ubiquitous, security can't be an afterthought. Codegate sits between you and AI, actively protecting your development process by: +In today's world where AI coding assistants are becoming ubiquitous, security +can't be an afterthought. CodeGate sits between you and AI, actively protecting +your development process by: - 🔒 Preventing accidental exposure of secrets and sensitive data - 🛡️ Ensuring AI suggestions follow secure coding practices @@ -26,25 +38,32 @@ In today's world where AI coding assistants are becoming ubiquitous, security ca ## 🌟 Features -### Supported AI Providers -Codegate works seamlessly with leading AI providers: +### Supported AI providers + +CodeGate works seamlessly with leading AI providers: + - 🤖 Anthropic (Claude) - 🧠 OpenAI - ⚡ vLLM - 💻 Local LLMs (run AI completely offline!) - 🔮 Many more on the way! -### AI Coding Assistants -We're starting with Continue VSCode extension support, with many more AI coding assistants coming soon! +### AI coding assistants + +We're starting with Continue VS Code extension support, with many more AI coding +assistants coming soon! + +### Privacy first + +Unlike E.T., your code never phones home! 🛸 CodeGate is designed with privacy +at its core: -### Privacy First -Unlike E.T., your code never phones home! 🛸 Codegate is designed with privacy at its core: - 🏠 Everything stays on your machine - 🚫 No external data collection - 🔐 No calling home or telemetry - 💪 Complete control over your data -## 🚀 Quick Start +## 🚀 Quickstart ### Prerequisites @@ -52,41 +71,47 @@ Make sure you have these tools installed: - 🐳 [Docker](https://docs.docker.com/get-docker/) - 🛠️ [jq](https://stedolan.github.io/jq/download/) -- 💻 [VSCode](https://code.visualstudio.com/download) +- 💻 [VS Code](https://code.visualstudio.com/download) -### One-Command Setup +### One-command setup ```bash chmod +x install.sh && ./install.sh ``` This script will: -1. Install the Continue VSCode extension + +1. Install the Continue VS Code extension 2. Set up your configuration 3. Create and start necessary Docker services ## 🎯 Usage -### VSCode Integration with Continue +### VS Code integration with Continue + +Simply tap the Continue button in your VS Code editor to start chatting with +your AI assistant - now protected by CodeGate! -Simply tap the Continue button in your VSCode editor to start chatting with your AI assistant - now protected by Codegate! +![Continue chat interface](./static/image.png) -![Continue Chat Interface](./static/image.png) +### Manual configuration -### Manual Configuration +#### Basic server start -#### Basic Server Start ```bash codegate serve ``` -#### Custom Settings +#### Custom settings + ```bash codegate serve --port 8989 --host localhost --log-level DEBUG ``` -#### Using Config File +#### Using config file + Create a `config.yaml`: + ```yaml port: 8989 host: "localhost" @@ -94,11 +119,13 @@ log_level: "DEBUG" ``` Then run: + ```bash codegate serve --config config.yaml ``` -#### Environment Variables +#### Environment variables + ```bash export CODEGATE_APP_PORT=8989 export CODEGATE_APP_HOST=localhost @@ -108,7 +135,8 @@ codegate serve ## 🛠️ Development -### Local Setup +### Local setup + ```bash # Get the code git clone https://github.com/stacklok/codegate.git @@ -124,9 +152,9 @@ pip install -e ".[dev]" ### Running locally with several network interfaces -By default weaviate is picking the default route as the ip for the cluster nodes. It may cause -some issues when dealing with multiple interfaces. To make it work, localhost needs to be the -default route: +By default weaviate is picking the default route as the ip for the cluster +nodes. It may cause some issues when dealing with multiple interfaces. To make +it work, localhost needs to be the default route: ```bash sudo route delete default @@ -136,48 +164,62 @@ sudo route add -net 128.0.0.0/1 ``` ### Testing + ```bash pytest ``` -## 🐳 Docker Deployment +## 🐳 Docker deployment + +### Build the image -### Build the Image ```bash make image-build ``` -### Run the Container +### Run the container + ```bash # Basic usage with local image docker run -p 8989:8989 -p 9090:80 codegate:latest # With pre-built pulled image -docker pull ghcr.io/stacklok/codegate/codegate:latest -docker run --name codegate -d -p 8989:8989 -p 9090:80 ghcr.io/stacklok/codegate/codegate:latest +docker pull ghcr.io/stacklok/codegate:latest +docker run --name codegate -d -p 8989:8989 -p 9090:80 ghcr.io/stacklok/codegate:latest # It will mount a volume to /app/codegate_volume # The directory supports storing Llama CPP models under subidrectoy /models # A sqlite DB with the messages and alerts is stored under the subdirectory /db -docker run --name codegate -d -v /path/to/volume:/app/codegate_volume -p 8989:8989 -p 9090:80 ghcr.io/stacklok/codegate/codegate:latest +docker run --name codegate -d -v /path/to/volume:/app/codegate_volume -p 8989:8989 -p 9090:80 ghcr.io/stacklok/codegate:latest ``` ### Exposed parameters -- CODEGATE_VLLM_URL: URL for the inference engine (defaults to [https://inference.codegate.ai](https://inference.codegate.ai)) -- CODEGATE_OPENAI_URL: URL for OpenAI inference engine (defaults to [https://api.openai.com/v1](https://api.openai.com/v1)) -- CODEGATE_ANTHROPIC_URL: URL for Anthropic inference engine (defaults to [https://api.anthropic.com/v1](https://api.anthropic.com/v1)) -- CODEGATE_OLLAMA_URL: URL for OLlama inference engine (defaults to [http://localhost:11434/api](http://localhost:11434/api)) -- CODEGATE_APP_LOG_LEVEL: Level of debug desired when running the codegate server (defaults to WARNING, can be ERROR/WARNING/INFO/DEBUG) -- CODEGATE_LOG_FORMAT: Type of log formatting desired when running the codegate server (default to TEXT, can be JSON/TEXT) + +- CODEGATE_VLLM_URL: URL for the inference engine (defaults to + [https://inference.codegate.ai](https://inference.codegate.ai)) +- CODEGATE_OPENAI_URL: URL for OpenAI inference engine (defaults to + [https://api.openai.com/v1](https://api.openai.com/v1)) +- CODEGATE_ANTHROPIC_URL: URL for Anthropic inference engine (defaults to + [https://api.anthropic.com/v1](https://api.anthropic.com/v1)) +- CODEGATE_OLLAMA_URL: URL for OLlama inference engine (defaults to + [http://localhost:11434/api](http://localhost:11434/api)) +- CODEGATE_APP_LOG_LEVEL: Level of debug desired when running the codegate + server (defaults to WARNING, can be ERROR/WARNING/INFO/DEBUG) +- CODEGATE_LOG_FORMAT: Type of log formatting desired when running the codegate + server (default to TEXT, can be JSON/TEXT) ```bash -docker run -p 8989:8989 -p 9090:80 -e CODEGATE_OLLAMA_URL=http://1.2.3.4:11434/api ghcr.io/stacklok/codegate/codegate:latest +docker run -p 8989:8989 -p 9090:80 -e CODEGATE_OLLAMA_URL=http://1.2.3.4:11434/api ghcr.io/stacklok/codegate:latest ``` ## 🤝 Contributing -We welcome contributions! Whether it's bug reports, feature requests, or code contributions, please feel free to contribute to making Codegate better. +We welcome contributions! Whether it's bug reports, feature requests, or code +contributions, please feel free to contribute to making CodeGate better. ## 📜 License -This project is licensed under the terms specified in the [LICENSE](LICENSE) file. +This project is licensed under the terms specified in the [LICENSE](LICENSE) +file. + + diff --git a/docs/cli.md b/docs/cli.md index 6be81b79..83c3d6aa 100644 --- a/docs/cli.md +++ b/docs/cli.md @@ -1,19 +1,19 @@ -# CLI Commands and Flags +# CLI commands and flags -Codegate provides a command-line interface through `cli.py` with the following +CodeGate provides a command-line interface through `cli.py` with the following structure: -## Main Command +## Main command ```bash codegate [OPTIONS] COMMAND [ARGS]... ``` -## Available Commands +## Available commands -### serve +### `serve` -Start the Codegate server: +Start the CodeGate server: ```bash codegate serve [OPTIONS] @@ -21,49 +21,52 @@ codegate serve [OPTIONS] #### Options -- `--port INTEGER`: Port to listen on (default: 8989) +- `--port INTEGER`: Port to listen on (default: `8989`) - Must be between 1 and 65535 - Overrides configuration file and environment variables - -- `--host TEXT`: Host to bind to (default: localhost) + +- `--host TEXT`: Host to bind to (default: `localhost`) - Overrides configuration file and environment variables - -- `--log-level [ERROR|WARNING|INFO|DEBUG]`: Set the log level (default: INFO) + +- `--log-level [ERROR|WARNING|INFO|DEBUG]`: Set the log level (default: `INFO`) - Optional - Case-insensitive - Overrides configuration file and environment variables - -- `--log-format [JSON|TEXT]`: Set the log format (default: JSON) + +- `--log-format [JSON|TEXT]`: Set the log format (default: `JSON`) - Optional - Case-insensitive - Overrides configuration file and environment variables - + - `--config FILE`: Path to YAML config file - Optional - Must be a valid YAML file - - Configuration values can be overridden by environment variables and CLI options + - Configuration values can be overridden by environment variables and CLI + options - `--prompts FILE`: Path to YAML prompts file - Optional - Must be a valid YAML file - Overrides default prompts and configuration file prompts -- `--vllm-url TEXT`: vLLM provider URL (default: http://localhost:8000) +- `--vllm-url TEXT`: vLLM provider URL (default: `http://localhost:8000`) - Optional - Base URL for vLLM provider (/v1 path is added automatically) - Overrides configuration file and environment variables -- `--openai-url TEXT`: OpenAI provider URL (default: https://api.openai.com/v1) +- `--openai-url TEXT`: OpenAI provider URL (default: + `https://api.openai.com/v1`) - Optional - Base URL for OpenAI provider - Overrides configuration file and environment variables -- `--anthropic-url TEXT`: Anthropic provider URL (default: https://api.anthropic.com/v1) +- `--anthropic-url TEXT`: Anthropic provider URL (default: + `https://api.anthropic.com/v1`) - Optional - Base URL for Anthropic provider - Overrides configuration file and environment variables -- `--ollama-url TEXT`: Ollama provider URL (default: http://localhost:11434) +- `--ollama-url TEXT`: Ollama provider URL (default: `http://localhost:11434`) - Optional - Base URL for Ollama provider (/api path is added automatically) - Overrides configuration file and environment variables @@ -74,11 +77,12 @@ codegate serve [OPTIONS] - `--embedding-model TEXT`: Name of the model used for embeddings - Optional -- `--db-path TEXT`: Path to a SQLite DB. It will create one if it doesn't exist. (default: ./codegate_volume/db/codegate.db) +- `--db-path TEXT`: Path to a SQLite DB. Will be created if it doesn't exist. + (default: `./codegate_volume/db/codegate.db`) - Optional - Overrides configuration file and environment variables -### show-prompts +### `show-prompts` Display the loaded system prompts: @@ -91,9 +95,9 @@ codegate show-prompts [OPTIONS] - `--prompts FILE`: Path to YAML prompts file - Optional - Must be a valid YAML file - - If not provided, shows default prompts from prompts/default.yaml + - If not provided, shows default prompts from `prompts/default.yaml` -### generate_certs +### `generate_certs` Generate certificates for the CodeGate server. @@ -103,23 +107,28 @@ codegate generate-certs [OPTIONS] #### Options -- `--certs-out-dir PATH`: Directory path where the certificates are going to be generated. (default: ./codegate_volume/certs) +- `--certs-out-dir PATH`: Directory path where the certificates are generated + (default: ./codegate_volume/certs) - Optional - Overrides configuration file and environment variables -- `--ca-cert-name TEXT`: Name that will be given to the created CA certificate. (default: ca.crt) +- `--ca-cert-name TEXT`: Name that will be given to the created CA certificate + (default: ca.crt) - Optional - Overrides configuration file and environment variables -- `--ca-key-name TEXT`: Name that will be given to the created CA key. (default: ca.key) +- `--ca-key-name TEXT`: Name that will be given to the created CA key (default: + ca.key) - Optional - Overrides configuration file and environment variables -- `--server-cert-name TEXT`: Name that will be given to the created server certificate. (default: server.crt) +- `--server-cert-name TEXT`: Name that will be given to the created server + certificate (default: server.crt) - Optional - Overrides configuration file and environment variables -- `--server-key-name TEXT`: Name that will be given to the created server key. (default: server.key) +- `--server-key-name TEXT`: Name that will be given to the created server key + (default: server.key) - Optional - Overrides configuration file and environment variables @@ -127,15 +136,16 @@ codegate generate-certs [OPTIONS] - Optional - Case-insensitive - Overrides configuration file and environment variables - + - `--log-format [JSON|TEXT]`: Set the log format (default: JSON) - Optional - Case-insensitive - Overrides configuration file and environment variables -## Error Handling +## Error handling The CLI provides user-friendly error messages for: + - Invalid port numbers - Invalid log levels - Invalid log formats @@ -148,51 +158,63 @@ All errors are output to stderr with appropriate exit codes. ## Examples Start server with default settings: + ```bash codegate serve ``` Start server on specific port and host: + ```bash codegate serve --port 8989 --host localhost ``` Start server with custom logging: + ```bash codegate serve --log-level DEBUG --log-format TEXT ``` Start server with configuration file: + ```bash codegate serve --config my-config.yaml ``` Start server with custom prompts: + ```bash codegate serve --prompts my-prompts.yaml ``` Start server with custom vLLM endpoint: + ```bash codegate serve --vllm-url https://vllm.example.com ``` Start server with custom Ollama endpoint: + ```bash codegate serve --ollama-url http://localhost:11434 ``` Show default system prompts: + ```bash codegate show-prompts ``` Show prompts from a custom file: + ```bash codegate show-prompts --prompts my-prompts.yaml ``` Generate certificates with default settings: + ```bash codegate generate-certs -``` \ No newline at end of file +``` + + diff --git a/docs/configuration.md b/docs/configuration.md index 9daf66de..67058151 100644 --- a/docs/configuration.md +++ b/docs/configuration.md @@ -1,37 +1,44 @@ -# Configuration System +# Configuration system -The configuration system in Codegate is managed through the `Config` class in `config.py`. It supports multiple configuration sources with a clear priority order. +The configuration system in CodeGate is managed through the `Config` class in +`config.py`. It supports multiple configuration sources with a clear priority +order. -## Configuration Priority (highest to lowest) +## Configuration priority + +Configuration sources are evaluated in the following order, from highest to +lowest priority: 1. CLI arguments 2. Environment variables -3. Config file (YAML) -4. Default values (including default prompts from prompts/default.yaml) +3. Configuration file (YAML) +4. Default values (including default prompts from `prompts/default.yaml`) + +Values from higher-priority sources take precedence over lower-priority values. -## Default Configuration Values +## Default configuration values -- Port: 8989 -- Proxy Port: 8990 -- Host: "localhost" -- Log Level: "INFO" -- Log Format: "JSON" -- Prompts: Default prompts from prompts/default.yaml +- Port: `8989` +- Proxy port: `8990` +- Host: `"localhost"` +- Log level: `"INFO"` +- Log format: `"JSON"` +- Prompts: default prompts from `prompts/default.yaml` - Provider URLs: - - vLLM: "http://localhost:8000" - - OpenAI: "https://api.openai.com/v1" - - Anthropic: "https://api.anthropic.com/v1" - - Ollama: "http://localhost:11434" -- Certificate Configuration: - - Certs Directory: "./certs" - - CA Certificate: "ca.crt" - - CA Key: "ca.key" - - Server Certificate: "server.crt" - - Server Key: "server.key" + - vLLM: `"http://localhost:8000"` + - OpenAI: `"https://api.openai.com/v1"` + - Anthropic: `"https://api.anthropic.com/v1"` + - Ollama: `"http://localhost:11434"` +- Certificate configuration: + - Certs directory: `"./certs"` + - CA certificate: `"ca.crt"` + - CA private key: `"ca.key"` + - Server certificate: `"server.crt"` + - Server private key: `"server.key"` -## Configuration Methods +## Configuration methods -### From File +### Configuration file Load configuration from a YAML file: @@ -40,6 +47,7 @@ config = Config.from_file("config.yaml") ``` Example config.yaml: + ```yaml port: 8989 proxy_port: 8990 @@ -58,51 +66,54 @@ server_cert: "server.crt" server_key: "server.key" ``` -### From Environment Variables +### Environment variables Environment variables are automatically loaded with these mappings: -- `CODEGATE_APP_PORT`: Server port -- `CODEGATE_APP_PROXY_PORT`: Server proxy port -- `CODEGATE_APP_HOST`: Server host -- `CODEGATE_APP_LOG_LEVEL`: Logging level -- `CODEGATE_LOG_FORMAT`: Log format -- `CODEGATE_PROMPTS_FILE`: Path to prompts YAML file +- `CODEGATE_APP_PORT`: server port +- `CODEGATE_APP_PROXY_PORT`: server proxy port +- `CODEGATE_APP_HOST`: server host +- `CODEGATE_APP_LOG_LEVEL`: logging level +- `CODEGATE_LOG_FORMAT`: log format +- `CODEGATE_PROMPTS_FILE`: path to prompts YAML file - `CODEGATE_PROVIDER_VLLM_URL`: vLLM provider URL - `CODEGATE_PROVIDER_OPENAI_URL`: OpenAI provider URL - `CODEGATE_PROVIDER_ANTHROPIC_URL`: Anthropic provider URL - `CODEGATE_PROVIDER_OLLAMA_URL`: Ollama provider URL -- `CODEGATE_CERTS_DIR`: Directory for certificate files +- `CODEGATE_CERTS_DIR`: directory for certificate files - `CODEGATE_CA_CERT`: CA certificate file name - `CODEGATE_CA_KEY`: CA key file name -- `CODEGATE_SERVER_CERT`: Server certificate file name -- `CODEGATE_SERVER_KEY`: Server key file name +- `CODEGATE_SERVER_CERT`: server certificate file name +- `CODEGATE_SERVER_KEY`: server key file name ```python config = Config.from_env() ``` -## Configuration Options +## Configuration options -### Network Settings +### Network settings Network settings can be configured in several ways: -1. In Configuration File: +1. Configuration file: + ```yaml - port: 8989 # Port to listen on (1-65535) - proxy_port: 8990 # Proxy port to listen on (1-65535) - host: "localhost" # Host to bind to + port: 8989 # Port to listen on (1-65535) + proxy_port: 8990 # Proxy port to listen on (1-65535) + host: "localhost" # Host to bind to ``` -2. Via Environment Variables: +2. Environment variables: + ```bash export CODEGATE_APP_PORT=8989 export CODEGATE_APP_PROXY_PORT=8990 export CODEGATE_APP_HOST=localhost ``` -3. Via CLI Flags: +3. CLI flags: + ```bash codegate serve --port 8989 --proxy-port 8990 --host localhost ``` @@ -111,16 +122,18 @@ Network settings can be configured in several ways: Provider URLs can be configured in several ways: -1. In Configuration File: +1. Configuration file: + ```yaml provider_urls: - vllm: "https://vllm.example.com" # /v1 path is added automatically + vllm: "https://vllm.example.com" # /v1 path is added automatically openai: "https://api.openai.com/v1" anthropic: "https://api.anthropic.com/v1" - ollama: "http://localhost:11434" # /api path is added automatically + ollama: "http://localhost:11434" # /api path is added automatically ``` -2. Via Environment Variables: +2. Environment variables: + ```bash export CODEGATE_PROVIDER_VLLM_URL=https://vllm.example.com export CODEGATE_PROVIDER_OPENAI_URL=https://api.openai.com/v1 @@ -128,20 +141,25 @@ Provider URLs can be configured in several ways: export CODEGATE_PROVIDER_OLLAMA_URL=http://localhost:11434 ``` -3. Via CLI Flags: +3. CLI flags: + ```bash codegate serve --vllm-url https://vllm.example.com --ollama-url http://localhost:11434 ``` -Note: -- For the vLLM provider, the /v1 path is automatically appended to the base URL if not present. -- For the Ollama provider, the /api path is automatically appended to the base URL if not present. +Note: + +- For the vLLM provider, the `/v1` path is automatically appended to the base + URL if not present. +- For the Ollama provider, the `/api` path is automatically appended to the base + URL if not present. -### Certificate Configuration +### Certificate configuration Certificate files can be configured in several ways: -1. In Configuration File: +1. Configuration file: + ```yaml certs_dir: "./certs" ca_cert: "ca.crt" @@ -150,7 +168,8 @@ Certificate files can be configured in several ways: server_key: "server.key" ``` -2. Via Environment Variables: +2. Environment variables: + ```bash export CODEGATE_CERTS_DIR=./certs export CODEGATE_CA_CERT=ca.crt @@ -159,12 +178,13 @@ Certificate files can be configured in several ways: export CODEGATE_SERVER_KEY=server.key ``` -3. Via CLI Flags: +3. CLI flags: + ```bash codegate serve --certs-dir ./certs --ca-cert ca.crt --ca-key ca.key --server-cert server.crt --server-key server.key ``` -### Log Levels +### Log levels Available log levels (case-insensitive): @@ -173,22 +193,24 @@ Available log levels (case-insensitive): - `INFO` - `DEBUG` -### Log Formats +### Log formats Available log formats (case-insensitive): - `JSON` - `TEXT` -### Prompts Configuration +### Prompts configuration Prompts can be configured in several ways: -1. Default Prompts: - - Located in prompts/default.yaml +1. Default prompts: + + - Located in `prompts/default.yaml` - Loaded automatically if no other prompts are specified -2. In Configuration File: +2. Configuration file: + ```yaml # Option 1: Direct prompts definition prompts: @@ -199,40 +221,49 @@ Prompts can be configured in several ways: prompts: "path/to/prompts.yaml" ``` -3. Via Environment Variable: +3. Environment variable: + ```bash export CODEGATE_PROMPTS_FILE=path/to/prompts.yaml ``` -4. Via CLI Flag: +4. CLI flag: + ```bash codegate serve --prompts path/to/prompts.yaml ``` -### Prompts File Format +### Prompts file format -Prompts files should be in YAML format with string values: +Prompts files are defined in YAML format with string values: ```yaml prompt_name: "Prompt text content" + another_prompt: "More prompt text" + +multiline_prompt: | + This is a multi-line prompt. + It can span multiple lines. ``` Access prompts in code: + ```python config = Config.load() prompt = config.prompts.prompt_name ``` -## Error Handling +## Error handling -The configuration system uses a custom `ConfigurationError` exception for handling configuration-related errors, such as: +The configuration system uses a custom `ConfigurationError` exception for +handling configuration-related errors, such as: - Invalid port numbers (must be between 1 and 65535) - Invalid proxy port numbers (must be between 1 and 65535) -- Invalid log levels -- Invalid log formats +- Invalid [log levels](#log-levels) +- Invalid [log formats](#log-formats) - YAML parsing errors - File reading errors - Invalid prompt values (must be strings) -- Missing or invalid prompts files +- Missing or invalid [prompts files](#prompts-file-format) diff --git a/docs/development.md b/docs/development.md index 791b54a1..f1ccb0f8 100644 --- a/docs/development.md +++ b/docs/development.md @@ -1,43 +1,52 @@ -This guide provides comprehensive information for developers working on the Codegate project. +# Development guide -## Project Overview +This guide provides comprehensive information for developers working on the +CodeGate project. + +## Project overview + +CodeGate is a configurable generative AI gateway designed to protect developers +from potential AI-related security risks. Key features include: -Codegate is a configurable Generative AI gateway designed to protect developers from potential AI-related security risks. Key features include: - Secrets exfiltration prevention - Secure coding recommendations - Prevention of AI recommending deprecated/malicious libraries - Modular system prompts configuration - Multiple AI provider support with configurable endpoints -## Development Setup +## Development setup ### Prerequisites - Python 3.11 or higher -- [Poetry](https://python-poetry.org/docs/#installation) for dependency management -- [Docker](https://docs.docker.com/get-docker/) (for containerized deployment) -- or -- [PodMan](https://podman.io/getting-started/installation) (for containerized deployment) -- [VSCode](https://code.visualstudio.com/download) (recommended IDE) +- [Poetry](https://python-poetry.org/docs/#installation) for dependency + management +- [Docker](https://docs.docker.com/get-docker/) or + [Podman](https://podman.io/getting-started/installation) (for containerized + deployment) +- [Visual Studio Code](https://code.visualstudio.com/download) (recommended IDE) -### Initial Setup +### Initial setup 1. Clone the repository: + ```bash git clone https://github.com/stacklok/codegate.git cd codegate ``` -2. Install Poetry following the [official installation guide](https://python-poetry.org/docs/#installation) +2. Install Poetry following the + [official installation guide](https://python-poetry.org/docs/#installation) 3. Install project dependencies: + ```bash poetry install --with dev ``` -## Project Structure +## Project structure -``` +```plain codegate/ ├── pyproject.toml # Project configuration and dependencies ├── poetry.lock # Lock file (committed to version control) @@ -61,9 +70,9 @@ codegate/ └── docs/ # Documentation ``` -## Development Workflow +## Development workflow -### 1. Environment Management +### 1. Environment management Poetry commands for managing your development environment: @@ -75,21 +84,24 @@ Poetry commands for managing your development environment: - `poetry show`: List all installed packages - `poetry env info`: Show information about the virtual environment -### 2. Code Style and Quality +### 2. Code style and quality The project uses several tools to maintain code quality: -- **Black** for code formatting: +- [**Black**](https://black.readthedocs.io/en/stable/) for code formatting: + ```bash poetry run black . ``` -- **Ruff** for linting: +- [**Ruff**](https://docs.astral.sh/ruff/) for linting: + ```bash poetry run ruff check . ``` -- **Bandit** for security checks: +- [**Bandit**](https://bandit.readthedocs.io/) for security checks: + ```bash poetry run bandit -r src/ ``` @@ -97,85 +109,92 @@ The project uses several tools to maintain code quality: ### 3. Testing Run the test suite with coverage: + ```bash poetry run pytest ``` -Tests are located in the `tests/` directory and follow the same structure as the source code. +Tests are located in the `tests/` directory and follow the same structure as the +source code. -### 4. Make Commands +### 4. Make commands The project includes a Makefile for common development tasks: -- `make install`: Install all dependencies -- `make format`: Format code using black and ruff -- `make lint`: Run linting checks -- `make test`: Run tests with coverage -- `make security`: Run security checks -- `make build`: Build distribution packages -- `make all`: Run all checks and build (recommended before committing) +- `make install`: install all dependencies +- `make format`: format code using black and ruff +- `make lint`: run linting checks +- `make test`: run tests with coverage +- `make security`: run security checks +- `make build`: build distribution packages +- `make all`: run all checks and build (recommended before committing) -## Configuration System +## Configuration system -Codegate uses a hierarchical configuration system with the following priority (highest to lowest): +CodeGate uses a hierarchical configuration system with the following priority +(highest to lowest): 1. CLI arguments 2. Environment variables 3. Config file (YAML) 4. Default values (including default prompts) -### Configuration Options +### Configuration options -- Port: Server port (default: 8989) -- Host: Server host (default: "localhost") -- Log Level: Logging level (ERROR|WARNING|INFO|DEBUG) -- Log Format: Log format (JSON|TEXT) -- Prompts: System prompts configuration +- Port: server port (default: `8989`) +- Host: server host (default: `"localhost"`) +- Log level: logging verbosity level (`ERROR`|`WARNING`|`INFO`|`DEBUG`) +- Log format: log format (`JSON`|`TEXT`) +- Prompts: system prompts configuration - Provider URLs: AI provider endpoint configuration -See [Configuration Documentation](configuration.md) for detailed information. +See [Configuration system](configuration.md) for detailed information. + +## Working with providers -## Working with Providers +CodeGate supports multiple AI providers through a modular provider system. -Codegate supports multiple AI providers through a modular provider system. +### Available providers -### Available Providers +1. **vLLM provider** -1. **vLLM Provider** - - Default URL: http://localhost:8000 - - Supports OpenAI-compatible API - - Automatically adds /v1 path to base URL - - Model names are prefixed with "hosted_vllm/" + - Default URL: `http://localhost:8000` + - Supports OpenAI-compatible APIs + - Automatically adds `/v1` path to base URL + - Model names are prefixed with `hosted_vllm/` -2. **OpenAI Provider** - - Default URL: https://api.openai.com/v1 +2. **OpenAI provider** + + - Default URL: `https://api.openai.com/v1` - Standard OpenAI API implementation -3. **Anthropic Provider** - - Default URL: https://api.anthropic.com/v1 +3. **Anthropic provider** + + - Default URL: `https://api.anthropic.com/v1` - Anthropic Claude API implementation -4. **Ollama Provider** - - Default URL: http://localhost:11434 +4. **Ollama provider** + - Default URL: `http://localhost:11434` - Endpoints: - * Native Ollama API: `/ollama/api/chat` - * OpenAI-compatible: `/ollama/chat/completions` - ``` + - Native Ollama API: `/ollama/api/chat` + - OpenAI-compatible: `/ollama/chat/completions` -### Configuring Providers +### Configuring providers Provider URLs can be configured through: 1. Config file (config.yaml): + ```yaml provider_urls: vllm: "https://vllm.example.com" openai: "https://api.openai.com/v1" anthropic: "https://api.anthropic.com/v1" - ollama: "http://localhost:11434" # /api path added automatically + ollama: "http://localhost:11434" # /api path added automatically ``` 2. Environment variables: + ```bash export CODEGATE_PROVIDER_VLLM_URL=https://vllm.example.com export CODEGATE_PROVIDER_OPENAI_URL=https://api.openai.com/v1 @@ -184,11 +203,12 @@ Provider URLs can be configured through: ``` 3. CLI flags: + ```bash codegate serve --vllm-url https://vllm.example.com --ollama-url http://localhost:11434 ``` -### Implementing New Providers +### Implementing new providers To add a new provider: @@ -199,6 +219,7 @@ To add a new provider: - `__init__.py`: Export provider class Example structure: + ```python from codegate.providers.base import BaseProvider @@ -221,35 +242,39 @@ class NewProvider(BaseProvider): pass ``` -## Working with Prompts +## Working with prompts -### Default Prompts +### Default prompts -Default prompts are stored in `prompts/default.yaml`. These prompts are loaded automatically when no other prompts are specified. +Default prompts are stored in `prompts/default.yaml`. These prompts are loaded +automatically when no other prompts are specified. -### Creating Custom Prompts +### Creating custom prompts 1. Create a new YAML file following the format: + ```yaml prompt_name: "Prompt text content" another_prompt: "More prompt text" ``` 2. Use the prompts file: + ```bash # Via CLI codegate serve --prompts my-prompts.yaml - # Or in config.yaml + # Via config.yaml prompts: "path/to/prompts.yaml" - # Or via environment + # Via environment export CODEGATE_PROMPTS_FILE=path/to/prompts.yaml ``` -### Testing Prompts +### Testing prompts 1. View loaded prompts: + ```bash # Show default prompts codegate show-prompts @@ -259,13 +284,14 @@ Default prompts are stored in `prompts/default.yaml`. These prompts are loaded a ``` 2. Write tests for prompt functionality: + ```python def test_custom_prompts(): config = Config.load(prompts_path="path/to/test/prompts.yaml") assert config.prompts.my_prompt == "Expected prompt text" ``` -## CLI Interface +## CLI interface The main command-line interface is implemented in `cli.py`. Basic usage: @@ -283,4 +309,4 @@ codegate serve --prompts my-prompts.yaml codegate serve --vllm-url https://vllm.example.com ``` -See [CLI Documentation](cli.md) for detailed command information. \ No newline at end of file +See [CLI commands and flags](cli.md) for detailed command information. diff --git a/docs/logging.md b/docs/logging.md index 885a6d7e..a88cb212 100644 --- a/docs/logging.md +++ b/docs/logging.md @@ -1,17 +1,18 @@ -# Logging System +# Logging system -The logging system in Codegate (`codegate_logging.py`) provides a flexible and structured logging solution with support for both JSON and text formats. +The logging system in CodeGate (`codegate_logging.py`) provides a flexible and +structured logging solution with support for both JSON and text output. -## Log Routing +## Log routing Logs are automatically routed based on their level: - **stdout**: INFO and DEBUG messages - **stderr**: ERROR, CRITICAL, and WARNING messages -## Log Formats +## Log formats -### JSON Format +### JSON format When using JSON format (default), log entries include: @@ -27,26 +28,27 @@ When using JSON format (default), log entries include: } ``` -### Text Format +### Text format When using text format, log entries follow this pattern: -``` +```plain YYYY-MM-DDThh:mm:ss.mmmZ - LEVEL - NAME - MESSAGE ``` ## Features -- **Consistent Timestamps**: ISO-8601 format with millisecond precision in UTC -- **Automatic JSON Serialization**: Extra fields are automatically serialized to JSON -- **Non-serializable Handling**: Graceful handling of non-serializable values -- **Exception Support**: Full exception and stack trace integration -- **Dual Output**: Separate handlers for error and non-error logs -- **Configurable Levels**: Support for ERROR, WARNING, INFO, and DEBUG levels +- **Consistent timestamps**: ISO-8601 format with millisecond precision in UTC +- **Automatic JSON serialization**: extra fields are automatically serialized to + JSON +- **Non-serializable handling**: graceful handling of non-serializable values +- **Exception support**: full exception and stack trace integration +- **Dual output**: separate handlers for error and non-error logs +- **Configurable levels**: support for ERROR, WARNING, INFO, and DEBUG levels -## Usage Examples +## Usage examples -### Basic Logging +### Basic logging ```python import structlog @@ -60,7 +62,7 @@ logger.error("This is an error message") logger.warning("This is a warning message") ``` -### Logging with Extra Fields +### Logging with extra fields ```python logger.info("Server starting", extra={ @@ -70,7 +72,7 @@ logger.info("Server starting", extra={ }) ``` -### Exception Logging +### Exception logging ```python try: @@ -85,31 +87,36 @@ except Exception as e: The logging system can be configured through: 1. CLI arguments: + ```bash codegate serve --log-level DEBUG --log-format TEXT ``` 2. Environment variables: + ```bash export APP_LOG_LEVEL=DEBUG export CODEGATE_LOG_FORMAT=TEXT ``` 3. Configuration file: + ```yaml log_level: DEBUG log_format: TEXT ``` -## Best Practices +## Best practices 1. Always use the appropriate log level: - - ERROR: For errors that need immediate attention - - WARNING: For potentially harmful situations - - INFO: For general operational information - - DEBUG: For detailed information useful during development + + - `ERROR` for errors that need immediate attention + - `WARNING` for potentially harmful situations + - `INFO` for general operational information + - `DEBUG` for detailed information useful during development 2. Include relevant context in extra fields: + ```python logger.info("User action", extra={ "user_id": "123", @@ -118,6 +125,7 @@ The logging system can be configured through: }) ``` -3. Use structured logging with JSON format in production for better log aggregation and analysis +3. Use structured logging with [JSON format](#json-format) in production for + better log aggregation and analysis. -4. Enable DEBUG level logging during development for maximum visibility +4. Enable `DEBUG` level logging during development for maximum visibility. diff --git a/static/codegate-logo-dark.svg b/static/codegate-logo-dark.svg new file mode 100644 index 00000000..dec54fcc --- /dev/null +++ b/static/codegate-logo-dark.svg @@ -0,0 +1,22 @@ + + + + + + + + + + + + + + + + + + + + + + diff --git a/static/codegate-logo-white.svg b/static/codegate-logo-white.svg new file mode 100644 index 00000000..417bce7b --- /dev/null +++ b/static/codegate-logo-white.svg @@ -0,0 +1,22 @@ + + + + + + + + + + + + + + + + + + + + + +