Danube Connect provides a batteries-included connector ecosystem for Danube Messaging, enabling seamless integration with external systems without compromising the safety, stability, or binary size of the core broker.
- 🔌 Plug-and-Play Connectors - Ready-to-use integrations for popular systems
- 🦀 Pure Rust - Memory-safe, high-performance connector framework
- 🔄 Bidirectional - Support for both source and sink connectors
- 📦 Modular - Clean separation between framework and connector implementations
- 🚀 Cloud Native - Docker-first with Kubernetes support
- 📊 Observable - Built-in metrics, tracing, and health checks
- ⚡ High Performance - Batching, connection pooling, and parallel processing
External Systems ↔ Connectors ↔ danube-connect-core ↔ danube-client ↔ Danube Broker
Connectors run as standalone processes, communicating with Danube brokers via gRPC. This ensures:
- Isolation: Connector failures don't impact the broker
- Scalability: Horizontal scaling of connectors
- Flexibility: Mix and match connectors as needed
Run a connector using Docker:
docker run -e DANUBE_SERVICE_URL=http://localhost:6650 \
-e DANUBE_TOPIC=/default/events \
-e SUBSCRIPTION_NAME=my-sink \
danube-connect/sink-http:latestCreate a new connector:
# Clone the repository
git clone https://github.com/danube-messaging/danube-connect
cd danube-connect
# Create a new connector
cd connectors
cargo new --bin sink-mydb
# Implement the SinkConnector trait
# See info/connector-development-guide.md for details| Connector | Status | Description | Documentation |
|---|---|---|---|
| Qdrant | ✅ Production | Vector embeddings for RAG/AI | README |
| SurrealDB | ✅ Production | Multi-model database (documents, time-series) | README |
| Delta Lake | 🚧 Planned | Zero-JVM data lake ingestion (S3/Azure/GCS) | - |
| LanceDB | 🚧 Planned | Serverless vector DB for RAG pipelines | - |
| ClickHouse | 🚧 Planned | Real-time analytics and feature stores | - |
| GreptimeDB | 🚧 Planned | Unified observability (metrics/logs/traces) | - |
| Connector | Status | Description | Documentation |
|---|---|---|---|
| MQTT | ✅ Production | IoT device integration (MQTT 3.1.1) | README |
| HTTP/Webhook | 🚧 Planned | Universal webhook ingestion from SaaS platforms | - |
| PostgreSQL CDC | 🚧 Planned | Change Data Capture from Postgres | - |
See Connector Roadmap for detailed implementation plans and timelines.
Complete documentation is available in the info/ directory:
- Start Here: Documentation Index - Overview and navigation
- Architecture Document - Design philosophy and specifications
- Development Guide - Build your first connector
- Configuration Guide - Configuration patterns and best practices
- Message Patterns - Message handling strategies
# Build all crates
cargo build --release
# Run tests
cargo test
# Build a specific connector
cargo build --release -p danube-sink-httpWe welcome contributions! Here's how you can help:
- New Connectors: Implement connectors for popular systems
- Documentation: Improve guides and examples
- Testing: Add test coverage
- Bug Reports: Open issues with detailed information
Please read our Development Guide before contributing.
Apache License 2.0 - See LICENSE for details.
- GitHub Issues: Report bugs or request features
- Danube Docs: Official Documentation
- Main Project: Danube Messaging
