A scalable, real-time event logging and processing system built with Go, Kafka, and various supporting services.
.
├── api-gateway/ # API Gateway service
├── event-producer/ # Event Producer service
├── event-consumer/ # Event Consumer service
├── analytics-worker/ # Analytics Worker service
├── deduplication-service/# Deduplication service
├── retry-handler/ # Retry Handler service
├── internal/ # Shared internal packages
│ ├── pkg/
│ │ ├── config/ # Configuration management
│ │ ├── kafka/ # Kafka utilities
│ │ └── models/ # Shared data models
└── docker-compose.yml # Docker composition file
- Receives events via HTTP endpoints
- Validates and processes incoming events
- Produces events to Kafka
- Exposes metrics for monitoring
- Health check endpoint at
/health
- Metrics endpoint at
/metrics
- Uses Confluent's Kafka and Zookeeper images
- Single broker setup for development
- Configurable through environment variables
- Health checks ensure service availability
- Docker and Docker Compose
- Go 1.22 or later (for local development)
- Make (optional, for using Makefile commands)
The system is configured through environment variables and .env
file. Create a .env
file in the root directory with the following variables:
# Service Configuration
SERVICE_NAME=event-logger
PORT=8081
LOG_LEVEL=info
# Kafka Configuration
KAFKA_BROKERS=kafka:29092
KAFKA_CLIENT_ID=event-producer
KAFKA_GROUP_ID=event-producer-group
KAFKA_TOPIC_RAW_EVENTS=raw-events
KAFKA_TOPIC_PROCESSED_EVENTS=processed-events
KAFKA_TOPIC_RETRY_EVENTS=retry-events
KAFKA_TOPIC_ANALYTICS_EVENTS=analytics-events
# PostgreSQL Configuration
POSTGRES_HOST=postgres
POSTGRES_PORT=5432
POSTGRES_DB=eventlogger
POSTGRES_USER=postgres
POSTGRES_PASSWORD=your_password
# Redis Configuration
REDIS_HOST=redis
REDIS_PORT=6379
REDIS_PASSWORD=your_redis_password
- Build all services:
docker-compose build
- Start the entire stack:
docker-compose up -d
- View service logs:
docker-compose logs -f [service_name]
- Stop all services:
docker-compose down
- Install dependencies:
go mod download
- Run the Event Producer service locally:
go run event-producer/main.go
-
POST /events
- Submit a new event{ "id": "unique-event-id", "type": "event-type", "timestamp": "2024-02-25T12:00:00Z", "payload": { "key": "value" } }
-
GET /health
- Health check endpoint -
GET /metrics
- Prometheus metrics endpoint
The system includes Prometheus for metrics collection. Access the Prometheus UI at:
http://localhost:9090
Available metrics include:
events_produced_total
- Total number of events producedevent_production_latency_seconds
- Event production latency
- Base image:
golang:1.22-alpine
(build stage) - Final image:
alpine:latest
- Exposes port 8081
- Includes health checks
- Multi-stage build for minimal image size
- Uses official Confluent images
- Kafka exposed on ports 9092 (external) and 29092 (internal)
- Zookeeper exposed on port 2181
- Includes health checks for both services
- Fork the repository
- Create a feature branch
- Commit your changes
- Push to the branch
- Create a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.