Open
Description
Background
CNCF projects currently lack a standardized way to communicate their level of observability integration. This makes it difficult for users to understand what observability capabilities are available and for projects to benchmark their observability maturity against others.
Proposal
Create an Observability Integration Rubric that CNCF projects can use to self-assess and communicate their observability capabilities. The rubric defines clear levels of integration with modern observability tooling and standards.
Rubric Levels
What follows is a place to start.
Level 0: Basic Logging
- Only basic logging capabilities
- No structured logging
- No standardized logging format
- No integration with modern observability tools/standards
- Manual log collection and analysis required
Level 1: Standard Instrumentation
Must meet all Level 0 requirements plus:
- OpenTelemetry SDK integration for at least one signal type:
- Structured logging with consistent format
- Basic metrics exposure
- Distributed tracing support
- Continuous profiling capability
- Documentation on how to access and collect telemetry data
- Support for common collector agents (e.g., OpenTelemetry Collector)
Level 2: Complete Instrumentation
Must meet all Level 1 requirements plus:
- OpenTelemetry SDK integration for at least three signal types
- Support for context propagation across service boundaries
- Exemplars linking metrics to traces
- Pre-configured dashboards for common use cases
- Basic alerting rules/templates
- Documentation on recommended collection/storage solutions
Level 3: Advanced Observability
Must meet all Level 2 requirements plus:
- OpenTelemetry SDK integration for all signal types
- Custom instrumentation for project-specific components
- Advanced correlation between signals
- Rich set of pre-built dashboards
- SLO/SLI definitions and recording rules
- Comprehensive alerting rules with runbooks
- Integration examples with popular observability backends
- Regular testing of observability features
Implementation Details
Projects should:
- Self-assess against the rubric levels
- Document their current level in project documentation
- Create an observability integration roadmap if desired
- Update assessment when adding new capabilities
Benefits
- Clear standards for observability integration
- Easy comparison between projects
- Guidance for improvement
- Better user experience through predictable capabilities
- Encourages adoption of modern observability practices
Next Steps
- Gather community feedback on rubric levels and criteria
- Create assessment template/checklist
- Document process for self-assessment
- Begin collecting initial project assessments
Discussion Points
- Are the levels appropriately scoped?
- What additional criteria should be included?
- How to handle partial compliance with levels?
- Process for evolving rubric over time?
- How to verify self-assessments?
/cc @cncf/tag-observability
Metadata
Metadata
Assignees
Type
Projects
Status
Backlog