Skip to content

[processor/filter] Panic when writting debug logs after dropping a datapoint #44705

@TylerHelmuth

Description

@TylerHelmuth

Component(s)

processor/filter

What happened?

Description

I found a panic today when trying to print some debug logs in the filter processor. The issue occurs when writing a debug log after having dropped a datapoint.

The issue occurs because the filterprocessor is doing .RemoveIf

dps.RemoveIf(func(datapoint pmetric.NumberDataPoint) bool {
skip, err := fmp.skipDataPointExpr.Eval(ctx, ottldatapoint.NewTransformContext(datapoint, metric, metrics, is, resource, pmetric.NewScopeMetrics(), pmetric.NewResourceMetrics()))
if err != nil {
errors = multierr.Append(errors, err)
return false
}
return skip
})
to loop through all the datapoints. If the condition returns true, the datapoint is removed from the slide.

Then in the next iteration through RemoveIf while processing the next datapoint, when OTTL does the next print

c.telemetrySettings.Logger.Debug("condition evaluation result", zap.String("condition", condition.origText), zap.Bool("match", match), zap.Any("TransformContext", tCtx))
the TransformContext includes the Metric and when it is Marshalled, the datapoints (all of them) are printed
switch mm.Type() {
case pmetric.MetricTypeSum:
encoder.AddString("aggregation_temporality", mm.Sum().AggregationTemporality().String())
encoder.AddBool("is_monotonic", mm.Sum().IsMonotonic())
err = encoder.AddArray("datapoints", NumberDataPointSlice(mm.Sum().DataPoints()))
case pmetric.MetricTypeGauge:
err = encoder.AddArray("datapoints", NumberDataPointSlice(mm.Gauge().DataPoints()))
case pmetric.MetricTypeHistogram:
encoder.AddString("aggregation_temporality", mm.Histogram().AggregationTemporality().String())
err = encoder.AddArray("datapoints", HistogramDataPointSlice(mm.Histogram().DataPoints()))
case pmetric.MetricTypeExponentialHistogram:
encoder.AddString("aggregation_temporality", mm.ExponentialHistogram().AggregationTemporality().String())
err = encoder.AddArray("datapoints", ExponentialHistogramDataPointSlice(mm.ExponentialHistogram().DataPoints()))
case pmetric.MetricTypeSummary:
err = encoder.AddArray("datapoints", SummaryDataPointSlice(mm.Summary().DataPoints()))
}

The problem is when this print happens, the datapoint slice contains a nil datapoint. When that datapoint is marshalled, the property calls cause a panic because orig is nil.

err := encoder.AddObject("attributes", Map(ndp.Attributes()))

Ultimately this is a result of referencing the slice that is being looped through in for the RemoveIf. One fix is to updating the logging implementation to not print the datapoints of the TransformContext's metric when marshaling the datapoint TransformContext.

But that will only solve the issue for logging. This issue would still be arise if someone wrote a custom Condition function that utilized the TransformContext's metric and then referenced the datapoints slice. OTTL lets you access all your parent properties, so it is always at risk of accessing something that is looping through RemoveIf.

What I'd like to see is a way to check "is my pdata type nil?".

Steps to Reproduce

Collector config

receivers:
  hostmetrics:
    collection_interval: 1s
    scrapers:
      memory:

processors:
  filter:
    error_mode: ignore
    metrics:
      metric:
        - 'type == METRIC_DATA_TYPE_SUM'
        - 'resource.attributes["service.name"] == "my_service_name"'

exporters:
  debug:

service:
  pipelines:
    metrics:
      receivers:
        - hostmetrics
      processors:
        - filter
      exporters:
        - debug

Expected Result

Collector operates as normal

Actual Result

❯ ./bin/otelcontribcol_darwin_arm64 --config ./local/config.yaml
2025-12-02T09:32:30.983-0700    debug   builders/builders.go:24 Alpha component. May change in the future.      {"resource": {"service.instance.id": "49b7247d-0edf-4db7-9de1-959bdc373272", "service.name": "otelcontribcol", "service.version": "0.141.0-dev"}, "otelcol.component.id": "debug", "otelcol.component.kind": "exporter", "otelcol.signal": "metrics"}
2025-12-02T09:32:30.984-0700    debug   builders/builders.go:24 Alpha component. May change in the future.      {"resource": {"service.instance.id": "49b7247d-0edf-4db7-9de1-959bdc373272", "service.name": "otelcontribcol", "service.version": "0.141.0-dev"}, "otelcol.component.id": "filter", "otelcol.component.kind": "processor", "otelcol.pipeline.id": "metrics", "otelcol.signal": "metrics"}
2025-12-02T09:32:30.985-0700    debug   builders/builders.go:24 Beta component. May change in the future.       {"resource": {"service.instance.id": "49b7247d-0edf-4db7-9de1-959bdc373272", "service.name": "otelcontribcol", "service.version": "0.141.0-dev"}, "otelcol.component.id": "hostmetrics", "otelcol.component.kind": "receiver", "otelcol.signal": "metrics"}
2025-12-02T09:32:30.985-0700    info    [email protected]/service.go:224 Starting otelcontribcol...      {"resource": {"service.instance.id": "49b7247d-0edf-4db7-9de1-959bdc373272", "service.name": "otelcontribcol", "service.version": "0.141.0-dev"}, "Version": "0.141.0-dev", "NumCPU": 10}
2025-12-02T09:32:30.985-0700    info    extensions/extensions.go:40     Starting extensions...  {"resource": {"service.instance.id": "49b7247d-0edf-4db7-9de1-959bdc373272", "service.name": "otelcontribcol", "service.version": "0.141.0-dev"}}
2025-12-02T09:32:30.986-0700    info    [email protected]/service.go:247 Everything is ready. Begin running and processing data. {"resource": {"service.instance.id": "49b7247d-0edf-4db7-9de1-959bdc373272", "service.name": "otelcontribcol", "service.version": "0.141.0-dev"}}
2025-12-02T09:32:31.991-0700    debug   [email protected]/parser.go:475     condition evaluation result     {"resource": {"service.instance.id": "49b7247d-0edf-4db7-9de1-959bdc373272", "service.name": "otelcontribcol", "service.version": "0.141.0-dev"}, "otelcol.component.id": "filter", "otelcol.component.kind": "processor", "otelcol.pipeline.id": "metrics", "otelcol.signal": "metrics", "condition": "metric.type == METRIC_DATA_TYPE_SUM", "match": true, "TransformContext": {"resource": {"attributes": {}, "dropped_attribute_count": 0}, "scope": {"attributes": {}, "dropped_attribute_count": 0, "name": "github.com/open-telemetry/opentelemetry-collector-contrib/receiver/hostmetricsreceiver/internal/scraper/memoryscraper", "version": "0.141.0-dev"}, "metric": {"description": "Bytes of memory in use.", "name": "system.memory.usage", "unit": "By", "type": "Sum", "metadata": {}, "aggregation_temporality": "Cumulative", "is_monotonic": false, "datapoints": [{"attributes": {"state": "used"}, "exemplars": [], "flags": 0, "start_time_unix_nano": 1763499331000000000, "time_unix_nano": 1764693151989891000, "value_int": 25447432192}, {"attributes": {"state": "free"}, "exemplars": [], "flags": 0, "start_time_unix_nano": 1763499331000000000, "time_unix_nano": 1764693151989891000, "value_int": 461930496}, {"attributes": {"state": "inactive"}, "exemplars": [], "flags": 0, "start_time_unix_nano": 1763499331000000000, "time_unix_nano": 1764693151989891000, "value_int": 8450375680}]}, "datapoint": {"attributes": {"state": "used"}, "exemplars": [], "flags": 0, "start_time_unix_nano": 1763499331000000000, "time_unix_nano": 1764693151989891000, "value_int": 25447432192}, "cache": {}}}
panic: runtime error: invalid memory address or nil pointer dereference
[signal SIGSEGV: segmentation violation code=0x2 addr=0x0 pc=0x101a8cbf8]

goroutine 181 [running]:
go.opentelemetry.io/collector/pdata/pmetric.NumberDataPoint.Attributes(...)
        go.opentelemetry.io/collector/[email protected]/pmetric/generated_numberdatapoint.go:53
github.com/open-telemetry/opentelemetry-collector-contrib/pkg/ottl/contexts/internal/logging.NumberDataPoint.MarshalLogObject({0x0?, 0x1400174c000?}, {0x10e04ada0, 0x14000d10140})
        github.com/open-telemetry/opentelemetry-collector-contrib/pkg/[email protected]/contexts/internal/logging/logging.go:208 +0x38
go.uber.org/zap/zapcore.(*jsonEncoder).AppendObject(0x14000d10140, {0x10de7c280, 0x14000ab50a0})
        go.uber.org/[email protected]/zapcore/json_encoder.go:225 +0x234
github.com/open-telemetry/opentelemetry-collector-contrib/pkg/ottl/contexts/internal/logging.NumberDataPointSlice.MarshalLogArray({0x140004e0280?, 0x1400174c000?}, {0x10e043820, 0x14000d10140})
        github.com/open-telemetry/opentelemetry-collector-contrib/pkg/[email protected]/contexts/internal/logging/logging.go:199 +0xb8
go.uber.org/zap/zapcore.(*jsonEncoder).AppendArray(0x14000d10140, {0x10de930c0, 0x14000ab5090})
        go.uber.org/[email protected]/zapcore/json_encoder.go:213 +0x214
go.uber.org/zap/zapcore.(*jsonEncoder).AddArray(0x14000d10140, {0x10931812a?, 0x14000d10140?}, {0x10de930c0, 0x14000ab5090})
        go.uber.org/[email protected]/zapcore/json_encoder.go:102 +0x3c
github.com/open-telemetry/opentelemetry-collector-contrib/pkg/ottl/contexts/internal/logging.Metric.MarshalLogObject({0x14000f01e60?, 0x1400174c000?}, {0x10e04ada0, 0x14000d10140})
        github.com/open-telemetry/opentelemetry-collector-contrib/pkg/[email protected]/contexts/internal/logging/logging.go:177 +0x580
go.uber.org/zap/zapcore.(*jsonEncoder).AppendObject(0x14000d10140, {0x10de7c260, 0x14000ab5050})
        go.uber.org/[email protected]/zapcore/json_encoder.go:225 +0x234
go.uber.org/zap/zapcore.(*jsonEncoder).AddObject(0x14000d10140, {0x1092fdbed?, 0xc243e307fb1555f0?}, {0x10de7c260, 0x14000ab5050})
        go.uber.org/[email protected]/zapcore/json_encoder.go:107 +0x3c
github.com/open-telemetry/opentelemetry-collector-contrib/pkg/ottl/contexts/ottldatapoint.TransformContext.MarshalLogObject({{0x10dab7d20, 0x14000ab4f70}, {0x14000f01e60, 0x1400174c000}, {0x14001a7c350, 0x1400174c000}, {0x14001a7c310, 0x1400174c000}, {0x14000fac080, 0x1400174c000}, ...}, ...)
        github.com/open-telemetry/opentelemetry-collector-contrib/pkg/[email protected]/contexts/ottldatapoint/datapoint.go:52 +0x1ac
go.uber.org/zap/zapcore.(*jsonEncoder).AppendObject(0x14000d10140, {0x13f8e4d78, 0x14000fac500})
        go.uber.org/[email protected]/zapcore/json_encoder.go:225 +0x234
go.uber.org/zap/zapcore.(*jsonEncoder).AddObject(0x14000d10140, {0x109356ee5?, 0x140000ccb68?}, {0x13f8e4d78, 0x14000fac500})
        go.uber.org/[email protected]/zapcore/json_encoder.go:107 +0x3c
go.uber.org/zap/zapcore.Field.AddTo({{0x109356ee5, 0x10}, 0x2, 0x0, {0x0, 0x0}, {0x10db2dc20, 0x14000fac500}}, {0x10e04ada0, 0x14000d10140})
        go.uber.org/[email protected]/zapcore/field.go:121 +0x86c
go.uber.org/zap/zapcore.addFields(...)
        go.uber.org/[email protected]/zapcore/field.go:210
go.uber.org/zap/zapcore.consoleEncoder.writeContext({0x10de7b1c0?}, 0x1400062b180, {0x14000269080, 0x3, 0x1?})
        go.uber.org/[email protected]/zapcore/console_encoder.go:141 +0xcc
go.uber.org/zap/zapcore.consoleEncoder.EncodeEntry({0x101300904?}, {0xff, {0xc243e307fb58d348, 0x52c8bec7, 0x114a93d40}, {0x0, 0x0}, {0x1093f38c4, 0x1b}, {0x1, ...}, ...}, ...)
        go.uber.org/[email protected]/zapcore/console_encoder.go:119 +0x530
go.uber.org/zap/zapcore.(*ioCore).Write(0x140008660c0, {0xff, {0xc243e307fb58d348, 0x52c8bec7, 0x114a93d40}, {0x0, 0x0}, {0x1093f38c4, 0x1b}, {0x1, ...}, ...}, ...)
        go.uber.org/[email protected]/zapcore/core.go:95 +0x54
go.uber.org/zap/zapcore.(*CheckedEntry).Write(0x14001c841a0, {0x14000269080, 0x3, 0x3})
        go.uber.org/[email protected]/zapcore/entry.go:258 +0xd4
go.uber.org/zap.(*Logger).Debug(0x109356ee5?, {0x1093f38c4?, 0x10db2dc20?}, {0x14000269080, 0x3, 0x3})
        go.uber.org/[email protected]/logger.go:239 +0x4c
github.com/open-telemetry/opentelemetry-collector-contrib/pkg/ottl.(*ConditionSequence[...]).Eval(0x10df26aa0, {0x10df5b550, 0x14000da06f0}, {{0x10dab7d20, 0x14000ab4f70}, {0x14000f01e60, 0x1400174c000}, {0x14001a7c350, 0x1400174c000}, {0x14001a7c310, ...}, ...})
        github.com/open-telemetry/opentelemetry-collector-contrib/pkg/[email protected]/parser.go:475 +0x2c8
github.com/open-telemetry/opentelemetry-collector-contrib/processor/filterprocessor.(*filterMetricProcessor).handleNumberDataPoints.func1({0x14000f00900, 0x1400174c000})
        github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/metrics.go:274 +0x138
go.opentelemetry.io/collector/pdata/pmetric.NumberDataPointSlice.RemoveIf({0x140004e0280?, 0x1400174c000?}, 0x140000cd520)
        go.opentelemetry.io/collector/[email protected]/pmetric/generated_numberdatapointslice.go:129 +0x7c
github.com/open-telemetry/opentelemetry-collector-contrib/processor/filterprocessor.(*filterMetricProcessor).handleNumberDataPoints(0x30ab7984747a5?, {0x10df5b550?, 0x14000da06f0?}, {0x140004e0280?, 0x1400174c000?}, {0x14000f01e60?, 0x1400174c000?}, {0x14001a7c350?, 0x1400174c000?}, {0x14001a7c310?, ...}, ...)
        github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/metrics.go:273 +0x88
github.com/open-telemetry/opentelemetry-collector-contrib/processor/filterprocessor.(*filterMetricProcessor).processMetrics.func1.1.1({0x14000f01e60?, 0x1400174c000?})
        github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/metrics.go:162 +0x38c
go.opentelemetry.io/collector/pdata/pmetric.MetricSlice.RemoveIf({0x14001a7c350?, 0x1400174c000?}, 0x140000cd770)
        go.opentelemetry.io/collector/[email protected]/pmetric/generated_metricslice.go:129 +0x7c
github.com/open-telemetry/opentelemetry-collector-contrib/processor/filterprocessor.(*filterMetricProcessor).processMetrics.func1.1({0x14001a7c310?, 0x1400174c000?})
        github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/metrics.go:147 +0x74
go.opentelemetry.io/collector/pdata/pmetric.ScopeMetricsSlice.RemoveIf({0x14000fac0b8?, 0x1400174c000?}, 0x140000cd8c8)
        go.opentelemetry.io/collector/[email protected]/pmetric/generated_scopemetricsslice.go:129 +0x7c
github.com/open-telemetry/opentelemetry-collector-contrib/processor/filterprocessor.(*filterMetricProcessor).processMetrics.func1({0x14000fac080, 0x1400174c000})
        github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/metrics.go:145 +0x154
go.opentelemetry.io/collector/pdata/pmetric.ResourceMetricsSlice.RemoveIf({0x14000df6060?, 0x1400174c000?}, 0x140000cdab0)
        go.opentelemetry.io/collector/[email protected]/pmetric/generated_resourcemetricsslice.go:129 +0x7c
github.com/open-telemetry/opentelemetry-collector-contrib/processor/filterprocessor.(*filterMetricProcessor).processMetrics(0x14000d36600, {0x10df5b550, 0x14000da06f0}, {0x14000df6060?, 0x1400174c000?})
        github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/metrics.go:130 +0xa0
go.opentelemetry.io/collector/processor/processorhelper.NewMetrics.func1({0x10df5b550, 0x14000da06f0}, {0x14000df6060?, 0x1400174c000?})
        go.opentelemetry.io/collector/processor/[email protected]/metrics.go:59 +0x100
go.opentelemetry.io/collector/consumer.ConsumeMetricsFunc.ConsumeMetrics(...)
        go.opentelemetry.io/collector/[email protected]/metrics.go:27
go.opentelemetry.io/collector/service/internal/refconsumer.refMetrics.ConsumeMetrics({{0x13fc5e848?, 0x140007c0100?}}, {0x10df5b550?, 0x14000da06f0?}, {0x14000df6060?, 0x1400174c000?})
        go.opentelemetry.io/collector/[email protected]/internal/refconsumer/metrics.go:29 +0x94
go.opentelemetry.io/collector/consumer.ConsumeMetricsFunc.ConsumeMetrics(...)
        go.opentelemetry.io/collector/[email protected]/metrics.go:27
go.opentelemetry.io/collector/internal/fanoutconsumer.(*metricsConsumer).ConsumeMetrics(0x14000addb60, {0x10df5b550, 0x14000da06f0}, {0x14000df6060?, 0x1400174c000?})
        go.opentelemetry.io/collector/internal/[email protected]/metrics.go:60 +0x1dc
go.opentelemetry.io/collector/scraper/scraperhelper.scrapeMetrics(0x14001c98360, {0x10decba70, 0x14000addb60})
        go.opentelemetry.io/collector/scraper/[email protected]/controller.go:265 +0x2c8
go.opentelemetry.io/collector/scraper/scraperhelper.NewMetricsController.func1(0x3b9aca00?)
        go.opentelemetry.io/collector/scraper/[email protected]/controller.go:228 +0x20
go.opentelemetry.io/collector/scraper/scraperhelper.(*controller[...]).startScraping.func1()
        go.opentelemetry.io/collector/scraper/[email protected]/controller.go:171 +0x11c
created by go.opentelemetry.io/collector/scraper/scraperhelper.(*controller[...]).startScraping in goroutine 1
        go.opentelemetry.io/collector/scraper/[email protected]/controller.go:152 +0x78

Collector version

v0.141.0

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions