Skip to content

[elasticsearchexporter] exporter panics when encoding non-string value scope attributes #37701

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
axw opened this issue Feb 5, 2025 · 7 comments · Fixed by #40098
Closed
Assignees
Labels
bug Something isn't working exporter/elasticsearch good first issue Good for newcomers

Comments

@axw
Copy link
Contributor

axw commented Feb 5, 2025

Component(s)

exporter/elasticsearch

What happened?

Description

The Elasticsearch exporter assumes all scope attribute values are strings, and will panic if they're not.

Steps to Reproduce

Run the collector with the attached config. Yes, use a logs pipeline with the hostmetrics receiver -- originally an accident on my part, but it triggers the bug.

Expected Result

The exporter should encode scope attribute values with the original type rather than coercing to a string.

Actual Result

The exporter panics.

Collector version

v0.119.0

Environment information

Environment

N/A

OpenTelemetry Collector configuration

receivers:                                                               
  hostmetrics:                                                           
    scrapers:                                                            
      cpu: {}                                                            
                                                                         
exporters:                                                               
  elasticsearch:                                                         
    endpoint: <snip>
    api_key: <snip>
                                                                         
service:                                                                 
  pipelines:                                                             
    logs:                                                             
      receivers: [hostmetrics]                                           
      processors: []                                                     
      exporters: [elasticsearch]

Log output

panic: interface conversion: interface {} is bool, not string                                                                                                                                               
                                                                                                                                                                                                            
goroutine 1 [running]:                                                                                                                                                                                      
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.scopeToAttributes({0xc00075bab0?, 0xc001b03a4c?})                                                                  
        /home/andrew/projects/opentelemetry-collector-contrib/exporter/elasticsearchexporter/model.go:337 +0x27d                                                                                            
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.(*encodeModel).encodeLogDefaultMode(0xc001b02690, {0xc000dd6f78?, 0xc001b03a4c?}, {0xc000d91300?, 0xc001b03a4c?}, {0xc00075bab0?, 0xc001b03a4c?}, {{0x10a0d4fe,
 0x14}, {0x0, ...}, ...})                                                                                                                                                                                   
        /home/andrew/projects/opentelemetry-collector-contrib/exporter/elasticsearchexporter/model.go:136 +0x8e5           
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.(*encodeModel).encodeLog(0xc001b02690, {0xc000dd6f78?, 0xc001b03a4c?}, {0x0, 0x0}, {0xc000d91300?, 0xc001b03a4c?}, {0xc00075bab0?, 0xc001b03a4c?}, {0x0, ...}, 
...)                                                                                                                          
        /home/andrew/projects/opentelemetry-collector-contrib/exporter/elasticsearchexporter/model.go:113 +0x166           
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.(*elasticsearchExporter).pushLogRecord(0xc000bc8540, {0x120d3368, 0xc00075bb20}, {0xc000dd6f78?, 0xc001b03a4c?}, {0x0, 0x0}, {0xc000d91300?, 0xc001b03a4c?}, {0
xc00075bab0, ...}, ...)                                                                                                                 
        /home/andrew/projects/opentelemetry-collector-contrib/exporter/elasticsearchexporter/exporter.go:187 +0x54b          
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.(*elasticsearchExporter).pushLogsData(0xc000bc8540, {0x120d3368, 0xc00075bb20}, {0xc0011a3f08?, 0xc001b03a4c?})
        /home/andrew/projects/opentelemetry-collector-contrib/exporter/elasticsearchexporter/exporter.go:138 +0xb37                    
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsRequest).Export(0xc000d81d60, {0x120d3368, 0xc00075bb20})        
        /home/andrew/go/pkg/mod/go.opentelemetry.io/collector/[email protected]/exporterhelper/logs.go:64 +0x83            
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*TimeoutSender).Send(0xc000f645a0, {0x120d32c0, 0xc000c17980}, {0x120ac350, 0xc000d81d60})                                              
        /home/andrew/go/pkg/mod/go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/timeout_sender.go:53 +0xee                        
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*obsReportSender[...]).Send(0x1200f660, {0x120d32f8, 0xc000594dc0}, {0x120ac350, 0xc000d81d60})                                         
        /home/andrew/go/pkg/mod/go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/obs_report_sender.go:28 +0x21e                         
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*BaseExporter).Send(0xc000685a40, {0x120d32f8, 0xc000594dc0}, {0x120ac350, 0xc000d81d60})
        /home/andrew/go/pkg/mod/go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/base_exporter.go:129 +0xa4                             
go.opentelemetry.io/collector/exporter/exporterhelper.NewLogsRequest.func1({0x120d32f8, 0xc000594dc0}, {0xc0011a3f08?, 0xc001b03a4c?})                    
        /home/andrew/go/pkg/mod/go.opentelemetry.io/collector/[email protected]/exporterhelper/logs.go:142 +0x3c5                                        
go.opentelemetry.io/collector/consumer.ConsumeLogsFunc.ConsumeLogs(...)                                                                
        /home/andrew/go/pkg/mod/go.opentelemetry.io/collector/[email protected]/logs.go:26                                                                
go.opentelemetry.io/collector/consumer.ConsumeLogsFunc.ConsumeLogs(...)                                                               
        /home/andrew/go/pkg/mod/go.opentelemetry.io/collector/[email protected]/logs.go:26                                             
github.com/open-telemetry/opentelemetry-collector-contrib/receiver/hostmetricsreceiver.(*hostEntitiesReceiver).sendEntityEvent(0xc000ad79b0, {0x120d32f8, 0xc000594dc0})                                
        /home/andrew/projects/opentelemetry-collector-contrib/receiver/hostmetricsreceiver/receiver.go:75 +0x484                                                        
github.com/open-telemetry/opentelemetry-collector-contrib/receiver/hostmetricsreceiver.(*hostEntitiesReceiver).Start(0xc000ad79b0, {0x120d16d0, 0x1a3d1860}, {0x4153a5?, 0x10?})                        
        /home/andrew/projects/opentelemetry-collector-contrib/receiver/hostmetricsreceiver/receiver.go:33 +0xba                                                                 
go.opentelemetry.io/collector/service/internal/graph.(*Graph).StartAll(0xc000dd6360, {0x120d16d0, 0x1a3d1860}, 0xc00092ac60)                                           
        /home/andrew/go/pkg/mod/go.opentelemetry.io/collector/[email protected]/internal/graph/graph.go:419 +0x3fd                                             
go.opentelemetry.io/collector/service.(*Service).Start(0xc0014cbcb0, {0x120d16d0, 0x1a3d1860})                                                           
        /home/andrew/go/pkg/mod/go.opentelemetry.io/collector/[email protected]/service.go:267 +0x51a                                                    
go.opentelemetry.io/collector/otelcol.(*Collector).setupConfigurationComponents(0xc0007810e0, {0x120d16d0, 0x1a3d1860})                
        /home/andrew/go/pkg/mod/go.opentelemetry.io/collector/[email protected]/collector.go:231 +0xd25                           
go.opentelemetry.io/collector/otelcol.(*Collector).Run(0xc0007810e0, {0x120d16d0, 0x1a3d1860})                                       
        /home/andrew/go/pkg/mod/go.opentelemetry.io/collector/[email protected]/collector.go:285 +0x6b                                 
go.opentelemetry.io/collector/otelcol.NewCommand.func1(0xc0004cc308, {0x10996e15?, 0x7?, 0x1098af58?})                           
        /home/andrew/go/pkg/mod/go.opentelemetry.io/collector/[email protected]/command.go:36 +0x13c                          
github.com/spf13/cobra.(*Command).execute(0xc0004cc308, {0xc0000b7ea0, 0x2, 0x2})                                          
        /home/andrew/go/pkg/mod/github.com/spf13/[email protected]/command.go:985 +0x10f4                                          
github.com/spf13/cobra.(*Command).ExecuteC(0xc0004cc308)                                                                    
        /home/andrew/go/pkg/mod/github.com/spf13/[email protected]/command.go:1117 +0x656                                       
github.com/spf13/cobra.(*Command).Execute(0xc0004cc308)                                                                    
        /home/andrew/go/pkg/mod/github.com/spf13/[email protected]/command.go:1041 +0x27                                        
main.runInteractive({0x10dc7ce8, {{0x109cd5ab, 0xe}, {0x10bbc707, 0x3b}, {0x109b2886, 0xb}}, 0x0, {{{0x0, 0x0, ...}, ...}}, ...})                                                                       
        /home/andrew/projects/opentelemetry-collector-contrib/cmd/otelcontribcol/main.go:61 +0x5d                                                                               
main.run(...)                                                                                                                                                           
        /home/andrew/projects/opentelemetry-collector-contrib/cmd/otelcontribcol/main_others.go:10                                                             
main.main()                                                                                                                                               
        /home/andrew/projects/opentelemetry-collector-contrib/cmd/otelcontribcol/main.go:54 +0x778                                                       
exit status 2

Additional context

No response

@axw axw added bug Something isn't working needs triage New item requiring triage labels Feb 5, 2025
Copy link
Contributor

github-actions bot commented Feb 5, 2025

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@lahsivjar
Copy link
Member

/label -needs-triage

@github-actions github-actions bot removed the needs triage New item requiring triage label Feb 5, 2025
@carsonip
Copy link
Contributor

This seems to only affect raw and none mode, as ECS and OTel mapping mode uses other code paths.

@carsonip
Copy link
Contributor

/label good-first-issue

@github-actions github-actions bot added the good first issue Good for newcomers label Mar 27, 2025
@AndersonQ
Copy link
Contributor

Hey folks, I'll take this if it's ok.

@AndersonQ
Copy link
Contributor

I needed to change the config to reproduce the issue. I needed to change the mapping mode. For the record I used the following:

receivers:
  hostmetrics:
    metadata_collection_interval: 1s
    scrapers:
      cpu: {}

exporters:
  elasticsearch:
    mapping:
      mode: none
    endpoint: <redacted>
    user: <redacted>
    password: <redacted>
    flush:
      interval: 1s

service:
  pipelines:
    logs:
      receivers: [hostmetrics]
      processors: []
      exporters: [elasticsearch]

@axw
Copy link
Contributor Author

axw commented May 19, 2025

Thanks @AndersonQ !

dragonlord93 pushed a commit to dragonlord93/opentelemetry-collector-contrib that referenced this issue May 23, 2025
…en-telemetry#40098)

<!--Ex. Fixing a bug - Describe the bug and how this fixes the issue.
Ex. Adding a feature - Explain what this achieves.-->
#### Description

Previously, the code attempted to cast scope attribute values to
strings, leading to a panic when incompatible types were encountered.

When mapping scope to attributes, instead of trying to cast them to
string now it just creates a new Map for the attributes and copies the
cope entries to the new map. This avoids any cast and preserves the
original types.

<!-- Issue number (e.g. open-telemetry#1234) or full URL to issue, if applicable. -->
#### Link to tracking issue
Fixes open-telemetry#37701

<!--Describe what testing was performed and which tests were added.-->
#### Testing
Run the exporter with the config that originally causes the panic, see
config below.
Running it on main will panic, running it on this PR will not panic.

```
./bin/otelcontribcol_linux_amd64 --config otel-test.yaml
```

```
receivers:
  hostmetrics:
    metadata_collection_interval: 1s
    scrapers:
      cpu: {}

exporters:
  elasticsearch:
    mapping:
      mode: none
    endpoint: <redacted>
    user: <redacted>
    password: <redacted>
    flush:
      interval: 1s

service:
  pipelines:
    logs:
      receivers: [hostmetrics]
      processors: []
      exporters: [elasticsearch]

```

<!--Describe the documentation added.-->
#### Documentation

N/A
<!--Please delete paragraphs that you did not use before submitting.-->
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working exporter/elasticsearch good first issue Good for newcomers
Projects
None yet
4 participants