Skip to content

[receiver/SQLServer] Error scraping metrics failed to parse valueKey #39124

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
rafaelrodrigues3092 opened this issue Apr 2, 2025 · 4 comments · Fixed by #39905
Closed

[receiver/SQLServer] Error scraping metrics failed to parse valueKey #39124

rafaelrodrigues3092 opened this issue Apr 2, 2025 · 4 comments · Fixed by #39905
Labels
bug Something isn't working receiver/sqlserver

Comments

@rafaelrodrigues3092
Copy link

Component(s)

receiver/sqlserver

What happened?

Description

While trying to scrape ALL possible metrics using the receiver I am seeing this error:

2025-04-02T11:32:19.067-0400    error   [email protected]/obs_metrics.go:61        Error scraping metrics  {"error": "failed to parse valueKey for row 330: strconv.ParseInt: parsing \"2.082944e+06\": invalid syntax in Free Space in tempdb (KB)"}
go.opentelemetry.io/collector/scraper/scraperhelper.wrapObsMetrics.func1
        go.opentelemetry.io/collector/scraper/[email protected]/obs_metrics.go:61
go.opentelemetry.io/collector/scraper.ScrapeMetricsFunc.ScrapeMetrics
        go.opentelemetry.io/collector/[email protected]/metrics.go:24
go.opentelemetry.io/collector/scraper/scraperhelper.scrapeMetrics
        go.opentelemetry.io/collector/scraper/[email protected]/controller.go:256
go.opentelemetry.io/collector/scraper/scraperhelper.NewMetricsController.func1
        go.opentelemetry.io/collector/scraper/[email protected]/controller.go:228
go.opentelemetry.io/collector/scraper/scraperhelper.(*controller[...]).startScraping.func1
        go.opentelemetry.io/collector/scraper/[email protected]/controller.go:171

When running the query from

const sqlServerPerformanceCountersQuery string = `
directly and filtering on "Free Space in tempdb (KB)", I see this:

Image

Steps to Reproduce

Run the attached configuration with otel-contrib agent v0.123.0 locally on the target system

Collector version

v0.123.0

Environment information

Environment

  • Windows Server 2019
  • SQL Server 2019 (RTM-CU29-GDR 15.0.4410.1)

OpenTelemetry Collector configuration

exporters:
  debug:
receivers:  
  sqlserver:
    collection_interval: 60s
    server: ${env:COMPUTERNAME}
    port: 1433    
    username: ${env:SECURE_LOGIN}  
    password: ${env:SECURE_PASSWORD}
    resource_attributes:
      server.address:
        enabled: true
      server.port:
        enabled: true
      sqlserver.computer.name:
        enabled: true
      sqlserver.database.name:
        enabled: true
      sqlserver.instance.name:
        enabled: true      
    metrics:
      sqlserver.batch.request.rate:
        enabled: true
      sqlserver.batch.sql_compilation.rate:
        enabled: true
      sqlserver.batch.sql_recompilation.rate:
        enabled: true
      sqlserver.database.backup_or_restore.rate:
        enabled: true
      sqlserver.database.count:
        enabled: false
      sqlserver.database.execution.errors:
        enabled: true
      sqlserver.database.full_scan.rate:
        enabled: true
      sqlserver.database.io:
        enabled: true
      sqlserver.database.latency:
        enabled: true
      sqlserver.database.operations:
        enabled: true
      sqlserver.database.tempdb.space:
        enabled: true
      sqlserver.database.tempdb.version_store.size:
        enabled: true
      sqlserver.deadlock.rate:
        enabled: true
      sqlserver.index.search.rate:
        enabled: true
      sqlserver.lock.timeout.rate:
        enabled: true
      sqlserver.lock.wait.rate:
        enabled: true
      sqlserver.lock.wait_time.avg:
        enabled: true
      sqlserver.login.rate:
        enabled: true
      sqlserver.logout.rate:
        enabled: true
      sqlserver.memory.grants.pending.count:
        enabled: true
      sqlserver.memory.usage:
        enabled: true
      sqlserver.page.buffer_cache.free_list.stalls.rate:
        enabled: true
      sqlserver.page.buffer_cache.hit_ratio:
        enabled: true
      sqlserver.page.checkpoint.flush.rate:
        enabled: true
      sqlserver.page.lazy_write.rate:
        enabled: true
      sqlserver.page.life_expectancy:
        enabled: true
      sqlserver.page.lookup.rate:
        enabled: true
      sqlserver.page.operation.rate:
        enabled: true
      sqlserver.page.split.rate:
        enabled: true
      sqlserver.processes.blocked:
        enabled: true
      sqlserver.replica.data.rate:
        enabled: true
      sqlserver.resource_pool.disk.throttled.read.rate:
        enabled: true
      sqlserver.resource_pool.disk.throttled.write.rate:
        enabled: true
      sqlserver.table.count:
        enabled: true
      sqlserver.transaction.delay:
        enabled: true
      sqlserver.transaction.mirror_write.rate:
        enabled: true
      sqlserver.transaction.rate:
        enabled: true
      sqlserver.transaction.write.rate:
        enabled: true
      sqlserver.transaction_log.flush.data.rate:
        enabled: true
      sqlserver.transaction_log.flush.rate:
        enabled: true
      sqlserver.transaction_log.flush.wait.rate:
        enabled: true
      sqlserver.transaction_log.growth.count:
        enabled: true
      sqlserver.transaction_log.shrink.count:
        enabled: true
      sqlserver.transaction_log.usage:
        enabled: true
      sqlserver.user.connection.count:
        enabled: true  
    query_sample_collection:
      enabled: false
      max_rows_per_query: 1450
    top_query_collection:
      enabled: false
      lookback_time: 60
      max_query_sample_count: 1000
      top_query_count: 200
service:
  pipelines:
    metrics/sqlserver:
      receivers:
      - sqlserver      
      exporters:
      - debug
  telemetry:
    metrics:
      level: normal
      readers:
      - pull:
          exporter:
            prometheus:
              host: 0.0.0.0
              port: 8888

Log output

2025-04-02T11:32:17.283-0400    info    [email protected]/service.go:197 Setting up own telemetry...
2025-04-02T11:32:17.283-0400    info    builders/builders.go:26 Development component. May change in the future.
2025-04-02T11:32:17.283-0400    info    [email protected]/service.go:264 Starting otelcol-contrib...     {"Version": "0.123.1", "NumCPU": 8}
2025-04-02T11:32:17.283-0400    info    extensions/extensions.go:41     Starting extensions...
2025-04-02T11:32:17.967-0400    info    [email protected]/service.go:287 Everything is ready. Begin running and processing data.
2025-04-02T11:32:19.067-0400    error   [email protected]/obs_metrics.go:61        Error scraping metrics  {"error": "failed to parse valueKey for row 330: strconv.ParseInt: parsing \"2.082944e+06\": invalid syntax in Free Space in tempdb (KB)"}
go.opentelemetry.io/collector/scraper/scraperhelper.wrapObsMetrics.func1
        go.opentelemetry.io/collector/scraper/[email protected]/obs_metrics.go:61
go.opentelemetry.io/collector/scraper.ScrapeMetricsFunc.ScrapeMetrics
        go.opentelemetry.io/collector/[email protected]/metrics.go:24
go.opentelemetry.io/collector/scraper/scraperhelper.scrapeMetrics
        go.opentelemetry.io/collector/scraper/[email protected]/controller.go:256
go.opentelemetry.io/collector/scraper/scraperhelper.NewMetricsController.func1
        go.opentelemetry.io/collector/scraper/[email protected]/controller.go:228
go.opentelemetry.io/collector/scraper/scraperhelper.(*controller[...]).startScraping.func1
        go.opentelemetry.io/collector/scraper/[email protected]/controller.go:171

Additional context

No response

@rafaelrodrigues3092 rafaelrodrigues3092 added bug Something isn't working needs triage New item requiring triage labels Apr 2, 2025
Copy link
Contributor

github-actions bot commented Apr 2, 2025

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@crobert-1
Copy link
Member

crobert-1 commented Apr 2, 2025

This looks like the same problem we were seeing in #38823. We'll have to update the type to double here.

Thanks for filing and letting us know, @rafaelrodrigues3092!

@crobert-1 crobert-1 removed the needs triage New item requiring triage label Apr 2, 2025
@rafaelrodrigues3092
Copy link
Author

This looks like the same problem we were seeing in #38823. We'll have to update the type to double here.

Thanks for filing and letting us know, @rafaelrodrigues3092!

Thank you @crobert-1 .
I tried looking for any related issues before I opened this one. Clearly I did not do a good job!

@crobert-1
Copy link
Member

I tried looking for any related issues before I opened this one. Clearly I did not do a good job!

No worries, thanks for checking!

@atoulme atoulme closed this as completed in d3f92cb May 8, 2025
@crobert-1 crobert-1 marked this as a duplicate of #39989 May 9, 2025
dragonlord93 pushed a commit to dragonlord93/opentelemetry-collector-contrib that referenced this issue May 23, 2025
…ts (open-telemetry#39905)

<!--Ex. Fixing a bug - Describe the bug and how this fixes the issue.
Ex. Adding a feature - Explain what this achieves.-->
#### Description
SQL Server stores large integers in scientific e notation.
`strconv.ParseInt` fails to parse this and raises an error, as seen in
the original filed issue. For our use case, we still want these to be
considered valid integers, so before failing entirely, attempt to parse
to float and convert to integer.

This change also uses already existing methods, `retrieveInt` and
`retrieveFloat` to be able to re-use the parsing functionality. The
original issue was only for `sqlserver.database.tempdb.space`, but I
believe this issue is relevant for all metrics of type `int`.

<!-- Issue number (e.g. open-telemetry#1234) or full URL to issue, if applicable. -->
#### Link to tracking issue
Fixes open-telemetry#39124

<!--Describe what testing was performed and which tests were added.-->
#### Testing
Changed test values of a couple integers to ensure new functionality
works. The changed values broke tests on `main` in the same way the
filed bug shows, but tests are passing with this change.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working receiver/sqlserver
Projects
None yet
2 participants