You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
Hi!
I would like to use AWS metric streams feature to export metrics signal. I was testing the new SQS support successfully but I was getting some issues with the receiver. Basically AWS generates a file (set to OTLP v1.0 in AWS) and drops it into the S3 bucket. From there the collector grabs it with no issues but I was getting an error.
I might not be configuring it properly, but my first comment would be that being forced to use an extension for selecting the encoding might be limiting. In the AWS case, they are not using any extension whatsoever so it fails to select a format.
In any case, once you use the proper extension it still fails to process the AWS metric streams file. It throws the error: "error": "proto: illegal wireType 7". Which if I'm not mistaken means that the protobuf parsing is not working due to an incorrect message.
I verified this by running: protoc --decode_raw < aws_metrics_file.binpb and getting a parsing error.
After looking deeply into this, looks like AWS appends multiple messages into the same file. I'm not surprised as even in JSON for logs they do a similar thing, having appended JSON messages in the same file but failing to parse in JSON directly because it doesn't match JSON format. They simply attach records in the file.
I confirmed this by running a python script (sorry, not that familiar with go) to get each message size from the protobuf binary file and processing them separately. That worked properly and I was finally able to parse the metric data. To double check I even split the file into different protobuf files, one per message. I dropped those files into S3 and then the receiver picked it up through SQS and processed it properly. Even protoc command worked for them so.
So to sum up, AWS metric stream seems to be framing the messages and needs to extract each in order to process them.
Describe the solution you'd like
I think metric streams for getting AWS metrics (and maybe logs as well now as lambda now can deliver logs directly to s3) are a really interesting feature for those using AWS signals out of AWS framework.
So, ideally maybe through a config option, receiver could get the file and go through all the messages framed within and process each message separately. Also, it would be great to set the encoding but not based on the file extension. For example, AWS doesn't seem to use extensions (other than uuid for these files).
I hope this makes sense!!!
Describe alternatives you've considered
No response
Additional context
No response
The text was updated successfully, but these errors were encountered:
I think it should be feasible to add a new option to the receiver to specify the format as AWS cloud watch metric stream and some code to extract the OTLP messages.
@GGonzalezGomez I've just realised that this should already be supported!
The encoding already exists for the AWS CloudWatch Metric stream so it's just a matter of configuring the receiver.
Component(s)
receiver/awss3
Is your feature request related to a problem? Please describe.
Hi!
I would like to use AWS metric streams feature to export metrics signal. I was testing the new SQS support successfully but I was getting some issues with the receiver. Basically AWS generates a file (set to OTLP v1.0 in AWS) and drops it into the S3 bucket. From there the collector grabs it with no issues but I was getting an error.
I might not be configuring it properly, but my first comment would be that being forced to use an extension for selecting the encoding might be limiting. In the AWS case, they are not using any extension whatsoever so it fails to select a format.
In any case, once you use the proper extension it still fails to process the AWS metric streams file. It throws the error: "error": "proto: illegal wireType 7". Which if I'm not mistaken means that the protobuf parsing is not working due to an incorrect message.
I verified this by running: protoc --decode_raw < aws_metrics_file.binpb and getting a parsing error.
After looking deeply into this, looks like AWS appends multiple messages into the same file. I'm not surprised as even in JSON for logs they do a similar thing, having appended JSON messages in the same file but failing to parse in JSON directly because it doesn't match JSON format. They simply attach records in the file.
I confirmed this by running a python script (sorry, not that familiar with go) to get each message size from the protobuf binary file and processing them separately. That worked properly and I was finally able to parse the metric data. To double check I even split the file into different protobuf files, one per message. I dropped those files into S3 and then the receiver picked it up through SQS and processed it properly. Even protoc command worked for them so.
So to sum up, AWS metric stream seems to be framing the messages and needs to extract each in order to process them.
Describe the solution you'd like
I think metric streams for getting AWS metrics (and maybe logs as well now as lambda now can deliver logs directly to s3) are a really interesting feature for those using AWS signals out of AWS framework.
So, ideally maybe through a config option, receiver could get the file and go through all the messages framed within and process each message separately. Also, it would be great to set the encoding but not based on the file extension. For example, AWS doesn't seem to use extensions (other than uuid for these files).
I hope this makes sense!!!
Describe alternatives you've considered
No response
Additional context
No response
The text was updated successfully, but these errors were encountered: