-
Notifications
You must be signed in to change notification settings - Fork 1.8k
Description
Component(s)
exporter/otlp, exporter/exporterhelper
Describe the issue you're reporting
I found the exporterhelper does not have a standalone sizer for batch. And batch and queue_size are going to share the same sizer.
queue_size (default = 1000): Maximum size the queue can accept. Measured in units defined by sizer
batch disabled by default if not definedflush_timeout: time after which a batch will be sent regardless of its size. Must be a non-zero value
min_size: the minimum size of a batch.
max_size: the maximum size of a batch, enables batch splitting. The maximum size of a batch should be greater than or equal to the minimum size of a batch.
This would limit the use case and make it less flexible.
For instance, I want to limit the body to a certain size, like 10 MB, when exporting. Any body bigger than this limit will be split into smaller pieces. I don't want to touch the queue_size since the default value 1000 requests seems good enough.
But, if queue_size is sharing the sizer with batch, I will be forced to think of a new value for queue_size that could be strange and unrealistic. Take the default value 1000 requests as an example, I cannot simply multiply the original 1000 requests by the max body limit of 10 MB because I will get 10 GB, which is not realistic.
The semantic meaning of queue_size is changed when switching sizer. The requests sizer means limiting the frequency of the requests, while the bytes sizer means limiting the total size of all requests. The ability to limit QPS is gone.
There is no way to express "limiting the QPS of requests while splitting the size of every single request if the size of the request reaches a certain limit.". And, this body limit is useful for an upstream gateway that has a max HTTP body size as a protection.
I don't know what the good number is for queue_size here, and I don't want to touch it.
Please bring the standalone sizer back to batch.
Tip
React with 👍 to help prioritize this issue. Please use comments to provide useful context, avoiding +1 or me too, to help us triage it. Learn more here.