Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

StoreAPI batching #7929

Open
GiedriusS opened this issue Nov 19, 2024 · 2 comments
Open

StoreAPI batching #7929

GiedriusS opened this issue Nov 19, 2024 · 2 comments

Comments

@GiedriusS
Copy link
Member

Is your proposal related to a problem?

Sending responses one-by-one from StoreAPI is not ideal:

  • Compression suffers because compression is per-message
  • We have to allocate a lot of small objects hence increased GC pressure

Since each response is small, my proposal is to pack multiple responses into one message.

Describe the solution you'd like

Tunable batch size parameter on the StoreAPI level (request - batch size, response is batched accordingly) and inside of the querier.

Describe alternatives you've considered

N/A

Additional context

N/A

@harry671003
Copy link
Contributor

Is the proposal to batch multiple Series in the Series response?

I'm wondering if GRPC under the hood automatically batches multiple SeriesResponse into one TCP request.

message SeriesResponse {
  oneof result {
    repeated Series series = 1; // Batch?
    string warning = 2;
    google.protobuf.Any hints = 3;
  }
}

message SeriesResponse {

@GiedriusS
Copy link
Member Author

It's not about the transport layer but about reducing GC pressure through allocating much fewer storepb.SeriesResponse objects on both sides. Also, it's a prelude to further possible improvements like adding a symbol table. With batch size 1 doing something like that doesn't make sense.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants