You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
with more than ~3000 boundRows throws the following error in the API:
java.util.concurrent.ExecutionException: org.apache.kafka.common.errors.RecordTooLargeException:
The message is 2584882 bytes when serialized which is larger than the maximum request size you have configured with the max.request.size configuration.
Workaround
Either split queries only using ~3000 boundRows or use api/noAnalytics/update endpoint.
Maybe the API also should have a look @Stratidakos
The text was updated successfully, but these errors were encountered:
Hi @MarieSaphira, there is no plan to have this issue fixed at the moment. It has to do with very large batch inserts which are done mostly to accommodate fast data ingestion. We believe that as soon as it supports some thousands of queries then it should be acceptable.
Describe the bug
When using the parameterised queries the message size is limited. So inserting
with more than ~3000 boundRows throws the following error in the API:
Workaround
Either split queries only using ~3000 boundRows or use
api/noAnalytics/update
endpoint.Maybe the API also should have a look @Stratidakos
The text was updated successfully, but these errors were encountered: