-
Notifications
You must be signed in to change notification settings - Fork 32
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Chunk logs to avoid hitting 1MB limit in putlogevents #28
Comments
For reference here is how I handle it in cwlogs-writable, which additionally handles the maximum of 10,000 log events per PutLogEvents call: https://github.com/amekkawi/cwlogs-writable/blob/38ed37ab16aca9c249fa5747e574363508afc5d7/lib/index.js#L420 The AWS docs are accurate but could be a little more specific about what it means by "sum of all event messages". I tested this out and it means just the "message" property of each log event, plus 26 bytes. So you can ignore all other JSON that makes up the PutEventLogs call, including the "timestamp". Note: I notice that I'm simply using the string length in getMessageSize(). I'll probably change that to multiple the length by 4 (max bytes per UTF-8 character) since measuring the exact bytes is likely to be expensive. |
I confirm this bug is this there. |
Just ran into this bug in a production environment 🤕 as well. Does anybody have a workaround or a pull request?
|
Just in case anybody else runs into the process crashing when this error happens, I was able to prevent it by properly defining an onError function on the CloudWatchStream after it's created.
|
From the docs, http://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/CloudWatchLogs.html#putLogEvents-property
It would be nice to limit/chunk the amount of data uploaded, since a request over the limit returns error and crashes nodejs.
The text was updated successfully, but these errors were encountered: