-
Notifications
You must be signed in to change notification settings - Fork 152
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Logstash-output-s3 can't save logs on many bucket. #37
Comments
I haven't tested this personaly but can you try adding a different
|
I've also ran into this bug, took quite awhile to track down - in retrospect, the "S3: Cannot delete the temporary file since it doesn't exist on disk" messages in the log were pointing to this issue - if you don't set tags, the files start overwriting each other... |
@codewaffle - regarding the OP: did you have two s3 outputs 9with different buckets) in if blocks the same as the original post? And was it solved by adding tags? |
It was the same bucket, different prefixes - I'm a little fuzzy on the rest of the details after so much time. Here's the config I'm using now - iirc all I did to go from 'not working' to 'working' was add the tags:
|
I can confirm the problem, and that adding tags resolves the issue. I had multiple s3 outputs, and without the tags, the files overwrite each other. My symptoms manifested as each bucket getting the same log file. This error was in my logstash logs:
|
With the support of dynamic prefix and uuid that should fix that situation see #102 |
I am trying to send my kubernetes logs to s3 using logstash(v5.4.3) and logstash-output-s3 (5.0.7). logstash is creating logstash-programmatic-access-test-object but not the actual log files. Below are my input and output configurations. Am I taking a wrong input plugin ? please do suggest.
|
Hi all,
I make a config for logstash ,
I want if the logs match logtype==request and env==prd , it will save to Bucket 1
If the logstash logtype==request and env==stg , i will save to Bucket2.
However, all logs are save on Bucket 1.
In the beginning , I think there is something wrong with my configure or logstash condition didn't work properly.
So I tried to remove configure which save logs on Bucket2 to check the conditional logstash.
However, it worked correctly.
That's why I think the logstash output-s3 don't allow to save logs on many buckets.
The text was updated successfully, but these errors were encountered: