-
Notifications
You must be signed in to change notification settings - Fork 69
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Redis streams support #82
Comments
I think this would be a great addition 👍 Redis Streams are a perfect match for logstash IMHO. Would love to see it implemented 😄 |
@moonglum I am not working on this(I also don't have any experience with ruby) so PR's are certainly welcome. While I think the cleanest solution would be if elastic/logstash#8514 is implemented for proper acknowledgments of the stream. I think having the streams structure available with acknowledgements similiar to sqs and kafka are doing it now, would also certianly add value for a lot of people right now. For people that don't use ephemeral logstash instances it is not problem to depend on persistent queue mechanism. Another option to add "proper" acknowledgments would be that besides this plugin in the input section a "companion" plugin can be added as last output in the output section by users themselves. Though that would only work for single pipeline setup(but that is what most people are using anyway) |
+1!!!! Would be fantastic to have this addition. We are using Debezium server for change data capture, and the change messages are sent to Redis (Stream). If this plugin supports Stream data type, it'd make it a lot easier to ship the messages to our Elastic stack. |
+1 . We need messages sent to Stream to save in elasticsearch. |
With Redis 5.0 there is a new interesting datatype that was introduced, Streams, ''which models a log data structure'' https://redis.io/topics/streams-intro
This datatype seems perfect for logstash.
Currently the redis input plugin supports reading from the following data_types:
list
,channel
,pattern_channel
This issue is a feature request for support of this data_type, stream.
The problem of when to send XACK can be solved like other queue plugins', sqs, kafka, by directly acknowledging or in batches. Until elastic/logstash#8514 provides a way to do this better.
The text was updated successfully, but these errors were encountered: