Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Redis streams support #82

Open
rv2673 opened this issue May 11, 2020 · 4 comments
Open

Redis streams support #82

rv2673 opened this issue May 11, 2020 · 4 comments

Comments

@rv2673
Copy link

rv2673 commented May 11, 2020

With Redis 5.0 there is a new interesting datatype that was introduced, Streams, ''which models a log data structure'' https://redis.io/topics/streams-intro
This datatype seems perfect for logstash.

Currently the redis input plugin supports reading from the following data_types:
list, channel, pattern_channel

This issue is a feature request for support of this data_type, stream.

The problem of when to send XACK can be solved like other queue plugins', sqs, kafka, by directly acknowledging or in batches. Until elastic/logstash#8514 provides a way to do this better.

@moonglum
Copy link

I think this would be a great addition 👍 Redis Streams are a perfect match for logstash IMHO. Would love to see it implemented 😄
If you're not planning to work on this: Would PRs be welcome for this topic?

@rv2673
Copy link
Author

rv2673 commented Nov 21, 2020

@moonglum I am not working on this(I also don't have any experience with ruby) so PR's are certainly welcome.

While I think the cleanest solution would be if elastic/logstash#8514 is implemented for proper acknowledgments of the stream. I think having the streams structure available with acknowledgements similiar to sqs and kafka are doing it now, would also certianly add value for a lot of people right now. For people that don't use ephemeral logstash instances it is not problem to depend on persistent queue mechanism.

Another option to add "proper" acknowledgments would be that besides this plugin in the input section a "companion" plugin can be added as last output in the output section by users themselves. Though that would only work for single pipeline setup(but that is what most people are using anyway)

@namoshizun
Copy link

+1!!!! Would be fantastic to have this addition. We are using Debezium server for change data capture, and the change messages are sent to Redis (Stream). If this plugin supports Stream data type, it'd make it a lot easier to ship the messages to our Elastic stack.

@sanasz91mdev
Copy link

+1 . We need messages sent to Stream to save in elasticsearch.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants