Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bug in input fluend codec(decode) / MsgPack decoder #7102 #11

Open
jsvd opened this issue May 26, 2017 · 2 comments
Open

bug in input fluend codec(decode) / MsgPack decoder #7102 #11

jsvd opened this issue May 26, 2017 · 2 comments
Labels

Comments

@jsvd
Copy link
Member

jsvd commented May 26, 2017

moved from elastic/logstash#7102
created by @gasparch


When trying to send logs from Docker to Logstash using fluent I've discovered that logstash is not able to parse logs

11:14:11.739 [Ruby-0-Thread-193: /usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-tcp-4.1.0/lib/logstash/inputs/tcp.rb:124] ERROR logstash.inputs.tcp - An error occurred. Closing connection {:client=>"xxxx.3:36058", :exception=>#<TypeError: no implicit conversion from nil to integer>, :backtrace=>["org/jruby/RubyTime.java:1073:in `at'", "org/logstash/ext/JrubyTimestampExtLibrary.java:216:in `at'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-fluent-3.0.2-java/lib/logstash/codecs/fluent.rb:46:in `decode'", "org/msgpack/jruby/MessagePackLibrary.java:195:in `each'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-fluent-3.0.2-java/lib/logstash/codecs/fluent.rb:42:in `decode'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-tcp-4.1.0/lib/logstash/inputs/tcp.rb:182:in `handle_socket'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-tcp-4.1.0/lib/logstash/inputs/tcp.rb:153:in `server_connection_thread'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-tcp-4.1.0/lib/logstash/inputs/tcp.rb:151:in `server_connection_thread'"]}

Same Docker can send logs to Fluentd and they are processed normally. After some debug here is a minimal case that triggers the error.

  • Version:
    Logstash 1:5.4.0-1
  • Operating System:
    CentOS 7, kernel 3.10.0-514.6.1.el7.x86_64
  • Config File (if you have sensitive info, please remove it):

Only changed file from default installation is /etc/logstash/conf.d/config.conf

input {
        tcp {
                codec => fluent
                port => 4000
        }
}

output {
        stdout { 
                codec => rubydebug 
        }
}
  • Sample Data:

docker run --rm --name 'containName' --log-driver fluentd --log-opt fluentd-address=xxxxx.4:4000 --log-opt tag="test" busybox /bin/sh -c 'while true; do echo "test"; sleep 1; break; done'

  • Steps to Reproduce:

I managed to write minimal code that produces the error:

  1. CentOS7, install ruby (ruby 2.0.0p648 (2015-12-16) [x86_64-linux]), and gem install fluent-logger
  2. run irb and issue commands
require 'fluent-logger'
logger = Fluent::Logger::FluentLogger.new(nil, :host => 'xxxxx.4', :port => 4000)

Following command produces log in Logstash

logger.post("some_tag", {"container_id" => "1111111111111111111111111111111" })

Following command causes error in Logstash

logger.post("some_tag", {"container_id" => "1111111111111111111111111111111X" })

Adding just one symbol totally ruins MsgPack decoder in Logstash.

In case of Docker logging Logstash is able to wrongly decode first half of packet, incorrectly decodes container_id and then tries to decode remainder of message as MsgPack packet and fails there (because remaining junk has no tag/epochtime/etc fields).

@jsvd jsvd added the bug label May 26, 2017
@jalberto
Copy link

jalberto commented Jul 27, 2017

Is there any progress on this?

I am trying to send docker logs, using docker fluentd logging driver to logstash, but the logs get wrongly parsed by logstash into elasticsearch

@gasparch
Copy link

We just installed fluentd and made it to feed data directly to ES :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants