-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SQL transaction log reader #7
Comments
@lbradstreet I would like to work on this task, but I need some help with this. |
I think @ibradstreet is referring to this type of stream construction from DB replication logs: The project I referred to, by Martin Kleppmann and team, has implemented parsing for both MySQL and PostgreSQL logs. Very exciting stuff, but not as elegant as Onyx. :-) |
the project is still running? |
This might be a good watch if you're interested in looking at reading the binlog https://www.percona.com/live/data-performance-conference-2016/sessions/mining-mysqls-binary-log-apache-kafka-and-kafka-connect |
thanks |
Yes, I think we would implement it with https://github.com/shyiko/mysql-binlog-connector-java. I'm not sure how we would go about making it generic yet. I'd start by making it specific to MySQL and hopefully we can generalise later. You may wish to read through the onyx-kafka code, since it also reads from a "log" of sorts. https://github.com/onyx-platform/onyx-kafka/blob/0.9.x/src/onyx/plugin/kafka.clj I think implementing it in this plugin would be fine. It'd also be ok to make it a separate plugin to keep the dependencies down though. Maybe start with it in this plugin? |
i run a simple example to pull data from mysql binlog file on our production server: (ns onyx-binlog.core
(:require [clojure.core.async :as async :refer [>!! <!! timeout chan thread close!]]
[clojure.java.data :as data :refer [from-java]])
(:import [com.github.shyiko.mysql.binlog BinaryLogClient BinaryLogClient$EventListener]))
(def binlog-ch (chan 1000))
(let [client (BinaryLogClient. "10.0.146.10" 3306 "bigdatatongbu" "bigdatatongbu")
listener (reify BinaryLogClient$EventListener (onEvent [this event] (>!! binlog-ch (from-java event) )))
_ (doto client (.registerEventListener listener) (.connect (* 1000 5))) ]
(while true (prn (<!! binlog-ch)))) the related event data is:
i think it's hard to integrated into the current plugin as the data model is not consistent. can you suggest something? |
i also check the my pipe/maxwell libraryhttps://github.com/mardambey/mypipe mechanism. so i will try to run the related ddl, dml to test the replication mechanism, but i also need a mechanism to get the updated data within a window.
|
Interesting. Thanks for getting onto this. It looks like you're on the On 11 April 2016 at 19:54, larry luo [email protected] wrote:
|
i think event action is not consistent vs data, but i can switch the log mode to low level . i will ask the dba to switch the database to row mode tomorrow, |
i will develop the plugin based on row mode for mysql. it is consistent with current read-log logic now. but i think it's a good idea to build another plugin for log reader. can you suggest something? |
i already write a simple example with messy and hardcode code i will adapt it to the plugin, so can you give me some suggestion about the integration with the exist plugin as the bin-log is the database-level data. |
This could mostly copy code from the datomic plugin's read-log.
Unfortunately this would have to be per database.
A reasonable looking Java library to help is:
https://github.com/shyiko/mysql-binlog-connector-java
I'd love to help a newbie with this one. Otherwise it may have to wait for client demand.
The text was updated successfully, but these errors were encountered: