Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

canal订阅上游数据库元数据过大(超过64M)导致的fastjson报OutOfMemoryError异常 #5389

Open
2 tasks
hygugb opened this issue Jan 23, 2025 · 1 comment

Comments

@hygugb
Copy link

hygugb commented Jan 23, 2025

  • I have searched the issues of this repository and believe that this is not a duplicate.
  • I have checked the FAQ of this repository and believe that this is not a duplicate.

environment

  • canal version 1.1.7 and 1.1.8
  • mysql version MySQL 8.0.40 (用于tsdb)

Issue Description

使用canal订阅mysql业务数据库binlog投递到kafka,canal集群的tsdb使用的mysql做存储,首次订阅业务数据库的时候,报java.lang.OutOfMemoryError异常。

经过分析发现,由于业务数据库表数量很多,在canal首次订阅的时候获取元数据存储到meta_snapshot表时,由于元数据的大小超过了64M,导致了fastjson的JSON.toJSONString方法发生了OOM,订阅任务没法开始,并一直重启。
后把源码com.alibaba.otter.canal.parse.inbound.mysql.tsdb.DatabaseTableMeta.applySnapshotToDB方法中的“snapshotDO.setData(JSON.toJSONString(schemaDdls));”修改为“snapshotDO.setData(JSON.toJSONString(schemaDdls, JSONWriter.Feature.LargeObject));”后编译替换canal.parse-1.1.8.jar 问题得到解决。

希望在下个版本中支持这类大对象。

If there is an exception, please attach the exception trace:

2025-01-22 19:41:02.142 [destination = wms4-shard05-118 , address = r-3306-xxxxxx-prod.service.consul/10.xx.xx.xxx:3306 , EventParser] ERROR c.a.o.c.p.inbound.mysql.rds.RdsBinlogEventParserProxy - dump address r-3306-xxxxxx-prod.service.consul/10.xx.xx.xxx:3306 has an error, retrying. caused by 
java.lang.OutOfMemoryError: null
	at com.alibaba.fastjson2.JSONWriterUTF16.ensureCapacity(JSONWriterUTF16.java:1250)
	at com.alibaba.fastjson2.JSONWriterUTF16.writeStringEscape(JSONWriterUTF16.java:366)
	at com.alibaba.fastjson2.JSONWriterUTF16JDK8UF.writeString(JSONWriterUTF16JDK8UF.java:165)
	at com.alibaba.fastjson2.writer.ObjectWriterImplMap.write(ObjectWriterImplMap.java:481)
	at com.alibaba.fastjson2.JSON.toJSONString(JSON.java:2373)
	at com.alibaba.otter.canal.parse.inbound.mysql.tsdb.DatabaseTableMeta.applySnapshotToDB(DatabaseTableMeta.java:372)
	at com.alibaba.otter.canal.parse.inbound.mysql.tsdb.DatabaseTableMeta.rollback(DatabaseTableMeta.java:176)
	at com.alibaba.otter.canal.parse.inbound.mysql.AbstractMysqlEventParser.processTableMeta(AbstractMysqlEventParser.java:144)
	at com.alibaba.otter.canal.parse.inbound.AbstractEventParser$1.run(AbstractEventParser.java:192)
	at java.lang.Thread.run(Thread.java:748)
2025-01-22 19:41:02.145 [destination = wms4-shard05-118 , address = r-3306-xxxxxx-prod.service.consul/10.xx.xx.xxx:3306 , EventParser] ERROR com.alibaba.otter.canal.common.alarm.LogAlarmHandler - destination:wms4-shard05-118[java.lang.OutOfMemoryError
	at com.alibaba.fastjson2.JSONWriterUTF16.ensureCapacity(JSONWriterUTF16.java:1250)
	at com.alibaba.fastjson2.JSONWriterUTF16.writeStringEscape(JSONWriterUTF16.java:366)
	at com.alibaba.fastjson2.JSONWriterUTF16JDK8UF.writeString(JSONWriterUTF16JDK8UF.java:165)
	at com.alibaba.fastjson2.writer.ObjectWriterImplMap.write(ObjectWriterImplMap.java:481)
	at com.alibaba.fastjson2.JSON.toJSONString(JSON.java:2373)
	at com.alibaba.otter.canal.parse.inbound.mysql.tsdb.DatabaseTableMeta.applySnapshotToDB(DatabaseTableMeta.java:372)
	at com.alibaba.otter.canal.parse.inbound.mysql.tsdb.DatabaseTableMeta.rollback(DatabaseTableMeta.java:176)
	at com.alibaba.otter.canal.parse.inbound.mysql.AbstractMysqlEventParser.processTableMeta(AbstractMysqlEventParser.java:144)
	at com.alibaba.otter.canal.parse.inbound.AbstractEventParser$1.run(AbstractEventParser.java:192)
	at java.lang.Thread.run(Thread.java:748)
]
@hygugb hygugb changed the title canal订阅上游数据库元数据过大(超过64M)导致的fasjson报OutOfMemoryError异常 canal订阅上游数据库元数据过大(超过64M)导致的fastjson报OutOfMemoryError异常 Jan 23, 2025
@agapple
Copy link
Member

agapple commented Jan 24, 2025

可以提交一个PR给我

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants