Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] [Clickhouse to localfile] export Data rows error,need 1000,000 but actual 2000,000 #8803

Open
2 of 3 tasks
zhaohongyan133 opened this issue Feb 24, 2025 · 1 comment
Labels

Comments

@zhaohongyan133
Copy link

Search before asking

  • I had searched in the issues and found no similar issues.

What happened

Image

Image

Image

Image

I want to export 100 or 1000,000 datas,but actual export 200 or 2000,000,it will happens every 10-20 times

SeaTunnel Version

2.3.9

SeaTunnel Config

env {
  parallelism = 4
  job.mode = "BATCH"
  checkpoint.interval = 100000
}

source {
  Clickhouse {
    host = "192.168.0.22:8123"
    database = "default"
    sql = "select * from argus.test limit 100"
    username = "default"
    password = "xxx"
    clickhouse.config = {
      "socket_timeout": "300000"
    }
  }
}

sink {

  LocalFile {
    path = "/home/zhaohy/seatunnel/output/"
    file_format_type = "csv"
    field_delimiter = ","
    row_delimiter = "\n"
  }

}

Running Command

./bin/seatunnel.sh --config config/argus_clickhouse_2_csv_100.conf -m local

Error Exception

no error

Zeta or Flink or Spark Version

zeta

Java or Scala Version

1.8

Screenshots

No response

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Code of Conduct

@liunaijie
Copy link
Member

try to set parallelism to 1

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants