You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I want archive the topic _schemas in bigquery ( it contains only events with a JSON key and a JSON value but not serialized with a JSON schema )
so I want use the schemaless functionnality of the connector but it fail on
com.wepay.kafka.connect.bigquery.exception.BigQueryConnectException: Failed to unionize schemas of records for the table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=schema_metadata, tableId=_schemas}} Caused by: Could not convert to BigQuery schema with a batch of tombstone records. at
com.wepay.kafka.connect.bigquery.SchemaManager.getTableInfo(SchemaManager.java:297) at
com.wepay.kafka.connect.bigquery.SchemaManager.createTable(SchemaManager.java:240) at
com.wepay.kafka.connect.bigquery.write.row.AdaptiveBigQueryWriter.attemptTableCreate(AdaptiveBigQueryWriter.java:161) at
com.wepay.kafka.connect.bigquery.write.row.AdaptiveBigQueryWriter.performWriteRequest(AdaptiveBigQueryWriter.java:102) at
com.wepay.kafka.connect.bigquery.write.row.BigQueryWriter.writeRows(BigQueryWriter.java:112) at
did someone found a way to send kafka events without schema to bq ?
thanks
The text was updated successfully, but these errors were encountered:
@raphaelauv , connector does not support table creation and schema update on JSON input. If you create the table manually with appropriate schema and send payload of type Map, data ingestion should not fail.
has anyone found a way to solve this?
the only thing i could get working is to use a StringConverter and the HoistField SMT to wrap the whole thing in a single value. but the table just has a single column with the entire string as a string. would love a way to have the schema infered from the json and flatten it.
I want archive the topic
_schemas
in bigquery ( it contains only events with a JSON key and a JSON value but not serialized with a JSON schema )so I want use the schemaless functionnality of the connector but it fail on
com.wepay.kafka.connect.bigquery.exception.BigQueryConnectException: Failed to unionize schemas of records for the table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=schema_metadata, tableId=_schemas}} Caused by: Could not convert to BigQuery schema with a batch of tombstone records. at com.wepay.kafka.connect.bigquery.SchemaManager.getTableInfo(SchemaManager.java:297) at com.wepay.kafka.connect.bigquery.SchemaManager.createTable(SchemaManager.java:240) at com.wepay.kafka.connect.bigquery.write.row.AdaptiveBigQueryWriter.attemptTableCreate(AdaptiveBigQueryWriter.java:161) at com.wepay.kafka.connect.bigquery.write.row.AdaptiveBigQueryWriter.performWriteRequest(AdaptiveBigQueryWriter.java:102) at com.wepay.kafka.connect.bigquery.write.row.BigQueryWriter.writeRows(BigQueryWriter.java:112) at
did someone found a way to send kafka events without schema to bq ?
thanks
The text was updated successfully, but these errors were encountered: