You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
sqlalchemy.exc.ProgrammingError: (snowflake.connector.errors.ProgrammingError) 100069 (22P02): 01b7c685-0002-1d43-0004-fb6a0dcbae1a: Error parsing JSON: document is too large, max size 16777216 bytes
Ran into this today and I don't know how to resolve it. I thought at first that this was related to our tap-mongodb source extractor (as a MongoDB BSON document also has a max 16MB size) but some googling turned up this SO post which suggests that this error relates to the Snowflake loading from a stage, in a more general way.
The text was updated successfully, but these errors were encountered:
Someone shared they were running into this same problem in a past Office Hours, and it was the motivation for this short exploration of excluding large JSON values: MeltanoLabs/meltano-map-transform#300.
One option folks could try to implement is adding ON_ERROR=CONTINUE to the insert statement. Of course, ideas and PRs are welcome.
Ran into this today and I don't know how to resolve it. I thought at first that this was related to our tap-mongodb source extractor (as a MongoDB BSON document also has a max 16MB size) but some googling turned up this SO post which suggests that this error relates to the Snowflake loading from a stage, in a more general way.
The text was updated successfully, but these errors were encountered: