Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

JSON "document is too large" error #275

Open
menzenski opened this issue Oct 18, 2024 · 1 comment
Open

JSON "document is too large" error #275

menzenski opened this issue Oct 18, 2024 · 1 comment
Labels
bug Something isn't working

Comments

@menzenski
Copy link

sqlalchemy.exc.ProgrammingError: (snowflake.connector.errors.ProgrammingError) 100069 (22P02): 01b7c685-0002-1d43-0004-fb6a0dcbae1a: Error parsing JSON: document is too large, max size 16777216 bytes

Ran into this today and I don't know how to resolve it. I thought at first that this was related to our tap-mongodb source extractor (as a MongoDB BSON document also has a max 16MB size) but some googling turned up this SO post which suggests that this error relates to the Snowflake loading from a stage, in a more general way.

@edgarrmondragon
Copy link
Member

Someone shared they were running into this same problem in a past Office Hours, and it was the motivation for this short exploration of excluding large JSON values: MeltanoLabs/meltano-map-transform#300.

One option folks could try to implement is adding ON_ERROR=CONTINUE to the insert statement. Of course, ideas and PRs are welcome.

@edgarrmondragon edgarrmondragon added the bug Something isn't working label Oct 18, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
Status: No status
Development

No branches or pull requests

2 participants