You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a question about getting data from SQLite into Duckdb. In SQLite, I have tables/views with a massive amount of rows 10 billion, transactional data. What is the easiest way to copy those tables into Duckdb to perform some analytics?
The text was updated successfully, but these errors were encountered:
I am using a python arrow UDF to process results from a query against the sqlite3 source. I'm seeing many small vectors of 1 or 2 records going through (as opposed to the 2048 you would expect at full capacity). I have an expensive setup in my UDF so it kills performance.
I wonder if this behavior would match with internal duckdb operations.
I get larger vectors when accessing the sqlite db in order (wrt the rowid), so maybe a copy wouldn't suffer from the performace penalty
Hi. Many thanks for all the work you are doing.
I have a question about getting data from SQLite into Duckdb. In SQLite, I have tables/views with a massive amount of rows 10 billion, transactional data. What is the easiest way to copy those tables into Duckdb to perform some analytics?
The text was updated successfully, but these errors were encountered: