Open
Description
Hey there again,
your library does a great jobs importing data line by line to prevent memory.
But it seems like there is no solution to process the imported lines in batches. Correct me if I oversaw this feature.
In my case, my CSV file is huge so adding all the data to an array exceeds the memory limit of some devices.
It would be good to have an option to set a batch size.
When the importer has imported as many elements as defined, a callback is fired where I can process the batch of processed elements (e.g. save them to a database) and free up memory, so the importer can continue.
Also there must be a onFinish
callback that doesn't pass the whole array, because it could be the case that it doesn't fit into memory.
Metadata
Metadata
Assignees
Labels
No labels