Skip to content

Add option for processing data in batches #50

Open
@fl034

Description

@fl034

Hey there again,
your library does a great jobs importing data line by line to prevent memory.
But it seems like there is no solution to process the imported lines in batches. Correct me if I oversaw this feature.

In my case, my CSV file is huge so adding all the data to an array exceeds the memory limit of some devices.

It would be good to have an option to set a batch size.
When the importer has imported as many elements as defined, a callback is fired where I can process the batch of processed elements (e.g. save them to a database) and free up memory, so the importer can continue.

Also there must be a onFinish callback that doesn't pass the whole array, because it could be the case that it doesn't fit into memory.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions