-
Notifications
You must be signed in to change notification settings - Fork 150
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fetch multiple rows #977
Comments
@rodrigolive |
The application does not need to fetch 1000 rows. It needs 20M rows in batches of 1000 to avoid OOM errors and reduced network round trips from 20M If one fetches 20M rows (or 100K for that matter) with To request more rows in a batch, one needs to set the const sql = "SELECT rownum FROM myhugetable";
const stmt = await db.prepare(sql);
await stmt.setAttr(ibmdb.SQL_ATTR_ROW_ARRAY_SIZE, 1000);
const cursor = await stmt.execute();
let row;
row = await cursor.fetch();
console.log( row.ROWNUM ); // rownum == 1
row = await cursor.fetch();
console.log( row.ROWNUM ); // rownum == 1000 If you try the above code the remaining 999 rows for each batch is swallowed up by the driver. Only the first is returned. |
Using the ODBC (node-odbc) driver I'm able to fetch more than one row at a time with
stmt.fetch()
usingfetchSize
from a DB2 iAccess Connect driver. How can I accomplish the same with node-ibm_db directly, without using the ODBC driver?This is the ODBC driver code I'm looking to reproduce directly using the ibm_db driver:
The motivation is that
fetchAll()
is unusable for large tables andstmt.fetch()
for single rows is excruciatingly slow for large tables (ie. 20M rows). It makes the driver unusable for real-world situations for dealing with large table processing. This feature is available for the JDBC DB2 driver equivalent OTOH.The text was updated successfully, but these errors were encountered: