Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Slow for more than a certain number of records #21

Open
kjfdebruin opened this issue Dec 4, 2017 · 1 comment
Open

Slow for more than a certain number of records #21

kjfdebruin opened this issue Dec 4, 2017 · 1 comment

Comments

@kjfdebruin
Copy link

kjfdebruin commented Dec 4, 2017

Thanks for a great library - does exactly what I need!

I have an interesting issue: when I insertIgnore, insertReplace or insertOnDuplicateKey for more than approximately 7200 records with 9 columns, then it suddenly starts timing out.

Less than that number of rows or columns completes very quickly (<5 seconds). Above that number suddenly starts timing out - and increasing the PHP max execution time still had no effect when set to 10 minutes (still timing out); then I figured something else must be wrong.

I solved it in this way:

$videoLogArrayChunks = array_chunk($videoLogArray, 3000);
foreach ($videoLogArrayChunks as $chunk) {
   VideoLog::insertIgnore($chunk);
}

Not sure if it's an issue or something else that I did incorrectly.

@n0099
Copy link

n0099 commented Feb 24, 2019

7200*9=64800, it's near the pdo binding value max limit 65535, plz use array_chunk to chunk insert rows to prevent pdo exception, see also #18

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants