-
Notifications
You must be signed in to change notification settings - Fork 27
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
High "Batch items size" breaks CiviCRM Queue Runner #20
Comments
Looks like class |
@sluc23 You are right. The issue is that 'civicrm_queue_item' table which stores the data is of type 'text' which has a limit of '65,535 characters'. |
Interesting - I expect it affects built in imports too? Or do they not use the queue runner like that |
built in imports don't use queue runner at all (which would be very nice to have), which leads to classic php timeouts error when importing relative big files. i.e.: |
Ok, this is weird and took me some time to find it. And very likely it's related with CiviCRM Queue / PHP limitations, rather than CSV_Import extension.
When setting a relatively "high" amount of items to be processed in each queue task item run, the queue gets stuck and this error is given in the AJAX call:
After some research I've found that the field data in civicrm_queue_item table, gets truncated and it doesn't store more than ~ 65500 characters, so the serialized data object is broken.
I could replicate that behavior in an internal extension we have, which implements CiviCRM Queue, and got same results, when data field is big.. doesn't store more than approx 65520 characters.
data is type of text in MySQL , so no limitations there.
I'll keep on investigating this, just wanted to point out in case anybody faced the same issue
The text was updated successfully, but these errors were encountered: