Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fail on large tables #194

Open
mehdi-zarrin opened this issue Jan 31, 2015 · 3 comments
Open

fail on large tables #194

mehdi-zarrin opened this issue Jan 31, 2015 · 3 comments
Labels

Comments

@mehdi-zarrin
Copy link

Hi,

i have the below code in my controller :

$users = User::select(['id' , 'username', 'first_name', 'last_name', 'email']);
return Datatables::of($users)->make(true);

it's works fine for 5 records :) but when i insert 100000 records with faker i'm geeting memory PHP Fatal error: Allowed memory size of 134217728 bytes exhausted.

thanks .

@gokigoks
Copy link

php has max_memory capacity which can be changed with the php INI. But you can change it on a running php script with this

ini_set('memory_limit','16M');

increase at your own dispense

@Remo
Copy link

Remo commented Apr 21, 2015

that's the wrong approach, I'm quite sure @mehdi-zarrin meant that this package should never consume more memory unless you display all records at one. With a server side pagination you shouldn't have to increase your memory limit like that, it should fetch the rows with a limit and only return those that are actually displayed..

@gokigoks
Copy link

@Remo your right. I guess your queries should be specific? I doubt anyone needs all that 10k rows.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants