Unfortunetely pagination won't work for me. I need to process all data, not display them. Normally I'd just fetch the data row by row, having always just one row in memory. Surprisingly Laravel doesn't allow that. Do I need to use clean PDO?
No. I need to get all records that meet complex criteria. Complex enough to not be able to do it by SQL query alone.
More than one table or just a single table. What are you doing with the records, displaying?
What are these questions for? As I said, I need to fetch the records to avoid keeping a large array in memory. It doesn't matter if it's one table or not and id doesn't matter if I'm displaying or not.
kokokurak said:
What are these questions for? As I said, I need to fetch the records to avoid keeping a large array in memory. It doesn't matter if it's one table or not and id doesn't matter if I'm displaying or not.
I would say it is to better understand what your trying to do and maybe be able to offer some ideas.
I have some questions too,
Are you fetching all the data from the table and then doing your parsing?
I would hope you don't have to pull 2.5m rows each time... depending on what your doing to the data, you might have to figure out some kind of preload cache or something... some kind of stages, but that all depends on what your "processing" is, what the data consists of, int,str,blob..etc, and ...
Do you really need to use Eloquent ? If not, you could just pull the pdo and run it old style, the while db fetch...
I'm guessing, its run query, loop results, do something with row data...then it times out... or is it just really slow? is it the db query or the processing that is making it slow? Will pulling the data differently really fix the slowness?
There isn't enough info provided to make up a test code of some kind.
I don't mean to insult, just tossing out all the things I can think of that might be related to the issue.
TerrePorter,
no - I'm parsing on the fly. I need to filter out some results. It can be done every time, because it's happening on the admin side.
There's no doubt that Laravel's lack of real fetching is a performance nightmare. I tried to use Eluqoent and waited a few minutes - nothing. Same with chunk() method, even though I tried it with several possible chunk sizes. This is just wrong and memory-killing way of obtaining database results in many cases.
I prepared an example illustrating what I did now:
$sql = Dog::select('name')->whereAge(0)->toSql(); // 0 is just "empty" value, it doesn't matter here yet
$db = DB::connection()->getPdo();
$query = $db->prepare($sql);
$query->execute(array(10)); // here's our puppies' real age
while ($name = $query->fetchColumn()) {
// process data
}
I used Eloquent to build SQL statement for me, then I passed it to build prepared statement with pure PDO.
Result?
1.5 seconds.
Now, THIS IS how big sets of data should be processed.
I ran into the same problem multiple times.
Thx for your solution.
Bulding my param array isn't really beatiful. But it works.
$requests->whereIn('requests.id', $preselectedRequestIds);
foreach ($preselectedRequestIds as $index => $value) {
$sumSqlParams[] = $value;
}
if you execute get, everything is collected, no eloquent way around.
should provide iterator query result, then, php can handle all of it. laravel should provide a way to fetch row one by one as mysqli_fetch_row
Sign in to participate in this thread!
The Laravel portal for problem solving, knowledge sharing and community building.
The community