This worked!
DB::connection()->disableQueryLog();
But what do you think about my code to import csv files like that way?
Not on topic but if you are the women on the picture, you are really beautiful.
eriktisme said:
Not on topic but if you are the women on the picture, you are really beautiful.
Smooth man. Really smooth. Help her or him (its irrelevent this is a coding forum not pof) with their issue and win her or his love!
merkabaphi said:
But what do you think about my code to import csv files like that way?
I prefer query builder or raw queries over eloquent while working with large data.
If it's a one off thing, you can use Sequel Pro / PHPMyAdmin (or other interface depending on what database implementation you are using) to import it into the database directly
But if not I would go with what mgsmus said, and just query the database directly
And I think you are reassigning the $product variable on each loop (array -> Eloquent)? And doing an unneccessary Product::find in the else section because it should already be found if the 'if condition' failed.
eriktisme said:
Not on topic but if you are the women on the picture, you are really beautiful.
Keep it professional, please.
Why don't you run the import as a Queue
? And if the import is just locally, you can write your own artisan command and trigger it in the console, instead of hitting the server.
2 things that will help:
Save memory by loading the csv file one row at a time instead of loading the whole thing into one big array in memory. goodby/csv would work, personally I like ddeboer/data-import.
Speed up the database by using prepared statements. Laravel will not help you here but it's easy enough just to use PDO for something this simple (unless your Product model is doing something complicated).
$select_stmt = DB::getPdo()->prepare('SELECT id FROM products WHERE sku = ?');
$insert_stmt = DB::getPdo()->prepare('INSERT INTO products(price, old_price) VALUES(?, ?)');
$update_stmt = DB::getPdo()->prepare('UPDATE products SET price = ?, old_price = ? WHERE id = ?');
foreach($products as $key => $product)
{
$select_stmt->execute([ $product['sku'] ]);
$id = $select_stmt->fetchColumn();
if ($id === false)
{
$insert_stmt->execute([ $product['productPrice'], $product['productOldPrice'] ]);
}
else
{
$update_stmt->execute([ $product['productPrice'], $product['productOldPrice'], $id ]);
}
}
100 times faster is to use MySQL's "LOAD DATA IN FILE" :)
merkabaphi said: It takes too long and the memory of the server gets full. Does anyone knows how to solve this? Thanks!
I believe this could be a culprit contributing to your out-of-memory issue:
$products = csvToArray('products.csv');
It appears this method is parsing the CSV file into an array. This has the effect of reading and loading the entire file, which if it is a large file, could consume too much memory and therefore cause your out of memory issue. Thus, it would be better to process each row of the file as it is being read. See the PHP doc to get an idea of how you might do that: http://php.net/fgetcsv
Good luck!
John Madson
Looks like https://github.com/goodby/csv will do this for you, which was already recommended by somebody else. I second that recommendation, as it will execute a callback on each line of the CSV file, preventing the issue I described above.
Also important thing is to use Database Transactions if the MySQL engine is Innodb. This will speed up the process a lot!.
DB::transaction(function () use($products) {
foreach($products as $key => $product)
{
// save here.
}
});
Sign in to participate in this thread!
The Laravel portal for problem solving, knowledge sharing and community building.
The community