60-second timeout error is an expected error that can occur when exporting or processing using all
or get
on large amounts of data. To avoid this error, laravel has a method called chunk
that can be used to process large amounts of data in smaller chunks.
consider following example
$data = Flight::all();
$data->chunk(1000, function($data) {
foreach ($data as $item) {
// process data
}
});
office doc reference : https://laravel.com/docs/11.x/eloquent#chunking-results
You can use command line mysqldump -u your_username -p your_database_name > backup.sql to export database and same to import it will be a one go
Handling 8 lakh (800,000) products efficiently in Laravel requires optimizing both import/export logic and server configuration. Exporting such a large dataset in one go can lead to issues like timeouts or memory exhaustion. Exporting in chunks is the best solution.
Here’s how you can handle this:
Exporting data in chunks avoids loading all 8 lakh records into memory at once. You can use a queue or chunking feature of Laravel.
a. Use Laravel's chunk()
method
The chunk()
method processes the data in chunks, avoiding memory overload.
public function exportProducts()
{
$filePath = storage_path('exports/products.csv');
$handle = fopen($filePath, 'w');
// Add CSV header
fputcsv($handle, ['ID', 'Name', 'Price', 'Stock', 'Description']);
// Process products in chunks
Product::chunk(1000, function ($products) use ($handle) {
foreach ($products as $product) {
fputcsv($handle, [
$product->id,
$product->name,
$product->price,
$product->stock,
$product->description,
]);
}
});
fclose($handle);
return response()->download($filePath);
}
b. Use a Queue for Background Processing For large exports, use Laravel queues to run the export process in the background.
Create an export job:
php artisan make:job ExportProductsJob
Handle the export logic in the job:
public function handle()
{
$filePath = storage_path('exports/products.csv');
$handle = fopen($filePath, 'w');
fputcsv($handle, ['ID', 'Name', 'Price', 'Stock', 'Description']);
Product::chunk(1000, function ($products) use ($handle) {
foreach ($products as $product) {
fputcsv($handle, [
$product->id,
$product->name,
$product->price,
$product->stock,
$product->description,
]);
}
});
fclose($handle);
// Notify the user or send an email with the download link
}
Dispatch the job:
ExportProductsJob::dispatch();
Laravel Excel is a robust package for importing and exporting large datasets efficiently.
composer require maatwebsite/excel
Create an export class:
php artisan make:export ProductsExport --model=Product
Define the export logic:
namespace App\Exports;
use App\Models\Product;
use Maatwebsite\Excel\Concerns\FromQuery;
use Maatwebsite\Excel\Concerns\Exportable;
class ProductsExport implements FromQuery
{
use Exportable;
public function query()
{
return Product::query();
}
}
Use the export class:
public function exportProducts()
{
return (new ProductsExport)->download('products.xlsx');
}
If you're facing timeouts even after chunking, you might need to adjust server configurations:
Increase Execution Time:
Update your php.ini
file:
max_execution_time = 300
memory_limit = 512M
Use CLI Instead of HTTP Requests Run exports via Artisan commands to avoid web server timeouts:
php artisan export:products
For importing, process the file in chunks to avoid memory overload:
Create an import class:
php artisan make:import ProductsImport --model=Product
Define the import logic:
namespace App\Imports;
use App\Models\Product;
use Maatwebsite\Excel\Concerns\ToModel;
use Maatwebsite\Excel\Concerns\WithChunkReading;
class ProductsImport implements ToModel, WithChunkReading
{
public function model(array $row)
{
return new Product([
'name' => $row[0],
'price' => $row[1],
'stock' => $row[2],
'description' => $row[3],
]);
}
public function chunkSize(): int
{
return 1000;
}
}
Use the import class:
public function importProducts(Request $request)
{
Excel::import(new ProductsImport, $request->file('file'));
return response()->json(['message' => 'Import successful']);
}
Let me know if you'd like further assistance!
Sign in to participate in this thread!
The Laravel portal for problem solving, knowledge sharing and community building.
The community