Support the ongoing development of Laravel.io →
posted 10 years ago
Database
Last updated 2 years ago.
0

My 2 cents,

Option 1 - Split your inserts to let's say 10,000 every time.

Option 2 - I sometimes loop thru and echo out the insert statement to something like less than 1000 rows and insert directly into MySQL (I don't think this is a good idea for 75k rows tho ... ) lol

insert into table (...) values (...); 
insert into table (...) values (...); 
insert into table (...) values (...); 

Option 3 - If you are importing from .sql you should really try the following, this way is way faster than using PHP

mysql -u username -p -h localhost DATA-BASE-NAME < data.sql

Option 4 - Looks like you are having an XML document, I never tried this tho, but you could take a look at http://dev.mysql.com/doc/refman/5.0/en/string-functions.html#function_load-file

But I mean, ultimately for importing big data, PHP is not a very good choice. :-(

Last updated 2 years ago.
0

Thank you awsp. I will try to increase the amount of values inserted at once. My last try was with 25 Values each insert, because I read an article that this value has a good performance. Option 3 is not a choise, you are correct I'm importing data from xml. Option 4 seems also not really helpful.

Any other ideas?

Last updated 2 years ago.
0

I tested it now with 1000 rows each insert. But I get the same slowing down performance. When I stop importing and start the script again, then again the first thousands inserts go fast and slow then down no matter how many products are already in the table before starting the script. So indexes should not be the problem.

So what could I do? One idea is to stop after 10000 inserts and then run the script again. But I want to create a cronjob and just have a webspace without ssh and shell_exec-function is disabled, soI can't use this to start and finish scripts in a loop.

Any other idea?

Last updated 2 years ago.
0

Looks like you are doing insertion directly to your web host. (?)

Maybe you should first import XML file to your local machine and export to .sql file and do option 3 I mentioned before and insert to your web host database.

Last updated 2 years ago.
0

Like I wrote I want to do a cronjob, that runs every day to import the products from a webservice. So at least I will have to use a "own" webserver with ssh, because it does not end with 75000 products...at least it could be over 1 000 000 products.

Last updated 2 years ago.
0

I just worked on a importer which had to import data from 8 MsSql tables with 180k-450k rows each. At the moment we can do the full import in about 20 minutes.

What we have learned:

  • Do not edit data on the fly.
  • Select data in blocks of 500 rows
  • If you want to edit the data: do it directly after the import
  • We needed 4gb memory just for the importer (that is a lot!)
  • test for fastest solutions. It is all about milliseconds but you spare minutes at the end.

We used Zend but it shouldn't make a difference.

Last updated 2 years ago.
0

@Kompas: Thanks for your advices. :) Can you tell me what webserver (dedicated or virtual, size of ram, nr cores, etc) you are using for yur application?

Last updated 2 years ago.
0

And don't forget to disable the query log (\DB::disableQueryLog();)

Last updated 2 years ago.
0

You should see if switching to mysqli driver speeds things up. It is supposed to be faster but I have never benchmarked it.

Last updated 2 years ago.
0

Sign in to participate in this thread!

Eventy

Your banner here too?

umarcel umarcel Joined 21 Aug 2014

Moderators

We'd like to thank these amazing companies for supporting us

Your logo here?

Laravel.io

The Laravel portal for problem solving, knowledge sharing and community building.

© 2024 Laravel.io - All rights reserved.