Support the ongoing development of Laravel.io →
Input Database Eloquent
Last updated 2 years ago.
0

Have you seen chunk in the docs?
Another way I did it is dump the data to local computer have an ODBC source setup and actually use Microsoft Access as a frontend to do various reporting and things of that nature. That may or may not be a solution for you. If you have the old Visual FoxPro that will also work.

Last updated 9 years ago.
0

The whole thing is running on a Linux server so FoxPro / Access / ODBC aren't options. It's running out of memory just pulling from the API to write it to our local database - I've got chunk (or a variant thereof) set up on the our data > Salesforce part, but I haven't even got to a point where I can integrate that with the process because of this problem.

0

No, you dump the data to a Windows machine if you have Microsoft Access you can manipulate the data on the local server it's dumped to. It's not live data though. Meanwhile did you look up chunk in the docs?
You definitely need to pull the data in increments for example if 1 million records pull 1 through100000 then 100,001 to 200,000 etc, etc.

Last updated 9 years ago.
0

The records I'm pulling from the API (during the part where I'm having the memory issue) are accessed one at a time. When I send them to Salesforce I am using chunking (or, as I said, a variation of - I can't use chunk because of the way the data is updated post pull) on the SELECT queries against the local data.

0

MikkyX said:

The records I'm pulling from the API (during the part where I'm having the memory issue) are accessed one at a time. When I send them to Salesforce I am using chunking (or, as I said, a variation of - I can't use chunk because of the way the data is updated post pull) on the SELECT queries against the local data.

How many are sent to Salesforce at one time?, if too many that may be the problem send less at a time to Salesforce if possible.

0

As I've already mentioned, I haven't integrated the Salesforce side of things yet. The problem is occurring with part 1 of the process - the fetching of the data from the other API (N200, if I need to be specific) into our local database. One record at a time (they don't do batching) via a HTTP request through Gopher - and this is then inserted / updated as appropriate using Eloquent and a Model.

0

MikkyX said:

.One record at a time (they don't do batching) via a HTTP request through Gopher - and this is then inserted / updated as appropriate using Eloquent and a Model.

How can memory run low if you are retrieving one record at a time?
Can you download the data in a file? You should be able to.

Last updated 9 years ago.
0

jimgwhit said:

MikkyX said:

.One record at a time (they don't do batching) via a HTTP request through Gopher - and this is then inserted / updated as appropriate using Eloquent and a Model.

How can memory run low if you are retrieving one record at a time?
Can you download the data in a file? You should be able to.

That question is what I've been trying to figure out :) I'm wondering if it's a memory leak in one of the components I'm using - I know Guzzle had one but it should be fixed now. The data can be downloaded in a CSV or Excel file through the Flash-based interface, but I have to do this in real time through their XML API.

0

What is the issue with using their XML API? Also I gather that there is no way to gain access to a phpmyadmin on their site and export the data that way? I also am thinking that the admins at that site / service should be able to assist you with an efficient way of getting the data you require, if not don't use them period.

0

There is no issue, I have no issue, with using their XML API. I have had a CodeIgniter version of this project run on several clients without an issue - including some with larger datasets than this. On one of them, the initial fetch of all the data from the XML API took a day and a half running on an Amazon EC2 instance - hundreds of thousands of records - and it completed.

I've now moved to Laravel for my development which meant porting this project and now I have these memory issues. Something, somewhere, is being left behind in memory and growing until it's all gone.

Also, I'm fairly certain companies that provide access to data through an API don't generally make a habit of allowing their customers access to the underlying databases via phpMyAdmin! It's not up to me which company the client uses to handle their event registration.

0

Why don't you just continue to use CodeIgniter for this? Obviously from your post laravel has some kind of memory issues. Have you considered issuing a bug report to laravel. Also do a search on StackOverflow and punch in laravel 5 memory leaks, something may come up that might help. Definitely sounds like a laravel problem. Or is there a way to do this part of it using regular php and pdo? Try that and port those results to laravel. There has to be a solution here.

0

Laravel historically uses more memory than CodeIgniter.

I used to have a project that pulls large data from MLS and insert to my local MySQL database.

It turned out CodeIgniter had no problem doing it, until I made a switch to Laravel. (@version 3 during that time)

I actually ended up writing a shell script and CURL for data at first, and use vanilla raw SQL queries to do data manipulation.

So maybe try use Laravel's raw query to see if it works?

0

Like I said if that doesn't work write a single page application using regular PHP and pdo and that should at least get the data you need.

0

Sorry for the delay in replying to this. In answer to the question about just using my CI version - the client asked for Laravel and I was already porting it. So it's got to be Laravel.

I'll probably be looking at queuing. I tried taking all the non-essential database stuff out and just letting the API fetching run and it still ramped up. The weird thing, which I don't think I've mentioned in here, is that calls to memory_get_usage and memory_get_peak_usage while the script is running report a steady 10-11mb of usage, it's only in external process monitors that you can see it running away - as if the leak is outside of my scripting, but still tied to the process running it.

0

Have you tried sleeping in between requests?

0

No. I could try it if you definitely think it will help though?

0

I use it on a long running script I use to stop it hogging all resources, although this may not be the issue if you are actually crashing the server.

The other thing, is are there any resources you can free up - I do an "unset" at the end of the loop after I've finished with any vars

0

I'll see where I could use a sleep then, thanks - I thought about the unset() or $var = null approaches but aren't variables meant to be freed up etc. when the function / method they were in ends anyway?

0

Well, for anyone still wondering, converting the script to use queues with multiple workers via supervisord did the trick. Thanks to those who suggested this approach.

0

This is a little late, but to remind me when I google this issue again in the future... :)

I also added the Eloquent module to a Codeigniter project using this method including adding events to the Codeigniter profiler: http://jamieonsoftware.com/post/90299647695/using-eloquent-orm-inside-codeigniter-with-added

Later I had a seeming memory leak while iterating through a large data set and while I had turned off queryLogging in Eloquent, I had forgotten about the Codeigniter profiler getting updated when the Eloquent query events fired. Turns out, that was where the memory was getting chewed up. Disabling that stabilized the memory usage.

0

Sign in to participate in this thread!

Eventy

Your banner here too?

MikkyX mikkyx Joined 10 May 2015

Moderators

We'd like to thank these amazing companies for supporting us

Your logo here?

Laravel.io

The Laravel portal for problem solving, knowledge sharing and community building.

© 2024 Laravel.io - All rights reserved.