Support the ongoing development of Laravel.io →
Requests Session
Last updated 2 years ago.
0

Break up the data, for example one section A-F, another run G-M, that sort of thing.
Why json, just foreach your results with pagination. Is it a report? What? If report, I sql dump the
data to my local windows machine, restore it to my local mysql server, connect ms access via odbc
and can run great reports via msaccess. Seems some php developers make things harder than necessary.
I've run reports that are hunderds of pages long this way. Forget that json junk and look for efficient solutions.

0

I am querying a remote site via REST and basic auth in order to get data. After I make this query it returns a response. I take this response and run my loops in order to manipulate the data and do something with it. Here is my code so you have a clear idea:

	public function store()
	{
		$source = Request::input('source');

		if ($source == 'download') {
			// Get Request Variables
			$datestart = Request::input('shipdate');
			$cp = 1;

			$ifexist = Logs::find($datestart);
			if ($ifexist != null) {
				$ifexist->toArray();
				if ($ifexist['status'] == "200") {
					$ds = new FormatDate($datestart, 'm-d-Y');
					$dstart = $ds->get_date();

					Session::flash('message', 'Already previously successfully downloaded shipments for shipment date: ' . $dstart);
					return Redirect::to('home');				
				} else {
					$ds = new FormatDate($datestart, 'm-d-Y');
					$dstart = $ds->get_date();

					Session::flash('message', 'Failed previouslly downloaded shipments for shipment date: ' . $dstart . '<br>Please truncate all shipments from that date and try again.');
					return Redirect::to('home');
				}
			}
			
			do {

				// Initiate a new client
		 	    $client = new Client();

		 	    // Disable SSL Check
		 	    $client->setDefaultOption('verify', false);

			    // Create Request and Include Headers
				$request = $client->createRequest(
					'GET','https://api.remotesite.com/shipments', [
						'auth' => [env('API_USER'), env('API_PASS')]
				]);

				// Query Parameters
				$query = $request->getQuery();
				$query->set('shipDateStart', $datestart);
				$query->set('shipDateEnd', $datestart);
				$query->set('pageSize', '500');
				$query->set('includeShipmentItems', 'true');
				$query->set('page', $cp);
				
				// Submit Request
				$response = $client->send($request);

				// Parse Response Into JSON
				$json = $response->json();

				// Paging & Count Variables
				$currentpage = $json['page'];
				$totalpages = $json['pages'];
				$shipments = $json['shipments'];
				$shipments_count = count($shipments);

				// Loop through each order
				foreach ($shipments as $k => $v) {
					// Get line items
					$items = $v['shipmentItems'];

					// Check if line items exist than proceed with loop
					if ($items) {
						// Loop through each line items for order and assign to a variable
						foreach ($items as $kk => $vv) {
							$orderid = $v['orderId'];
							$ordernum = $v['orderNumber'];
							$shiptoname = $v['shipTo']['name'];
							$sku = $vv['sku'];
							$name = $vv['name'];
							$quantity = $vv['quantity'];
							$unitprice = $vv['unitPrice'];
							$trackingnumber = $v['trackingNumber'];
							$isreturnlabel = $v['isReturnLabel'];
							$voided = $v['voided'];				

							// Database Insert
							$users = Orders::Create(array(
								'orderId' => $orderid,
								'orderNum' => $ordernum,
								'shipToName' => $shiptoname,
								'shipDate' => $datestart,
								'sku' => $sku,
								'name' => $name,
								'unitPrice' => $unitprice,
								'quantity' => $quantity,
								'voided' => $voided,
								'isReturnLabel' => $isreturnlabel,
								'tracking' => $trackingnumber
							));
						}
					}
				}

				// Paging Check
				if ($cp != $totalpages)	{ $cp++; }

			} while ( $currentpage < $totalpages );

			// Check if order line items exist in database and update flag
			$fixed = 0;

			// Query my_item_table Then Store To Array
			$qb_items = My_Item_Table::get(array('name'))->toArray();

			// Flatten $qb_items Array
			$qb_items_flat = array_flatten($qb_items);		

			// Query orders for qbExist items only
			$notexist = Orders::where('qbExist', '=', '0')
				->where('voided', '=', '0')
				->where('isReturnLabel', '=', '0')
				->where('qbSubmitted', '=', '0')
				->where('qbImported', '=', '0')
				->get();

			// If no results then return message
			if (count($notexist) == null) {
				$message = 'Nothing to fix.';
			} else {
				foreach ($notexist as $k => $v) {
					$ifexist = Aliases::where('alias', '=', $v->sku)->first();

					if ($ifexist || in_array($v->sku, $qb_items_flat)) {
						$replace = Orders::where('sku', '=', $v->sku)
							->update(array(
								'qbExist' => 1
							));

						$fixed++;
					}
				}		
			}

			// Log Status Code Then Update Logs Database
			$code = $response->getStatusCode();
			$logcode = Logs::Create(array(
				'requestDate' => $datestart,
				'status' => $code
			));

			// Format date before passing to session message
			$ds = new FormatDate($datestart, 'm-d-Y');
			$dstart = $ds->get_date();

			// Flash session message than redirect back to home
			Session::flash('message', 'Date: ' . $dstart . '<br>Download Shipments Success<br>Total Shipments: ' . $shipments_count . '<br />' . $fixed . 'SKU auto-fixed');
			return Redirect::to('home');
		}
	}

Basically there's a 500 row limit per request so I need to take the total pages shown on the response and loop it until it reaches that.

So all this is 1 single running request going way over 600 execution seconds. I'm not sure how to grasp to break this up yet.

0

Is this data for a large report, or what? You said 4) Return to view. Who is viewing all of this. If just updating, what exactly is being updated where you have to go through all records? Is this MySQL, or what database? Can't you connect to database loop through the records and update one at a time, may take a while, but go have some coffee while running. I have run large accounts receivable reports that wasn't a rig - ma - rah like this. Let the folks you are doing this for have you re-design the flow so it makes more sense. Arrays are fine, but I wouldn't use arrays for large amounts of data. Start with record 1, loop ...record 500,000. You take Bandag tire, some of there large transactions, updates run over night.
Man re-design this stuff more efficient. Why are you messing with rest API. 2, for goodness sake connect direct to that database. I have worked with databases with thousands of records, but never had a problem like this. Short of this, sorry if I don't understand.

Last updated 9 years ago.
0

I do appreciate your honest response.

To clarify this is an REST API from a commercial shipping service. They are not a client or friend and as with most commercial companies, they only allow certain aspects of their resources available and most of the time via OAuth2 and REST API combinations. Of course, most definitely if it was for a partner, a friend, or client, I'd definitely have more flexibility and connect with their database directly to do as I wish and without any restrictions or limitations on request rows.

To answer your question, after this data is processed, stored to database, it is simply displayed onto a view table. That part is definitely not any issues I'm incurring and mainly the process before this which I was just simply trying to make more efficient.

By the way, I'm not sure what their database is as it's not accessible anyways but mines is MySQL.

Last updated 9 years ago.
0

If not a client, what is your part in accessing this data, why and for whom? I was the database manager for a trucking co.and I designed all and did it the way I saw fit, to bad you can't do that. Any way to loop through the rest data in a regular way?

dnamotoring said: I often reach my 600 execution time limit and resort to upping it even more.

What is this 600 execution time limit, I guess i am confused about why this limit, when some large companies run updates all night on hundreds of thousands of records.

Last updated 9 years ago.
0
Solution

There are a few ways to tackle your problem.

First, you should profile your script and figure out which part takes how much time. If the bottleneck is database-related (which I suspect it is, because neither fetching a dozen API responses, nor parsing 3-4k objects worth of JSON should take this long), maybe an index or a dozen in the right place is all it takes to cut down on the run time.

Second, this is actually the textbook case for queues and asynchronous jobs. Your controller shouldn't do the actual work of fetching data from API and processing it, it should just set up a job, push it into the queue, and maybe return some sort of job id which you could then use to poll the server and see if it's done yet, and if it is, what are the results.

Meanwhile a listener/worker (which is just a simple class with a single method with all the code from your controller) running in background would pick up this job form a queue, do the actual heavy lifting and log the results (to database or wherever).

0

What is this 600 execution time limit, I guess i am confused about why this limit, when some large companies run updates all night on hundreds of thousands of records.

Sorry I simply meant it was exceeding my php.ini setting of 600 sec execution time which seems bad practice to have it run/wait for a client at the browser for so long. Of course jobs can go much longer but the way I was approaching it was not right.

Last updated 9 years ago.
0

Xum said:

There are a few ways to tackle your problem.

First, you should profile your script and figure out which part takes how much time. If the bottleneck is database-related (which I suspect it is, because neither fetching a dozen API responses, nor parsing 3-4k objects worth of JSON should take this long), maybe an index or a dozen in the right place is all it takes to cut down on the run time.

Second, this is actually the textbook case for queues and asynchronous jobs. Your controller shouldn't do the actual work of fetching data from API and processing it, it should just set up a job, push it into the queue, and maybe return some sort of job id which you could then use to poll the server and see if it's done yet, and if it is, what are the results.

Meanwhile a listener/worker (which is just a simple class with a single method with all the code from your controller) running in background would pick up this job form a queue, do the actual heavy lifting and log the results (to database or wherever).

Thanks I think this is definitely the Laravel way to go about it. I'm assuming with this, instead of having the client/user who clicked the button to initiate this api call from their browser and waiting... it would simply now say "order download requested you will be notified when it has completed".

0

Interestingly-enough, this is the exact reason I'm picking up Laravel at this time.

We had a kiosk / live upload system at an event, and ended up getting into all sorts of problems when we were busy.

The new architecture I've planned out offloads all processing to a queue / job system, and we're planning to leverage Laravel's Beanstalkd / Queue implementation, with the local server processing footage to be uploaded as there's capacity, and the remote system emailing the user once it's been received and processed.

Last updated 9 years ago.
0

Sign in to participate in this thread!

Eventy

Your banner here too?

Moderators

We'd like to thank these amazing companies for supporting us

Your logo here?

Laravel.io

The Laravel portal for problem solving, knowledge sharing and community building.

© 2024 Laravel.io - All rights reserved.