Jump to content

[SOLVED] recursively check a url until condition is met?


HaLo2FrEeEk

Recommended Posts

I have a url that points to a picture.  There is a value in the url that can be changed to show a different picture, but there is a limit to how high the value can go.  Here is an example:

 

http://z2-ec2.images-amazon.com/R/1/a=0756655498+d=_SCR(3,0,0)_+o=01+s=RMTILE+va=MAIN+e=.jpg

 

d=_SCR(3,0,0)_

 

The format for that is:

 

(zoom level, column, row)

 

It's the url for the zoom images on Amazon.  This particular image has 5 colums and 6 rows.  I would like to be able to download the full-size images and construct them into one image, instead of having 30 individual smaller images.  What I want to do is recursively check the url, increasing the colum value until the image no longer returns a "HTTP/1.1 200 OK" code, then do the same with the rows.  For this image it's kinda pointless since I know how many rows and cols it has, but for other images I won't know without manually checking.  Here's what I have so far:

 

<?php
$asin = $_REQUEST['asin'];
$col = 0;
$row = 0;

$url = "http://z2-ec2.images-amazon.com/R/1/a=".$asin."+d=_SCR(3,".$col.",".$row.")_+o=01+s=RMTILE+va=MAIN+e=.jpg";
$mainheaders = get_headers($url);
if($mainheaders[0] != "HTTP/1.1 200 OK") {
  die("The image does not exist");
  }

$col++;
$row++;

do {
  $rowheaders = get_headers($url);
  $row++;
  } while($rowheaders[0] == "HTTP/1.1 200 OK");
?>

 

When I run this it gives me a 500 Server Error, don't know why.  So I tried changing it to this:

 

<?php
$asin = $_REQUEST['asin'];
$col = 0;
$row = 0;

$url = "http://z2-ec2.images-amazon.com/R/1/a=".$asin."+d=_SCR(3,".$col.",".$row.")_+o=01+s=RMTILE+va=MAIN+e=.jpg";
$mainheaders = get_headers($url);
if($mainheaders[0] != "HTTP/1.1 200 OK") {
  die("The image does not exist");
  }

$col++;
$rowheaders = get_headers($url);
echo $url."<br><br>";
print_r($rowheaders);
?>

 

increase the $col variable by 1 then get the headers again and print the new url and the headers again...this is my result:

 

http://z2-ec2.images-amazon.com/R/1/a=0756655498+d=_SCR(3,0,0)_+o=01+s=RMTILE+va=MAIN+e=.jpg

Array ( [0] => HTTP/1.1 200 OK [1] => Date: Wed, 04 Nov 2009 01:15:37 GMT [2] => Last-Modified: Wed, 02 Sep 2009 18:38:22 GMT [3] => Server: Server [4] => Content-Type: image/jpeg [5] => X-Cache: MISS from cdn-images.amazon.com [6] => X-Cache: MISS from cdn-images.amazon.com [7] => Content-Length: 15902 [8] => Connection: close )

 

Why did it not increase the $col variable in the url?  The new url should be:

 

http://z2-ec2.images-amazon.com/R/1/a=0756655498+d=_SCR(3,1,0)_+o=01+s=RMTILE+va=MAIN+e=.jpg

 

How can I achieve what I want, to recursively check the url until I reach the maximum col and row values?  The way I check if I'm at max is, since there are 5 columns in this image, if col was set to 6 then get_headers[1] would return "HTTP/1.1 404 Not Found", so I can use that to confirm the existence of the url.

 

Is there some loop I can do?  I don't need it to print anything until it gets to the maximum values of $col and $row.

Nevermind, I figured it out:

 

<?php
$asin = $_REQUEST['asin'];

$baseurl = "http://z2-ec2.images-amazon.com/R/1/a=".$asin."+d=_SCR(3,0,0)_+o=01+s=RMTILE+va=MAIN+e=.jpg";
$mainheaders = get_headers($baseurl);
if($mainheaders[0] != "HTTP/1.1 200 OK") {
  die("File does not exist");
  }

$col = 0;
$row = 0;
$url = "http://z2-ec2.images-amazon.com/R/1/a=".$asin."+d=_SCR(3,%d,%d)_+o=01+s=RMTILE+va=MAIN+e=.jpg";
while($colheaders[0] != "HTTP/1.1 404 Not Found") {
  $col++;
  $colheaders = get_headers(sprintf($url, $col, $row));
  }
$col--;

while($rowheaders[0] != "HTTP/1.1 404 Not Found") {
  $row++;
  $rowheaders = get_headers(sprintf($url, $col, $row));
  }
$row--;

echo "Max Colums: ".$col."<br>Max Rows: ".$row;
?>

 

Simple, and it works.

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.