Jump to content

Multiple url simplexmlelements


lilgezuz

Recommended Posts

I'm trying to get some data from mutliple urls using one php file and one user id.  I have been trying a bunch of different ways and can still not get it.  In the below example my first url pulls the data then the second fails.  But if I remove the first url and just do the second it works.  I will also post the error messages I'm getting.

 

// Desired address 
$url = "http://www.myurl.com/api?user=$userid"; 
// Retrieve the URL contents 
$page = file_get_contents($url); 
// Parse the returned XML file 
$xml = new simplexmlelement($page,null,true); 
// Parse the coordinate string

$image= $xml->image;
$title= $xml->title;


// Desired address 
$url1 = "http://www.mysecondurl.com/apitwo?user=$userid"; 
// Retrieve the URL contents 
$page1 = file_get_contents($url1); 
// Parse the returned XML file 
$xml1 = new simplexmlelement($page,1null,true); 
// Parse the coordinate string

$loc= $xml1->location;
$pos= $xml1->position;

 

Fatal error: Uncaught exception 'Exception' with message 'String could not be parsed as XML' in

(13): SimpleXMLElement->__construct('', 0, false) #1 {main} thrown
Link to comment
Share on other sites

Your second example is screwed up in a few ways.

 

Did anybody notice that SimpleXMLElement can take a URL as an argument? Or that the third argument is only true if the $data is a URL?

$url = "http://www.myurl.com/api?user=$userid";
$xml = new SimpleXMLElement($url, null, true);

Link to comment
Share on other sites

still can't get it to work correctly.  Is there a way to do arrays with simplexml meaning have an array with multiple urls in it and then process them to get the data I want.

 

I even tried to do the first url and if there was a result send it to another php file to process the second url but I get the same kind of fatal error message when I do it like that also

Link to comment
Share on other sites

What's your exact code and are you sure that the stuff is real XML?

 

here are my links if accessing them directly

<? $rootBase = $_SERVER["DOCUMENT_ROOT"];
  include ($rootBase .'/db.php');
  $id = $_GET['id']; ?>
  <head>
	<meta http-equiv="Content-Type" content="text/html; charset=utf-8">
	<meta http-equiv="X-UA-Compatible" content="IE=9" >
</head>
<?




// Desired address 
$url = "http://www.psnapi.com.ar/ps3/api/psn.asmx/getPSNID?sPSNID=$id"; 
$url1 = "http://www.psnapi.com.ar/ps3/api/psn.asmx/getGames?sPSNID=$id";
// Retrieve the URL contents 

$page = file_get_contents($url);
$page1 = file_get_contents($url1);

// Parse the returned XML file 
$xml =  new simplexmlelement($page); 
$xml1 =  new simplexmlelement($page1);

// Parse the coordinate string

    
$id1 = $xml->ID; 
$avatar = $xml->Avatar;
$level = $xml->Level;
$progress = $xml->Progress;
$ttrophies = $xml->Trophies->Total;
$plat = $xml->Trophies->Platinum;
$gold = $xml->Trophies->Gold;
$silver = $xml->Trophies->Silver;
$bronze = $xml->Trophies->Bronze;
$country = $xml->Country->Descripcion;
$flag = $xml->Country->Bandera;
$tpoints = $xml->LevelData->Points; 
$plus = $xml->Plus; 



//mysql_query("SET NAMES utf8"); 

//$addinfo="INSERT INTO users SET psnid='$id1', avatar ='$avatar', level='$level', progress='$progress', trophies='$ttrophies', plat='$plat', gold='$gold', silver='$silver', bronze='$bronze', country='$country', flag='$flag', points='$tpoints', plus='$plus', da= NOW() "; 

//$addres=mysql_query($addinfo);

// Output the coordinates 

echo "ID: $id1<br>
Image: <img src=\"$avatar\"><br>
Level: $level<br>
Progress: $progress<br>
Trophies: $ttrophies<br>
Platinum: $plat<br>
Gold: $gold<br>
Silver: $silver<br>
Bronze: $bronze<br>
country: $country<br>
flag: $flag<br>
Points: $tpoints<br>
Plus: $plus<br>

"; 



?>

Link to comment
Share on other sites

It looks like there's some kind of throttle in place to limit how frequently you can request the URLs.  When I try in rapid succession, at least one of my requests always gets met with a 403 (Forbidden).

 

The following seems to work ok so you may need to cache responses if you're planning on working with them a lot:

 

<?php
$xml1 = 'http://www.psnapi.com.ar/ps3/api/psn.asmx/getGames?sPSNID=Lilgezus';
$xml2 = 'http://www.psnapi.com.ar/ps3/api/psn.asmx/getPSNID?sPSNID=Lilgezus';

$xml1 = simplexml_load_string(file_get_contents($xml1));
sleep(10);  # wait 10 seconds before loading the next URL
$xml2 = simplexml_load_string(file_get_contents($xml2));

echo "The Game Title from the XML in URL #1 is: {$xml1->Game->Title} <br>";
echo "The ID pulled from the XML at URL #2 is: {$xml2->ID}";
exit;
?>

Link to comment
Share on other sites

It looks like there's some kind of throttle in place to limit how frequently you can request the URLs.  When I try in rapid succession, at least one of my requests always gets met with a 403 (Forbidden).

 

The following seems to work ok so you may need to cache responses if you're planning on working with them a lot:

 

<?php
$xml1 = 'http://www.psnapi.com.ar/ps3/api/psn.asmx/getGames?sPSNID=Lilgezus';
$xml2 = 'http://www.psnapi.com.ar/ps3/api/psn.asmx/getPSNID?sPSNID=Lilgezus';

$xml1 = simplexml_load_string(file_get_contents($xml1));
sleep(10);  # wait 10 seconds before loading the next URL
$xml2 = simplexml_load_string(file_get_contents($xml2));

echo "The Game Title from the XML in URL #1 is: {$xml1->Game->Title} <br>";
echo "The ID pulled from the XML at URL #2 is: {$xml2->ID}";
exit;
?>

 

This works good, a little slow but pulls what I want.  Got one other question, the first url pulls multiple games so I tried to to do a foreach  loop but it didnt echo the results

 

foreach($xml1 as $xml1){  

$title = $xml1->Game->Title;

echo $title;

 

And what exactly do you mean by cache respones

Link to comment
Share on other sites

This works good, a little slow but pulls what I want...

 

Well a 10+ second loading time for a page is unreasonable - this was just an example to illustrate the throttling that seems to be in place.

 

And what exactly do you mean by cache respones.

 

Caching is basically just storing the result of an expensive (complex/slow) operation so that you can have direct access to the result at a later stage without having to redo the slow/expensive operation. In your XML example, if the data at those URLs doesn't change all that often or your application doesn't require the absolute latest data, you could fetch the data once, and save a copy of it somewhere else (e.g in a database, a file or memcache etc.) along with an expiry of a few minutes/hours.  Then when the data is needed, your application can check your cached copy of the data before resorting to fetching fresh data via the URLs you provided (slow) and updating your cache.

Link to comment
Share on other sites

I am storing the info in my database and using it later, but 1k plus people could be accessing these urls from my site.  Once they do it the first time then the next time it updates it will use a data from the last time they updated so it doesn't pull as many results

Link to comment
Share on other sites

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.