Jump to content

Recommended Posts

Well what I'm trying to do is get the RSS feed from Digg but its returning a blank page with nothing.

 

<?php

$curl = curl_init();

curl_setopt($curl, CURLOPT_USERAGENT, 'Digg');
$data = curl_exec($curl);


$xmlFileData = file_get_contents("http://digg.com/rss/index.xml");

$xmlData = new SimpleXMLElement($xmlFileData);


foreach ($xmlData->item as $item) {
echo $item->title;
}

?>

 

 

Link to comment
https://forums.phpfreaks.com/topic/111398-using-curl-and-simplexml/
Share on other sites

cURL is not necessary. This is not perfect, but it should get you headed in the right direction:

 

<?php
$feed = simplexml_load_file('http://digg.com/rss/index.xml');
$digg = $feed->channel->item->children('http://digg.com/docs/diggrss/'); //this shows simpleXML how to parse the custom namespace elements
$title = $feed->channel->item->title; //this may not work without being looped so that it parses the entire XML document
echo "<h2>$title</h2>\n";
echo "$digg->diggCount"; //this should get you at the special digg:diggCount element, which shows an example of retrieving info out of custom namespaces
?>

 

There is still some work for you to do with looping through the XML... so let me know if you get stuck again

I tried the code up posted, and for some reason Digg's XML doesn't read in like other XML I have used, so I played around with it, and this is what I came up with (and it works):

 

<?php
$diggFeed = file_get_contents('http://www.digg.com/rss/index.xml');
$feed = new SimpleXMLElement($diggFeed);
$namespaces = $feed->getNamespaces(true);
foreach ($feed->channel->item as $item) {
$title = $item->title;
$guid = $item->guid;
$description = $item->description;
$digg = $item->children($namespaces['digg']);
echo "<p><a href='$guid'>$title</a> by " . $digg->submitter->username . "<br />$description</p><br />\n";
}
?>

In  terms of simplicity file_get_contents() would be the simplest solution, but if you are requesting information from more than one source I would highly recommend using curl, as it allows you to make requests in parallel to any number of urls. Thus you can fetch as many remote files as you want and the execution time will only occur once for all of your requests. Where as using file_get_contents or other functions, fopen, fsockopen, etc.  you will have to make them one after the other.

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.