refined Posted June 5, 2008 Share Posted June 5, 2008 Alright, so basically, I need a function that takes a cURL output from a page and grabs links from inside of the page. The links are all from rapidshare.com, and are generally in this typical format: http://rapidshare.com/files/3459349/etcetcetc.r00 http://rapidshare.com/files/81027576/blahblahblah234.rar http://www.rapidshare.com/files/81027576/etcetcetc.rar http://www.rapidshare.com/files/81027576/etdasflkjas.r00 These are essentially how the links work. I know how to code everything except for the regex that grabs the link for the page. I've tried several things, including the classic: function returnSubstrings($text, $openingMarker, $closingMarker) { $openingMarkerLength = strlen($openingMarker); $closingMarkerLength = strlen($closingMarker); $result = array(); $position = 0; while (($position = strpos($text, $openingMarker, $position)) !== false) { $position += $openingMarkerLength; if (($closingMarkerPosition = strpos($text, $closingMarker, $position)) !== false) { $result[] = substr($text, $position, $closingMarkerPosition - $position); $position = $closingMarkerPosition + $closingMarkerLength; } } return $result; } $cURLoutput = str_replace("www.rapidshare.com", "rapidshare.com", $cURLoutput); $linkarray= returnSubstrings($cURLoutput, "http://rapidshare.com", ".rar"); $linkarray2= returnSubstrings($cURLoutput, "http://rapidshare.com", ".[0-9]"); Obviously, I left the cURL stuff out.. This is clearly not the most effective way to do this.. what is the best method? Yeah, it's very ugly Link to comment https://forums.phpfreaks.com/topic/108878-grabbing-links-from-web-page/ Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.