Jump to content

Retrieving a URL's content


pepsimax123

Recommended Posts

There's many options available for downloading a URL - but I'm stuck.

 

I've looked through all the ones I know, but none seem to pay attention to partial content.

 

I'm trying to retrieve a URL that gives the following header:

HTTP/1.1 206 Partial Content

Content-Range: bytes 0-100000/631723

 

As you can see it dishes out the file in 100,000 byte chunks.

 

Trouble is, when I use any method in PHP, ie file_get_contents, fopen, or even cURL, none of these continue on after retrieving the 100,000 bytes.

 

End result, I have a 100,000 byte file.

 

What I need is to get the PHP script to grab all the data, in the example above, all 631,723 bytes.

 

How can I do this?

Link to comment
https://forums.phpfreaks.com/topic/225341-retrieving-a-urls-content/
Share on other sites

I have indeed, but this is not something I had hoped to do.

 

The server it retrieves data from is set up in such a poor way, I don't know if this is purposeful or not - but it does make things a pain!

 

Sending HEAD rather than GET will return the correct content-length so I am now retrieving the correct content length and then looping through in 100,000 parts and merging as you say.

 

This, sadly, because it does not pay attention to other ranges you give it - you can only go in 100,000 blocks and using HTTP 1.0 still gives you HTTP 1.1 headers!

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.