Jump to content

FTP size limit for downloads?


rainjam

Recommended Posts

I'm trying to create a script which will download a given file from our ftp site without needing a username and password. The background to this is that we have to send a lot of large files (often 500MB+) to clients, and having tried various existing HTTP transfer options (wetransfer, ge.tt, YouSendIt etc), we haven't yet found one which is reliable enough to stick with: they're either stuffed with ads, get blocked by various companies' firewalls, or fall over during large uploads. The most reliable way seems to be to send things via ftp, but this creates problems for some people when we send a link like this:

 

ftp://username:password@ftpserver.com/link/to/file.zip

 

as they often get asked for a password (not realising it's in the link already), or the link gets blocked by their firewall, or whatever. It also doesn't easily tell us if they've actually downloaded the file. Most of the people we're sending these links to don't have FTP client software, either, so specifying the server name, username & password isn't generally an option.

 

So, as a solution I want to be able to send people a link like:

 

http://oursite.com/dl?f=ZToxJnU6MjY=

 

This will then look up the relevant file, download it from our ftp for them without needing them to login, and notify us. I've tried a couple of different methods to achieve this, but for some reason whichever way I use (so far I've used both @readfile($file) and print(@fread($file, 1024* 8)) the download gets to 63.6Mb (66,711,732 bytes, or within 1K of this), and then cuts off. If I try two downloads concurrently, it will cut them off when the total size has reached 63.6Mb.

 

I've tried going through php.ini and have increased the limits for default_socket_timeout, max_execution_time and memory_limit, but this doesn't seem to make any difference. The script is based on Armand Niculescu's at http://www.richnetapps.com/php-download-script-with-resume-option/. Any ideas why this could be happening?

Link to comment
Share on other sites

There are other possibilities for a timeout, such as a CGI timeout if you're setup in CGI mode. phpinfo() would tell you what SAPI your using.

 

Also remove the @ from in front of your function calls. That operator hides any error messages that may come up, and should be used very rarely.

 

After the download stops, open the resulting file in a text editor or hex editor and check if there are any error messages printed out (after you've fixed the @ problem).

Link to comment
Share on other sites

BTW, the details for the system this is running on are -

 

PHP version 5.3.14

MySQL 5.1.36 (file info is stored in a MySQL db)

SERVER_PROTOCOL: HTTP/1.1

 

This happens exactly the same on Firefox 22, Chrome 28 or IE 10, all on Windows 7 x64. IE helpfully explains that the download was interrupted. All the progress bars look fine when downloading (ie, the file size is represented correctly).

Link to comment
Share on other sites

There are other possibilities for a timeout, such as a CGI timeout if you're setup in CGI mode. phpinfo() would tell you what SAPI your using.

 

Also remove the @ from in front of your function calls. That operator hides any error messages that may come up, and should be used very rarely.

 

After the download stops, open the resulting file in a text editor or hex editor and check if there are any error messages printed out (after you've fixed the @ problem).

Hi Kicken

 

Thanks. I've removed the @s but still get the files cut off. It's running CGI 1.1.

 

I also opened the resulting file in PSPad, and got a "Warning: fseek(): stream does not support seeking in [xxx] on line 346..."

 

So I commented out the fseek line (it's only there as part of a section that's supposed to allow resuming a download) but still get a truncated file. This is a few bytes smaller than the ones I've tried to download previously (66,711,552 bytes as against 66,711,732 bytes) presumably because it no longer has the error message in it.

 

I don't know if it's an actual timeout (in seconds), as it always seems to get to the same size no matter how quickly the download is going.

Link to comment
Share on other sites

why don't you just do like you said:

 

http://oursite.com/dl?f=ZToxJnU6MjY=

 

lookup the path/to/file based on the "f" param (I assume that's what you're doing).. but instead of trying to open and output the file with php, just redirect to the actual file.. 

 

example:

 

<?php
 
// grab the "f" param and lookup the path/to/filename, put it into $file
 
// execute whatever code you want that tracks that the user downloaded it
 
header("Location: $file");
exit();

?>
 
Link to comment
Share on other sites

@Josh - good idea - I could, but I suspect that won't solve the problem some people are having (and it's difficult to get details from them about why, or what they're doing, or how their system's set up - they're non technical and just want the file...) where they'll occasionally get asked for a username and password, even though these are explicitly in the link. You're right, we're storing the full ftp://user:pass@somewhere.com/file.zip link in the db, then pulling that apart to retrieve username & password etc.

 

I've stripped out a lot of extraneous code from the script and tried a simplified version running on Wamp on my laptop, and that works fine - I've tested it with a file of 2GB which took 15 minutes but worked perfectly. Running exactly the same script on the server again only downloads 63.6Mb of the file. So it's some kind of server config issue - not sure what though!

 

The following three php.ini variables don't seem to have any bearing on the result, but they are:

 

memory_limit: (server) 256M, (laptop) 512M [the server has 2GB RAM, the laptop 16GB, so I can't bump this up on the server too much more]

max_execution_time: (server) 4800, (laptop) 10000 [both these seem to me to be massively more than necessary anyway]

default_socket_timeout: (server) 1200, (laptop) 60

Link to comment
Share on other sites

Well it may not solve the problem but it takes the problem out of php's hands because it removes php from the equation.

 

Even if you find that the problem is somewhere else (e.g. - maybe your server's apache timeout directive (or cgi timeout if iis) is too low), php is still acting as an unnecessary middleman.

 

But if you wanna keep poking at php, for shits and grins I would set max_execution_time to 0 (make it unlimited) and see if they still have a problem.

Link to comment
Share on other sites

OK, cool. Of the two things I'm trying to achieve by doing it with PHP, I'd say that getting people a link that just works every time, and isn't blocked or stripped out by their firewall, is the more important of the two. Download tracking is useful, but not essential.

 

I'm going to keep plugging away for now because I have a feeling this'll be one of those things that's solved by something really tiny and stupid :/

Link to comment
Share on other sites

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.