I'm trying to create a script which will download a given file from our ftp site without needing a username and password. The background to this is that we have to send a lot of large files (often 500MB+) to clients, and having tried various existing HTTP transfer options (wetransfer, ge.tt, YouSendIt etc), we haven't yet found one which is reliable enough to stick with: they're either stuffed with ads, get blocked by various companies' firewalls, or fall over during large uploads. The most reliable way seems to be to send things via ftp, but this creates problems for some people when we send a link like this:
ftp://username:
[email protected]/link/to/file.zip
as they often get asked for a password (not realising it's in the link already), or the link gets blocked by their firewall, or whatever. It also doesn't easily tell us if they've actually downloaded the file. Most of the people we're sending these links to don't have FTP client software, either, so specifying the server name, username & password isn't generally an option.
So, as a solution I want to be able to send people a link like:
http://oursite.com/dl?f=ZToxJnU6MjY=
This will then look up the relevant file, download it from our ftp for them without needing them to login, and notify us. I've tried a couple of different methods to achieve this, but for some reason whichever way I use (so far I've used both @readfile($file) and print(@fread($file, 1024* ) the download gets to 63.6Mb (66,711,732 bytes, or within 1K of this), and then cuts off. If I try two downloads concurrently, it will cut them off when the total size has reached 63.6Mb.
I've tried going through php.ini and have increased the limits for default_socket_timeout, max_execution_time and memory_limit, but this doesn't seem to make any difference. The script is based on Armand Niculescu's at http://www.richnetapps.com/php-download-script-with-resume-option/. Any ideas why this could be happening?