Jump to content

creating large zip files.... anyway to prevent running out of memory?


scooter41

Recommended Posts

Hey There,

 

I am using php system that creates zip's on the fly adding files before delivering the zip file to the user via http.

 

Everything works fine when adding smaller mp3 files, but as soon as you add say 4 of 5 wav files to the zip, which are around 12 meg each, the script soon runs out of memory, as it appears to keep adding the files to ram rather than flush and write the temporary file to disk.

 

I am using the following script to create the zip files, but this happens with others I have tried also:

 

http://pear.php.net/package/Archive_Zip/docs/latest/Archive_Zip/Archive_Zip.html

 

Does anyone have any suggestions? Many thanks for your help in advance

 

Rgds

 

Scott

 

 

right, I have set it up to 256 meg already, but the thing that concerns me, is:

 

Firstly someone will probably be downloading 30 wav files at a time, so thats on average 300meg  which is over the limit, plus this is likely to be a hard hitting site, so say 10 people are on the site at once, thats just going to nuke the ram.....

I will try having a look at that... the trouble is the user is able to choose which files to zip, rather than select all from one folder, and then they are also renamed before adding to the zip file... so the php solution was ideal.

 

Im not sure how graceful that would be with the command line, I will take a look

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.