ganast Posted June 18, 2007 Share Posted June 18, 2007 I am using fpdf (fpdf.org) to generate a pdf from any page on a site. One page is about 200 LETTER size pages long, and generating that page fails unless I set the php memory_limit directive to about 1400M. Obviously I don't want to do this... I also don't want to rewrite the fpdf classes so that they use the filesystem more aggressively... although that is probably the best answer. I am wondering if it is possible to force PHP to use the file system in stead of ram in certain scripts. In the case of this particular page, I know that is going to take forever to generate the file. I will just have the server generate the pdf at 3AM each day and cache the actual pdf for retrieval throughout the day... so I don't care that accessing the filesystem will slow down the pdf creation... I just need to have some means of creating large pages while retaining a reasonable memory_limit of probably 32M in the php.ini file or the apache directive. Is this at all possible... or do I need to re-code part of the script? --gabe Quote Link to comment Share on other sites More sharing options...
btherl Posted June 19, 2007 Share Posted June 19, 2007 If you configure php to use that much memory, and configure your operating system to use disk when memory runs out, then you can do the processing. It may be slow, but as long as fpdf doesn't need to use all 1400M at once, it ought to work. Btw, you can configure the memory limit on one script only, either by setting it on the command line ( -dmemory_limit=1400M ) or by setting it at the start of your script ( ini_set('memory_limit', '1400M'); ) Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.