Jump to content

memory and gd library


Drongo_III

Recommended Posts

Hello

 

I have recently been working with gd library as part of an image upload system. I noticed that one of my scripts was failing due to exceeding the available memory.

 

Anyway researching this I discovered that gd library can be quite memory hungry and the pixel width height of even a small image can require a lot of memory - even when the file size is quite low.

 

So my question is how do really big sites deal with this? If u have a high volume upload form for instance I could imagine the drain in memory being quite massive.

 

So do they just throw hardware at it or are there more efficient image libraries than gd?

 

Incidentally I corrected the memory issue by changing the ini setting for the script. Just in case anyone has that problem.

Link to comment
Share on other sites

So my question is how do really big sites deal with this? If u have a high volume upload form for instance I could imagine the drain in memory being quite massive.

Handle the image operations external to PHP. For instance, exec() out to ImageMagick or similar. PHP's memory limits and other restrictions do not apply to external processes. Such programs may also have better optimizations and do the job quicker.

Link to comment
Share on other sites

Handle the image operations external to PHP. For instance, exec() out to ImageMagick or similar. PHP's memory limits and other restrictions do not apply to external processes. Such programs may also have better optimizations and do the job quicker.

 

 

I am interested in understanding this better so apologies if I'm being slow. When you say use exec do you mean to execute a program on an external server? How would using exec on the same server get around the issue of using up memory since the server resource will be used just the same?  Sorry i realise this is all a bit abstract but I can foresee a time in the future when I'll need to build something similar for a much higher volume website so I'd like to understand a  good approach.

Link to comment
Share on other sites

You could farm the processing out to a dedicated machine (or multiple machines) if the need was there. Until you hit a massive scale though, just running the process on the same machine would suffice. PHP limits it's own memory consumption. Just because you hit a limit in PHP does not mean that the system cannot do more. If you do reach a point at which the system is overloaded then yes, the answer is just "throw hardware at it" by adding additional servers, either as dedicated image processors, or just additional web servers to distribute the load.

Link to comment
Share on other sites

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.