Drongo_III Posted September 12, 2013 Share Posted September 12, 2013 Hello I have recently been working with gd library as part of an image upload system. I noticed that one of my scripts was failing due to exceeding the available memory. Anyway researching this I discovered that gd library can be quite memory hungry and the pixel width height of even a small image can require a lot of memory - even when the file size is quite low. So my question is how do really big sites deal with this? If u have a high volume upload form for instance I could imagine the drain in memory being quite massive. So do they just throw hardware at it or are there more efficient image libraries than gd? Incidentally I corrected the memory issue by changing the ini setting for the script. Just in case anyone has that problem. Quote Link to comment Share on other sites More sharing options...
kicken Posted September 12, 2013 Share Posted September 12, 2013 So my question is how do really big sites deal with this? If u have a high volume upload form for instance I could imagine the drain in memory being quite massive. Handle the image operations external to PHP. For instance, exec() out to ImageMagick or similar. PHP's memory limits and other restrictions do not apply to external processes. Such programs may also have better optimizations and do the job quicker. Quote Link to comment Share on other sites More sharing options...
Drongo_III Posted September 12, 2013 Author Share Posted September 12, 2013 Handle the image operations external to PHP. For instance, exec() out to ImageMagick or similar. PHP's memory limits and other restrictions do not apply to external processes. Such programs may also have better optimizations and do the job quicker. I am interested in understanding this better so apologies if I'm being slow. When you say use exec do you mean to execute a program on an external server? How would using exec on the same server get around the issue of using up memory since the server resource will be used just the same? Sorry i realise this is all a bit abstract but I can foresee a time in the future when I'll need to build something similar for a much higher volume website so I'd like to understand a good approach. Quote Link to comment Share on other sites More sharing options...
kicken Posted September 13, 2013 Share Posted September 13, 2013 You could farm the processing out to a dedicated machine (or multiple machines) if the need was there. Until you hit a massive scale though, just running the process on the same machine would suffice. PHP limits it's own memory consumption. Just because you hit a limit in PHP does not mean that the system cannot do more. If you do reach a point at which the system is overloaded then yes, the answer is just "throw hardware at it" by adding additional servers, either as dedicated image processors, or just additional web servers to distribute the load. Quote Link to comment Share on other sites More sharing options...
Drongo_III Posted September 13, 2013 Author Share Posted September 13, 2013 Thanks mate that's brill. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.