rbarnett Posted February 27, 2008 Share Posted February 27, 2008 I have a problem with a very large php page that is 25 MB. It's purpose is to do a select query of records from a PostgreSQL database. The page successfully loads in the browser but when users try to Post or submit, the page fails often with a white screen or if they're lucky it will take 10 minutes to process. But it usually results in a white screen. There are 20,000 + items on the page so it is a big post.. We tried setting the memory_limit in the php.ini file as high as 80 MB from the default 8 MB but with no change. Any help would be much appreciated. Thanks Quote Link to comment Share on other sites More sharing options...
cooldude832 Posted February 27, 2008 Share Posted February 27, 2008 so your telling me you wrote 25mb of php code think about this, a file of "25mb" would be around 500 thousand lines of php, which is probably close to the whole php basecode to begin with so why not rethink your post a bit. Quote Link to comment Share on other sites More sharing options...
PFMaBiSmAd Posted February 27, 2008 Share Posted February 27, 2008 Whatever your application is, it is not suitable for a web server/browser environment or any other user display. There is no need to output and then submit that amount of information at one time. It sounds like you need to filter and search for relevant information only instead of operating on everything even if it is not affected. Quote Link to comment Share on other sites More sharing options...
killsite Posted February 27, 2008 Share Posted February 27, 2008 Best and most cost effective is to chop up the code and optimize the queries as the gurus have suggested. However, if you insist you should look into other things outside of php such as: Fine Tuning MySQL: This is taken from http://www.gossamer-threads.com/lists/mythtv/users/91034: key_buffer = 48M max_allowed_packet = 8M table_cache = 128 sort_buffer_size = 48M net_buffer_length = 8M thread_cache_size = 4 query_cache_type = 1 query_cache_size = 4M query_cache - this caches repeated SQL queries. key_buffer - used for caching primary key indexes. table_cache - tells mysql how many table files handles to keep open simultaneously. thread_cache_size - this tells mysql to keep worker threads around which are expensive to start up, but cheap to maintain sort_buffer_size - this value is used during queries to hold results in memory otherwise it creates temporary result tables on disk net_buffer_length - should help on larger network based queries to improve throughput MySQL Optimization- from dev.mysql http://dev.mysql.com/doc/refman/5.0/en/optimization.html Server Optimization- from dev.mysql If you're on a shared environment. Upgrade to a Managed or Dedicated server where you control or get to choose the hardware. Things to consider that will help with sql queries: Disk seeks. It takes time for the disk to find a piece of data. With modern disks, the mean time for this is usually lower than 10ms, so we can in theory do about 100 seeks a second. This time improves slowly with new disks and is very hard to optimize for a single table. The way to optimize seek time is to distribute the data onto more than one disk. Disk reading and writing. When the disk is at the correct position, we need to read the data. With modern disks, one disk delivers at least 10–20MB/s throughput. This is easier to optimize than seeks because you can read in parallel from multiple disks. CPU cycles. When we have the data in main memory, we need to process it to get our result. Having small tables compared to the amount of memory is the most common limiting factor. But with small tables, speed is usually not the problem. Memory bandwidth. When the CPU needs more data than can fit in the CPU cache, main memory bandwidth becomes a bottleneck. This is an uncommon bottleneck for most systems, but one to be aware of. Best of luck and I hope some of the information was helpufl. Quote Link to comment Share on other sites More sharing options...
rbarnett Posted February 27, 2008 Author Share Posted February 27, 2008 Sorry I was not clear in my initial post. The php file itself is only 180 K. In the rare instance that the post is actually successful and you download the file as a 'save as' it is 25 MB. Quote Link to comment Share on other sites More sharing options...
cooldude832 Posted February 27, 2008 Share Posted February 27, 2008 you should be compressing then streaming it Quote Link to comment Share on other sites More sharing options...
rbarnett Posted February 27, 2008 Author Share Posted February 27, 2008 Forgive my ignorance but is it possible to compress and stream when posting data to the server such as is the case here? I'm not sure how I would do this with a php page that posts data to the server and is not uploading a file. Thanks Quote Link to comment Share on other sites More sharing options...
cooldude832 Posted February 27, 2008 Share Posted February 27, 2008 its all in the headers you are making a .zip file and set headers to force download as a streamed content So you do something like <?php #set headers #start zip file $q = "Select * from `table"; $r = mysql_query($q) or die(mysql_error()); while($row = mysql_fetch_array($r)){ #Insert into files in zip } ?> Quote Link to comment Share on other sites More sharing options...
rbarnett Posted February 27, 2008 Author Share Posted February 27, 2008 Thanks for explaining that. I understand but I'm not trying to download. I'm trying to post the data to the server. In my earlier post I was only saying that the size of the post is 25 MB (which I know to be true when I do a 'save as' and download the php file after the post has been submitted). Usually though the post fails and I get a white screen. I really appreciate your response though. Thanks Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.