Gotharious Posted June 1, 2011 Share Posted June 1, 2011 Hello everyone, I made this website using CakePHP, and it was working great, but then suddenly it started giving me this error Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 38 bytes) in /cake/libs/model/datasources/dbo/dbo_mysql.php on line 766 Here is the code that contains line number 766 /** * Fetches the next row from the current result set * * @return unknown */ function fetchResult() { if ($row = mysql_fetch_row($this->results)) { $resultRow = array(); $i = 0; foreach ($row as $index => $field) { list($table, $column) = $this->map[$index]; $resultRow[$table][$column] = $row[$index]; $i++; } return $resultRow; } else { return false; } } /** Quote Link to comment Share on other sites More sharing options...
QuickOldCar Posted June 1, 2011 Share Posted June 1, 2011 You most likely are running out of memory. Either raise the limit in php.ini, or better yet make a more efficient function. Judging by your code you are getting all the results from a table and displaying it, the table is getting large. Quote Link to comment Share on other sites More sharing options...
Gotharious Posted June 1, 2011 Author Share Posted June 1, 2011 the memory allocation is 128 mb set in php.ini should I split it into multiple table? and please specify you mean the number of rows in the table or columns that are large? Quote Link to comment Share on other sites More sharing options...
QuickOldCar Posted June 2, 2011 Share Posted June 2, 2011 Well since the data appears to be just 32bytes, maybe it's stuck in an infinite loop. So continues to run until your memory limit is reached. do you even need this in your foreach? $i++; especially since it is not a for or a while and no limit is set. foreach itself should return all the results Quote Link to comment Share on other sites More sharing options...
teynon Posted June 2, 2011 Share Posted June 2, 2011 You are running out of memory for sure. 134217728 is 128 mb. Your script is most likely processing too much information. You either need to increase the memory limit, usually double it, or fix your program to use less. I would try 256 mb. Quote Link to comment Share on other sites More sharing options...
Gotharious Posted June 3, 2011 Author Share Posted June 3, 2011 Ok, I removed the line i++; and still getting the same error, I can't increase the memory over 128mb as it's a shared server How can stop the infinite loop if there is one? the code was working just fine, and all other tabs is working great except for this table, it suddenly just keeps giving errors Quote Link to comment Share on other sites More sharing options...
QuickOldCar Posted June 3, 2011 Share Posted June 3, 2011 Maybe what you should do is normal mysql select queries and then pagination to split up the data. And forget that function you made. As I said, either the data you are trying to fetch is too large to display, or maybe even your shared host doesn't have the full 128mb available for you any longer. But even if it uses 128mb I would do something different to get the results. Quote Link to comment Share on other sites More sharing options...
Gotharious Posted June 3, 2011 Author Share Posted June 3, 2011 Well that page which gives the error is Add Hotel page, where i should add data not view them i mean i click in the admin menu "list hotels, it lists around 100,000 hotel 5 hotels per page but when i click Add New Hotel it gives me this error. So actually i'm trying to add data not fetch results Quote Link to comment Share on other sites More sharing options...
QuickOldCar Posted June 3, 2011 Share Posted June 3, 2011 The function you posted creates an array of all the results. I assume after this function you are doing something else to add another set of results to the same array. I don't really know why this function is used to insert data, it has to do a lot just to find the next row. What are you using for a database? Lets say if using mysql, usually everyone uses auto increment as the primary on an id field. You can just insert into next available id. Which is very fast and uses hardly anything. Or can do checks: if it exists...either do nothing or update the exisiting data, if does not exist then insert by next id . Making indexes on any WHERE or AND values in the select statements also speeds up the entire fetching process. Quote Link to comment Share on other sites More sharing options...
Gotharious Posted June 3, 2011 Author Share Posted June 3, 2011 Yes, I'm using MySQL, and the ID is set to Auto Increment. I'm a total Noob in this I know, and I'm not really good in English, so I don't really get what you want me to do. I understand that you mean instead it search for the next available row, I should set it to just start with the next id right away as the id is set to auto increment, but I'm not really sure how to do it. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.