chaiwei Posted January 21, 2010 Share Posted January 21, 2010 Hi, I try to run a sync between two database. 1 is localhost in my pc and another 1 is on webserver (remote) Okay, Let say I have 500,000 rows in a table, and I also just want to sync this table only. I use a SELECT * FROM table, and then use while loop and generate a sql Insert into xx VALUES ( xxx ), (xxx) ,(xxx) after that, I use ajax call to a script, pass the sql through ajax post method, and use mysql_query to insert it into my pc. However I keep getting exhausted memory, function query($sql) { $resource = mysql_query($sql, $this->dbconnection); if ($resource) { if (is_resource($resource)) { $i = 0; $data = array(); while ($result = mysql_fetch_assoc($resource)) { $data[$i] = $result; $i++; } mysql_free_result($resource); $query = new stdClass(); $query->row = isset($data[0]) ? $data[0] : array(); $query->rows = $data; $query->num_rows = $i; unset($data); return $query; } else { return TRUE; } } else { exit('Error: ' . mysql_error($this->dbconnection) . '<br />Error No: ' . mysql_errno($this->dbconnection) . '<br />' . $sql); } } I am not sure is this causing the exhausted memory issues. is it this problem? $query = new stdClass(); I wonder how bigdump handle the ajax call and even it import a million row of record in the table it wouldn't have an memory issues. Link to comment https://forums.phpfreaks.com/topic/189287-ajax-will-help-reduce-php-memory-exhausted/ Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.