goodespeler Posted December 2, 2008 Share Posted December 2, 2008 I have a csv that has about 8600 rows and what i'm trying to do is put the csv in an array, where the first row is column heading. The problem I keep running into is that PHP keeps running out of memory when using this function on too big of a file. function buildStock($File) { $handle = fopen($File, "r"); $fields = fgetcsv($handle, 5000, ","); while($data = fgetcsv($handle, 5000, ",")) { $detail[] = $data; } $x = 0; $y = 0; foreach($detail as $i) { foreach($fields as $z) { $stock[$x][$z] = $i[$y]; $y++; } $y = 0; $x++; } return $stock; } Any thoughts on how to rework this so that I won't have a memory problem? Thank you. Link to comment https://forums.phpfreaks.com/topic/135144-csv-to-array-running-out-of-memory/ Share on other sites More sharing options...
PFMaBiSmAd Posted December 2, 2008 Share Posted December 2, 2008 You would need to page through the file, one or a few lines at a time and process it in smaller pieces. Is there a reason you must get the entire contents at one time instead of searching through it and only retrieving the row(s) your are interested in and/or why not use a database? You are also making a $detail[] array and then making a $stock[][] array out of that, consuming two times the amount of memory. By doing it all inside the while() loop and skipping the creation of the $detail[] array, you will reduce the memory needed by half. Link to comment https://forums.phpfreaks.com/topic/135144-csv-to-array-running-out-of-memory/#findComment-703900 Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.