Jump to content

fputcsv stops writing after output file reaches size 848661


ballhogjoni

Recommended Posts

I'm trying to write a new csv file using the data from a large 1.4gb csv file. The problem is that its not working with the code I have below. It just hangs up when the size of output (tempfile) reaches 848661. Any ideas?

$file = "file.csv";
    $file_number = 1;
    if (!$output = fopen($tempfile, 'w')) {
        unlink($tempfile);
        die('could not open temporary output file');
    }

    $in_data = array();
    $header = true;
    $row = 0;
    while (($data = fgetcsv($handle, 0, $delimiter)) !== FALSE) {
        if ($header) {
            $in_data[] = $headers;
            fputcsv($output, $headers, "\t");
            $header = false;
            continue;
        }
/*
                if ( ($row % 10) == 0 ){
error_log("sleeping on row: $row");
                    sleep(3);
                }
error_log("checking row: $row");
*/
                if (!$val = getProductRow($data)) //this returns an array
                    continue;

                $stat = fstat($output);
error_log("Stat is: " . print_r($stat['size'],true));
                if($stat['size'] > 9437184){
error_log("saving the $file");
                    fputcsv($output, $val, "\t");
error_log(__LINE__);
                    fclose($output);
error_log(__LINE__);
                    if (file_exists($file))
                        unlink($file);
error_log(__LINE__);
                    rename($tempfile, $file);
error_log(__LINE__);
                    chmod($file, 0777);
error_log(__LINE__);
                    $final_file_name = $file = "file$file_number.csv";
error_log(__LINE__);
                    $tempfile = tempnam(".", "tmp"); // produce a temporary file name, in the current directory
error_log("creating the $file");
                    if (!$output = fopen($tempfile, 'w')) {
error_log(__LINE__);
                        unlink($tempfile);
                        die('could not open temporary output file');
                    }
error_log(__LINE__);
                    fputcsv($output, $headers, "\t");
error_log(__LINE__);
                    $file_number++;
error_log(__LINE__);
                    continue;
                }

error_log(__LINE__);
                fputcsv($output, $val, "\t");
            }

my guess is php is running out of memory, since you are putting each row into $in_data. Try increasing allowed memory in your php.ini file or better yet, restructuring your code to not hold more than 1 row at a time in memory.

my guess is php is running out of memory, since you are putting each row into $in_data.

That's what I thought as well, but I don't get any errors and I set ini_set('memory_limit', '1024M'); So to double check the memory usage I ran memory_get_usage() and it was sitting at 585728

So i thought it could be that I was running out of physical memory. I am running macbook with 8gb physical memory. With everything running, chrome, spotify, etc it was taking about 7.5gb of memory. I shutdown everything and it still didn't work.

do you have php's error reporting set to E_ALL and display_errors set to ON (or log_errors set to ON) so that php would be reporting and displaying/logging the errors that it detects?

 

edit; after looking at your program logic, i'm going to guess that at some point your getProductRow() function just starts returning a false/empty array every time and the code ends up executing the continue; statement every time and is looping over the rest of the input file and not writing anything to the output.

edit; after looking at your program logic, i'm going to guess that at some point your getProductRow() function just starts returning a false/empty array every time and the code ends up executing the continue; statement every time and is looping over the rest of the input file and not writing anything to the output.

 

Thanks! This got me thinking and i went digging into code that was working for everything except for this one case that I wasn't handling. It was causing an infinite loop. I fixed that code and now all is good.

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.