Jump to content

PHP not able to copy files more than 4GB.


Abrar

Recommended Posts

Hi Friends,

I am trying to write a small program to copy files on regular basis automatically using PHP.. I have written the code and it works well but any file more than 4GB, its only copying up to 4GB, others are ignored.

eg. a video file with 8GB, if we try to copy with the below code, It is only coping 4GB as a result the video is corrupt.

Please help, Thanks

 

<?php
$source = 'd:/';  // source folder or file
$destPath = 'C:/';   // destination folder or file  
$files = scandir($source);
$archiveFiles = array();
foreach ($files as $file) {
  if (!in_array($file, array(".", ".."))) {
    $lastModified = date('d/m/y', filemtime($source . '/' . $file));
    if ($lastModified == date("d/m/y")) {
      array_push($archiveFiles, $file);
      echo $file . '<br>';
      if (copy($source . $file, $destPath . $file)) {
        echo "files copied";
      } else {
        echo "Canot Copy file";
      }
    }
  }
  else
  {
    echo "Not Copied " . '<br>';
  }
}
?>

 

Link to comment
Share on other sites

Hi Kicken,

Thank you so much for your reply and support!

Yes, even i am trying to copy db backup files on regular basis... The destination do support large files since we manually copy those files but trying to automate with PHP and facing this issue...

The same program i am using for the backup ARC files and works like charm.

Please guide me.

Thank you.

Link to comment
Share on other sites

Start by looking at phpinfo(), /etc/php.ini (or whatever on windows), and your php fpm settings if used.  If no culprits, maybe your web server?  Also, I often do reality checks such as the following.

 

      printf('filesize: %s   copy(%s, %s)<br>'.PHP_EOL, filesize($source . $file), $source . $file, $destPath . $file);

 

Link to comment
Share on other sites

13 hours ago, Abrar said:

 

Yes, even i am trying to copy db backup files on regular basis... The destination do support large files since we manually copy those files but trying to automate with PHP and facing this issue...

 

We need more information:

  • What OS are you running on?
  • Is this a CLI program or something happening via webserver integration?
  • What version of PHP are you using?
  • Copying from where to where?  Is this from one directory to another on the same server? From one server to another?  From a local directory to an NFS mounted one?
  • A snippet of the code that actually does the copying would be helpful

Possibly you hit a limitation rectified in a more recent version of PHP.  Most likely there are several alternatives you can use to get around the problem.

Link to comment
Share on other sites

Hi Friends,

Please find the details:

1. Windows 10

2. Its a basic .PHP file program which has been made to trigger on regular basis using Task Scheduler in windows 10.

3. PHP version 8.

4 .I am trying to copy from one server to another server using a third server.

5. Below is the code i am using.

<?php
$source = 'xxx.xxx.xxx.xxx/sr/';  // source folder or file
$destPath = 'xxx.xxx.xxx.xxx/dest/';   // destination folder or file  
$files = scandir($source);
$archiveFiles = array();
foreach ($files as $file) {
  if (!in_array($file, array(".", ".."))) {
    $lastModified = date('d/m/y', filemtime($source . '/' . $file));
    if ($lastModified == date("d/m/y")) {
      array_push($archiveFiles, $file);
      echo $file . '<br>';
      if (copy($source . $file, $destPath . $file)) {
        echo "files copied";
      } else {
        echo "Canot Copy file";
      }
    }
  }
  else
  {
    echo "Not Copied " . '<br>';
  }
}
?>

 

Link to comment
Share on other sites

You have hit a bug:  https://bugs.php.net/bug.php?id=81145

The bug has been fixed, but apparently it's not been merged into a production release yet.  

For now you can try this workaround.  Replace your copy() with the copyByChunk() function, and see if it solves your issue.  Increasing the buffer size will probably help if the performance hit is substantial, but fair warning -- test this function out before you use it.  I did no real testing of it.

 

function copyByChunk($srcFile, $dstFile) {
   # Use 1mb chunks
   $bufferSizeBytes = 1048576;
   $bytes = 0;

   $src = fopen($srcFile, "rb");
   $dst = fopen($dstFile, "w");

   while (!feof($src)) {
      $bytes += fwrite($dst, fread($src, $bufferSizeBytes));
   }

   fclose($src);
   fclose($dst);

   // return bytes written
   return $bytes; 
}

 

Link to comment
Share on other sites

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.