Jump to content

Accessing file whilst they are being written


nmc

Recommended Posts

I'd like to work with several files at the same time

The first file is a large wmv video file that's being downloaded and saved by another php program

 

This file, lets call it testfile.wmv is currently being written onto the disk in a filestore cache directory and will

continue to be written there for 15-20 minutes at the most

 

Whilst the huge wmv file is still being written it will be write locked preventing other people from writing to the file but it won't  be read locked so other processes will be allowed to read from it.

 

So I would like to setup two or three separate programs which will start reading data from the large file and and send it streaming over http to some other users who have requested it as part of a download and they don't want to wait until the large  file download to the server has completed.

 

What I'm trying to achieve here is a kind of cacheing origin pull style CDN system where the main system receives a request and starts downloading the large file (could be 500 MB or larger). The other 'downloader' processes will need to read from the file at the same time as it is still being written to and then output the data to the end users.

 

I haven't done so muhc php programming so I'm fully familiar with the php programming language and all of it's specifics regarding file I/O

 

Here's a couple of questions for people who will most likely be able to tell me if I'm going about this the wrong way.

 

Can I safely start reading the file in which is currently being written in chunks of say 8kb and send them to the user

There will likely be several users attampting to download the same file which is still being written to.

 

I can forsee a problem if the initial download of the large file slows down in such a way that one or two of the end user downloaders catch up with the end of the large testfile.wmv. At this stage I coupl slow down the end user download programs using a sleep function  to make them wait maybe 1 second each time it happens.

 

The question is : what do I do to avoid a problem should this scenario happen.

 

I will most likely receive an end of file during the read processes unless I specifically avoid it somehow.

 

Any suggestions

 

Here's some thoughts I came up with :

 

Before I read in each 'chunk' from the file I could run a file size function on the file - will this work when the file is open for writing ? If the file size thing does work then I can work out how many bytes to read into the next 'chunk' and read again, managing to avoid an eof.

 

I will then keep repeating this process until the file size doesn't change any more and we reach the end of the file.

 

To complicate it further there will be multiple processes all reading the same large file and doing the same kind of loop each cheking the file size prior to fread'ing more data from the large wmv file.

 

Does anyone think this will work.

 

Any thoughts appreciated.

 

Thanks

 

Neil

 

Link to comment
Share on other sites

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.