silvercover Posted September 21, 2011 Share Posted September 21, 2011 Hi, I have a website and part of my website needs to read/write a plain text file on server. most of the time we have between 50 to 100 concurrent online users and their actions on our site result to read/write certain file in which that is a temporary place holder for some other pages. So I want to know about concurrent read/write conflicts and what's the best practice to avoid that if that happened. How does my server manage these requests? Thanks. Quote Link to comment https://forums.phpfreaks.com/topic/247572-concurrency-in-reading-and-writing-files/ Share on other sites More sharing options...
thehippy Posted September 21, 2011 Share Posted September 21, 2011 fopen and flock are what you need to read up on. Reading a file isn't so much the problem, if 50 people at the same time want to read a file, nearly all operating systems will allow a shared file open. Writing is where you come upon problems, when 50 people want to write to a file at the same time, they all want to open the file, stick the pointer at some point in the file and insert data, then close the file. If that were to actually happen your file would become corrupt. So the course of action for your application is to gain a lock on the file, so it and only it can write to it, release the lock and close it so others can do the same. You can google around for more general info by searching for 'atomic file operations' or 'thread safe file operations' Quote Link to comment https://forums.phpfreaks.com/topic/247572-concurrency-in-reading-and-writing-files/#findComment-1271318 Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.