RobertP Posted September 22, 2011 Share Posted September 22, 2011 so php has a built in function that, ofcourse is faster then mine, but i want your opinions, i use lots of files with my software, should i consider using file_get_contents, or is there a better solution? ... and what else can i do with file_get_contents ?? function getFileContents($file){ if(($size=filesize($file))>0){ $handler = fopen($file,'r'); $contents = fread($handler,$size); fclose($handler); return $contents; } return null; } vs file_get_contents times executed (each): 1000 fastest to slowest: Array ( [orThis] => 0.00049706306306306306 [getFileContents] => 0.000632337 ) biggest difference time: 0.00013527393693693694 fastest is 27.2146% faster than the slowest ps: orThis is a wrapper for file_get_contents [attachment deleted by admin] Quote Link to comment https://forums.phpfreaks.com/topic/247653-benchmarking/ Share on other sites More sharing options...
cs.punk Posted September 22, 2011 Share Posted September 22, 2011 Perhaps use a database. Quote Link to comment https://forums.phpfreaks.com/topic/247653-benchmarking/#findComment-1271773 Share on other sites More sharing options...
Buddski Posted September 22, 2011 Share Posted September 22, 2011 Ive been using file_get_contents for a while now (I used a similar method as you described) and have found it alot easier.. (its one line!!) Also, file_get_contents() is the preferred way to read the contents of a file into a string. It will use memory mapping techniques if supported by your OS to enhance performance. Quote Link to comment https://forums.phpfreaks.com/topic/247653-benchmarking/#findComment-1271776 Share on other sites More sharing options...
RobertP Posted September 22, 2011 Author Share Posted September 22, 2011 thank you so much, i think i will update using file_get_contents Quote Link to comment https://forums.phpfreaks.com/topic/247653-benchmarking/#findComment-1271796 Share on other sites More sharing options...
.josh Posted September 23, 2011 Share Posted September 23, 2011 If you are just wanting to open up and grab all contents of file in one go then yes, use file_get_contents. But for large files this isn't a good thing to do, as it it puts the entire content of the file into memory. What you would do instead depends on what you are trying to do in general. If you are parsing lines in a file then you would use fread to read and process one line at a time. If you are trying to search or update lines or otherwise use as a flat-file database...consider using a database instead. Or if that's not an option, consider using mysql-lite. Quote Link to comment https://forums.phpfreaks.com/topic/247653-benchmarking/#findComment-1271880 Share on other sites More sharing options...
xyph Posted September 23, 2011 Share Posted September 23, 2011 If you are parsing lines in a file then you would use fread to read and process one line at a time. file is great for that. Unless you're dealing with large files, file_get_contents is the way to go. From what I understand, file_get_contents() is just a wrapper for fopen()/fread()/fclose() Quote Link to comment https://forums.phpfreaks.com/topic/247653-benchmarking/#findComment-1271888 Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.