clown[NOR] Posted January 27, 2008 Share Posted January 27, 2008 i'm working on a project for a gameing server. they have all kills stored in a txt file. my project is to sort by the highest kills and down the last hour. i've tried using file() but reading every line into an array every time takes way to long. is there another way to read this file faster? Thanks In Advance - Clown Quote Link to comment https://forums.phpfreaks.com/topic/88043-reading-large-txt-files/ Share on other sites More sharing options...
ziv Posted January 27, 2008 Share Posted January 27, 2008 file(), file_get_contents() and so on...are a functions family that read the entire file into memory and you should not use them on large files. use instead fopen() and iterate on the data in a relatively small chunks. Quote Link to comment https://forums.phpfreaks.com/topic/88043-reading-large-txt-files/#findComment-450429 Share on other sites More sharing options...
flappy_warbucks Posted January 27, 2008 Share Posted January 27, 2008 $fp = fopen($file,"r"); $conetents = fread($fp,sile_size($file)); On my server that will read a 5mb file in seconds. Quote Link to comment https://forums.phpfreaks.com/topic/88043-reading-large-txt-files/#findComment-450430 Share on other sites More sharing options...
clown[NOR] Posted January 27, 2008 Author Share Posted January 27, 2008 well this file is on a remote server, and it contains everything from 100 to over 100,000 lines =) so fopen() is the one i should use? Quote Link to comment https://forums.phpfreaks.com/topic/88043-reading-large-txt-files/#findComment-450432 Share on other sites More sharing options...
ziv Posted January 27, 2008 Share Posted January 27, 2008 $fp = fopen($file,"r"); $conetents = fread($fp,sile_size($file)); On my server that will read a 5mb file in seconds. this is the same as using file_get_contents() <?php $conetents = file_get_contents($file); ?> but if you have a memory problem (or performance) you should not load all this data into the memory. Quote Link to comment https://forums.phpfreaks.com/topic/88043-reading-large-txt-files/#findComment-450435 Share on other sites More sharing options...
ziv Posted January 27, 2008 Share Posted January 27, 2008 link=topic=179218.msg798214#msg798214 date=1201446095] well this file is on a remote server, and it contains everything from 100 to over 100,000 lines =) so fopen() is the one i should use? i'm not sure...because in this case if you use http protocol, there is no benefits for the fopen()..... Quote Link to comment https://forums.phpfreaks.com/topic/88043-reading-large-txt-files/#findComment-450437 Share on other sites More sharing options...
clown[NOR] Posted January 27, 2008 Author Share Posted January 27, 2008 thanks.. if i get a memory issue, what should i use instead? should i start looking into some sockets or something? <-- just a wild guess Quote Link to comment https://forums.phpfreaks.com/topic/88043-reading-large-txt-files/#findComment-450439 Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.