spiderwell Posted July 27, 2012 Share Posted July 27, 2012 Hi all, So i just discovered the image directory at my work and it has 145k images in it. This seems to me like too many to have in one place. Does anyone know the implications of so many images in one place, such as would it be slower to fetch images from a directory like that compared to say one with 1000 images. I am assumming with no knowledge that it might well do that but I dont know and my googe searches didnt give me anything concrete. I know if i was to list every file it would obviously take longer but just showing a few images on a per image basis would speed be affected? the server set up is (and god i need to get them to change this) php 5.2 on IIS5 thanks Quote Link to comment Share on other sites More sharing options...
scootstah Posted July 27, 2012 Share Posted July 27, 2012 The only thing I can see being a problem is running utilities which need to fetch all of the files in the directory, like ls. Quote Link to comment Share on other sites More sharing options...
kicken Posted July 28, 2012 Share Posted July 28, 2012 I don't think a ton of files in a directory is much of an issue on any modern file system. I recall reading about it being a problem in some older systems like FAT16/32 but I'd guess your not using those. The only issue would be like scootstah mentioned, using tools that operate on the list of files. An older system I used to work on would store all uploads in a single directory. I have no idea how many files it had in it total but occasionally someone would accidently try and open that folder in windows explorer and it'd lock up their system for a good 10 minutes or so trying to load and render the list. If I expect a folder to contain a lot of files I try and setup a small tier system where they get split into different directories. Mainly so that if someone does have to actually go in and find a particular file manually it's easier to do. The way I manage the setup is usually to prepend some fixed-length prefix to the filename and then use the first x characters as directories. Generally I'll use the user's ID (zero-padded if needed) as a prefix but sometimes I just generate a random number. Eg. say user #37 uploads avatar.jpg to the site and it should be stored in /Uploads/avatars using a tier-setting of 3 levels The script would take their ID, pad it to at least length=3 (so 037) and prefix it to the filename for 037avatar.jpg Then the script would take the first 3 letters from the name and use them as directories, creating the path: /Uploads/avatars/0/3/7. The image then gets saved as /Uploads/avatars/0/3/7/037avatar.jpg It seems to work pretty well and keeps the directory sizes reasonable by spreading out the files. There are certainly other ways one could accomplish the same thing if desired. Quote Link to comment Share on other sites More sharing options...
spiderwell Posted July 29, 2012 Author Share Posted July 29, 2012 cheers for the feedback, i am thinking of splitting the images just for the reasons you mention, windows maintenace for want of a better phrase. when i tried to copy the whole folder it took several atempts to not lock up and eventually when it did work the process was many many hours! Quote Link to comment Share on other sites More sharing options...
QuickOldCar Posted August 5, 2012 Share Posted August 5, 2012 I store around 3 million images so far in a single folder on a Windows server and see no ill effect from it. Definitely you do not want to open that folder. I write scripts using glob and pagination to cycle through them. Linux has limitations, for example ext3 has a 32,000 subdirectory limit, while ext4 has 64,000 Windows is just limited by it's storage size. As others said above, if have a choice not to store in a single folder , is best to do that. If you have no way to logically separate the files into folders, you could also disable the 8.3 file name generation. http://support.microsoft.com/kb/121007 http://en.wikipedia.org/wiki/8.3_filename Quote Link to comment Share on other sites More sharing options...
.josh Posted August 5, 2012 Share Posted August 5, 2012 geez, why so many pics? you work for a pr0n site? Quote Link to comment Share on other sites More sharing options...
Adam Posted August 6, 2012 Share Posted August 6, 2012 Sounds like his girlfriend has been on holiday.. Quote Link to comment Share on other sites More sharing options...
Christian F. Posted August 6, 2012 Share Posted August 6, 2012 Linux has limitations, for example ext3 has a 32,000 subdirectory limit, while ext4 has 64,000 Subdir limit != file num limit, as evidenced by this snippet: for ($run = 1; $run < 65000; $run++) { file_put_contents ($run, 'test'); } The result: test$ ls -l | wc -l 65000 Quote Link to comment Share on other sites More sharing options...
Mahngiel Posted August 6, 2012 Share Posted August 6, 2012 Linux has limitations, for example ext3 has a 32,000 subdirectory limit, while ext4 has 64,000 Subdir limit != file num limit, as evidenced by this snippet: for ($run = 1; $run < 65000; $run++) { file_put_contents ($run, 'test'); } The result: test$ ls -l | wc -l 65000 Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.