Jump to content

Recommended Posts

Hi all,

 

So i just discovered the image directory at my work and it has 145k images in it. This seems to me like too many to have in one place.

Does anyone know the implications of so many images in one place, such as would it be slower to fetch images from a directory like that compared to say one with 1000 images.

I am assumming with no knowledge that it might well do that but I dont know and my googe searches didnt give me anything concrete. I know if i was to list every file it would obviously take longer but just showing a few images on a per image basis would speed be affected?

 

the server set up is (and god i need to get them to change this) php 5.2 on IIS5

 

thanks

 

Link to comment
https://forums.phpfreaks.com/topic/266337-images-per-directory/
Share on other sites

I don't think a ton of files in a directory is much of an issue on any modern file system.  I recall reading about it being a problem in some older systems like FAT16/32 but I'd guess your not using those.

 

The only issue would be like scootstah mentioned, using tools that operate on the list of files.  An older system I used to work on would store all uploads in a single directory.  I have no idea how many files it had in it total but occasionally someone would accidently try and open that folder in windows explorer and it'd lock up their system for a good 10 minutes or so trying to load and render the list.

 

If I expect a folder to contain a lot of files I try and setup a small tier system where they get split into different directories.  Mainly so that if someone does have to actually go in and find a particular file manually it's easier to do.  The way I manage the setup is usually to prepend some fixed-length prefix to the filename and then use the first x characters as directories.  Generally I'll use the user's ID (zero-padded if needed) as a prefix but sometimes I just generate a random number.

 

Eg. say user #37 uploads  avatar.jpg to the site and it should be stored in /Uploads/avatars  using a tier-setting of 3 levels

 

The script would take their ID, pad it to at least length=3 (so 037) and prefix it to the filename for 037avatar.jpg

 

Then the script would take the first 3 letters from the name and use them as directories, creating the path: /Uploads/avatars/0/3/7.  The image then gets saved as /Uploads/avatars/0/3/7/037avatar.jpg

 

It seems to work pretty well and keeps the directory sizes reasonable by spreading out the files.  There are certainly other ways one could accomplish the same thing if desired.

 

cheers for the feedback, i am thinking of splitting the images just for the reasons you mention, windows maintenace for want of a better phrase. when i tried to copy the whole folder it took several atempts to not lock up and eventually when it did work the process was many many hours!

 

I store around 3 million images so far in a single folder on a Windows server and see no ill effect from it.

 

Definitely you do not want to open that folder. I write scripts using glob and pagination to cycle through them.

 

Linux has limitations, for example ext3 has a 32,000 subdirectory limit, while ext4 has 64,000

 

Windows is just limited by it's storage size.

 

As others said above, if have a choice not to store in a single folder , is best to do that.

 

If you have no way to logically separate the files into folders, you could also disable the 8.3 file name generation.

http://support.microsoft.com/kb/121007

http://en.wikipedia.org/wiki/8.3_filename

Linux has limitations, for example ext3 has a 32,000 subdirectory limit, while ext4 has 64,000

 

Subdir limit != file num limit, as evidenced by this snippet:

for ($run = 1; $run < 65000; $run++) {
    file_put_contents ($run, 'test');
}

 

The result:

test$ ls -l | wc -l

65000

Linux has limitations, for example ext3 has a 32,000 subdirectory limit, while ext4 has 64,000

 

Subdir limit != file num limit, as evidenced by this snippet:

for ($run = 1; $run < 65000; $run++) {
    file_put_contents ($run, 'test');
}

 

The result:

test$ ls -l | wc -l

65000

zwzS.png

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.