Jump to content

Best way to store image data in DB ?


keldorn

Recommended Posts

Say you have an image  20091212_81986783.jpg  that is stored locally on your server at,

 

/home/website/public_html/static/uploads/

 

You serve these images from static.example.com/uploads/20091212_81986783.jpg

 

Would you store the whole url in the database, or just the file name? and in the long term what scales better if you have ever to move the files to many servers and serve them off urls like  static1.example.com and static2.example.com  ?

 

If you stored the whole url in the database, if you ever dispersed the files, you would have to update every row in the database where this image is and update it url.

 

Storing file name only:

How the how heck would you know what server its on when it beyond local storage?

 

Just something that been thinking about for last few months :?

Link to comment
Share on other sites

Well, you could save the server info as part of the site configuration.  That would allow you to save only the filename in the db, keeping things a bit simpler on both ends.  This could work with multiple storage locations if you save their paths in an array.  Loop through them while concatenating the filename to check if the file exists.

Link to comment
Share on other sites

Do you mean storing an associative array with the servers in a config, like

 

<?php
$servers = ('1'=>array('ip_address'=>'67.159.0.1',
                                    'host'=>'static1.example.com'
                                    ),
                    '2'=>array('ip_address'=>'67.159.0.2',
                                    'host'=>'static2.example.com'
                                    ),
                  )

 

 

Then when you upload an image, you pick one from there randomly, then connect to it via FTP, then upload the image, then store in the database the full imageurl. (static2.example.com/uploads/).  Okay so you'll know where they are stored and if you ever need to scale it more and those servers are full, you can just remove them from the server array, and add new fresh servers.

I probably might never have this problem, but I am thinking ahead about scalability.

 

I even realize that FTP probably wont scale, I think the fastest way might be too send a POST to  a server at some obsucure port with the image data  and then have some multi-threaded non-blocking application there written in C wiating on that port to take that and write it fast to disk or shove it in memory waiting to be written to disk,

of course there would be no auth, so you would have to firewall the server, kinda like how if your using memcached.

On port 80 you might having something like Varnish or Nginx serving the images.

 

Link to comment
Share on other sites

I would think that if you had enough static content to warrant static content servers, you'd invest the time and money in some sort of managed file share solution that hides the details of which server the file is actually on and still allows you to access it off of a regular old file path.

 

Does such a thing not exist?

Link to comment
Share on other sites

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.