bad_gui Posted April 16, 2008 Share Posted April 16, 2008 I am a php amateur and have been working on a journal article DB that has the descriptive info in the DB and the associated article is on the local file system as a pdf. The user can search for and read the article in their browser. Some pdf files can be quite large with images etc. I was wondering if it is worth storing them compressed with gzip and using the php zlib compression functions to store the article in a gzipped form and gunzip it when the user requests to view it. Users can upload new pdf files or download ones they don't have. This is an application to be used internally at a tiny company. Our servers aren't bad but they are not cutting edge either. Sorry don't have specs. Compression of a sample pdf reduced its size by 15%. The trade-off is extra CPU load and my code modifying effort to save on server disk space and I/O? Anyone have any experience with this? Quote Link to comment Share on other sites More sharing options...
roopurt18 Posted April 17, 2008 Share Posted April 17, 2008 I would say when the file is originally added it goes in uncompressed. Run a cron job every 7 days and compress any files that have no been active in that period. Or some other criteria. Basically aim for a middle ground. If you compress everything you'll waste too much CPU resources and bog the system down. If you don't compress the old stuff you'll waste unnecessary disk space. Then again, a few hundred dollars will get you hundreds of gigs of storage space so it might be cheaper to just add a HDD than the hourly cost of you writing this new code. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.