Jump to content

[SOLVED] deleting a file after 24 hours


HaLo2FrEeEk

Recommended Posts

I have a script set up that will transfer a picture from one server to my own every 10 minutes (using cron).  The script uses an external cURL based class to retrieve the file on systems that don't have file_get_contents turned on.  What I want is for only 24 hours worth of pictures to be on the server at any one time (give or take 1, I'm not too picky).  The files are saved with the filename as a timestamp, so there is no overwriting, but it's irregular (since the script runs every 10 minutes, but it could take longer for the remote server to respond, thus making the timestamps not exactly 10 minutes apart, it's usually a difference of 3 or 4 seconds either way at the most, though).  How would I go about deleting the 24 hour old file after a new one is written?  I suppose I could make a rather complicated script that would name the file based on what time of day it is (what 10-minute chunk it is during the day), but that would require a lot of math, and I'd prefer a simpler method.  Can anyone give any pointers on how I'd do this, I've been racking my brain trying to figure something simple out, but I can't think of anything.

Link to comment
Share on other sites

Do you have access to a database? I'm thinking that you should has a table that keeps track of the images in a table, storing the image path and the time it was uploaded. By comparing the date stored and the current date it should be fairly simple to delete the files that are more than 24 hours old.

Link to comment
Share on other sites

If you don't have access to a database..

I would scan the files that are in the directory to be deleted

Loop through each file and use filemtime(); on them and if it is older than 24 hours, unlink(); it

I would probably be more inclined to runit every hour or so if there are many files to check

 

Best bet would be a database though for sure.

Link to comment
Share on other sites

I do have a database, yes, but I was looking for something really simple.  I was afraid I'd have to scan the files, though.  Since I transfer the picture every 10 minutes there are 6 per hour, or 144 per day, that would take a long time to scan, most likely.  The problem is, even with using a database, there's no way to regulate the timestamps.  Like I said, one image might transfer faster than another, meaning the timestamp can be a few seconds off either way.  If there is a way to regulate the timestamp so that the difference between each is exactly 600 seconds (so a file transfered at 11:10 pm on 08/03/08 would be 1217830200, +600 would be 1217830800, +600 would be 1217831400; so that's 11:10pm, 11:20pm, and 11:30pm).  Is there a way to do this?  Could I get the time of day using some fancy math, then round down to the 10th minute and run strtotime()?  I can't really figure out how I'd do that.  It might actually be more convenient to use a database because I plan on making a gallery to display these images up-to-date, and use an external class to generate a dynamic animated gif using the images in the folder.  Anyways, what, in your opinions, would be easiest?  Using a database, or a scanning the folder?

Link to comment
Share on other sites

The easiest way would be to add this to your crontab....

 

0 * * * * /usr/bin/find /path/to/images -mtime +24 -exec rm {} \;

 

This will run once an hour on the hour checking the /path/to/images directory, any files within this directory older than 24hrs will be removed.

Link to comment
Share on other sites

Ok, I put in the code you put up, for the cron job, this is the code I'm using:

 

0 * * * * /usr/bin/find /home/halo2freeek/infectionist.com/misc/bungie_webcam/images -mtime +24 -exec rm {} \;

 

That's the path to my images, but what's with the 0 and the * * * at the beginning, and the usr/bin/find?  Is that something I have to change, what do I need to point it to?  I have it set up to email me the output everytime it's run and this is the email I got:

 

sh: line 1: 0: command not found

 

What's that mean?

Link to comment
Share on other sites

the 0 * * * * is the cron schedule.... run on minute 0, every hour, every day of the month, every month of the year, every day of the week.

 

man crontab for more details.

 

I'm guessing you aren't installing the cronjob correctly.

 

from shell, type in:

 

crontab -e <enter>

 

This opens the crontab.  Paste that text in there and exit (with save).

 

/usr/bin/find is the path to the find binary which is searching the folder for the files and removing them.

 

If cron still isn't working, just paste this directly to a command line:

 

/usr/bin/find /home/halo2freeek/infectionist.com/misc/bungie_webcam/images -mtime +24 -exec rm {}

 

if you still get an error, post it.

 

Link to comment
Share on other sites

I don't have to use shell for crontab, my host provides a nice little interface to add new cronjobs when needed.  They won't let me put in scheduling information since I set that with the interface seperately.  I have it set up to email me the output though, so I removed the schedule info and put this:

 

/usr/bin/find /home/halo2freeek/infectionist.com/misc/bungie_webcam/images -mtime +24 -exec rm {}

 

And I haven't gotten an email.  I set it to run on the 42nd minute, since it was 1:40 when I tried it and I wanted to see the result quickly.  I will set it to do it every 10 minutes and see if it makes a difference, but it won't.  I'm thinking t might just not email me the result if there is nothing to delete.  I changed my server around a little and changed my domain's root folder name, and I didn't change it in the cronjob, I fixed it last night though and it's copying the pictures again, so hopefully in a few hours I'll get an email telling me it was successful.

Link to comment
Share on other sites

Sorry for the double post, but I can't get the cron to work.  I put it in there, it's active, and it's scheduled to run every 10 minutes, but for some reason, it's just not deleting the files.  Is there somthing I have to do to set the filetime on the images or is it automatic?  This is the cron code I have in there now:

 

/usr/bin/find /home/halo2freeek/infectionist.com/misc/bungie_webcam/images -mtime +24 -exec rm {} \;

 

And it's not deleting the file.  I don't want to have hundreds of images on the server, there are only supposed to be 144 +/- 1 at any given time.

Link to comment
Share on other sites

So I'd use:

 

/usr/bin/find /home/halo2freeek/infectionist.com/misc/bungie_webcam/images -mtime +24 -exec bin/rm {} \;

 

or

 

/usr/bin/rm /home/halo2freeek/infectionist.com/misc/bungie_webcam/images -mtime +24 -exec rm {} \;

 

Which one?

 

No question is rediculous for someone in my position, I just want to get this resolved quickly.

Link to comment
Share on other sites

Ok, I ran exactly that command from the root folder under putty (the root folder is home/halo2freeek, but I think it shouldn't matter where I run it from because it has the path to the images).  It didn't return an error, but it didn't remove any files.

 

What I learned is that +24 will remove files more than 24 days old, if I want to remove files 0 days or older (meaning anywhere from 0 - 86,399 seconds old), I need to change +24 to +0 and it works like a charm.  I ran that command and it removed all the files leaving 144 files after.

 

Thank you everyone for your help, I really appreciate it!

Link to comment
Share on other sites

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.