nirvanaforu Posted January 15, 2009 Share Posted January 15, 2009 Hi All: I would like to write a small shell script to be added to cronjobs, the main purpose is for the shell script is to delete the outdated files and outdated sub-directories under a certain folder, however I do not want to delete the top folder. For example: the top folder is /proj/tmp the command I wrote was: “ find /proj/tmp/ -mtime +10 | xargs rm -rf ” The problem is: once there are some files inside satisfying the find condition, "/proj/tmp/" will be returned as well. I can't add -type f, because I would like to remove those subfolders as well. if I revise the command as the following: “ find /proj/tmp/* -mtime +10 | xargs rm -rf ” it works as I expected. However, as the top folder is empty, it always report some errors like: "find /proj/tmp/*: no such file or directory". If it's added to cronjob, which will be emailed to root again and again. Any suggestions to this problem? Thanks a lot!!! btw: I tried with >/dev/null, which didn't help with the error message. Someone suggested tmpwatch, the problem is that I don't know which files or directories returned by tmpwatch, so that I can't test with tmpwatch. Quote Link to comment Share on other sites More sharing options...
trq Posted January 15, 2009 Share Posted January 15, 2009 find /proj/tmp/ -mindepth 1 -mtime +10 | xargs rm -rf Quote Link to comment Share on other sites More sharing options...
mpiekarski Posted January 16, 2009 Share Posted January 16, 2009 Hello, This last comment will likely work but is not necessarily optimized and can take much longer to run than an otherwise optimized command would. What I mean exactly is that will attempt to find any files and folders and recursively force the deletion of them if their mod date is over 10 days. find /proj/tmp/ -mindepth 1 -type f -mtime +10 -iname "sess_*" -exec rm -f {} \; That is just an example. However, by searching for files by type (f for file, d for directory) you can cut your time in half. In addition to that, the more specific you get, the faster it should go (provided you have some sort of journaled file system like ext3). By using more flags, such as iname you can get even more specific. (I used it with "sess_*" just as an example to find old php session files.). Finally, I used the -exec to run whatever you want on the output. In short: When it comes to the find command - the more specific you get, the better your result. Also, the -exec will do the same thing you are trying to do with xargs. The issue with using xargs rm -rf is that you may hit files with spaces or invalid characters in it. Using something like: xargs -0 rm -rf should escape them. However, there is no point in using the pipe if you can use -exec. (Note: the {} represents what would otherwise be returned to stdout from find [[ie. your files you are finding ]] and you have to end a -exec with an escaped semicolon [[ \; ]] ) Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.