One of my home made (web) applications running on Debian Sarge is using the /tmp folder to store temporarly files in (they are not needed anymore after beeing processed) Do I need to create a cron job to empty the /tmp directory, or is this beeing taken care of by itself (if so, I still see some files in it from several days ago!)?
I edited the sysklogd to clean the /tmp dir on every reboot. I added the following it. It has to be at a specific line and I have to refer to my box when I get home. Code: rm -fr /tmp/* /tmp/.??* I don't think it's a wise idea to clean the directory during production use.
Hi domino, Hmm clean the /tmp dir on every reboot... Not sure if this is an option for me.. My servers do normaly not do a reboot often! (one of the servers is up over 100 days now) I guess I need to change some code in my application, and let it use an other "temp" directory that I can clean every 6hrs or so.
Yea, kinda figured that but I first thought it was a home server and not a dedicated box. I think I remember a read a few years back with using a cron to clean a user /tmp directory. But I think it was on a RH 9 box. I'll post it when I find it. Sorry for the misunderstanding.
To go sure you can create a cron job. Create a shell script like this one: Code: #!/bin/sh for file in "$( /usr/bin/find /tmp -type f -mtime +2 )" do rm -f $file done exit 0 make it executable, and put it in a cron job. It deletes all files in /tmp that are older than two days.
Why doesn't this work here: I try to remove all files in the backup folder that are older than xx days... Instead of running just the rm command I first wanted to echo it to see it works fine... however nothing is being returned...
I think you will need to set the var $file 1st to echo it. Now it showing $file, but it's empty! maybe something like this Code: #!/bin/sh for file in "$( /usr/bin/find /backup/ -type d -mtime +3 )" do showIt=$file # rm -Rf $file echo $showIt done exit 0
it lists all folders and subfolders.... However I want to rm -Rf all folders taht are older than xxx days (and in /backup/ only)
But only from /backup, right? What happens, if you run this: Code: for file in "$( /usr/bin/find /backup/ -type d -mtime +3 )" do echo $file done directly on the shell (not from a script)?
nothing happens Well process is running but nothing is being echoed... I let it run now for a while...
It took first a while then it started outputting... but the problem remains: I only want to search in /backup/ for folders older than xxx days but not recursively...
This is old I know but for those who want to know, there is no need to script this to get the same function as find is quite powerful to do the same job, and the last post is correct... 'man' is your friend. find /backup/ -type f -mtime +3 -maxdepth 1 -exec rm -f {} \; (-type f) remove files (-mtime +3) older than 3 days (-maxdepth 1) in this directory branch depth only (-exec rm -f {} \ and execute a rm for each file found.