Quick and dirty automatic old file deletion

I’ve started using my VPS to store stills from my IP cameras. It’s easy and quick – I’ve got a script running on my openwrt router that fetches the stills periodically from 8 different cameras. Whilst some support FTP upload, others are less well connected (hence the need for the script).

Unfortunately whilst I was away, without Internet last month, the storage on the VPS filled up with these images taking down most of the services running. This was the quick and dirty fix I managed to implement using on my phone:

This code adds a daily crontab that uses the ubiquitous find command to simply delete old files. In this case I’ve used 10 days, but you can get flexible with the mtime option if you check the man page. We use exec as this doesn’t expand the entire list of found files so avoids “argument list too long “issues.

Couple of caveats –

  • I ran it as root, because that’s what I had.. Many users are ingrained with a fear of the root; but I consider it a personal preference which is best debated elsewhere.
  • You should use  crontab -e to modify your cron files as it verifies the syntax.

Being able to do these quick fixes is what I love about linux. I highly recommend following bash one liners on twitter – https://twitter.com/bashoneliners who also have a QDB style website.