Got a log directory that gets stuffed with files? Want a quick and simple way to clean out old files? Here is a simple code snippet. It will scan one directory and remove any file more than 30 days old.
<cfset logdir = "/Applications/ColdFusion8/logs">
<cfdirectory action="list" directory="#logdir#" name="files" type="file">
<cfset thirtydaysago = dateAdd("d", -30, now())>
<!--- get older files --->
<cfquery name="oldfiles" dbtype="query">
select name
from files
where datelastmodified < <cfqueryparam cfsqltype="cf_sql_timestamp" value="#thirtydaysago#">
</cfquery>
<cfoutput>Out of #files.recordCount# files, there are #oldfiles.recordCount# to delete.</cfoutput>
<cfif oldfiles.recordCount>
<cfloop query="oldfiles">
<cffile action="delete" file="#logdir#/#name#">
</cfloop>
</cfif>
Not much going on here. I'll point out the type="file" attribute to cfdirectory. This was added in ColdFusion 8. It tells the tag to only return files, not subdirectories. Outside of that the code just uses query of query to gather files of a certain age (I won't say old, let's be polite) and it then loops over them to delete them.
In my next post I'll show an alternative - archiving to a zip.
Edit: I had my < and > messed up above. It is right now.
Archived Comments
Nice.
If you have a lot of big files I believe you'll get better performance using the new fileDelete function.
Just an FYI, I had my < and > wrong. Fixed.
Ray: Why the <cfif oldfiles.recordCount> ?
Shouldn't the cfloop be enough?
It is me being anal. No need to run the loop if no results.
See we have a clustered environment on Round Robin so code like that wouldn't work without a bit of "fudging" of CFHTTP requests to make sure all the servers are dealt with.
Personally I'd use a scheduled task on the OS that runs batch files to deal with this kind of job. You can then target each server individually and don't have to hassle CF with a task that the OS could do instead. A lot less time consuming to write, test and setup as well :-)