The cron task you proposed is the most efficient. Do you really want to slow down the user by forcing them glob () through your cache?
You can run a hook that determines if the user agent is a robot, and delete old files in this case, but be careful which bot you run for it, or you might get some results from “this page slowly” random sites. :)
source share