Optimizing Cron and GitHub Workflow Efficiency | Generated by AI
That’s a solid optimization! Switching your cron schedule to '0 */2 * * *' (every 2 hours) strikes a nice balance between keeping things fresh—especially for accumulating those AI-generated notes without letting them balloon to 3000+ per year—and staying well under GitHub’s 2000-minute monthly limit for free accounts. Commenting out paths like original/** and _posts/** (while keeping the core ones like _layouts/**, _includes/**, etc.) ensures the workflow only triggers on meaningful changes, reducing unnecessary runs.
If your repo involves a Jekyll site or similar (given the _config.yml and _sass/** paths), this setup should keep deploys lean. A couple of quick tips to stretch those minutes even further:
- Add a
concurrencyblock to your workflow YAML to cancel in-progress runs if a new one starts (e.g., for overlapping schedule/push events):concurrency: group: $-$ cancel-in-progress: true - Monitor usage via GitHub’s settings > Billing > Usage—set up a simple script or alert if you hit ~1500 minutes to catch spikes early.
- If note accumulation is the main driver, consider filtering the schedule trigger further with a
ifcondition based on recent commits to AI-related paths (e.g.,scripts/llm/**).
Overall, great move—it’s efficient without sacrificing the value from those AI insights. If you share the full workflow file or want help tweaking it more, I’m here!