Automated pfSense Git Backups with Kestra (config.xml Version Control)

Why You Must Backup pfSense Regularly

If you're running your network on pfSense, your entire infrastructure depends on a single file:

/cf/conf/config.xml

This file contains:

  • Firewall rules
  • NAT rules
  • VLAN configuration
  • VPN settings
  • Certificates
  • Interface assignments
  • User accounts

If this file is lost or corrupted, rebuilding your firewall manually can take hours — sometimes days. All it takes is one wrong upgrade and you will need to reinstall your firewall.

Now combine that with the automation power of Kestra and Gitlab (other repos will work), and you get:

✅ Automated firewall backups
✅ Version-controlled configuration history
✅ Scheduled offsite storage
✅ Rapid disaster recovery

Let’s build it.


Important Prerequisites on pfSense

Before this automation works, you must prepare your pfSense box properly.

1️⃣ Install sudo

pfSense does not always include sudo by default.

Install it via:

pkg install sudo

Then configure /usr/local/etc/sudoers (carefully!) to allow your automation user to run Git commands.

Example minimal rule:

ansible ALL=(ALL) NOPASSWD: /usr/local/bin/git

Keep sudo access restricted to only what is necessary.


2️⃣ Install Git

pfSense also does not include Git by default.

Install it:

pkg install git

Without Git installed locally, your Kestra SSH command will fail.

Once installed, initialize the repository:

cd /cf/conf
sudo git init
sudo git add *
sudo git remote add origin https://gitlab/<owner of repo>/<repo>

In addition, setting up a git credential store for storage of the credentials is a good idea. Typically this is stored in the users home directory and is called .git-credentials. The format for the file is:

https://{{ gitlab_username|urlencode }}:{{ gitlab_password|urlencode }}@gitlab

Don't forget to check which user you are using, especially if using sudo to run the git commands.


First-Time Force Push (Important)

When you connect the local pfSense repo to an existing remote repository, you may encounter:

  • Non-fast-forward errors
  • Unrelated histories
  • Refusing to merge unrelated histories

In my case, the first push required:

git push -u origin main --force

Why?

Because:

  • The remote repo already existed
  • The local pfSense repo was freshly initialized
  • Git histories did not match

⚠ After the first force push, do NOT keep using --force.

It should only be required once during initial setup.

After that, normal pushes work correctly.


Backup Strategy Overview

We will:

  1. Use Kestra to securely connect to pfSense
  2. Pull the config.xml
  3. Store it temporarily
  4. Push it into a Git repository
  5. Run it automatically on a schedule

We’ll use GitLab as the remote repository, but this works with any Git server.


Your Production Kestra Flow

Below is a production-ready example flow:

id: pfsense_backup
namespace: ansible

tasks:

  - id: fetch_config
    type: io.kestra.plugin.fs.ssh.Command
    host: "pfsense"
    authMethod: PUBLIC_KEY
#    authMethod: PASSWORD
    username: "ansible"
    strictHostKeyChecking: no
#    privateKey: "{{ secret('SSH_RSA_PRIVATE_KEY') }}"
    privateKey: "{{ kv('SSH_KEY') }}"
#    password: "{{ secret('GITLAB_PASSWORD') }}"
    commands: 
      - cd /cf/conf/ && sudo git config --global --add safe.directory /cf/conf && sudo git config credential.helper store && sudo git add * && sudo git commit -m "kestra update"; sudo git push origin main

triggers:
  - id: schedule_backup
    type: io.kestra.plugin.core.trigger.Schedule
    cron: 0 12 * * *
    disabled: true

concurrency:
  limit: 1
  behavior: FAIL

How This Design Works

Instead of pulling the file to Kestra…

You:

  1. SSH into pfSense
  2. Run Git locally
  3. Commit any changes
  4. Push to GitLab

This reduces complexity and avoids unnecessary file transfers.


Why This Is a Strong Backup Pattern

This approach provides:

  • Immutable version history
  • Timestamped commits
  • Change diffs
  • Off-device storage
  • Audit trail of firewall modifications

If your firewall hardware fails:

  1. Install fresh pfSense
  2. Clone repo
  3. Replace /cf/conf/config.xml
  4. Reboot

Recovery takes minutes instead of hours.


Security Considerations

config.xml contains:

  • VPN credentials
  • Certificate data
  • Password hashes
  • Shared secrets

You must:

  • Keep the repository private
  • Restrict GitLab access
  • Enable 2FA
  • Consider encryption for high-security environments

For enterprise environments, encrypt before pushing. Alterntaively, Backup the entire /cf/conf directory using tar, encrypt that file and then commit it. However, this doesn't track changes over time to the individual files.


Concurrency Protection Explained

concurrency:
  limit: 1
  behavior: FAIL

Prevents overlapping backups.

This ensures:

  • No race conditions
  • No partial commits
  • No corrupted backup history

Common Mistakes to Avoid

❌ Forgetting to install sudo
❌ Forgetting to install git
❌ Leaving repo public
❌ Using password-based SSH
❌ Repeated force pushing
❌ Never testing restore


Testing Your Restore Process (Critical)

Backups are worthless if you never test them.

At least quarterly:

  1. Install a test pfSense VM
  2. Restore a saved config.xml
  3. Confirm rules load properly

This validates your disaster recovery plan.


So why not Ansible?

I use ansible a lot, so why aren't I using it for this?

Well, I tried. Ansible's git module doesn't do adds, commits or pushs. To get that to work with Ansible, I would need to install an extra library (I was looking at git_acp), but that didn't seem to want to work with the docker container.

In the end, this was the simplest way to get the configuration in to Gitlab.


Final Thoughts

By combining:

  • pfSense
  • Kestra
  • Git automation
  • Secure SSH keys
  • Scheduled triggers
  • Proper first-time setup

You’ve created a lightweight but enterprise-grade firewall backup system.

Set it up once.
Protect your network permanently.


About the author

Tim Wilkes is a UK-based security architect with over 15 years of experience in electronics, Linux, and Unix systems administration. Since 2021, he's been designing secure systems for a telecom company while indulging his passions for programming, automation, and 3D printing. Tim shares his projects, tinkering adventures, and tech insights here - partly as a personal log, and partly in the hopes that others will find them useful.

Want to connect or follow along?

LinkedIn: [phpsytems]
Twitter / X: [@timmehwimmy]
Mastodon: [@timmehwimmy@infosec.exchange]


If you've found a post helpful, consider supporting the blog - it's a part-time passion that your support helps keep alive.

⚠️ Disclaimer

This post may contain affiliate links. If you choose to purchase through them, I may earn a small commission at no extra cost to you. I only recommend items and services I’ve personally read or used and found valuable.

As an Amazon Associate I earn from qualifying purchases.