I do similar, it's a relatively small but important SQLite database so every five minutes I take a backup using that method, and check the generated SQL files into remote git repositories.
One on GitHub which is just a snapshot, a single commit in an otherwise empty repo, force pushed. This is for recovery purposes, I don't need the history and would probably run afoul of their service limits if I did so.
And the other on Azure DevOps which has the entire commit history for the past few years. This one is a bit trickier because the pack files end up exhausting disk space if not cleaned up, and garbage collection interrupts the backups. So it clones just the latest commit (grafted), pushes the next commit, and wipes the local repo. No idea how this looks on the remote backend but it's still working without any size complaints, and it's good to know there's an entire snapshotted history there if needed. As well as being able to clone the most recent for recovery if GitHub fails.
One on GitHub which is just a snapshot, a single commit in an otherwise empty repo, force pushed. This is for recovery purposes, I don't need the history and would probably run afoul of their service limits if I did so.
And the other on Azure DevOps which has the entire commit history for the past few years. This one is a bit trickier because the pack files end up exhausting disk space if not cleaned up, and garbage collection interrupts the backups. So it clones just the latest commit (grafted), pushes the next commit, and wipes the local repo. No idea how this looks on the remote backend but it's still working without any size complaints, and it's good to know there's an entire snapshotted history there if needed. As well as being able to clone the most recent for recovery if GitHub fails.