stumbled on this and am curious if @din every finished his/her backup (and wanted to leave some additional thoughts for the next bandwidth-constrained Kopia user).
Also curious why it wasn’t recommended to use filtering, either by file type or directory to (1) ensure a completed snapshot (2) prioritize most important data first.
And what about compression? Why not benchmark the various compression algorithms to figure out what will be the best fit for you? Given no mention of CPU bottlenecks (only bandwidth), might the best option be whatever smashes your data down best? 10GB files are often things like compressed media files which don’t further compress well so you could always flag those as exceptions to avoid unnecessary CPU work, but then again, when you are that constrained for bandwidth, I’d probably just squeeze those blocks as tight as I can before sending them out. Someone with more under-the-hood knowledge should probably chime in here on why I’m coming to the wrong conclusions now
Also, I know this has been said elsewhere, but with that small a pipe, if you really care about that data and need it offsite, I’d bum a friend’s network, or a coffee shop, or library, …or basically anywhere to at least get the important stuff up, and then let the rest trickle up over time.
Maybe the old “sneaker-net” is a better fit? …Have an offsite drive (or two that you keep in rotation) at your work or someone’s house that you bring home to back up over a wire (eg. USB 3 where even a slow spinner will be somewhere around 2000 times faster than your outbound pipe), but it then lives offsite.
And then there are the many strange combinations of all of the above that might be best depending on your specific data situation (rate of data that’s changing over time, correlation between important files and large files, etc.)
Interested to hear how things went / are going!