Frequently fails to GCS with 'context canceled', and then will not re-run and can't end process gracefully. kill required. New snapshots get stuck at 'listing manifest contents' until kill & relaunch

Since about a month ago, I frequently get a failure backing up to GCS, which ends in ‘context canceled’.

once this happens, I can’t do any further snapshots, as they just get stuck at ‘listing manifest contents’

Here is part of a failure log from Kopia UI:

23:52:49.168 snapshotted directory path:"." error:"processing subdirectories: unable to process directory \"Server backup\": error writing dir manifest: Server backup: unable to write directory: unable to write content chunk 0 of DIR:Server backup: error writing previously failed pack: error writing pack: can't save pack data blob p83a0a48538b172bc385c3d79576f5dde-sa6e3e7c61d7b245610d: error writing pack file: unable to complete PutBlob(p83a0a48538b172bc385c3d79576f5dde-sa6e3e7c61d7b245610d) despite 10 retries, last error: unexpected GCS error: Post \"https://storage.googleapis.com/upload/storage/v1/b/ourCompany_veam_backup/o?alt=json&name=p83a0a48538b172bc385c3d79576f5dde-sa6e3e7c61d7b245610d&prettyPrint=false&projection=full&uploadType=resumable\": context canceled: error writing previously failed pack: error writing pack: can't save pack data blob p83a0a48538b172bc385c3d79576f5dde-sa6e3e7c61d7b245610d: error writing pack file: unable to complete PutBlob(p83a0a48538b172bc385c3d79576f5dde-sa6e3e7c61d7b245610d) despite 10 retries, last error: unexpected GCS error: Post \"https://storage.googleapis.com/upload/storage/v1/b/ourCompany_veam_backup/o?alt=json&name=p83a0a48538b172bc385c3d79576f5dde-sa6e3e7c61d7b245610d&prettyPrint=false&projection=full&uploadType=resumable\": context canceled" dur:1h2m51.802867083s

and then if I try to run another snapshot later, or the next day, it just hangs at ‘listing manifest contents’.

08:32:18.866 uploading root@backup:/mnt/backup/Veeam-latest-snapshot
08:32:18.867 reloading committed manifest contents: rev=9 last=0
08:32:18.867 listing manifest contents

I am on Fedora 35 and receive Kopia updates via the kopia yum repository.

Currently on version v0.10.4

I have kopia running in tmux.
When I reattach the tmux console, and try to close with ctrl-c, it just hangs with ‘stopping all source managers’

[root@backup ~]# kopia server --ui --insecure --address="http://192.168.1.50:51515" --server-password="some-password"

Server will allow connections from users whose accounts are stored in the repository.
User accounts can be added using 'kopia server user add'.

SERVER ADDRESS: http://192.168.1.50:51515
Open the address above in a web browser to use the UI.
upload triggered via API: root@backup:/mnt/backup/Veeam-latest-snapshot
umount: /mnt/backup/Veeam-latest-snapshot: not mounted.
upload triggered via API: root@backup:/mnt/backup/Veeam-latest-snapshot
^CShutting down...
stopping all source managers

I open another terminal and ‘killall -9 kopia’ to stop kopia. then restart it, and I can do a snapshot OK usually.

because I have a before-action script which mounts the latest snapper btrfs snapshot into a ‘Veeam-latest-snapshot’ directory, I have a problem when this happens because the directory is still busy even after killing the dead kopia. So my script can’t umount the previous latest snapshot so that it can mount the latest again.

I’ll look into that, this appears to be GCS-provider specific.

This seems to be happening more often than not, if not all the time… It’s certainly happened every time the last few times I’ve tried to run. I keep missing my window to try again, but I just did now and have the same failure.
Let me know if there’s anything I can share that could help.
cheers :slight_smile:

They all seem to be failing at the same point. (259.5gb)

When I look inside the snapshot via the UI, I see a large ‘checkpointed’ file in one of the directories - where it got stuck. Not sure if this helps at all.

Oh, so you have a giant file there… How are you triggering the snapshot?

Can you try snapshotting from command line?

Yes, lots of big files - bare metal disk image files.
I have been running from the Snapshot now button in the web-ui.
I’ll try from the command-line now :slight_smile:

So far so good. It is certainly beyond the 259.5GB point that it had been dying at before.
It’s a shame I can’t see the same sort of friendly log output as what the html-ui shows though.
All I see is:

Snapshotting root@backup:/mnt/backup/Veeam-latest-snapshot ...
 | 3 hashing, 23 hashed (876.7 GB), 65 cached (650.5 GB), uploaded 698.6 GB, estimating...

The cli-logs and content logs don’t show the same sort of friendly directory-by-directory ‘snapshotted’ stuff as what the html-ui shows.

It’s uploading to GCS at ~500Mbps so should have a completed snapshot in another few hours.

And do you have this giant file in a directory by itself ? If so, you may be able to benefit from upcoming parallelization improvements which will allow processing that one file in parallel to files in other directories.

It won’t likely fix this particular issue here, thought, for that we need another fix.

I am using Veeam to back up to this Linux box over SMB. I have set some options in Veeam that appear to work well with Kopia’s hashing, for incrementals. Not sure what to do over the very long term mind you (must reset the incrementals at some point, one would think).

So I have a few directories - one per machine, with one very large and then many fairly large files in them.

It seems to be working fine again - It does seem like my problem could have been to do with initiating from the html-ui.
I am running via a cron job now and it seems fine. I just can’t get the same kind of useful / clear logging that the html-ui gives, but I’ll come back to that another time.

The parallelisation sounds interesting! Looking forward to that :slight_smile:

FYI here is the current directory that is being cloned to GCS by Kopia:

[root@backup Veeam-latest-snapshot]# tree --du -h
.
├── [  17G]  OpenOTP & NetScaler
│ ├── [ 8.6G]  OpenOTP & NetScalerD2021-12-26T200431_54F3.vbk
│ ├── [ 180M]  OpenOTP & NetScalerD2021-12-27T201423_0311.vib
│ ├── [ 184M]  OpenOTP & NetScalerD2021-12-28T201419_9E69.vib
│ ├── [ 216M]  OpenOTP & NetScalerD2021-12-29T201413_57F2.vib
│ ├── [ 197M]  OpenOTP & NetScalerD2021-12-30T201408_A10D.vib
│ ├── [ 194M]  OpenOTP & NetScalerD2021-12-31T201404_1AD8.vib
│ ├── [ 184M]  OpenOTP & NetScalerD2022-01-01T200510_705D.vib
│ ├── [ 182M]  OpenOTP & NetScalerD2022-01-02T200443_4931.vib
│ ├── [ 184M]  OpenOTP & NetScalerD2022-01-03T201413_2D9C.vib
│ ├── [ 521M]  OpenOTP & NetScalerD2022-01-20T210751_A568.vib
│ ├── [ 186M]  OpenOTP & NetScalerD2022-01-21T205547_4EA6.vib
│ ├── [ 186M]  OpenOTP & NetScalerD2022-01-22T200545_AAB5.vib
│ ├── [ 191M]  OpenOTP & NetScalerD2022-01-23T200427_49BF.vib
│ ├── [ 181M]  OpenOTP & NetScalerD2022-01-24T201553_24F1.vib
│ ├── [ 192M]  OpenOTP & NetScalerD2022-01-25T201655_D211.vib
│ ├── [ 213M]  OpenOTP & NetScalerD2022-01-26T201647_2E82.vib
│ ├── [ 197M]  OpenOTP & NetScalerD2022-01-27T201703_66C8.vib
│ ├── [ 191M]  OpenOTP & NetScalerD2022-01-28T201656_2879.vib
│ ├── [ 178M]  OpenOTP & NetScalerD2022-01-29T200538_812D.vib
│ ├── [ 179M]  OpenOTP & NetScalerD2022-01-30T200507_7508.vib
│ ├── [ 179M]  OpenOTP & NetScalerD2022-01-31T205823_1B06.vib
│ ├── [ 184M]  OpenOTP & NetScalerD2022-02-01T203951_D305.vib
│ ├── [ 206M]  OpenOTP & NetScalerD2022-02-02T202118_22AB.vib
│ ├── [ 194M]  OpenOTP & NetScalerD2022-02-03T202003_9A55.vib
│ ├── [ 191M]  OpenOTP & NetScalerD2022-02-04T203909_31FC.vib
│ ├── [ 177M]  OpenOTP & NetScalerD2022-02-05T202902_5A6A.vib
│ ├── [ 181M]  OpenOTP & NetScalerD2022-02-06T202858_A968.vib
│ ├── [ 178M]  OpenOTP & NetScalerD2022-02-07T203938_C301.vib
│ ├── [ 188M]  OpenOTP & NetScalerD2022-02-08T203631_B10A.vib
│ ├── [ 216M]  OpenOTP & NetScalerD2022-02-09T202928_43F0.vib
│ ├── [ 203M]  OpenOTP & NetScalerD2022-02-10T203022_0BC3.vib
│ ├── [ 198M]  OpenOTP & NetScalerD2022-02-11T204259_F63F.vib
│ ├── [ 178M]  OpenOTP & NetScalerD2022-02-12T203209_CBA2.vib
│ ├── [ 189M]  OpenOTP & NetScalerD2022-02-13T203120_944E.vib
│ ├── [ 180M]  OpenOTP & NetScalerD2022-02-14T204357_D6B9.vib
│ ├── [ 193M]  OpenOTP & NetScalerD2022-02-15T204355_B8C8.vib
│ ├── [ 216M]  OpenOTP & NetScalerD2022-02-16T202244_C177.vib
│ ├── [ 194M]  OpenOTP & NetScalerD2022-02-17T202108_7EAE.vib
│ ├── [ 197M]  OpenOTP & NetScalerD2022-02-18T204213_D2C0.vib
│ ├── [ 176M]  OpenOTP & NetScalerD2022-02-19T203212_3D71.vib
│ ├── [ 182M]  OpenOTP & NetScalerD2022-02-20T203209_541B.vib
│ ├── [ 179M]  OpenOTP & NetScalerD2022-02-21T204339_830C.vib
│ ├── [ 185M]  OpenOTP & NetScalerD2022-02-22T204447_9D65.vib
│ ├── [ 215M]  OpenOTP & NetScalerD2022-02-23T204356_9376.vib
│ ├── [ 194M]  OpenOTP & NetScalerD2022-02-24T202025_4994.vib
│ └── [ 639K]  OpenOTP & NetScaler.vbm
├── [ 1.5T]  AmiExpress backup
│ ├── [ 225G]  AmiExpress backupD2021-12-26T200018_E60B.vbk
│ ├── [  34G]  AmiExpress backupD2021-12-27T200033_688A.vib
│ ├── [  37G]  AmiExpress backupD2021-12-28T200029_6B60.vib
│ ├── [  38G]  AmiExpress backupD2021-12-29T200023_B5C1.vib
│ ├── [  37G]  AmiExpress backupD2021-12-30T200017_6469.vib
│ ├── [  37G]  AmiExpress backupD2021-12-31T200014_E71E.vib
│ ├── [ 5.0G]  AmiExpress backupD2022-01-01T200033_4917.vib
│ ├── [ 2.9G]  AmiExpress backupD2022-01-02T200029_9B1E.vib
│ ├── [  35G]  AmiExpress backupD2022-01-03T200023_E99B.vib
│ ├── [  81G]  AmiExpress backupD2022-01-20T200045_67B6.vib
│ ├── [  37G]  AmiExpress backupD2022-01-21T200028_7526.vib
│ ├── [ 5.0G]  AmiExpress backupD2022-01-22T200023_88F3.vib
│ ├── [ 2.9G]  AmiExpress backupD2022-01-23T200013_A3C2.vib
│ ├── [  36G]  AmiExpress backupD2022-01-24T200030_FE6A.vib
│ ├── [  39G]  AmiExpress backupD2022-01-25T200024_840A.vib
│ ├── [  39G]  AmiExpress backupD2022-01-26T200016_7436.vib
│ ├── [  39G]  AmiExpress backupD2022-01-27T200032_FF29.vib
│ ├── [  39G]  AmiExpress backupD2022-01-28T200025_9FDD.vib
│ ├── [ 5.0G]  AmiExpress backupD2022-01-29T200016_B357.vib
│ ├── [ 2.9G]  AmiExpress backupD2022-01-30T200031_D3F1.vib
│ ├── [  37G]  AmiExpress backupD2022-01-31T200022_ECDE.vib
│ ├── [  40G]  AmiExpress backupD2022-02-01T200017_44C4.vib
│ ├── [  39G]  AmiExpress backupD2022-02-02T200034_4EA7.vib
│ ├── [  40G]  AmiExpress backupD2022-02-03T200026_4FCF.vib
│ ├── [  39G]  AmiExpress backupD2022-02-04T200020_3F5A.vib
│ ├── [ 5.1G]  AmiExpress backupD2022-02-05T200036_1872.vib
│ ├── [ 2.9G]  AmiExpress backupD2022-02-06T200031_EDC6.vib
│ ├── [  38G]  AmiExpress backupD2022-02-07T200026_EC28.vib
│ ├── [  62G]  AmiExpress backupD2022-02-09T200047_DEC9.vib
│ ├── [  39G]  AmiExpress backupD2022-02-10T200023_5140.vib
│ ├── [  40G]  AmiExpress backupD2022-02-11T200019_87EA.vib
│ ├── [ 5.2G]  AmiExpress backupD2022-02-12T200014_EEA0.vib
│ ├── [ 3.0G]  AmiExpress backupD2022-02-13T200035_D080.vib
│ ├── [  37G]  AmiExpress backupD2022-02-14T200031_1E4F.vib
│ ├── [  40G]  AmiExpress backupD2022-02-15T200029_05DF.vib
│ ├── [  40G]  AmiExpress backupD2022-02-16T200027_5F53.vib
│ ├── [  40G]  AmiExpress backupD2022-02-17T200022_8C62.vib
│ ├── [  40G]  AmiExpress backupD2022-02-18T200019_99BF.vib
│ ├── [ 5.3G]  AmiExpress backupD2022-02-19T200018_D6E9.vib
│ ├── [ 3.1G]  AmiExpress backupD2022-02-20T200015_7B0A.vib
│ ├── [  38G]  AmiExpress backupD2022-02-21T200013_8F31.vib
│ ├── [  41G]  AmiExpress backupD2022-02-22T200035_CE44.vib
│ ├── [  43G]  AmiExpress backupD2022-02-23T202122_841E.vib
│ ├── [  38G]  AmiExpress backupD2022-02-24T200027_B6D5.vib
│ └── [ 403K]  AmiExpress backup.vbm
├── [ 1.6T]  Server backup
│ └── [ 1.6T]  server.company.local
│     ├── [ 1.1T]  Server backup - server.company.localD2021-12-26T200740_0C53.vbk
│     ├── [ 3.9G]  Server backup - server.company.localD2021-12-27T201729_43A8.vib
│     ├── [ 4.0G]  Server backup - server.company.localD2021-12-28T201724_73C9.vib
│     ├── [ 3.8G]  Server backup - server.company.localD2021-12-29T201717_1AF5.vib
│     ├── [  23G]  Server backup - server.company.localD2021-12-30T201713_B99E.vib
│     ├── [  24G]  Server backup - server.company.localD2021-12-31T201711_882D.vib
│     ├── [ 4.3G]  Server backup - server.company.localD2022-01-01T200838_417F.vib
│     ├── [  23G]  Server backup - server.company.localD2022-01-02T200812_24E1.vib
│     ├── [ 4.0G]  Server backup - server.company.localD2022-01-03T201744_9C6B.vib
│     ├── [  56G]  Server backup - server.company.localD2022-01-20T211144_F4D1.vib
│     ├── [ 9.5G]  Server backup - server.company.localD2022-01-21T205918_CE4C.vib
│     ├── [  29G]  Server backup - server.company.localD2022-01-22T200916_D480.vib
│     ├── [  23G]  Server backup - server.company.localD2022-01-23T200755_AD77.vib
│     ├── [ 6.3G]  Server backup - server.company.localD2022-01-24T201922_4A13.vib
│     ├── [ 6.9G]  Server backup - server.company.localD2022-01-25T202025_0F09.vib
│     ├── [ 6.1G]  Server backup - server.company.localD2022-01-26T202014_5B97.vib
│     ├── [ 8.3G]  Server backup - server.company.localD2022-01-27T202032_6201.vib
│     ├── [  24G]  Server backup - server.company.localD2022-01-28T202027_63D3.vib
│     ├── [ 4.5G]  Server backup - server.company.localD2022-01-29T202202_81ED.vib
│     ├── [ 3.6G]  Server backup - server.company.localD2022-01-30T200837_02D7.vib
│     ├── [  11G]  Server backup - server.company.localD2022-01-31T210347_0737.vib
│     ├── [ 6.4G]  Server backup - server.company.localD2022-02-01T204432_BEDE.vib
│     ├── [ 8.5G]  Server backup - server.company.localD2022-02-02T202557_877A.vib
│     ├── [  28G]  Server backup - server.company.localD2022-02-03T202504_2C5B.vib
│     ├── [ 8.7G]  Server backup - server.company.localD2022-02-04T204349_938D.vib
│     ├── [  22G]  Server backup - server.company.localD2022-02-05T203402_5862.vib
│     ├── [ 4.3G]  Server backup - server.company.localD2022-02-06T203336_80DB.vib
│     ├── [ 6.2G]  Server backup - server.company.localD2022-02-07T204441_6A34.vib
│     ├── [ 9.9G]  Server backup - server.company.localD2022-02-08T204133_FC26.vib
│     ├── [  12G]  Server backup - server.company.localD2022-02-09T203446_CF91.vib
│     ├── [  25G]  Server backup - server.company.localD2022-02-10T203523_81D2.vib
│     ├── [ 8.6G]  Server backup - server.company.localD2022-02-11T204803_2E2E.vib
│     ├── [ 3.9G]  Server backup - server.company.localD2022-02-12T203709_AF13.vib
│     ├── [ 3.8G]  Server backup - server.company.localD2022-02-13T203621_FF7D.vib
│     ├── [ 5.5G]  Server backup - server.company.localD2022-02-14T204900_79B1.vib
│     ├── [ 9.0G]  Server backup - server.company.localD2022-02-15T204859_CFBA.vib
│     ├── [ 8.4G]  Server backup - server.company.localD2022-02-16T202747_96F1.vib
│     ├── [  25G]  Server backup - server.company.localD2022-02-17T202610_D4CE.vib
│     ├── [ 8.6G]  Server backup - server.company.localD2022-02-18T204714_E970.vib
│     ├── [ 3.9G]  Server backup - server.company.localD2022-02-19T203716_36C3.vib
│     ├── [ 3.8G]  Server backup - server.company.localD2022-02-20T203713_41C3.vib
│     ├── [ 8.5G]  Server backup - server.company.localD2022-02-21T204903_1D24.vib
│     ├── [ 5.8G]  Server backup - server.company.localD2022-02-22T205010_31AD.vib
│     ├── [ 7.6G]  Server backup - server.company.localD2022-02-23T204858_BF99.vib
│     ├── [  26G]  Server backup - server.company.localD2022-02-24T202527_39F0.vib
│     └── [ 4.1M]  Server backup - server.company.local.vbm
├── [  30G]  Webserver (Fedora box)
│ ├── [  15G]  Webserver (Fedora box)D2021-12-26T200603_A3FD.vbk
│ ├── [ 289M]  Webserver (Fedora box)D2021-12-27T201555_1FEC.vib
│ ├── [ 282M]  Webserver (Fedora box)D2021-12-28T201551_EB8E.vib
│ ├── [ 276M]  Webserver (Fedora box)D2021-12-29T201545_C365.vib
│ ├── [ 277M]  Webserver (Fedora box)D2021-12-30T201540_5295.vib
│ ├── [ 281M]  Webserver (Fedora box)D2021-12-31T201536_4A94.vib
│ ├── [ 345M]  Webserver (Fedora box)D2022-01-01T200705_D028.vib
│ ├── [ 340M]  Webserver (Fedora box)D2022-01-02T200638_F5CE.vib
│ ├── [ 339M]  Webserver (Fedora box)D2022-01-03T201608_7318.vib
│ ├── [ 1.9G]  Webserver (Fedora box)D2022-01-20T210946_1CBE.vib
│ ├── [ 290M]  Webserver (Fedora box)D2022-01-21T205742_9778.vib
│ ├── [ 267M]  Webserver (Fedora box)D2022-01-22T200740_DDA0.vib
│ ├── [ 285M]  Webserver (Fedora box)D2022-01-23T200622_472B.vib
│ ├── [ 279M]  Webserver (Fedora box)D2022-01-24T201748_C47C.vib
│ ├── [ 340M]  Webserver (Fedora box)D2022-01-25T201850_D853.vib
│ ├── [ 344M]  Webserver (Fedora box)D2022-01-26T201842_3171.vib
│ ├── [ 343M]  Webserver (Fedora box)D2022-01-27T201858_2639.vib
│ ├── [ 338M]  Webserver (Fedora box)D2022-01-28T201851_7468.vib
│ ├── [ 337M]  Webserver (Fedora box)D2022-01-29T200733_9608.vib
│ ├── [ 340M]  Webserver (Fedora box)D2022-01-30T200703_07FD.vib
│ ├── [ 344M]  Webserver (Fedora box)D2022-01-31T210104_2652.vib
│ ├── [ 345M]  Webserver (Fedora box)D2022-02-01T204210_232D.vib
│ ├── [ 336M]  Webserver (Fedora box)D2022-02-02T202337_8D63.vib
│ ├── [ 341M]  Webserver (Fedora box)D2022-02-03T202244_F7D4.vib
│ ├── [ 314M]  Webserver (Fedora box)D2022-02-04T204126_6F9D.vib
│ ├── [ 322M]  Webserver (Fedora box)D2022-02-05T203144_8852.vib
│ ├── [ 327M]  Webserver (Fedora box)D2022-02-06T203116_2C9D.vib
│ ├── [ 318M]  Webserver (Fedora box)D2022-02-07T204219_DDDC.vib
│ ├── [ 321M]  Webserver (Fedora box)D2022-02-08T203912_971A.vib
│ ├── [ 321M]  Webserver (Fedora box)D2022-02-09T203209_9619.vib
│ ├── [ 307M]  Webserver (Fedora box)D2022-02-10T203303_3371.vib
│ ├── [ 309M]  Webserver (Fedora box)D2022-02-11T204540_B738.vib
│ ├── [ 302M]  Webserver (Fedora box)D2022-02-12T203450_475B.vib
│ ├── [ 302M]  Webserver (Fedora box)D2022-02-13T203401_FF52.vib
│ ├── [ 306M]  Webserver (Fedora box)D2022-02-14T204638_097B.vib
│ ├── [ 293M]  Webserver (Fedora box)D2022-02-15T204636_2099.vib
│ ├── [ 281M]  Webserver (Fedora box)D2022-02-16T202525_B577.vib
│ ├── [ 281M]  Webserver (Fedora box)D2022-02-17T202349_8289.vib
│ ├── [ 283M]  Webserver (Fedora box)D2022-02-18T204454_03D7.vib
│ ├── [ 341M]  Webserver (Fedora box)D2022-02-19T203453_54CF.vib
│ ├── [ 361M]  Webserver (Fedora box)D2022-02-20T203451_D4D5.vib
│ ├── [ 347M]  Webserver (Fedora box)D2022-02-21T204621_A7D0.vib
│ ├── [ 344M]  Webserver (Fedora box)D2022-02-22T204728_EF62.vib
│ ├── [ 376M]  Webserver (Fedora box)D2022-02-23T204638_33AC.vib
│ ├── [ 350M]  Webserver (Fedora box)D2022-02-24T202307_4271.vib
│ └── [ 417K]  Webserver (Fedora box).vbm
└── [  30G]  Z billing system
    ├── [  15G]  Z billing systemD2021-12-26T200259_D4A2.vbk
    ├── [ 324M]  Z billing systemD2021-12-27T201251_0E5C.vib
    ├── [ 314M]  Z billing systemD2021-12-28T201246_434B.vib
    ├── [ 350M]  Z billing systemD2021-12-29T201240_55AA.vib
    ├── [ 348M]  Z billing systemD2021-12-30T201235_78C6.vib
    ├── [ 343M]  Z billing systemD2021-12-31T201231_50BE.vib
    ├── [ 334M]  Z billing systemD2022-01-01T200337_548F.vib
    ├── [ 335M]  Z billing systemD2022-01-02T200310_2B10.vib
    ├── [ 337M]  Z billing systemD2022-01-03T201241_29A0.vib
    ├── [ 1.4G]  Z billing systemD2022-01-20T210555_EDD1.vib
    ├── [ 371M]  Z billing systemD2022-01-21T205414_2101.vib
    ├── [ 342M]  Z billing systemD2022-01-22T200412_2393.vib
    ├── [ 341M]  Z billing systemD2022-01-23T200254_8DF5.vib
    ├── [ 379M]  Z billing systemD2022-01-24T201419_FF81.vib
    ├── [ 376M]  Z billing systemD2022-01-25T201522_5340.vib
    ├── [ 378M]  Z billing systemD2022-01-26T201514_D39C.vib
    ├── [ 368M]  Z billing systemD2022-01-27T201530_5186.vib
    ├── [ 368M]  Z billing systemD2022-01-28T201523_8667.vib
    ├── [ 342M]  Z billing systemD2022-01-29T200406_DAB5.vib
    ├── [ 329M]  Z billing systemD2022-01-30T200334_80DC.vib
    ├── [ 428M]  Z billing systemD2022-01-31T205542_C779.vib
    ├── [ 302M]  Z billing systemD2022-02-01T203733_F4DC.vib
    ├── [ 304M]  Z billing systemD2022-02-02T201900_0C6A.vib
    ├── [ 301M]  Z billing systemD2022-02-03T201744_92A8.vib
    ├── [ 295M]  Z billing systemD2022-02-04T203651_03FB.vib
    ├── [ 257M]  Z billing systemD2022-02-05T202644_F6D2.vib
    ├── [ 265M]  Z billing systemD2022-02-06T202639_D617.vib
    ├── [ 296M]  Z billing systemD2022-02-07T203719_D7B6.vib
    ├── [ 300M]  Z billing systemD2022-02-08T203413_CF88.vib
    ├── [ 301M]  Z billing systemD2022-02-09T202709_29B4.vib
    ├── [ 300M]  Z billing systemD2022-02-10T202803_D19E.vib
    ├── [ 306M]  Z billing systemD2022-02-11T204040_0345.vib
    ├── [ 272M]  Z billing systemD2022-02-12T202950_13DC.vib
    ├── [ 269M]  Z billing systemD2022-02-13T202902_8B9B.vib
    ├── [ 295M]  Z billing systemD2022-02-14T204117_A5DF.vib
    ├── [ 302M]  Z billing systemD2022-02-15T204137_2724.vib
    ├── [ 300M]  Z billing systemD2022-02-16T202026_452C.vib
    ├── [ 298M]  Z billing systemD2022-02-17T201849_5EA1.vib
    ├── [ 299M]  Z billing systemD2022-02-18T203954_BD6C.vib
    ├── [ 272M]  Z billing systemD2022-02-19T202954_B27C.vib
    ├── [ 276M]  Z billing systemD2022-02-20T202951_FABF.vib
    ├── [ 302M]  Z billing systemD2022-02-21T204121_1667.vib
    ├── [ 304M]  Z billing systemD2022-02-22T204228_6AA2.vib
    ├── [ 358M]  Z billing systemD2022-02-23T204115_5C89.vib
    ├── [ 312M]  Z billing systemD2022-02-24T201807_FE84.vib
    └── [ 419K]  Z billing system.vbm

  3.2T used in 6 directories, 229 files

I take a snapshot of the dir with snapper before starting. I am using btrfs and snapshots for extra safety so that there are read-only snapshots that can’t be deleted over the network. but also I let kopia clone one of those snapshots.
Because I don’t want kopia to be backing up a different directory name each time (.snapshots/3434, then .snapshots/3435, etc - the number increments with each snapshot), I have to have a script that mounts the latest snapshot to a temporary directory (–before-folder-action), so that as far as Kopia is concerned, it is always doing the same files - which it is, but with btrfs, the snapshots are already mounted all the time, so I have to re-mount to a static/fixed path-name. I did try symlinks at first but that didn’t work as intended. I am still monitoring and working on this, especially the umount and remount - I have hit a few unexpected ‘device is busy’ when trying to umount…

[root@backup ~]# cat ./update-btrfs-snapshot-for-kopia.sh
#!/bin/bash
if grep -qs '/mnt/backup/Veeam-latest-snapshot ' /proc/mounts
then
        for i in {1..5}
        do
                echo "Trying to unmount /mnt/backup/Veeam-latest-snapshot"
                if !(umount /mnt/backup/Veeam-latest-snapshot)
                then
                        if [ $i -lt 5 ]
                        then
                                echo "Unmount attempt ${i} failed. Sleeping for 30s before retrying."
                                sleep 30s
                        else
                                echo "Unmount failed after ${i} attempts. Giving up."
                                echo "Open files:"
                                lsof|grep latest-snapshot
                                exit 1
                        fi
                fi
        done
fi
/usr/bin/snapper -c Veeam create -c timeline
if SNAPSHOTS=$(/usr/sbin/btrfs subvol list -o /mnt/backup/Veeam/.snapshots/ -t)
then
        LATEST_SS=$(echo "$SNAPSHOTS"|awk 'END {print "/mnt/backup/"$4}')
        echo "Mounting snapshot ${LATEST_SS}"
        mount --bind $LATEST_SS /mnt/backup/Veeam-latest-snapshot
fi

I should probably change the above script so that the final test actually tests whether the mount succeeded, rather than testing the btrfs subvol list command, which is a pointless test really.

(I also have an --after-folder-action to umount as well now… will see how it goes. Still not sure why the device was busy yesterday when the cron job ran. I think the original snapper timeline timer might still be creating hourly snapshots, which I don’t want… is supposed to happen just once in the morning, when there is no writing activity. That could be the source of that problem. the systemd timer for snapper timeline creation has reverted to hourly after a system upgrade. Also it’s possible there was samba/smbd access to the directory (it is within the shared path, but only the Veeam box and a server have access to it).

My cron job, (which I am now using after hitting the above (start of this thread) failures when running from the html-ui), looks like this. As I said, I wish I could get similar logging to what the html-ui shows… some sort of nice per-subfolder progress.

[root@backup ~]# cat run-kopia.sh
if kopia snapshot create /mnt/backup/Veeam-latest-snapshot --no-progress &> /tmp/kopia-log.txt
then
        cat /tmp/kopia-log.txt | mutt -s "Kopia Success" -- administrator@ourcompany.co.uk
else
        cp /root/.cache/kopia/cli-logs/latest.log /tmp/kopia-fail-log.txt
        cat /tmp/kopia-log.txt | mutt -s "Kopia FAIL" -a /tmp/kopia-fail-log.txt -- administrator@ourcompany.co.uk
        rm /tmp/kopia-fail-log.txt
fi