After upgrading to 0.9.7 the following is displayed by Kopia UI.
In an effort see what is going on I have tried to connect via cli:
kopia repo connect rclone --remote-path=onedrive-personal:backup --log-level=debug --log-dir=c:\temp\kopia\
…this generates the following output:
generating new TLS certificate
adding alternative IP to certificate: 127.0.0.1
starting C:\Program Files\Rclone\rclone.exe
detected webdav address: https://127.0.0.1:20767/
Enter password to open repository:
Creating cache directory 'C:\Users\Chris\AppData\Local\kopia\34920e150ee5b318' with max size 5242880000
generating new TLS certificate
adding alternative IP to certificate: 127.0.0.1
starting C:\Program Files\Rclone\rclone.exe
detected webdav address: https://127.0.0.1:20780/
finished sweeping content cache in 831.2µs and retained 490860/5242880000 bytes (0 %)
finished sweeping metadata cache in 37.7296ms and retained 2023429987/5242880000 bytes (38 %)
finished sweeping content cache in 611.5µs and retained 490860/5242880000 bytes (0 %)
finished sweeping metadata cache in 34.0402ms and retained 2023429987/5242880000 bytes (38 %)
finished sweeping content cache in 1.008ms and retained 490860/5242880000 bytes (0 %)
finished sweeping metadata cache in 34.0696ms and retained 2023429987/5242880000 bytes (38 %)
finished sweeping content cache in 646.3µs and retained 490860/5242880000 bytes (0 %)
finished sweeping metadata cache in 39.2453ms and retained 2023429987/5242880000 bytes (38 %)
finished sweeping content cache in 962.3µs and retained 490860/5242880000 bytes (0 %)
finished sweeping metadata cache in 41.5607ms and retained 2023429987/5242880000 bytes (38 %)
finished sweeping content cache in 1.264ms and retained 490860/5242880000 bytes (0 %)
finished sweeping metadata cache in 35.7785ms and retained 2023429987/5242880000 bytes (38 %)
finished sweeping content cache in 1.3697ms and retained 490860/5242880000 bytes (0 %)
finished sweeping metadata cache in 33.3384ms and retained 2023429987/5242880000 bytes (38 %)
finished sweeping content cache in 1.168ms and retained 490860/5242880000 bytes (0 %)
finished sweeping metadata cache in 37.4939ms and retained 2023429987/5242880000 bytes (38 %)
finished sweeping content cache in 1.0684ms and retained 490860/5242880000 bytes (0 %)
finished sweeping metadata cache in 34.2747ms and retained 2023429987/5242880000 bytes (38 %)
finished sweeping content cache in 1.0632ms and retained 490860/5242880000 bytes (0 %)
finished sweeping metadata cache in 33.3595ms and retained 2023429987/5242880000 bytes (38 %)
finished sweeping content cache in 1.0343ms and retained 490860/5242880000 bytes (0 %)
finished sweeping metadata cache in 32.1488ms and retained 2023429987/5242880000 bytes (38 %)
finished sweeping content cache in 1.1721ms and retained 490860/5242880000 bytes (0 %)
finished sweeping metadata cache in 45.0539ms and retained 2023429987/5242880000 bytes (38 %)
finished sweeping content cache in 1.218ms and retained 490860/5242880000 bytes (0 %)
finished sweeping metadata cache in 34.5785ms and retained 2023429987/5242880000 bytes (38 %)
finished sweeping content cache in 561.2µs and retained 490860/5242880000 bytes (0 %)
finished sweeping metadata cache in 43.0925ms and retained 2023429987/5242880000 bytes (38 %)
Attempting to repair the repo has no obvious effect. It isn’t the end of the world for me if I must destroy and rebuild the repo from scratch, but naturally this has me wondering whether the issue will reoccur.
Would very much appreciate any input, perhaps this is recoverable?