I'm slowly seeding a NAS backup to Google Cloud Storage with rclone. The standard process is to "rclone sync" a big directory, and then run "rclone check" before moving on to the next one.
As of yesterday, rclone started finding missing files in the Google Storage bucket when running "rclone check", and the list of missing files would be different with each invocation.
I was sure it was something with rclone, but then I decided to run gsutil on a VM instance in the cloud, and "gsutil ls" was exhibiting the same problem!
Here's an example (the data in the bucket has not been modified in over 24 hours):
vitaly@data-exporter:~$ gsutil ls -lR gs://bucketname/Photography/2016 |wc
8817 31667 1034417
vitaly@data-exporter:~$ gsutil ls -lR gs://bucketname/Photography/2016 |wc
8810 31643 1033605
vitaly@data-exporter:~$ gsutil ls -lR gs://bucketname/Photography/2016 |wc
8813 31656 1033965
vitaly@data-exporter:~$ gsutil ls -lR gs://bucketname/Photography/2016 |wc
8818 31671 1034544
vitaly@data-exporter:~$ gsutil ls -lR gs://bucketname/Photography/2016 |wc
8816 31664 1034294
I am using the Nearline storage class. Regardless of the storage type, shouldn't the listing of bucket contents produce the same number of files?
I would appreciate some ideas. This was working as expected last week.
P.S. I enabled versioning recently, but did not see these issues last week with versioning enabled.