Comment on When using rsync to backup my /home folder to an external 1TB SSD, I run out of space, how??
bleistift2@sopuli.xyz 15 hours agoYou checked 385GiB of files by hand? Is that size made up by a few humongously large files?
I suggest using uniq to check if you have duplicate files in there. (uniq’s input must be sorted first). If you still have the output file from the previous step, and it’s called rsync-output.txt, do sort rsync-output.txt | uniq -dc. This will print the duplicates and the number of their occurrences.
sbeak@sopuli.xyz 15 hours ago
when using uniq nothing is printed (I’m assuming that means no duplicates?)
bleistift2@sopuli.xyz 14 hours ago
I’m sorry. I was stupid. If you had duplicates due to a file system loop or symlinks, they would all be under different names. So you wouldn’t be able to find them with this method.
sbeak@sopuli.xyz 14 hours ago
running du command with --count-links as suggested by another user returns 384G (so that isn’t the problem it seems)
bleistift2@sopuli.xyz 14 hours ago
du --count-linksonly counts hard-linked files multiple types. I assumed you had a symlink loop that rsync would have tried to unwrap.For instance:
If you tried to rsync that, you’d end up with the directories
foo,bar,foo/bar,bar/foo,foo/bar/foo,bar/foo/bar,foo/bar/foo/bar, ad infinitum, in the target directory.sbeak@sopuli.xyz 14 hours ago
Ok then, that makes sense