If you don’t spot any recusion issues, I’d suggest looking for other issues and not spend too much time here. At least now you have some troubleshooting knowledge going forward. Best of luck figuring out the issue.
Comment on When using rsync to backup my /home folder to an external 1TB SSD, I run out of space, how??
sbeak@sopuli.xyz 16 hours agoHaving a quick scroll of the output file (neat tip with the > to get a text file, thanks!) nothing immediately jumps out to me. There isn’t any repeated folders or anything like that from a glance. Anything I should look out for?
confusedpuppy@lemmy.dbzer0.com 16 hours ago
bleistift2@sopuli.xyz 16 hours ago
You checked 385GiB of files by hand? Is that size made up by a few humongously large files?
I suggest using
uniqto check if you have duplicate files in there. (uniq’s input must be sorted first). If you still have the output file from the previous step, and it’s calledrsync-output.txt, dosort rsync-output.txt | uniq -dc. This will print the duplicates and the number of their occurrences.sbeak@sopuli.xyz 16 hours ago
when using uniq nothing is printed (I’m assuming that means no duplicates?)
bleistift2@sopuli.xyz 16 hours ago
I’m sorry. I was stupid. If you had duplicates due to a file system loop or symlinks, they would all be under different names. So you wouldn’t be able to find them with this method.
sbeak@sopuli.xyz 16 hours ago
running du command with --count-links as suggested by another user returns 384G (so that isn’t the problem it seems)
sbeak@sopuli.xyz 16 hours ago
Ok then, that makes sense