On 10/27/18 8:10 PM, Carlos E. R. wrote:
Hi,
I just did a backup of my main laptop using rsync:
rsync --archive --acls --xattrs --hard-links \ --sparse --stats --human-readable --checksum \ --link-dest=/mnt/backup/previous/home \ /mnt/home/ \ /mnt/backup/current/home>
The idea is that files that exist in the previous backup are hard-linked into the current backup, saving space. But how much did I save? How many files and how many bytes are hardlinks?
'du' doesn't count hardlinked files twice, so counting e.g. my rolling snapshots gives me: $ du -shxc monthly.{11..0} weekly.{3..0} daily.{7..0} hourly.{23..0} 75G monthly.11 2.0G monthly.10 2.3G monthly.9 2.2G monthly.8 2.4G monthly.7 14G monthly.6 2.6G monthly.5 3.9G monthly.4 2.6G monthly.3 7.8G monthly.2 2.4G monthly.1 3.0G monthly.0 48M weekly.3 2.0G weekly.2 49M weekly.1 1.5G weekly.0 108M daily.7 1.7G daily.6 1.2G daily.5 48M daily.4 48M daily.3 49M daily.2 50M daily.1 1.6G daily.0 43M hourly.23 44M hourly.22 554M hourly.21 1.7G hourly.20 1.5G hourly.19 1.5G hourly.18 45M hourly.17 1.7G hourly.16 1.6G hourly.15 51M hourly.14 1.7G hourly.13 1.7G hourly.12 49M hourly.11 1.8G hourly.10 1.7G hourly.9 48M hourly.8 1.6G hourly.7 48M hourly.6 48M hourly.5 48M hourly.4 48M hourly.3 49M hourly.2 50M hourly.1 51M hourly.0 145G total And given that I also create an *.idx file for each of the above snapshots (which are in total 860M), the output of 'df' matches quite well: $ df -h --o=used . Used 146G
(I have doubts about what GB units are the tools using, too)
$ man df | grep -A1 -- --human-readable -h, --human-readable print sizes in powers of 1024 (e.g., 1023M) Have a nice day, Berny