[geeks] listing identical files
Phil Stracchino
alaric at caerllewys.net
Fri Nov 19 13:33:00 CST 2004
On Fri, Nov 19, 2004 at 02:24:22PM -0500, Phil Stracchino wrote:
> This is a little involved and messy, but:
>
> md5sum * 2>/dev/null | sort > foo
> cat foo | cut -d\ -f1 > foo2
> cat foo2 | sort -u > foo3
> for f in `diff foo2 foo3 | cut -d\ -f2`; do grep $f foo; done
>
> should show you all duplicates. It's up to you to determine from there
> which are hardlinks, which are symlinks, and which are duplicates.
To apply this to more than a single directory at a tome, by the way,
change the first line to:
find <starting-directory> -type f 2>/dev/null | md5sum | sort > foo
it's still quick and dirty, of course.
--
========== Fight Back! It may not be just YOUR life at risk. ==========
alaric at caerllewys.net : phil-stracchino at earthlink.net : phil at novylen.net
phil stracchino : unix ronin : renaissance man : mystic zen biker geek
2000 CBR929RR, 1991 VFR750F3 (foully murdered), 1986 VF500F (sold)
Linux Now! ...Friends don't let friends use Microsoft.
More information about the geeks
mailing list