Finding (and removing) duplicate files on your hard drive

I generally hold to the philsophy that hard drive space is cheap, and your time is too valuable to waste on optimising hard drive space.

But one of those fun holiday activities, reserved for times when procrastination is at its peak, is to thoroughly clean up a hard drive and make extra room available.

My usual technique is to use SpaceSniffer (found courtesy of Scott Hanselman's tool list) but this time around I suspected that the biggest waste of space was caused by duplicate files (particularly music and photos) taking up a lot of space.

When confronted with a simple problem, the smart guys look for pre-existing solutions. But not me.

I like to employ something I call the 'my way is the best way' philosophy. Other people call it 'not invented here' syndrome, but I prefer to call it 'my way is the best way' because... well, my way is the best way.

thinking about duplicate files

Analysis is more fun than Action

Most of the duplicate-finding tools in this category have a feature where they will automatically delete all but one copy of each duplicate file found. That's not something I'm willing to do, at least not automatically. What I wanted to do was to create the full list of files, and then analyse it, for example in NimbleText. I wanted to create the list of files and then stand back, thoughfully stroking my long beard, just like Pai Mei from Kill Bill.

So I embarked on a special project, codenamed Dinomopabot, a name recommended by my 5 year old daughter who is very clever at these things. The final result is now named 'Dupes.exe': a command line tool for finding duplicate files on your hard drive.

You can browse, clone or fork the source-code, at Bitbucket:

'Dupes' sourcecode

Or download the executable, ready for use:

Download 'dupes.exe'

Here's the built-in help text:

Dupes Find duplicate files, by calculating checksums.

Usage: Dupes.exe [options]
Tip: redirect output to a .csv file, and manipulate with NimbleText.

  -p, --path=VALUE           the folder to scan
  -s, --subdirs              include subdirectories
  -f, --filter=VALUE         search filter (defaults to *.*)
  -a, --all                  show ALL checksums of files, even non-copies
  -?, -h, --help             show this message and exit

For each file it encounters, Dupes generates a sha256 checksum, with which to compare files. They're short and catchy, they look like this:


Cute hey? Almost adoption-worthy.

And for every member of a duplicate file set that the tool encounters, it spits out a row with four columns, separated by bar symbols ('|')

The four columns are:

CheckSum       Sha256 checksum of the file. (Hint: sort by this to get all duplicates together)
DuplicateNum   0 for the first file in the duplicate set, 1 for the second file, etc.
Filesize       In bytes. (Hint: sort by this, if you want to tackle big files first)
Path           Full path and filename for this duplicate.

So you run dupes.exe and direct the output into a textfile (using > [filename]), and from there you can manipulate it (with NimbleText for example), to create a batch file that carefully deletes all the hand-picked, unwanted duplicates of your choice.

Here's an example of a NimbleText pattern you could use with the output of Dupes. This will create a batch file that deletes all but the first copy of each file:

<% if ($1 > 0) { 'del ' + $3 } %>

That pattern is just a piece of embedded javascript (you can embed javascript in NimbleText patterns) that says "if column 1 is greater than Zero, then output the text 'del ' plus the text from column 3." Column 1 is the duplicate number, so it will be greater than zero for all but the first instance of the file. And column 3 is the full path and filename of the duplicate.

Thank you. I hope someone finds this thing useful. Also, please imagine suitably gigantic and terrifying disclaimers attached to this code. I wrote it after all.


Your comment, please?

Your Name
Your Url (optional)
Note: I may edit, reuse or delete your comment. Don't be mean.