[Techtalk] interesting photo management problem-- weeding out duplicates

John Sturdy jcg.sturdy at gmail.com
Tue Apr 29 12:05:08 UTC 2014

I have a stalled attempt at doing something similar, with the
complication that I used to resize photos to take less space when I
had a smaller disk system, but left the originals around to delete
later if I actually ran out of space, and now I want to find the best
version of each.  So, working on the assumption I don't take more than
one photo per second, and that resizing a photo doesn't change the
"Date and Time (Original)" field of the exif data, I'm basing it on
applying this:

   exif -m $1 | grep original

to each jpg file, collecting all the results and sorting them by time,
then picking the largest file with each timestamp.

However, I haven't got as far as a complete working script for this.


More information about the Techtalk mailing list