[Techtalk] interesting photo management problem-- weeding out duplicates

Carla Schroder carla at bratgrrl.com
Wed Apr 30 16:42:22 UTC 2014


On Mon, 28 Apr 2014 08:44:24 +0200
Miriam Ruiz <miriam at debian.org> wrote:

> 2014-04-28 5:55 GMT+02:00 Carla Schroder <carla at bratgrrl.com>:
> > On Mon, 28 Apr 2014 00:56:00 +0200
> > Miriam Ruiz <miriam at debian.org> wrote:
> >
> >> 2014-04-28 0:40 GMT+02:00 Carla Schroder <carla at bratgrrl.com>:
> >>
> >> > Thoughts? Brainstorms? I also looked at the Organize command
> >> > included in Exiv2, but I couldn't get it to build on my system.
> >>
> >> I kind of do more or less the same, but using the program fdupes:
> >>
> >> b=""
> >> fdupes tmp/ --recurse | \
> >>   while read f
> >>     do
> >>       if [ "$f" = "" ]
> >>     then
> >>       b=""
> >>     else
> >>       if [ "$b" = "" ]
> >>     then
> >>       b="$f"
> >>     else
> >>       rm "$f" && echo "Remove \"$f\""
> >>       ln -v "$b" "$f"
> >>     fi
> >>   fi
> >> done
> >>
> >
> > What is the purpose of creating the hardlink?
> 
> Well, that was my particular choice, it could be a symlink instead, or
> just to remove the file, or to move the file somewhere elsem and just
> symlink it in your file structure, or indexing them in a database of
> duplicates. My purpose was to remove the space occupied by the
> redundant files (which happened to be mostly songs, films and photos)
> while keeping them where they were, Of course, they were all in the
> same disk.
> 
> Greetings,
> Miry

Thank you, now I get it!

Carla

-- 
++++++++++++++++++++++++++++++++++++++++
Ace Linux guru                         +
carlaschroder.com                      +
Everything mortal has moments immortal +
++++++++++++++++++++++++++++++++++++++++


More information about the Techtalk mailing list