So, I’m selfhosting immich, the issue is we tend to take a lot of pictures of the same scene/thing to later pick the best, and well, we can have 5~10 photos which are basically duplicates but not quite.
Some duplicate finding programs put those images at 95% or more similarity.

I’m wondering if there’s any way, probably at file system level, for the same images to be compressed together.
Maybe deduplication?
Have any of you guys handled a similar situation?

  • Admiral Patrick
    link
    fedilink
    English
    5
    edit-2
    2 months ago

    Not sure if a de-duplicating filesystem would help with that or not. Depends, I guess, on if there are similarities between the similar images at the block level.

    Maybe try setting up a small, test ZFS pool, enabling de-dup, adding some similar images, and then checking the de-dupe rate? If that works, then you can plan a more permanent ZFS (or other filesystem that supports de-duplication) setup to hold your images.

      • Admiral Patrick
        link
        fedilink
        English
        3
        edit-2
        2 months ago

        That’s what I was thinking, but wasn’t sure enough to say beyond “give it a shot and see”.

        There might be some savings to be had by enabling compression, though it would depend on what format the images are in to start with. If they’re already in a compressed format, it would probably just be a waste of CPU to try compressing them further at the filesystem level.