Whenever I wipe my PC, I use tar to make an archive of the whole system. This works, but having to decompress the whole archive to pull files out is very annoying. Is there another archive format that:

  • Preserves permissions (i.e., is Unix-y)
  • Supports strong compression (I use either zstd or xz depending on how long I can be bothered to wait)
  • Supports pulling out individual files quickly
  • @[email protected]
    link
    fedilink
    English
    811 months ago

    Maybe Borg is a possibility. However, I have not yet backed up an entire system with it, but only certain files.

    • The file permissions have always been correct when restoring files in my case.
    • Which compression (LZ4, zlib, LZMA or zstd) and which compression level is used can be specified when creating a backup.
    • Backups can be mounted via FUSE, so that you can restore individual files with an file manager or a terminal emulator, for example.
    • @dbrand666
      link
      English
      111 months ago

      Look at Restic too. Similar feature set. Really simple to set up (I think Borg is too but I haven’t tried it).

  • @[email protected]
    link
    fedilink
    English
    611 months ago

    At least on the Mac (bsdtar) you can extract single files out of a tar file.

    E.g.,

    Create the tar file:

    tar cvzf pseudo.tgz pseudo/

    Move to another directory

    cd /tmp/tt

    Extract a single file:

    tar -xf ../pseudo.tgz pseudo/10481_2017.1069.png

    You say PC, so might want to check the tar version you are using and see if there are extra parameters to do the file extraction.

  • @[email protected]
    link
    fedilink
    English
    511 months ago

    Take a look at squashfs. This creates a compressed archive that can be mounted as a read-only filesystem to pull out individual files. It is very fast and likely already installed on your system.

  • @[email protected]
    link
    fedilink
    English
    511 months ago

    You don’t need to extract the whole thing if you use tar. The reason you have to here is because you use zstd/xz on top of it.

    Use tar as is. It’s what it’s made for.

    • @dbrand666
      link
      English
      2
      edit-2
      11 months ago

      Tar has to scan the whole archive to find the file you want to extract. That’s why it’s slow. Compression doesn’t really change that.

      As for what tar is made for, that would be archiving directly to tape.

  • @[email protected]
    link
    fedilink
    English
    111 months ago
    • mksquashfs - result can be mounted as a read only fs. Auto deduplication (Files with the same exact content occupy the same data block). Uses smaller block sizes for compression so ratios might not be as good
    • fsarchiver - kind of like tar but indexes and stores fs info, and I’m pretty sure it allows picking specific files out
  • calm.like.a.bomb
    link
    fedilink
    English
    111 months ago

    When I wipe my PC I always use Clonezilla. I have a separate /home partition and I usually copy /etc inside my user’s home directory just before the cloning. I’d say you should give it a try.

  • @[email protected]
    link
    fedilink
    English
    110 months ago

    Dar

    Tar = Tape ARchive

    DAR = Disk Archive

    It is supposed to replace tar when storing on random access media as tar isn’t random access. Compression and encryption options.

  • @[email protected]
    link
    fedilink
    English
    111 months ago

    Borg or restic since they do deduplication.

    My biggest data regret is rsync-ing or tar-ing up my systems to my fileserver as a backup mechanism. So much wasted space. Extremely difficult to find anything. Impossible to properly organize. These backup solutions improve the situation tremendously.

  • @[email protected]
    link
    fedilink
    English
    1
    edit-2
    10 months ago

    fsarchiver is very nice. Not fast on pulling out files, but, I mean, it’s infinitely faster than tar.

    Only quit using it so much because zfs-send is the real big hammer.

    Best part is it can regenerate partitions, or whatever, or you can restore a larger partition to a smaller one, all the cool permutations assuming the files actually fit. Can re-write users and permissions if you like, all the bells.

    https://www.fsarchiver.org/

    Support for basic file attributes (permissions, ownership, …)
    Support for basic file-system attributes (label, uuid, block-size) for all linux file-systems
    Support for multiple file-systems per archive
    Support for extended file attributes (they are used by SELinux)
    Support for all major Linux filesystems (extfs, xfs, btrfs, reiserfs, etc)
    Support for FAT filesystems (in order to backup/restore EFI System Partitions)
    Experimental support for cloning ntfs filesystems
    Checksumming of everything which is written in the archive (headers, data blocks, whole files)
    Ability to restore an archive which is corrupt (it will just skip the current file)
    Multi-threaded lzo, gzip, bzip2, lzma/xz compression: if you have a dual-core / quad-core it will use all the power of your cpu
    Lzma/xz compression (slow but very efficient algorithm) to make your archive smaller.
    Support for splitting large archives into several files with a fixed maximum size
    Encryption of the archive using a password. Based on blowfish from libgcrypt.
    

    Oh, also you can always copy it over to an iso image and mount it, or a qcow or raw image of some kind for loop mount.

    Hey, didn’t know about this: https://www.linux.com/news/mounting-archives-fuse-and-archivemount/