TL;DR No archive format like tar, zip, … but how would you theoretically represent a symlink in a manner that can be stored on the cloud and retrieved back to the system as a symlink?

Backstory

I heavily use symlinks to organise my media and even wrote an application that helps me do so (it’s in Python and being rewritten in Rust). But I also use stuff like home-manager and nix which makes heavy use of symlinks.

My goal is to back up my media and /home to the cloud at regular intervals. There are services that cost just about 60-100€ yearly for limitless storage in the cloud. So having part of my library purely in the cloud and using terrabytes of space would cost less than a single 15TB HDD (500+€). To have a local backup, I’d even need a least a second one, which would put me at >1000€ - the equivalent of at least 10 years of cloud storage.

Options explored

rclone

It is pretty sweet as it supports mounting a cloud drive as a folder and has transparent encryption! However there are multiple open issues on uploading symlinks and I don’t know Go. I wouldn’t mind trying to learn it if I had an idea how to upload a symlink without following it (following symlinks breaks them).

git-annex etc.

git-annex and using a bare git repo with a remote worktree is great, but I don’t need to make diffs of stuff and follow how things moved around, etc. I just need to replace backups with a view of what’s there. Plus, storing all that history will probably take enormous amounts of space which is wasteful.

Ideas

store a blob of stat() call for every file

I’m not sure about this. The stat struct does contain information about the filetype (directory, hard link, symlink, …), but my knowledge of linux internals is limited and maybe that’s too complicated for this usecase.

a db of links

Instead of storing the links themselves, I store a DB (sqlite? CSV?) of links, upload that DB and use the DB to restore links after pull it back down. 🤔 Actually this might be the simplest thing to do, but maybe y’all have better ideas.

  • @[email protected]
    link
    fedilink
    English
    7
    edit-2
    1 year ago

    This is the very reason I dumped Dropbox. Used to be great - I just created symlinks to all my normal data folders in the main Dropbox folder and all of those folders would then be real-time (ish) backed up to the cloud without me ever having to touch anything or move all my stuff into the Dropbox folder. Then they decided to drop support for symlinks…

    • Dataprolet
      link
      fedilink
      English
      21 year ago

      If your files are only in the cloud it’s not a backup.

      • @[email protected]
        link
        fedilink
        English
        41 year ago

        They weren’t - they were on my file system and using symlinks in my Dropbox folder pushed them up to my DB account as well.

  • key
    link
    fedilink
    English
    4
    edit-2
    1 year ago

    I’d think simplest option is to replace the symlink with a text file that contains the target path. Then add in a special unique extension so you can easily detect which files are meant to be symlinks.

    I haven’t used it but this script looks like it does most of that https://github.com/nbeaver/toggle-symlink

    Though really I would question the need to do such a thing. Backup is a well solved thing so needing to be creative is a bad sign. I’d default to using a pre-built backup tool like borg. If you really need to avoid wrappers you can always backup to an actual file system via rsync which will handle symlinks normally

  • @cm0002
    link
    31 year ago

    I have no comment on the syncing problem, but could you lmk what the cheap cloud services are? I’m reaching the end of my rope with Google Workspace drive no longer being unlimited and Dropbox ended their “as much as you need” policy

    • @YoorWeb
      link
      4
      edit-2
      1 year ago

      Nextcloud. It’s open source, you can go with one of their providers or host it yourself I’d you don’t mind playing with config.

      • @cm0002
        link
        21 year ago

        Oh. Yea I’ve been in the Google Workspace Unlimited alternative thread on the rclone forums, apparently jotta will eventually hit a point on its “gradual slowdown” that it’s practically worthless (Iirc it was around 10TB, so for jotta “unlimited” is functionally 10TB)

        1ficihier was also talked about, but ig you have to reupload any given file every 30 days or it will expire