• u/lukmly013 💾 (lemmy.sdf.org)
      link
      fedilink
      English
      54 days ago

      NGINX autoindex + Wget + SSH fuckery (a.k.a.: “Lazy turd solution”)

      Idea:
      You can put files into selected directory for filesharing which will be used as root directory for NGINX. When you enable autoindex you’ll get the classic directory listing you see on places like Linux ISO mirrors.
      That will be the file source.
      To download, you’ll simply download from that autoindex page.
      Uploading is, uuuhh, creative.
      You have to also run NGINX server the same way on the upload side, either have them on same network or use reverse SSH forwarding, and then SSH into the machine you wish to upload to and download the files into it with Wget (or at least I use Wget) from the locally running server.

      Example config I last used on my phone as the upload side:

      daemon off;
      events {}
      http {
      server {
              listen 192.168.34.217:8080;
              root /storage/emulated/0/LibreTorrent/;
              location / {
                      autoindex on;
              }
      }
      }
      

      Yes, the indentation, I know.

      • @[email protected]
        link
        fedilink
        English
        64 days ago

        I read all that then had the thought:

        Rclone does all this with like 1 command line, doesn’t it? Recently was looking into synching my seedbox with my home media server and every guide was “use rclone and a script that detects when a file is added/removed to trigger the synch”

        • @[email protected]
          link
          fedilink
          English
          14 days ago

          Even unison with cron is pretty painless.

          Or git for some stuff.

          Even rsync.

          With wireguard or tailscale there’s heaps of options.