Sorry but I can’t think of another word for it right now. This is mostly just venting but also if anyone has a better way to do it I wouldn’t hate to hear it.

I’m trying to set up a home server for all of our family photos. We’re on our way to de-googling, and part of the impetus for the change is that our Google Drive is almost full.We have a few hundred gigs of photos between us. The problem with trying to download your data from Google is that it will only allow you to do so in a reasonable way through Google takeout. First you have to order it. Then you have to wait anywhere from a few hours to a day or two for Google to “prepare” the download. Then you have one week before the takeout “expires.” That’s one week to the minute from the time of the initial request.

I don’t have some kind of fancy California internet, I just have normal home internet and there is just no way to download a 50gig (or 2 gig) file in one go - there are always intrruptions that require restarting the download. But if you try to download the files too many times, Google will give you another error and you have to start over and request a new takeout. Google doesn’t let you download the entire archive either, you have to select each file part individually.

I can’t tell you how many weeks it’s been that I’ve tried to download all of the files before they expire, or google gives me another error.

  • Eager Eagle
    link
    English
    35 months ago

    A 50GB download takes less than 12h on a 10Mbps internet. And I had a 10Mbps link 10 years ago in a third world country, so maybe check your options with your ISP. 50GB really should not be a problem nowadays.

    • @gedaliyahOP
      link
      English
      45 months ago

      It’s not the speed - it’s the interruptions. If I could guarantee an uninterrupted download for 12 hours, then I could do it over the course of 3-4 days. I’m looking into some of the download management tools that people here have suggested.

      • Eager Eagle
        link
        English
        35 months ago

        that might work; I don’t know if you live in a remote area, but I’d also consider a coffee shop, library, university, or hotel lobby with wifi. You might be able to download it within an hour.

        • @gedaliyahOP
          link
          English
          25 months ago

          Great, any suggestions?

          • @aulin
            link
            English
            35 months ago

            I would recommend Aria2. It can download several chunks of a file in parallel, resume downloads automatically with a set number of retries, it supports mirrors (maybe not an option for Google Takeout, but for other cases), and it can dpwnload over many different protocols.

          • @[email protected]
            link
            fedilink
            English
            25 months ago

            I believe jdownloader is able to.
            If I am not mistaken wget and curl can resume a download as well but that may require a small script with error catching and auto loopijg until finished.