I’m planning to migrate my email to a different provider, but they don’t give much storage, so I was wondering what people would recommend for this kind of setup: basically I’d like to use the new provider as something like a relay. I’d want them to only store an email or two at a time and have some kind of self hosted solution that just grabs the emails from the provider and stores them after deleting them off the provider so it’s never storing my entire email history, and also keeps my sent emails somewhere so that I have a copy of it. Ideally I’d wanna be able to set this up with a mail client like NextCloud’s.

  • @TCB13
    link
    English
    5
    edit-2
    1 year ago

    The good old fetchmail is probably what you’re looking for. Run your local/self-hosted email server and then use fetchmail as described here to fetch the email from the email provider and deliver into the local accounts. You also have getmail (does the same but is written in python), guide here, or go-getmail

    Alternatively, and probably way better:

    Postfix has a feature called ETRN service, documented here. It can be used to incoming emails queued deliver it to another server when a connection is available:

    The SMTP ETRN command was designed for sites that have intermittent Internet connectivity. With ETRN, a site can tell the mail server of its provider to “Please deliver all my mail now”. The SMTP server searches the queue for mail to the customer, and delivers that mail by connecting to the customer’s SMTP server.

    From what I know about it you might be able to:

    1. Configure just a SMTP/Postfix server on the cloud provider;
    2. Configure a full IMAP/SMTP server on the self-hosted / local machine;
    3. Configure the “cloud” Postfix to deliver all incoming email into your local / self-hosted Postfix using relay_domains here and here.
    4. Setup ETRN in the “cloud” provider to deal with your local server being offline / unavailable;
    5. On the local machine create a simple bash script + systemd timer / cron like this:
    nc -c 'echo "ehlo selfhosted.example.org";sleep 1;echo "etrn your-domain.example.org";sleep 1;echo "quit"' remote-cloud-server.example.org 25
    

    This command will connect to the cloud server and ask it to deliver all queued email to the self-hosted instance. This can be setup to run every x minutes, or if you want to get fancy, when the network goes up with the network-online.target target like described here. Note that the script isn’t strictly necessary, is just guarantees that if the connection between servers goes down when it comes back you’ll get all the queued email delivered right away.

    The following links may also be of interest so your local / self-hosted email server can send email:

    Now a note about NextCloud: their webmail is the worst possible solution, I wrote very detailed description of the issues here. Do yourself a favor and use Roundcube.

    • @[email protected]
      link
      fedilink
      English
      21 year ago

      Agree about the Nextcloud client, but it’s easy enough to replace it with the SnappyMail plug in which works a treat.

      • @TCB13
        link
        English
        21 year ago

        Or simply run RoundCube without NC.

    • chimay
      link
      English
      21 year ago

      I use them with neomutt for years, and am happy with it.

    • @[email protected]OP
      link
      fedilink
      English
      2
      edit-2
      1 year ago

      Wow thanks for the very detailed info! I’ll look into all of these. I read your post about the NC webmail and yeah I might just go for RoundCube lol. I’ve had performance issues with the file part of NC but it just works better for me than other solutions so I figured I may as well just tack it on but seems I’ll have more performance/resource concerns if I do.

      • @TCB13
        link
        English
        11 year ago

        You’re welcome. Well Syncthing is great, setup is easy and it does get the job done. My favorite way of running it is having a central “server” (like your NAS or so) and have all devices connecting to it instead BUT not to each other. This way your NAS acts like a single source of truth for the files and conflicts are close to none. Another advantage of running it like that is that you can plug other things into the file storage like WebDav, SMB or Filebrowser in order to support accessing files from any browser and iOS devices.

    • @TCB13
      link
      English
      1
      edit-2
      1 year ago

      deleted by creator

  • @mattaw
    link
    English
    21 year ago

    The older POP3 mail protocol downloads and deletes emails from the mailbox to your local program so if you can get next cloud’s to use that as a mail source it will start to work the way you want.

  • Admiral Patrick
    link
    fedilink
    English
    21 year ago

    That sounds like POP3.

    Unlike IMAP, where your inbox lives on the mail server, POP stores the messages only until you download them.

    So you should be able to look for a provider that allows you to connect with POP3 and set your client up to fetch them periodically.

  • stown
    link
    fedilink
    English
    1
    edit-2
    1 year ago

    I’m doing exactly this -using a cheap server from IONOS as my SMTP relay. I followed this guide but had to modify it slightly. (ie. It doesn’t explain that you need 2 MX records: the primary one points to your actual self-hosted mail server so messages from other servers will go there. The secondary MX record points to your SMTP relay so that DKIM and other DNS based security features will recognize your Relay as a valid sender for the domain.)

    • @[email protected]OP
      link
      fedilink
      English
      11 year ago

      Cool, thanks for the guide! Glad to know I’m not just crazy trying to do things this way