Hey there!
I’m thinking about starting a blog about privacy guides, security, self-hosting, and other shenanigans, just for my own pleasure. I have my own server running Unraid and have been looking at self-hosting Ghost as the blog platform. However, I am wondering how “safe” it is to use one’s own homelab for this. If you have any experience regarding this topic, I would gladly appreciate some tips.
I understand that it’s relatively cheap to get a VPS, and that is always an option, but it is always more fun to self-host on one’s own bare metal! :)
could someone please point me to a “self-host-beginner-tutorial”? I had pretty good ICT-knowledge but when it comes to selfhosting my knowledge ends…
yes, i have a few Rust framework based sites for mostly personal use
I host mine just like you want to do. Ghost running in a docker container on my homelab, with reverse proxy and domain pointing to it.
Haven’t had any issues so far.
Yes I host everything public with cloudflare tunnels. Everything more heavy is VPN with DDNS on invite basis to friends and fam. For the former it’s Hassle-free HTTPS, no reverse proxy, no firewall, no nonsense.
No, with these reasons:
- Bandwidth isn’t plenty
- My “uptime” at home isn’t great
- No redundant hardware, even a simple mainboard defect would take a while to replace
I have a VPS for these tasks, and I host a few sites for friends amd family.
A VPS still counts as self-hosting :)
I host my sites on a VPS. Better internet connection and uptime, and you can get pretty good VPSes for less than $40/year.
The approach I’d take these days is to use a static site generator like Eleventy, Hugo, etc. These generate static HTML files. You can then store those files on literally any host. You could upload them to a static file hosting service like BunnyCDN storage, Github Pages, Netlify, Cloudflare Pages, etc. Even Amazon S3 and Cloudfront if you want to pay more for the same thing. Note that Github Pages is extremely feature-poor so I’d usually recommend one of the others.
I have hosted a wordpress site on my unraid box before, but ended up moving it to a VPS instead. I ended up moving it primarily because a VPS is just going to have more uptime since I end up tinkering around with my homelab too often. So, any service that I expect other people to use, I often end up moving it to a VPS (mostly wikis for different things). The one exception to that is anything related to media delivery (plex, jellyfin, *arr stack), because I don’t want to make that as publicly accessible and it needs close integration with the storage array in unraid.
Good points here, uptime is a factor I had not taken into consideration. Probably better to get a vps as you say.
I wonder sometimes if the advice against pointing DNS records to your own residential IP amounts to a big scare. Like you say, if it’s just a static page served on an up to date and minimal web server, there’s less leverage for an attacker to abuse.
I’ve found that ISPs too often block port 80 and 443. Did you luck out with a decent one?
I wonder sometimes if the advice against pointing DNS records to your own residential IP amounts to a big scare. Like you say, if it’s just a static page served on an up to date and minimal web server, there’s less leverage for an attacker to abuse.
That advice is a bit old-fashioned in my opinion. There are many tools nowadays that will get you a very secure setup without much effort:
- Using a reverse proxy with automatic SSL certs like Caddy.
- Sandboxing services with Podman.
- Mitigating DoS attacks by using a WAF such as Bunkerweb.
And of course, besides all these tools, the simplest way of securing public services is to keep them updated.
I’ve found that ISPs too often block port 80 and 443. Did you luck out with a decent one?
Rogers has been my ISP for several years and have no issue receiving HTTP/S traffic. The only issue, like with most providers, is that they block port 25 (SMTP). It’s the only thing keeping me from self-hosting my own email server and have to rely on a VPS.
I have a Hugo site hosted on GitHub and I use CloudFlare Pages to put it on my custom domain. You don’t have to use GitHub to host the repo. Except for the cost of the domain, it’s free.
You don’t really need Cloudflare to have your own domain, you can do everything directly with GitHub.
I self host a Grav site among other things on a 15 Euro VPS.
Also, I started with Ghost but the fact that they locked up the newsletter side of business to a single provider and were unwilling to rework things at the time made me walk away. Yes, I know you could go code side, and add others, but that was a complicated setup in itself. Grav works perfectly for me.
Static site hosted by someone else for free is the way to go. I wouldn’t invite that sort of pain upon my network.
Fair point
My self-hosted stuff is intranet only apart from the VPN I used to access remotely. My blog is a Hugo site currently hosted on GitHub.
Yea depends on your website bandwidth/uptime requirements. I use a VPS running nginx and wireguard, and tunnel into that from a VM in my homelab, so no ports are open on my home firewall. nginx drops all random traffic at the VPS that isn’t destined to a preconfigured service, expected traffic is forwarded through the wireguard tunnel to the right VM’s, segregated from the rest of my home network by VLANs. I host a bit of web content where I’m not concerned with bandwidth or uptime really, as well as home assistant, file browser, a few dedicated game servers, etc.
I self hosted many websites for about 20 years, but sadly I had to take it all down this year. In the process of moving to another state. Also going to really miss my 1gbps unlimited fiber connection.
I hosted my websites from windows server 2003, 2008, virtual machines, Linux, and other ways. It was fun times. I have very good up time using 2 servers and UPS battery backups.
I self-host a Ghost blog. It’s about as safe as any other service exposed to the internet.