I’m trying to fix this annoying slowness when posting to larger communities. (Just try replying here…) I’ll be doing some restarts of the docker stack and nginx.
Sorry for the inconvenience.
Edit: Well I’ve changed the nginx from running in a docker container to running on the host, but that hasn’t solved the posting slowness…
Thank you Ruud for hosting! Your work is much appreciated.
deleted by creator
2 restarts done already :-)
Hmm. I guess the delay in posting is not related to nginx. I now have the same conf as a server that doesn’t have this issue.
I’m only familiar with the high-level Lemmy architecture, but could it be related to database indices being rebuilt?
Good luck today lol
Godspeed to you over the coming days man. Really appreciate you putting this together and the extra work it takes when tackling something like this (both being new to the platform and the tech still being in relative infancy) - not to mention the crazy scaling happening. I will definitely be pitching in to help make sure the server stays up!!
Thanks for putting in the time to make this run smoothly
Keep up the good work!
I joined this instance and also mastodon.social, first time using the Fediverse and as excited as confused lolThanks for your work on this server!
Hehe, the joys of troubleshooting and profiling. Isn’t it fun?
Hmm if it takes too long the fun disappears… ;-)
You got this. <3
I don’t have experience scaling Lemmy, but I do have experience scaling stuff in general. I’m sure you’ve got a few people here who’d be willing to talk things through with you if you get too frustrated.
And don’t forget to breathe and step back if you have to. Your well being is more important.
Thank you so much for this amazing instance!
You’re welcome!
@[email protected] have you seen https://lemmy.world/post/72136 ?
Thanks! Hadn’t seen, but changed that (also to 512!) yesterday.
That is definitely a good lead
Something is weird.
I opened this post from main page “subscribed listing”, but the title showed “I can’t find any cannabis cultivation community”, but the comments were same. I initially thought I have opened a wrong post, but the comments were mentioning “Good work Ruud”, so I refreshed and it fixed post’s title.
Have you noticed the issue?
I’ve noticed a couple oddities as well.
- I refresh a page and a completely different page loads instead
- An autorefresh hits the community tab, but it loads up 10 posts from a single community I’m sure it’ll get sorted out eventually lol
It’s happened to me a few times as well (not just on this instance, think it’s a bug in Lemmy itself). So far I’e not found a reproducible pattern though so it’s a tricky one to bug report effectively.
Yeah, I tried opening it again a few times, no luck yet. Will see if I can figure out any pattern.
I had something similar happen yesterday.
I opened a thread about pokemon, browsed it for a bit, did some stuff in other tabs, and clicked back to the pokemon tab maybe an hour later to browse some more.
The post had changed to one where a user was asking for relaxing game recommendations and it was loading in new comments that seemed to be from that post, but I could still see the comments that had already loaded from the pokemon post when I scrolled down.
When I refreshed it changed back to the pokemon post and only showed comments from that.
Thanks a bunch! I’ll be donating for what it’s worth. I really like it here.
Since I have you here, if I start my own instance do I absolutely have to use docker? I’ve never had good experiences with it and would rather just install programs the old-fashioned way
Well if they can create a docker image out of it, you should be able to install it on a VM… but I run it in Docker because it makes everything so easy manageable…
Docker is not necessary, lemm.ee for example is running without docker!
Here is documentation for setting it up: https://join-lemmy.org/docs/en/administration/from_scratch.html
Of course you can fully adapt it to your own use case. The Lemmy backend is a single binary, you don’t even need to build it on the same machine which will run it. There’s no hard requirement to use nginx or anything like that either - if you understand what this guide is doing, you can replace all the unimportant parts as needed.
Awesome!!! Gonna work on it this weekend. Thank you!
Interesting, thanks for posting
It is possible to do it without docker… but nobody recommends it :)
There is a how-to on how to set up your own instance without docker using ansible: https://join-lemmy.org/docs/en/administration/install_ansible.html
Note that this is just basically a script to deploy lemmy on a remote server. And it uses docker. It just does it for you. (Mostly)
Oh, oof. Didn’t look into it much further as the docker solution would have suited me best also. Thanks for the heads-up
This is also only for Debian AFAIK
Technically no, but they put all their update info and support for docker.
Any progress on this. I’ve been thinking about it too. Couple of ideas:
Too many indexes needing to update when an insert occurs?
Are there any triggers running upon insert?
Unlikely but there isn’t a disk write bottleneck? Might be worth running some benchmarks from the VM shell.
I was thinking that as well, it’s like the post gets “checked” or something like that and that gives a timeout of 20secs. It could be an api or database but somehow my spidey sense says this could well be in code. Some extra calls to filter things maybe? Using an external server? Or even the propagation to the others? (Idk how this federation thing connects to the others, could be just that; maybe another server that is the bottleneck) I just found the 20 seconds suspicious given that is the default timeout
Didn’t know about the timeout but that makes sense. Would be easy to test by changing the nginx timeout.
Another thought: how many db connections do you have? Could it be starved because there are so many selects happening and it needs to wait for them to finish first?
pg_locks shows alarming periods when lots of locking is holding up activity. Inserts take pretty long time on like tables for comments and postings.