this post was submitted on 13 Jun 2023
167 points (97.2% liked)

Lemmy.World Announcements

29079 readers
233 users here now

This Community is intended for posts about the Lemmy.world server by the admins.

Follow us for server news 🐘

Outages 🔥

https://status.lemmy.world/

For support with issues at Lemmy.world, go to the Lemmy.world Support community.

Support e-mail

Any support requests are best sent to info@lemmy.world e-mail.

Report contact

Donations 💗

If you would like to make a donation to support the cost of running this platform, please do so at the following donation URLs.

If you can, please use / switch to Ko-Fi, it has the lowest fees for us

Ko-Fi (Donate)

Bunq (Donate)

Open Collective backers and sponsors

Patreon

Join the team

founded 2 years ago
MODERATORS
 

We're still working to find a solution for the posting slowness in large communities.

We have seen that a post does get submitted right away, but yet the page keeps 'spinning'

So right after you clicked 'Post' or 'Reply' you can refresh the page and the post should be there.

(But maybe to be sure you could copy the contents of your post first, so you can paste again if anything would go wrong..)

top 50 comments
sorted by: hot top controversial new old
[–] bdonvr@lemmy.rogers-net.com 16 points 1 year ago (3 children)

Keep in mind that the upcoming Lemmy update will probably fix this I think. (Replacing websockets)

[–] ruud@lemmy.world 7 points 1 year ago

Yes I really hope so!!

[–] RoyalEngineering@lemmy.world 5 points 1 year ago (1 children)

Maybe this is a dumb question, but why would replacing websockets speed things up? I read the Wikipedia page on it, but I guess I don’t understand it fully.

[–] phiresky@lemmy.world 9 points 1 year ago (2 children)

In general websockets scale badly because the server has to keep open a connection and a fair amount of state. You also can't really cache websocket messages like you can normal HTTP responses. Not sure which reasons apply to Lemmy though.

[–] RoyalEngineering@lemmy.world 3 points 1 year ago

Thanks for the explanation!

[–] LocustOfControl@reddthat.com 3 points 1 year ago

The caching problem is definitely part of it from conversations on GitHub.

[–] slashzero@lemmy.world 2 points 1 year ago

I really hope someone is doing some level of performance testing on those changes to make sure the changes fix the performance issues.

[–] Scaldart@lemmy.world 15 points 1 year ago

Just hopping into the chain to say that I appreciate you and all of your hard work! This place—Lemmy in general, but specifically this instance—has been so welcoming and uplifting. Thank you!

[–] mykl@lemmy.world 15 points 1 year ago (1 children)

At least the "reply" button goes away so I don't end up double- triple- or even duodecuple-posting! Thanks for all the hard work that must be going on behind the scenes right now!

[–] cascadingsymmetry@lemmy.world 15 points 1 year ago (4 children)

I kept getting a timeout message from Jerboa which led me to think I hadn't been posted. So I ended up submitting the same joke to the Dad Jokes sub three times. Which actually is how dad might tell that joke.

[–] mykl@lemmy.world 5 points 1 year ago

Lemmy is now your digital dadlife assistant.

[–] AlmightySnoo@lemmy.world 5 points 1 year ago

I think in that case it's a feature not a bug.

[–] FartsWithAnAccent@lemmy.world 2 points 1 year ago (2 children)

I get this occasionally with Jeroba too, I had assumed it was because I'm on Mint and the connection is shoddy but maybe it's an issue with the client.

load more comments (2 replies)
load more comments (1 replies)
[–] tallwookie@lemmy.world 15 points 1 year ago (2 children)

maybe related, but I've noticed that upvoting/downvoting has similar lag delays

[–] kaxora@lemmy.world 8 points 1 year ago

Same. It would be good to fix this

[–] ChaosAD@lemmy.world 5 points 1 year ago

I can't up/down vote at all.

[–] gkd@lemmy.world 10 points 1 year ago

Been noticing this in the app I’m working on. Pretty much all POST requests fail to return a response and just timeout after 60 seconds. A quick refresh shows that the new items do successfully get created though.

[–] vepro@lemmy.world 9 points 1 year ago (1 children)

I assume that there is something that is O(N), which explains why wait time scales with community size (amount of posts, comments)

[–] slashzero@lemmy.world 7 points 1 year ago (1 children)

Oh, Big-O notation? I never thought I’d see someone else mention big O notation out in the wild!

:high-five:

[–] manitcor@lemmy.intai.tech 7 points 1 year ago (4 children)

you are going to meet a lot of OG redditors in the next few weeks. Old reddit had Big O in every post, even posts with cute animals.

load more comments (4 replies)
[–] wit@lemmy.world 9 points 1 year ago

Again, thank you for the outstanding work! You are awesome!

Also, the new icon for lemmy world is great!

[–] slashzero@lemmy.world 7 points 1 year ago* (last edited 1 year ago) (1 children)

Have you tried enabling the slow query logs @ruud@lemmy.world? I went through that exercise yesterday to try to find the root cause but my instance doesn’t have enough load to reproduce the conditions, and my day job prevents me from devoting much time to writing a load test to simulate the load.

I did see several queries taking longer than 500ms (up to 2000ms) but they did not appear related to saving posts or comments.

[–] lhx@lemmy.world 1 points 1 year ago

These things are taking 15-20 seconds though.

[–] Oxff@lemmy.world 7 points 1 year ago

Thank you for your hard work and keeping us up to date.

[–] neighbourbehaviour@lemmy.world 7 points 1 year ago (2 children)

Does this behaviour appear on other big instances? E.g. lemmy.ml?

[–] slashzero@lemmy.world 3 points 1 year ago (1 children)

Yes. Absolutely does happen on other instances that have thousands of users.

Great, so it's reproducible and Lemmy-the-app related, not instance-specific. Should be fixable across the board once it's identified and resolved.

[–] hydra@lemmy.world 3 points 1 year ago

Yes it does, tried this workaround before.

[–] lethalfeline@lemmy.world 6 points 1 year ago

Thanks for your and the other Lemmy devs work on this. These growing pains are a good thing as frustrating as it can be for users and maintainers alike. Lemmy will get bigger and this optimization treadmill is really just starting.

[–] kaxora@lemmy.world 5 points 1 year ago (2 children)

In my case, the page keeps spinning but the post is not submitted, regardless of reloading the page or waiting for a long time. There was one case where I cut down significantly on the amount of characters in the post and then it posted, but I have been unable to replicate this.

[–] Moghul@lemmy.world 5 points 1 year ago

I have the same issue with image posts. If I submit them through the app the posts counter on my profile goes up, but there's no post. I also can't retrieve any posts for my own account. It says I have 3 but it shows none.

Comments work OK so I'm not sure what the problem is. I was worried I got restricted or something.

[–] Moghul@lemmy.world 1 points 1 year ago

Oh my god I'm so fucking stupid. If you hide posts you've seen it'll also hide your own posts...

[–] Sunforged@lemmy.world 5 points 1 year ago (1 children)

@ruud@lemmy.world Yo dude, first off huge props and a big thank you for what you have setup. I’ll be donating monthly while I am here. I appreciate that we have an alternative to Reddit at this critical moment in time.

I do have a question on your long term plans, do you want to continue to expand and upgrade the server, as funding allows, or is there a cap that you will close off the server to new members? Or perhaps make it more of a process to join?

[–] ruud@lemmy.world 10 points 1 year ago (2 children)

Well if all the Reddit users would get over to Lemmy I guess all servers would need to scale up... but I think the server we have now is powerfull enough to grow quite a lot, as long as the software gets tuned ..

load more comments (2 replies)
[–] lenninscjay@lemmy.world 4 points 1 year ago

ok, so it's not just me. Hope it gets resolved soon!

[–] CaptObvious@lemmy.world 4 points 1 year ago

Thanks for posting the workaround and for working to resolve the issue. Lemmy is a great place, and a real breath of fresh air after Reddit.

[–] manitcor@lemmy.intai.tech 4 points 1 year ago

its def more hung up today, oddly its only first level replies for some reason

[–] V4uban@beehaw.org 3 points 1 year ago

Thank you so much

[–] ulu_mulu@lemmy.world 3 points 1 year ago

I noticed that too, page keeps spinning but comments are posted immediately anyway.

[–] TragicNotCute@lemmy.world 2 points 1 year ago

I’ve done this twice in the last 20 minutes and the content is not there. This workaround was working earlier today though.

[–] kapeeshy@lemmy.world 2 points 1 year ago

I noticed this, thanks for the clarification

[–] MyOpinion@lemmy.world 2 points 1 year ago

This is the biggest issue I have run into. Thanks for looking into it.

[–] deathworlder@lemmy.world 2 points 1 year ago* (last edited 1 year ago)

One of the large applications I was working on had the same issue, to solve it we ended up creating multiple smaller instances and started hosting a set of related API's in each server.

for example read operations like list posts, comments etc could be in one server. write operations can be clusered in one server.

Later, whichever server is getting overloaded can be split up again. In our case 20% of API's used around 3/4th of server resources, so we split those 20% API's in 4 large servers and kept the remaining 80% API's in 3 small servers.

This worked for us because the DB's were maintained in seperate servers.

I wonder if a quasi micro-services approach will solve the issue here.

Edit 1: If done properly this approach can be cost effective, in some cases it might cost 10 to 20 percentage more in server costs, however it will lead to a visible improvement in performance.

[–] LargeHardonCollider@lemmy.world 2 points 1 year ago (1 children)

Is the slowdown that it the instance has to send out updates about the comment to every other instance before returning a successful response? If so, is anyone working on moving this to an async queue?

Sending out updates seems like something that’s fine being eventually consistent

[–] ruud@lemmy.world 2 points 1 year ago (1 children)

Ooh that’s a good remark ! I’ll see if that’s the cause

[–] LargeHardonCollider@lemmy.world 5 points 1 year ago* (last edited 1 year ago) (1 children)

Reading more about how this works, sending out updates to each instance shouldn’t block the request from returning unless you have a config flag set to debug source.

It might be due to poorly optimized database queries. Check out this issue for more info. Sounds like there are problems with updating the rank of posts and probably comments too

[–] ruud@lemmy.world 3 points 1 year ago (1 children)

So it looks like YOU SOLVED THE ISSUE with this reply! This led me to check the debug mode, and it was on! It turned that on when I just sterted the server and federation had issues....

We no longer seem to have the slowness!!

That’s awesome! Thanks for hosting the server!

load more comments
view more: next ›