Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

High resource usage over time (memory leak?) #3446

Open
TruncatedDinoSour opened this issue Nov 28, 2024 · 8 comments
Open

High resource usage over time (memory leak?) #3446

TruncatedDinoSour opened this issue Nov 28, 2024 · 8 comments

Comments

@TruncatedDinoSour
Copy link

Background information

  • Dendrite version or git SHA: 0.13.8+79b87c7
  • SQLite3 or Postgres?: Postgres
  • Running in Docker?: No
  • go version: go version go1.23.0 linux/amd64
  • Client used (if applicable): Schildichat, Element, Hydrogen, or CinnyChat is what most people use I believe on the HS

Description

  • What is the problem: Dendrite, when running over time, begins hogging ram over time. This results in not only a big RAM hog, but also a CPU hog, since I run zRAM (maybe also related to dendrite using the CPU? Idk.) Regardless, every week or so I have to restart Dendrite because it keeps eating more and more RAM, making the overall server performance worse.
  • Who is affected: The server.
  • How is this bug manifesting: It appears as Dendrite runs in long-term, slowly eating more and more RAM and/or swap.
  • When did this first appear: I can't recall. I don't remember needing to restart dendrite before 1.18 I think.

Steps to reproduce

  • list the steps
  • that reproduce the bug
  • using hyphens as bullet points
  • Run dendrite
  • Use it for a week or so
  • Watch the resource, mainly RAM, usage grow over time

This is its resource usage only 2 days later after its most recent restart. And it only creeps over time until I restart it:

image

It's weird.

@TruncatedDinoSour
Copy link
Author

Had to restart it again. Could only last 3 days.

@TruncatedDinoSour
Copy link
Author

s7evink/fetch-auth-events fixed it

@TruncatedDinoSour
Copy link
Author

TruncatedDinoSour commented Dec 9, 2024

nvm lmao, it was fine for like 6 days and now its no again

@neilalexander
Copy link
Contributor

No idea if zRAM is a setup that should be supported but without a memory profile it will be difficult to tell what’s going on.

https://element-hq.github.io/dendrite/development/profiling

@TruncatedDinoSour
Copy link
Author

TruncatedDinoSour commented Dec 11, 2024

a setup that should be supported but without a memory profile it will be difficult

it is zram, yeah
but, is profiling a good choice over like a week ? the report would be huge, no ?
and even so, wouldnt it severely impact the performance for the week ? is there a way to check this with minimal disruption ?

edit :

image

its only growing :')

actually since its clearly majorly growing over a day, im down to set up reporting tomorrow day and report day after tmrw : ) ill do that

@neilalexander
Copy link
Contributor

neilalexander commented Dec 11, 2024

You just need a single memory profile captured when the memory usage is high. That should contain enough info and the files are small.

Having profiling enabled has next to no runtime cost so it’s fine to have it switched on for a long time.

@TruncatedDinoSour
Copy link
Author

TruncatedDinoSour commented Dec 11, 2024

You just need a single memory profile captured when the memory usage is high. That should contain enough info and the files are small.

Having profiling enabled has next to no runtime cost so it’s fine to have it switched on for a long time.

oh nice, okay then, ill enable the profiler tomorrow since for today i consider myself done and want the rest of the dayevening offish xD

ill send out a memory profile capture in 1-3 days in this thread :)

@TruncatedDinoSour
Copy link
Author

TruncatedDinoSour commented Dec 11, 2024

ok since it was just 1 environment variable i enabled it now, thought maybe its more complicated but nope, ill post the thing when the resource usage is high :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants