increate node heapsize
# 🌱|help-and-getting-started
w
Hello, Is there a way to overwrite the node heap size? I am getting JavaScript heap out of memory issues
q
Hello @wonderful-table-85939 do you mean the
garden
binary itself is giving out of memory messages?
w
right why is it hardcoded instead of configurable?
q
@polite-fountain-28010 could you answer this one?
p
I think since we need to override it to be larger by default, it needed to be baked into the executable. I tried finding out if it can be overridden again from the outside, but so far I wasn't successful. It's possible that the packed
pkg
executable always forces the flags to be overriden even if
NODE_OPTIONS='--max-old-space-size=XXX'
is set What I'm wondering is how we even get to consume more than 4GB of memory. Would you mind sharing some details about your garden setup so we can take a look if there's any potential memory leaks that we haven't found yet?
q
Thanks, Tim! Pinging @wonderful-table-85939 for visibility
w
It's the same issue as https://github.com/garden-io/garden/issues/4638 a coworker thinks they tracked it down to https://github.com/garden-io/garden/blob/main/core/src/tasks/base.ts#L163C13-L163C13. I believe they will put in a pr soon
but could it be possible to pull from a environment variable first and default to some number?
p
This is done by the
pkg
bundler and so far I haven't seen a way to override this. I believe command line arguments take precedence over the environment variables, so it would have to be changed in
pkg
, which is a bit difficult since the project seems to no longer be a priority for vercel. Regarding the cause of the memory leak, I've also suspected that memoization could be to blame. Feel free to assign me as a reviewer for the PR once you open it. I'll see if I can find a reproduction for the out of memory issue to validate any fixes there. Could you share how large or complex your project is that triggers the OOM?
I've managed to reproduce the memory leak and validated that changing the decorator to
Copy code
@Memoize((params: ResolveProcessDependenciesParams<O>) => params.status ? params.status.state : null)
indeed fixes it. This is amazing, thank you so much for spotting this. I can open a PR for this right away, but I can also approve your coworkers PR if they prefer to have that contribution on record 🙂
I keep forgetting that simply answering doesn't actually ping the thread members... @wonderful-table-85939
w
Fix sounds good thanks for looking into this so quickly
p
I opened a PR for this and the release should be happening very soon. If you share your and your colleague's github handle before the release, we can also give you a shoutout in the release notes for tracking this down. @wonderful-table-85939
w
sure mine is https://github.com/marklester/ I am asking my coworker if he is okay with sharing gh handle
ok yeah he is cool with it @yyc
i'll monitor the releases and post back results
a
@wonderful-table-85939 This fix has been released in garden 0.13.15. Feel free to give it a try and see if it resolves the memory issue. https://github.com/garden-io/garden/releases/tag/0.13.15
w
already did and yeah it looks to have fixed our issue. Thanks for quick response!
a
Great! Thanks for the contribution 🙂
w
@polite-fountain-28010 looks like I can still reproduce
garden delete environment
and
garden get config
die with message:
Copy code
de:279114) MaxListenersExceededWarning: (node) warning: possible EventEmitter memory leak detected. 5001 listeners added. Use emitter.setMaxListeners() to increase limit.
(Use `garden --trace-warnings ...` to show where the warning was created)
(node:279114) MaxListenersExceededWarning: (node) warning: possible EventEmitter memory leak detected. 5001 listeners added. Use emitter.setMaxListeners() to increase limit.
p
That shouldn't directly be responsible for any crashes and just be a warning of possibly using too many listeners on an
EventEmitter
instance. Did the process actually crash there?
w
yeah had to kill -9 it.
we are using kind so current approach is to kill the cluster itself