Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

x/tools/gopls: memory usage problems have returned in gopls 0.5 #41890

Closed
trapgate opened this issue Oct 9, 2020 · 8 comments
Closed

x/tools/gopls: memory usage problems have returned in gopls 0.5 #41890

trapgate opened this issue Oct 9, 2020 · 8 comments
Labels
FrozenDueToAge gopls Issues related to the Go language server, gopls. Tools This label describes issues relating to any tools in the x/tools repository. WaitingForInfo Issue is not actionable because of missing required information, which needs to be provided.

Comments

@trapgate
Copy link

trapgate commented Oct 9, 2020

What version of Go are you using (go version)?

$ go version
go version go1.15.2 linux/amd64

Does this issue reproduce with the latest release?

Yes

What operating system and processor architecture are you using (go env)?

go env Output
$ go env
GO111MODULE="off"
GOARCH="amd64"
GOBIN=""
GOCACHE="/home/geoff/.cache/go-build"
GOENV="/home/geoff/.config/go/env"
GOEXE=""
GOFLAGS="-mod=vendor"
GOHOSTARCH="amd64"
GOHOSTOS="linux"
GOINSECURE=""
GOMODCACHE="/home/geoff/do/cthulhu/docode/pkg/mod"
GONOPROXY="*.internal.digitalocean.com,github.com/digitalocean"
GONOSUMDB="*.internal.digitalocean.com,github.com/digitalocean"
GOOS="linux"
GOPATH="/home/geoff/do/cthulhu/docode"
GOPRIVATE="*.internal.digitalocean.com,github.com/digitalocean"
GOPROXY="https://proxy.golang.org,direct"
GOROOT="/home/geoff/go"
GOSUMDB="sum.golang.org"
GOTMPDIR=""
GOTOOLDIR="/home/geoff/go/pkg/tool/linux_amd64"
GCCGO="gccgo"
AR="ar"
CC="gcc"
CXX="g++"
CGO_ENABLED="1"
GOMOD=""
CGO_CFLAGS="-g -O2"
CGO_CPPFLAGS=""
CGO_CXXFLAGS="-g -O2"
CGO_FFLAGS="-g -O2"
CGO_LDFLAGS="-g -O2"
PKG_CONFIG="pkg-config"
GOGCCFLAGS="-fPIC -m64 -pthread -fmessage-length=0 -fdebug-prefix-map=/tmp/go-build528678853=/tmp/go-build -gno-record-gcc-switches"

What did you do?

Started vscode.

What did you expect to see?

After experiencing lower memory usage with our monorepo and gopls 0.4.4, with version 0.5 gopls is back to using 10GB of memory. This was previously reported in this issue.

What did you see instead?

Instead of using 800MB - 1GB of memory when running against our monorepo, gopls is back to using about 10GB.

gopls.197817-9GiB-nonames.zip

@gopherbot gopherbot added the gopls Issues related to the Go language server, gopls. label Oct 9, 2020
@gopherbot gopherbot added this to the Unreleased milestone Oct 9, 2020
@stamblerre stamblerre changed the title x/Tools/gopls: Memory usage problems have returned in gopls 0.5 x/tools/gopls: memory usage problems have returned in gopls 0.5 Oct 9, 2020
@gopherbot gopherbot added the Tools This label describes issues relating to any tools in the x/tools repository. label Oct 9, 2020
@stamblerre
Copy link
Contributor

Is this with gopls/v0.5.1? Also, can you please share your gopls settings?

/cc @heschik

@stamblerre stamblerre modified the milestones: Unreleased, gopls/v0.5.2 Oct 9, 2020
@heschi
Copy link
Contributor

heschi commented Oct 9, 2020

This is troubling. It certainly looks like the return of the memory leak issues. But the mechanism must be completely different, since the cache is structured completely differently. I'll try to find some time to re-run the memory tests and maybe add some diagnostic pages that will be more helpful.

In the meantime, it might be interesting to see a series of zip files, rather than just the biggest one. In particular I'm interested in whether the list of loaded packages is changing/growing over time.

@trapgate
Copy link
Author

trapgate commented Oct 9, 2020

I first noticed it after upgrading to gopls 0.5.0, but I'm running 0.5.1 now and it's still happening. It's not a gradual increase, either - I just reloaded gopls, and within about 30s the new process reached 7.1GB. It seems to increase gradually after that, but it's not unbounded - it was using 10GB after running for a couple of weeks.

For a gopls launched with: -mode=stdio serve -rpc.trace --debug=localhost:6060, here's a set of zip files.

gopls.241070-1GiB-nonames.zip
gopls.241070-2GiB-nonames.zip
gopls.241070-3GiB-nonames.zip
gopls.241070-4GiB-nonames.zip
gopls.241070-5GiB-nonames.zip
gopls.241070-6GiB-nonames.zip

@heschi
Copy link
Contributor

heschi commented Oct 13, 2020

I was able to spend a little time today running through the tests I used to guide the optimizations that landed in 0.4.4. They all look good, which is nice but also completely unhelpful. I'm not aware of any intended changes that would have increased memory usage.

The behavior you're describing sounds like increased memory usage during the initial workspace load, where we discover, type-check, and diagnose all the packages in the workspace. The zips you attached above are consistent with that. Given that my tests look the same, the most obvious explanation is that the IWL is finding and loading more packages than it used to for some reason. Comparing the contents of http://localhost:6060/cache/1 for 0.4.4 vs. 0.5.1 might show interesting results. Unfortunately it's very hard to figure out what's going on with the redacted logs. Can you compare them yourself, or maybe you would be willing to send some to me directly?

@stamblerre stamblerre added the WaitingForInfo Issue is not actionable because of missing required information, which needs to be provided. label Oct 14, 2020
@trapgate
Copy link
Author

trapgate commented Nov 4, 2020

I finally had a chance to get back to this, but what I'm seeing now isn't illuminating.

I started with the current stable version, 0.5.2. It's using about 7GB of resident memory when started on our monorepo, and about 12GB virtual memory. As before, this is pretty consistent even after its been running continually for a couple of weeks.

Then I rolled back to v0.4.4 (GO111MODULE=on go get golang.org/x/tools/gopls@v0.4.4), verified the rollback with gopls version, and restarted the language server from within vscode. The last time I was running v0.4.4, I saw memory usage of around 1.5GB, but this time it was almost identical to v0.5.2 - 7GB resident, 12 virtual.

Of course I wondered if I had rolled back gopls in the wrong directory, but I checked that at localhost:6060/info: golang.org/x/tools/gopls 0.4.4.

Another thing I'm noticing: about half the time when I restart the language server, it doesn't come up listening on port 6060. I've made sure that it's being started with --debug=localhost:6060. Restarting the language server again without making any other changes fixes this. I'm seeing this with both v0.4.4 and v0.5.2. This problem doesn't seem to affect the functionality or the memory usage of gopls otherwise.

I also diff'ed the contents of /cache/1 from both versions. The stats I get back are remarkably similar, but I don't think that's too surprising, given how close the memory usage is between the two versions now.

I also wondered if something about our repository might have changed since August 20th, when I reported that 0.4.4 was consuming much less memory than previous versions. So I checked out a SHA of our repo from a few days before that, and restarted vscode completely. I still see gopls v0.4.4 using about 8GB.

So, I'm now unable to reproduce the lower memory usage I was seeing with v0.4.4 back in August.

Maybe it would be helpful to look at the cache again. I've redacted the names of our internal packages from this list, and this is only the first 15 entries. The first one seems to consist of some embedded javascript - presumably that accounts for the large file size in the cache, which very small ast + types + types info values.

Name | total = file + ast + types + types info
github.com/swaggo/files (1) | 35.68 = 35.63 + 0.02 + 0.01 + 0.02
--- (2) | 27.42 = 0.99 + 10.72 + 1.17 + 14.55
--- (1) | 26.68 = 3.45 + 12.75 + 2.52 + 7.96
--- (2) | 26.47 = 0.95 + 10.22 + 1.73 + 13.58
--- (2) | 24.40 = 0.84 + 9.29 + 1.50 + 12.78
--- (2) | 22.16 = 0.77 + 8.40 + 1.31 + 11.68
--- (2) | 21.76 = 0.89 + 8.49 + 1.31 + 11.07
--- (2) | 20.93 = 0.86 + 8.15 + 1.16 + 10.76
--- (1) | 20.88 = 4.43 + 9.40 + 1.39 + 5.67
--- (2) | 20.43 = 0.82 + 8.21 + 1.43 + 9.97
--- (2) | 20.30 = 0.81 + 8.16 + 1.42 + 9.91
--- (2) | 18.10 = 0.73 + 7.31 + 1.22 + 8.84
--- (2) | 16.78 = 0.60 + 6.56 + 1.22 + 8.40
--- (2) | 16.42 = 0.65 + 6.63 + 1.10 + 8.04
github.com/elastic/go-elasticsearch/v7/esapi (1) | 15.81 = 1.81 + 7.08 + 1.82 + 5.10

@heschi
Copy link
Contributor

heschi commented Nov 4, 2020

In some ways, that's good news: I had no idea what would cause a regression, so it's nice that there isn't one. Not much help to you though. I'm not sure what the next steps should be, I'll have to think about it. Right now it seems like you just have a huge project. Do you need to work on all of it at once? Would a package inclusion/exclusion filter be useful?

Yes, the first package is probably a set of large files that have had their unexported fields dropped and are therefore very cheap in terms of AST.

@trapgate
Copy link
Author

trapgate commented Nov 6, 2020

Starting the editor from a subdirectory in our monorepo does reduce the memory footprint to usable levels, and in our case at least I haven't had that cause me any problems - I spend nearly all my time in a subset of our subprojects. I ran that way for months when I had less memory in my work machine, and it was fine. A package in-/exclusion filter would be useful though, because it would decouple the editor's view of the repo from the set of projects gopls loads. I'd like the editor to have a view of the whole repo, even if gopls doesn't.

It's quite possible that this is a problem caused by some characteristic of our repo - nearly all the top offenders in the cache are our own packages. But if this is happening with other monorepos, I think this is a big problem. Certainly everyone who uses the language server at my company is aware of the memory consumption, and it causes us a fair bit of pain, since the majority of our dev machines have 16G of memory, and gopls is using up to 12 by itself.

Thank you for your efforts; if I can help by providing more information, let me know.

@stamblerre stamblerre removed the WaitingForInfo Issue is not actionable because of missing required information, which needs to be provided. label Nov 6, 2020
@stamblerre stamblerre added this to Needs Triage in vscode-go: gopls by default Nov 10, 2020
@stamblerre stamblerre moved this from Needs Triage to In progress in vscode-go: gopls by default Nov 10, 2020
@stamblerre stamblerre moved this from In progress to Critical in vscode-go: gopls by default Nov 10, 2020
@stamblerre stamblerre moved this from Critical to Needs Triage in vscode-go: gopls by default Nov 11, 2020
@heschi
Copy link
Contributor

heschi commented Nov 11, 2020

The problem is that I really don't know what the problem is. Right now all I know is that the heap is much bigger than the packages you're loading would seem to justify, and I have no idea why. Without the ability to reproduce, or even just look at fully-detailed logs, I'm unfortunately stumped.

If you're willing to share more information privately, maybe I'll be able to figure out more. Otherwise I think we should probably close this and plan to implement the package inclusion/exclusion.

@stamblerre stamblerre added the WaitingForInfo Issue is not actionable because of missing required information, which needs to be provided. label Nov 11, 2020
@stamblerre stamblerre moved this from Needs Triage to In progress in vscode-go: gopls by default Nov 11, 2020
@stamblerre stamblerre moved this from In progress to Waiting for Info in vscode-go: gopls by default Nov 18, 2020
@heschi heschi closed this as completed Nov 18, 2020
vscode-go: gopls by default automation moved this from Waiting for Info to Done Nov 18, 2020
@stamblerre stamblerre removed this from the gopls/vscode-go milestone Nov 18, 2020
@golang golang locked and limited conversation to collaborators Nov 18, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
FrozenDueToAge gopls Issues related to the Go language server, gopls. Tools This label describes issues relating to any tools in the x/tools repository. WaitingForInfo Issue is not actionable because of missing required information, which needs to be provided.
Projects
None yet
Development

No branches or pull requests

4 participants