Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

x/tools/gopls: high memory usage in sigs.k8s.io/cluster-api #37076

Closed
vincepri opened this issue Feb 6, 2020 · 15 comments
Closed

x/tools/gopls: high memory usage in sigs.k8s.io/cluster-api #37076

vincepri opened this issue Feb 6, 2020 · 15 comments
Labels
FrozenDueToAge gopls Issues related to the Go language server, gopls. Tools This label describes issues relating to any tools in the x/tools repository. WaitingForInfo Issue is not actionable because of missing required information, which needs to be provided.
Milestone

Comments

@vincepri
Copy link

vincepri commented Feb 6, 2020

What version of Go are you using (go version)?

$ go version
go version go1.13.7 darwin/amd64

Does this issue reproduce with the latest release?

Yes

What operating system and processor architecture are you using (go env)?

go env Output
$ go env
GO111MODULE="on"
GOARCH="amd64"
GOBIN=""
GOCACHE="/Users/vince/Library/Caches/go-build"
GOENV="/Users/vince/Library/Application Support/go/env"
GOEXE=""
GOFLAGS=""
GOHOSTARCH="amd64"
GOHOSTOS="darwin"
GONOPROXY=""
GONOSUMDB=""
GOOS="darwin"
GOPATH="/Users/vince/go"
GOPRIVATE=""
GOPROXY="https://proxy.golang.org"
GOROOT="/usr/local/Cellar/go/1.13.7/libexec"
GOSUMDB="sum.golang.org"
GOTMPDIR=""
GOTOOLDIR="/usr/local/Cellar/go/1.13.7/libexec/pkg/tool/darwin_amd64"
GCCGO="gccgo"
AR="ar"
CC="clang"
CXX="clang++"
CGO_ENABLED="1"
GOMOD="/Users/vince/go/src/sigs.k8s.io/cluster-api/go.mod"
CGO_CFLAGS="-g -O2"
CGO_CPPFLAGS=""
CGO_CXXFLAGS="-g -O2"
CGO_FFLAGS="-g -O2"
CGO_LDFLAGS="-g -O2"
PKG_CONFIG="pkg-config"
GOGCCFLAGS="-fPIC -m64 -pthread -fno-caret-diagnostics -Qunused-arguments -fmessage-length=0 -fdebug-prefix-map=/var/folders/8r/hwtdskms33z018dpqfl8__mc0000gn/T/go-build025802019=/tmp/go-build -gno-record-gcc-switches -fno-common"

What did you do?

Running VSCode with latest gopls developing with Cluster API, a Kubernetes sub-project.

What did you expect to see?

Reasonable memory usage in Activity Monitor.

What did you see instead?

gopls process that was attached to cluster-api codebase was about 30GB after developing on the codebase.

@gopherbot gopherbot added this to the Unreleased milestone Feb 6, 2020
@gopherbot gopherbot added Tools This label describes issues relating to any tools in the x/tools repository. gopls Issues related to the Go language server, gopls. labels Feb 6, 2020
@gopherbot
Copy link

Thank you for filing a gopls issue! Please take a look at the Troubleshooting guide, and make sure that you have provided all of the relevant information here.

@stamblerre
Copy link
Contributor

Is this with gopls/v0.3.1? Are you able to consistently reproduce this? If possible, can you provide a heap profile? This can be done by adding the following settings to your VS Code settings.json:

"go.languageServerFlags": [
    "-rpc.trace",
    "serve",
    "--debug=localhost:6060"
]

and running go tool pprof -http :8080 http://localhost:6060/debug/pprof/heap.

I just tried typing quickly in a random file in that repo, and I could push memory usage up pretty high, but it all went back down after a GC.

@stamblerre
Copy link
Contributor

stamblerre commented Feb 6, 2020

Also, if you could provide the output of gopls -rpc.trace -v check path/to/file.go, that will help confirm that I am reproducing correctly.

@mfischr
Copy link

mfischr commented Feb 6, 2020

I'm running with gopls v0.3.0, go 1.13.3, vscode on Ubuntu 18.04 and seeing memory usage growing to 10GB within a few minutes. Happens when editing some files but not others. pprof profile attached.

Unfortunately this is private code so I can't share it. The repo is quite large, maybe 60k lines not including vendor stuff, but the package I'm working on has fewer than 1,000 lines. I can make edits for about 10 minutes before my system hard freezes and I have to power cycle. Workaround is to kill gopls every few minutes.

This is new in v0.3.0. I can't remember what version I was running before upgrading a couple days ago, but the issue wasn't this severe.

pprof.gopls.alloc_objects.alloc_space.inuse_objects.inuse_space.001.pb.gz

@stamblerre
Copy link
Contributor

@cbd32: gopls/v0.3.0 had a memory leak, which we believe we fixed in gopls/v0.3.1. Can you try upgrading and see if that fixes the issue for you?

@vincepri
Copy link
Author

vincepri commented Feb 6, 2020

@stamblerre I tried both on v0.3.1 and the master version of gopls. I'm in meeting all morning, but later this afternoon I should be able to paste the trace.

@heschi
Copy link
Contributor

heschi commented Feb 6, 2020

@cbd32 I looked at your profile and I don't see any obvious signs of the autocomplete memory leak, but it'd still be good to rule it out. Once you've reproduced with v0.3.1, please file a new issue so that we can keep separate problems from getting confused with each other.

@vincepri I was also unable to reproduce your issue. I did notice that cluster-api has a number of nested modules, which gopls currently does not support well, but even messing around in those didn't immediately cause a problem. It might be good to pay attention to whether you're working in those when you reproduce it.

@stamblerre stamblerre added the WaitingForInfo Issue is not actionable because of missing required information, which needs to be provided. label Feb 7, 2020
@heschi
Copy link
Contributor

heschi commented Feb 10, 2020

All: I just merged https://golang.org/cl/218858, which will automatically write profiles when gopls uses more than 5GiB of memory. If you'd like, you can update to master with go get golang.org/x/tools/gopls@master golang.org/x/tools@master and then attach the heap and goroutine profiles for us to look at.

@vincepri
Copy link
Author

Thanks @heschik, I'll post them here if I encounter the issue again

@jvburnes
Copy link

jvburnes commented Feb 11, 2020

I'm using .0.3.1 and with a modest size codebase (dexidp) my copy of nvim / vim-go / gopls starts at about 40MB of ram. Within a minute gopls is consuming almost 2GB of RAM. A second nvim session will lock my machine on the same code base as memory usage climbs to about 95% of available RAM before my machine starts swapping with a vengence. This is on linux 5.4.15.

@heschi
Copy link
Contributor

heschi commented Feb 12, 2020

@jvburnes Loading stuff takes time, so I'm not sure what you mean by it starting at 40MiB. When I open https://github.com/dexidp/dex, I see gopls stabilize at an RSS of about 600MiB. If you're seeing dramatically more, please provide repro instructions and/or profiles.

@jvburnes
Copy link

thanks hescik.

I just reproduced it in my VM. It's getting late, but my VM was consuming about 768M in the guest and very stable. I started nvim within one of the dex source files and within a minute all available VM memory was consumed (about 3GB) -- right before it crashed my VM. I know you want details and they will be forthcoming. It's getting late so I'll post them in the morning.

@stamblerre
Copy link
Contributor

@heschik has made a few fixes on master that should address the memory consumption issues, so I would recommend upgrading to master (GO111MODULE=on go get golang.org/x/tools/gopls@master golang.org/x/tools@master) to see if that addresses this issue.

@vincepri
Copy link
Author

I think we can close this for cluster-api I'm not noticing any more large memory usage

@stamblerre
Copy link
Contributor

Thanks for following up! Closing.

@golang golang locked and limited conversation to collaborators Feb 25, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
FrozenDueToAge gopls Issues related to the Go language server, gopls. Tools This label describes issues relating to any tools in the x/tools repository. WaitingForInfo Issue is not actionable because of missing required information, which needs to be provided.
Projects
None yet
Development

No branches or pull requests

6 participants