Skip to content

x/tools/gopls: HUGE memory leak in gopls. Manual GC works. #72919

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
hlpmenu opened this issue Mar 18, 2025 · 7 comments
Closed

x/tools/gopls: HUGE memory leak in gopls. Manual GC works. #72919

hlpmenu opened this issue Mar 18, 2025 · 7 comments
Labels
BugReport Issues describing a possible bug in the Go implementation. gopls Issues related to the Go language server, gopls. Tools This label describes issues relating to any tools in the x/tools repository.
Milestone

Comments

@hlpmenu
Copy link

hlpmenu commented Mar 18, 2025

gopls version

Build info

golang.org/x/tools/gopls v0.18.1
    golang.org/x/tools/gopls@v0.18.1 h1:2xJBNzdImS5u/kV/ZzqDLSvlBSeZX+pWY9uKVP7Pask=
    github.com/BurntSushi/toml@v1.4.1-0.20240526193622-a339e1f7089c h1:pxW6RcqyfI9/kWtOwnv/G+AzdKuy2ZrqINhenH4HyNs=
    github.com/google/go-cmp@v0.6.0 h1:ofyhxvXcZhMsU5ulbFiLKl/XBFqE1GSq7atu8tAmTRI=
    golang.org/x/exp/typeparams@v0.0.0-20241210194714-1829a127f884 h1:1xaZTydL5Gsg78QharTwKfA9FY9CZ1VQj6D/AZEvHR0=
    golang.org/x/mod@v0.23.0 h1:Zb7khfcRGKk+kqfxFaP5tZqCnDZMjC5VtUBs87Hr6QM=
    golang.org/x/sync@v0.11.0 h1:GGz8+XQP4FvTTrjZPzNKTMFtSXH80RAzG+5ghFPgK9w=
    golang.org/x/telemetry@v0.0.0-20241220003058-cc96b6e0d3d9 h1:L2k9GUV2TpQKVRGMjN94qfUMgUwOFimSQ6gipyJIjKw=
    golang.org/x/text@v0.22.0 h1:bofq7m3/HAFvbF51jz3Q9wLg3jkvSPuiZu/pD1XwgtM=
    golang.org/x/tools@v0.30.1-0.20250221230316-5055f70f240c h1:Ja/5gV5a9Vvho3p2NC/T2TtxhHjrWS/2DvCKMvA0a+Y=
    golang.org/x/vuln@v1.1.3 h1:NPGnvPOTgnjBc9HTaUx+nj+EaUYxl5SJOWqaDYGaFYw=
    honnef.co/go/tools@v0.5.1 h1:4bH5o3b5ZULQ4UrBmP+63W9r7qIkqJClEA9ko5YKx+I=
    mvdan.cc/gofumpt@v0.7.0 h1:bg91ttqXmi9y2xawvkuMXyvAA/1ZGJqYAEGjXuP0JXU=
    mvdan.cc/xurls/v2@v2.5.0 h1:lyBNOm8Wo71UknhUs4QTFUNNMyxy2JEIaKKo0RWOh+8=
go: go1.24.1

go env

AR='ar'
CC='gcc'
CGO_CFLAGS='-O2 -g'
CGO_CPPFLAGS=''
CGO_CXXFLAGS='-O2 -g'
CGO_ENABLED='1'
CGO_FFLAGS='-O2 -g'
CGO_LDFLAGS='-O2 -g'
CXX='g++'
GCCGO='gccgo'
GO111MODULE='on'
GOAMD64='v1'
GOARCH='amd64'
GOAUTH='netrc'
GOBIN=''
GOCACHE='[redacted].cache/go-build'
GOCACHEPROG=''
GODEBUG=''
GOENV='[redacted].config/go/env'
GOEXE=''
GOEXPERIMENT=''
GOFIPS140='off'
GOFLAGS=''
GOGCCFLAGS='-fPIC -m64 -pthread -Wl,--no-gc-sections -fmessage-length=0 -ffile-prefix-map=/tmp/go-build1586616623=/tmp/go-build -gno-record-gcc-switches'
GOHOSTARCH='amd64'
GOHOSTOS='linux'
GOINSECURE=''
GOMOD='/dev/null'
GOMODCACHE='[redacted]go/pkg/mod'
GONOPROXY='github.com/hlpmenu'
GONOSUMDB='gopkg.hlmpn.dev'
GOOS='linux'
GOPATH='[redacted]go'
GOPRIVATE='[redacted]'
GOPROXY='direct'
GOROOT='/usr/local/go'
GOSUMDB='sum.golang.org'
GOTELEMETRY='local'
GOTELEMETRYDIR='[redacted].config/go/telemetry'
GOTMPDIR=''
GOTOOLCHAIN='auto'
GOTOOLDIR='/usr/local/go/pkg/tool/linux_amd64'
GOVCS=''
GOVERSION='go1.24.1'
GOWORK=''
PKG_CONFIG='pkg-config'

What did you do?

Seems to make no difference what you do to trigger it, after startup it starts to creep up about +1-2MB memory a second until hitting close to/oom limit.

Note
Manually triggering a GC seems to actually "gc", but it keeps on climbing back until you manually gc next.
At the point where i just restarted it and it was around 650MB, manually triggering a GC pushed it down to 166MB, where it kept climbing again, and once a new manual gc was triggered, went back to 166MB exactly again.

What did you see happen?

System etc

  • OS: Ubuntu 24.10
  • Kernel: 6.13.7-x64v3-t2-oracular-xanmod1

Start climbing 1-2mb/s right at start, manually triggering a gc frees the memory as you would expect to be done normally.

Malloc calls are consistantly a huge number higher than frees, example :

With no manual gc:
Malloc calls 4,772,278
Frees 212,506

with manual gc, same instance:
Malloc calls 5,251,943
Frees 4,491,204

Process is idle while this is happening, logging shows nothing weird at all.

What did you expect to see?

No memleak lol

Editor and settings

Same thing occurs in vscode+cursor

code:

Version: 1.98.0
Commit: 6609ac3d66f4eade5cf376d1cb76f13985724bcb
Date: 2025-03-04T21:06:18.612Z (1 wk ago)
Browser: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Code/1.98.0 Chrome/132.0.6834.196 Electron/34.2.0 Safari/537.36

cursor:

Version: 0.47.8
Commit: 82ef0f61c01d079d1b7e5ab04d88499d5af500e0
Date: 2025-03-18T05:39:44.386Z (8 hrs ago)
Browser: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Cursor/0.47.8 Chrome/128.0.6613.186 Electron/32.2.6 Safari/537.36

And trough cli.

Vscode go related settings

  "[go]": {

        "editor.defaultFormatter": "golang.go",
        "editor.codeLens": false,
        "go.diagnostic.vulncheck": "Imports",
    },
    "go.enableCodeLens": {
        "runtest": false,
    },

    // ## Linting
    "go.lintOnSave": "package",
    "go.lintTool": "golangci-lint",

    "go.playground": {
        "openbrowser": false,
        "share": false,
        "run": false
    },
    "go.showWelcome": false,
    "go.testExplorer.showOutput": false,
    "go.testExplorer.enable": false,
    "go.survey.prompt": false,
    "go.lintFlags": [
        "--fast"
    ],

    // ## Gopls
    "gopls": {
        "ui.semanticTokens": true,
        "build.directoryFilters": [
            "-node_modules",
            "-.git",
        ]
    },
    "go.toolsEnvVars": {
        "GOGC": "off"
    },

    // ## Formating
    "go.formatTool": "default",

    // ## Go vet 
    "go.vetOnSave": "off",
    "go.vetFlags": [ ],

Logs

Attatchments:

@hlpmenu hlpmenu added gopls Issues related to the Go language server, gopls. Tools This label describes issues relating to any tools in the x/tools repository. labels Mar 18, 2025
@gopherbot gopherbot added this to the Unreleased milestone Mar 18, 2025
@gabyhelp gabyhelp added the BugReport Issues describing a possible bug in the Go implementation. label Mar 18, 2025
@findleyr
Copy link
Member

When gopls starts up, it type-checks and analyzes the workspace. This involves a lot of allocation, which becomes garbage once the analysis is complete. GC'ing collects this garbage.

It used to be the case that all of this memory was held indefinitely, so in this case your process would have 650mb memory usage as its steady state. We since did a lot of work to make memory independent, so that it can be GC'ed and the steady state memory is lower (it sounds like 150mb in your case).

There are also some asynchronous processes in gopls that will allocate in the background:

  • Updating the in-memory index of GOMODCACHE data.
  • Performing a GC of the file-based cache it uses for derived data.
  • Running a timed runtime GC.

All of these will also produce some garbage that must be GC'ed. I'm not sure that it's typical for these to produce 1-2mb of allocations per second; that seems a bit high, but note that loading the debug page also causes allocations. If this is causing problems, we could grab a timed profile to see what's going on.

I'm not following why this is a memory leak. If manual GC works, where is the leak? The garbage collector should eventually run (https://tip.golang.org/doc/gc-guide#GOGC for some description of the scheduling). Are you actually experiencing OOMs? If so, what is your memory limit? Can you please confirm that the GOGC environment variable is unset in your environment?

Also, for medium-sized repos, 650mb high water mark for memory with 200mb low-water mark is not abnormal. Syntax and type information consumes a significant amount of memory, many times the size of the source code. As indicated above, we've done a lot of work so that not all of this information need be in memory at once, but there is still a lot of allocation when this data is invalidated.

@hlpmenu
Copy link
Author

hlpmenu commented Mar 18, 2025

When gopls starts up, it type-checks and analyzes the workspace. This involves a lot of allocation, which becomes garbage once the analysis is complete. GC'ing collects this garbage.

It used to be the case that all of this memory was held indefinitely, so in this case your process would have 650mb memory usage as its steady state. We since did a lot of work to make memory independent, so that it can be GC'ed and the steady state memory is lower (it sounds like 150mb in your case).

There are also some asynchronous processes in gopls that will allocate in the background:

  • Updating the in-memory index of GOMODCACHE data.
  • Performing a GC of the file-based cache it uses for derived data.
  • Running a timed runtime GC.

All of these will also produce some garbage that must be GC'ed. I'm not sure that it's typical for these to produce 1-2mb of allocations per second; that seems a bit high, but note that loading the debug page also causes allocations. If this is causing problems, we could grab a timed profile to see what's going on.

I'm not following why this is a memory leak. If manual GC works, where is the leak? The garbage collector should eventually run (https://tip.golang.org/doc/gc-guide#GOGC for some description of the scheduling). Are you actually experiencing OOMs? If so, what is your memory limit? Can you please confirm that the GOGC environment variable is unset in your environment?

Also, for medium-sized repos, 650mb high water mark for memory with 200mb low-water mark is not abnormal. Syntax and type information consumes a significant amount of memory, many times the size of the source code. As indicated above, we've done a lot of work so that not all of this information need be in memory at once, but there is still a lot of allocation when this data is invalidated.

Maybe i was a bit unclear, i just used 650mb as a example, as i had just pkill:d it,

Its not 600mb, its 30gb... It does not gc at all, ever..

It grows about 1-2MB a sec until it reaches the OOM point, just now, it was at 16GB.. Ran a manual gc, and it went down to 200 ish mb.

Were not talking it jumping up to 30gb right away, it like almost at a constant phase keeps growing. Also it does not matter if anything is done on the system, just with the process as "sleeping", it still grows by the second

Edit: Regarding OOM, yep, it goes on until either oom kicks in or if you are using the computer, you notice when it completly freezes.
Also worth mentioning, is, that if you look at stats, it shows all malloc:s, but 0 free:s,

@prattmic
Copy link
Member

    "go.toolsEnvVars": {
        "GOGC": "off"
    },

This sounds like it disables automatic GC, assuming go.toolsEnvVars applies to gopls.

@findleyr
Copy link
Member

@prattmic thanks for spotting that! That's the problem.

(Aside: perhaps we should include GOGC in go env output; I did look there :) )

@findleyr
Copy link
Member

Closing as WAI.

@findleyr findleyr closed this as not planned Won't fix, can't repro, duplicate, stale Mar 18, 2025
@prattmic
Copy link
Member

prattmic commented Mar 18, 2025

(Aside: perhaps we should include GOGC in go env output; I did look there :) )

I think that would be nice, though I don't think it would have helped here, since go env was probably run outside of the go.toolsEnvVars context.

Since gopls has a memory diagnostics web UI, which @hlpmenu used, it may be helpful to include the GOGC and GOMEMLIMIT settings there.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
BugReport Issues describing a possible bug in the Go implementation. gopls Issues related to the Go language server, gopls. Tools This label describes issues relating to any tools in the x/tools repository.
Projects
None yet
Development

No branches or pull requests

5 participants