Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

go/parser,go/types: memory usage doubled in 1.17 #49035

Closed
danmillwood opened this issue Oct 18, 2021 · 14 comments
Closed

go/parser,go/types: memory usage doubled in 1.17 #49035

danmillwood opened this issue Oct 18, 2021 · 14 comments
Assignees
Labels
FrozenDueToAge NeedsInvestigation Someone must examine and confirm this is a valid issue and not a duplicate of an existing one.

Comments

@danmillwood
Copy link

What version of Go are you using (go version)?

$ go version
go version go1.17.2 linux/amd64

Does this issue reproduce with the latest release?

Yes (assuming 1.17.2 is latest release

What operating system and processor architecture are you using (go env)?

go env Output
$ go env
GO111MODULE=""
GOARCH="amd64"
GOBIN=""
GOCACHE="/root/.cache/go-build"
GOENV="/root/.config/go/env"
GOEXE=""
GOEXPERIMENT=""
GOFLAGS=""
GOHOSTARCH="amd64"
GOHOSTOS="linux"
GOINSECURE=""
GOMODCACHE="/root/go/pkg/mod"
GONOPROXY=""
GONOSUMDB=""
GOOS="linux"
GOPATH="/root/go"
GOPRIVATE=""
GOPROXY="https://proxy.golang.org,direct"
GOROOT="/usr/local/go"
GOSUMDB="sum.golang.org"
GOTMPDIR=""
GOTOOLDIR="/usr/local/go/pkg/tool/linux_amd64"
GOVCS=""
GOVERSION="go1.17.2"
GCCGO="gccgo"
AR="ar"
CC="gcc"
CXX="g++"
CGO_ENABLED="1"
GOMOD="/dev/null"
CGO_CFLAGS="-g -O2"
CGO_CPPFLAGS=""
CGO_CXXFLAGS="-g -O2"
CGO_FFLAGS="-g -O2"
CGO_LDFLAGS="-g -O2"
PKG_CONFIG="pkg-config"
GOGCCFLAGS="-fPIC -m64 -pthread -fmessage-length=0 -fdebug-prefix-map=/tmp/go-build2626739605=/tmp/go-build -gno-record-gcc-switches"

What did you do?

The Kubernetes open source code includes a verification step that performs cross compile validation for a number of target systems. Im not familiar with the code but the README states that it does the following:

Typecheck does cross-platform typechecking of source code for all Go build
platforms.

The primary benefit is speed: a full Kubernetes cross-build takes 20 minutes
and >40GB of RAM, while this takes under 2 minutes and <8GB of RAM.

It uses Go's built-in parsing and typechecking libraries (go/parser and
go/types)

https://github.com/kubernetes/kubernetes/blob/master/test/typecheck/README

With go 1.16 and below, these checks could be run on a system with 8GB RAM. With go 1.17.2 this now requires more than double the amount of RAM to avoid out of memory issues. It appears something in go 1.17 is using significantly more memory.

The shell script to run the check is here https://github.com/kubernetes/kubernetes/blob/master/hack/verify-typecheck.sh

Recreating the OOM can be done by pulling down the Kubernetes open source package from https://github.com/kubernetes/kubernetes and running hack/verify-typecheck.sh with a go 1.17.2 compiler on a Linux VM with 8GB memory.

What did you expect to see?

I modified hack/lib/golang.sh to set minimum_go_version=go1.16.0 and ran the test with go 1.16.9. The result was

root@kubernetes1804:~/go/src/k8s.io/kubernetes# hack/verify-typecheck.sh
type-checking windows/arm64
type-checking windows/386
ERROR(windows/arm64): failed to verify: err: exit status 2: stderr: cmd/go: unsupported GOOS/GOARCH pair windows/arm64

type-checking darwin/amd64
type-checking linux/amd64
type-checking linux/386
type-checking linux/arm
type-checking windows/amd64
type-checking linux/arm64
type-checking linux/ppc64le
type-checking linux/s390x
type-checking darwin/arm64
exit status 1

What did you see instead?

With go 1.17.2 I get

root@kubernetes1804:~/go/src/k8s.io/kubernetes# hack/verify-typecheck.sh
Generating prerelease lifecycle code for 6 targets
Generating deepcopy code for 36 targets
Generating defaulter code for 13 targets
Generating conversion code for 19 targets
Generating openapi code for KUBE
Generating openapi code for AGGREGATOR
Generating openapi code for APIEXTENSIONS
Generating openapi code for CODEGEN
Generating openapi code for SAMPLEAPISERVER
type-checking windows/arm64
type-checking linux/arm
fatal error: runtime: out of memory

runtime stack:
runtime.throw({0x6205fa, 0x800000})
	/usr/local/go/src/runtime/panic.go:1198 +0x71
runtime.sysMap(0xc2c5800000, 0x428c40, 0xc1688bbe90)
	/usr/local/go/src/runtime/mem_linux.go:169 +0x96
runtime.(*mheap).grow(0x7bb680, 0x242)
	/usr/local/go/src/runtime/mheap.go:1393 +0x225
runtime.(*mheap).allocSpan(0x7bb680, 0x242, 0x0, 0x0)
	/usr/local/go/src/runtime/mheap.go:1179 +0x165
runtime.(*mheap).alloc.func1()
	/usr/local/go/src/runtime/mheap.go:913 +0x69
runtime.systemstack()
	/usr/local/go/src/runtime/asm_amd64.s:383 +0x49


Id like to be able to let the kubernetes community know whether this is a performance change that can be fixed, or whether folks verifying a kubernetes build will need significantly more memory.

@danmillwood
Copy link
Author

full-stack-trace.txt

@seankhliao
Copy link
Member

#45580 indicated there shouldn't be such an increase.

@seankhliao seankhliao changed the title Memory usage more than doubled when using go/parser and go/types in 1.17 go/parser,go/types: memory usage more than doubled in 1.17 Oct 18, 2021
@seankhliao
Copy link
Member

cc @findleyr

@seankhliao seankhliao added the NeedsInvestigation Someone must examine and confirm this is a valid issue and not a duplicate of an existing one. label Oct 18, 2021
@findleyr
Copy link
Contributor

CC @griesemer

Thanks for reporting, and for the repro. I can investigate.

I don't think go/types memory usage more than doubled in 1.17 generally -- we'd have known by now if that were the case. But it seems likely based on this report that there is a significant regression that this repro exercises.

#45580 was a best-effort audit of the changes, but certainly could have missed something.

@findleyr findleyr self-assigned this Oct 18, 2021
@danmillwood
Copy link
Author

danmillwood commented Oct 20, 2021

Hello @findleyr Thanks for taking a look at this. I was trying to work out how I could provide more insight into what is going on, and ended up increasing the memory of my VM to 16Gb and using the top command to see how much memory was used when running the same script with go 1.16.9 and go 1.17.2. The results were that go 1.16.9 got upto 50.6% of memory and go 1.17.2 used 97.6%. I will attach the two screenshots I took

@danmillwood
Copy link
Author

go1 16 9-top
go1 17 2-top

@danmillwood danmillwood changed the title go/parser,go/types: memory usage more than doubled in 1.17 go/parser,go/types: memory usage doubled in 1.17 Oct 20, 2021
@cuonglm
Copy link
Member

cuonglm commented Oct 20, 2021

Here's what go tool pprof result, when running hack/verify-typecheck.sh -cross=false.

go1.16.8:

      flat  flat%   sum%        cum   cum%
  566.93MB 34.47% 34.47%   566.93MB 34.47%  go/types.(*Checker).recordTypeAndValue
  134.81MB  8.20% 42.66%   134.81MB  8.20%  go/types.(*Checker).recordUse (inline)
   95.59MB  5.81% 48.47%   102.08MB  6.21%  go/parser.(*parser).parseIdent
   50.56MB  3.07% 51.55%    50.56MB  3.07%  go/types.(*Scope).Insert
   38.59MB  2.35% 53.89%    38.59MB  2.35%  go/ast.NewObj
   36.97MB  2.25% 56.14%   116.40MB  7.08%  go/parser.(*parser).parseOperand
   35.23MB  2.14% 58.28%    54.09MB  3.29%  go/types.(*Checker).recordSelection
   34.42MB  2.09% 60.37%    34.42MB  2.09%  go/types.NewScope
      33MB  2.01% 62.38%       33MB  2.01%  go/types.(*Checker).recordDef
   31.76MB  1.93% 64.31%    75.47MB  4.59%  go/parser.(*parser).parseCallOrConversion

go1.17.1:

      flat  flat%   sum%        cum   cum%
 1025.87MB 26.35% 26.35%  1025.87MB 26.35%  go/types.(*Checker).recordTypeAndValue
  924.97MB 23.76% 50.11%   924.97MB 23.76%  go/types.sanitizer.typ
  277.28MB  7.12% 57.24%   277.28MB  7.12%  go/types.(*Checker).recordUse (inline)
  179.31MB  4.61% 61.84%   187.85MB  4.83%  go/parser.(*parser).parseIdent
  109.08MB  2.80% 64.65%   109.08MB  2.80%  go/types.(*Scope).Insert
   80.95MB  2.08% 66.72%   117.22MB  3.01%  go/types.(*Checker).recordSelection
   79.46MB  2.04% 68.77%    79.46MB  2.04%  go/types.NewScope
   69.92MB  1.80% 70.56%    69.92MB  1.80%  go/ast.NewObj
   60.17MB  1.55% 72.11%    60.17MB  1.55%  go/types.(*Checker).recordDef
   56.57MB  1.45% 73.56%   139.27MB  3.58%  go/parser.(*parser).parseCallOrConversion

@cuonglm
Copy link
Member

cuonglm commented Oct 20, 2021

And here's the result for current tip 4e565f7:

      flat  flat%   sum%        cum   cum%
  605.48MB 31.88% 31.88%   605.48MB 31.88%  go/types.(*Checker).recordTypeAndValue
  145.96MB  7.69% 39.57%   145.96MB  7.69%  go/types.(*Checker).recordUse (inline)
  110.91MB  5.84% 45.41%   119.25MB  6.28%  go/parser.(*parser).parseIdent
   60.17MB  3.17% 48.58%    60.17MB  3.17%  go/types.(*Scope).insert
   50.02MB  2.63% 51.21%    50.02MB  2.63%  go/types.NewScope
   41.54MB  2.19% 53.40%    42.23MB  2.22%  os.ReadFile
   41.18MB  2.17% 55.57%    41.18MB  2.17%  go/ast.NewObj
   39.82MB  2.10% 57.66%   121.19MB  6.38%  go/parser.(*parser).parseOperand
   37.86MB  1.99% 59.66%    37.86MB  1.99%  go/token.(*File).AddLine
   36.97MB  1.95% 61.60%    59.53MB  3.13%  go/types.(*Checker).recordSelection

So sounds like the sanitizer is problem, which we removed in https://go-review.googlesource.com/c/go/+/335929.

cc @findleyr

@findleyr
Copy link
Contributor

Thanks for looking into this @danmillwood and @cuonglm! I confess I haven't gotten to debugging this yet.

The sanitizer sounds like a highly likely culprit. It will produce a lot of garbage, which in the context of something like gopls' steady-state memory might not be noticeable, but could easily spike the high-water mark in a script.

This is probably a big enough regression to fix for the next point release. We can just remove the sanitization pass in 1.17, as it's not necessary for non-generic code, and no longer applies at tip 😞.

Before patching, I'll need to confirm that this is the problem. I can do this, but @cuonglm or @danmillwood if you're interested you can try commenting out the sanitization pass here: https://cs.opensource.google/go/go/+/refs/tags/go1.17.2:src/go/types/check.go;l=278;drc=1607c2817241bd141af9331a3e6c3148e5cd5d8b

@griesemer
Copy link
Contributor

In the sanitizer, is is the maps used to detect duplicates? Otherwise, w/o generics, no allocations (but perhaps for implicit allocations due to non-careful coding) should happen. Would be good to verify.

@findleyr
Copy link
Contributor

@griesemer makes a good point. I assumed it was the map writes, but it would be good to confirm via pprof list.

@cuonglm
Copy link
Member

cuonglm commented Oct 21, 2021

@findleyr Here's the result with go1.17.2 with sanitizer pass commented out:

      flat  flat%   sum%        cum   cum%
  637.58MB 31.27% 31.27%   637.58MB 31.27%  go/types.(*Checker).recordTypeAndValue
  152.11MB  7.46% 38.73%   152.11MB  7.46%  go/types.(*Checker).recordUse
  125.16MB  6.14% 44.87%   134.94MB  6.62%  go/parser.(*parser).parseIdent
   64.78MB  3.18% 48.04%    64.78MB  3.18%  go/types.(*Scope).Insert
   50.21MB  2.46% 50.50%    50.21MB  2.46%  go/token.(*File).AddLine
   46.36MB  2.27% 52.78%    46.36MB  2.27%  go/ast.NewObj
   45.39MB  2.23% 55.00%    45.39MB  2.23%  go/types.NewScope
   40.30MB  1.98% 56.98%   130.80MB  6.41%  go/parser.(*parser).parseOperand
   39.86MB  1.95% 58.93%    96.80MB  4.75%  go/parser.(*parser).parseCallOrConversion
   38.57MB  1.89% 60.83%    38.57MB  1.89%  go/types.(*Checker).recordDef

But even with this patch, running hack/verify-typecheck.sh for all platforms still report ~12GB memory usage (via Activity Monitor in my M1).

@findleyr
Copy link
Contributor

Sorry for the delay on this; it got pushed back by work toward the 1.18 freeze. This still warrants investigation, and possibly a cherry-picked fix, particularly considering other reports like #49883.

I will try to get to this by the end of next week.

@findleyr
Copy link
Contributor

findleyr commented Aug 9, 2022

Unfortunately this didn't get done, sorry.

We're now on Go 1.19, so I expect this is now obsolete as both supported Go versions have the fix.

@findleyr findleyr closed this as not planned Won't fix, can't repro, duplicate, stale Aug 9, 2022
@golang golang locked and limited conversation to collaborators Aug 9, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
FrozenDueToAge NeedsInvestigation Someone must examine and confirm this is a valid issue and not a duplicate of an existing one.
Projects
None yet
Development

No branches or pull requests

6 participants