Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

runtime: 2-3x memory usage when running in container increase in 1.16 vs 1.15 (MADV_DONTNEED related) #44554

Open
pitr opened this issue Feb 23, 2021 · 21 comments
Labels
compiler/runtime Issues related to the Go compiler and/or runtime. NeedsInvestigation Someone must examine and confirm this is a valid issue and not a duplicate of an existing one.
Milestone

Comments

@pitr
Copy link

pitr commented Feb 23, 2021

What version of Go are you using (go version)?

$ go version
go version go1.16 linux/amd64

Does this issue reproduce with the latest release?

Yes, in 1.16

What operating system and processor architecture are you using (go env)?

go env Output
$ go env
GO111MODULE=""
GOARCH="amd64"
GOBIN=""
GOCACHE="/root/.cache/go-build"
GOENV="/root/.config/go/env"
GOEXE=""
GOFLAGS=""
GOHOSTARCH="amd64"
GOHOSTOS="linux"
GOINSECURE=""
GOMODCACHE="/go/pkg/mod"
GONOPROXY=""
GONOSUMDB=""
GOOS="linux"
GOPATH="/go"
GOPRIVATE=""
GOPROXY="https://proxy.golang.org,direct"
GOROOT="/usr/local/go"
GOSUMDB="sum.golang.org"
GOTMPDIR=""
GOTOOLDIR="/usr/local/go/pkg/tool/linux_amd64"
GOVCS=""
GOVERSION="go1.16"
GCCGO="gccgo"
AR="ar"
CC="gcc"
CXX="g++"
CGO_ENABLED="1"
GOMOD="/dev/null"
CGO_CFLAGS="-g -O2"
CGO_CPPFLAGS=""
CGO_CXXFLAGS="-g -O2"
CGO_FFLAGS="-g -O2"
CGO_LDFLAGS="-g -O2"
PKG_CONFIG="pkg-config"
GOGCCFLAGS="-fPIC -m64 -pthread -fno-caret-diagnostics -Qunused-arguments -fmessage-length=0 -fdebug-prefix-map=/tmp/go-build457535914=/tmp/go-build -gno-record-gcc-switches"

What did you do?

After upgrading our application to go 1.16, it started using 2-3x more memory. Unfortunately I cannot provide steps to replicate this, as it is an internal piece of software.

What did you expect to see?

Expected to see same memory usage profile or better (due to MADV_DONTNEED being enabled).

What did you see instead?

2 series with low (~50mb) memory usage are generated by binaries built with go 1.15. Others are all 1.16. The last series is built with GODEBUG=madvdontneed=0 (which doesn't seem to make any difference)

Screen Shot 2021-02-23 at 19 35 20

$ cat go.mod
module ...

go 1.16

require (
    github.com/go-ole/go-ole v1.2.5 // indirect
    github.com/go-redis/redis/v7 v7.4.0
    github.com/golang/protobuf v1.4.3 // indirect
    github.com/gorilla/mux v1.8.0
    github.com/magefile/mage v1.11.0 // indirect
    github.com/matryer/is v1.4.0
    github.com/opentracing-contrib/go-stdlib v1.0.0
    github.com/opentracing/opentracing-go v1.2.0
    github.com/rcrowley/go-metrics v0.0.0-20201227073835-cf1acfcdf475
    github.com/shirou/gopsutil v3.21.1+incompatible // indirect
    github.com/sirupsen/logrus v1.8.0
    golang.org/x/net v0.0.0-20210220033124-5f55cee0dc0d // indirect
    golang.org/x/oauth2 v0.0.0-20210220000619-9bb904979d93
    golang.org/x/sys v0.0.0-20210220050731-9a76102bfb43 // indirect
    golang.org/x/text v0.3.5 // indirect
    google.golang.org/appengine v1.6.7 // indirect
    google.golang.org/genproto v0.0.0-20210219173056-d891e3cb3b5b // indirect
    google.golang.org/grpc v1.35.0 // indirect
    gopkg.in/yaml.v2 v2.2.5 // indirect
)
@ALTree ALTree changed the title 2-3x memory usage increase in 1.16 vs 1.15 runtime: 2-3x memory usage increase in 1.16 vs 1.15 Feb 23, 2021
@ALTree
Copy link
Member

ALTree commented Feb 23, 2021

cc @mknyszek

@ALTree ALTree added the NeedsInvestigation Someone must examine and confirm this is a valid issue and not a duplicate of an existing one. label Feb 23, 2021
@seankhliao
Copy link
Member

@pitr have you looked at a memory profile to see where the extra memory is being used?

note GODEBUG=madvdontneed is a run time and not build time setting

@mknyszek
Copy link
Contributor

It's very hard for me to tell whether this is a purely runtime thing just from the information you've provided. My first question is what (in very precise terms) is that graph showing? RSS? RSS+swap? RSS+swap+cache? Some runtime.MemStats field?

After knowing that, the next thing to determine is whether that extra memory used is actually just stale free memory or fragmentation or something (that's more likely a runtime issue) or whether it's that more objects are alive (this could be a runtime issue, but also maybe a compiler issue, or an issue with a standard library package, or something else).

For this purpose, running your application with the environment variable GODEBUG=gctrace=1 for both runs would be very helpful. It would be nice to compare that to RSS if that's not what your graph is showing (if it is, then great!).

As @seankhliao suggested, looking at a memory profile (specifically one for each of the Go 1.15 and Go 1.16 runs) would be helpful too because if it is actually that more objects are live, then this will tell you which ones it is. go tool pprof has a --diff_base flag that will allow you to get a diff between the two.

Lastly, the most effective thing here (since the difference is so significant) is to actually just start bisecting the Go repository between 1.15 and 1.16 to pinpoint the exact change that caused the issue. I would normally just do that last one myself but alas I need steps to reproduce. I can help you with building the Go toolchain and then building your application with it if it comes to that.

@pitr
Copy link
Author

pitr commented Feb 24, 2021

Thanks for great suggestions. I will try pprof next. In the mean time I fixed GODEBUG=madvdontneed to be a runtime setting, and it seems to help, restoring pre1.16 memory usage:

Screen Shot 2021-02-24 at 10 44 48

Regarding numbers shown, the metric comes from container_memory_working_set_bytes which is "Current working set in bytes"

@szuecs
Copy link

szuecs commented Feb 24, 2021

To add to @pitr a bit of environment information: container_memory_working_set_bytes is recorded by cadvisor v0.34.0 metrics

@AZ-X
Copy link

AZ-X commented Feb 24, 2021

I can confirm the phenomenon happens to the repique.
Repique is a DNS stub currently released on Feb.18. It ships with major two versions (1.15, 1.16) of golang libraries.
When switching to version 1.16 of golang std library, memory cost increases about 1-2M (primary usage: TLS1.3+HTTPS2).
You can download it here.
It’s usually NOT assigned to an issue, nevertheless I am testing dry run releases for OpenWRT on a mips device with mem=64M.
repique dnscrypt-proxy-r2-legacy

@pitr
Copy link
Author

pitr commented Feb 26, 2021

I was able to look into both gctrace and pprof (inuse_space). Since I already pinpointed that this issue can be replicated with simply switching madvdontneed, I am comparing binaries built with 1.16 with madvdontneed=1 and 1.16 with madvdontneed=0, and not looking at 1.15 anymore.

First gctrace=1 data:

madvdontneed=1 (go1.16) average container_memory_working_set_bytes is 434MB

Feb 26 12:01:09.658 gc 1 @0.018s 0%: 0.017+0.38+0.020 ms clock, 0.14+0.10/0.41/0.47+0.16 ms cpu, 4->4->0 MB, 5 MB goal, 8 P
Feb 26 12:01:09.685 gc 2 @0.046s 0%: 0.017+0.55+0.003 ms clock, 0.14+0.29/0.44/0.95+0.024 ms cpu, 4->4->1 MB, 5 MB goal, 8 P
Feb 26 12:01:09.749 gc 3 @0.110s 0%: 0.029+0.37+0.004 ms clock, 0.23+0/0.51/1.3+0.032 ms cpu, 4->4->1 MB, 5 MB goal, 8 P
Feb 26 12:01:09.777 gc 4 @0.137s 0%: 0.029+0.75+0.17 ms clock, 0.23+0/0.58/1.6+1.3 ms cpu, 5->5->4 MB, 6 MB goal, 8 P
Feb 26 12:01:09.810 gc 5 @0.169s 0%: 0.024+2.1+0.024 ms clock, 0.19+0/0.58/3.0+0.19 ms cpu, 9->11->9 MB, 10 MB goal, 8 P
Feb 26 12:01:09.872 gc 6 @0.230s 0%: 0.020+3.5+0.032 ms clock, 0.16+0/3.7/1.3+0.25 ms cpu, 23->23->17 MB, 24 MB goal, 8 P
Feb 26 12:01:09.990 gc 7 @0.348s 0%: 0.031+2.4+0.028 ms clock, 0.25+0/0.69/3.4+0.22 ms cpu, 34->34->23 MB, 35 MB goal, 8 P
Feb 26 12:01:10.047 gc 8 @0.401s 0%: 0.058+7.6+0.018 ms clock, 0.47+0/0.96/7.1+0.14 ms cpu, 56->64->53 MB, 57 MB goal, 8 P
Feb 26 12:01:10.296 gc 9 @0.643s 0%: 0.59+13+0.021 ms clock, 4.7+0.32/14/28+0.17 ms cpu, 132->133->101 MB, 133 MB goal, 8 P
Feb 26 12:01:10.780 gc 10 @1.127s 0%: 0.044+13+0.019 ms clock, 0.35+0.40/27/67+0.15 ms cpu, 193->195->92 MB, 202 MB goal, 8 P
Feb 26 12:02:06.125 gc 11 @56.469s 0%: 0.33+16+0.003 ms clock, 2.6+3.0/32/91+0.030 ms cpu, 179->179->93 MB, 184 MB goal, 8 P
Feb 26 12:02:23.699 gc 12 @74.037s 0%: 0.13+22+0.021 ms clock, 1.0+0.36/45/113+0.17 ms cpu, 182->185->127 MB, 187 MB goal, 8 P
Feb 26 12:03:23.655 gc 13 @133.995s 0%: 0.089+21+0.020 ms clock, 0.71+0.52/42/106+0.16 ms cpu, 249->251->119 MB, 255 MB goal, 8 P
Feb 26 12:04:23.241 gc 14 @193.583s 0%: 0.15+18+0.019 ms clock, 1.2+0/37/94+0.15 ms cpu, 233->233->99 MB, 239 MB goal, 8 P
Feb 26 12:04:23.757 gc 15 @194.090s 0%: 0.059+27+0.020 ms clock, 0.47+0.38/53/116+0.16 ms cpu, 194->198->152 MB, 199 MB goal, 8 P
Feb 26 12:05:23.481 gc 16 @253.821s 0%: 0.10+20+0.019 ms clock, 0.87+0/41/107+0.15 ms cpu, 303->303->105 MB, 304 MB goal, 8 P
Feb 26 12:06:17.127 gc 17 @307.462s 0%: 0.27+24+0.064 ms clock, 2.2+91/48/0+0.51 ms cpu, 206->211->98 MB, 211 MB goal, 8 P
Feb 26 12:06:23.711 gc 18 @314.050s 0%: 0.18+21+0.018 ms clock, 1.5+6.9/42/88+0.14 ms cpu, 189->189->134 MB, 196 MB goal, 8 P
Feb 26 12:06:24.332 gc 19 @314.663s 0%: 0.063+29+0.017 ms clock, 0.50+0.65/59/144+0.14 ms cpu, 262->267->183 MB, 269 MB goal, 8 P
Feb 26 12:07:23.695 gc 20 @374.031s 0%: 0.55+24+0.023 ms clock, 4.4+15/47/108+0.18 ms cpu, 372->373->175 MB, 373 MB goal, 8 P
Feb 26 12:08:23.598 gc 21 @433.939s 0%: 0.14+20+0.004 ms clock, 1.1+0/39/100+0.033 ms cpu, 348->348->118 MB, 350 MB goal, 8 P
Feb 26 12:09:21.126 gc 22 @491.464s 0%: 0.35+22+0.003 ms clock, 2.8+32/45/74+0.026 ms cpu, 230->231->96 MB, 236 MB goal, 8 P
Feb 26 12:09:24.135 gc 23 @494.466s 0%: 0.18+29+0.018 ms clock, 1.5+5.7/58/129+0.14 ms cpu, 187->187->126 MB, 192 MB goal, 8 P
Feb 26 12:10:23.632 gc 24 @553.969s 0%: 0.13+23+0.034 ms clock, 1.0+0.31/47/93+0.27 ms cpu, 246->249->131 MB, 252 MB goal, 8 P
Feb 26 12:10:37.119 gc 25 @567.462s 0%: 0.16+18+0.003 ms clock, 1.3+1.6/35/102+0.024 ms cpu, 255->256->93 MB, 262 MB goal, 8 P
Feb 26 12:11:23.511 gc 26 @613.849s 0%: 0.090+23+0.025 ms clock, 0.72+0.48/44/97+0.20 ms cpu, 186->186->122 MB, 187 MB goal, 8 P
Feb 26 12:11:24.240 gc 27 @614.573s 0%: 0.052+28+0.021 ms clock, 0.42+0.70/57/139+0.16 ms cpu, 238->243->167 MB, 244 MB goal, 8 P
Feb 26 12:12:23.612 gc 28 @673.948s 0%: 0.15+23+0.17 ms clock, 1.2+0/47/95+1.3 ms cpu, 328->328->134 MB, 334 MB goal, 8 P
Feb 26 12:12:31.381 gc 29 @681.709s 0%: 0.071+32+0.019 ms clock, 0.57+0/64/175+0.15 ms cpu, 263->263->150 MB, 269 MB goal, 8 P
Feb 26 12:13:38.124 gc 30 @748.463s 0%: 0.69+20+0.005 ms clock, 5.5+31/40/67+0.043 ms cpu, 292->294->96 MB, 300 MB goal, 8 P
Feb 26 12:14:23.452 gc 31 @793.789s 0%: 0.14+23+0.018 ms clock, 1.1+0/46/98+0.14 ms cpu, 191->191->122 MB, 192 MB goal, 8 P
Feb 26 12:14:24.223 gc 32 @794.556s 0%: 0.059+27+0.015 ms clock, 0.47+0.49/55/135+0.12 ms cpu, 239->244->163 MB, 245 MB goal, 8 P
Feb 26 12:15:23.674 gc 33 @854.011s 0%: 0.10+24+0.019 ms clock, 0.85+0.33/45/115+0.15 ms cpu, 319->322->126 MB, 327 MB goal, 8 P
Feb 26 12:16:23.368 gc 34 @913.703s 0%: 0.26+25+0.018 ms clock, 2.0+1.0/50/99+0.14 ms cpu, 246->248->112 MB, 252 MB goal, 8 P
Feb 26 12:16:23.704 gc 35 @914.034s 0%: 0.54+30+0.045 ms clock, 4.3+7.2/60/103+0.36 ms cpu, 231->234->191 MB, 232 MB goal, 8 P
Feb 26 12:17:01.168 gc 36 @951.479s 0%: 0.14+49+0.003 ms clock, 1.1+51/67/38+0.028 ms cpu, 373->378->101 MB, 383 MB goal, 8 P
Feb 26 12:17:23.759 gc 37 @974.093s 0%: 0.088+26+0.019 ms clock, 0.70+0.35/52/130+0.15 ms cpu, 198->201->139 MB, 203 MB goal, 8 P
Feb 26 12:18:23.664 gc 38 @1033.997s 0%: 0.17+27+0.039 ms clock, 1.3+0.29/54/96+0.31 ms cpu, 271->275->134 MB, 278 MB goal, 8 P
Feb 26 12:18:24.394 gc 39 @1034.713s 0%: 0.065+41+0.026 ms clock, 0.52+1.5/83/137+0.20 ms cpu, 261->270->193 MB, 268 MB goal, 8 P
Feb 26 12:19:23.935 gc 40 @1094.267s 0%: 0.11+28+0.019 ms clock, 0.88+0/55/140+0.15 ms cpu, 402->402->172 MB, 403 MB goal, 8 P
Feb 26 12:20:23.457 gc 41 @1153.793s 0%: 0.15+24+0.017 ms clock, 1.2+0/46/94+0.14 ms cpu, 338->338->126 MB, 345 MB goal, 8 P
Feb 26 12:20:24.207 gc 42 @1154.542s 0%: 0.10+25+0.028 ms clock, 0.81+0.29/51/124+0.22 ms cpu, 246->249->152 MB, 252 MB goal, 8 P
Feb 26 12:20:26.141 gc 43 @1156.462s 0%: 0.17+40+0.018 ms clock, 1.4+6.7/79/184+0.14 ms cpu, 296->297->182 MB, 304 MB goal, 8 P
Feb 26 12:21:23.731 gc 44 @1214.060s 0%: 0.17+31+0.027 ms clock, 1.4+13/62/122+0.21 ms cpu, 362->364->180 MB, 365 MB goal, 8 P
Feb 26 12:22:23.654 gc 45 @1273.992s 0%: 0.15+23+0.020 ms clock, 1.2+0.35/46/114+0.16 ms cpu, 352->355->119 MB, 361 MB goal, 8 P
Feb 26 12:23:23.431 gc 46 @1333.769s 0%: 0.11+22+0.024 ms clock, 0.90+3.7/45/93+0.19 ms cpu, 234->234->108 MB, 239 MB goal, 8 P
Feb 26 12:23:23.732 gc 47 @1334.066s 0%: 0.056+27+0.026 ms clock, 0.45+0.33/53/122+0.21 ms cpu, 211->214->163 MB, 216 MB goal, 8 P
Feb 26 12:24:23.312 gc 48 @1393.652s 0%: 0.13+21+0.021 ms clock, 1.1+0/40/107+0.16 ms cpu, 323->323->106 MB, 327 MB goal, 8 P
Feb 26 12:24:23.718 gc 49 @1394.051s 0%: 0.33+27+0.024 ms clock, 2.6+0.33/54/108+0.19 ms cpu, 207->210->157 MB, 212 MB goal, 8 P
Feb 26 12:25:23.424 gc 50 @1453.760s 0%: 0.10+24+0.021 ms clock, 0.83+0/44/103+0.17 ms cpu, 307->307->100 MB, 314 MB goal, 8 P
Feb 26 12:25:59.121 gc 51 @1489.463s 0%: 0.19+18+0.003 ms clock, 1.5+2.0/37/106+0.026 ms cpu, 195->195->94 MB, 200 MB goal, 8 P
Feb 26 12:26:23.797 gc 52 @1514.134s 0%: 0.21+22+0.17 ms clock, 1.6+0.45/45/111+1.3 ms cpu, 183->186->121 MB, 188 MB goal, 8 P
Feb 26 12:27:23.367 gc 53 @1573.704s 0%: 0.10+24+0.003 ms clock, 0.83+6.9/49/88+0.027 ms cpu, 239->241->116 MB, 242 MB goal, 8 P
Feb 26 12:27:23.776 gc 54 @1574.105s 0%: 0.14+32+0.032 ms clock, 1.1+11/64/113+0.26 ms cpu, 243->246->196 MB, 244 MB goal, 8 P
Feb 26 12:28:17.141 gc 55 @1627.469s 0%: 0.18+32+0.013 ms clock, 1.4+88/65/2.3+0.11 ms cpu, 381->388->103 MB, 392 MB goal, 8 P
Feb 26 12:28:23.600 gc 56 @1633.935s 0%: 0.17+24+0.034 ms clock, 1.4+0/49/95+0.27 ms cpu, 208->209->151 MB, 209 MB goal, 8 P
Feb 26 12:28:23.988 gc 57 @1634.315s 0%: 0.049+34+0.020 ms clock, 0.39+0.31/68/154+0.16 ms cpu, 294->298->212 MB, 302 MB goal, 8 P
Feb 26 12:29:23.526 gc 58 @1693.862s 0%: 0.10+25+0.027 ms clock, 0.84+0/50/92+0.21 ms cpu, 421->421->141 MB, 424 MB goal, 8 P
Feb 26 12:29:23.918 gc 59 @1694.249s 0%: 0.059+29+0.17 ms clock, 0.47+0.20/58/134+1.3 ms cpu, 276->278->192 MB, 283 MB goal, 8 P
Feb 26 12:30:23.610 gc 60 @1753.949s 0%: 0.15+21+0.020 ms clock, 1.2+0.28/41/103+0.16 ms cpu, 374->377->114 MB, 384 MB goal, 8 P
Feb 26 12:31:23.469 gc 61 @1813.810s 0%: 0.11+19+0.019 ms clock, 0.88+0/39/102+0.15 ms cpu, 225->225->105 MB, 229 MB goal, 8 P
Feb 26 12:32:15.119 gc 62 @1865.461s 0%: 0.16+19+0.003 ms clock, 1.3+10/38/95+0.027 ms cpu, 205->206->94 MB, 210 MB goal, 8 P
Feb 26 12:32:23.604 gc 63 @1873.942s 0%: 0.14+23+0.027 ms clock, 1.1+0.34/46/97+0.21 ms cpu, 184->186->130 MB, 188 MB goal, 8 P
Feb 26 12:32:28.328 gc 64 @1878.652s 0%: 0.091+37+0.024 ms clock, 0.73+0.54/74/174+0.19 ms cpu, 253->253->148 MB, 260 MB goal, 8 P
Feb 26 12:33:23.702 gc 65 @1934.037s 0%: 0.11+24+0.026 ms clock, 0.93+0.32/49/115+0.20 ms cpu, 290->293->144 MB, 297 MB goal, 8 P
Feb 26 12:34:17.135 gc 66 @1987.467s 0%: 0.54+28+0.062 ms clock, 4.3+93/53/0+0.50 ms cpu, 282->288->101 MB, 289 MB goal, 8 P
Feb 26 12:34:24.135 gc 67 @1994.461s 0%: 0.32+34+0.019 ms clock, 2.5+51/68/75+0.15 ms cpu, 195->199->127 MB, 202 MB goal, 8 P
Feb 26 12:35:23.885 gc 68 @2054.220s 0%: 0.086+25+0.020 ms clock, 0.69+0.35/50/126+0.16 ms cpu, 247->251->135 MB, 255 MB goal, 8 P
Feb 26 12:36:23.650 gc 69 @2113.988s 0%: 0.21+22+0.018 ms clock, 1.7+0.34/43/109+0.15 ms cpu, 264->267->122 MB, 271 MB goal, 8 P
Feb 26 12:37:27.830 gc 70 @2178.169s 0%: 0.083+21+0.020 ms clock, 0.66+0/41/106+0.16 ms cpu, 243->243->118 MB, 244 MB goal, 8 P
Feb 26 12:38:23.420 gc 71 @2233.762s 0%: 0.16+18+0.020 ms clock, 1.3+0/36/105+0.16 ms cpu, 230->230->99 MB, 236 MB goal, 8 P
Feb 26 12:39:01.133 gc 72 @2271.465s 0%: 0.24+26+0.069 ms clock, 1.9+92/52/0+0.55 ms cpu, 194->200->99 MB, 199 MB goal, 8 P
Feb 26 12:39:23.772 gc 73 @2294.108s 0%: 0.10+24+0.019 ms clock, 0.82+0/49/107+0.15 ms cpu, 219->220->150 MB, 220 MB goal, 8 P
Feb 26 12:40:17.136 gc 74 @2347.467s 0%: 0.21+29+0.024 ms clock, 1.7+89/55/0+0.19 ms cpu, 291->297->101 MB, 300 MB goal, 8 P
Feb 26 12:40:23.659 gc 75 @2353.995s 0%: 0.13+24+0.043 ms clock, 1.0+0.52/49/106+0.34 ms cpu, 195->198->140 MB, 203 MB goal, 8 P
Feb 26 12:40:37.119 gc 76 @2367.462s 0%: 0.26+17+0.003 ms clock, 2.1+2.6/35/100+0.026 ms cpu, 273->273->93 MB, 280 MB goal, 8 P
Feb 26 12:41:23.677 gc 77 @2414.015s 0%: 0.084+22+0.021 ms clock, 0.67+0.40/44/110+0.17 ms cpu, 182->185->119 MB, 187 MB goal, 8 P
Feb 26 12:42:23.585 gc 78 @2473.926s 0%: 0.15+19+0.020 ms clock, 1.2+0/38/104+0.16 ms cpu, 244->244->117 MB, 245 MB goal, 8 P
Feb 26 12:43:23.484 gc 79 @2533.825s 0%: 0.10+19+0.019 ms clock, 0.84+0/37/106+0.15 ms cpu, 236->236->105 MB, 237 MB goal, 8 P
Feb 26 12:44:10.136 gc 80 @2580.467s 0%: 0.17+28+0.21 ms clock, 1.4+93/54/0+1.7 ms cpu, 206->211->99 MB, 211 MB goal, 8 P
Feb 26 12:44:23.720 gc 81 @2594.056s 0%: 0.12+25+0.020 ms clock, 1.0+0.39/50/124+0.16 ms cpu, 191->194->132 MB, 199 MB goal, 8 P
Feb 26 12:45:23.722 gc 82 @2654.057s 0%: 0.089+26+0.065 ms clock, 0.71+0.44/45/116+0.52 ms cpu, 257->261->128 MB, 264 MB goal, 8 P
Feb 26 12:46:23.766 gc 83 @2714.105s 0%: 0.19+21+0.022 ms clock, 1.5+0.29/42/105+0.18 ms cpu, 251->253->117 MB, 257 MB goal, 8 P
Feb 26 12:47:23.412 gc 84 @2773.751s 0%: 0.11+21+0.003 ms clock, 0.93+0.28/43/99+0.031 ms cpu, 230->230->104 MB, 235 MB goal, 8 P
Feb 26 12:47:23.722 gc 85 @2774.055s 0%: 0.10+28+0.025 ms clock, 0.82+6.6/56/112+0.20 ms cpu, 210->212->177 MB, 211 MB goal, 8 P
Feb 26 12:48:23.610 gc 86 @2833.949s 0%: 0.11+21+0.025 ms clock, 0.92+0.31/42/106+0.20 ms cpu, 345->347->113 MB, 355 MB goal, 8 P
Feb 26 12:49:23.372 gc 87 @2893.714s 0%: 0.12+19+0.019 ms clock, 0.97+0.32/37/102+0.15 ms cpu, 221->223->95 MB, 227 MB goal, 8 P
Feb 26 12:49:36.536 gc 88 @2906.879s 0%: 0.081+17+0.003 ms clock, 0.65+0.38/35/102+0.031 ms cpu, 186->186->93 MB, 191 MB goal, 8 P
Feb 26 12:50:23.598 gc 89 @2953.938s 0%: 0.13+21+0.003 ms clock, 1.0+0/41/98+0.028 ms cpu, 189->189->121 MB, 190 MB goal, 8 P
Feb 26 12:50:23.905 gc 90 @2954.234s 0%: 0.045+31+0.020 ms clock, 0.36+0.48/63/143+0.16 ms cpu, 237->240->186 MB, 243 MB goal, 8 P
Feb 26 12:52:09.119 gc 91 @3059.461s 0%: 0.18+18+0.003 ms clock, 1.4+9.0/36/92+0.031 ms cpu, 364->365->94 MB, 373 MB goal, 8 P
Feb 26 12:52:23.445 gc 92 @3073.781s 0%: 0.16+24+0.21 ms clock, 1.3+0/48/94+1.6 ms cpu, 192->192->138 MB, 193 MB goal, 8 P
Feb 26 12:52:23.767 gc 93 @3074.095s 0%: 0.074+32+0.036 ms clock, 0.59+0.37/64/113+0.29 ms cpu, 269->274->204 MB, 276 MB goal, 8 P
Feb 26 12:53:23.649 gc 94 @3133.988s 0%: 0.084+21+0.014 ms clock, 0.67+0.47/42/103+0.11 ms cpu, 397->400->117 MB, 408 MB goal, 8 P
Feb 26 12:54:23.468 gc 95 @3193.809s 0%: 0.16+19+0.019 ms clock, 1.3+0/38/104+0.15 ms cpu, 236->236->105 MB, 237 MB goal, 8 P
Feb 26 12:55:06.130 gc 96 @3236.462s 0%: 0.23+28+0.003 ms clock, 1.8+95/56/0+0.029 ms cpu, 206->211->98 MB, 211 MB goal, 8 P
Feb 26 12:55:23.854 gc 97 @3254.189s 0%: 0.076+25+0.019 ms clock, 0.61+0.23/50/138+0.15 ms cpu, 189->191->137 MB, 197 MB goal, 8 P
Feb 26 12:56:23.592 gc 98 @3313.929s 0%: 0.21+23+0.030 ms clock, 1.7+0/46/97+0.24 ms cpu, 271->272->134 MB, 275 MB goal, 8 P
Feb 26 12:56:24.334 gc 99 @3314.663s 0%: 0.059+31+0.020 ms clock, 0.47+0.69/61/150+0.16 ms cpu, 263->269->185 MB, 269 MB goal, 8 P
Feb 26 12:58:17.129 gc 100 @3427.463s 0%: 0.48+25+0.079 ms clock, 3.8+92/51/0+0.63 ms cpu, 361->366->99 MB, 370 MB goal, 8 P
Feb 26 12:58:23.755 gc 101 @3434.085s 0%: 0.12+31+0.014 ms clock, 0.99+0.35/57/137+0.11 ms cpu, 192->196->141 MB, 198 MB goal, 8 P
Feb 26 12:59:23.756 gc 102 @3494.089s 0%: 0.12+27+0.018 ms clock, 0.98+0.25/54/130+0.15 ms cpu, 276->280->136 MB, 283 MB goal, 8 P
Feb 26 13:00:23.826 gc 103 @3554.157s 0%: 0.13+29+0.033 ms clock, 1.0+0.32/58/130+0.26 ms cpu, 266->270->133 MB, 273 MB goal, 8 P
Feb 26 13:01:23.736 gc 104 @3614.072s 0%: 0.11+24+0.024 ms clock, 0.94+0.34/49/123+0.19 ms cpu, 259->262->129 MB, 266 MB goal, 8 P
Feb 26 13:02:23.708 gc 105 @3674.045s 0%: 0.14+23+0.020 ms clock, 1.1+0.41/46/115+0.16 ms cpu, 252->255->122 MB, 259 MB goal, 8 P
Feb 26 13:03:23.460 gc 106 @3733.800s 0%: 0.11+20+0.003 ms clock, 0.89+0.25/40/103+0.026 ms cpu, 239->239->111 MB, 245 MB goal, 8 P
Feb 26 13:03:23.914 gc 107 @3734.239s 0%: 0.056+35+0.029 ms clock, 0.45+0.67/70/139+0.23 ms cpu, 217->223->174 MB, 222 MB goal, 8 P
Feb 26 13:04:23.648 gc 108 @3793.982s 0%: 0.14+26+0.026 ms clock, 1.1+0/52/111+0.21 ms cpu, 341->343->135 MB, 348 MB goal, 8 P
Feb 26 13:04:24.656 gc 109 @3794.984s 0%: 0.066+32+0.019 ms clock, 0.53+0.26/64/161+0.15 ms cpu, 263->270->185 MB, 270 MB goal, 8 P
Feb 26 13:06:17.143 gc 110 @3907.463s 0%: 0.25+30+0.11 ms clock, 2.0+85/57/0+0.91 ms cpu, 361->366->98 MB, 370 MB goal, 8 P
Feb 26 13:06:23.719 gc 111 @3914.054s 0%: 0.15+25+0.052 ms clock, 1.2+0.36/51/127+0.42 ms cpu, 192->195->136 MB, 197 MB goal, 8 P
Feb 26 13:07:23.758 gc 112 @3974.093s 0%: 0.093+25+0.020 ms clock, 0.75+0.37/51/127+0.16 ms cpu, 265->268->134 MB, 272 MB goal, 8 P
Feb 26 13:08:23.700 gc 113 @4034.035s 0%: 0.14+26+0.027 ms clock, 1.1+0.36/52/124+0.22 ms cpu, 262->266->131 MB, 269 MB goal, 8 P
Feb 26 13:09:23.483 gc 114 @4093.823s 0%: 0.11+20+0.024 ms clock, 0.95+0.29/41/101+0.19 ms cpu, 257->257->121 MB, 263 MB goal, 8 P
Feb 26 13:09:23.913 gc 115 @4094.245s 0%: 0.054+29+0.018 ms clock, 0.43+0.27/58/147+0.15 ms cpu, 237->239->169 MB, 243 MB goal, 8 P
Feb 26 13:10:23.691 gc 116 @4154.024s 0%: 0.15+27+0.036 ms clock, 1.2+0.86/53/114+0.28 ms cpu, 330->334->144 MB, 339 MB goal, 8 P
Feb 26 13:11:21.128 gc 117 @4211.462s 0%: 0.40+26+0.003 ms clock, 3.2+53/52/47+0.031 ms cpu, 281->283->96 MB, 288 MB goal, 8 P
Feb 26 13:11:24.214 gc 118 @4214.546s 0%: 0.088+27+0.018 ms clock, 0.71+0/55/133+0.15 ms cpu, 187->187->123 MB, 192 MB goal, 8 P
Feb 26 13:12:23.717 gc 119 @4274.052s 0%: 0.14+25+0.024 ms clock, 1.1+0.66/51/126+0.19 ms cpu, 240->244->135 MB, 246 MB goal, 8 P
Feb 26 13:13:23.455 gc 120 @4333.792s 0%: 0.10+23+0.032 ms clock, 0.85+0/46/97+0.25 ms cpu, 270->270->137 MB, 271 MB goal, 8 P
Feb 26 13:13:23.823 gc 121 @4334.152s 0%: 0.056+32+0.032 ms clock, 0.45+0.41/64/115+0.25 ms cpu, 268->275->205 MB, 275 MB goal, 8 P
Feb 26 13:14:23.441 gc 122 @4393.779s 0%: 0.14+23+0.025 ms clock, 1.1+0/45/104+0.20 ms cpu, 410->410->121 MB, 411 MB goal, 8 P
Feb 26 13:14:23.905 gc 123 @4394.234s 0%: 0.060+31+0.018 ms clock, 0.48+0.28/61/154+0.14 ms cpu, 237->240->164 MB, 243 MB goal, 8 P
Feb 26 13:15:23.677 gc 124 @4454.009s 0%: 0.079+28+0.029 ms clock, 0.63+0.30/56/120+0.23 ms cpu, 319->323->137 MB, 328 MB goal, 8 P
Feb 26 13:16:14.204 gc 125 @4504.545s 0%: 0.061+20+0.003 ms clock, 0.48+0.37/40/116+0.031 ms cpu, 267->267->93 MB, 274 MB goal, 8 P
Feb 26 13:16:23.469 gc 126 @4513.804s 0%: 0.24+25+0.040 ms clock, 1.9+0.31/50/92+0.32 ms cpu, 197->197->137 MB, 198 MB goal, 8 P
Feb 26 13:16:23.786 gc 127 @4514.112s 0%: 0.076+35+0.032 ms clock, 0.60+0.29/71/111+0.26 ms cpu, 268->274->203 MB, 275 MB goal, 8 P
Feb 26 13:17:23.441 gc 128 @4573.776s 0%: 0.10+26+0.024 ms clock, 0.87+0/52/92+0.19 ms cpu, 412->412->129 MB, 413 MB goal, 8 P
Feb 26 13:17:23.771 gc 129 @4574.096s 0%: 0.12+35+0.18 ms clock, 1.0+10/70/122+1.4 ms cpu, 275->279->217 MB, 276 MB goal, 8 P
Feb 26 13:18:23.722 gc 130 @4634.061s 0%: 0.20+21+0.018 ms clock, 1.6+15/43/84+0.15 ms cpu, 453->453->149 MB, 454 MB goal, 8 P
Feb 26 13:19:08.123 gc 131 @4678.464s 0%: 0.40+20+0.003 ms clock, 3.2+14/40/93+0.030 ms cpu, 289->290->94 MB, 299 MB goal, 8 P
Feb 26 13:19:23.740 gc 132 @4694.076s 0%: 0.10+25+0.027 ms clock, 0.83+0.70/50/124+0.22 ms cpu, 184->187->132 MB, 188 MB goal, 8 P
Feb 26 13:20:23.692 gc 133 @4754.028s 0%: 0.13+24+0.022 ms clock, 1.0+0.36/47/118+0.17 ms cpu, 258->261->127 MB, 265 MB goal, 8 P
Feb 26 13:21:23.600 gc 134 @4813.933s 0%: 0.080+27+0.019 ms clock, 0.64+1.9/53/109+0.15 ms cpu, 264->265->134 MB, 265 MB goal, 8 P
Feb 26 13:21:35.528 gc 135 @4825.855s 0%: 0.096+33+0.017 ms clock, 0.76+0.34/66/164+0.14 ms cpu, 261->261->147 MB, 268 MB goal, 8 P
Feb 26 13:22:23.724 gc 136 @4874.043s 0%: 0.12+30+0.026 ms clock, 1.0+4.3/59/118+0.20 ms cpu, 287->291->148 MB, 295 MB goal, 8 P
Feb 26 13:23:23.417 gc 137 @4933.757s 0%: 0.11+20+0.038 ms clock, 0.88+0/39/112+0.31 ms cpu, 291->291->99 MB, 296 MB goal, 8 P
Feb 26 13:24:06.131 gc 138 @4976.463s 0%: 0.55+28+0.004 ms clock, 4.4+63/57/24+0.034 ms cpu, 193->197->97 MB, 198 MB goal, 8 P
Feb 26 13:24:23.664 gc 139 @4994.001s 0%: 0.13+23+0.019 ms clock, 1.0+0.40/47/117+0.15 ms cpu, 189->192->123 MB, 195 MB goal, 8 P
Feb 26 13:25:23.710 gc 140 @5054.049s 0%: 0.10+21+0.024 ms clock, 0.85+1.3/42/101+0.19 ms cpu, 241->243->114 MB, 247 MB goal, 8 P
Feb 26 13:26:23.410 gc 141 @5113.751s 0%: 0.21+19+0.018 ms clock, 1.7+0/38/111+0.15 ms cpu, 226->226->99 MB, 229 MB goal, 8 P
Feb 26 13:27:01.135 gc 142 @5151.464s 0%: 0.48+30+0.12 ms clock, 3.8+102/55/0+1.0 ms cpu, 194->200->100 MB, 199 MB goal, 8 P
Feb 26 13:27:23.723 gc 143 @5174.048s 0%: 0.082+35+0.014 ms clock, 0.65+0.46/69/142+0.11 ms cpu, 191->195->134 MB, 200 MB goal, 8 P
Feb 26 13:28:23.697 gc 144 @5234.033s 0%: 0.12+24+0.020 ms clock, 0.96+0.35/49/120+0.16 ms cpu, 259->262->129 MB, 268 MB goal, 8 P
Feb 26 13:29:23.625 gc 145 @5293.963s 0%: 0.085+22+0.018 ms clock, 0.68+0.32/45/112+0.14 ms cpu, 252->255->117 MB, 258 MB goal, 8 P

madvdontneed=0 (go1.16) average container_memory_working_set_bytes is 167MB, stabilized at 130MB after 1 hour
Feb 26 13:30:20.055 gc 1 @0.041s 0%: 0.021+0.40+0.028 ms clock, 0.17+0.22/0.46/0.38+0.22 ms cpu, 4->4->0 MB, 5 MB goal, 8 P
Feb 26 13:30:20.088 gc 2 @0.072s 2%: 1.1+0.96+0.046 ms clock, 8.9+1.2/0.76/0+0.36 ms cpu, 4->4->1 MB, 5 MB goal, 8 P
Feb 26 13:30:20.174 gc 3 @0.160s 1%: 0.039+0.73+0.065 ms clock, 0.31+0/0.71/1.1+0.52 ms cpu, 4->4->2 MB, 5 MB goal, 8 P
Feb 26 13:30:20.191 gc 4 @0.171s 1%: 0.038+3.1+0.043 ms clock, 0.30+0/3.1/0.97+0.34 ms cpu, 4->5->4 MB, 5 MB goal, 8 P
Feb 26 13:30:20.231 gc 5 @0.213s 1%: 0.033+3.9+0.26 ms clock, 0.26+0/4.1/1.0+2.1 ms cpu, 12->13->9 MB, 13 MB goal, 8 P
Feb 26 13:30:20.289 gc 6 @0.271s 1%: 0.027+5.0+0.039 ms clock, 0.22+0/0.75/5.7+0.31 ms cpu, 21->21->17 MB, 22 MB goal, 8 P
Feb 26 13:30:20.420 gc 7 @0.399s 1%: 0.077+6.8+0.035 ms clock, 0.61+0/7.4/0.98+0.28 ms cpu, 50->50->37 MB, 51 MB goal, 8 P
Feb 26 13:30:20.589 gc 8 @0.569s 0%: 0.029+6.8+0.24 ms clock, 0.23+0/7.1/2.7+1.9 ms cpu, 78->78->58 MB, 79 MB goal, 8 P
Feb 26 13:30:20.704 gc 9 @0.684s 0%: 0.086+5.7+0.062 ms clock, 0.68+0.27/8.3/16+0.49 ms cpu, 113->114->80 MB, 116 MB goal, 8 P
Feb 26 13:30:20.969 gc 10 @0.945s 0%: 0.046+10+0.020 ms clock, 0.37+0.20/21/50+0.16 ms cpu, 157->157->108 MB, 161 MB goal, 8 P
Feb 26 13:31:01.927 gc 11 @41.885s 0%: 0.93+27+0.005 ms clock, 7.4+11/54/82+0.044 ms cpu, 212->212->95 MB, 217 MB goal, 8 P
Feb 26 13:31:33.382 gc 12 @73.345s 0%: 0.090+22+0.036 ms clock, 0.72+0/44/86+0.29 ms cpu, 189->189->132 MB, 191 MB goal, 8 P
Feb 26 13:31:34.177 gc 13 @74.133s 0%: 0.055+29+0.030 ms clock, 0.44+0.41/59/147+0.24 ms cpu, 259->264->183 MB, 265 MB goal, 8 P
Feb 26 13:33:06.916 gc 14 @166.869s 0%: 0.29+32+0.005 ms clock, 2.3+81/65/17+0.047 ms cpu, 358->362->97 MB, 367 MB goal, 8 P
Feb 26 13:33:33.629 gc 15 @193.586s 0%: 0.058+29+0.030 ms clock, 0.46+0.80/57/136+0.24 ms cpu, 189->192->135 MB, 194 MB goal, 8 P
Feb 26 13:34:33.442 gc 16 @253.405s 0%: 0.14+22+0.031 ms clock, 1.1+0.51/45/108+0.25 ms cpu, 264->267->120 MB, 271 MB goal, 8 P
Feb 26 13:35:33.353 gc 17 @313.320s 0%: 0.086+19+0.030 ms clock, 0.68+0/38/102+0.24 ms cpu, 237->237->116 MB, 240 MB goal, 8 P
Feb 26 13:36:33.190 gc 18 @373.156s 0%: 0.22+19+0.029 ms clock, 1.7+0/38/99+0.23 ms cpu, 228->228->95 MB, 232 MB goal, 8 P
Feb 26 13:37:01.917 gc 19 @401.869s 0%: 0.20+34+0.058 ms clock, 1.6+99/59/0.89+0.46 ms cpu, 186->193->99 MB, 191 MB goal, 8 P
Feb 26 13:37:33.534 gc 20 @433.499s 0%: 0.065+21+0.046 ms clock, 0.52+0.33/42/105+0.36 ms cpu, 188->190->117 MB, 198 MB goal, 8 P
Feb 26 13:38:33.251 gc 21 @493.217s 0%: 0.15+20+0.036 ms clock, 1.2+0/39/106+0.29 ms cpu, 230->230->104 MB, 234 MB goal, 8 P
Feb 26 13:39:08.907 gc 22 @528.871s 0%: 0.45+22+0.005 ms clock, 3.6+22/44/83+0.046 ms cpu, 204->205->94 MB, 209 MB goal, 8 P
Feb 26 13:39:33.501 gc 23 @553.462s 0%: 0.12+24+0.028 ms clock, 0.98+0.37/49/118+0.22 ms cpu, 183->186->128 MB, 188 MB goal, 8 P
Feb 26 13:40:33.410 gc 24 @613.374s 0%: 0.12+22+0.030 ms clock, 1.0+0.32/44/107+0.24 ms cpu, 249->252->115 MB, 256 MB goal, 8 P
Feb 26 13:41:33.252 gc 25 @673.219s 0%: 0.10+19+0.005 ms clock, 0.85+0/38/100+0.044 ms cpu, 230->230->105 MB, 231 MB goal, 8 P
Feb 26 13:42:17.925 gc 26 @717.878s 0%: 0.23+33+0.099 ms clock, 1.9+82/66/0+0.79 ms cpu, 205->210->101 MB, 210 MB goal, 8 P
Feb 26 13:42:33.441 gc 27 @733.402s 0%: 0.13+25+0.029 ms clock, 1.0+0.48/50/103+0.23 ms cpu, 195->197->136 MB, 202 MB goal, 8 P
Feb 26 13:43:06.916 gc 28 @766.869s 0%: 0.42+32+0.007 ms clock, 3.4+77/64/13+0.056 ms cpu, 266->271->98 MB, 273 MB goal, 8 P
Feb 26 13:43:33.381 gc 29 @793.344s 0%: 0.071+23+0.050 ms clock, 0.57+0/46/96+0.40 ms cpu, 190->191->134 MB, 196 MB goal, 8 P
Feb 26 13:43:45.562 gc 30 @805.528s 0%: 0.10+19+0.006 ms clock, 0.84+0.36/39/114+0.050 ms cpu, 261->261->93 MB, 268 MB goal, 8 P
Feb 26 13:44:33.232 gc 31 @853.193s 0%: 0.13+25+0.028 ms clock, 1.0+5.0/46/95+0.22 ms cpu, 189->197->129 MB, 190 MB goal, 8 P
Feb 26 13:44:34.041 gc 32 @853.992s 0%: 0.050+35+0.030 ms clock, 0.40+0.49/68/156+0.24 ms cpu, 248->252->170 MB, 259 MB goal, 8 P
Feb 26 13:45:33.509 gc 33 @913.467s 0%: 0.091+27+0.039 ms clock, 0.73+0.29/54/135+0.31 ms cpu, 331->334->137 MB, 340 MB goal, 8 P
Feb 26 13:46:33.379 gc 34 @973.342s 0%: 0.19+22+0.025 ms clock, 1.5+0.53/44/92+0.20 ms cpu, 267->269->127 MB, 274 MB goal, 8 P
Feb 26 13:46:34.171 gc 35 @974.124s 0%: 0.056+32+0.033 ms clock, 0.45+0.79/64/159+0.27 ms cpu, 249->254->181 MB, 255 MB goal, 8 P
Feb 26 13:48:15.902 gc 36 @1075.868s 0%: 0.27+19+0.008 ms clock, 2.1+11/38/92+0.067 ms cpu, 354->355->93 MB, 363 MB goal, 8 P
Feb 26 13:48:33.479 gc 37 @1093.440s 0%: 0.14+24+0.040 ms clock, 1.1+0.31/48/121+0.32 ms cpu, 183->185->129 MB, 187 MB goal, 8 P
Feb 26 13:49:33.604 gc 38 @1153.564s 0%: 0.088+26+0.029 ms clock, 0.71+0.30/51/113+0.23 ms cpu, 252->255->118 MB, 259 MB goal, 8 P
Feb 26 13:50:33.385 gc 39 @1213.350s 0%: 0.15+21+0.029 ms clock, 1.2+0/42/104+0.23 ms cpu, 242->243->117 MB, 243 MB goal, 8 P
Feb 26 13:51:33.251 gc 40 @1273.217s 0%: 0.10+20+0.034 ms clock, 0.83+0/40/105+0.27 ms cpu, 232->232->105 MB, 234 MB goal, 8 P
Feb 26 13:52:17.926 gc 41 @1317.878s 0%: 0.28+33+0.10 ms clock, 2.2+100/66/0+0.83 ms cpu, 205->210->101 MB, 210 MB goal, 8 P
Feb 26 13:52:33.556 gc 42 @1333.512s 0%: 0.13+29+0.026 ms clock, 1.1+0.52/58/144+0.20 ms cpu, 194->198->140 MB, 202 MB goal, 8 P
Feb 26 13:53:33.546 gc 43 @1393.503s 0%: 0.068+29+0.079 ms clock, 0.54+0.34/57/139+0.63 ms cpu, 273->274->139 MB, 281 MB goal, 8 P
Feb 26 13:54:33.804 gc 44 @1453.764s 0%: 0.12+26+0.031 ms clock, 1.0+0.29/51/128+0.25 ms cpu, 272->275->135 MB, 279 MB goal, 8 P
Feb 26 13:55:33.645 gc 45 @1513.604s 0%: 0.087+26+0.038 ms clock, 0.70+0.53/53/131+0.30 ms cpu, 263->266->132 MB, 270 MB goal, 8 P
Feb 26 13:56:33.456 gc 46 @1573.417s 0%: 0.18+24+0.029 ms clock, 1.4+0.27/49/121+0.23 ms cpu, 259->262->125 MB, 265 MB goal, 8 P
Feb 26 13:57:33.452 gc 47 @1633.415s 0%: 0.081+22+0.031 ms clock, 0.65+0.39/44/109+0.25 ms cpu, 245->248->118 MB, 251 MB goal, 8 P
Feb 26 13:58:33.201 gc 48 @1693.160s 0%: 0.13+26+0.017 ms clock, 1.0+0/52/92+0.13 ms cpu, 232->232->107 MB, 237 MB goal, 8 P
Feb 26 13:58:33.530 gc 49 @1693.483s 0%: 0.12+33+0.033 ms clock, 1.0+5.0/66/127+0.27 ms cpu, 214->217->180 MB, 215 MB goal, 8 P
Feb 26 13:59:33.292 gc 50 @1753.243s 0%: 0.13+35+0.037 ms clock, 1.0+22/60/106+0.29 ms cpu, 357->358->129 MB, 360 MB goal, 8 P
Feb 26 13:59:33.608 gc 51 @1753.544s 0%: 0.74+48+0.056 ms clock, 5.9+50/97/133+0.45 ms cpu, 270->274->219 MB, 271 MB goal, 8 P
Feb 26 14:00:33.280 gc 52 @1813.213s 0%: 0.14+53+0.031 ms clock, 1.1+19/55/79+0.24 ms cpu, 425->441->137 MB, 438 MB goal, 8 P
Feb 26 14:00:33.575 gc 53 @1813.514s 0%: 0.19+47+0.037 ms clock, 1.5+23/92/116+0.30 ms cpu, 274->277->215 MB, 275 MB goal, 8 P
Feb 26 14:01:33.254 gc 54 @1873.213s 0%: 0.079+27+0.028 ms clock, 0.63+15/43/92+0.23 ms cpu, 412->420->129 MB, 431 MB goal, 8 P
Feb 26 14:01:34.192 gc 55 @1874.146s 0%: 0.078+31+0.035 ms clock, 0.62+0.23/62/155+0.28 ms cpu, 250->253->171 MB, 259 MB goal, 8 P
Feb 26 14:02:33.415 gc 56 @1933.374s 0%: 0.15+26+0.046 ms clock, 1.2+0.36/52/115+0.37 ms cpu, 333->336->133 MB, 342 MB goal, 8 P
Feb 26 14:03:01.928 gc 57 @1961.878s 0%: 0.29+35+0.006 ms clock, 2.3+64/70/26+0.051 ms cpu, 261->266->100 MB, 267 MB goal, 8 P
Feb 26 14:03:33.401 gc 58 @1993.366s 0%: 0.082+21+0.038 ms clock, 0.66+0.38/43/106+0.30 ms cpu, 195->198->130 MB, 201 MB goal, 8 P
Feb 26 14:03:34.221 gc 59 @1994.174s 0%: 0.052+34+0.030 ms clock, 0.42+0.29/67/168+0.24 ms cpu, 254->258->186 MB, 260 MB goal, 8 P
Feb 26 14:04:33.532 gc 60 @2053.488s 0%: 0.23+30+0.024 ms clock, 1.8+21/59/125+0.19 ms cpu, 395->396->182 MB, 396 MB goal, 8 P
Feb 26 14:05:33.273 gc 61 @2113.239s 0%: 0.10+20+0.045 ms clock, 0.83+0/40/102+0.36 ms cpu, 361->361->137 MB, 365 MB goal, 8 P
Feb 26 14:05:33.652 gc 62 @2113.599s 0%: 0.10+39+0.044 ms clock, 0.84+0.59/77/117+0.35 ms cpu, 268->273->208 MB, 275 MB goal, 8 P
Feb 26 14:06:33.454 gc 63 @2173.414s 0%: 0.19+26+0.044 ms clock, 1.5+0.31/52/125+0.35 ms cpu, 407->410->122 MB, 417 MB goal, 8 P
Feb 26 14:07:33.221 gc 64 @2233.184s 0%: 0.10+22+0.014 ms clock, 0.82+0/44/103+0.11 ms cpu, 245->245->121 MB, 246 MB goal, 8 P
Feb 26 14:07:33.990 gc 65 @2233.944s 0%: 0.053+32+0.029 ms clock, 0.42+0.29/60/141+0.23 ms cpu, 237->241->164 MB, 243 MB goal, 8 P
Feb 26 14:08:33.290 gc 66 @2293.250s 0%: 0.12+26+0.021 ms clock, 0.98+6.7/51/88+0.17 ms cpu, 329->329->133 MB, 330 MB goal, 8 P
Feb 26 14:08:33.596 gc 67 @2293.538s 0%: 0.15+44+0.054 ms clock, 1.2+17/83/121+0.43 ms cpu, 274->278->219 MB, 275 MB goal, 8 P
Feb 26 14:09:33.373 gc 68 @2353.338s 0%: 0.11+20+0.031 ms clock, 0.91+0/41/101+0.24 ms cpu, 430->431->117 MB, 439 MB goal, 8 P
Feb 26 14:10:33.207 gc 69 @2413.172s 0%: 0.12+21+0.027 ms clock, 1.0+0.069/41/104+0.22 ms cpu, 230->230->99 MB, 235 MB goal, 8 P
Feb 26 14:11:01.932 gc 70 @2441.877s 0%: 0.24+40+0.015 ms clock, 1.9+85/77/0+0.12 ms cpu, 193->199->101 MB, 198 MB goal, 8 P
Feb 26 14:11:33.422 gc 71 @2473.388s 0%: 0.076+20+0.039 ms clock, 0.61+0.53/41/101+0.31 ms cpu, 195->197->128 MB, 202 MB goal, 8 P
Feb 26 14:11:40.919 gc 72 @2480.870s 0%: 0.14+35+0.037 ms clock, 1.1+1.0/70/168+0.30 ms cpu, 251->251->148 MB, 257 MB goal, 8 P
Feb 26 14:13:01.915 gc 73 @2561.869s 0%: 0.28+31+0.026 ms clock, 2.3+109/57/0+0.20 ms cpu, 288->294->98 MB, 296 MB goal, 8 P
Feb 26 14:13:33.490 gc 74 @2593.452s 0%: 0.072+23+0.040 ms clock, 0.58+0.39/47/118+0.32 ms cpu, 190->192->126 MB, 196 MB goal, 8 P
Feb 26 14:14:33.406 gc 75 @2653.368s 0%: 0.13+24+0.028 ms clock, 1.1+0.32/47/104+0.22 ms cpu, 246->250->112 MB, 253 MB goal, 8 P
Feb 26 14:15:33.178 gc 76 @2713.144s 0%: 0.091+20+0.039 ms clock, 0.72+0.29/39/102+0.31 ms cpu, 220->220->96 MB, 225 MB goal, 8 P
Feb 26 14:16:06.915 gc 77 @2746.869s 0%: 0.27+32+0.029 ms clock, 2.1+89/63/8.6+0.23 ms cpu, 188->193->98 MB, 193 MB goal, 8 P
Feb 26 14:16:33.652 gc 78 @2773.614s 0%: 0.19+24+0.032 ms clock, 1.5+0.49/48/119+0.26 ms cpu, 189->191->121 MB, 196 MB goal, 8 P
Feb 26 14:17:33.375 gc 79 @2833.340s 0%: 0.096+21+0.031 ms clock, 0.77+0/41/105+0.25 ms cpu, 238->238->117 MB, 243 MB goal, 8 P
Feb 26 14:18:33.049 gc 80 @2893.011s 0%: 0.14+24+0.038 ms clock, 1.1+0.25/47/101+0.30 ms cpu, 230->230->100 MB, 235 MB goal, 8 P
Feb 26 14:18:33.460 gc 81 @2893.421s 0%: 0.057+25+0.071 ms clock, 0.46+0.47/49/103+0.57 ms cpu, 196->199->150 MB, 201 MB goal, 8 P
Feb 26 14:18:33.995 gc 82 @2893.938s 0%: 0.044+43+0.042 ms clock, 0.35+0.44/85/199+0.34 ms cpu, 293->298->186 MB, 300 MB goal, 8 P
Feb 26 14:19:33.374 gc 83 @2953.336s 0%: 0.13+24+0.034 ms clock, 1.1+0.32/47/99+0.27 ms cpu, 365->366->134 MB, 372 MB goal, 8 P
Feb 26 14:19:46.903 gc 84 @2966.870s 0%: 0.38+19+0.005 ms clock, 3.0+0.74/38/109+0.046 ms cpu, 262->262->93 MB, 268 MB goal, 8 P
Feb 26 14:20:33.226 gc 85 @3013.179s 0%: 0.14+32+0.064 ms clock, 1.1+13/63/101+0.51 ms cpu, 191->193->128 MB, 192 MB goal, 8 P
Feb 26 14:20:33.569 gc 86 @3013.519s 0%: 0.15+36+0.052 ms clock, 1.2+9.0/66/107+0.41 ms cpu, 260->264->205 MB, 261 MB goal, 8 P
Feb 26 14:21:33.371 gc 87 @3073.336s 0%: 0.11+21+0.038 ms clock, 0.89+0/42/105+0.30 ms cpu, 409->410->118 MB, 411 MB goal, 8 P
Feb 26 14:22:17.925 gc 88 @3117.878s 0%: 0.27+32+0.056 ms clock, 2.2+102/61/0+0.45 ms cpu, 229->234->101 MB, 236 MB goal, 8 P
Feb 26 14:22:33.457 gc 89 @3133.419s 0%: 0.15+23+0.23 ms clock, 1.2+0.30/46/109+1.9 ms cpu, 195->198->136 MB, 203 MB goal, 8 P
Feb 26 14:23:02.902 gc 90 @3162.869s 0%: 0.13+19+0.007 ms clock, 1.0+1.4/38/107+0.058 ms cpu, 266->266->93 MB, 273 MB goal, 8 P
Feb 26 14:23:33.508 gc 91 @3193.465s 0%: 0.082+28+0.031 ms clock, 0.65+0.48/56/136+0.24 ms cpu, 182->185->131 MB, 187 MB goal, 8 P
Feb 26 14:24:33.418 gc 92 @3253.381s 0%: 0.12+23+0.030 ms clock, 1.0+0.44/45/113+0.24 ms cpu, 256->258->120 MB, 262 MB goal, 8 P
Feb 26 14:25:14.605 gc 93 @3294.572s 0%: 0.090+19+0.006 ms clock, 0.72+2.9/37/104+0.054 ms cpu, 234->234->94 MB, 240 MB goal, 8 P
Feb 26 14:25:33.205 gc 94 @3313.167s 0%: 0.072+24+0.036 ms clock, 0.58+0.49/47/99+0.29 ms cpu, 183->199->123 MB, 188 MB goal, 8 P
Feb 26 14:25:33.632 gc 95 @3313.588s 0%: 0.053+30+0.028 ms clock, 0.43+0.30/60/143+0.23 ms cpu, 235->237->160 MB, 247 MB goal, 8 P
Feb 26 14:26:17.941 gc 96 @3357.888s 0%: 0.49+38+0.038 ms clock, 3.9+81/75/24+0.30 ms cpu, 311->315->105 MB, 320 MB goal, 8 P
Feb 26 14:26:38.921 gc 97 @3378.870s 0%: 0.34+36+0.029 ms clock, 2.7+27/71/104+0.23 ms cpu, 205->207->126 MB, 210 MB goal, 8 P
Feb 26 14:27:33.425 gc 98 @3433.388s 0%: 0.10+23+0.040 ms clock, 0.85+0.32/45/108+0.32 ms cpu, 246->248->133 MB, 252 MB goal, 8 P
Feb 26 14:28:06.910 gc 99 @3466.875s 0%: 0.84+21+0.007 ms clock, 6.7+4.9/42/102+0.059 ms cpu, 260->260->95 MB, 267 MB goal, 8 P
Feb 26 14:28:33.407 gc 100 @3493.372s 0%: 0.13+21+0.038 ms clock, 1.1+0.33/41/102+0.31 ms cpu, 185->187->112 MB, 190 MB goal, 8 P
Feb 26 14:29:25.902 gc 101 @3545.869s 0%: 0.27+18+0.007 ms clock, 2.2+6.6/37/101+0.056 ms cpu, 219->220->93 MB, 225 MB goal, 8 P
Feb 26 14:29:33.733 gc 102 @3553.694s 0%: 0.056+24+0.028 ms clock, 0.45+0.53/48/116+0.23 ms cpu, 182->184->142 MB, 187 MB goal, 8 P
Feb 26 14:30:17.929 gc 103 @3597.885s 0%: 0.20+29+0.006 ms clock, 1.6+13/57/86+0.049 ms cpu, 277->278->98 MB, 284 MB goal, 8 P
Feb 26 14:30:33.545 gc 104 @3613.506s 0%: 0.12+25+0.042 ms clock, 0.96+0.35/49/120+0.33 ms cpu, 192->194->126 MB, 197 MB goal, 8 P
Feb 26 14:31:33.250 gc 105 @3673.215s 0%: 0.092+21+0.006 ms clock, 0.73+0.46/42/109+0.052 ms cpu, 252->252->105 MB, 253 MB goal, 8 P
Feb 26 14:31:43.118 gc 106 @3683.073s 0%: 0.12+31+0.032 ms clock, 1.0+2.3/61/143+0.25 ms cpu, 206->206->124 MB, 211 MB goal, 8 P
Feb 26 14:32:17.916 gc 107 @3717.876s 0%: 0.23+26+0.096 ms clock, 1.9+100/51/0+0.77 ms cpu, 242->246->99 MB, 249 MB goal, 8 P
Feb 26 14:32:33.577 gc 108 @3733.528s 0%: 0.15+35+0.039 ms clock, 1.2+0.084/70/135+0.31 ms cpu, 192->197->138 MB, 199 MB goal, 8 P
Feb 26 14:33:33.402 gc 109 @3793.365s 0%: 0.084+23+0.035 ms clock, 0.67+0.22/47/106+0.28 ms cpu, 269->271->133 MB, 276 MB goal, 8 P
Feb 26 14:33:34.197 gc 110 @3794.151s 0%: 0.054+32+0.033 ms clock, 0.43+0.48/64/160+0.26 ms cpu, 260->264->190 MB, 267 MB goal, 8 P
Feb 26 14:34:38.918 gc 111 @3858.871s 0%: 0.25+33+0.029 ms clock, 2.0+21/66/118+0.23 ms cpu, 371->372->125 MB, 380 MB goal, 8 P
Feb 26 14:35:33.413 gc 112 @3913.375s 0%: 0.12+24+0.097 ms clock, 1.0+0.30/47/100+0.78 ms cpu, 245->247->132 MB, 251 MB goal, 8 P
Feb 26 14:36:06.912 gc 113 @3946.874s 0%: 0.82+24+0.006 ms clock, 6.6+31/47/69+0.052 ms cpu, 257->259->95 MB, 264 MB goal, 8 P
Feb 26 14:36:33.382 gc 114 @3973.343s 0%: 0.22+24+0.040 ms clock, 1.7+0.63/47/106+0.32 ms cpu, 201->202->134 MB, 202 MB goal, 8 P
Feb 26 14:36:43.923 gc 115 @3983.870s 0%: 0.22+39+0.028 ms clock, 1.7+2.2/78/187+0.22 ms cpu, 261->261->148 MB, 268 MB goal, 8 P
Feb 26 14:37:59.107 gc 116 @4059.073s 0%: 0.10+20+0.004 ms clock, 0.86+0.32/40/102+0.038 ms cpu, 289->290->94 MB, 297 MB goal, 8 P
Feb 26 14:38:21.912 gc 117 @4081.869s 0%: 0.25+28+0.008 ms clock, 2.0+60/56/34+0.070 ms cpu, 183->186->96 MB, 188 MB goal, 8 P
Feb 26 14:38:33.355 gc 118 @4093.316s 0%: 0.089+24+0.24 ms clock, 0.71+0/48/91+1.9 ms cpu, 198->199->134 MB, 199 MB goal, 8 P
Feb 26 14:38:36.570 gc 119 @4096.519s 0%: 0.097+37+0.028 ms clock, 0.78+0.36/74/181+0.22 ms cpu, 262->262->147 MB, 269 MB goal, 8 P
Feb 26 14:39:16.107 gc 120 @4136.073s 0%: 0.10+20+0.005 ms clock, 0.81+2.0/40/110+0.045 ms cpu, 288->288->94 MB, 295 MB goal, 8 P

Heap size at GC start, at GC end, and live heap are more or less the same, maybe with madvdontneed=1 it's slightly higher, but still 4x smaller than the reported working set.

Looking at inuse_space in pprof, including using --diff_base shows very little difference between both. So from Go's perspective it seems that there is no difference.

I am now convinced that the issue is with how I am measuring memory (cadvisor's container_memory_working_set_bytes) and not with Go itself. I am not sure where to go from here, and I will for the time being keep using madvdontneed=0.

@ulikunitz
Copy link
Contributor

Cadvisor's container_working_set_bytes appears to report on the cgroup memory.usage_in_bytes counter but modifies with inactive_file information. In the Linux kernel memory.usage_in_bytes for a non-root control group is computed from the mem_cgroup field memory. A container is running in a control group.

I guess that MADV_DONTNEED and MADV_FREE affect the memory control group counter differently. Note that both are not removing the memory mapping actually but telling the kernel that it can reclaim the memory of the area if required. So it is not really obvious what memory counters should do. MADV_FREE should be more efficient, but doesn't move the RSS counter, which I guess was the reason to use now MADV_DONTNEED by default. But if I'm guessing correctly, we have now the opposite situation with the control group memory counter.

@pitr pitr changed the title runtime: 2-3x memory usage increase in 1.16 vs 1.15 runtime: 2-3x memory usage when running in container increase in 1.16 vs 1.15 (MADV_DONTNEED related) Mar 1, 2021
@mknyszek
Copy link
Contributor

mknyszek commented Mar 3, 2021

It is odd that memory.usage_in_bytes is reporting more with MADV_DONTNEED turned on.

The process that returns memory to the OS (the "scavenger") paces itself according to how long the madvise syscall takes (well, approximately; it also includes the time it takes to find a page to release). Generally speaking, MADV_DONTNEED is around 2x slower (in my experiments) than MADV_FREE, and it's possible the switch means the scavenger is operating less aggressively.

Though, there are two holes in this theory:

  1. MADV_FREE'd pages aren't counted by Linux unless the system is under memory pressure. I believe this is defined as being over the cgroups soft memory limit. Is that the case here?
  2. That difference in the working set size is quite large, and I wouldn't think a 0.5x reduction in scavenging rate would affect that, but it might. Based on the gctrace your live heap size is somewhat spiky so this could be plausible.

Alternatively, there's a bug in the scavenger's pacing causing this.

@pitr Could you also please provide a log of GODEBUG=madvdontneed=1,gctrace=1,scavtrace=1?

@szuecs
Copy link

szuecs commented Mar 3, 2021

@mknyszek 1. is unlikely because @pitr runs the workload in our kubernetes infrastructure and I just checked soft limits seem not to be set, for example in one container I see this:

/ $ cat /sys/fs/cgroup/memory/memory.soft_limit_in_bytes
9223372036854771712
/ $ cat /sys/fs/cgroup/memory/memory.limit_in_bytes
1073741824

@sganapathy1312
Copy link

sganapathy1312 commented Mar 24, 2021

We had also ran into this issue. We resolved it by moving back to MADV_FREE. I'm happy to provide more data if it is helpful.

@JohnRusk
Copy link

JohnRusk commented Jun 24, 2021

I'm not sure if this is relevant, but I'll share it in case it is. I'm seeing situations in containerized apps where the RAM usage reported by the app itself (by Golang's gctrace) is significantly higher than the reported working set reported by container_working_set_bytes. That makes me think the working set is under-reporting the "true" RAM usage. That's certainly possible in some cases kubernetes-sigs/metrics-server#187 (comment)

So my tentative theory is that, for some kinds of apps, working-set under-reports "true RAM usage". I can imagine how that's possible, since apparently working set only includes pages that the kernel thinks are recently used. If an app has some pages that it hasn't used in a while, they won't be counted towards the working set.

The bit I don't understand, and the reason I'm not 100% sure of this theory, is that switching to MADV_DONTNEED makes the under-reporting go away. (I.e. I appears as if MADV_DONTNEED didn't make the app use more memory; it just made container_working_set_bytes report the usage more accurately.) But I have no idea why switching to MADV_DONTNEED would make that happen. Is it somehow remedying the under-reporting? Or is it actually causing additional allocations that just happen to approiximately balance out the under-reporting (even though the under-reporting continues)? Or is it just introducing a new and different reporting error, as suggested by some of the replies above. I don't know.

@howardjohn
Copy link
Contributor

In istio/istio#33569 (comment), which may or may not be the same issue as here, I am seeing that with go 1.16 container_working_set_bytes==container_usage_bytes

@mknyszek
Copy link
Contributor

mknyszek commented Jul 1, 2021

I'm not sure if this is relevant, but I'll share it in case it is. I'm seeing situations in containerized apps where the RAM usage reported by the app itself (by Golang's gctrace) is significantly higher than the reported working set reported by container_working_set_bytes. That makes me think the working set is under-reporting the "true" RAM usage. That's certainly possible in some cases kubernetes-sigs/metrics-server#187 (comment)

gctrace only counts bytes of objects, in general, it does not include fragmentation (except for a very specific kind of internal fragmentation), space for heap metadata, memory that is mapped and committed, but unused ("idle") so I expect it to be an underestimate. OS-reported "working set" or "RSS" is indeed the true memory footprint.

So my tentative theory is that, for some kinds of apps, working-set under-reports "true RAM usage". I can imagine how that's possible, since apparently working set only includes pages that the kernel thinks are recently used. If an app has some pages that it hasn't used in a while, they won't be counted towards the working set.

That's a bit of an oversimplification. "Recently used" just means "the application needs it" and the kernel tends to be pretty conservative about this, unless you have some very aggressive swapping policy turned on (if that's an option that exists, even). On Linux, what this looks like is:

  1. Map anonymous memory as read-write, adds to VSS or virtual memory footprint.
  2. Read from or write to that memory the first time, and the OS backs that with physical memory, adding to RSS or the working set.
  3. Use an madvise syscall to hint to the kernel that it is free to take the physical memory corresponding to some memory back, i.e. it's unused. On next access, the kernel will back this memory again (like (2)).
  4. Unmap the memory altogether, which both evicts any physical memory backing it, and removes that virtual memory mapping. This can be slow, so in general it is not used by allocators and runtimes.

The bit I don't understand, and the reason I'm not 100% sure of this theory, is that switching to MADV_DONTNEED makes the under-reporting go away. (I.e. I appears as if MADV_DONTNEED didn't make the app use more memory; it just made container_working_set_bytes report the usage more accurately.) But I have no idea why switching to MADV_DONTNEED would make that happen. Is it somehow remedying the under-reporting? Or is it actually causing additional allocations that just happen to approiximately balance out the under-reporting (even though the under-reporting continues)? Or is it just introducing a new and different reporting error, as suggested by some of the replies above. I don't know.

This makes sense to me. MADV_DONTNEED on Linux causes the kernel to immediate update RSS numbers as it immediately evicts any physical memory backing the virtual mapping. MADV_FREE on the other hand is more of a hint: the kernel is free to take back that physical memory lazily. However, the application cannot assume that data stored in that memory is usable anymore, because it can be evicted at any time. The runtime (via MemStats.HeapReleased) thus reports that it has released that memory to the OS. But RSS metrics reported by the OS may lag until the OS actually takes that memory back.

In general, MADV_FREE is more efficient. However it leads to non-trivial monitoring problems, hence why we switched back to MADV_DONTNEED by default in Go 1.16.

@mknyszek
Copy link
Contributor

mknyszek commented Jul 1, 2021

@sganapathy1312 That's somewhat surprising to me. If you can acquire a GODEBUG=scavtrace=1 trace for me comparing Go 1.15 and Go 1.16, that would be great. I'm still thinking that the underlying cause here is what I mentioned in #44554 (comment).

@JohnRusk
Copy link

JohnRusk commented Jul 5, 2021

gctrace only counts bytes of objects, in general, it does not include fragmentation (except for a very specific kind of internal fragmentation), space for heap metadata, memory that is mapped and committed, but unused ("idle") so I expect it to be an underestimate.

So do I. So it's interesting that in my tests the gctrace figures are higher than container_working_set_bytes.

@mknyszek
Copy link
Contributor

mknyszek commented Jul 6, 2021

@JohnRusk Ah. I see. My bad for misreading.

That can happen for similar reasons. If you make a large allocation but never actually touch that memory on Linux, then it'll never get paged in. Linux might not*** count that in RSS.

*** It really depends. If it contains pointers such that the GC touches that memory, it could get paged in. If the Go memory allocator allocates space for it using memory that's already paged in, then it will of course count. If you want to run an experiment, try a small program that creates a 1 GiB byte slice (e.g. make([]byte, 1<<30)) and then sits there (maybe running a GC every couple seconds in a background goroutine). gctrace will report a 1 GiB heap but your RSS will be tiny if the application isn't doing anything else.

@zhixinwen
Copy link

zhixinwen commented Feb 28, 2022

I am seeing similar issues in our k8s applicaiton. The application shows a memory leak like behavior, the memory usage increases gradually and never runs down according to GKE's pod memory graph. I cannot tell which metrics GKE uses, because it does not document that. But after I add GODEBUG=madvdontneed=0, the memory usage gets stablized.

One interesting observation is alloc_bytes by go Memstats is actually smaller when GKE graph indicates a memory leak..

This is the graph from GKE memory usage before and after enabling the flag:
Screen Shot 2022-03-01 at 11 31 58 AM
This is the memstats data:
Screen Shot 2022-03-01 at 11 47 14 AM

@gopherbot gopherbot added the compiler/runtime Issues related to the Go compiler and/or runtime. label Jul 7, 2022
@seankhliao seankhliao added this to the Unplanned milestone Aug 20, 2022
@cedricve
Copy link

I am seeing similar issues in our k8s applicaiton. The application shows a memory leak like behavior, the memory usage increases gradually and never runs down according to GKE's pod memory graph. I cannot tell which metrics GKE uses, because it does not document that. But after I add GODEBUG=madvdontneed=0, the memory usage gets stablized.

One interesting observation is alloc_bytes by go Memstats is actually smaller when GKE graph indicates a memory leak..

This is the graph from GKE memory usage before and after enabling the flag: Screen Shot 2022-03-01 at 11 31 58 AM This is the memstats data: Screen Shot 2022-03-01 at 11 47 14 AM

This is actually confronting, as I have been chasing a memory for leak in one of our Golang applications. I came to the conclusion, once generated as a static binary (the memory remained consistent), however when running in Docker or a Kubernetes deployment, the same behaviour is showing up as what you illustrated.

We've currently added that environment variable, curious what it will do to the runtime behaviour.

    - name: GODEBUG
      value: madvdontneed=0

@zhixinwen
Copy link

I am seeing similar issues in our k8s applicaiton. The application shows a memory leak like behavior, the memory usage increases gradually and never runs down according to GKE's pod memory graph. I cannot tell which metrics GKE uses, because it does not document that. But after I add GODEBUG=madvdontneed=0, the memory usage gets stablized.

One interesting observation is alloc_bytes by go Memstats is actually smaller when GKE graph indicates a memory leak..

This is the graph from GKE memory usage before and after enabling the flag: Screen Shot 2022-03-01 at 11 31 58 AM This is the memstats data: Screen Shot 2022-03-01 at 11 47 14 AM

We set it for the past few months, and we notice some other issues.
GKE dashboard shows memory usage drops but

  1. MemStats sys_bytes shows much higher RAM usage
  2. Container can OOM when GKE dashboard shows memory usage is well below request. Also when OOM happens, we dont see OOMKill event.

I doubt the flag hides the issue rather than fixes it.

@truong-hua
Copy link

truong-hua commented Mar 11, 2024

I have the same issue with TiDB running in docker container but the memory seem like only quickly increase in the first 4GB (the system has 16GB memory), after that the memory usage is quite stable at 4GB while pprof/heap reported only hundred of MB. I'm testing with GODEBUG=madvdontneed=0 but at first glance, seem like RSS reported by top is more stable and less than MADV enabled.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
compiler/runtime Issues related to the Go compiler and/or runtime. NeedsInvestigation Someone must examine and confirm this is a valid issue and not a duplicate of an existing one.
Projects
Status: Triage Backlog
Development

No branches or pull requests