Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

log: use sync.Pool to reduce allocations #46285

Closed
andig opened this issue May 20, 2021 · 3 comments
Closed

log: use sync.Pool to reduce allocations #46285

andig opened this issue May 20, 2021 · 3 comments
Labels
FeatureRequest FrozenDueToAge NeedsInvestigation Someone must examine and confirm this is a valid issue and not a duplicate of an existing one. Performance
Milestone

Comments

@andig
Copy link
Contributor

andig commented May 20, 2021

What version of Go are you using (go version)?

$ go version
go version go1.16.3 darwin/arm64

Does this issue reproduce with the latest release?

yes

What operating system and processor architecture are you using (go env)?

go env Output
$ go env
GO111MODULE=""
GOARCH="arm64"
GOBIN=""
GOCACHE="/Users/andig/Library/Caches/go-build"
GOENV="/Users/andig/Library/Application Support/go/env"
GOEXE=""
GOFLAGS=""
GOHOSTARCH="arm64"
GOHOSTOS="darwin"
GOINSECURE=""
GOMODCACHE="/Users/andig/go/pkg/mod"
GONOPROXY=""
GONOSUMDB=""
GOOS="darwin"
GOPATH="/Users/andig/go"
GOPRIVATE=""
GOPROXY="https://proxy.golang.org,direct"
GOROOT="/opt/homebrew/Cellar/go/1.16.3/libexec"
GOSUMDB="sum.golang.org"
GOTMPDIR=""
GOTOOLDIR="/opt/homebrew/Cellar/go/1.16.3/libexec/pkg/tool/darwin_arm64"
GOVCS=""
GOVERSION="go1.16.3"
GCCGO="gccgo"
AR="ar"
CC="clang"
CXX="clang++"
CGO_ENABLED="1"
GOMOD="/Users/andig/htdocs/evcc/go.mod"
CGO_CFLAGS="-g -O2"
CGO_CPPFLAGS=""
CGO_CXXFLAGS="-g -O2"
CGO_FFLAGS="-g -O2"
CGO_LDFLAGS="-g -O2"
PKG_CONFIG="pkg-config"
GOGCCFLAGS="-fPIC -arch arm64 -pthread -fno-caret-diagnostics -Qunused-arguments -fmessage-length=0 -fdebug-prefix-map=/var/folders/sv/rs_453y57xj86xsbz3kw1mbc0000gn/T/go-build87690164=/tmp/go-build -gno-record-gcc-switches -fno-common"

What did you do?

Investigating a server application with a larger number of active log.Loggers (with different prefixes) I've noticed that memory consumption kept growing and pprof showed memory consumed in log.Output. It seems that the reason is the allocated, grow-only buffer in conjunction with extensive logging of e.g. HTTP traffic, see

l.buf = l.buf[:0]
.

What did you expect to see?

Flat memory consumption for logging, almost flat total.

What did you see instead?

Growing memory consumption as more live objects with expensive loggers are added.

I would like to suggest to make log.Logger use a sync pool like this:

var bufPool = sync.Pool{
	New: func() interface{} {
		return []byte{}
	},
}

where Output becomes:

	// l.buf = l.buf[:0]
	buf := bufPool.Get().([]byte)

        // as before
	l.formatHeader(&buf, now, file, line)
	buf = append(buf, s...)
	if len(s) == 0 || s[len(s)-1] != '\n' {
		buf = append(buf, '\n')
	}
	_, err := l.out.Write(buf)

        // put small buffers back
	if cap(buf) < 256 {
		bufPool.Put(buf[:0])
	}
	return err

That way particulary long log lines would no longer cause memory consumption at the cost of additional allocations. Might also chose a higher limit for putting resources back to the pool.

Would be happy to provide a PR.

@andig andig changed the title make log.Logger use sync.Pool make log.Logger use sync.Pool to reduce memory consumption May 20, 2021
@seankhliao
Copy link
Member

cc @robpike

@seankhliao seankhliao changed the title make log.Logger use sync.Pool to reduce memory consumption log: use sync.Pool to reduce allocations May 20, 2021
@ianlancetaylor
Copy link
Contributor

It's quite unusual for a single program to have so many loggers that this will make any difference. And using a pool will tend to make ordinary logging slightly slower. This seems like a case where it may be preferable to use your own logging package.

@dmitshur dmitshur added this to the Backlog milestone May 21, 2021
@dmitshur dmitshur added NeedsInvestigation Someone must examine and confirm this is a valid issue and not a duplicate of an existing one. Performance labels May 21, 2021
@andig
Copy link
Contributor Author

andig commented May 22, 2021

Agreed this can be worked around. It still felt surprising that logging larger amounts of data actually added that data to the resident amount of program memory. If there were any real-life application test that I could validate I'd be happy to try and see if the proposed change could make a difference.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
FeatureRequest FrozenDueToAge NeedsInvestigation Someone must examine and confirm this is a valid issue and not a duplicate of an existing one. Performance
Projects
None yet
Development

No branches or pull requests

5 participants