-
Notifications
You must be signed in to change notification settings - Fork 18k
runtime: panic: runtime error: hash of unhashable type [2]string #67608
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
CC @golang/runtime This does seem impossible. Have you tried running the program under the race detector? |
Added -race to the build, but unsure if there are additional steps required, the panic remained, no race reported. |
Passing on gotip |
If you have a passing and a failing state, then a binary search might reveal which CL changed things. Does your program use plugins? I noticed the |
👍 - is there perhaps a guide we could follow, or do we just list the git hashes and have at it?
We do use plugins, however no plugins are loaded as part of the test (no .so files, no plugin.Open). Likely we can disable CGO (and disable plugins) and see if the issue persists too. |
You can use git bisect https://git-scm.com/docs/git-bisect |
@randall77 one of our brilliant SRE guys managed to do as you suggested:
In light of the traced CL, is there some work around to the behaviour, or some indication to what may be causing it? As mentioned, a direct build from source seems to be passing, so there should be some wider environment difference either at build or run time that result in the panic, meaning there should be some way to avoid it... |
Excellent, thanks. That CL certainly looks related. |
Never mind, that CL fixes the problem. So I guess we could reconsider backporting (which we chose not to do).
I'm afraid I don't understand this. What other build is there? You mention
Maybe. Using the type |
Even stranger is that it's only a failure if built via goreleaser (release tooling). If built from source without the indirection, via Dockerfile, then the failure doesn't appear. Also doesn't appear on 1.21 with same goreleaser release pipeline. I've asked if we could pinpoint the breaking CL as well.
We build two different processes as I tried to describe in the issue:
goreleaser is in essence release build tooling that provides a wrapper around go build, creating deb and rpm packages, and ultimately creating a docker image where those packages are installed ; our release build is failing those CI tests, however the very minimal dockerfile that skips all of those steps and just uses go build is passing the CI tests.
I didn't catch you there, the |
Hm, I'm not sure what goreleaser might do differently then. Certainly trying to match what goreleaser does in your simple docker build, or paring back what goreleaser does to match the simple docker build, might illuminate things. One thing I would check: make sure that you're actually getting the right Go version in both cases. You can print
I mean doing, anywhere in the package, something like:
This just introduces a use of |
This would be the breaking CL: cf68384 |
Verified with |
Just want to add we are seeing this as well. Same runtime panic within the opentelemetry code. go 1.22.4 on MacOS 14.5 Sonoma with M1 Max chip. go env output
|
Timed out in state WaitingForInfo. Closing. (I am just a bot, though. Please speak up if this is a mistake or you have the requested information.) |
This isn't actually done yet, reopening. |
Go version
go1.22.3 linux/amd64
Output of
go env
in your module/workspace:What did you do?
I'm running some integration tests trying to upgrade to 1.22.3 and am encountering a panic which seems impossible.
What did you see happen?
I get the following panic during the execution of the integration test.
So far it's particular to whenever the binary is built with goreleaser. I've verified
go version -m pkg
matches between the breaking and passing build. Both builds are built from the same source tree, but build from source doesn't trigger the panic. I've tried various debugging things like:The exact same goreleaser binary passes is used with recent 1.21 versions (1.21.8-1.21.10), and the resulting build doesn't trigger a panic. The panic is reliably triggered with 1.22.3 and doesn't seem racy, however I haven't been able to reproduce it with a direct build from source, using this Dockerfile or this one using the 1.22-bookworm base. It issues
make build
, which is essentially just a wrapper forgo build -tags=goplugin -trimpath .
.One thing that made a difference was running the binary with delve debugger. In that case, the panic doesn't occur. Additionally, the panic itself is strange, because
[2]string
seems to be a valid map key, via playground: https://go.dev/play/p/tm_uKffqff0 ; the source code feels impossible to trigger the exact panic:[2]string
may be coming fromThe build environment has been breaking with
golang:1.22-bullseye
andgolang:1.22-bookworm
(1.22.3).What did you expect to see?
no panic
The text was updated successfully, but these errors were encountered: