-
Notifications
You must be signed in to change notification settings - Fork 17.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
runtime: stack grow panic tracing back through sigpanic from signal handler #23484
Comments
Interestingly, this looks like something that @aclements was just talking about: stack growth in the signal handler. I think the traceback is failing to correctly trace past the If I'm right, this is not a regression, and it's hard to fix, so although I just set the milestone to 1.10 I'm going to redirect to 1.11. |
Thanks. If you're right, the issue title is misleading. Feel free to update it to whatever makes more sense. |
I agree with Ian that the trace past Given that the traceback print is failing, it's no surprise that stack growth is panicking. This is eerily similar to #21431, which is almost certainly also involves traceback failing around a sigpanic, this time on mipsle. Maybe we're just setting up our injected sigpanic calls slightly wrong on LR machines? I thought surely we'd have a test for this, but I can't actually find one. |
Change https://golang.org/cl/89016 mentions this issue: |
Currently, if anything goes wrong when printing a traceback, we simply cut off the traceback without any further diagnostics. Unfortunately, right now, we have a few issues that are difficult to debug because the traceback simply cuts off (#21431, #23484). This is an attempt to improve the debuggability of traceback failure by printing a diagnostic message plus a hex dump around the failed traceback frame when something goes wrong. The failures look like: goroutine 5 [running]: runtime: unexpected return pc for main.badLR2 called from 0xbad stack: frame={sp:0xc42004dfa8, fp:0xc42004dfc8} stack=[0xc42004d800,0xc42004e000) 000000c42004dea8: 0000000000000001 0000000000000001 000000c42004deb8: 000000c42004ded8 000000c42004ded8 000000c42004dec8: 0000000000427eea <runtime.dopanic+74> 000000c42004ded8 000000c42004ded8: 000000000044df70 <runtime.dopanic.func1+0> 000000c420001080 000000c42004dee8: 0000000000427b21 <runtime.gopanic+961> 000000c42004df08 000000c42004def8: 000000c42004df98 0000000000427b21 <runtime.gopanic+961> 000000c42004df08: 0000000000000000 0000000000000000 000000c42004df18: 0000000000000000 0000000000000000 000000c42004df28: 0000000000000000 0000000000000000 000000c42004df38: 0000000000000000 000000c420001080 000000c42004df48: 0000000000000000 0000000000000000 000000c42004df58: 0000000000000000 0000000000000000 000000c42004df68: 000000c4200010a0 0000000000000000 000000c42004df78: 00000000004c6400 00000000005031d0 000000c42004df88: 0000000000000000 0000000000000000 000000c42004df98: 000000c42004dfb8 00000000004ae7d9 <main.badLR2+73> 000000c42004dfa8: <00000000004c6400 00000000005031d0 000000c42004dfb8: 000000c42004dfd0 !0000000000000bad 000000c42004dfc8: >0000000000000000 0000000000000000 000000c42004dfd8: 0000000000451821 <runtime.goexit+1> 0000000000000000 000000c42004dfe8: 0000000000000000 0000000000000000 000000c42004dff8: 0000000000000000 main.badLR2(0x0) /go/src/runtime/testdata/testprog/badtraceback.go:42 +0x49 For #21431, #23484. Change-Id: I8718fc76ced81adb0b4b0b4f2293f3219ca80786 Reviewed-on: https://go-review.googlesource.com/89016 Run-TryBot: Austin Clements <austin@google.com> TryBot-Result: Gobot Gobot <gobot@golang.org> Reviewed-by: Cherry Zhang <cherryyz@google.com>
I'm not sure this is the same issue, but here's another with hexdump included: https://build.golang.org/log/cdc05ec757bc1a3625a1ae212dd827691c26a3be
|
And another: https://build.golang.org/log/6063e6e67248b01c9eb0100afc611ac23e203797
|
Bumping to 1.13, as it made its way from 1.10 already. @ianlancetaylor @aclements If you think this should be a release-blocker for 1.12, please let us know. |
I ran It's been months since we've seen anything like this on the bots, except for two very recent failures which look similar but could be unrelated to whatever happened last year.
The next builds that match are from November and are mainly from windows-arm and windows-amd64-race. They don't really seem relevant. Those are the only failures until back to September. |
@aclements @mknyszek is still a blocker for 1.13? |
There haven't been any other crashes outside of The Maybe we should just retire this issue and file a new one for the May 8th failure (and maybe for the plan9 failures?)? The builders which were originally failing here no longer exist ( |
From https://build.golang.org/log/6864350004c318139a5516a5b65d5099a88a0272:
The stack looks interesting to me. @aclements ?
The text was updated successfully, but these errors were encountered: