Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

x/playground: easily runs into build timeouts by downloading modules #56977

Open
mvdan opened this issue Nov 29, 2022 · 5 comments
Open

x/playground: easily runs into build timeouts by downloading modules #56977

mvdan opened this issue Nov 29, 2022 · 5 comments
Labels
NeedsInvestigation Someone must examine and confirm this is a valid issue and not a duplicate of an existing one.
Milestone

Comments

@mvdan
Copy link
Member

mvdan commented Nov 29, 2022

For example, see https://go.dev/play/p/b4S-VWK-yWR, which needs to download https://pkg.go.dev/cuelang.org/go@v0.5.0-beta.1 and its dependencies. It currently times out at build time, presumably while still downloading the module zips:

timeout running go build
go: downloading cuelang.org/go v0.5.0-beta.1
go: downloading github.com/cockroachdb/apd/v2 v2.0.2
go: downloading github.com/mpvl/unique v0.0.0-20150818121801-cbe035fff7de
go: downloading golang.org/x/text v0.3.8
go: downloading golang.org/x/net v0.0.0-20220722155237-a158d28d115b
go: downloading github.com/google/uuid v1.2.0
go: downloading github.com/pkg/errors v0.8.1
go: downloading gopkg.in/yaml.v3 v3.0.1

Go build failed.

It appears that the timeout happens after about ten seconds.

I wrote a small program a while ago to estimate how big all the zips in go list -m all are. The results, starting with the largest in bytes, are as follows:

$ go-mod-size | sort -k2 -n -r
golang.org/x/text@v0.3.8 8615981
golang.org/x/tools@v0.1.12 3946738
cuelang.org/go@v0.5.0-beta.1 2062942
golang.org/x/sys@v0.0.0-20220722155257-8c9f86f7a55f 1797601
golang.org/x/net@v0.0.0-20220722155237-a158d28d115b 1582191
github.com/cockroachdb/apd/v2@v2.0.2 320918
github.com/protocolbuffers/txtpbfmt@v0.0.0-20220428173112-74888fd59c2b 271386
github.com/pkg/diff@v0.0.0-20210226163009-20ebb0f2a09e 219375
github.com/rogpeppe/go-internal@v1.9.0 210263
golang.org/x/mod@v0.6.0-dev.0.20220818022119-ed83ed61efb9 169515

So a clean go build does download a fair bit - at least 15MiB or so. The total number of modules is 32, per go list -m all | wc -l. I think the playground should be able to download these modules by doing any of:

  1. Talking to a fast GOPROXY. GOMODCACHE=$PWD/tmp time go build reports a total runtime of about 3s on my laptop, with a 500Mb ethernet connection and a 11ms ping to proxy.golang.org. I'm sure that a playground server could do better.

  2. Reusing a GOMODCACHE on a best-effort basis. The same could be said about GOCACHE for build caching for compile timeouts.

  3. Increasing the timeout for multi-file playground inputs with go.mod and go.sum files. For example, given the number of lines in either go.mod or go.sum, we could estimate that this example needs a 20s timeout rather than 10s. Presumably this shouldn't be too harmful, as the playground should cache the result once it runs without a timeout.

  4. Implementing a more sophisticated timeout. For example, the playground could have very generous timeouts for downloading modules, given that network traffic is presumably cheap. The timeout for compiling and linking could similarly be more generous than the timeout for running the program, as there's no chance for running arbitrary code to burn CPU or abusing free compute.

cc @toothrot per https://dev.golang.org/owners

@gopherbot gopherbot added this to the Unreleased milestone Nov 29, 2022
@mvdan
Copy link
Member Author

mvdan commented Nov 29, 2022

Worth noting that, since go build prints what modules it's downloading, but not what packages it's buildling, it could be that my timeout above is during the compilation or linking of the program. I don't think I can know for sure. If it is indeed not a "module download" timeout, I think that should be made clearer.

@seankhliao seankhliao added the NeedsInvestigation Someone must examine and confirm this is a valid issue and not a duplicate of an existing one. label Nov 30, 2022
@findleyr
Copy link
Contributor

We discussed this in the tools call yesterday, and I feel like I may have prematurely determined that this problem is not easily solvable.

If timeouts are during the build, not the download, we should consider increasing the size of the playground app engine instances:
https://cs.opensource.google/go/x/playground/+/master:app.yaml;l=8;drc=9bfb5bdef13596a261db44e1c28a9fc8ddd7bc77

Right now, they appear to be at the minimum number of CPU:
https://cloud.google.com/appengine/docs/flexible/reference/app-yaml?tab=go#resource-settings

It may be worth just 2x'ing or 4x'ing the instance shape to see if it improves anything.

CC @golang/release

@mvdan
Copy link
Member Author

mvdan commented Jan 12, 2023

Two CPUs and two gigabytes of memory don't sound like enough to build some non-trivial examples with dependencies in a reasonable amount of time :) I would agree with that. Though I would also like to see better caching and longer build timeouts.

@gopherbot
Copy link

gopherbot commented Jan 12, 2023

Change https://go.dev/cl/461795 mentions this issue: playground: 4x resources for the tip playground

gopherbot pushed a commit to golang/playground that referenced this issue Jan 12, 2023
Experiment whether increasing instance shape can help mitigate build
timeouts.

For golang/go#56977

Change-Id: Ib516b97b9729151542331f8bae8ad5725914d2f2
Reviewed-on: https://go-review.googlesource.com/c/playground/+/461795
Run-TryBot: Robert Findley <rfindley@google.com>
TryBot-Result: Gopher Robot <gobot@golang.org>
Reviewed-by: Heschi Kreinick <heschi@google.com>
@mvdan
Copy link
Member Author

mvdan commented Apr 7, 2023

For what it's worth, the higher resources don't seem to get us out of the timeouts. Relatively light examples like https://go.dev/play/p/qPDCgnJgO39 still trigger a timeout. Some dependencies like x/text and x/tools still weigh megabytes, but surely that shouldn't be out of the norm :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
NeedsInvestigation Someone must examine and confirm this is a valid issue and not a duplicate of an existing one.
Projects
None yet
Development

No branches or pull requests

4 participants