You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
package main
funcf[Tinterface{ ~[2]int|~[4]int }]() {
println(len(T{})) // Zero alloc, build failureprintln(len(*new(T))) // Build ok, but allocates
}
funcmain() {
f[[2]int]()
}
What did you see happen?
Only the version that allocates runtime works. The Go compiler has a nifty optimization for doing len([2]byte) for example to evaluate it during compile time, but it doesn't seem to work for genetic types. I can allocate it in the generic version, but then I pay for a useless alloc and GC cost.
What did you expect to see?
I expect a 0 alloc way to figure out the length of the array.
Edit: For the reference, my use case is having a generic encoder that can encode a bunch of different static sized byte slices (e.g. maybe 10 different sizes). The input is provided as a pointer to a slice (I have non pointer too, but this one needs a pointer). A nil pointer would encode as a static sized byte array of all zeroes, hence why I need to know what size it would be without allocating since my lib is zero alloc (for now :P).
The text was updated successfully, but these errors were encountered:
This is impossible to fix because of how constants and generics work in Go. The only way to make it work would be to require any generic function that calls len to be compiled separately for each possible type argument. We aren't going to make that choice. Sorry.
Im also happy to be able to compute the length of a type parameter without instantiating one. Any idea how to do that? Whilst I can accept that the len(T{}) compiler trick doesn't work for generics, IMO the use case should be solvable without allocating.
Go version
go version go1.22.6 darwin/arm64
Output of
go env
in your module/workspace:What did you do?
https://go.dev/play/p/1awTlRO9FrO
What did you see happen?
Only the version that allocates runtime works. The Go compiler has a nifty optimization for doing len([2]byte) for example to evaluate it during compile time, but it doesn't seem to work for genetic types. I can allocate it in the generic version, but then I pay for a useless alloc and GC cost.
What did you expect to see?
I expect a 0 alloc way to figure out the length of the array.
Edit: For the reference, my use case is having a generic encoder that can encode a bunch of different static sized byte slices (e.g. maybe 10 different sizes). The input is provided as a pointer to a slice (I have non pointer too, but this one needs a pointer). A nil pointer would encode as a static sized byte array of all zeroes, hence why I need to know what size it would be without allocating since my lib is zero alloc (for now :P).
The text was updated successfully, but these errors were encountered: