New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
encoding/json: Can improve json.Decoder while io.reader is bytes.Bufffer. #9706
Comments
What's the context? I'm all for reducing allocations, but I want to make sure that this is worthwhile before we investigate adding special cases for this. |
Reducing copies and making json.Decoder re-usable are separate issues. I don't think the copies (which don't contribute to GCs, only CPU) will matter as much as a Reset(io.Reader) method on json.Decoder, which is probably worth doing. |
I have a bad example, my idea is simply to reduce the memory copy and memory allocation . When the data is too large (size is 1m), will has too many copies and memory allocation in the readValue(), and I hope that we can avoid it. buffer := bytes.NewBuffer(make([]byte, 0, 1024))
.... // read bytes to buffer from network
dec := json.NewDecoder(buffer) // buffer is *bytes.Buffer
dec.Decode(&m) // it should copy all bytes to internal buf.
.... // do some things. |
I hope that we can avoid or improve it while reader is bytes.Buffer or bytes.Reader or strings.Reader. |
I understand the desire to improve things, but we're not going to add special cases if they don't address a real issue. |
Can improve json.Decoder while reader is bytes.Bufffer, use it like this:
json.Decoder.Decode() should copy all bytes to internal buf in the readValue(), it is what I want to avoid。I want to reuse buffer and zero copy, Reducing the number of GC. #7709
The text was updated successfully, but these errors were encountered: