Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

archive/zip: reject certain invalid archives #33026

Open
MichaelTJones opened this issue Jul 10, 2019 · 16 comments
Open

archive/zip: reject certain invalid archives #33026

MichaelTJones opened this issue Jul 10, 2019 · 16 comments
Labels
NeedsInvestigation Someone must examine and confirm this is a valid issue and not a duplicate of an existing one.
Milestone

Comments

@MichaelTJones
Copy link
Contributor

This is not a defect report, but a suggestion to review the Zip archive reader with this new and authoritative zipbomb treatise in mind:

https://www.bamsoftware.com/hacks/zipbomb/

@ianlancetaylor
Copy link
Contributor

CC @dsnet

@ianlancetaylor ianlancetaylor changed the title security review of zip decoder archive/zip: security review of zip decoder Jul 10, 2019
@ianlancetaylor ianlancetaylor added the NeedsInvestigation Someone must examine and confirm this is a valid issue and not a duplicate of an existing one. label Jul 10, 2019
@ianlancetaylor ianlancetaylor added this to the Go1.13 milestone Jul 10, 2019
@dsnet
Copy link
Member

dsnet commented Jul 10, 2019

Hi @MichaelTJones, thanks for the report. This attack depends on two properties (called "insights" in the paper).

I believe Go's zip.Writer refuses to encode zip archives like this. However, Go's zip.Reader is probably susceptible to both properties mentioned by insight 1 and 2 (I did not verify). Insight 2 seems to be obviously an invalid ZIP archive, and our reader should error on such files. However, it's not clear to me that the directly overlapping files of insight 1 is invalid. I personally believe this should be an error as well, but I'm concerned that there may be people actually depending on this property.

@MichaelTJones
Copy link
Contributor Author

MichaelTJones commented Jul 10, 2019 via email

@FiloSottile
Copy link
Contributor

I read that article when it came out, and while it's extremely interesting in its techniques, I agree with the point it makes that unpacking archives with untrusted sources requires CPU, time and memory limits.

We should fix archive/zip if it accepts invalid archives, but I don't think we can make it safe to run on arbitrary archives without resource limits. Offering an API to easily apply limits on the unpacking process would be interesting, but a separate issue and definitely not for Go 1.13.

Aside from the insights, the table also points out other invalid zips that we decode, so we should probably also fix those.

/cc @rsc since he's credited on that article, and might have relevant insight.

@FiloSottile FiloSottile changed the title archive/zip: security review of zip decoder archive/zip: reject certain invalid archives Jul 10, 2019
@MichaelTJones
Copy link
Contributor Author

MichaelTJones commented Jul 10, 2019 via email

@qbradq
Copy link
Contributor

qbradq commented Jul 16, 2019

Is anyone working on this? If not I'd like to take it.

@MichaelTJones
Copy link
Contributor Author

MichaelTJones commented Jul 16, 2019 via email

@AxbB36
Copy link

AxbB36 commented Jul 19, 2019

A couple of things to watch out for here -- reports of in-the-wild zip files that could no longer be processed after Debian merged a patch to reject overlapping files (#931433: unzip: CVE-2019-13232).

@qbradq
Copy link
Contributor

qbradq commented Jul 22, 2019

Do we want to support FireFox's non-standard zip layout? The current implementation of zip.Reader searches for a CDE in the last 64KB of the file. This means that although small test files encoded as described in the linked blog post might work, once the blob of data following the CDE grows to 64KB-(size of CDE) it will stop working.

Regarding duplicate CDE's, they can be scrubbed prior to scanning for overlapping files.

I have started toying with this. I note that when I add duplicate header detection we have two test cases that choke on it (due to their inputs having duplicates).

Regarding adding a test case for those janky JAR files, the only example I have been able to track down and get a copy of is 1MB. That seems too large.

@MichaelTJones
Copy link
Contributor Author

MichaelTJones commented Jul 22, 2019 via email

@qbradq
Copy link
Contributor

qbradq commented Jul 22, 2019

I am just about ready to submit the CL for the zip bomb detection. I am going to go ahead and push the CL with full tests including the large binary files for review. If they need to be removed I can take care of it on the CL.

@AxbB36
Copy link

AxbB36 commented Jul 23, 2019

Do we want to support FireFox's non-standard zip layout? The current implementation of zip.Reader searches for a CDE in the last 64KB of the file. This means that although small test files encoded as described in the linked blog post might work, once the blob of data following the CDE grows to 64KB-(size of CDE) it will stop working.

No, zip.Reader searches for the EOCD in the last 64 KB. The EOCD contains a pointer to the central directory, which may be anywhere in the file. zip.Reader does not currently have any problem with the Firefox layout, even if the zip file is larger than 64 KB.

@qbradq
Copy link
Contributor

qbradq commented Jul 23, 2019

Do we want to support FireFox's non-standard zip layout? The current implementation of zip.Reader searches for a CDE in the last 64KB of the file. This means that although small test files encoded as described in the linked blog post might work, once the blob of data following the CDE grows to 64KB-(size of CDE) it will stop working.

No, zip.Reader searches for the EOCD in the last 64 KB. The EOCD contains a pointer to the central directory, which may be anywhere in the file. zip.Reader does not currently have any problem with the Firefox layout, even if the zip file is larger than 64 KB.

Thank you for the clarification!

@gopherbot
Copy link

Change https://golang.org/cl/187357 mentions this issue: archive/zip: detect and reject Better Zip Bomb

@AxbB36
Copy link

AxbB36 commented Jul 25, 2019

Change https://golang.org/cl/187357

// Check for overlapping file bodies (Better Zip Bomb)
for il := range z.File {
	l := z.File[il]
	lstart := uint64(l.headerOffset)
	lend := uint64(l.headerOffset) + l.CompressedSize64
	for ir := il + 1; ir < len(z.File); ir++ {
		r := z.File[ir]
		rstart := uint64(r.headerOffset)
		rend := uint64(r.headerOffset) + r.CompressedSize64
		if lend <= rstart || lstart >= rend {
			continue
		}
		return ErrInvalid
	}
}

This looks like an O(n2) algorithm. You should test whether its performance is adequate on a zip file containing many files, such as the ffff.zip from thejoshwolfe/yauzl#108. You may be better off first sorting the array of Files (or indices into the array) by their headerOffset. After that, you only need a linear scan to check whether File n overlaps File n+1.

lend := uint64(l.headerOffset) + l.CompressedSize64 is the wrong calculation because it ignores the size of the local file header itself, which is variable because of the filename and extra field. I think what you want is lend := l.DataOffset() + l.CompressedSize64 (except that you need to check for error from DataOffset).

The additions probably have integer overflow issues. You should test with some Zip64 files that are hacked to have sizes close to 264.

There's no need to include the entire zbsm.zip–and doing so is probably a bad idea, as its hash has no doubt been added to malware detection engines by now. You can generate your own smaller test case using the source code.

$ ./zipbomb --mode=quoted_overlap --num-files=3 --compressed-size=234 > test.zip
$ wc -c test.zip
500 test.zip
$ ./ratio test.zip
test.zip	678120 / 500	1356.24	+31.323 dB

I will say, though, that in my opinion the motivation for this patch is misguided. Its most likely outcome is breaking compatibility with some genuine zip files, without doing much to protect callers against zip bombs. Even a naive non-overlapped DEFLATE bomb can have a compression ratio over 1000, which will be too much for some applications.

@andybons andybons modified the milestones: Go1.13, Go1.14 Jul 30, 2019
@jorangreef
Copy link

Just in case it will help with crafting this fix, https://github.com/ronomon/zip has mitigations for overlapping local file headers, as well as more "obviously bad data".

@rsc rsc modified the milestones: Go1.14, Backlog Oct 9, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
NeedsInvestigation Someone must examine and confirm this is a valid issue and not a duplicate of an existing one.
Projects
None yet
Development

No branches or pull requests

10 participants