GMS-2024-327: Scrapy decompression bomb vulnerability
Impact
Scrapy limits allowed response sizes by default through the DOWNLOAD_MAXSIZE
and DOWNLOAD_WARNSIZE
settings.
However, those limits were only being enforced during the download of the raw, usually-compressed response bodies, and not during decompression, making Scrapy vulnerable to decompression bombs.
A malicious website being scraped could send a small response that, on decompression, could exhaust the memory available to the Scrapy process, potentially affecting any other process sharing that memory, and affecting disk usage in case of uncompressed response caching.
Patches
Upgrade to Scrapy 2.11.1.
If you are using Scrapy 1.8 or a lower version, and upgrading to Scrapy 2.11.1 is not an option, you may upgrade to Scrapy 1.8.4 instead.
Workarounds
There is no easy workaround.
Disabling HTTP decompression altogether is impractical, as HTTP compression is a rather common practice.
However, it is technically possible to manually backport the 2.11.1 or 1.8.4 fix, replacing the corresponding components of an unpatched version of Scrapy with patched versions copied into your own code.
Acknowledgements
This security issue was reported by @dmandefy through huntr.com.
References
Detect and mitigate GMS-2024-327 with GitLab Dependency Scanning
Secure your software supply chain by verifying that all open source dependencies used in your projects contain no disclosed vulnerabilities. Learn more about Dependency Scanning →