Explorar o código

docs: mark SEC_REVIEW F49 as fixed in 6580a5b

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
chiappa hai 3 días
pai
achega
8210d3ee58
Modificáronse 1 ficheiros con 23 adicións e 1 borrados
  1. 23 1
      doc/SEC_REVIEW.md

+ 23 - 1
doc/SEC_REVIEW.md

@@ -11,7 +11,7 @@
 >
 > Each finding is referenced as **F<N>** for later citation.
 >
-> **Findings rolled up:** 5 sev-3 (5 fixed, 0 open), 27 sev-2 (27 fixed, 0 open), 42 sev-1 (16 fixed, 26 open).
+> **Findings rolled up:** 5 sev-3 (5 fixed, 0 open), 27 sev-2 (27 fixed, 0 open), 42 sev-1 (17 fixed, 25 open).
 
 ---
 
@@ -1612,6 +1612,28 @@
   is multi-GB, OOM-killing the api. Stream via `gzopen`/`gzread` and
   bail past a threshold.
 - **Severity: 1**
+- **Status:** Fixed. `DbipDownloader::gunzip` now streams via
+  `gzopen` / `gzread` 64 KiB at a time (peak memory = the chunk, not
+  the file) and aborts with `DownloaderException` once the running
+  total exceeds `MAX_DECOMPRESSED_BYTES = 400 MiB`. The cap matches
+  the MaxMind tarball total cap from F48 so both downloaders agree
+  on what "too big" looks like; real `dbip-country-lite-*.mmdb` is
+  ~10 MiB, so the cap is generous against future growth. On cap
+  breach (or any other gunzip error), the partial output file is
+  unlinked so the caller never sees a half-decoded MMDB on disk;
+  the gz input is left in place so the operator can see what was
+  attempted. The gunzip helper is split into a private `gunzip()`
+  with the production cap and a `public @internal gunzipWithCap()`
+  that takes the cap as an argument so the unit test can drive it
+  with small fixtures instead of building 400 MiB of test data.
+  Regression tests in
+  `api/tests/Unit/Enrichment/DbipDownloaderTest.php`:
+  `testNormalGunzipPasses`, `testOutputOverCapIsRejectedAndCleanedUp`
+  (4 KiB plaintext under a 1 KiB cap → exception + no partial output
+  file), `testEmptyGzipIsRejected`, `testMissingInputIsRejected`,
+  and `testLargeInputStreamsCorrectly` (256 KiB through the chunked
+  loop, ensuring chunked accumulation works correctly across multiple
+  reads).
 
 ### F50 — Guzzle client used by GeoIP downloaders allows redirects without host filtering
 - **File:** `api/src/App/Container.php:279-288`