-
Notifications
You must be signed in to change notification settings - Fork 25
Description
When given a crafted input lewton attempts to allocate enormous amounts of memory. This presents several issues:
- This has DoS potential: opening several ogg files for decoding at once may exhaust the memory address space and cause a crash
- This prevents any binary depending on lewton from being analyzed with LLVM sanitizers which do not support binaries allocating terabytes of memory
Most decoding libraries face this issue at some point. This is usually solved by limiting the amount of allocated memory to some sane default, and letting people override it if they're really dealing with enormous amounts of data. In Rust we can easily allow the API user to override these limits via the builder pattern.
See https://libpng.sourceforge.io/decompression_bombs.html for more info on how a similar issue was solved in libpng. See also the Limits struct from flif crate.
We have briefly discussed this in #32, but I'm now filing this is as a separate issue because it is now a blocker for me. I'm seeing intermittent crashes under fuzzer that I cannot reproduce, and the inability to use sanitizers prevents me from pinpointing them.