Skip to content

Unbounded memory allocations #34

@Shnatsel

Description

@Shnatsel

When given a crafted input lewton attempts to allocate enormous amounts of memory. This presents several issues:

  1. This has DoS potential: opening several ogg files for decoding at once may exhaust the memory address space and cause a crash
  2. This prevents any binary depending on lewton from being analyzed with LLVM sanitizers which do not support binaries allocating terabytes of memory

Most decoding libraries face this issue at some point. This is usually solved by limiting the amount of allocated memory to some sane default, and letting people override it if they're really dealing with enormous amounts of data. In Rust we can easily allow the API user to override these limits via the builder pattern.

See https://libpng.sourceforge.io/decompression_bombs.html for more info on how a similar issue was solved in libpng. See also the Limits struct from flif crate.

We have briefly discussed this in #32, but I'm now filing this is as a separate issue because it is now a blocker for me. I'm seeing intermittent crashes under fuzzer that I cannot reproduce, and the inability to use sanitizers prevents me from pinpointing them.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions