Skip to content

Batch-Processing for Noise Tunnel #497

@yvesrychener

Description

@yvesrychener

🚀 Feature

Batch-Processing for Noise Tunnel

Motivation

Methods like IntegratedGradients, which perform multiple steps per input have an internal_batch_size argument. However, Noise Tunnel lacks this as far as I can see. I think it would be quite useful to have a similar argument for Noise Tunnel, since underlying methods like Saliency cannot do batch-processing. This means that OOM issues arise with NoiseGrad-Saliency (and others).

Pitch

Add an argument batch_size_nt, controlling the noise-tunnel internal batch size.

Alternatives

Add a internal_batch_size argument to all attribution methods, but this would be a lot of work.

Additional context

None

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions