-
Notifications
You must be signed in to change notification settings - Fork 3
Closed
Labels
planningPlanning future developmentPlanning future development
Description
This issue keeps track of which Flux layers in the model reference got LRP implementations.
Basic layers
-
Dense
-
flatten
Convolution
-
Conv
-
DepthwiseConv
-
ConvTranspose
-
CrossCor
Pooling layers
-
AdaptiveMaxPool
-
MaxPool
-
GlobalMaxPool
-
AdaptiveMeanPool
-
MeanPool
-
GlobalMeanPool
General purpose
-
Maxout
-
SkipConnection
-
Chain
Support nested Flux Chains in LRP #119 -
Parallel
Add LRP support forParallel
layer #10 -
Bilinear
-
Diagonal
-
Embedding
Normalisation & regularisation
-
normalise
-
BatchNorm
EnableBatchNorm
layers in LRP #129 -
dropout
-
Dropout
-
AlphaDropout
-
LayerNorm
-
InstanceNorm
-
GroupNorm
Upsampling layers
-
Upsample
-
PixelShuffle
Recurrent layers
-
RNN
-
LSTM
-
GRU
-
Recur
Metadata
Metadata
Assignees
Labels
planningPlanning future developmentPlanning future development