Skip to content

Add missed loaders for the ReLU and ELU activation layers #78

@zaleslaw

Description

@zaleslaw

Each layer should have an implementation for export/import from/to JSON configuration (Keras compatible format) (see ModelLoader.kt and ModelSaver.kt)

The saving functions for the ReLU and ELU activation layer are missed

As an integration example, add a convolutional neural network to the examples package (in CNN package for example), it could be based on improved LeNet model, train it on "mnist" or "fashion mnist" and add it as an integration test to the "examples" module tests

  • add saving function for ReLU activation layer
  • add saving function for ELU activation layer
  • add an example
  • convert this example to the integration test
  • train network, tune it to achieve accuracy more than >70%

Metadata

Metadata

Assignees

Labels

Type

No type

Projects

No projects

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions