Skip to content

Add loaders for ELU, RELU activation layers (#78) #128

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Jun 22, 2021

Conversation

kokorins
Copy link
Contributor

  • missing activation layers loaders added
  • two distinct examples with save/load added to examples folder (trying to reach 0.7 accuracy)

- missing activation layers loaders added
- two distinct examples with save/load added to examples folder (trying to reach 0.7 accuracy)
@kokorins
Copy link
Contributor Author

For some reason I cant reopen the old one #115

@kokorins kokorins reopened this Jun 18, 2021
@zaleslaw zaleslaw added the Review This PR is under review label Jun 21, 2021
@zaleslaw zaleslaw linked an issue Jun 21, 2021 that may be closed by this pull request
5 tasks
@zaleslaw
Copy link
Collaborator

zaleslaw commented Jun 21, 2021

So, @kokorins, I'd like your PR, especially the custom callback!

Could you please update your PR by adding trainability property (like in the pr )

@zaleslaw zaleslaw added LGTM PR reviewed and is ready to merge and removed Review This PR is under review labels Jun 21, 2021
@zaleslaw zaleslaw merged commit 03c1890 into Kotlin:master Jun 22, 2021
@zaleslaw
Copy link
Collaborator

Congrats @kokorins ! Thanks for the contribution

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
LGTM PR reviewed and is ready to merge
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add missed loaders for the ReLU and ELU activation layers
2 participants