Skip to content

casinca/ffn-from-scratch

Repository files navigation

Feed Forward Neural Network from Scratch

In pure Python and NumPy, one derivative at a time.

This is a from scratch, verbose implementation of a Feed Forward Neural Network (FFN). Used as a small framework for learning and research paper reimplementations at a lower level vs Pytorch or TF/Keras.


Thanks to multiple losses and activations functions, different tasks were implemented:

  • Categorical (multi-class) & Image classifications
  • Binary classification
  • Regression

Contents

  • Forward/Backward Propagation, with residual connections (optionally)

  • Weights Initialization:

    • Xavier Glorot Initialization
    • Kaiming He Initialization
  • Activation Functions:

    • ReLU
    • Sigmoid
    • Softmax
  • Loss Functions:

    • MAE
    • MSE
    • RMSE
    • Cross-Entropy/NLL
    • Binary Cross-Entropy
  • Optimizers:

    • AdamW
    • Adam
    • SGD
  • Regularizations:

    • L2
    • Dropout

Acknowledgments

This implementation wouldn't have been possible without the following resources, special thanks to:

  • The "Deep Learning Book" By Ian Goodfellow, Yoshua Bengio and Aaron Courville
    Used for the algorithms, math details and hyperparameters tuning

  • nnfs.io/nnfs By Harrison Kinsley and Daniel Kukieła
    for the book: general structure, math details, and the quick datasets for reproducibility

  • Various research papers referenced and cited throughout the notebook

  • Zalando Research: for the Fashion-MNIST dataset

About

Feed Forward Neural Network (FFN) from Scratch. In pure Python and NumPy. One derivative at a time.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published