Skip to content

Latest commit

 

History

History
21 lines (16 loc) · 517 Bytes

README.md

File metadata and controls

21 lines (16 loc) · 517 Bytes

Neural Network

This is my attempt to code a neural network from scratch after studying how deep learning works

Activation functions

  • Linear (Identity)
  • Sigmoid (Logistic)
  • ReLU (Rectified linear unit)
  • TanH

Loss functions

  • MSE (Mean squared error)
  • Binary cross entropy

Example: XOR

To test my module, I tried to train a network predicting the XOR gate output

Cost over epochs

Imgur

Decision boundary output

Imgur