Skip to content

A simple Python script showing how the backpropagation works

License

Notifications You must be signed in to change notification settings

alm4z/python-backprop

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 

Repository files navigation

Vanilla backpropagation on Python

A simple Python script showing how the backpropagation works.

  • Numpy realization
  • Basic MLP neural network

XOR Problem

The backpropagation algorithm begins by comparing the actual value output by the forward propagation process to the expected value and then moves backward through the network, slightly adjustingeach of the weights in a direction that reduces the size of the error by a small degree. Both forward and back propagation are re-run thousands of times on each input combination until the network can accurately predict the expected output of the possible inputs using forward propagation.

For the xOr problem, 100% of possible data examples are available to use in the training process. We can therefore expect the trained network to be 100% accurate in its predictions and there is no need to be concerned with issues such as bias and variance in the resulting model. [1]

How it use?

git clone https://github.com/alm4z/python-backprop.git
cd python-backprop
python backprop.py

Contact

If you have any further questions or suggestions, please, do not hesitate to contact by email at a.sadenov@gmail.com.

References

  1. Well described XOR problem from Jayesh Bapu Ahire

License

MIT

About

A simple Python script showing how the backpropagation works

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages