
BackProp with Matrix Derivatives - Part II
In the first part of these series we began by a brief review of essential mathematical concepts to helps us build a foundation for understanding backpropagation using matrix derivatives. In this post we will express our neural network model in matrix form and symbolically derive its backpropagation equations using gradient descent. A simple neuron model In the first part, we observed that our neural model is represented as follows; $\displaystyle \hat{y} = f(w*x+b)$ ...