
Automatic Differentiation with Computational Graph, Part III Application
Backward propagation or more precisely Error Backward Propagation is a technique in neural networks used to optimize a neural network. An output error is propagated backward within the network and as the error propagates the network parameter are adjusted with the aim to reduce resulting output error. Our Model: Consider the following graph of our model of the neuron; graph LR A@{shape: circ, label: "w"} --> B@{shape: rect, label: "g(.)"} --> C@{shape: circ, label: "y"} --> D@{shape: rect, label: "h(.)"} --> E@{shape: circ, label: "f"} F@{shape: circ, label: "b"} --> B In this model; ...