←
^
→
Artificial Neural Networks
Backpropagation Convergence
Backpropagation does a gradient descent, so it will converge to a local minimum (perhaps not global). To alliviate this you can
add momentum, as before,
use stochastic gradient descent,
train multiple nets with different initial weights.
As we saw in the pictures, convergence is slow at first then fast.
José M. Vidal
.
26 of 33