Aug 18, 2020
Hugo Larochell Neural Networks
Back Propagation Algorithm:
Validation of Back propagation of result using simple limit epsilon tends to zero
Regularization:
- Generally applied only to
weights
notbaiases
L2
Regularization:- Takes square of weights
- Gaussian Prior, in probabilistic modelling says how the weights are generated
L1
Regularization:- Takes absolute values
- Laplacian Prior
- Makes some weights exactly=0 thus pruning connections
- Makes NNs less complex and to overfit data
Generalization:
Bias:
- How close is the average model to the TRUE solution
Variance:
- Area of possible model (
theta
) for a given training set
Trade off:
- Both low variance and low bias is not possible.
- We have to make sure that we trade off using
lambda
so that we get acceptable variance and bias in our model. - High
lambda
=> high bias, low variance - Low
lambda
=> low bias, but high variance