![]() |
#1
|
|||
|
|||
![]()
We have:
total-noise = var (overfitting-noise ? ) + bias (deterministic-noise) + stochastic-noise Qs: 1. Is overfitting-noise the var part alone? From Prof’s lecture, I tend to conclude that it is var caused because of attempt to fit stochastic-noise i.e. overfitting-noise really is an interplay of (stochastic-noise -> variance). Need help in interpreting it. 2. When we try to arrest the overfitting, using brakes(regularization) and/or validation, are we really working with overfitting alone ? In case of validation, we will have a measure of total-error : Is it that the relativity of total-errors across choice of model-complexity(e.g. H2 Vs H10), is giving us an estimate of relative measure of overfitting across choices of hypothesis-complexity? In case of brakes(regularization) : will the brake really be applied on overfitting alone, and not other parts of total-error, esp bias part ? 3. Consider a case in which target-complexity is 2nd order polynomial and we chose a 2nd order(H2) and a 10th order polynomial(H10) to fit it. How will the overfit and bias vary for the two hypothesis (as N grows on the x-axis)? Specifically, will the H10 have overfitting (with or without stochastic noise)? Also, H10 should have higher bias compared to H2 ? 4. Is there a notion of underfitting wrt Target-Function ? When we try to fit a 10th order polynomial target-function, with a 2nd order polynomial hypothesis, are we not underfitting ? If so, can we associate underfitting to bias then ? If not, what else ? Thanks |
Thread Tools | |
Display Modes | |
|
|