Thread: Noise
View Single Post
  #1  
Old 06-11-2013, 01:56 PM
Elroch Elroch is offline
Invited Guest
 
Join Date: Mar 2013
Posts: 143
Default Noise

One of the questions in the final drew my attention to the fact that I had somehow failed to get one topic straight from the lectures. This is the issue of noise. Perhaps a little discussion would help?

Firstly are we on the same page if I understand that noise in this context always means discrepancies between a learned hypothesis and the function that it attempts to approximate? The latter, a true signal, and the former some (usually somewhat inaccurate) approximation to it.

Firstly, the notion of deterministic noise is very clear from the presentation in the books and lectures as the difference between the mean hypothesis and the target function (with the assumptions of some probability distribution on the set of possible samples and the some fixed machine that converts samples to hypotheses, which can then be compared pointwise to the target). But given that the mean hypothesis can be very different to any of the hypotheses in the hypothesis set, I am not sure it helps to think of it as the "best" hypothesis in the set. Especially since sometimes when it is in the set, it is not the best hypothesis! I believe the fact that where it is in the hypothesis set, it is often close to the best approximation may be something to do with certain derived probability distributions typically being not very asymmetric. [Everything's approximately Gaussian, right? ]

Anyhow, to me, the concept of deterministic noise seems to be essentially the same as the bias term in the bias-variance decomposition. Would you agree with this statement?

In hindsight, the next point is where I am sure I was guilty of muddled understanding. The selection of a particular sample is presumably a random thing, resulting from some probability distribution on the set of all possible samples, and is thus stochastic in nature. But only now I have come to the conclusion that this does not form part of what is called stochastic noise.

It would help here if a separate special term was used (wouldn't "sampling noise" be clearer than "variance", a very general term more prone to being misinterpreted than "bias", in my opinion). Unfortunately, as it is a source of noise resulting from the random selection of a particular sample, it is stochastic in nature, which is how I explain my wrong interpretation of the term.

So, am I right in thinking that stochastic noise is limited to noise in the target which is completely independent of the information contained in x? This is certainly a key practical concept, with no distinction between where it is purely random in nature or in terms of missing information in the inputs (whether this is still true in the case of quantum entanglement is a diversion from machine learning ...). For example if you have a deterministic function of 3 variables and are given only 2 of them to learn from, the dependence on the 3rd variable may look exactly like stochastic noise, right?

Thanks, Yaser, for getting us to think about an important issue. Hopefully I am in a better position now to use it in practice. Please do point out anything that I still have wrong or incomplete.
Reply With Quote