#1




Deterministic Noise & Gibb's Phenomenon/Godunov's Theorem
I think deterministic noise represents the error when a higher order 'polynomial' is attempting to model/forecast a much higher order target function. This error reduces when the order of the hypothesis polynomial is reduced. Also, this is indistinguishable from stochastic noise.
I find this similar to Gibb's phenomenon and Godunov's Theorem. Specifically, in Computational Fluid Dynamics, one is required to use a lower (first order) representation near a discontinuity (shock wave) else, the scheme becomes unstable with the magnitude of oscillations near and across the discontinuity (the deterministic noise in this case) growing with every iteration. To address this, limiters are used. Godunov's Theorem states that: "Linear numerical schemes for solving partial differential equations (PDE's), having the property of not generating new extrema (monotone scheme), can be at most firstorder accurate." I just wanted to know if my understanding of deterministic noise and its similarity to Gibb's phenomenon/Godunov's Theorem is correct? If so, are concepts such as limiters, of any use in formulating hypotheses in Machine Learning? 
#2




Re: Deterministic Noise & Gibb's Phenomenon/Godunov's Theorem
I am not familiar with the Gibb's Phenomenon enough to give precise comments. From my reading of the phenomenon on wikipedia, there are indeed similarities
(0) deterministic noise: For a fixed target function, if the target function is outside of the hypothesis sets, the deterministic noise decreases with higherorder polynomials, but does not diminish. (1) Gibb's: for a fixed (discontinuous) function, the level of overshooting decreases with higherorder Fourier series, but does not diminish. Just my two cents.
__________________
When one teaches, two learn. 
Tags 
deterministic noise, gibb's, godunov, limiters 
Thread Tools  
Display Modes  

