Deterministic Noise & Gibb's Phenomenon/Godunov's Theorem
I think deterministic noise represents the error when a higher order 'polynomial' is attempting to model/forecast a much higher order target function. This error reduces when the order of the hypothesis polynomial is reduced. Also, this is indistinguishable from stochastic noise.
I find this similar to Gibb's phenomenon and Godunov's Theorem. Specifically, in Computational Fluid Dynamics, one is required to use a lower (first order) representation near a discontinuity (shock wave) else, the scheme becomes unstable with the magnitude of oscillations near and across the discontinuity (the deterministic noise in this case) growing with every iteration. To address this, limiters are used. Godunov's Theorem states that: "Linear numerical schemes for solving partial differential equations (PDE's), having the property of not generating new extrema (monotone scheme), can be at most firstorder accurate."
I just wanted to know if my understanding of deterministic noise and its similarity to Gibb's phenomenon/Godunov's Theorem is correct? If so, are concepts such as limiters, of any use in formulating hypotheses in Machine Learning?
