LFD Book Forum Deterministic Noise & Gibb's Phenomenon/Godunov's Theorem

#1
12-18-2012, 05:00 PM
 jk.aero Junior Member Join Date: Jul 2012 Posts: 1
Deterministic Noise & Gibb's Phenomenon/Godunov's Theorem

I think deterministic noise represents the error when a higher order 'polynomial' is attempting to model/forecast a much higher order target function. This error reduces when the order of the hypothesis polynomial is reduced. Also, this is indistinguishable from stochastic noise.

I find this similar to Gibb's phenomenon and Godunov's Theorem. Specifically, in Computational Fluid Dynamics, one is required to use a lower (first order) representation near a discontinuity (shock wave) else, the scheme becomes unstable with the magnitude of oscillations near and across the discontinuity (the deterministic noise in this case) growing with every iteration. To address this, limiters are used. Godunov's Theorem states that: "Linear numerical schemes for solving partial differential equations (PDE's), having the property of not generating new extrema (monotone scheme), can be at most first-order accurate."

I just wanted to know if my understanding of deterministic noise and its similarity to Gibb's phenomenon/Godunov's Theorem is correct? If so, are concepts such as limiters, of any use in formulating hypotheses in Machine Learning?
#2
12-19-2012, 07:47 AM
 htlin NTU Join Date: Aug 2009 Location: Taipei, Taiwan Posts: 601
Re: Deterministic Noise & Gibb's Phenomenon/Godunov's Theorem

I am not familiar with the Gibb's Phenomenon enough to give precise comments. From my reading of the phenomenon on wikipedia, there are indeed similarities

(0) deterministic noise: For a fixed target function, if the target function is outside of the hypothesis sets, the deterministic noise decreases with higher-order polynomials, but does not diminish.

(1) Gibb's: for a fixed (discontinuous) function, the level of overshooting decreases with higher-order Fourier series, but does not diminish.

Just my two cents.
__________________
When one teaches, two learn.

 Tags deterministic noise, gibb's, godunov, limiters

 Thread Tools Display Modes Linear Mode

 Posting Rules You may not post new threads You may not post replies You may not post attachments You may not edit your posts BB code is On Smilies are On [IMG] code is On HTML code is Off Forum Rules
 Forum Jump User Control Panel Private Messages Subscriptions Who's Online Search Forums Forums Home General     General Discussion of Machine Learning     Free Additional Material         Dynamic e-Chapters         Dynamic e-Appendices Course Discussions     Online LFD course         General comments on the course         Homework 1         Homework 2         Homework 3         Homework 4         Homework 5         Homework 6         Homework 7         Homework 8         The Final         Create New Homework Problems Book Feedback - Learning From Data     General comments on the book     Chapter 1 - The Learning Problem     Chapter 2 - Training versus Testing     Chapter 3 - The Linear Model     Chapter 4 - Overfitting     Chapter 5 - Three Learning Principles     e-Chapter 6 - Similarity Based Methods     e-Chapter 7 - Neural Networks     e-Chapter 8 - Support Vector Machines     e-Chapter 9 - Learning Aides     Appendix and Notation     e-Appendices

All times are GMT -7. The time now is 11:31 PM.