 LFD Book Forum What about residual analysis in linear regression?
 User Name Remember Me? Password
 FAQ Calendar Mark Forums Read

 Thread Tools Display Modes
#1
 jlaurentum Member Join Date: Apr 2013 Location: Venezuela Posts: 41 What about residual analysis in linear regression?

I've been kind of saving this question, but decided to ask at this point.

Why is there no mention of residual analysis in any of the linear regression topics the course has covered? How does residual analysis fit into the data learning picture (if it fits in at all)?

Specifically: starting with this week's topic of regularization, we've seen how weight decay softens the weights, but in doing so, chages them from the normal weights you'd obtain in linear regression. I would imagine that with weight decay, it would no longer hold that the mean of the errors (as in linear regression errors: ) is equal to zero, so the residuals would not be normally distributed with same variance and zero mean. In other words, with weight decay at least one of the Gauss-Markov assumptions do not hold?

Does that matter?

In general, are the standard tools of linear regression analysis we were taught in school (looking at the determination coefficient, hypothesis testing on the significance of the coefficients, and residual analysis to see if the assumptions that back up the previous elements hold) entirely pointless when you're doing machine learning?
#2 yaser Caltech Join Date: Aug 2009 Location: Pasadena, California, USA Posts: 1,476 Re: What about residual analysis in linear regression?

Quote:
 Originally Posted by jlaurentum I've been kind of saving this question, but decided to ask at this point. Why is there no mention of residual analysis in any of the linear regression topics the course has covered? How does residual analysis fit into the data learning picture (if it fits in at all)? Specifically: starting with this week's topic of regularization, we've seen how weight decay softens the weights, but in doing so, chages them from the normal weights you'd obtain in linear regression. I would imagine that with weight decay, it would no longer hold that the mean of the errors (as in linear regression errors: ) is equal to zero, so the residuals would not be normally distributed with same variance and zero mean. In other words, with weight decay at least one of the Gauss-Markov assumptions do not hold? Does that matter? In general, are the standard tools of linear regression analysis we were taught in school (looking at the determination coefficient, hypothesis testing on the significance of the coefficients, and residual analysis to see if the assumptions that back up the previous elements hold) entirely pointless when you're doing machine learning?
Residual analysis and other details of linear regression are worthy topics. They are regularly covered in statistics, but often not covered in machine learning. If you recall in Lecture 1, we alluded quickly to the contrast between statistics and machine learning (which do have a substantive overlap) in terms of mathematical assumptions and level of detailed analysis. Linear regression is a case in point for that contrast.
__________________
Where everyone thinks alike, no one thinks very much
#3
 jlaurentum Member Join Date: Apr 2013 Location: Venezuela Posts: 41 Re: What about residual analysis in linear regression?

Thank you for the quick reply, Professor. I'll review lecture one more closely.

 Thread Tools Show Printable Version Email this Page Display Modes Linear Mode Switch to Hybrid Mode Switch to Threaded Mode Posting Rules You may not post new threads You may not post replies You may not post attachments You may not edit your posts BB code is On Smilies are On [IMG] code is On HTML code is Off Forum Rules
 Forum Jump User Control Panel Private Messages Subscriptions Who's Online Search Forums Forums Home General     General Discussion of Machine Learning     Free Additional Material         Dynamic e-Chapters         Dynamic e-Appendices Course Discussions     Online LFD course         General comments on the course         Homework 1         Homework 2         Homework 3         Homework 4         Homework 5         Homework 6         Homework 7         Homework 8         The Final         Create New Homework Problems Book Feedback - Learning From Data     General comments on the book     Chapter 1 - The Learning Problem     Chapter 2 - Training versus Testing     Chapter 3 - The Linear Model     Chapter 4 - Overfitting     Chapter 5 - Three Learning Principles     e-Chapter 6 - Similarity Based Methods     e-Chapter 7 - Neural Networks     e-Chapter 8 - Support Vector Machines     e-Chapter 9 - Learning Aides     Appendix and Notation     e-Appendices

All times are GMT -7. The time now is 09:32 AM.

 Contact Us - LFD Book - Top