LFD Book Forum Recency weighted regression
 Register FAQ Calendar Mark Forums Read

#11
08-26-2012, 03:00 AM
 itooam Senior Member Join Date: Jul 2012 Posts: 100
Re: Recency weighted regression

Suppose I could just transform the weight matrix into a vector do a transpose then do a cross product (not sure how to present that in algebraic form but think that is the solution)!?
#12
08-26-2012, 03:09 AM
 itooam Senior Member Join Date: Jul 2012 Posts: 100
Re: Recency weighted regression

* I meant "inner" product above NOT cross product.
#13
08-27-2012, 04:41 AM
 itooam Senior Member Join Date: Jul 2012 Posts: 100
Re: Recency weighted regression

Scrap what I wrote above about large datasets causing havoc for the weight matrix. I found Octave already knows about such problems and has support for sparse matrices... very useful
#14
08-27-2012, 01:08 PM
 magdon RPI Join Date: Aug 2009 Location: Troy, NY, USA. Posts: 597
Re: Recency weighted regression

Yes, there is a closed form solution which is obtained by taking the into the square:

This is exactly an unscaled linear regression problem where you have rescaled each data point by . So, after you rescale your data in this way, you can just run your old regression algorithm without the weightings.

Quote:
 Originally Posted by itooam Thank you for all your help it has been really appreciated. I have one final question, do you know if there is a closed form solution to (assuming is a vector with the same number of rows as x?) i.e., the closed form solution as used for linear regression and regularization - copied from lecture notes is this: I am not sure where would end up in the above, the derivation is beyond me mathematically?
__________________
Have faith in probability
#15
08-28-2012, 04:15 AM
 itooam Senior Member Join Date: Jul 2012 Posts: 100
Re: Recency weighted regression

Thanks Magdon, I always manage to make things so much more complicated than they need to be. That equation you posted would have saved me hours - and it is so simple - why didn't I think of it? Instead I went the long way round, not a total loss though as has been a great learning curve for me

I tried your approach and compared to my workings (in one of my previous posts):

and for all my tests I am getting the same . So this is great news as confirms my formula was correct too .

Many thanks, I can't say enough how much your help is appreciated.

 Thread Tools Display Modes Linear Mode

 Posting Rules You may not post new threads You may not post replies You may not post attachments You may not edit your posts BB code is On Smilies are On [IMG] code is On HTML code is Off Forum Rules
 Forum Jump User Control Panel Private Messages Subscriptions Who's Online Search Forums Forums Home General     General Discussion of Machine Learning     Free Additional Material         Dynamic e-Chapters         Dynamic e-Appendices Course Discussions     Online LFD course         General comments on the course         Homework 1         Homework 2         Homework 3         Homework 4         Homework 5         Homework 6         Homework 7         Homework 8         The Final         Create New Homework Problems Book Feedback - Learning From Data     General comments on the book     Chapter 1 - The Learning Problem     Chapter 2 - Training versus Testing     Chapter 3 - The Linear Model     Chapter 4 - Overfitting     Chapter 5 - Three Learning Principles     e-Chapter 6 - Similarity Based Methods     e-Chapter 7 - Neural Networks     e-Chapter 8 - Support Vector Machines     e-Chapter 9 - Learning Aides     Appendix and Notation     e-Appendices

All times are GMT -7. The time now is 03:57 PM.