#1




Solutions to exercises
Hello,
The course and book are great! I've been going through a number of ML courses and books but this is definitely the best. Thank you Prof Yaser! Can I find somewhere the solutions for the exercises in the book? It'd be super useful to check my answers against them as I go through the book. Thanks again for all this, Prof Yaser! LR 
#2




Re: Solutions to exercises
There is no solution manual pdf for distribution. But you are very welcomed to discuss about your answers with the community here.
__________________
When one teaches, two learn. 
#3




Re: Solutions to exercises
Hi,
You could also check out my solutions to the book problems on GitHub : https://github.com/ppaquay/LearningfromDataSolutions Best regards ! 
#4




Re: Solutions to exercises
In problem 1.1 the bags and the suggested use of Bayes Theorem is a bit of a red Herring. You have 4 balls 3 of which are black. You pick one ball which is black. There are now 3 balls 2 of which are black. The chance that you pick a black one is 2 out of 3. It doesn't matter how many bags there are (as long as there is at least one ball in each bag) or from which bag you choose the second ball (as long as there is a ball in it).
So for example if I have 10 balls spread over 3 bags (e.g. 4,4,2) and one is white and the first ball I pick is black. Then the chance I pick another black ball from any particular bag is 8/9. Try solving this using the Bayesian Theorem :^). You can think of the bags as groupings on the balls on a table in the dark. If you select a ball from one group it doesn't matter which group you select the second ball from. 
#5




Re: Solutions to exercises
Quote:
you make the step: w(t1) + y(t1)x(t1)^2 = w(t1)^2 + x(t1)^2 +2 y(t1)w^T(t1)x(t1) What lemma are you basing this step on? You can prove by simple algebraic manipulation (so no need for use of a lemma) by using w(t)=w(t1)+y(t1) and the definition of w^2 namely, w^T w 
Thread Tools  
Display Modes  

