PDA

View Full Version : Homework 6


  1. Using TeX in your post
  2. Including a lecture video segment
  3. Hw 6 q1
  4. doubt in lecture 11, deterministic noise
  5. Minimizing Eaug- where did the 2 go?
  6. Hw 6 Q1: "In general"
  7. Questions on lecture 12
  8. Homework 6 #10
  9. Homework#6 Q2
  10. Usage of test data in early stopping
  11. Homework#6 Q3
  12. Clarification on HW6-Q8
  13. backpropagation at the final layer
  14. Classification Error - HW6 Q2
  15. Subset & deterministic noise
  16. Out of syllabus question on Regularization vs Priors
  17. Q9 clarification
  18. Discussion of Lecture 11 "Overfitting"
  19. Question on regularization for logistic regression
  20. weight decay and data normalization *not a homework question*
  21. Restricted Learner's Rule of Thumb (Lecture 11)
  22. Question 9 - minimum # of weights
  23. Lec-11: Overfitting in terms of (bias, var, stochastic-noise)
  24. What about residual analysis in linear regression?
  25. *ANSWER* questions w linear regression & weight decay
  26. HW6 - Q 2 thru 6 - make sure you apply abs() to the last 2 features!
  27. Visualizations (qs 2 to 6) - Regularization with weight decay
  28. misclassified error and Eucledian distance
  29. HW6-Q1 ambiguous/contradictory
  30. Doubt in Lecture 11
  31. *ANSWER* hw6 q8
  32. Related to Legendre polynomials
  33. *ANSWER* Hw6 q10
  34. Homework materials