PDA

View Full Version : The Final


  1. Question 9
  2. Question 14 - bias term
  3. Questions on the Bayesian Prior
  4. Question 14 LLoyd's algorithm - empty clusters
  5. Ques 16-18 How to find Ein?
  6. RBF using a library package?
  7. Question 2 - Meaning of /g
  8. Question 10
  9. Question 20
  10. Question 20 - Average of two hypotheses
  11. Question about cluster centers RBF basic
  12. Questions 7-10
  13. Clarification Request On Problem 13
  14. Question 12
  15. P14-17 Out-Of-Sample Data Set Size?
  16. Q7 No feature transform but do we add an x_0
  17. Question 13 - RBF with hard mrgin unable to accurately classifiy
  18. Q19
  19. Question 6
  20. Binary Classification Error
  21. Question 11
  22. Lecture 18 and Shapley value
  23. P13 Question
  24. Question 7-9: all overwhelms one, should we use weights
  25. How Many Iterations to Pick Best K-Cluster?
  26. Problem 14 and 15: What Does "Beat" Mean?
  27. Final Question No. 4
  28. Q19: What mathematical object is the posterior
  29. On Bayesian learning
  30. True Caltech final experience
  31. Q6 clarification required
  32. Q1: clarification on the meaning of polynomial transform of order Q
  33. Question 2 - Method of averaging
  34. One-class SVMs
  35. on the right track?
  36. Q21
  37. March Madness
  38. All things considered...
  39. Data snooping and science
  40. Thanks
  41. Q12 - SVMs
  42. Clarification on Q4
  43. Q3 clarifications
  44. Question 20 - option d
  45. Thank you for an excellent course!
  46. Clarification on the Radial Basis Function Problems
  47. Q7-10
  48. Q9 Clarification
  49. Q13 - on using LIBSVM : C parameter?
  50. Failed to find the bias term for Q14
  51. Q13 bizarre results
  52. Clarification – Data Snooping and Tukey’s EDA
  53. Q16 - clarification needed
  54. Q16 and Q17 clarification
  55. *ANSWER* Q13 about linearly separable by SVM
  56. Q13, how to calculate b ,and final predicted solution ?
  57. Q13-15 rbf vsm eout=0?
  58. *answer* q2