LFD Book Forum Homework for Neural Network
 User Name Remember Me? Password
 Register FAQ Calendar Mark Forums Read

 Thread Tools Display Modes
#1
02-06-2015, 02:25 PM
 mach_learn Junior Member Join Date: Feb 2015 Posts: 2
Homework for Neural Network

Professor Yaser,

In lecture 9 Q & A. You mentioned that you used to have a homework programming assignment for creating a Neural Network, but that was removed.

Could you please post that homework assignment. It would be very helpful.

Thank you
#2
02-15-2015, 08:52 PM
 yaser Caltech Join Date: Aug 2009 Location: Pasadena, California, USA Posts: 1,478
Re: Homework for Neural Network

Here are the old homework problems (not required in this course, and there is no technical support).

Backpropagation:

Following the class notes, implement the backpropagation algorithm
that takes as input a network architecture ()
and a set of examples where
and , and produces as output the network weights.
The algorithm should perform gradient descent on one example at a time,
but should also keep track of the average error for all the examples in
each epoch. Try your algorithm on the data set in

http://work.caltech.edu/train.dat

(the first two columns are the input and the third column is the output).
Test the convergence behavior for architectures with one hidden layer
() and 1 to 5 neurons (), with combinations of the following
parameters:

(i) The initial weight values chosen independently and randomly
from the range (-0.02,0.02), the range (-0.2,0.2), or the range (-2,2).

(ii) The learning rate fixed at , or .

(iii) Sufficient number of epochs to get the training error
to converge (within reason).

Turn in your code and a single parameter combination that resulted
in good convergence for the above architectures.

Generalization:

Using your backpropagation program and data from the above problem, train different
neural networks with (an input layer, one hidden' layer, and an
output layer) where the number of neurons in the hidden layer is 1, 2, 3, 4, or 5.
Use the following out-of-sample data to test your networks:

http://work.caltech.edu/test.dat

Plot the training and test errors for each network as a function of
the epoch number (hence the intermediate' networks are evaluated using the
test data, but the test data is not used in the backpropagation).
Repeat the experiment by reversing the roles of the training and test
sets (you may need to readjust the parameter combination from the previous problem), and plot
the training and test errors again. Briefly analyze the results you get.
__________________
Where everyone thinks alike, no one thinks very much
#3
02-16-2015, 03:09 PM
 mach_learn Junior Member Join Date: Feb 2015 Posts: 2
Re: Homework for Neural Network

Thank you Professor Yaser.

 Tags neural network

 Thread Tools Display Modes Hybrid Mode

 Posting Rules You may not post new threads You may not post replies You may not post attachments You may not edit your posts BB code is On Smilies are On [IMG] code is On HTML code is Off Forum Rules
 Forum Jump User Control Panel Private Messages Subscriptions Who's Online Search Forums Forums Home General     General Discussion of Machine Learning     Free Additional Material         Dynamic e-Chapters         Dynamic e-Appendices Course Discussions     Online LFD course         General comments on the course         Homework 1         Homework 2         Homework 3         Homework 4         Homework 5         Homework 6         Homework 7         Homework 8         The Final         Create New Homework Problems Book Feedback - Learning From Data     General comments on the book     Chapter 1 - The Learning Problem     Chapter 2 - Training versus Testing     Chapter 3 - The Linear Model     Chapter 4 - Overfitting     Chapter 5 - Three Learning Principles     e-Chapter 6 - Similarity Based Methods     e-Chapter 7 - Neural Networks     e-Chapter 8 - Support Vector Machines     e-Chapter 9 - Learning Aides     Appendix and Notation     e-Appendices

All times are GMT -7. The time now is 11:46 AM.

 Contact Us - LFD Book - Top