LFD Book Forum  

Go Back   LFD Book Forum > Course Discussions > Online LFD course > Homework 5

Reply
 
Thread Tools Display Modes
  #1  
Old 02-06-2015, 01:25 PM
mach_learn mach_learn is offline
Junior Member
 
Join Date: Feb 2015
Posts: 2
Default Homework for Neural Network

Professor Yaser,

In lecture 9 Q & A. You mentioned that you used to have a homework programming assignment for creating a Neural Network, but that was removed.

Could you please post that homework assignment. It would be very helpful.

Thank you
Reply With Quote
  #2  
Old 02-15-2015, 07:52 PM
yaser's Avatar
yaser yaser is offline
Caltech
 
Join Date: Aug 2009
Location: Pasadena, California, USA
Posts: 1,472
Default Re: Homework for Neural Network

Here are the old homework problems (not required in this course, and there is no technical support).

\bullet Backpropagation:

Following the class notes, implement the backpropagation algorithm
that takes as input a network architecture (d^{(0)} = d , d^{(1)} , d^{(2)} , ... , d^{(L)} =1)
and a set of examples ({\bf x}_1 , y_1) , ... , ({\bf x}_N , y_N) where
{\bf x}_n \in {\mathbb R}^d and y \in {\mathbb R}, and produces as output the network weights.
The algorithm should perform gradient descent on one example at a time,
but should also keep track of the average error for all the examples in
each epoch. Try your algorithm on the data set in

http://work.caltech.edu/train.dat

(the first two columns are the input and the third column is the output).
Test the convergence behavior for architectures with one hidden layer
(L=2) and 1 to 5 neurons (d^{(1)}=1,2,3,4,5), with combinations of the following
parameters:

(i) The initial weight values chosen independently and randomly
from the range (-0.02,0.02), the range (-0.2,0.2), or the range (-2,2).

(ii) The learning rate \eta fixed at 0.01, 0.1 or 1.

(iii) Sufficient number of epochs to get the training error
to converge (within reason).

Turn in your code and a single parameter combination that resulted
in good convergence for the above architectures.

\bullet Generalization:

Using your backpropagation program and data from the above problem, train different
neural networks with L=2 (an input layer, one `hidden' layer, and an
output layer) where the number of neurons in the hidden layer is 1, 2, 3, 4, or 5.
Use the following out-of-sample data to test your networks:

http://work.caltech.edu/test.dat

Plot the training and test errors for each network as a function of
the epoch number (hence the `intermediate' networks are evaluated using the
test data, but the test data is not used in the backpropagation).
Repeat the experiment by reversing the roles of the training and test
sets (you may need to readjust the parameter combination from the previous problem), and plot
the training and test errors again. Briefly analyze the results you get.
__________________
Where everyone thinks alike, no one thinks very much
Reply With Quote
  #3  
Old 02-16-2015, 02:09 PM
mach_learn mach_learn is offline
Junior Member
 
Join Date: Feb 2015
Posts: 2
Default Re: Homework for Neural Network

Thank you Professor Yaser.
Reply With Quote
Reply

Tags
neural network

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -7. The time now is 03:52 AM.


Powered by vBulletin® Version 3.8.3
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
The contents of this forum are to be used ONLY by readers of the Learning From Data book by Yaser S. Abu-Mostafa, Malik Magdon-Ismail, and Hsuan-Tien Lin, and participants in the Learning From Data MOOC by Yaser S. Abu-Mostafa. No part of these contents is to be communicated or made accessible to ANY other person or entity.