LFD Book Forum Homework 6 #10
 User Name Remember Me? Password
 FAQ Calendar Mark Forums Read

 Thread Tools Display Modes
#1
05-14-2012, 10:48 AM
 mjbeeson Member Join Date: Apr 2012 Posts: 25
Homework 6 #10

Counting weights. Just to clarify if I am counting right: suppose we have 2 input nodes and one hidden layer with 2 nodes and one output node. So then we have 4 weights, right? According to the problem statement the constant input nodes are counted in the "2". Similarly if every input or hidden layer has an even number of nodes then we must get an even number for the total number of weights. Do you agree? Or do I have an off-by-one error in this count somehow?
#2
05-14-2012, 10:59 AM
 cassio Member Join Date: Apr 2012 Location: Florianopolis - Brazil Posts: 16
Re: Homework 6 #10

Quote:
 Originally Posted by mjbeeson Counting weights. Just to clarify if I am counting right: suppose we have 2 input nodes and one hidden layer with 2 nodes and one output node. So then we have 4 weights, right?
Maybe I'm not right, but I think that in your example we have 6 weights. 4 to fully connect the input layer with the hidden layer, plus 2 for the hidden layer with the output layer
#3
05-14-2012, 11:05 AM
 rohanag Invited Guest Join Date: Apr 2012 Posts: 94
Re: Homework 6 #10

Quote:
 Originally Posted by cassio Maybe I'm not right, but I think that in your example we have 6 weights. 4 to fully connect the input layer with the hidden layer, plus 2 for the hidden layer with the output layer
4 nodes are not needed to connect the input layer with the hidden one, if he wants one of the 2 hidden layer nodes to be constant.

Last edited by rohanag; 05-14-2012 at 11:09 AM. Reason: grammar
#4
05-14-2012, 11:08 AM
 rohanag Invited Guest Join Date: Apr 2012 Posts: 94
Re: Homework 6 #10

so if you did count the constant nodes in your specification, I think 4 is correct for the final answer
#5
05-14-2012, 11:09 AM
 cassio Member Join Date: Apr 2012 Location: Florianopolis - Brazil Posts: 16
Re: Homework 6 #10

Quote:
 Originally Posted by rohanag Not 4 nodes to connect the input layer with the hidden one, if he wants one of the 2 hidden layer nodes to be constant. Do you mjbeeson?
But even the constant is multiplied by a weight that connects it to the next layer, does not?
#6
05-14-2012, 11:10 AM
 rohanag Invited Guest Join Date: Apr 2012 Posts: 94
Re: Homework 6 #10

Quote:
 Originally Posted by cassio But even the constant is multiplied by a weight that connects it to the next layer, does not?
Yes so, that's why there are 2 weights from the input to the hidden layer, and 2 nodes from the hidden layer to the final output node.
#7
05-14-2012, 11:17 AM
 cassio Member Join Date: Apr 2012 Location: Florianopolis - Brazil Posts: 16
Re: Homework 6 #10

Quote:
 Originally Posted by rohanag Yes so, that's why there are 2 weights from the input to the hidden layer, and 2 nodes from the hidden layer to the final output node.
Yes yes, now I understood what you meant. The constant unit does not receive connections from back layers.
#8
05-14-2012, 11:46 AM
 Yellin Member Join Date: Apr 2012 Posts: 26
Re: Homework 6 #10

Quote:
 Originally Posted by mjbeeson Counting weights. Just to clarify if I am counting right: suppose we have 2 input nodes and one hidden layer with 2 nodes and one output node. So then we have 4 weights, right? According to the problem statement the constant input nodes are counted in the "2". Similarly if every input or hidden layer has an even number of nodes then we must get an even number for the total number of weights. Do you agree? Or do I have an off-by-one error in this count somehow?
There may be some confusion here between "nodes", which are the little circles in the neural net diagrams, and "units", which are the x's that enter and leave the circles. So mibeeson's example may mean there are 2 input units, of which one is constant, with two nodes, each receiving the two input units and producing one unit each. Adding one constant unit to the two units from those two nodes makes for 3 hidden units; the three go to one output node, which produces the output unit. Each node needs weights for each of its incoming units, so there are 4 weights from the first layer of nodes plus 3 from the three units entering the output node. Is this reasonable, or am I the one who is confused?
#9
05-14-2012, 12:23 PM
 cassio Member Join Date: Apr 2012 Location: Florianopolis - Brazil Posts: 16
Re: Homework 6 #10

Quote:
 Originally Posted by Yellin There may be some confusion here between "nodes", which are the little circles in the neural net diagrams, and "units", which are the x's that enter and leave the circles. So mibeeson's example may mean there are 2 input units, of which one is constant, with two nodes, each receiving the two input units and producing one unit each. Adding one constant unit to the two units from those two nodes makes for 3 hidden units; the three go to one output node, which produces the output unit. Each node needs weights for each of its incoming units, so there are 4 weights from the first layer of nodes plus 3 from the three units entering the output node. Is this reasonable, or am I the one who is confused?
Yellin, in my understanding you are right in the example you gave. And I would like to add the other interpretation of the problem proposed by mjbeeson where rohanag opened my eyes: if I have 2 input units (one of them is the constant) and one node that receive both connections from the input units. Now to next layer, I have the unit value from the node and one constant unit (what result in two units in this hidden layer), both connecting with the output node. Thus, I have two weights in first part and more two in the last. Am I right?

Last edited by cassio; 05-14-2012 at 12:37 PM. Reason: grammar
#10
05-14-2012, 12:48 PM
 mjbeeson Member Join Date: Apr 2012 Posts: 25
Re: Homework 6 #10

I don't think that's the way we are to read the homework. Notice problem 8, where it says,
"10 input units (including the constant value)...and 36 hidden units (include the constant inputs of each hidden layer).
The "including" part important in the counting.

 Thread Tools Display Modes Hybrid Mode

 Posting Rules You may not post new threads You may not post replies You may not post attachments You may not edit your posts BB code is On Smilies are On [IMG] code is On HTML code is Off Forum Rules
 Forum Jump User Control Panel Private Messages Subscriptions Who's Online Search Forums Forums Home General     General Discussion of Machine Learning     Free Additional Material         Dynamic e-Chapters         Dynamic e-Appendices Course Discussions     Online LFD course         General comments on the course         Homework 1         Homework 2         Homework 3         Homework 4         Homework 5         Homework 6         Homework 7         Homework 8         The Final         Create New Homework Problems Book Feedback - Learning From Data     General comments on the book     Chapter 1 - The Learning Problem     Chapter 2 - Training versus Testing     Chapter 3 - The Linear Model     Chapter 4 - Overfitting     Chapter 5 - Three Learning Principles     e-Chapter 6 - Similarity Based Methods     e-Chapter 7 - Neural Networks     e-Chapter 8 - Support Vector Machines     e-Chapter 9 - Learning Aides     Appendix and Notation     e-Appendices

All times are GMT -7. The time now is 06:57 AM.

 Contact Us - LFD Book - Top

Powered by vBulletin® Version 3.8.3
Copyright ©2000 - 2020, Jelsoft Enterprises Ltd.
The contents of this forum are to be used ONLY by readers of the Learning From Data book by Yaser S. Abu-Mostafa, Malik Magdon-Ismail, and Hsuan-Tien Lin, and participants in the Learning From Data MOOC by Yaser S. Abu-Mostafa. No part of these contents is to be communicated or made accessible to ANY other person or entity.