LFD Book Forum

LFD Book Forum (http://book.caltech.edu/bookforum/index.php)
-   Homework 6 (http://book.caltech.edu/bookforum/forumdisplay.php?f=135)
-   -   Homework 6 #10 (http://book.caltech.edu/bookforum/showthread.php?t=494)

mjbeeson 05-14-2012 11:48 AM

Homework 6 #10
 
Counting weights. Just to clarify if I am counting right: suppose we have 2 input nodes and one hidden layer with 2 nodes and one output node. So then we have 4 weights, right? According to the problem statement the constant input nodes are counted in the "2". Similarly if every input or hidden layer has an even number of nodes then we must get an even number for the total number of weights. Do you agree? Or do I have an off-by-one error in this count somehow?

cassio 05-14-2012 11:59 AM

Re: Homework 6 #10
 
Quote:

Originally Posted by mjbeeson (Post 2108)
Counting weights. Just to clarify if I am counting right: suppose we have 2 input nodes and one hidden layer with 2 nodes and one output node. So then we have 4 weights, right?

Maybe I'm not right, but I think that in your example we have 6 weights. 4 to fully connect the input layer with the hidden layer, plus 2 for the hidden layer with the output layer

rohanag 05-14-2012 12:05 PM

Re: Homework 6 #10
 
Quote:

Originally Posted by cassio (Post 2110)
Maybe I'm not right, but I think that in your example we have 6 weights. 4 to fully connect the input layer with the hidden layer, plus 2 for the hidden layer with the output layer

4 nodes are not needed to connect the input layer with the hidden one, if he wants one of the 2 hidden layer nodes to be constant.

rohanag 05-14-2012 12:08 PM

Re: Homework 6 #10
 
so if you did count the constant nodes in your specification, I think 4 is correct for the final answer

cassio 05-14-2012 12:09 PM

Re: Homework 6 #10
 
Quote:

Originally Posted by rohanag (Post 2111)
Not 4 nodes to connect the input layer with the hidden one, if he wants one of the 2 hidden layer nodes to be constant. Do you mjbeeson?

But even the constant is multiplied by a weight that connects it to the next layer, does not?

rohanag 05-14-2012 12:10 PM

Re: Homework 6 #10
 
Quote:

Originally Posted by cassio (Post 2113)
But even the constant is multiplied by a weight that connects it to the next layer, does not?

Yes so, that's why there are 2 weights from the input to the hidden layer, and 2 nodes from the hidden layer to the final output node.

cassio 05-14-2012 12:17 PM

Re: Homework 6 #10
 
Quote:

Originally Posted by rohanag (Post 2114)
Yes so, that's why there are 2 weights from the input to the hidden layer, and 2 nodes from the hidden layer to the final output node.

Yes yes, now I understood what you meant. The constant unit does not receive connections from back layers.

rohanag 05-14-2012 12:19 PM

Re: Homework 6 #10
 
Quote:

Originally Posted by cassio (Post 2115)
Yes yes, now I understood what you meant. The constant unit does not receive connections from back layers.

exactly :)

cassio 05-14-2012 12:23 PM

Re: Homework 6 #10
 
@mjbeeson, I am sorry for misled you in my first post.

Yellin 05-14-2012 12:46 PM

Re: Homework 6 #10
 
Quote:

Originally Posted by mjbeeson (Post 2108)
Counting weights. Just to clarify if I am counting right: suppose we have 2 input nodes and one hidden layer with 2 nodes and one output node. So then we have 4 weights, right? According to the problem statement the constant input nodes are counted in the "2". Similarly if every input or hidden layer has an even number of nodes then we must get an even number for the total number of weights. Do you agree? Or do I have an off-by-one error in this count somehow?

There may be some confusion here between "nodes", which are the little circles in the neural net diagrams, and "units", which are the x's that enter and leave the circles. So mibeeson's example may mean there are 2 input units, of which one is constant, with two nodes, each receiving the two input units and producing one unit each. Adding one constant unit to the two units from those two nodes makes for 3 hidden units; the three go to one output node, which produces the output unit. Each node needs weights for each of its incoming units, so there are 4 weights from the first layer of nodes plus 3 from the three units entering the output node. Is this reasonable, or am I the one who is confused?


All times are GMT -7. The time now is 11:22 AM.

Powered by vBulletin® Version 3.8.3
Copyright ©2000 - 2019, Jelsoft Enterprises Ltd.
The contents of this forum are to be used ONLY by readers of the Learning From Data book by Yaser S. Abu-Mostafa, Malik Magdon-Ismail, and Hsuan-Tien Lin, and participants in the Learning From Data MOOC by Yaser S. Abu-Mostafa. No part of these contents is to be communicated or made accessible to ANY other person or entity.