Thread: Homework 6 #10
View Single Post
  #10  
Old 05-14-2012, 12:46 PM
Yellin Yellin is offline
Member
 
Join Date: Apr 2012
Posts: 26
Default Re: Homework 6 #10

Quote:
Originally Posted by mjbeeson View Post
Counting weights. Just to clarify if I am counting right: suppose we have 2 input nodes and one hidden layer with 2 nodes and one output node. So then we have 4 weights, right? According to the problem statement the constant input nodes are counted in the "2". Similarly if every input or hidden layer has an even number of nodes then we must get an even number for the total number of weights. Do you agree? Or do I have an off-by-one error in this count somehow?
There may be some confusion here between "nodes", which are the little circles in the neural net diagrams, and "units", which are the x's that enter and leave the circles. So mibeeson's example may mean there are 2 input units, of which one is constant, with two nodes, each receiving the two input units and producing one unit each. Adding one constant unit to the two units from those two nodes makes for 3 hidden units; the three go to one output node, which produces the output unit. Each node needs weights for each of its incoming units, so there are 4 weights from the first layer of nodes plus 3 from the three units entering the output node. Is this reasonable, or am I the one who is confused?
Reply With Quote