Quote:
Originally Posted by Yellin
There may be some confusion here between "nodes", which are the little circles in the neural net diagrams, and "units", which are the x's that enter and leave the circles. So mibeeson's example may mean there are 2 input units, of which one is constant, with two nodes, each receiving the two input units and producing one unit each. Adding one constant unit to the two units from those two nodes makes for 3 hidden units; the three go to one output node, which produces the output unit. Each node needs weights for each of its incoming units, so there are 4 weights from the first layer of nodes plus 3 from the three units entering the output node. Is this reasonable, or am I the one who is confused?

Yellin, in my understanding you are right in the example you gave. And I would like to add the other interpretation of the problem proposed by mjbeeson where rohanag opened my eyes: if I have 2 input units (one of them is the constant) and one node that receive both connections from the input units. Now to next layer, I have the unit value from the node and one constant unit (what result in two units in this hidden layer), both connecting with the output node. Thus, I have two weights in first part and more two in the last. Am I right?