LFD Book Forum  

Go Back   LFD Book Forum > Course Discussions > Online LFD course > Homework 6

Reply
 
Thread Tools Display Modes
  #1  
Old 05-14-2012, 10:48 AM
mjbeeson mjbeeson is offline
Member
 
Join Date: Apr 2012
Posts: 25
Default Homework 6 #10

Counting weights. Just to clarify if I am counting right: suppose we have 2 input nodes and one hidden layer with 2 nodes and one output node. So then we have 4 weights, right? According to the problem statement the constant input nodes are counted in the "2". Similarly if every input or hidden layer has an even number of nodes then we must get an even number for the total number of weights. Do you agree? Or do I have an off-by-one error in this count somehow?
Reply With Quote
  #2  
Old 05-14-2012, 10:59 AM
cassio cassio is offline
Member
 
Join Date: Apr 2012
Location: Florianopolis - Brazil
Posts: 16
Default Re: Homework 6 #10

Quote:
Originally Posted by mjbeeson View Post
Counting weights. Just to clarify if I am counting right: suppose we have 2 input nodes and one hidden layer with 2 nodes and one output node. So then we have 4 weights, right?
Maybe I'm not right, but I think that in your example we have 6 weights. 4 to fully connect the input layer with the hidden layer, plus 2 for the hidden layer with the output layer
Reply With Quote
  #3  
Old 05-14-2012, 11:05 AM
rohanag rohanag is offline
Invited Guest
 
Join Date: Apr 2012
Posts: 94
Default Re: Homework 6 #10

Quote:
Originally Posted by cassio View Post
Maybe I'm not right, but I think that in your example we have 6 weights. 4 to fully connect the input layer with the hidden layer, plus 2 for the hidden layer with the output layer
4 nodes are not needed to connect the input layer with the hidden one, if he wants one of the 2 hidden layer nodes to be constant.

Last edited by rohanag; 05-14-2012 at 11:09 AM. Reason: grammar
Reply With Quote
  #4  
Old 05-14-2012, 11:08 AM
rohanag rohanag is offline
Invited Guest
 
Join Date: Apr 2012
Posts: 94
Default Re: Homework 6 #10

so if you did count the constant nodes in your specification, I think 4 is correct for the final answer
Reply With Quote
  #5  
Old 05-14-2012, 11:09 AM
cassio cassio is offline
Member
 
Join Date: Apr 2012
Location: Florianopolis - Brazil
Posts: 16
Default Re: Homework 6 #10

Quote:
Originally Posted by rohanag View Post
Not 4 nodes to connect the input layer with the hidden one, if he wants one of the 2 hidden layer nodes to be constant. Do you mjbeeson?
But even the constant is multiplied by a weight that connects it to the next layer, does not?
Reply With Quote
  #6  
Old 05-14-2012, 11:10 AM
rohanag rohanag is offline
Invited Guest
 
Join Date: Apr 2012
Posts: 94
Default Re: Homework 6 #10

Quote:
Originally Posted by cassio View Post
But even the constant is multiplied by a weight that connects it to the next layer, does not?
Yes so, that's why there are 2 weights from the input to the hidden layer, and 2 nodes from the hidden layer to the final output node.
Reply With Quote
  #7  
Old 05-14-2012, 11:17 AM
cassio cassio is offline
Member
 
Join Date: Apr 2012
Location: Florianopolis - Brazil
Posts: 16
Default Re: Homework 6 #10

Quote:
Originally Posted by rohanag View Post
Yes so, that's why there are 2 weights from the input to the hidden layer, and 2 nodes from the hidden layer to the final output node.
Yes yes, now I understood what you meant. The constant unit does not receive connections from back layers.
Reply With Quote
  #8  
Old 05-14-2012, 11:19 AM
rohanag rohanag is offline
Invited Guest
 
Join Date: Apr 2012
Posts: 94
Default Re: Homework 6 #10

Quote:
Originally Posted by cassio View Post
Yes yes, now I understood what you meant. The constant unit does not receive connections from back layers.
exactly
Reply With Quote
  #9  
Old 05-14-2012, 11:23 AM
cassio cassio is offline
Member
 
Join Date: Apr 2012
Location: Florianopolis - Brazil
Posts: 16
Default Re: Homework 6 #10

@mjbeeson, I am sorry for misled you in my first post.
Reply With Quote
  #10  
Old 05-14-2012, 11:46 AM
Yellin Yellin is offline
Member
 
Join Date: Apr 2012
Posts: 26
Default Re: Homework 6 #10

Quote:
Originally Posted by mjbeeson View Post
Counting weights. Just to clarify if I am counting right: suppose we have 2 input nodes and one hidden layer with 2 nodes and one output node. So then we have 4 weights, right? According to the problem statement the constant input nodes are counted in the "2". Similarly if every input or hidden layer has an even number of nodes then we must get an even number for the total number of weights. Do you agree? Or do I have an off-by-one error in this count somehow?
There may be some confusion here between "nodes", which are the little circles in the neural net diagrams, and "units", which are the x's that enter and leave the circles. So mibeeson's example may mean there are 2 input units, of which one is constant, with two nodes, each receiving the two input units and producing one unit each. Adding one constant unit to the two units from those two nodes makes for 3 hidden units; the three go to one output node, which produces the output unit. Each node needs weights for each of its incoming units, so there are 4 weights from the first layer of nodes plus 3 from the three units entering the output node. Is this reasonable, or am I the one who is confused?
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -7. The time now is 05:17 PM.


Powered by vBulletin® Version 3.8.3
Copyright ©2000 - 2018, Jelsoft Enterprises Ltd.
The contents of this forum are to be used ONLY by readers of the Learning From Data book by Yaser S. Abu-Mostafa, Malik Magdon-Ismail, and Hsuan-Tien Lin, and participants in the Learning From Data MOOC by Yaser S. Abu-Mostafa. No part of these contents is to be communicated or made accessible to ANY other person or entity.