LFD Book Forum  

Go Back   LFD Book Forum > General > General Discussion of Machine Learning

Reply
 
Thread Tools Display Modes
  #1  
Old 05-30-2013, 03:13 AM
Elroch Elroch is offline
Invited Guest
 
Join Date: Mar 2013
Posts: 143
Default Try this!

I'd like to share a very interesting problem that I've had a lot of fun exploring, and learnt a few things about the behaviour of neural networks and efficiently training them. It can be studied with any general purpose neural network software, of which there are many.

The data set simply consists of identical inputs and outputs with all combinations of n bits. For example with n=4, a typical input/output pair would be \{1101, 1101\}. So there are only 2^n points in the data set. One nice thing is that this results in very fast training, which permits a lot of experimentation, with little waiting. Typical problems with thousands of data points can involve a lot more waiting, unless you have access to a cloud or a rack of GPUs!

With that data set, the problem would be a bit trivial without some constraint on design. The constraint is that one of the layers must only have a single neuron in it! Any number of other hidden layers of any size are permitted between the inputs and the single neuron and between the single neuron and the outputs.

The first issue is how to design the network. It clearly has n input neurons and n output neurons, one neuron in some hidden layer and one or more hidden layers between the inputs and the single neuron, and the single neuron and the outputs.

Without giving away too much, I will point out some of the things that I found interesting with the 4-bit version:
  1. One side of the network can be implemented much more simply than the other. You can find out which.
  2. Local minima can be a problem. Seeing very clear examples of this was very enlightening. A local minimum in this problem means that some sets of bits are getting corrupted in the network. Some choices of training algorithm get stuck permanently at different local minima depending on the initial weights
  3. This problem led me to explore various training algorithms, Suffice it to say, back propagation does not work well here, and the superior algorithm I have been mainly using up to now was not the best either. This was also very interesting and will affect my future use of neural networks.
  4. The choice of training algorithm may come close to removing the problem of local minima here. Very exciting fact, especially as these are so starkly visible in this problem.
  5. Convergence time varies a lot with the choice of random weights. Some training algorithms can get almost stuck at local minima, but eventually escape, rather than just getting stuck permanently.

Have fun! Seriously, I think it's well worth the time.
Reply With Quote
  #2  
Old 05-30-2013, 03:39 AM
yaser's Avatar
yaser yaser is offline
Caltech
 
Join Date: Aug 2009
Location: Pasadena, California, USA
Posts: 1,472
Default Re: Try this!

Thank you. This would make a nice homework problem/project in a Machine Learning class.
__________________
Where everyone thinks alike, no one thinks very much
Reply With Quote
  #3  
Old 04-08-2017, 04:15 PM
diethealthcare diethealthcare is offline
Banned
 
Join Date: Apr 2017
Posts: 14
Default Re: Try this!

Thanks for your post .
Reply With Quote
  #4  
Old 06-12-2017, 02:48 AM
Haeraa-ran Haeraa-ran is offline
Junior Member
 
Join Date: Jun 2017
Posts: 1
Default Re: Try this!

Very good information .
Reply With Quote
  #5  
Old 06-13-2017, 09:39 AM
nidhi nidhi is offline
Junior Member
 
Join Date: Jun 2017
Posts: 3
Default Re: Try this!

Thanks for this information
__________________
Blog is my passion Geek World News
Reply With Quote
  #6  
Old 07-28-2017, 01:43 AM
Alakey Alakey is offline
Junior Member
 
Join Date: Jul 2017
Posts: 2
Default Re: Try this!

Lol! Thanks, gonna try tomorrow.
__________________
More skulls for the skull throne!
https://www.bestadvisor.com/
Reply With Quote
  #7  
Old 08-05-2017, 09:30 AM
Khalid Khalid is offline
Junior Member
 
Join Date: Aug 2017
Posts: 5
Post Re: Try this!

Thanks for sharing this with us.
كتاب تفسير الاحلام
Reply With Quote
  #8  
Old 08-16-2017, 04:32 AM
pdsubraa pdsubraa is offline
Junior Member
 
Join Date: Aug 2017
Location: Singapore
Posts: 1
Default Re: Try this!

Thank you - I shall try!
Reply With Quote
  #9  
Old Today, 06:25 AM
john511 john511 is offline
Junior Member
 
Join Date: Aug 2017
Posts: 6
Default Re: Try this!

thank you !
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -7. The time now is 07:28 AM.


Powered by vBulletin® Version 3.8.3
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
The contents of this forum are to be used ONLY by readers of the Learning From Data book by Yaser S. Abu-Mostafa, Malik Magdon-Ismail, and Hsuan-Tien Lin, and participants in the Learning From Data MOOC by Yaser S. Abu-Mostafa. No part of these contents is to be communicated or made accessible to ANY other person or entity.