LFD Book Forum

LFD Book Forum (http://book.caltech.edu/bookforum/index.php)
-   General Discussion of Machine Learning (http://book.caltech.edu/bookforum/forumdisplay.php?f=105)
-   -   Try this! (http://book.caltech.edu/bookforum/showthread.php?t=4323)

Elroch 05-30-2013 03:13 AM

Try this!
 
I'd like to share a very interesting problem that I've had a lot of fun exploring, and learnt a few things about the behaviour of neural networks and efficiently training them. It can be studied with any general purpose neural network software, of which there are many.

The data set simply consists of identical inputs and outputs with all combinations of n bits. For example with n=4, a typical input/output pair would be \{1101, 1101\}. So there are only 2^n points in the data set. One nice thing is that this results in very fast training, which permits a lot of experimentation, with little waiting. Typical problems with thousands of data points can involve a lot more waiting, unless you have access to a cloud or a rack of GPUs!

With that data set, the problem would be a bit trivial without some constraint on design. The constraint is that one of the layers must only have a single neuron in it! Any number of other hidden layers of any size are permitted between the inputs and the single neuron and between the single neuron and the outputs.

The first issue is how to design the network. It clearly has n input neurons and n output neurons, one neuron in some hidden layer and one or more hidden layers between the inputs and the single neuron, and the single neuron and the outputs.

Without giving away too much, I will point out some of the things that I found interesting with the 4-bit version:
  1. One side of the network can be implemented much more simply than the other. You can find out which. :)
  2. Local minima can be a problem. Seeing very clear examples of this was very enlightening. A local minimum in this problem means that some sets of bits are getting corrupted in the network. Some choices of training algorithm get stuck permanently at different local minima depending on the initial weights
  3. This problem led me to explore various training algorithms, Suffice it to say, back propagation does not work well here, and the superior algorithm I have been mainly using up to now was not the best either. This was also very interesting and will affect my future use of neural networks.
  4. The choice of training algorithm may come close to removing the problem of local minima here. Very exciting fact, especially as these are so starkly visible in this problem.
  5. Convergence time varies a lot with the choice of random weights. Some training algorithms can get almost stuck at local minima, but eventually escape, rather than just getting stuck permanently.

Have fun! Seriously, I think it's well worth the time.

yaser 05-30-2013 03:39 AM

Re: Try this!
 
Thank you. This would make a nice homework problem/project in a Machine Learning class.

diethealthcare 04-08-2017 04:15 PM

Re: Try this!
 
Thanks for your post :).

Haeraa-ran 06-12-2017 02:48 AM

Re: Try this!
 
Very good information .

Alakey 07-28-2017 01:43 AM

Re: Try this!
 
Lol! Thanks, gonna try tomorrow.:D

pdsubraa 08-16-2017 04:32 AM

Re: Try this!
 
Thank you - I shall try!

john511 08-18-2017 06:25 AM

Re: Try this!
 
thank you !

amrsaber 10-02-2017 04:03 PM

Re: Try this!
 
must try this thanks

JennPendic 05-01-2018 02:30 PM

Re: Try this!
 
God I have to try this!!!


All times are GMT -7. The time now is 09:50 PM.

Powered by vBulletin® Version 3.8.3
Copyright ©2000 - 2019, Jelsoft Enterprises Ltd.
The contents of this forum are to be used ONLY by readers of the Learning From Data book by Yaser S. Abu-Mostafa, Malik Magdon-Ismail, and Hsuan-Tien Lin, and participants in the Learning From Data MOOC by Yaser S. Abu-Mostafa. No part of these contents is to be communicated or made accessible to ANY other person or entity.