Skip to content

This is a multiple file project that is part of my dissertation M.Sc Thesis. This repository contains all the background theory of Deep Belief Networks and Machine Learning theory using Artificial Neural Networks to utilize the examples.

License

Notifications You must be signed in to change notification settings

kmonachopoulos/Deep-Learning-DBN-From-the-ground-up

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

The current Msc. Thesis is dealing with the study and implementation of Deep Belief Networks, both in theoretical and practical background. Our aim is to investigate and analyze the theoretical background of Deep Belief Networks, starting with machine learning theory in the field of Artificial Neural Networks and completing the implementation in an algorithmic layer. The approach that is used for the proper training procedure, includes the Greedy-Layer Wise Unsupervised Pre-Training and Semi-Supervised Fine-Tuning techniques. These techniques contain initialization and optimization procedures of the synaptic weights, using a small part of database training patterns. Studding Deep Belief Networks, we analyze all the methods which contribute to the Deep Learning Network structure, specifying the individual techniques they contain. Recounting, we develop the theory which includes the Metropolis – Hasting, Gibbs Sampling and Simulated Annealing techniques, such as their origins. Based on unsupervised learning Hopfield Network and by using the above techniques, we extract the stochastic form of Boltzmann Machines Networks. Simplifying the morphological structure of the network, we managed to capture high order regularities of the probability density function of the input patterns, by using Restricted Boltzmann Machines Networks and by stacking them we conclude to the final structure of Deep Belief Networks. As a Proof-Of-Concept, we proceed to the development of a Deep Belief Network, extracting the results, by using handwritten digit binomial MNIST database. Summarizing, we attach the recent techniques that are used in the specific database, including the recognition error results. Finally, we present our algorithm results through Matlab IDE, extracting the final In-Sample and Out-Of-Sample errors in the rates of 0.19% and 1.7%, respectively.

=== Topics Covered ===

- Linear Regression
- Polynomial Regression
- Logistic Regression
- Simple Perceptron (Binary Classification)
- Multilayer Perceptron (MLP)
- Self-Organizing Map (SOM) 
- Hopfield 
- Boltzmann Machines
- Restricted Boltzmann Machines
- Monte Carlo Simulation
	- Monte Carlo Integration
	- Metropolis Hasting	(Random Walk Markov Chain Monte Carlo)
	- Gibbs Sampling	(Markov Chain Monte Carlo)
	- Simulated_Annealing 	(Random Walk Markov Chain Monte Carlo)
- Deep Belief Networks (MNIST)
	- greedy layerwise unsupervised pre-training
	- Linear Mapping
	- Fine Tuning	

DBN M.Sc Dissertation Thesis :

http://nemertes.lis.upatras.gr/jspui/bitstream/10889/9473/6/Monahopoulos(phys).pdf

About

This is a multiple file project that is part of my dissertation M.Sc Thesis. This repository contains all the background theory of Deep Belief Networks and Machine Learning theory using Artificial Neural Networks to utilize the examples.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages