Skip to content

Latest commit

 

History

History
25 lines (12 loc) · 2.2 KB

README.md

File metadata and controls

25 lines (12 loc) · 2.2 KB

Acharya-MachineLearning

This repo contains contributions to project Acharya, an Open Smart Education Initiative. Submissions in the domain of Machine Learning are welcome. You are contributions will be made available in the Acharya E-Learning platform.

How to make Contributions

You can submit Jupyter notebooks that illustrates the working of machines learning algorithms or application of the machine learning algorithms. Please check the topics to be covered. However, you do not have to restrict the submissions to the list of topics to be covered. Make sure to include figures and videos in your notebook.

Topics to be covered

Introduction to Machine Learning, Examples of Machine Learning applications - Learning associations, Classification, Regression, Unsupervised Learning, Reinforcement Learning. Supervised learning- Input representation, Hypothesis class,Version space, Vapnik-Chervonenkis (VC) Dimension

Probably Approximately Learning (PAC), Noise, Learning Multiple classes, Model Selection and Generalization,Dimensionality reduction- Subset selection, Principle Component Analysis

Classification- Cross validation and re-sampling methods- K-fold cross validation, Boot strapping, Measuring classifier performance- Precision, recall, ROC curves. Bayes Theorem, Bayesian classifier, Maximum Likelihood estimation, Density functions, Regression

Decision Trees- Entropy, Information Gain, Tree construction, ID3, Issues in Decision Tree learning- Avoiding Over-fitting, Reduced Error Pruning, The problem of Missing Attributes, Gain Ratio, Classification by Regression (CART), Neural Networks- The Perceptron, Activation Functions, Training Feed Forward Network by Back Propagation.

Kernel Machines- Support Vector Machine- Optimal Separating hyper plane, Soft-margin hyperplane, Kernel trick, Kernel functions. Discrete Markov Processes, Hidden Markov models, Three basic problems of HMMs- Evaluation problem,finding state sequence, Learning model parameters. Combining multiple learners, Ways to achieve diversity, Model combination schemes, Voting, Bagging, Boosting

Unsupervised Learning - Clustering Methods - K-means, Expectation-Maximization Algorithm, Hierarchical Clustering Methods , Density based clustering