Skip to content

Machine Learning example project using images of hand gestures.

Notifications You must be signed in to change notification settings

filipefborba/HandRecognition

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 

Repository files navigation

Hand Recognition

Filipe Borba

Franklin W. Olin College of Engineering

Data Science, Prof. Allen Downey


Disclaimer

This project is not mantained and can contain conceptual errors. Feel free to use and contribute, but I'm not responsible for any damage if you just copy stuff from here. Also, I highly recommend taking a look at this repo https://github.com/ageron/handson-ml2 before starting your own Machine Learning project.


Machine Learning problems and algorithms are getting more and more attention over the years. Not only it is very useful for a variety of real-life problems, it is very efficient to automate processes that use data. It is commonly used for tasks such as classification, recognition, detection and predictions. The basic idea is to use data to produce a model capable of returning an output. This output may give a right answer with a new input or produce predictions towards the known data.

The goal of this project is to train a Machine Learning algorithm capable of classificating images of different hand gestures, such as a fist, palm, showing the thumb, and others. With this, I'll be able to understand more about this field and create my own program that fits the data that I have. The method I'll be using is Deep Learning.

Deep Learning is part of a broader family of machine learning methods. It is based on the use of layers that process the input data, extracting features from them and producing a mathematical model. The creation of this said 'model' will be more clear in the next session. In this specific project, we'll be aiming to classify different images of hand gestures, which means that the computer will have to "learn" the features of each gesture and classify them correctly. For example, if it is given an image of a hand doing a thumbs up gesture, the output of the model needs to be "the hand is doing a thumbs up gesture".

Obs: This project was developed using the Google Colab environment.

About

Machine Learning example project using images of hand gestures.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published