this repository is responsible for the machine learning model. We are currently using two models, a binary classifier and multi label classifier, the binary classifier is responsible for the Real_Player & Unkown_bot classification, if the Real_Player classification is less then 50% the predictions of the multi class classifier are used.
- Docker
- Github desktop
- Git windows, Git unix will also work.
- An integrated development environment (IDE).
- We recommend VSCode, but any IDE will work.
- Open a terminal
cmd
- Navigate
cd
to where you want to save our code. - The command below will Create a folder
bot-detector
with two sub foldersremote
&local
& download the remote repositories in theremote
folder.- To add the repositories in github desktop, select
File
on the top left than clickAdd local repository
, and navigate to the cloned repositories.
- To add the repositories in github desktop, select
Windows
mkdir bot-detector\remote bot-detector\local && cd bot-detector\remote
git clone https://github.com/Bot-detector/Bot-Detector-Core-Files.git
git clone https://github.com/Bot-detector/bot-detector-mysql.git
git clone https://github.com/Bot-detector/bot-detector-ML.git
Linux
mkdir -p bot-detector/{remote,local}
git clone https://github.com/Bot-detector/Bot-Detector-Core-Files.git
git clone https://github.com/Bot-detector/bot-detector-mysql.git
git clone https://github.com/Bot-detector/bot-detector-ML.git
- Now you can start the project, the command below will create the necessary docker containers, the first time might take a couple minutes. Make sure docker desktop is running!
cd Bot-Detector-Core-Files
docker-compose up -d
- In the terminal you will now see
/usr/sbin/mysqld: ready for connections.
this means the database is ready. - Test the api's:
- Core api:
http://localhost:5000/
- Machine learning:
http://localhost:8000/
- Core api:
adding /docs at the end will give return the swagger documentation for the components /docs