Note that this is only tested on Ubuntu and probably does NOT run on windows
!!!Do NOT update models repository here without testing it first!!!
Look at guides here.
Sources are included in the tf folder and can be installed by following this guide.
Install from sources and add --config=mkl
to the bazel build
Please see tensorflow.org for more information MKL can be installed by adding
deb https://apt.repos.intel.com/mkl all main
to/etc/apt/sources.list/d/intelproducts.list
(Note that/intelpython
,/ipp
,/tbb
, and/daal
also exist onapt.repos.intel.com
, although/intelpython
is down as of April 16th, 2018, see intel.com)
- run
sudo apt-get install protobuf-compiler python-pil python-lxml python-tk
sudo pip install matplotlib jupyter
- run
protoc models/research/object_detection/protos/*.proto --python_out=./models/research
- run or add absolute paths to .bashrc
cd models/research;export PYTHONPATH=$PYTHONPATH:`pwd`:`pwd`/slim;cd ../..
- test by running
python models/object_detection/builders/model_builder_test.py
Just run npm start
inside of the LabelMeAnnotationTool folder
This starts a webserver on localhost:8080
python make_model/make_model.py
python models/research/object_detection/train.py --train_dir=./temp --pipeline_config_path=`pwd /make_model/embedded_ssd_mobilenet_v1_coco.config`
tensorboard --logdir=./temp
Note that to see this open browser to localhost:6006
python models/research/object_detection/eval.py --eval_dir=./tempEval --pipeline_config_path=/home/elias/Desktop/web/morvision/make_model/embedded_ssd_mobilenet_v1_coco.config