Skip to content
This repository has been archived by the owner on Jul 17, 2024. It is now read-only.

Commit

Permalink
Merged PR 533: Merge review/fixesForV2InitialRelease to master
Browse files Browse the repository at this point in the history
Fix broken links
  • Loading branch information
Byron Changuion authored and cjacobs committed Oct 2, 2017
1 parent 111bd25 commit 6408745
Show file tree
Hide file tree
Showing 8 changed files with 11 additions and 10 deletions.
2 changes: 1 addition & 1 deletion INSTALL-Windows.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ ELL requires a C++ compiler. On Windows, we support *Visual Studio 2015 update 3

ELL uses the [*CMake*](https://cmake.org/) build system, version 3.8 or newer. Download and install it from <https://cmake.org/download/>.

### LLVM 3.9, SWIG 3.0.12, OpenBlas, and Doxygen via NuGet
### LLVM 3.9, SWIG 3.0.12, OpenBLAS, and Doxygen via NuGet

ELL depends on the [*LLVM*](http://llvm.org/) compiler framework, version 3.9 or newer.

Expand Down
4 changes: 2 additions & 2 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,13 +16,13 @@ While the goal of ELL is to deploy software onto resource constrained platforms
## Installation and Setup
Install ELL on a
[Windows](https://github.com/Microsoft/ELL/blob/master/INSTALL-Windows.md), [Ubuntu Linux](https://github.com/Microsoft/ELL/blob/master/INSTALL-Ubuntu.md), or [Mac](https://github.com/Microsoft/ELL/blob/master/INSTALL-Mac.md)
laptop or desktop computer. If you intend to deploy models onto a Raspberry Pi, follow our instruction on [setting up the Raspberry Pi](/ELL/tutorials/Setting-Up-your-Raspberry-Pi).
laptop or desktop computer. If you intend to deploy models onto a Raspberry Pi, follow our instruction on [setting up the Raspberry Pi](/ELL/tutorials/Setting-up-your-Raspberry-Pi).

## Getting Started
A great place to start is our [tutorials section](/ELL/tutorials). As we develop and release new functionality in ELL, we publish new tutorials that showcase that functionality. Currently, our tutorials are focused on simple embedded computer vision tasks on Raspberry Pi, but we expect the scope to grow with time. Have fun!

## The ELL Gallery
Our [gallery](/ELL/gallery) is a collection of bits and pieces that you can download and use in your projects. Currently, the gallery includes a handful of pre-trained computer vision models and instructions for 3D printing an [active cooling attachment](/ELL/gallery/RPi-cooling) for your Raspberry Pi.
Our [gallery](/ELL/gallery) is a collection of bits and pieces that you can download and use in your projects. Currently, the gallery includes a handful of pre-trained computer vision models and instructions for 3D printing an [active cooling attachment](/ELL/gallery/Raspberry-Pi-3-Fan-Mount) for your Raspberry Pi.

## License
The ELL code and sample code in our tutorials are released under the [MIT Open Source License](https://github.com/Microsoft/ELL/blob/master/LICENSE.txt). Some of the other content on this website, such as the 3D models in our gallery, are released under the [Creative Commons Attribution 4.0 license](https://creativecommons.org/licenses/by/4.0/).
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ In this tutorial, we will download two models from the [ELL gallery](/ELL/galler
* Laptop or desktop computer (Windows, Linux, or Mac)
* Raspberry Pi
* Raspberry Pi camera or USB webcam
* *optional* - Active cooling attachment (see our [tutorial on cooling your Pi](/ELL/tutorials/Active-Cooling-your-Raspberry-Pi-3/))
* *optional* - Active cooling attachment (see our [tutorial on cooling your Pi](/ELL/tutorials/Active-cooling-your-Raspberry-Pi-3/))

#### Prerequisites

Expand Down Expand Up @@ -292,7 +292,7 @@ source activate py34
python sideBySide.py
```

If you have a camera and display connected to your Pi, you should see a window similar to the screenshot at the top of this page. Point your camera at different objects and see how the model classifies them. If you downloaded the full source for [tutorial.py](/ELL/tutorials/shared/tutorial.py), you will also see the average time in milliseconds it takes each model to process a frame. Try to get a sense of the relative accuracy and speed of each model.
If you have a camera and display connected to your Pi, you should see a window similar to the screenshot at the top of this page. Point your camera at different objects and see how the model classifies them. If you downloaded the full source for [sideBySide.py](/ELL/tutorials/Comparing-Image-Classification-models-side-by-side-on-the-Raspberry-Pi/sideBySide.py), you will also see the average time in milliseconds it takes each model to process a frame. Try to get a sense of the relative accuracy and speed of each model.

## Troubleshooting

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ In this tutorial, we will download a pretrained image classifier from the [ELL g

* Install ELL on your computer ([Windows](https://github.com/Microsoft/ELL/blob/master/INSTALL-Windows.md), [Ubuntu Linux](https://github.com/Microsoft/ELL/blob/master/INSTALL-Ubuntu.md), [Mac](https://github.com/Microsoft/ELL/blob/master/INSTALL-Mac.md)). Specifically, this tutorial requires ELL, CMake, and Python 3.6. Note that Python is required to run a tool named `wrap.py`, which makes compilation easy. If you prefer not to use `wrap.py`, you can perform the compilation steps manually, as described in the [wrap tool documentation](https://github.com/Microsoft/ELL/blob/master/tools/wrap/README.md).
* Follow the instructions for [setting up your Raspberry Pi](/ELL/tutorials/Setting-up-your-Raspberry-Pi).
* *optional* - Read through the instructions in [Getting Started with Image Classification on the Raspberry Pi](/ELL/tutorials/Getting-Started-with-Image-Classification-on-the-Raspberry-Pi/).
* *optional* - Read through the instructions in [Getting started with image classification on the Raspberry Pi](/ELL/tutorials/Getting-started-with-image-classification-on-the-Raspberry-Pi/).

## Step 1: Create a tutorial directory

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -118,6 +118,7 @@ First, import a few dependencies and add directories to our path, to allow Pytho
```python
import sys
import os
import cv2
import numpy as np

scriptPath = os.path.dirname(os.path.abspath(__file__))
Expand Down
2 changes: 1 addition & 1 deletion docs/tutorials/Importing-models/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -148,7 +148,7 @@ curl --location -o categories.txt https://raw.githubusercontent.com/pjreddie/dar

## Using the model

Once the model is in the ELL format, it no longer matters whether it came from CNTK or Darknet, and the only difference is the categories file. To test the model from Python, follow the steps in [Getting started with image classification on the Raspberry Pi](/ELL/tutorials/Getting-Started-with-Image-Classification-on-the-Raspberry-Pi/), but replace the model suggested in that tutorial with the model you just imported. Alternatively, to test the model from C++, follow the steps in [Getting started with image classification in C++](/ELL/tutorials/Getting-Started-with-Image-Classification-in-Cpp/).
Once the model is in the ELL format, it no longer matters whether it came from CNTK or Darknet, and the only difference is the categories file. To test the model from Python, follow the steps in [Getting started with image classification on the Raspberry Pi](/ELL/tutorials/Getting-started-with-image-classification-on-the-Raspberry-Pi/), but replace the model suggested in that tutorial with the model you just imported. Alternatively, to test the model from C++, follow the steps in [Getting started with image classification on the Raspberry Pi in C++](/ELL/tutorials/Getting-started-with-image-classification-in-cpp/).

## Troubleshooting

Expand Down
2 changes: 1 addition & 1 deletion docs/tutorials/Setting-Up-your-Raspberry-Pi.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ Most of our tutorials follow a common workflow. The first steps involve authorin
AI workloads guzzle power, so we recommend using a high quality USB power adapter and micro USB cable. An adapter rated for 12 Watts (2.4 Amps) per USB port works best. We've had a good experience with 12W-per-port USB power adapters from Apple, Anker, and Amazon Basics. Long and thin cables will often result in a noticeable voltage drop and fail to provide sufficient power to your Raspberry Pi. Generic unbranded cables are hit-and-miss. For a few extra dollars you can get a nice name-brand cable, like the Anker PowerLine, and save yourself a lot of frustration.

### Operating System
Our tutorials assume that the operating system running on your Pi is [*Raspbian Jessie*](https://downloads.raspberrypi.org/raspbian/images/raspbian-2017-07-05/2017-07-05-raspbian-jessie.zip), not the more recent *Raspian Stretch*.
Our tutorials assume that the operating system running on your Pi is *Raspbian Jessie* ([NOOBS](https://downloads.raspberrypi.org/NOOBS/images/NOOBS-2017-07-05/) or [image](https://downloads.raspberrypi.org/raspbian/images/raspbian-2017-07-05/)), not the more recent *Raspbian Stretch*.

### CMake
We use `CMake` on the Raspberry Pi to create Python modules that can be called from our tutorial code. To install `CMake` on your Pi, connect to the network, open a terminal window, and type
Expand Down
4 changes: 2 additions & 2 deletions tools/utilities/pythonlibs/demoHelper.py
Original file line number Diff line number Diff line change
Expand Up @@ -256,12 +256,12 @@ def report_times(self, node_level=True):
if hasattr(self.compiled_module, self.model_name + "_PrintNodeProfilingInfo"):
getattr(self.compiled_module, self.model_name + "_PrintNodeProfilingInfo")()

def get_top_n_predictions(self, predictions, N = 5, threshold = 0.20):
def get_top_n_predictions(self, predictions, N = 5):
"""Return at most the top N predictions as a list of tuples that meet the threshold.
The first of element of each tuple represents the index or class of the prediction and the second
element represents that probability or confidence value.
"""
map = [(i,predictions[i]) for i in range(len(predictions)) if predictions[i] >= threshold]
map = [(i,predictions[i]) for i in range(len(predictions)) if predictions[i] >= self.threshold]
map.sort(key=lambda tup: tup[1], reverse=True)
result = map[:N]
return result
Expand Down

0 comments on commit 6408745

Please sign in to comment.