This is a sample source code to create Sushi Detection demo.
This demo use following devices.
- Nvidia Jetson Nano
- Robot Hand
- Servo Motor
- Arduino Uno Rev3
- USB Web camera
This demo have two devices connected to Greengrass. In this step you need to create three certificate for following devices.
- Greengrass Core
- Robot Controller
- Rail Controller
Follow this step to create certificates. Dont forget to activate and download the generated certificate, private key and public key.
Check each certificate ARN in the certificate detail page. This ARN is use in next step.
Download Lambda Function Zip file and Machine Learning Model from following link and upload to your S3 bucket. The region must be same with Cloud Function region.
- Lambda Function
- ML Model
-
Sign in to the AWS Management Console and open the AWS CloudFormation console at https://console.aws.amazon.com/cloudformation.
-
If this is a new AWS CloudFormation account, click Create New Stack. Otherwise, click Create Stack. If you choose Create stact from drop down list choose With new resources (standard)
-
In the Template section, select Template is ready
-
In the Specify template section, select Upload a template file. Upload
demo_CFn.yml
-
Click Next
-
Enter stack name
-
Fill Parameters
parameter | value |
---|---|
GreengrassCoreCertificateARN | certificate ARN which created at previous step |
RailControllerCertificateARN | certificate ARN which created at previous step |
RobotControllerCertificateARN | certificate ARN which created at previous step |
GreengrassGroupName | your Greengrass name |
InferenceLambdaSourceKey | your S3 object key(lambda funcion zip) |
LambdaSourceBucket | S3Bucket(lambda funcion zip) |
MLModelURI | ML Model S3 URL |
-
Click Next. And click Next again.
-
Review the information for the stack. When you're satisfied with the settings, check I acknowledge that AWS CloudFormation might create IAM resources. and click Create.
-
It takes few minutes to complete creating Greengrass.
This demo works for Greengrass version 1.9x.
- Download Greengrass Core Software from here. Choose Armv8 (AArch64) Ubuntu 18.04 for Nvidia Jetson Nano.
Follow the step on the original setup page.
https://developer.nvidia.com/embedded/learn/get-started-jetson-nano-devkit#write
Boot your Jetson Nano, after you install the image in SD card.
And follow the Setup and First Boot
.
https://developer.nvidia.com/embedded/learn/get-started-jetson-nano-devkit#setup
Use terminal on Jetson Nano to install software.
sudo apt-get install -y git build-essential libatlas-base-dev libopencv-dev graphviz vim curl
Use terminal on Jetson Nano.
curl https://s3-us-west-2.amazonaws.com/neo-ai-dlr-release/v1.0/jetsonnano-aarch64-cu10-ubuntu18_04-glibc2_27-libstdcpp3_4/dlr-1.0-py2.py3-none-any.whl -o dlr-1.0-py2.py3-none-any.whl
pip install dlr-1.0-py2.py3-none-any.whl
sudo adduser --system ggc_user
sudo groupadd --system ggc_group
- Copy Greengrass Core software to Jetson Nano.
- Copy certificate for Greengrass Core to Jetson Nano.
- Extract Greengrass
sudo tar -zxvf greengrass-linux-x86-64-1.9.4.tar.gz -C /
- Copy certificate to Greengrass directory
cp /path/to/your/certificate /greengrass/certs/
- Download Root CA certificate
sudo curl https://www.amazontrust.com/repository/AmazonRootCA1.pem -o /greengrass/certs/root.ca.pem
- Edit Greengrass config
Replace each part with your environoment.
part | value |
---|---|
[ROOT_CA_PEM_HERE] | root.ca.pem |
[CLOUD_PEM_CRT_HERE] | your certificate file name. ex) abcd1234-certificate.pem.crt |
[CLOUD_PEM_KEY_HERE] | your private key file name. ex) afcf52c6b2-private.pem.key |
[THING_ARN_HERE] | Greengrass Core ARN ex) arn:aws:iot:us-east-w:1234567890:thing/sushi_Core |
coreThing.iotHost | check your AWS IoT Endpoint from AWS IoT console -> Settings -> Custom endpoint |
coreThing.ggHost | replace [AWS_REGION_HERE] with your region. |
runtime.cgroup.useSystemd | `[yes |
sudo vim /greengrass/config/config.json
{
"coreThing": {
"caPath": "[ROOT_CA_PEM_HERE]",
"certPath": "[CLOUD_PEM_CRT_HERE]",
"keyPath": "[CLOUD_PEM_KEY_HERE]",
"thingArn": "[THING_ARN_HERE]",
"iotHost": "[HOST_PREFIX_HERE]-ats.iot.[AWS_REGION_HERE].amazonaws.com",
"ggHost": "greengrass-ats.iot.[AWS_REGION_HERE].amazonaws.com"
},
"runtime": {
"cgroup": {
"useSystemd": "[yes|no]"
}
},
"managedRespawn": false,
"crypto": {
"caPath" : "file://certs/[ROOT_CA_PEM_HERE]",
"principals": {
"IoTCertificate": {
"privateKeyPath": "file://certs/[CLOUD_PEM_KEY_HERE]",
"certificatePath": "file://certs/[CLOUD_PEM_CRT_HERE]"
},
"SecretsManager": {
"privateKeyPath": "file://certs/[CLOUD_PEM_KEY_HERE]"
}
}
}
}
Connect USB camera before starting Greengrass daemon.
Start Greengrass
sudo /greengrass/ggc/core/greengrassd start
check Greengrass log.
sudo tail -F /greengrass/ggc/var/log/system/runtime.log
-
Copy
Greengrass/greengrass.service.txt
to/etc/systemd/system/greengrass.service
-
Enable service
sudo systemctl enable greengrass.service
- Start service
sudo systemctl start greengrass.service
See here to deploy Greengrass to your device.
The deployed Lambda Function source code can find in this repo LambdaFunction
directory.
This demo use Raspberry Pi 3+ to control servo motor and comunicate with Arduino Uno.
- Install software
sudo pip install AWSIoTPythonSDK
- rename
appconfig_template.json
toappconfig.json
- edit
appconfig.json
to your environoment
attribute | value |
---|---|
IOT_ENDPOINT | your AWS IoT endpoint |
CERT_FILE | your certificate file name. ex) abcd1234-certificate.pem.crt |
PRIVATE_KEY_FILE | your private key file name. ex) afcf52c6b2-private.pem.key |
THING_NAME | your RailController thing name |
-
Copy all demo source in this repo
/RailController
to Raspberry Pi/home/pi/RailController
-
SSH into Raspberry pi
-
Correct
WorkingDirectory
andExecStart
path inrailcontroller.service.txt
-
Copy
railcontroller.service.txt
to/etc/systemd/system/railcontroller.service
-
Enable service
sudo systemctl enable railcontroller.service
- Start service
sudo systemctl start railcontroller.service
Use arduino IDE to compile and install to Arduino Uno.
- Open
arduino/arduino_robot.ino
with Arduino IDE - compile and upload to Arduino Uno
- rename
appconfig_template.json
toappconfig.json
- edit
appconfig.json
to your environoment
attribute | value |
---|---|
IOT_ENDPOINT | your AWS IoT endpoint |
CERT_FILE | your certificate file name. ex) abcd1234-certificate.pem.crt |
PRIVATE_KEY_FILE | your private key file name. ex) afcf52c6b2-private.pem.key |
THING_NAME | your RobotController thing name |
-
Copy all demo source in this repo
/RobotController
to Raspberry Pi/home/pi/RobotController
-
SSH into Raspberry pi and move directory to
/home/pi/RobotController
-
Correct
WorkingDirectory
andExecStart
path inrobotcontroller.service.txt
-
Copy
robotcontroller.service.txt
to/etc/systemd/system/robotcontroller.service
-
Enable service
sudo systemctl enable robotcontroller.service
- Start service
sudo systemctl start robotcontroller.service
This demo use SageMaker built-in Image Classification template to create model. The created model use SageMaker Neo to compile.
- About built-in Image Classification Algorithm
- About compiling with SageMaker Neo
This demo use OpenCV to detect sushi saucer. But you can use Object Detection to detect sushi saucer. SageMaker built-in Object Detection Algorithm is not supported at SageMaker Neo(2019/11). If you want to use Object Detection with Image Classification use MXNet.
- Jun Ichikawa [email protected]
- Tatsuhiro Iida [email protected]
This library is licensed under the Apache 2.0 License.