This recipe shows you how to read and write an entity in Datastore from a Cloud Function. Where applicable:
Replace [PROJECT-ID] with your Cloud Platform project ID
-
Follow the Cloud Functions quickstart guide to setup Cloud Functions for your project
-
Clone this repository
cd ~/ git clone https://github.com/jasonpolites/gcf-recipes.git cd gcf-recipes/datastore
-
Create a Cloud Storage Bucket to stage our deployment
gsutil mb gs://[PROJECT-ID]-gcf-recipes-bucket
-
Ensure the Cloud Datastore API is enabled
-
Deploy the "ds-get" function with an HTTP trigger
gcloud alpha functions deploy ds-get --bucket [PROJECT-ID]-gcf-recipes-bucket --trigger-http --entry-point get
-
Deploy the "ds-set" function with an HTTP trigger
gcloud alpha functions deploy ds-set --bucket [PROJECT-ID]-gcf-recipes-bucket --trigger-http --entry-point set
-
Deploy the "ds-del" function with an HTTP trigger
gcloud alpha functions deploy ds-del --bucket [PROJECT-ID]-gcf-recipes-bucket --trigger-http --entry-point del
-
Call the "ds-set" function to create a new entity
gcloud alpha functions call ds-set --data '{"kind": "gcf-test", "key": "foobar", "value": {"message": "Hello World!"}}'
-
Call the "ds-get" function to read the newly created entity
gcloud alpha functions call ds-get --data '{"kind": "gcf-test", "key": "foobar"}'
-
Call the "ds-del" function to delete the entity
gcloud alpha functions call ds-del --data '{"kind": "gcf-test", "key": "foobar"}'
-
Call the "ds-get" function again to verify it was deleted
gcloud alpha functions call ds-get --data '{"kind": "gcf-test", "key": "foobar"}'
This recipe comes with a suite of unit tests. To run the tests locally, just use npm test
npm install
npm test
The tests will also produce code coverage reports, written to the /coverage
directory. After running the tests, you can view coverage with
open coverage/lcov-report/index.html