Skip to content

Commit

Permalink
add replicator example
Browse files Browse the repository at this point in the history
  • Loading branch information
austencollins committed Jun 3, 2019
1 parent 21c75ac commit 073fa48
Show file tree
Hide file tree
Showing 7 changed files with 413 additions and 0 deletions.
102 changes: 102 additions & 0 deletions aws-node-fetch-file-and-store-in-s3/package-lock.json

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 2 additions & 0 deletions aws-node-s3-file-replicator/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
node_modules
.serverless
105 changes: 105 additions & 0 deletions aws-node-s3-file-replicator/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,105 @@
<!--
title: 'AWS Fetch image from URL and upload to S3 example in NodeJS'
description: 'This example display how to fetch an image from remote source (URL) and then upload this image to a S3 bucket.'
layout: Doc
framework: v1
platform: AWS
language: nodeJS
authorLink: 'https://github.com/ScottBrenner'
authorName: 'Scott Brenner'
authorAvatar: 'https://avatars2.githubusercontent.com/u/416477?v=4&s=140'
-->
# Fetch image from URL then upload to s3 Example

This example display how to fetch an image from remote source (URL) and then upload this image to a S3 bucket.

## Use-cases

- Store a user's profile picture from another service.

## How it works

We first fetch the data from given url and then call the S3 API `putObject` to upload it to the bucket.

```js
fetch('image URL')
.then(res => {
return s3.putObject({Bucket, Key, Body: res.body}).promise();
}).then(res => {
callback(null, res);
}).catch(err => {
callback(err, null);
});
```

## Setup

Since this plugin uses the Serverless plugin `serverless-secrets-plugin` you need to setup the `node_modules` by running:

```bash
npm install
```

In addition you need to create an S3 bucket you want to store the files in. After you created the bucket change the bucket name in `serverless.yml` custom settings to your buckets.

```yml
custom:
bucket: <your-bucket-name>
```
## Deploy
In order to deploy the you endpoint simply run
```bash
serverless deploy
```

The expected result should be similar to:

```bash
Serverless: Creating Stack...
Serverless: Checking Stack create progress...
.....
Serverless: Stack create finished...
Serverless: Packaging service...
Serverless: Uploading CloudFormation file to S3...
Serverless: Uploading service .zip file to S3 (1.8 KB)...
Serverless: Updating Stack...
Serverless: Checking Stack update progress...
................
Serverless: Stack update finished...

Service Information
service: aws-node-fetch-file-and-store-in-s3
stage: dev
region: us-west-1
api keys:
None
endpoints:
None
functions:
aws-node-fetch-file-and-store-in-s3-dev-save: arn:aws:lambda:us-west-1:377024778620:function:aws-node-fetch-file-and-store-in-s3-dev-save
```

## Usage

You can now send an HTTP request directly to the endpoint using a tool like curl

```bash
serverless invoke --function save --log --data='{ "image_url": "https://assets-cdn.github.com/images/modules/open_graph/github-mark.png", "key": "github.png"}'
```

The expected result should be similar to:

```bash
"Saved"
--------------------------------------------------------------------
START RequestId: c658859d-bd11e6-ac1f-c7a7ee5bd7f3 Version: $LATEST
END RequestId: c658859d-bd11e6-ac1f-c7a7ee5bd7f3
REPORT RequestId: c658859d-bd11e6-ac1f-c7a7ee5bd7f3 Duration: 436.94 ms Billed Duration: 500 ms Memory Size: 1024 MB Max Memory Used: 29 MB
```

## Scaling

By default, AWS Lambda limits the total concurrent executions across all functions within a given region to 100. The default limit is a safety limit that protects you from costs due to potential runaway or recursive functions during initial development and testing. To increase this limit above the default, follow the steps in [To request a limit increase for concurrent executions](http://docs.aws.amazon.com/lambda/latest/dg/concurrent-executions.html#increase-concurrent-executions-limit).
55 changes: 55 additions & 0 deletions aws-node-s3-file-replicator/handler.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,55 @@
const aws = require('aws-sdk')
const s3 = new aws.S3()
const path = require('path')

const outputBucket = process.env.OUTPUT_BUCKET

exports.replicate = function main(event, context) {
// Fail on mising data
if (!outputBucket) {
context.fail('Error: Environment variable OUTPUT_BUCKET missing')
return
}
if (event.Records === null) {
context.fail('Error: Event has no records.')
return
}

let tasks = []
for (let i = 0; i < event.Records.length; i++) {
tasks.push(replicatePromise(event.Records[i], outputBucket))
}

Promise.all(tasks)
.then(() => { context.succeed() })
.catch(() => { context.fail() })
}

function replicatePromise(record, destBucket) {
return new Promise((resolve, reject) => {
// The source bucket and source key are part of the event data
var srcBucket = record.s3.bucket.name
var srcKey = decodeURIComponent(record.s3.object.key.replace(/\+/g, " "))

// Modify destKey if an alternate copy location is preferred
var destKey = srcKey
var msg = 'copying ' + srcBucket + ':' + srcKey + ' to ' + destBucket + ':' + destKey

console.log('Attempting: ' + msg)
s3.copyObject({
Bucket: destBucket,
Key: destKey,
CopySource: encodeURIComponent(srcBucket + '/' + srcKey),
MetadataDirective: 'COPY'
}, (err, data) => {
if (err) {
console.log('Error:' + msg)
console.log(err, err.stack) // an error occurred
return reject('Error:' + msg)
} else {
console.log('Success: ' + msg)
return resolve('Success: ' + msg)
}
})
})
}
102 changes: 102 additions & 0 deletions aws-node-s3-file-replicator/package-lock.json

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

10 changes: 10 additions & 0 deletions aws-node-s3-file-replicator/package.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
{
"name": "aws-fetch-file-and-store-in-s3",
"description": "Fetch an image from remote source (URL) and then upload the image to a S3 bucket.",
"version": "1.0.0",
"author": "Bozhao Yu",
"license": "MIT",
"dependencies": {
"aws-sdk": "^2.467.0"
}
}
Loading

0 comments on commit 073fa48

Please sign in to comment.