Skip to content

Commit

Permalink
Merge pull request #31 from Bladrak/async
Browse files Browse the repository at this point in the history
Implement async
  • Loading branch information
Bladrak committed Oct 14, 2015
2 parents 3fb6002 + e0cd8b8 commit e11994b
Show file tree
Hide file tree
Showing 20 changed files with 745 additions and 415 deletions.
35 changes: 35 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
# Thumbor Community - AWS: Changelog

This file will describe the changes in each version, noticeable the BC Breaks that may have append

## 2.0 - Async Connection

Switched connection from Boto 2.0 to botocore, in order to handle tornado async connections. This update leads to some major BC Breaks.

* [BC BREAK] Authentication is now handled by botocore directly, the following configuration values are no more used:
* AWS_ROLE_BASED_CONNECTION
* AWS_ACCESS_KEY
* AWS_SECRET_KEY
* BOTO_CONFIG
You'll need to use boto's configuration file directly to handle your server's authentication to S3, or role-based connection. See <https://github.com/boto/boto3>
* [BC BREAK] A new option has been added to configure the AWS region, named ``TC_AWS_REGION``; it defaults to ``eu-west-1``
* [BC BREAK] Option's names have been uniformized as well, here's the old to new options' names mapping:

| Old option | New option |
| ---------- | ---------- |
| STORAGE_BUCKET | TC_AWS_STORAGE_BUCKET |
| RESULT_STORAGE_BUCKET | TC_AWS_RESULT_STORAGE_BUCKET |
| S3_LOADER_BUCKET | TC_AWS_LOADER_BUCKET |
| S3_LOADER_ROOT_PATH | TC_AWS_LOADER_ROOT_PATH |
| STORAGE_AWS_STORAGE_ROOT_PATH | TC_AWS_STORAGE_ROOT_PATH |
| RESULT_STORAGE_AWS_STORAGE_ROOT_PATH | TC_AWS_RESULT_STORAGE_ROOT_PATH |
| S3_STORAGE_SSE | TC_AWS_STORAGE_SSE |
| S3_STORAGE_RRS | TC_AWS_STORAGE_RRS |
| S3_ALLOWED_BUCKETS | TC_AWS_ALLOWED_BUCKETS |
| RESULT_STORAGE_S3_STORE_METADATA | TC_AWS_STORE_METADATA |
| AWS_ENABLE_HTTP_LOADER | TC_AWS_ENABLE_HTTP_LOADER |
| AWS_ACCESS_KEY | N/A |
| AWS_SECRET_KEY | N/A |
| AWS_ROLE_BASED_CONNECTION | N/A |
| BOTO_CONFIG | N/A |

79 changes: 25 additions & 54 deletions Readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ Installation
Origin story
------------

This is a fork of https://github.com/willtrking/thumbor_aws ; as this repository was not maintained anymore,
This is a fork of https://github.com/willtrking/thumbor_aws; as this repository was not maintained anymore,
we decided to maintain it under the ``thumbor-community`` organisation.

Features
Expand All @@ -28,61 +28,32 @@ Features

Additional Configuration values used:

# the Amazon Web Services access key to use
AWS_ACCESS_KEY = ""
# the Amazon Web Services secret of the used access key
AWS_SECRET_KEY = ""

# Alternatively (recommended), use Role-based connection
# http://docs.aws.amazon.com/IAM/latest/UserGuide/roles-assume-role.html
AWS_ROLE_BASED_CONNECTION = True or False (Default: False)


# configuration settings specific for the s3_loader

# list of allowed buckets for the s3_loader
S3_ALLOWED_BUCKETS = []

# alternatively: set a fixed bucket, no need for bucket name in Image-Path
S3_LOADER_BUCKET = 'thumbor-images'
# A root path for loading images, useful if you share the bucket
S3_LOADER_ROOT_PATH = 'source-images'

# configuration settings specific for the storages

STORAGE_BUCKET = 'thumbor-images'
# A root path for the storage, useful if you share a bucket for loading / storing
STORAGE_AWS_STORAGE_ROOT_PATH = 'storage'

RESULT_STORAGE_BUCKET = 'thumbor-images'
RESULT_STORAGE_AWS_STORAGE_ROOT_PATH = 'result'
# It stores metadata like content-type on the result object
RESULT_STORAGE_S3_STORE_METADATA = False

STORAGE_EXPIRATION_SECONDS

TC_AWS_REGION='eu-west-1' # AWS Region

TC_AWS_STORAGE_BUCKET='' # S3 bucket for Storage
TC_AWS_STORAGE_ROOT_PATH='' # S3 path prefix for Storage bucket

TC_AWS_LOADER_BUCKET='' #S3 bucket for loader
TC_AWS_LOADER_ROOT_PATH='' # S3 path prefix for Loader bucket

TC_AWS_RESULT_STORAGE_BUCKET='' # S3 bucket for result Storage
TC_AWS_RESULT_STORAGE_ROOT_PATH='' # S3 path prefix for Result storage bucket

# put data into S3 using the Server Side Encryption functionality to
# encrypt data at rest in S3
# https://aws.amazon.com/about-aws/whats-new/2011/10/04/amazon-s3-announces-server-side-encryption-support/
S3_STORAGE_SSE = True or False (Default: False)

TC_AWS_STORAGE_SSE=False
# put data into S3 with Reduced Redundancy
# https://aws.amazon.com/about-aws/whats-new/2010/05/19/announcing-amazon-s3-reduced-redundancy-storage/
S3_STORAGE_RRS = True or False (Default: False)


#Optional config value to enable the HTTP loader
#This would allow you to load watermarks in over your images dynamically through a URI
#E.g.
#http://your-thumbor.com/unsafe/filters:watermark(http://example.com/watermark.png,0,0,50)/s3_bucket/photo.jpg
AWS_ENABLE_HTTP_LOADER = True or False (Default: False)


# Optional additional configuration for the Boto-Client used to access S3.
# see http://boto.readthedocs.org/en/latest/ref/s3.html?highlight=boto.s3.connection.s3connection#boto.s3.connection.S3Connection
# for all available config options
# Hint: If you are using S3 Frankfurt, you have to set the host to "s3.eu-central-1.amazonaws.com".
BOTO_CONFIG = {
'host': 'fakes3.local.dev',
'is_secure': False
}
TC_AWS_STORAGE_RRS=False # S3 redundancy


# Enable HTTP Loader as well?
# This would allow you to load watermarks in over your images dynamically through a URI
# E.g.
# http://your-thumbor.com/unsafe/filters:watermark(http://example.com/watermark.png,0,0,50)/s3_bucket/photo.jpg
TC_AWS_ENABLE_HTTP_LOADER=False

TC_AWS_ALLOWED_BUCKETS=False # List of allowed bucket to be requested
TC_AWS_STORE_METADATA=False # Store result with metadata (for instance content-type)

7 changes: 4 additions & 3 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@

setup(
name='tc_aws',
version='1.3.1',
version='2.0.0',
description='Thumbor AWS extensions',
author='Thumbor-Community & William King',
author_email='[email protected]',
Expand All @@ -13,14 +13,15 @@
packages=find_packages(),
install_requires=[
'python-dateutil',
'thumbor',
'boto'
'thumbor>=5.2',
'tornado-botocore',
],
extras_require={
'tests': [
'pyvows',
'coverage',
'tornado_pyvows',
'boto',
'moto',
'mock',
],
Expand Down
32 changes: 17 additions & 15 deletions tc_aws/__init__.py
Original file line number Diff line number Diff line change
@@ -1,24 +1,26 @@
# coding: utf-8

# Copyright (c) 2015, thumbor-community
# Use of this source code is governed by the MIT license that can be
# found in the LICENSE file.

from thumbor.config import Config

Config.define('STORAGE_BUCKET', 'thumbor-images', 'S3 bucket for Storage', 'S3')
Config.define('RESULT_STORAGE_BUCKET', 'thumbor-result', 'S3 bucket for result Storage', 'S3')
Config.define('S3_LOADER_BUCKET', None, 'S3 bucket for loader', 'S3')
Config.define('TC_AWS_REGION', 'eu-west-1', 'S3 region', 'S3')

Config.define('TC_AWS_STORAGE_BUCKET', None, 'S3 bucket for Storage', 'S3')
Config.define('TC_AWS_STORAGE_ROOT_PATH', '', 'S3 path prefix for Storage bucket', 'S3')

Config.define('S3_LOADER_ROOT_PATH', '', 'S3 path prefix for Loader bucket', 'S3')
Config.define('STORAGE_AWS_STORAGE_ROOT_PATH', '', 'S3 path prefix for Storage bucket', 'S3')
Config.define('RESULT_STORAGE_AWS_STORAGE_ROOT_PATH', '', 'S3 path prefix for Result storage bucket', 'S3')
Config.define('TC_AWS_LOADER_BUCKET', None, 'S3 bucket for loader', 'S3')
Config.define('TC_AWS_LOADER_ROOT_PATH', '', 'S3 path prefix for Loader bucket', 'S3')

Config.define('STORAGE_EXPIRATION_SECONDS', 3600, 'S3 expiration', 'S3')
Config.define('TC_AWS_RESULT_STORAGE_BUCKET', None, 'S3 bucket for result Storage', 'S3')
Config.define('TC_AWS_RESULT_STORAGE_ROOT_PATH', '', 'S3 path prefix for Result storage bucket', 'S3')

Config.define('S3_STORAGE_SSE', False, 'S3 encriptipon key', 'S3')
Config.define('S3_STORAGE_RRS', False, 'S3 redundency', 'S3')
Config.define('S3_ALLOWED_BUCKETS', False, 'List of allowed bucket to be requeted', 'S3')
Config.define('RESULT_STORAGE_S3_STORE_METADATA', False, 'S3 store result with metadata', 'S3')
Config.define('TC_AWS_STORAGE_SSE', False, 'S3 encryption', 'S3')
Config.define('TC_AWS_STORAGE_RRS', False, 'S3 redundancy', 'S3')

Config.define('AWS_ACCESS_KEY', None, 'AWS Access key, if None use environment AWS_ACCESS_KEY_ID', 'AWS')
Config.define('AWS_SECRET_KEY', None, 'AWS Secret key, if None use environment AWS_SECRET_ACCESS_KEY', 'AWS')
Config.define('AWS_ROLE_BASED_CONNECTION', False, 'EC2 instance can use role that does not require AWS_ACCESS_KEY see http://docs.aws.amazon.com/IAM/latest/UserGuide/roles-usingrole-ec2instance.html', 'AWS')
Config.define('TC_AWS_ENABLE_HTTP_LOADER', False, 'Enable HTTP Loader as well?', 'S3')
Config.define('TC_AWS_ALLOWED_BUCKETS', False, 'List of allowed buckets to be requested', 'S3')
Config.define('TC_AWS_STORE_METADATA', False, 'S3 store result with metadata', 'S3')

Config.define('BOTO_CONFIG', None, 'Additional Boto options for configuring S3 access (see http://boto.readthedocs.org/en/latest/ref/s3.html?highlight=boto.s3.connection.s3connection#boto.s3.connection.S3Connection)')
5 changes: 5 additions & 0 deletions tc_aws/aws/__init__.py
Original file line number Diff line number Diff line change
@@ -1 +1,6 @@
# coding: utf-8

# Copyright (c) 2015, thumbor-community
# Use of this source code is governed by the MIT license that can be
# found in the LICENSE file.

108 changes: 108 additions & 0 deletions tc_aws/aws/bucket.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,108 @@
# coding: utf-8

# Copyright (c) 2015, thumbor-community
# Use of this source code is governed by the MIT license that can be
# found in the LICENSE file.

import botocore.session

from tornado_botocore import Botocore
from tornado.concurrent import return_future

class Bucket(object):
"""
This handles all communication with AWS API
"""
_bucket = None
_region = None
_local_cache = dict()

def __init__(self, bucket, region):
"""
Constructor
:param string bucket: The bucket name
:param string region: The AWS API region to use
:return: The created bucket
"""
self._bucket = bucket
self._region = region

@return_future
def get(self, path, callback=None):
"""
Returns object at given path
:param string path: Path or 'key' to retrieve AWS object
:param callable callback: Callback function for once the retrieval is done
"""
session = Botocore(service='s3', region_name=self._region, operation='GetObject')
session.call(
callback=callback,
Bucket=self._bucket,
Key=path,
)

@return_future
def get_url(self, path, method='GET', expiry=3600, callback=None):
"""
Generates the presigned url for given key & methods
:param string path: Path or 'key' for requested object
:param string method: Method for requested URL
:param int expiry: URL validity time
:param callable callback: Called function once done
"""
session = botocore.session.get_session()
client = session.create_client('s3', region_name=self._region)

url = client.generate_presigned_url(
ClientMethod='get_object',
Params={
'Bucket': self._bucket,
'Key': path,
},
ExpiresIn=expiry,
HttpMethod=method,
)

callback(url)

@return_future
def put(self, path, data, metadata={}, reduced_redundancy=False, encrypt_key=False, callback=None):
"""
Stores data at given path
:param string path: Path or 'key' for created/updated object
:param bytes data: Data to write
:param dict metadata: Metadata to store with this data
:param bool reduced_redundancy: Whether to reduce storage redundancy or not?
:param bool encrypt_key: Encrypt data?
:param callable callback: Called function once done
"""
storage_class = 'REDUCED_REDUNDANCY' if reduced_redundancy else 'STANDARD'

args = dict(
callback=callback,
Bucket=self._bucket,
Key=path,
Body=data,
Metadata=metadata,
StorageClass=storage_class,
)

if encrypt_key:
args['ServerSideEncryption'] = 'AES256'

session = Botocore(service='s3', region_name=self._region, operation='PutObject')
session.call(**args)

@return_future
def delete(self, path, callback=None):
"""
Deletes key at given path
:param string path: Path or 'key' to delete
:param callable callback: Called function once done
"""
session = Botocore(service='s3', region_name=self._region, operation='DeleteObject')
session.call(
callback=callback,
Bucket=self._bucket,
Key=path,
)
25 changes: 0 additions & 25 deletions tc_aws/aws/connection.py

This file was deleted.

Loading

0 comments on commit e11994b

Please sign in to comment.