Skip to content

Commit

Permalink
Update README.md (#14)
Browse files Browse the repository at this point in the history
* Update README.md
* Support object lambda
  • Loading branch information
mjgp2 authored Jul 18, 2021
1 parent 70afb5d commit dd24621
Show file tree
Hide file tree
Showing 2 changed files with 12 additions and 11 deletions.
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ In order to support development either on RDS or locally, we implemented our own
the one provided in RDS. It was implemented in Python using the boto3 library.

## Installation
Make sure boto3 is installed using the default Python 2 installed on your computer.
Make sure boto3 is installed using the default Python 3 installed on your computer.
On MacOS, this can be done as follows:

sudo /usr/bin/easy_install boto3
Expand Down Expand Up @@ -35,7 +35,7 @@ Then install `postgres-aws-s3`:

Finally in Postgres:
```postgresql
psql> CREATE EXTENSION plpythonu;
psql> CREATE EXTENSION plpython3u;
psql> CREATE EXTENSION aws_s3;
```

Expand Down Expand Up @@ -395,7 +395,7 @@ psql> CREATE EXTENSION aws_s3;
Set the endpoint url and the aws keys to use s3 (in localstack you can set the aws creds to any non-empty string):
```
psql> SET aws_s3.endpoint_url TO 'http://localstack:4566';
psql> SET aws_s3.s3.aws_access_key_id TO 'dummy';
psql> SET aws_s3.aws_access_key_id TO 'dummy';
psql> SET aws_s3.secret_access_key TO 'dummy';
```

Expand Down
17 changes: 9 additions & 8 deletions aws_s3--0.0.1.sql
Original file line number Diff line number Diff line change
Expand Up @@ -82,24 +82,25 @@ AS $$
'endpoint_url': endpoint_url if endpoint_url else default_aws_settings.get('endpoint_url')
}

s3 = boto3.client(
s3 = boto3.resource(
's3',
region_name=region,
**aws_settings
)

response = s3.head_object(Bucket=bucket, Key=file_path)
obj = s3.Object(bucket, file_path)
response = obj.get()
content_encoding = response.get('ContentEncoding')
body = response['Body']

with tempfile.NamedTemporaryFile() as fd:
if content_encoding and content_encoding.lower() == 'gzip':
with tempfile.NamedTemporaryFile() as gzfd:
s3.download_fileobj(bucket, file_path, gzfd)
gzfd.flush()
gzfd.seek(0)
shutil.copyfileobj(gzip.GzipFile(fileobj=gzfd, mode='rb'), fd)
with gzip.GzipFile(fileobj=body) as gzipfile:
while fd.write(gzipfile.read(204800)):
pass
else:
s3.download_fileobj(bucket, file_path, fd)
while fd.write(body.read(204800)):
pass
fd.flush()
formatted_column_list = "({column_list})".format(column_list=column_list) if column_list else ''
res = plpy.execute("COPY {table_name} {formatted_column_list} FROM {filename} {options};".format(
Expand Down

0 comments on commit dd24621

Please sign in to comment.