Skip to content

amogh9594/aws-with-python

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

40 Commits
 
 
 
 
 
 
 
 

Repository files navigation

AWS With Python

The AWS SDK for Python (Boto3) enables you to use Python code to interact with AWS services like Amazon S3. For example, you can use the SDK to create an Amazon S3 bucket, list your available buckets, and then delete the bucket you just created.

AWS SDK for Python (Boto3)

Get started quickly using AWS with boto3, the AWS SDK for Python. Boto3 makes it easy to integrate your Python application, library, or script with AWS services including Amazon S3, Amazon EC2, Amazon DynamoDB, and more.

Install Package

pip install boto3

Program Files

  1. AWS S3 Bucket Connection

    • Set up credentials to connect Python to S3.
    • Authenticate with boto3.
    • Read and write data from/to S3.
  2. Analyzing Invoice Data Using Textract

    • PLAINTEXT detection from documents.
    • FORM detection from documents.
    • TABLE data detection from documents.
  3. Lambda iam user call for AWS interaction

    • AWS Identity and Access Management (IAM) user is an entity that you create in AWS to represent the person or application that uses it to interact with AWS.
    • The examples listed on this page are code samples written in Python that demonstrate how to interact with AWS Identity and Access Management (IAM).
  4. Using lambda read file from S3 using lambda trigger

    • S3 Object Lambda gives you the flexibility to invoke Lambda functions directly from S3 GET requests to process data to meet the specific requirements of your
      applications.
    • S3 Object Lambda works with your existing applications and uses AWS Lambda functions to automatically process and transform your data as it is being retrieved from S3.
  5. Upload to s3 bucket from lambda / upload a file to S3 bucket

    • Create necessary IAM role our lambda will used.
    • Create s3 bucket.
    • Create json file using Lambda function and upload a json file to S3 bucket. / Upload file from local directory to S3 bucket.
  6. Download All Files From S3 Using Boto3

    • s3.client.download_file() – API method to download file from your S3 buckets.
    • You’ll create an s3 resource and iterate over a for loop using objects.all() API.
  7. Send and read log from cloudwatch using Boto3

    • Sending logs to CloudWatch log group Click Here.
    • Filtering CloudWatch Logs by LogGroups and LogStreams and reading them using Python and the Boto3 SDK Click Here.
  8. AWS Textract Query feature

    • When user provided a query, Amazon Textract provides a specialized response object. It then provides the confidence Amazon Textract has with the answer and a location of the answer on the page, and the text answer to the question posed.
    • Detected queries are returned as Block objects in the responses from AnalyzeDocument and GetDocumentAnalysis. DocQuery
  9. Multiple files upload to AWS S3 from nextcloud folder

    • First create one folder inside nextcloud account store some files in that folder create shareable public URL.
    • Use This above code and enter URL/download , required credentials in that code and execute.