Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: remove unused header #18

Closed
wants to merge 1 commit into from
Closed
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 1 addition & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,6 @@

**AWS AI Stack** – A ready-to-use, full-stack boilerplate project for building serverless AI applications on AWS. A great fit for those seeking a trusted AWS foundation for AI apps and access to powerful LLM models via Bedrock ​​that keep your app’s data separate from model providers.


**[View the Live Demo – awsaistack.com](https://awsaistack.com)**

Use this as a boilerplate project to create an AI Chat bot, authentication services, business logic, async workers, all on AWS Lambda, API Gateway, DynamoDB, and EventBridge.
Expand Down Expand Up @@ -84,7 +83,7 @@ Upon request, it may take a few minutes for AWS to enable the model. Once they
are enabled, you will receive an email from AWS confirming the model is enabled.

Some users have reported issues with getting models enabled on AWS Bedrock. Make
sure you have sufficient permissions in AWS to enable the models first. Often,
sure you have sufficient permissions in AWS to enable the models first. Often,
AWS accounts that are new or have not historically had a monthly invoice over a few
dollars may require contacting AWS to enable models.

Expand Down Expand Up @@ -519,5 +518,3 @@ necessary.
Consider using a more fine-grained approach to deploying services, such as
only deploying the services that have changed by using the `serverless <service>
deploy` command.

##