This application is an AI-powered chatbot designed to help users navigate and utilize the Spack package manager tool. This innovative chatbot that employs Retrieval-Augmented Generation (RAG) technology to provide responses to user queries and can be interacted with both in Slack and Amazon Q. This solution can be applied to a variety of use cases that require knowledge integration to a chatbot. The two main selling points of this solution is: (1) having the ability to interact with the chatbot in two mediums and (2) being able to show sources in websites and documents in an S3 bucket via CloudFront.
- Slackbot and Amazon Q both share the same vector database reducing solution complexity and cost.
- Both Slackbot and Amazon Q bot provides clickable links to its sources for generating answers in their responses:
- https://spack-tutorial.readthedocs.io/en/latest/index.html
- Text files (
docuemnts/slack/*.txt
) uploaded to S3 bucket via CloudFront distribution
- Automated updates of both data sources.
- Dashboard for tracking bot invocations for Slack and Amazon Q:
- Aws Console -> Amazon Q Business -> Radiuss -> Analytics dashboard
- Aws Console -> Cloudformation -> Stacks -> SlackStack -> Outputs -> AmazonQCloudwatchDashboardOutput
-
Data Stack:
- A) Documentation Data
Documentation Processing Lambda
pulls in data fromRaw Documentation Bucket
and does the following:- Converts
.rst
files into markdown. - Splits the markdown text based on its title
- Generates metadata files that will be used by kendra. The metadata files contains the following attributes:
- title: section title from data split
- data_source:
documentation
- _source_uri: URL from the documentation which is
https://spack.readthedocs.io/en/latest/
+ file name + "#" + section title
- Converts
Documentation Processing Lambda
saves the split markdown and the metadata files intoProcessed Documentation Bucket
.Documentation Processing Lambda
triggers a kendra data source sync job to crawl theProcessed Documentation Bucket
.
- B) Slack Data
Slack Processing Lambda:
Slack Processing Lambda
pulls in data fromRaw Slack Bucket
which contains historical Slack data and does the following:- Generates metadata files that will be used by kendra. The metadata files contains the following attributes:
- title: section title from data split
- data_source:
slack
- _source_uri: generated CloudFront URL from the
Raw Slack Bucket
- Generates metadata files that will be used by kendra. The metadata files contains the following attributes:
Slack Processing Lambda
saves historical slack data and the metadata files intoProcessed Slack Bucket
.Slack Processing Lambda
triggers a kendra data source sync job to crawl theProcessed Slack Bucket
.Raw Slack Bucket
data is passed into a CloudFront distribution for public access. Slack Ingest Lambda:Slack Ingest Lambda
is triggered by event bridge daily.Slack Ingest Lambda
pulls in the past 24 hours conversation from slackdata from slack and writes it to Raw slack dataSlack Ingest Lambda
saves conversation data intoProcessed Slack Bucket
together with its metadata.Slack Processing Lambda
triggers a kendra data source sync job to crawl theProcessed Slack Bucket
.Processed Slack Bucket
data is passed into a CloudFront distribution for public access.
- A) Documentation Data
-
Amazon Q Stack: Amazon Q Business is a fully managed, generative-AI powered assistant tailored for this use case to answer questions based on the data from the data stack.
Identity Center
- Provides authentication to Amazon Q.Kendra
provides context to the responses via semantic search and sources.Cloudfront
links are provided by kendra and propagates to the responses of Amazon Q.Public Docs
links are provided by kendra and propagates to the responses of Amazon Q.- Invocations are logged in a
Cloudwatch
Dashboard.
- Slack
- A) Answering Questions
- Slack app invokes
API Gateway
with the question as a part of the payload. API Gateway
invokesSlackbot Lambda
.Slackbot Lambda
pulls Slack token fromSecrets Manager
.Slackbot Lambda
pulls Slack parameters for responses fromSSM Parameter Store
.Kendra
is queried with the question and responds with relevant passages and sources from documentation and slack data fromCloudfront
.- Public docs are returned as part of the response if the chatbot used it as a source.
- Slack data via
Cloudfront
are returned as part of the response if the chatbot used it as a source.
- Slack app invokes
- B) Reporting
0.
Metrics Lambda
is triggered every day at 0:00 UTC- Everytime the
Slackbot Lambda
is triggered it is captured inCloudwatch
as a metric. Metrics Lambda
pulls daily data fromCloudwatch
Metrics Lambda
pulls Slack token fromSecrets Manager
Metrics Lambda
pulls slack parameters for responses fromSSM Parameter Store
Metrics Lambda
send message on slack with daily report
- Everytime the
- A) Answering Questions
- Active AWS account
- Docker
- AWS CLI
- Slack workspace
Parent Channel
: Public Slack Channel where users will be interacting with the slack chatbot.Child Channel
: Private Slack Channel where metrics report will be sent to.
- If building from an arm based machine (Apple M series) change the parameter for
architecutre
indocumentation_processing_lambda
instacks/data.py
tolambda_.Architecture.ARM_64
.
- Ensure Bedrock model access for
anthropic.claude-v2:1
- Enable IAM identify Center for your account.
pip install -r requirements.txt
cdk bootstrap
cdk synth
cdk deploy --all
- Create a Slack app:
- Go to: https://api.slack.com/apps
- Select
Create an App
- Select
From a manifest
- Select
Spack
workspace - Select
YAML
tab and copy the contents of the app manifest and selectNext
- Select
Create
- Select
- Install App
- On the left pane, under settings select
Install App
- Select
Install to <workspace>
- Select
Allow
- Copy Bot User OAuth Token
- Go to: AWS Console -> AWS Secrets Manager -> Secrets -> SlackAccessKey### -> Overview -> Retrieve Secret Value -> Edit
- Paste value where it says
place-holder-access-key
- Click
Save
- Enter endpoint (One app is finished deploying from Step 1)
- Got to: AWS Console -> Cloudformation -> Stacks -> SlackStack -> Outputs -> SlackBotEndpointOutput (copy Value)
- Enable events
- Paste value under
Request URL
- On the bottom right of the screen select
Save Changes
- Invite bot to channels (Parent and Child):
- Select the channel
- On the upper right next to huddle click on the three dots.
- Select
edit settings
- Go to
integrations
tab - Select
Add an app
- Under the
In your workspace
tab select add the chatbot
- Enter Slack workspace information:
Obtain the following information from slack:
- Parent channel ID
- Child channel ID
Note: To obtain channel id, right-click the channel -> View Channel Details -> About -> Copy channel ID
- Slackbot member ID: Under apps -> right-click the bot -> view app details -> Copy Member ID. If Slack bot is not under apps, click
Add apps
and select the slackbot.
Enter above information into AWS:
- Got to: AWS Console -> Systems Manager -> Application Management -> Parameter Store -> My parameters
- Select
/Radiuss/Spack/ChildChannelId
and edit. Enter the child channel id as the value and selectsave changes
- Select
/Radiuss/Spack/ParentChannelId
and edit. Enter the parent channel id as the value and selectsave changes
- Select
/Radiuss/Spack/SlackbotMemberId
and edit. Enter the Slackbot member id as the value and selectsave changes
Security: It is highly recommended that the user change the slack token periodically.
- Go to AWS Console -> Amazon Q Business -> Applications -> Radiuss -> User Access -> Manage user access
- Select
Add groups and users
- Select
Add and assign new users
- Select
Next
- Enter information
- Select
Next
- Select
Add
- Go to AWS Console -> Amazon Q Business -> Applications -> Radiuss -> User Access -> Manage user access
- Select user via radio button
- Select
Edit subscription
- Select
Choose subscription
from dropdown Subscription tiers are available in this link
Amazon Q
: AWS Console -> Amazon Q Business -> Applications -> Radiuss -> Web experience settings -> Deployed URLSlack
: Workspace -> Designated Channel -> Send a single message that starts with @SpackChatbot
- Nick Biso, Machine Learning Engineer - Amazon Web Services Inc.
- Ian Lunsford, Aerospace Cloud Consultant - Amazon Web Services Inc.
- Natasha Tchir, Machine Learning Engineer - Amazon Web Services Inc.
- Katherine Feng, Machine Learning Engineer - Amazon Web Services Inc.
See CONTRIBUTING for more information.
This library is licensed under the MIT-0 License. See the LICENSE file.