You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: src/pages/[platform]/build-a-backend/ai/concepts/tools/index.mdx
+259-3Lines changed: 259 additions & 3 deletions
Original file line number
Diff line number
Diff line change
@@ -33,12 +33,268 @@ Amplify AI sections are under construction
33
33
34
34
</Callout>
35
35
36
+
Large language models (LLMs) are stateless text generators, they have no knowledge of the real world and can't access data on their own. For example, if you asked an LLM "what is the weather in San Jose?" it would not be able to tell you because it does not know what the weather is today. Tools (sometimes referred to as function calling) are functions/APIs that LLMs can choose to invoke to get information about the world. This allows the LLM to answer questions with information now included in their training data -- like the weather, application-specific, and even user-specific data.
36
37
37
-
## Tools as AppSync Queries
38
+
When an LLM is prompted with tools, it can choose to respond to a prompt saying that it wants to call a tool to get some data or take an action on the user's behalf. That data is then added to the conversation history so the LLM can see what data was returned. Here is a simplified flow of what happens:
38
39
40
+
1. User: "what is the weather in san jose?"
41
+
2. Code: Call LLM with this message: "what is the weather in san jose?", and let it know it has access to a tool called `getWeather` that takes an input like `{ city: string }`
42
+
3. LLM: "I want to call the 'getWeather' tool with the input `{city: 'san jose'}`"
43
+
4. Code: Run `getWeather({city: 'san jose'})` and append the results to the conversation history so far and call the LLM again
44
+
5. LLM: "In san jose it is 72 degrees and sunny"
39
45
40
-
## Connecting to AWS services
46
+
<Callout>
41
47
48
+
Note: the LLM itself is not actually executing any function or code. It responds with a special message saying that it wants to call that tool with specific input. That tool then needs to called and the results returned to the LLM in a message history. For more information on tools, see the [Bedrock docs on tool use](https://docs.aws.amazon.com/bedrock/latest/userguide/tool-use.html)
42
49
43
-
## Connecting to external services
50
+
</Callout>
51
+
52
+
53
+
54
+
## Tools in data schema
55
+
56
+
The default way you can define tools for the LLM to use is with data models and custom queries in your data schema. When you define tools in your data schema, Amplify will take care of all of the heavy lifting required to properly implement such as:
57
+
58
+
***Describing the tools to the LLM:** because each tool is a custom query or data model that is defined in the schema, Amplify knows the input shape needed for that tool
59
+
***Invoking the tool with the right parameters:** after the LLM responds it wants to call a tool, the code that initially called the LLM needs to then run that code.
60
+
***Maintaining the caller identity and authorization:** we don't want users to have access to more data through the LLM than they normally would, so when the LLM wants to invoke a tool we will call it with the user's identity. For example, if the LLM wanted to invoke a query to list Todos, it would only return the todos of the user and not everyone's todos.
61
+
***Re-prompting the LLM:** after the tool is executed and returns a response, the LLM needs to be re-prompted with the tool results. This could process could be repeated several times if the LLM needs to invoke several tools to get the necessary data to respond to the user.
62
+
63
+
64
+
65
+
### 1. Add a custom query
66
+
67
+
In your **`amplify/data/resource.ts`** file, add a custom query.
68
+
69
+
```ts title="amplify/data/resource.ts"
70
+
// highlight-start
71
+
import { typeClientSchema, a, defineData, defineFunction } from"@aws-amplify/backend";
// This returns a mock value, but you can connect to any API, database, or other service
116
+
return {
117
+
value: 42,
118
+
unit: 'C'
119
+
};
120
+
}
121
+
```
122
+
123
+
### 3. Add query function to backend
124
+
125
+
Lastly, update your **`amplify/backend.ts`** file to include the newly defined `getWeather` function.
126
+
127
+
```ts title="amplify/backend.ts"
128
+
// highlight-start
129
+
import { getWeather } from"./data/resource";
130
+
// highlight-end
131
+
132
+
defineBackend({
133
+
auth,
134
+
data,
135
+
// highlight-start
136
+
getWeather
137
+
// highlight-end
138
+
});
139
+
```
140
+
141
+
142
+
## Connecting to external APIs
143
+
144
+
### 1. Create a secret
145
+
146
+
Most APIs will have an API key you use to call their API. Get an API key from the service you are using and then store that API key in a [secret](/[platform]/deploy-and-host/fullstack-branching/secrets-and-vars/). If you are running code locally you can add a secret with the command:
147
+
148
+
```
149
+
npx ampx sandbox secret set [name]
150
+
```
151
+
152
+
where `[name]` is the name of the secret you want to set.
153
+
154
+
155
+
### 2. Add secret to function definition
156
+
157
+
In the function definition you can add environment variables and pass in secrets using the `secret` function. Make sure the input to the `secret` function is the name you entered above.
0 commit comments