Skip to content

Commit f80b846

Browse files
authored
docs(ai): add ai/concepts/tools page (#7961)
1 parent 4c00e47 commit f80b846

File tree

1 file changed

+259
-3
lines changed
  • src/pages/[platform]/build-a-backend/ai/concepts/tools

1 file changed

+259
-3
lines changed

src/pages/[platform]/build-a-backend/ai/concepts/tools/index.mdx

Lines changed: 259 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -33,12 +33,268 @@ Amplify AI sections are under construction
3333

3434
</Callout>
3535

36+
Large language models (LLMs) are stateless text generators, they have no knowledge of the real world and can't access data on their own. For example, if you asked an LLM "what is the weather in San Jose?" it would not be able to tell you because it does not know what the weather is today. Tools (sometimes referred to as function calling) are functions/APIs that LLMs can choose to invoke to get information about the world. This allows the LLM to answer questions with information now included in their training data -- like the weather, application-specific, and even user-specific data.
3637

37-
## Tools as AppSync Queries
38+
When an LLM is prompted with tools, it can choose to respond to a prompt saying that it wants to call a tool to get some data or take an action on the user's behalf. That data is then added to the conversation history so the LLM can see what data was returned. Here is a simplified flow of what happens:
3839

40+
1. User: "what is the weather in san jose?"
41+
2. Code: Call LLM with this message: "what is the weather in san jose?", and let it know it has access to a tool called `getWeather` that takes an input like `{ city: string }`
42+
3. LLM: "I want to call the 'getWeather' tool with the input `{city: 'san jose'}`"
43+
4. Code: Run `getWeather({city: 'san jose'})` and append the results to the conversation history so far and call the LLM again
44+
5. LLM: "In san jose it is 72 degrees and sunny"
3945

40-
## Connecting to AWS services
46+
<Callout>
4147

48+
Note: the LLM itself is not actually executing any function or code. It responds with a special message saying that it wants to call that tool with specific input. That tool then needs to called and the results returned to the LLM in a message history. For more information on tools, see the [Bedrock docs on tool use](https://docs.aws.amazon.com/bedrock/latest/userguide/tool-use.html)
4249

43-
## Connecting to external services
50+
</Callout>
51+
52+
53+
54+
## Tools in data schema
55+
56+
The default way you can define tools for the LLM to use is with data models and custom queries in your data schema. When you define tools in your data schema, Amplify will take care of all of the heavy lifting required to properly implement such as:
57+
58+
* **Describing the tools to the LLM:** because each tool is a custom query or data model that is defined in the schema, Amplify knows the input shape needed for that tool
59+
* **Invoking the tool with the right parameters:** after the LLM responds it wants to call a tool, the code that initially called the LLM needs to then run that code.
60+
* **Maintaining the caller identity and authorization:** we don't want users to have access to more data through the LLM than they normally would, so when the LLM wants to invoke a tool we will call it with the user's identity. For example, if the LLM wanted to invoke a query to list Todos, it would only return the todos of the user and not everyone's todos.
61+
* **Re-prompting the LLM:** after the tool is executed and returns a response, the LLM needs to be re-prompted with the tool results. This could process could be repeated several times if the LLM needs to invoke several tools to get the necessary data to respond to the user.
62+
63+
64+
65+
### 1. Add a custom query
66+
67+
In your **`amplify/data/resource.ts`** file, add a custom query.
68+
69+
```ts title="amplify/data/resource.ts"
70+
// highlight-start
71+
import { type ClientSchema, a, defineData, defineFunction } from "@aws-amplify/backend";
72+
// highlight-end
73+
74+
// highlight-start
75+
export const getWeather = defineFunction({
76+
name: 'getWeather',
77+
entry: 'getWeather.ts'
78+
});
79+
// highlight-end
80+
81+
const schema = a.schema({
82+
// highlight-start
83+
getWeather: a.query()
84+
.arguments({ city: a.string() })
85+
.returns(a.customType({ value: a.integer(), unit: a.string() }))
86+
.handler(a.handler.function(getWeather))
87+
.authorization((allow) => allow.authenticated()),
88+
// highlight-end
89+
90+
chat: a.conversation({
91+
aiModel: a.ai.model('Claude 3 Haiku'),
92+
systemPrompt: 'You are a helpful assistant',
93+
// highlight-start
94+
tools: [
95+
{
96+
query: a.ref('getWeather'),
97+
description: 'Provides the weather for a given city'
98+
},
99+
]
100+
// highlight-end
101+
}),
102+
});
103+
```
104+
105+
### 2. Implement the custom query
106+
107+
Now create a new **`amplify/data/getWeather.ts`** file.
108+
109+
```ts title="amplify/data/getWeather.ts"
110+
import type { Schema } from "./resource";
111+
112+
export const handler: Schema["getWeather"]["functionHandler"] = async (
113+
event
114+
) => {
115+
// This returns a mock value, but you can connect to any API, database, or other service
116+
return {
117+
value: 42,
118+
unit: 'C'
119+
};
120+
}
121+
```
122+
123+
### 3. Add query function to backend
124+
125+
Lastly, update your **`amplify/backend.ts`** file to include the newly defined `getWeather` function.
126+
127+
```ts title="amplify/backend.ts"
128+
// highlight-start
129+
import { getWeather } from "./data/resource";
130+
// highlight-end
131+
132+
defineBackend({
133+
auth,
134+
data,
135+
// highlight-start
136+
getWeather
137+
// highlight-end
138+
});
139+
```
140+
141+
142+
## Connecting to external APIs
143+
144+
### 1. Create a secret
145+
146+
Most APIs will have an API key you use to call their API. Get an API key from the service you are using and then store that API key in a [secret](/[platform]/deploy-and-host/fullstack-branching/secrets-and-vars/). If you are running code locally you can add a secret with the command:
147+
148+
```
149+
npx ampx sandbox secret set [name]
150+
```
151+
152+
where `[name]` is the name of the secret you want to set.
153+
154+
155+
### 2. Add secret to function definition
156+
157+
In the function definition you can add environment variables and pass in secrets using the `secret` function. Make sure the input to the `secret` function is the name you entered above.
158+
159+
```ts title="amplify/backend.ts"
160+
import {
161+
type ClientSchema,
162+
a,
163+
defineData,
164+
defineFunction,
165+
// highlight-start
166+
secret,
167+
// highlight-end
168+
} from "@aws-amplify/backend";
169+
170+
export const getWeather = defineFunction({
171+
name: "getWeather",
172+
entry: "./getWeather.ts",
173+
// highlight-start
174+
environment: {
175+
API_KEY: secret("API_KEY"),
176+
},
177+
// highlight-end
178+
});
179+
```
180+
181+
### 3. Use the secret to call the API
182+
183+
```ts title="amplify/data/getWeather.ts"
184+
// highlight-start
185+
import { env } from "$amplify/env/getWeather";
186+
// highlight-end
187+
import type { Schema } from "./resource";
188+
189+
export const handler: Schema["getWeather"]["functionHandler"] = async (
190+
event
191+
) => {
192+
// highlight-start
193+
const res = await fetch(
194+
`http://api.weatherstack.com/current?access_key=${
195+
env.API_KEY
196+
}&units=f&query=${encodeURIComponent(event.arguments.city ?? "")}`
197+
);
198+
199+
const weather = await res.json();
200+
201+
return {
202+
value: weather.current.temperature,
203+
unit: weather.request.unit,
204+
};
205+
// highlight-end
206+
};
207+
```
208+
209+
210+
## Custom Lambda Tools
211+
212+
Conversation routes can also have completely custom tools defined in a Lambda handler.
213+
214+
### 1. Create your custom conversation handler function.
215+
216+
```ts title="amplify/custom-conversation-handler/resource.ts"
217+
import { defineConversationHandlerFunction } from '@aws-amplify/backend-ai/conversation';
218+
219+
export const customConversationHandler = defineConversationHandlerFunction({
220+
name: 'customConversationHandlerFunction',
221+
entry: './custom_handler.ts',
222+
models: [
223+
{
224+
modelId: 'anthropic.claude-3-haiku-20240307-v1:0',
225+
},
226+
],
227+
});
228+
```
229+
230+
### 2. Define the custom handler function implementation.
231+
232+
```ts title="amplify/custom-conversation-handler/custom_handler.ts"
233+
234+
import {
235+
ConversationTurnEvent,
236+
ExecutableTool,
237+
handleConversationTurnEvent,
238+
} from '@aws-amplify/ai-constructs/conversation/runtime';
239+
import { ToolResultContentBlock } from '@aws-sdk/client-bedrock-runtime';
240+
241+
const thermometer: ExecutableTool = {
242+
name: 'thermometer',
243+
description: 'Returns current temperature in a city',
244+
execute: (input): Promise<ToolResultContentBlock> => {
245+
if (input && typeof input === 'object' && 'city' in input) {
246+
if (input.city === 'Seattle') {
247+
return Promise.resolve({
248+
text: `75F`,
249+
});
250+
}
251+
}
252+
return Promise.resolve({
253+
text: 'unknown'
254+
})
255+
},
256+
inputSchema: {
257+
json: {
258+
type: 'object',
259+
'properties': {
260+
'city': {
261+
'type': 'string',
262+
'description': 'The city name'
263+
}
264+
},
265+
required: ['city']
266+
}
267+
}
268+
};
269+
270+
/**
271+
* Handler with simple tool.
272+
*/
273+
export const handler = async (event: ConversationTurnEvent) => {
274+
await handleConversationTurnEvent(event, {
275+
tools: [thermometer],
276+
});
277+
};
278+
```
279+
280+
281+
### 3. Update conversation route
282+
283+
Finally, update your conversation route definition to use the custom handler.
284+
285+
```ts title="amplify/data/resource.ts"
286+
import { a, defineData } from '@aws-amplify/backend';
287+
// highlight-start
288+
import { customConversationHandler } from '../custom-conversation-handler/resource';
289+
// highlight-end
44290

291+
const schema = a.schema({
292+
customToolChat: a.conversation({
293+
aiModel: a.aiModel.anthropic.claude3Haiku(),
294+
systemPrompt: 'You are a helpful chatbot. Respond in 20 words or less.',
295+
// highlight-start
296+
handler: customConversationHandler,
297+
// highlight-end
298+
}),
299+
});
300+
```

0 commit comments

Comments
 (0)