-
Notifications
You must be signed in to change notification settings - Fork 146
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature request: Add AppSync Resolver Event Schema to Parser #3283
Comments
Thanks for opening your first issue here! We'll come back to you as soon as we can. |
Hi @svozza, thank you for opening this issue. Before deciding whether to move forward or not, I'd like to ask for @leandrodamascena's opinion on this since I noticed that the model/schema is also missing from the Python version of Powertools for AWS. Leo, do you know why we didn't add this in Parser, and if it makes sense to add it? |
I thought the AppSync resolver was supported in Python: https://docs.powertools.aws.dev/lambda/python/latest/utilities/data_classes/#appsync-resolver. Or is that different? |
That's different. Powertools for AWS Lambda (Python) has two utilities that share some overlap: Data Classes and Parser. The main goal of the former is to provide type hinting and code completion for common event types at authoring time, as well as some basic runtime benefits around decoding/deserializing nested fields. In this version of Powertools for AWS we opted to skip Data Classes entirely and focus on Parser. In the TypeScript ecosystem there're already established packages for AWS events like If you want to simply type the event in your function, you can use the type from that library (which comes already installed with Powertools for AWS) roughly like this: import type { AppSyncResolverHandler } from 'aws-lambda';
export const handler: AppSyncResolverHandler = async (event, context) => { // event is now of type AppSyncResolverEvent
// ...
} Note that this approach, while entirely valid, does not provide any runtime validation nor parsing. If instead we were to add a Zod schema to the Parser utility, you'd be able to use it to actually parse the event at runtime and ensure that the event has the shape you expected. |
Yes, it's the run time parsing I'm in interested in, not the typing. |
Hello folks! I think it makes sense to add the AppSync schema to the parser in TS and Python. We just need to make some considerations in the documentation and implementation. Unlike other services like ALB, APIGW, Lambda Function URL where you can't mutate the entire payload, in AppSync you can mutate the payload by using Lambda Resolvers + custom code in JS/VTL, in this case the parser model is kind of useless. We need to make very clear that we just support Direct Resolvers and not custom Resolvers. AppSync also has a difference when working with single resolvers and batch resolvers. Another difference is when resolving parent I can help provide feedback in TS if we decide to go ahead with this implementation and after that I can implement it in Python. If we decide to go ahead with this, can you help us to validate it @svozza? |
Yes fully agreed.
I would envisage two separate schemas where the batch one is just something like: const AppSyncResolverEvent = z.object(/** ... **/)
// ...
const BatchAppSyncResolverEvent = z.array(AppSyncResolverEvent) Regarding
Absolutely! As I said, I'm even happy to implement it. |
Thank you both! @svozza, thank you also for offering to help. I'd say as a first step let's start with writing up a spec with all the schemas we're gonna add here in this issue, so we can align on implementation even before moving onto the PR. This way Leandro can also check that it makes sense for Python and then when we agree, we can move to the PR, and Python can add it whenever they see fit. |
This is a modified version of the AppSync event schema that I am currently using (I've made it more generic as mine is for a very specific resolver event: const AppSyncIamIdentity = z.object({
accountId: z.string(),
cognitoIdentityPoolId: z.string(),
cognitoIdentityId: z.string(),
sourceIp: z.array(z.string()),
username: z.string(),
userArn: z.string(),
cognitoIdentityAuthType: z.string(),
cognitoIdentityAuthProvider: z.string(),
});
const AppSyncCognitoIdentity = z.object({
sub: z.string(),
issuer: z.string(),
username: z.string(),
claims: z.any(),
sourceIp: z.array(z.string()),
defaultAuthStrategy: z.string(),
groups: z.array(z.string()).nullable(),
});
const AppSyncOidcIdentity = z.object({
claims: z.any(),
issuer: z.string(),
sub: z.string(),
});
const AppSyncLambdaIdentity = z.object({
resolverContext: z.any(),
});
const AppSyncIdentity = z.union([
AppSyncCognitoIdentity,
AppSyncIamIdentity,
AppSyncLambdaIdentity,
AppSyncOidcIdentity,
]);
const AppSyncResolverEvent = z.object({
arguments: z.record(z.any()),
identity: z.optional(AppSyncIdentity),
source: z.record(z.any()).nullable(),
request: z.object({
headers: z.record(z.string()),
}),
info: z.object({
selectionSetList: z.array(z.string()),
selectionSetGraphQL: z.string(),
parentTypeName: z.string(),
fieldName: z.string(),
variables: z.record(z.any()),
}),
prev: z
.object({
result: z.record(z.any()),
})
.nullable(),
stash: z.record(z.any()),
}); We need to keep this as generic as possible because most users are going to want to extend this schema to parse Unfortunately, it doesn't look like the const getPost = AppSyncResolverEvent.extend({
info: z.object({fieldName: z.literal('getPost')}),
args: z.object({
id: z.string()
}),
});
const addPost = AppSyncResolverEvent.extend({
info: z.object({fieldName: z.literal('addPost')}),
args: z.object({
id: z.string(),
author: z.string(),
title: z.string(),
content: z.string(),
url: z.string().url()
}),
});
// info.fieldName is not supported
const EventSchema = z.discriminatedUnion('info.fieldName', [
getPost,
addPost
]); My guess is that users will need to use I'm not sure about how we should handle batch resolvers as |
I've found a nice pattern for associating arguments with the query/mutation they're associated with: const event = {
arguments: {
id: 'my identifier',
},
source: null,
request: {
// ...
},
prev: null,
info: {
// ...
parentTypeName: 'Query',
fieldName: 'getPost',
variables: {},
},
stash: {},
};
const fieldArgumentSchemas = {
getPost: z.object({
id: z.string(),
}),
addPost: z.object({
id: z.string(),
author: z.string(),
title: z.string(),
content: z.string(),
url: z.string().url(),
}),
// ...
};
const fieldNames = Object.keys(fieldSchemas);
const EventSchema = AppSyncResolverEvent.extend({
info: z.object({
fieldName: z.enum([fieldNames[0], ...fieldNames.slice(1)]),
}),
}).superRefine((val, ctx) => {
fieldArgumentSchemas[val.info.fieldName].parse(val.arguments);
});
EventSchema.parse(event); Would there be a way to encapsulate this in an envelope as it would be a very common use case. My main concern is that the API would be different to the others because the envelope would have to take an argument: const fieldArgumentSchemas = {
getPost: z.object({
id: z.string(),
}),
addPost: z.object({
id: z.string(),
author: z.string(),
title: z.string(),
content: z.string(),
url: z.string().url(),
}),
};
export const handler = middy(lambdaHandler).use(
parser({ schema: orderSchema, envelope: AppSyncResolverEnvelope(fieldArgumentSchemas) })
); // parser middleware returns {arguments: {id: 'my identifier'}}, fieldName: 'getPost'} |
Quick question about the testing. There's an already existing AppSync resolver event but I would like to try multiple configurations of the event, i.e., the different identity types (OIDC, IAM etc). Should I follow the apigw-rest pattern and create a folder for all these different event types or should do them as flat files in the main folder, e.g., |
Hi @svozza, sorry about the late reply. I just saw your PR and you went in the right direction - we are gradually moving the tests to follow that pattern but haven't been able to do it just yet. Regarding the actual schemas, the proposal seems sensible based on the events you shared - before we merge it I'd like to do some tests with a build of the utility by deploying an AppSync API and see how things are parsed. We can continue the discussion & review in the PR. Thank you again for the bias for action! |
Yes, I think this is a good idea. In fact, if you look at the const AppSyncIamIdentity = z.object({
accountId: z.string(),
cognitoIdentityPoolId: z.string().nullable(),
cognitoIdentityId: z.string().nullable(),
sourceIp: z.array(z.string()),
username: z.string(),
userArn: z.string(),
cognitoIdentityAuthType: z.string().nullable(),
cognitoIdentityAuthProvider: z.string().nullable(),
}); The reason I made them nullable is because during my testing I executed AppSync functions with this auth type to inspect the |
Use case
I would like to use the Powertools parser library to parse the AppSync lambda resolver event. (The fields are described here: https://docs.aws.amazon.com/appsync/latest/devguide/resolver-context-reference.html) It would be nice to be able to use an off the shelf Zod schema from Powertools rather than hand-rolling the schema myself as I am currently doing. Happy to work on this PR myself.
Solution/User Experience
It would work like any of the other built-in schemas that Powertools currently supports: https://docs.powertools.aws.dev/lambda/typescript/latest/utilities/parser/#built-in-schemas.
Alternative solutions
User can write the Zod schema themselves.
Acknowledgment
Future readers
Please react with 👍 and your use case to help us understand customer demand.
The text was updated successfully, but these errors were encountered: