Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Stripe - Data too long exception #3350

Open
derXear opened this issue Feb 7, 2025 · 0 comments
Open

Stripe - Data too long exception #3350

derXear opened this issue Feb 7, 2025 · 0 comments
Labels
type: bug 🐛 Something isn't working

Comments

@derXear
Copy link

derXear commented Feb 7, 2025

Describe the bug
Using the Stripe plugin with the remix storefront leads to the following error message:

[server] warn 2/6/25, 11:26 AM - [Vendure Server] Job data for "update-search-index" is too long to store with the mysql driver (145kb).
[server] The following keys were truncated:  
[server] /home/georg/Develop/bigrep/repo/shop/vendure-shop/node_modules/typeorm/src/driver/mysql/MysqlQueryRunner.ts:246
[server]                                 new QueryFailedError(query, parameters, err),
[server]                                 ^
...
[server]   sqlMessage: "Data too long for column 'data' at row 1",
[server]   sql: 'INSERT INTO `job_record`(`createdAt`, `updatedAt`, `queueName`, `data`, `state`, `progress`, `result`, `error`, `startedAt`, `settledAt`, `isSettled`, `retries`, `attempts`, `id`) VALUES (DEFAULT, DEFAULT, \'update-search-index\', \'{\\"type\\":\\"update-variants\\" ...
,\\"body\\":{\\"0\\":123,\\"1\\":10,\\"2\\":32,\\"3\\":32,\\"4\\":34,\\"5\\":105,\\"6\\":100,
...

The payment itself is processed successful in Stripe and Vendure, but there seems to be side-effect with the SearchIndexService.updateVariants function probably due to the stock level change.

To Reproduce
Steps to reproduce the behavior:

  1. Start Vendure w/ payments-plugin & DefaultJobQueuePlugn in v.3.1.2, mysql and stripe configured
  2. Start the remix-storefront
  3. Perform a Stripe payment
  4. See error

In the StripeController class, when having a breakpoint at the constructEventFromPayload function call:

@Post('stripe')
    async webhook(
        @Headers('stripe-signature') signature: string | undefined,
        @Req() request: RequestWithRawBody,
        @Res() response: Response,
    ): Promise<void> {
        ...

        const outerCtx = await this.createContext(channelToken, languageCode, request);

        await this.connection.withTransaction(outerCtx, async (ctx: RequestContext) => {
            const order = await this.orderService.findOneByCode(ctx, orderCode);

            if (!order) {
                throw new Error(
                    `Unable to find order ${orderCode}, unable to settle payment ${paymentIntent.id}!`,
                );
            }

            try {
                // Throws an error if the signature is invalid
                const event = await this.stripeService.constructEventFromPayload(ctx, order, request.rawBody, signature);

                // TODO Clearing the buffer here fixes the issue
                request.rawBody = Buffer.alloc(0);
                request.body = undefined;

In my case the request.rawBody is very large and that is part of the context used in SearchIndexService to create a job to reindex the variants changed. This will store the rawBody as part of the job.data in the database (using the DefaultJobQueuePlugn).

Expected behavior
In my opinion the rawBody / body from the request should not be stored in the job_record database. I see two possible options to have this:

Option A:

Immediately clear the rawBody in Stripe plugin after it was validated as seen above:

// Throws an error if the signature is invalid
const event = await this.stripeService.constructEventFromPayload(ctx, order, request.rawBody, signature);

// TODO Clearing the buffer here fixes the issue
request.rawBody = Buffer.alloc(0);
request.body = undefined;

Option B:

Improve the SqlJobQueueStrategy.constrainDataSize function by clearing any unnecessary data and not only checking string lengths

private constrainDataSize<Data extends JobData<Data> = object>(job: Job<Data>): Data | undefined {
        const type = this.rawConnection?.options.type;
        if (type === 'mysql' || type === 'mariadb') {
            const stringified = JSON.stringify(job.data);
            if (64 * 1024 <= stringified.length) {
                const truncatedKeys: Array<{ key: string; size: number }> = [];
                const reduced = JSON.parse(stringified, (key, value) => {
                    if (typeof value === 'string' && 2048 < value.length) {
                        truncatedKeys.push({ key, size: value.length });
                        return `[truncated - originally ${value.length} bytes]`;
                    // ADDED ->
                    } else if (typeof value === 'object' && value !== null) {
                        // TODO to be implemented - check serialized object size
                        const objectSize = getObjectSize(value);
                        if (objectSize > 2048) {
                            truncatedKeys.push({ key, size: objectSize });
                            return `[truncated - originally ${objectSize} bytes]`;
                        }
                    }
                    // ADDED <-
                    return value;
                });
               ...
                return reduced;
            }
        }
    }

Might be the better way to in general not pollute the database with unnecessary data?

Environment (please complete the following information):

  • @vendure/core: v3.1.2 (same for payment-plugin)
  • Nodejs version: v20
  • Database: mysql2 client v3.12.0
@derXear derXear added the type: bug 🐛 Something isn't working label Feb 7, 2025
@michaelbromley michaelbromley moved this to 📦 Backlog in Vendure OS Roadmap Feb 9, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type: bug 🐛 Something isn't working
Projects
Status: 📦 Backlog
Development

No branches or pull requests

1 participant