Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Add edge compatibility for custom frameworks and Next.JS #918

Merged
Merged
Show file tree
Hide file tree
Changes from 9 commits
Commits
Show all changes
33 commits
Select commit Hold shift + click to select a range
94138c5
Add support for polyfilling buffer and node for next
deepjyoti30Alt Aug 30, 2024
94300b9
Use querystringify instead of querystring for edge compatibility
deepjyoti30Alt Aug 30, 2024
5262e6b
Add support for parse implementation using URLSearchParams
deepjyoti30Alt Aug 30, 2024
7f9f61e
Add tests for parseParams
deepjyoti30Alt Aug 30, 2024
97d723e
Run build-pretty on all files
deepjyoti30Alt Aug 30, 2024
ab4f5f9
Add a workflow for testing Next.JS with edge runtime
deepjyoti30Alt Aug 30, 2024
a0145dc
Add support for gzip compression
deepjyoti30Alt Aug 30, 2024
aad18ad
Add support for br compression
deepjyoti30Alt Aug 30, 2024
028edd8
Disable brotli compression in edge
deepjyoti30Alt Aug 30, 2024
15ac624
Remove unnecessary log
deepjyoti30Alt Aug 31, 2024
5a9649d
Replace parseParams with one URLSearchParams alternative
deepjyoti30Alt Sep 2, 2024
a9d6535
Add support for throwing a proper error when brotli doesn't work
deepjyoti30Alt Sep 2, 2024
7cd8d97
Add support for CF workflow to test edge function compatibility
deepjyoti30Alt Sep 3, 2024
cbfbc63
Get rid of netflify edge test function
deepjyoti30Alt Sep 3, 2024
4a0b2c3
Remove newly added conflicting route from netlify next test wf
deepjyoti30Alt Sep 3, 2024
fcc69f4
Rename api/auth dynamic variable in next app router
deepjyoti30Alt Sep 3, 2024
39aaca4
Use ponyfilled process to refactor checking for test env
deepjyoti30Alt Sep 3, 2024
bac1e8f
Refactor process and expose an util function to make it accessible
deepjyoti30Alt Sep 3, 2024
476cda0
Add fallback implementation for accessing Buffer
deepjyoti30Alt Sep 3, 2024
a4645da
Refactor base64 encoding/decoding to use ponyfilled buffer
deepjyoti30Alt Sep 3, 2024
309c2e4
Reuse ponyfilled buffer at more places
deepjyoti30Alt Sep 3, 2024
66e3cdb
Remove polyfill functionality for buffer and process
deepjyoti30Alt Sep 3, 2024
043c05a
Remove unused test file
deepjyoti30Alt Sep 4, 2024
d2e4c32
Update some functions according to comments in PR
deepjyoti30Alt Sep 4, 2024
1627d3b
Update workflow to use more values from secrets
deepjyoti30Alt Sep 4, 2024
35a3748
Get rid of pages directory in next emailpassword example
deepjyoti30Alt Sep 4, 2024
df2ac22
Remove use of getBuffer in loopback framework
deepjyoti30Alt Sep 4, 2024
2f2fb62
Add error handling for possible malformed body in lambda request
deepjyoti30Alt Sep 4, 2024
58ab0e5
Feat/hono api example repo (#2)
deepjyoti30Alt Sep 4, 2024
a26942a
Merge branch '20.0' into feat/edge-compatibility-next
rishabhpoddar Sep 5, 2024
793925a
Address requested changes
deepjyoti30Alt Sep 5, 2024
5114804
Fix a typo regarding boxPrimitives check
deepjyoti30Alt Sep 5, 2024
d74b20d
Update changelog with details of changes
deepjyoti30Alt Sep 5, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
40 changes: 40 additions & 0 deletions .github/workflows/test-edge-function-next.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
name: "Test edge function compatibility for NextJS"
on: push
jobs:
test:
runs-on: ubuntu-latest
env:
NETLIFY_AUTH_TOKEN: ${{ secrets.netlify_auth_token }}
NETLIFY_SITE_ID: ${{ secrets.netlify_site_id }}
TEST_DEPLOYED_VERSION: true
defaults:
run:
working-directory: examples/next/with-emailpassword
steps:
- uses: actions/checkout@v2
- run: echo $GITHUB_SHA
- run: npm install git+https://github.com:supertokens/supertokens-node.git#$GITHUB_SHA
- run: npm install
- run: npm install [email protected] [email protected] puppeteer@^11.0.0 isomorphic-fetch@^3.0.0

# Step to update the runtime to edge to all files in app/api/ and pages/api/
- name: Add runtime export to API files
run: |
find app/api -type f -name "*.js" -exec sed -i '1s/^/export const runtime = "edge";\n/' {} +
find pages/api -type f -name "*.js" -exec sed -i '1s/^/export const runtime = "edge";\n/' {} +

- run: netlify deploy --alias 0 --build --json --auth=$NETLIFY_AUTH_TOKEN > deployInfo.json
- run: cat deployInfo.json
- run: |
( \
(echo "=========== Test attempt 1 ===========" && npx mocha --no-config --timeout 80000 test/**/*.test.js) || \
(echo "=========== Test attempt 2 ===========" && npx mocha --no-config --timeout 80000 test/**/*.test.js) || \
(echo "=========== Test attempt 3 ===========" && npx mocha --no-config --timeout 80000 test/**/*.test.js) \
)
- name: The job has failed
if: ${{ failure() }}
uses: actions/upload-artifact@v3
with:
name: screenshots
path: |
./**/*screenshot.jpeg
3 changes: 1 addition & 2 deletions lib/build/framework/awsLambda/framework.js
Original file line number Diff line number Diff line change
Expand Up @@ -12,15 +12,14 @@ const response_1 = require("../response");
const utils_2 = require("../utils");
const constants_1 = require("../constants");
const supertokens_1 = __importDefault(require("../../supertokens"));
const querystring_1 = require("querystring");
class AWSRequest extends request_1.BaseRequest {
constructor(event) {
super();
this.getFormDataFromRequestBody = async () => {
if (this.event.body === null || this.event.body === undefined) {
return {};
} else {
const parsedUrlEncodedFormData = querystring_1.parse(this.event.body);
const parsedUrlEncodedFormData = utils_2.parseParams(this.event.body);
return parsedUrlEncodedFormData === undefined ? {} : parsedUrlEncodedFormData;
}
};
Expand Down
1 change: 1 addition & 0 deletions lib/build/framework/utils.d.ts
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@ import type { Request, Response } from "express";
import type { IncomingMessage } from "http";
import { ServerResponse } from "http";
import type { HTTPMethod } from "../types";
export declare function parseParams(string: string): object;
export declare function getCookieValueFromHeaders(headers: any, key: string): string | undefined;
export declare function getCookieValueFromIncomingMessage(request: IncomingMessage, key: string): string | undefined;
export declare function getHeaderValueFromIncomingMessage(request: IncomingMessage, key: string): string | undefined;
Expand Down
163 changes: 143 additions & 20 deletions lib/build/framework/utils.js
Original file line number Diff line number Diff line change
Expand Up @@ -13,19 +13,155 @@
* License for the specific language governing permissions and limitations
* under the License.
*/
var __asyncValues =
(this && this.__asyncValues) ||
function (o) {
if (!Symbol.asyncIterator) throw new TypeError("Symbol.asyncIterator is not defined.");
var m = o[Symbol.asyncIterator],
i;
return m
? m.call(o)
: ((o = typeof __values === "function" ? __values(o) : o[Symbol.iterator]()),
(i = {}),
verb("next"),
verb("throw"),
verb("return"),
(i[Symbol.asyncIterator] = function () {
return this;
}),
i);
function verb(n) {
i[n] =
o[n] &&
function (v) {
return new Promise(function (resolve, reject) {
(v = o[n](v)), settle(resolve, reject, v.done, v.value);
});
};
}
function settle(resolve, reject, d, v) {
Promise.resolve(v).then(function (v) {
resolve({ value: v, done: d });
}, reject);
}
};
var __importDefault =
(this && this.__importDefault) ||
function (mod) {
return mod && mod.__esModule ? mod : { default: mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.serializeCookieValue = exports.getCookieValueToSetInHeader = exports.setCookieForServerResponse = exports.setHeaderForExpressLikeResponse = exports.assertFormDataBodyParserHasBeenUsedForExpressLikeRequest = exports.assertThatBodyParserHasBeenUsedForExpressLikeRequest = exports.parseURLEncodedFormData = exports.parseJSONBodyFromRequest = exports.normalizeHeaderValue = exports.getHeaderValueFromIncomingMessage = exports.getCookieValueFromIncomingMessage = exports.getCookieValueFromHeaders = void 0;
exports.serializeCookieValue = exports.getCookieValueToSetInHeader = exports.setCookieForServerResponse = exports.setHeaderForExpressLikeResponse = exports.assertFormDataBodyParserHasBeenUsedForExpressLikeRequest = exports.assertThatBodyParserHasBeenUsedForExpressLikeRequest = exports.parseURLEncodedFormData = exports.parseJSONBodyFromRequest = exports.normalizeHeaderValue = exports.getHeaderValueFromIncomingMessage = exports.getCookieValueFromIncomingMessage = exports.getCookieValueFromHeaders = exports.parseParams = void 0;
const cookie_1 = require("cookie");
const error_1 = __importDefault(require("../error"));
const constants_1 = require("./constants");
const utils_1 = require("../utils");
const content_type_1 = __importDefault(require("content-type"));
const inflation_1 = __importDefault(require("inflation"));
const pako_1 = __importDefault(require("pako"));
let brotliDecompress = null;
try {
// @ts-ignore
if (typeof EdgeRuntime === "undefined") brotliDecompress = require("brotli").decompress;
} catch (error) {
brotliDecompress = null;
}
async function inflate(stream) {
var e_1, _a, e_2, _b, e_3, _c;
if (!stream) {
throw new TypeError("argument stream is required");
}
const encoding = (stream.headers && stream.headers["content-encoding"]) || "identity";
let decompressedData;
if (encoding === "gzip" || encoding === "deflate") {
const inflator = new pako_1.default.Inflate();
try {
for (
var stream_1 = __asyncValues(stream), stream_1_1;
(stream_1_1 = await stream_1.next()), !stream_1_1.done;

) {
const chunk = stream_1_1.value;
inflator.push(chunk, false);
}
} catch (e_1_1) {
e_1 = { error: e_1_1 };
} finally {
try {
if (stream_1_1 && !stream_1_1.done && (_a = stream_1.return)) await _a.call(stream_1);
} finally {
if (e_1) throw e_1.error;
}
}
if (inflator.err) {
throw new Error(`Decompression error: ${inflator.msg}`);
}
decompressedData = inflator.result;
} else if (encoding === "br") {
if (!brotliDecompress) throw new Error("Brotli decompression not supported on the platform");
const chunks = [];
try {
for (
var stream_2 = __asyncValues(stream), stream_2_1;
(stream_2_1 = await stream_2.next()), !stream_2_1.done;

) {
const chunk = stream_2_1.value;
chunks.push(chunk);
}
} catch (e_2_1) {
e_2 = { error: e_2_1 };
} finally {
try {
if (stream_2_1 && !stream_2_1.done && (_b = stream_2.return)) await _b.call(stream_2);
} finally {
if (e_2) throw e_2.error;
}
}
const compressedData = Buffer.concat(chunks);
decompressedData = brotliDecompress(compressedData);
} else {
// Handle identity or unsupported encoding
decompressedData = Buffer.concat([]);
try {
for (
var stream_3 = __asyncValues(stream), stream_3_1;
(stream_3_1 = await stream_3.next()), !stream_3_1.done;

) {
const chunk = stream_3_1.value;
decompressedData = Buffer.concat([decompressedData, chunk]);
}
} catch (e_3_1) {
e_3 = { error: e_3_1 };
} finally {
try {
if (stream_3_1 && !stream_3_1.done && (_c = stream_3.return)) await _c.call(stream_3);
} finally {
if (e_3) throw e_3.error;
}
}
}
if (typeof decompressedData === "string") return decompressedData;
return new TextDecoder().decode(decompressedData);
}
function parseParams(string) {
// Set up a new URLSearchParams object using the string.
const params = new URLSearchParams(string);
// Get an iterator for the URLSearchParams object.
const entries = params.entries();
const result = {};
// Loop through the URLSearchParams object and add each key/value
for (const [key, value] of entries) {
// Split comma-separated values into an array.
result[key] = value.split(",");
// If a key does not have a value, delete it.
if (!value) {
delete result[key];
}
}
return result;
}
exports.parseParams = parseParams;
function getCookieValueFromHeaders(headers, key) {
if (headers === undefined || headers === null) {
return undefined;
Expand Down Expand Up @@ -110,8 +246,7 @@ async function parseJSONBodyFromRequest(req) {
if (!encoding.startsWith("utf-")) {
throw new Error(`unsupported charset ${encoding.toUpperCase()}`);
}
const buffer = await getBody(inflation_1.default(req));
const str = buffer.toString(encoding);
const str = await inflate(req);
if (str.length === 0) {
return {};
}
Expand All @@ -123,8 +258,7 @@ async function parseURLEncodedFormData(req) {
if (!encoding.startsWith("utf-")) {
throw new Error(`unsupported charset ${encoding.toUpperCase()}`);
}
const buffer = await getBody(inflation_1.default(req));
const str = buffer.toString(encoding);
const str = await inflate(req);
let body = {};
for (const [key, val] of new URLSearchParams(str).entries()) {
if (key in body) {
Expand Down Expand Up @@ -164,7 +298,8 @@ async function assertThatBodyParserHasBeenUsedForExpressLikeRequest(method, requ
try {
// parsing it again to make sure that the request is parsed atleast once by a json parser
request.body = await parseJSONBodyFromRequest(request);
} catch (_a) {
} catch (err) {
console.log("err:", err);
throw new error_1.default({
type: error_1.default.BAD_INPUT_ERROR,
message: "API input error: Please make sure to pass a valid JSON input in the request body",
Expand All @@ -175,6 +310,7 @@ async function assertThatBodyParserHasBeenUsedForExpressLikeRequest(method, requ
}
exports.assertThatBodyParserHasBeenUsedForExpressLikeRequest = assertThatBodyParserHasBeenUsedForExpressLikeRequest;
async function assertFormDataBodyParserHasBeenUsedForExpressLikeRequest(request) {
console.log("Entering assert 2");
if (typeof request.body === "string") {
try {
request.body = Object.fromEntries(new URLSearchParams(request.body).entries());
Expand Down Expand Up @@ -324,16 +460,3 @@ function serializeCookieValue(key, value, domain, secure, httpOnly, expires, pat
return cookie_1.serialize(key, value, opts);
}
exports.serializeCookieValue = serializeCookieValue;
// based on https://nodejs.org/en/docs/guides/anatomy-of-an-http-transaction
function getBody(request) {
return new Promise((resolve) => {
const bodyParts = [];
request
.on("data", (chunk) => {
bodyParts.push(chunk);
})
.on("end", () => {
resolve(Buffer.concat(bodyParts));
});
});
}
1 change: 1 addition & 0 deletions lib/build/nextjs.d.ts

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

29 changes: 15 additions & 14 deletions lib/build/nextjs.js

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

1 change: 1 addition & 0 deletions lib/build/polyfill.d.ts

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

9 changes: 9 additions & 0 deletions lib/build/polyfill.js

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

5 changes: 2 additions & 3 deletions lib/ts/framework/awsLambda/framework.ts
Original file line number Diff line number Diff line change
Expand Up @@ -25,12 +25,11 @@ import { HTTPMethod } from "../../types";
import { getFromObjectCaseInsensitive, makeDefaultUserContextFromAPI, normaliseHttpMethod } from "../../utils";
import { BaseRequest } from "../request";
import { BaseResponse } from "../response";
import { normalizeHeaderValue, getCookieValueFromHeaders, serializeCookieValue } from "../utils";
import { normalizeHeaderValue, getCookieValueFromHeaders, serializeCookieValue, parseParams } from "../utils";
import { COOKIE_HEADER } from "../constants";
import { SessionContainerInterface } from "../../recipe/session/types";
import SuperTokens from "../../supertokens";
import { Framework } from "../types";
import { parse } from "querystring";

export class AWSRequest extends BaseRequest {
private event: APIGatewayProxyEventV2 | APIGatewayProxyEvent;
Expand All @@ -45,7 +44,7 @@ export class AWSRequest extends BaseRequest {
if (this.event.body === null || this.event.body === undefined) {
return {};
} else {
const parsedUrlEncodedFormData = parse(this.event.body);
const parsedUrlEncodedFormData = parseParams(this.event.body);
return parsedUrlEncodedFormData === undefined ? {} : parsedUrlEncodedFormData;
}
};
Expand Down
Loading