-
-
Notifications
You must be signed in to change notification settings - Fork 2.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SSR Compression missing? Document size of SSR Astro + SPA much larger in comparison to SPA without Astro #9397
Comments
I am not sure built-in compression is necessary. Node servers typically run behind reverse proxies which are able to do it more efficiently. It would be a duplicated effort, and if you want to add it, the middleware is really simple anyway: import { defineMiddleware } from "astro:middleware"
export const onRequest = defineMiddleware(async (_context, next) => {
const response = await next()
return new Response(response.body?.pipeThrough(new CompressionStream("gzip")), {
...response,
headers: {
...response.headers,
"Content-Encoding": "gzip"
}
})
}) SvelteKit's node adapter did compress SSR responses at one point but it was removed over a year ago for similar reasons. Can you share what version you are using? Precompressing static resources would be good enhancement but the library we are using (send) does not support it and generally seems "done" on the maintenance front. It might be a while to introduce precompression, because it would likely mean migrating to something else for static files. |
I created two SvelteKit apps about a week ago using the standard command from the Docs . One with opting for Svelte5 beta, another Svelte4. I have just checked and both SvelteKit apps compress the document, at least judging by its size: As for your suggested middleware: It works! It's good to know that nginx or caddy can do the same. Most of us don't know that... |
Hey! As @lilnasy pointed out, you can do compression as middleware if you'd like. It's not something we currently want to add to Astro as it's commonly down already via the CDN or proxy server layer, and those layers are able to do it more efficiently. Astro doing it would be double work in most cases. Thanks for filing though! |
Its funny that the middleware posted above increases my document size by about 20kb more. |
That didn't work for me. The below does, however. const compression = defineMiddleware(async ({ cookies, locals }, next) => {
const response = await next();
if (!response.body) {
return response;
}
const contentType = response.headers.get("Content-Type");
const isCompressible =
contentType &&
(contentType.includes("text/") ||
contentType.includes("application/json") ||
contentType.includes("application/javascript") ||
contentType.includes("application/xml"));
if (!isCompressible) {
return response;
}
const compressedBody = response.body.pipeThrough(
new CompressionStream("gzip"),
);
const newHeaders = new Headers(response.headers);
newHeaders.set("Content-Encoding", "gzip");
return new Response(compressedBody, {
status: response.status,
statusText: response.statusText,
headers: newHeaders,
});
}); Bun doesn't yet support CompressionStream so I also needed to use this: oven-sh/bun#1723 (comment) For the record, I disagree with Astro not supporting compression out of the box. For simple setups IMHO it's ridiculous to add an entirely new process in between your server and the client, especially if you're already putting Cloudflare in front. Node/Deno/Bun have no trouble compressing responses with good enough performance for 99% of web servers. The "duplicated effort" is the effort everyone must go through with Astro to add their own compression middleware. |
To make things even easier for people in future, here's the full implementation, updated to also support brotli compression. // middleware.ts import { defineMiddleware, sequence } from "astro:middleware";
import "@lib/compressionStream.js"; // polyfill CompressionStream - // remove this once Bun supports CompressionStream: https://github.com/oven-sh/bun/issues/1723
const compression = defineMiddleware(
async ({ request, cookies, locals }, next) => {
const response = await next();
if (!response.body) {
return response;
}
const contentType = response.headers.get("Content-Type");
const isCompressible =
contentType &&
(contentType.includes("text/") ||
contentType.includes("application/json") ||
contentType.includes("application/javascript") ||
contentType.includes("application/xml"));
if (!isCompressible) {
return response;
}
const acceptEncoding = request.headers.get("Accept-Encoding") || "";
let compressionType = "";
const acceptedEncodings = acceptEncoding
.split(",")
.map((encoding) => encoding.trim().toLowerCase());
if (acceptedEncodings.includes("br")) {
compressionType = "br";
} else if (acceptedEncodings.includes("gzip")) {
compressionType = "gzip";
} else if (acceptedEncodings.includes("deflate")) {
compressionType = "deflate";
}
if (!compressionType) {
return response;
}
const compressedBody = response.body.pipeThrough(
new CompressionStream(compressionType),
);
const newHeaders = new Headers(response.headers);
newHeaders.set("Content-Encoding", compressionType);
return new Response(compressedBody, {
status: response.status,
statusText: response.statusText,
headers: newHeaders,
});
},
);
export const onRequest = sequence(compression); // compressionStream.js // @bun
// This module is only required for Bun, because it doesn't currently have CompressionStream. This module just polyfills it in.
// Use like this: import "./compressionStream.js";
// It should be possible to remove this soon and depend upon the built-in CompressionStream once it has landed in Bun.
// Modified from: https://github.com/oven-sh/bun/issues/1723#issuecomment-1774174194
import zlib from "node:zlib";
const make = (ctx, handle) =>
Object.assign(ctx, {
writable: new WritableStream({
write: (chunk) => handle.write(chunk),
close: () => handle.end(),
}),
readable: new ReadableStream({
type: "bytes",
start(ctrl) {
handle.on("data", (chunk) => ctrl.enqueue(chunk));
handle.once("end", () => ctrl.close());
},
}),
});
globalThis.CompressionStream ??= class CompressionStream {
constructor(format) {
make(
this,
format === "deflate"
? zlib.createDeflate()
: format === "gzip"
? zlib.createGzip()
: format === "br"
? zlib.createBrotliCompress()
: zlib.createDeflateRaw(),
);
}
};
globalThis.DecompressionStream ??= class DecompressionStream {
constructor(format) {
make(
this,
format === "deflate"
? zlib.createInflate()
: format === "gzip"
? zlib.createGunzip()
: format === "br"
? zlib.createBrotliDecompress()
: zlib.createInflateRaw(),
);
}
}; |
Astro Info
If this issue only occurs in one browser, which browser is a problem?
Chrome, Firefox
Describe the Bug
If I use Astro + Svelte, and create a large test component with a 1000 div elements, the document size gets large, like 50kb over the network. If I use the exact same code inside SvelteKit (no Astro), then the Document size is 20 times smaller.
I assume the compression is missing in SSR mode (astrojs/node), which makes it unusable, since a typical large home page will weigh half a megabyte.
Astro+Svelte
Astro+Solid
SvelteKit, the same App:
Astro3 and Astro4 behave the same way, and it's also independent of what framework, Solid, Preact or Svelte we're using.
I assume, but did not test, that only SSR is affected. Did not test if SSG is affected.
What's the expected result?
HTML Document should be compressed.
Link to Minimal Reproducible Example
No
Participation
The text was updated successfully, but these errors were encountered: