You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Feb 1, 2024. It is now read-only.
In the prototype, the ink! dependencies which are contained ion the Change.json file are received by calling fetch() to obtain them from the Netlify server after the initial page-load. This can take up to 2 min. Are there ways to improve the loading speed, like:
Load them from another server?
Split the file into multiple smaller files?
Stream the .json file
Directly subscribe to stream of JSON data instead of transferring file(-s)?
The Change.jsonm file is currently fetched, however, it is approx. 50 MB big (~5MB gzipped teransfer volume). It is probably not cached for long due to its size? Are there ways to more persistently cache this data?
Another approach whhcihs should be considered to improve the performance of loading and applying the Change.json file, is to fetch the data through Comlink Webworkers and to pass them to another Webworker thread which executes rust analyzer, see e.g.:
In the prototype, the ink! dependencies which are contained ion the
Change.json
file are received by calling fetch() to obtain them from the Netlify server after the initial page-load. This can take up to 2 min. Are there ways to improve the loading speed, like:The Change.jsonm file is currently fetched, however, it is approx. 50 MB big (~5MB gzipped teransfer volume). It is probably not cached for long due to its size? Are there ways to more persistently cache this data?
Another approach whhcihs should be considered to improve the performance of loading and applying the Change.json file, is to fetch the data through Comlink Webworkers and to pass them to another Webworker thread which executes rust analyzer, see e.g.:
https://issueexplorer.com/issue/GoogleChromeLabs/comlink/450
The text was updated successfully, but these errors were encountered: