-
-
Notifications
You must be signed in to change notification settings - Fork 32.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[icons] Improve icon search performance #44450
base: master
Are you sure you want to change the base?
[icons] Improve icon search performance #44450
Conversation
2c3147d
to
38602c3
Compare
Netlify deploy previewhttps://deploy-preview-44450--material-ui.netlify.app/ Bundle size report |
38602c3
to
e52845e
Compare
e52845e
to
0eb969b
Compare
0eb969b
to
eefd1d6
Compare
ec41a21
to
8c96355
Compare
8c96355
to
4724f34
Compare
I'd also be interested in whether creating and serializing the index on the server and just deserializing it with |
@Janpot Oh, I didn't think about this, smart. I guess the index could be created with a macro, during the bundle phase, like we do for parsing the markdown. Maybe https://github.com/devongovett/unplugin-parcel-macros would do it. For example, I have seen @PookMook use it in https://github.com/PookMook/boomer-css/blob/fc051df3a67268b685e20765c3eb7a6a7c01686b/src/components/hero-section.tsx#L2. Edit: hum, looking at the issues with unplugin-parcel-macros, it might not fly. Maybe https://github.com/kentcdodds/babel-plugin-macros is still our best option 🤔. |
What are the issues that are concerning to you with parcel macros? |
@PookMook Two concerns: 1. Turbopack support (thought to be fair, Turbopack starts a bit to feel like a failed project, compared to Rspack or Rolldown, but as long as we are within the Next.js ecosystem, it's important), 2. low npm download count, Devon is already spread thin with Lightingcss, Parcel, React Aria. |
In those two cases, very valid concerns.
Probably not acceptable state for a project like MUI indeed. I'm exploring those two options in the very distant background as time is short recently and I personally have very little incentive to do so (we're using vite on our prod projects + bundle our design system on it's own and use it like CSS modules) |
I was more thinking about using in general I'm not in favor of using a macro system as it tends to make code non standard, hard to reason about and locks you into a single compiler implementation. seems that parcel macros just execute at build time and don't alter JavaScript semantics? perhaps it could be acceptable? it would still blow up bundle size though. edit: 🤔 perhaps can be circumvented by loading it with a dynamic import? a third option I'm pondering about is to pregenerate a static json file at build time which we statically host and fetch on page load. it would have the added benefit that json parsing happens on a background thread automatically. drawback is that we have to add some locking behavior to make sure it's loaded before we start filtering and have to solve SSR. |
@Janpot Unclear: https://news.ycombinator.com/item?id=21013215, https://www.reddit.com/r/javascript/comments/r8pwoz/comment/hn86ge6/.
Option 4 is to be lazy, do nothing 😄. I would be curious to see if this help, but I don't know if after this PR, it's still needed. |
Did we ever make a version of this without a full text search engine? Intuitively I'd expect filtering a 6k list of relatively short strings to be not a super heavy operation. The best part is no part they say 🙂 |
@Janpot Oh I didn't know that https://v8.dev/blog/cost-of-javascript-2019#json. The file is pretty big https://github.com/mui/material-ui/blob/master/docs/data/material/components/material-icons/synonyms.js. Maybe it could save time.
I can't remember, I might have added this in the very first version as I was using Font Awesome as reference. I struggle to envision the search to be great without synonymes. Right now, the search takes a few ms and the indexation never blocks the main thread, maybe it's OK. I'm unclear on what's the main bottleneck now and either it's fast enough or not. |
const tasks = { current: allIcons.length }; | ||
|
||
function work() { | ||
miniSearch.addAll([allIcons[tasks.current - 1]]); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
miniSearch.addAll([allIcons[tasks.current - 1]]); | |
miniSearch.add(allIcons[tasks.current - 1]); |
tasks.current -= 1; | ||
} | ||
|
||
asyncWorker({ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's kind of a weird method signature. Wouldn't it make more sense as
async function asyncWorker<T>(tasks: T[], doWork: (task: T) => void): Promise<void>
?
|
||
// Copied from mui-x/packages/x-data-grid-generator/src/services/asyncWorker.ts | ||
// https://lucaong.github.io/minisearch/classes/MiniSearch.MiniSearch.html#addAllAsync is crap. | ||
function asyncWorker({ work, tasks, done }) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could this be simplified with scheduler.yield
?
edit Interesting exploration on the topic
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I struggle to envision the search to be great without synonymes.
I don't think it would be that hard to make this filter with synonyms but we can try that out later. left a few suggestions, but nothing blocking
@oliviertassinari Just did a quick test with the following snippet and it seems to consistently run in the 1ms range or lower on my machine const filteredIcons = React.useMemo(() => {
console.time('filteredIcons');
const result = allIcons.filter((icon) => {
if (theme !== icon.theme) {
return false;
}
if (query.length === 0) {
return true;
}
// TODO: normalize strings
if (icon.name.includes(query) || synonyms[icon.name]?.includes(query)) {
return true;
}
return false;
});
console.timeEnd('filteredIcons');
return result;
}, [query, theme]);
// The results slightly differ
console.log(filteredIcons.length, icons.length); I tried to time the indexed search and interestingly get results in the same ballpark for 1 or 2 character queries. For longer strings the indexed search is indeed an order of magnitude faster. const icons = React.useMemo(() => {
console.time('indexedIcons');
const keys = query === '' ? null : searchIndex.search(query, { limit: 3000 });
const result = (
keys === null ? allIcons : keys.map((key) => allIconsMap[key])
).filter((icon) => theme === icon.theme);
console.timeEnd('indexedIcons');
return result;
}, [query, theme]); I assume the index defaults to a simple filter for short queries, which would be what I'd expect with my limited knowledge of how these FT indices work 🙂. |
Problem
This is a continuation of #44001. I'm not sure that the change had any positive impact. I suspect that that PR only surfaced a Chrome DevTools bug. I'm saying this because there is what we see now: https://pagespeed.web.dev/analysis/https-mui-com-material-ui-material-icons/cvfn967l26?form_factor=desktop. That INP is "poor":
It's a problem because this page is one of the pages that has the most traffic to our docs. We have to have a nice UX.
Looking at the page load timeline, I see we spend +500ms creating the search index, the page is hydrated after 1.40s:
Solution
In our case, minisearch takes about 250ms to index, flexsearch takes about 500ms, and fuse.js 50ms. The problem is that fuse trades too much indexation speed for search slowness, so minisearch seems best.
<Icons>
component by using useEventCallback. Actually, many more parts of the docs are impacted by this problem, I have left a comment in the source for the root solution useRouterQueryParam() instead of useRouter() for performance vercel/next.js#45969. We could already start to fix this performance issue by having wrappers, matching the new App Router API. The issue can be observed using https://github.com/aidenybai/react-scan/blob/main/CHROME_EXTENSION_GUIDE.md.Preview: https://deploy-preview-44450--material-ui.netlify.app/material-ui/material-icons/
Our search is so much faster than on https://fontawesome.com/search 🥹. So, will this solve the 533ms INP field data? I hope so. Two hypotheses as to why we have this data now: