Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"Determining test suites to run" step is slow #3445

Closed
jonathanglasmeyer opened this issue May 3, 2017 · 11 comments
Closed

"Determining test suites to run" step is slow #3445

jonathanglasmeyer opened this issue May 3, 2017 · 11 comments

Comments

@jonathanglasmeyer
Copy link

jonathanglasmeyer commented May 3, 2017

Do you want to request a feature or report a bug?
Bug

What is the current behavior?
Running in watch mode with patternInfo: { input: '', lastCommit: false, onlyChanged: true, watch: true }, the "determining test suites to run..." step is taking >4 seconds for our private monorepo. Digging in to the code showed that the slow part is in jest-resolve-dependencies/build/index, these lines of code:

    const modules = this._hasteFS.getAllFiles().map(file => ({
      dependencies: this.resolve(file, options),
      file }));

make up > 99% of the overall time.

this._hasteFS.getAllFiles().length is 2742. Are 4 seconds expected for a hasteFS of this size?

What is the expected behavior?
As this step is repeated on every file change, using --watch is non-feasible for us, we have to go for --watchAll. Expected would be -- faster? :)

Please provide your exact Jest configuration and mention your Jest, node, yarn/npm version and operating system.
./node_modules/.bin/jest -v --> v19.0.2
node -v --> v7.6.0
yarn version: 1.0.0

.jestrc
{
  "moduleFileExtensions": [
    "js",
    "json",
    "svg",
    "client.js",
    "less.module"
  ],
  "moduleNameMapper": {
    "\\.(jpg|jpeg|png|gif|eot|otf|webp|svg|ttf|woff|woff2|mp4|webm|wav|mp3|m4a|aac|oga|sprite|less)$": "/test/fileMock.js"
  },
  "modulePaths": [
    "/node_modules",
    "/packages",
    ""
  ],
  "modulePathIgnorePatterns": [
    "/build/",
    "/__packages/",
    "/service/node_modules/"
  ],
  "testPathIgnorePatterns": [
    "/node_modules/",
    "/__packages/",
    "/admin/",
    "/api/",
    "/config/",
    "/deployment/",
    "/service/",
    "/partner/",
    "/flow/",
    "/website/",
    "/wish-website/"
  ],
  "testRegex": "\\.test\\.js$",
  "transform": {
    "\\.less\\.module$": "/test/transformCssModules.js",
    "^.+\\.js$": "/node_modules/babel-jest",
    "styles$": "/test/transformCssModules.js"
  }
}
@cpojer
Copy link
Member

cpojer commented May 3, 2017

It really depends on your setup, but this.resolve actually does node resolution so that it can track dependencies within the system. For watch mode, it would be awesome if we could make this step incremental rather than process everything on every change. At Facebook, we are looking at similar times on a monorepo that is orders of magnitude bigger, so I'm a bit unsure what's causing the slow time for you.

Can you try to change Jest to turn this to true and see if that affects things: https://github.com/facebook/jest/blob/master/packages/jest-resolve/src/index.js#L138?

@jonathanglasmeyer
Copy link
Author

Setting const skipResolution = true introduces the following error when running yarn jest -- --watch:

 FAIL  packages/fluxible-graphql-helpers/createStore.test.js
  ● Test suite failed to run

    Cannot find module 'jest-matchers' from 'jest-expect.js'
      
      at Resolver.resolveModule (node_modules/jest-resolve/build/index.js:169:17)

 FAIL  packages/fluxible-graphql-helpers/customMerge.test.js
  ● Test suite failed to run

    Cannot find module 'jest-matchers' from 'jest-expect.js'
      
      at Resolver.resolveModule (node_modules/jest-resolve/build/index.js:169:17)

@cpojer
Copy link
Member

cpojer commented May 3, 2017

We could take a look at this if you create a mock-repo on github where we could run Jest and test performance for ourselves.

Otherwise, we do have a plan to build a faster resolver, see #2925 for details, but there is no timeline attached to it.

@jonathanglasmeyer
Copy link
Author

Not sure how i should replicate our private monorepo setup publicly. 🤔
K, thanks for the pointer! 👍🏻

@jonathanglasmeyer
Copy link
Author

I investigated further how to explain the time taken. I measured how long the files that are resolveed take each.
I print the milliseconds used by DependencyResolver.resolve for each file that takes above 1ms, which are 842 of the ~2700 overall files.
The following 800 files constitute 4,4 seconds of the overall 4,8 seconds.

Sorted timings for 800 files - relationship of time taken to # of deps
2ms 0 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 1 deps 
2ms 10 deps 
2ms 10 deps 
2ms 10 deps 
2ms 10 deps 
2ms 11 deps 
2ms 11 deps 
2ms 14 deps 
2ms 15 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 2 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 3 deps 
2ms 4 deps 
2ms 4 deps 
2ms 4 deps 
2ms 4 deps 
2ms 4 deps 
2ms 4 deps 
2ms 4 deps 
2ms 4 deps 
2ms 4 deps 
2ms 4 deps 
2ms 4 deps 
2ms 4 deps 
2ms 4 deps 
2ms 4 deps 
2ms 4 deps 
2ms 4 deps 
2ms 4 deps 
2ms 4 deps 
2ms 4 deps 
2ms 4 deps 
2ms 4 deps 
2ms 4 deps 
2ms 4 deps 
2ms 4 deps 
2ms 4 deps 
2ms 4 deps 
2ms 4 deps 
2ms 4 deps 
2ms 4 deps 
2ms 4 deps 
2ms 4 deps 
2ms 4 deps 
2ms 4 deps 
2ms 4 deps 
2ms 4 deps 
2ms 4 deps 
2ms 4 deps 
2ms 4 deps 
2ms 4 deps 
2ms 4 deps 
2ms 41 deps 
2ms 5 deps 
2ms 5 deps 
2ms 5 deps 
2ms 5 deps 
2ms 5 deps 
2ms 5 deps 
2ms 5 deps 
2ms 5 deps 
2ms 5 deps 
2ms 5 deps 
2ms 5 deps 
2ms 5 deps 
2ms 5 deps 
2ms 5 deps 
2ms 5 deps 
2ms 5 deps 
2ms 5 deps 
2ms 5 deps 
2ms 5 deps 
2ms 5 deps 
2ms 5 deps 
2ms 5 deps 
2ms 5 deps 
2ms 5 deps 
2ms 5 deps 
2ms 5 deps 
2ms 5 deps 
2ms 5 deps 
2ms 5 deps 
2ms 6 deps 
2ms 6 deps 
2ms 6 deps 
2ms 6 deps 
2ms 6 deps 
2ms 6 deps 
2ms 6 deps 
2ms 6 deps 
2ms 6 deps 
2ms 6 deps 
2ms 7 deps 
2ms 7 deps 
2ms 7 deps 
2ms 7 deps 
2ms 7 deps 
2ms 7 deps 
2ms 7 deps 
2ms 7 deps 
2ms 7 deps 
2ms 7 deps 
2ms 8 deps 
2ms 8 deps 
2ms 8 deps 
2ms 8 deps 
2ms 9 deps 
2ms 9 deps 
2ms 9 deps 
2ms 9 deps 
3ms 1 deps 
3ms 1 deps 
3ms 1 deps 
3ms 1 deps 
3ms 1 deps 
3ms 11 deps 
3ms 11 deps 
3ms 11 deps 
3ms 15 deps 
3ms 17 deps 
3ms 2 deps 
3ms 2 deps 
3ms 2 deps 
3ms 2 deps 
3ms 2 deps 
3ms 2 deps 
3ms 2 deps 
3ms 2 deps 
3ms 2 deps 
3ms 2 deps 
3ms 2 deps 
3ms 2 deps 
3ms 2 deps 
3ms 2 deps 
3ms 2 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 3 deps 
3ms 4 deps 
3ms 4 deps 
3ms 4 deps 
3ms 4 deps 
3ms 4 deps 
3ms 4 deps 
3ms 4 deps 
3ms 4 deps 
3ms 4 deps 
3ms 4 deps 
3ms 4 deps 
3ms 4 deps 
3ms 4 deps 
3ms 4 deps 
3ms 4 deps 
3ms 4 deps 
3ms 4 deps 
3ms 4 deps 
3ms 4 deps 
3ms 4 deps 
3ms 4 deps 
3ms 4 deps 
3ms 5 deps 
3ms 5 deps 
3ms 5 deps 
3ms 5 deps 
3ms 5 deps 
3ms 5 deps 
3ms 5 deps 
3ms 5 deps 
3ms 5 deps 
3ms 5 deps 
3ms 5 deps 
3ms 5 deps 
3ms 5 deps 
3ms 5 deps 
3ms 5 deps 
3ms 5 deps 
3ms 5 deps 
3ms 5 deps 
3ms 5 deps 
3ms 5 deps 
3ms 5 deps 
3ms 5 deps 
3ms 5 deps 
3ms 5 deps 
3ms 5 deps 
3ms 6 deps 
3ms 6 deps 
3ms 6 deps 
3ms 6 deps 
3ms 6 deps 
3ms 6 deps 
3ms 6 deps 
3ms 6 deps 
3ms 6 deps 
3ms 6 deps 
3ms 6 deps 
3ms 6 deps 
3ms 6 deps 
3ms 6 deps 
3ms 6 deps 
3ms 6 deps 
3ms 6 deps 
3ms 6 deps 
3ms 6 deps 
3ms 6 deps 
3ms 6 deps 
3ms 7 deps 
3ms 7 deps 
3ms 7 deps 
3ms 7 deps 
3ms 7 deps 
3ms 7 deps 
3ms 7 deps 
3ms 7 deps 
3ms 7 deps 
3ms 8 deps 
3ms 8 deps 
3ms 8 deps 
3ms 8 deps 
3ms 8 deps 
3ms 8 deps 
3ms 9 deps 
3ms 9 deps 
3ms 9 deps 
4ms 10 deps 
4ms 12 deps 
4ms 12 deps 
4ms 12 deps 
4ms 13 deps 
4ms 13 deps 
4ms 18 deps 
4ms 2 deps 
4ms 3 deps 
4ms 3 deps 
4ms 3 deps 
4ms 3 deps 
4ms 3 deps 
4ms 3 deps 
4ms 3 deps 
4ms 3 deps 
4ms 3 deps 
4ms 3 deps 
4ms 3 deps 
4ms 3 deps 
4ms 3 deps 
4ms 3 deps 
4ms 3 deps 
4ms 3 deps 
4ms 3 deps 
4ms 3 deps 
4ms 3 deps 
4ms 3 deps 
4ms 3 deps 
4ms 3 deps 
4ms 3 deps 
4ms 3 deps 
4ms 3 deps 
4ms 3 deps 
4ms 3 deps 
4ms 3 deps 
4ms 3 deps 
4ms 3 deps 
4ms 3 deps 
4ms 4 deps 
4ms 4 deps 
4ms 4 deps 
4ms 4 deps 
4ms 4 deps 
4ms 4 deps 
4ms 4 deps 
4ms 4 deps 
4ms 4 deps 
4ms 4 deps 
4ms 4 deps 
4ms 4 deps 
4ms 4 deps 
4ms 4 deps 
4ms 4 deps 
4ms 4 deps 
4ms 4 deps 
4ms 4 deps 
4ms 4 deps 
4ms 4 deps 
4ms 4 deps 
4ms 5 deps 
4ms 5 deps 
4ms 5 deps 
4ms 5 deps 
4ms 5 deps 
4ms 5 deps 
4ms 5 deps 
4ms 5 deps 
4ms 5 deps 
4ms 5 deps 
4ms 5 deps 
4ms 5 deps 
4ms 5 deps 
4ms 5 deps 
4ms 5 deps 
4ms 5 deps 
4ms 5 deps 
4ms 5 deps 
4ms 6 deps 
4ms 6 deps 
4ms 6 deps 
4ms 6 deps 
4ms 6 deps 
4ms 6 deps 
4ms 6 deps 
4ms 6 deps 
4ms 6 deps 
4ms 6 deps 
4ms 6 deps 
4ms 7 deps 
4ms 7 deps 
4ms 7 deps 
4ms 7 deps 
4ms 7 deps 
4ms 7 deps 
4ms 7 deps 
4ms 7 deps 
4ms 7 deps 
4ms 7 deps 
4ms 7 deps 
4ms 7 deps 
4ms 8 deps 
4ms 8 deps 
4ms 8 deps 
4ms 8 deps 
4ms 8 deps 
4ms 8 deps 
4ms 8 deps 
4ms 9 deps 
4ms 9 deps 
4ms 9 deps 
5ms 10 deps 
5ms 10 deps 
5ms 11 deps 
5ms 11 deps 
5ms 12 deps 
5ms 12 deps 
5ms 13 deps 
5ms 13 deps 
5ms 15 deps 
5ms 3 deps 
5ms 3 deps 
5ms 3 deps 
5ms 3 deps 
5ms 3 deps 
5ms 3 deps 
5ms 3 deps 
5ms 3 deps 
5ms 3 deps 
5ms 3 deps 
5ms 4 deps 
5ms 4 deps 
5ms 4 deps 
5ms 4 deps 
5ms 4 deps 
5ms 4 deps 
5ms 4 deps 
5ms 4 deps 
5ms 4 deps 
5ms 4 deps 
5ms 4 deps 
5ms 4 deps 
5ms 5 deps 
5ms 5 deps 
5ms 5 deps 
5ms 5 deps 
5ms 5 deps 
5ms 5 deps 
5ms 5 deps 
5ms 5 deps 
5ms 5 deps 
5ms 5 deps 
5ms 5 deps 
5ms 6 deps 
5ms 6 deps 
5ms 6 deps 
5ms 6 deps 
5ms 6 deps 
5ms 7 deps 
5ms 7 deps 
5ms 7 deps 
5ms 7 deps 
5ms 7 deps 
5ms 7 deps 
5ms 7 deps 
5ms 7 deps 
5ms 7 deps 
5ms 7 deps 
5ms 7 deps 
5ms 8 deps 
5ms 8 deps 
5ms 8 deps 
5ms 8 deps 
5ms 8 deps 
5ms 8 deps 
5ms 8 deps 
5ms 8 deps 
5ms 8 deps 
5ms 9 deps 
5ms 9 deps 
5ms 9 deps 
5ms 9 deps 
5ms 9 deps 
5ms 9 deps 
5ms 9 deps 
6ms 10 deps 
6ms 10 deps 
6ms 10 deps 
6ms 10 deps 
6ms 11 deps 
6ms 11 deps 
6ms 12 deps 
6ms 13 deps 
6ms 13 deps 
6ms 14 deps 
6ms 15 deps 
6ms 15 deps 
6ms 3 deps 
6ms 3 deps 
6ms 4 deps 
6ms 4 deps 
6ms 4 deps 
6ms 4 deps 
6ms 4 deps 
6ms 4 deps 
6ms 5 deps 
6ms 5 deps 
6ms 5 deps 
6ms 5 deps 
6ms 5 deps 
6ms 5 deps 
6ms 5 deps 
6ms 5 deps 
6ms 5 deps 
6ms 5 deps 
6ms 6 deps 
6ms 6 deps 
6ms 6 deps 
6ms 6 deps 
6ms 6 deps 
6ms 6 deps 
6ms 6 deps 
6ms 6 deps 
6ms 6 deps 
6ms 7 deps 
6ms 7 deps 
6ms 7 deps 
6ms 7 deps 
6ms 7 deps 
6ms 7 deps 
6ms 7 deps 
6ms 8 deps 
6ms 8 deps 
6ms 8 deps 
6ms 8 deps 
6ms 8 deps 
6ms 8 deps 
6ms 8 deps 
6ms 8 deps 
6ms 9 deps 
6ms 9 deps 
6ms 9 deps 
7ms 10 deps 
7ms 10 deps 
7ms 10 deps 
7ms 11 deps 
7ms 12 deps 
7ms 13 deps 
7ms 15 deps 
7ms 17 deps 
7ms 19 deps 
7ms 24 deps 
7ms 3 deps 
7ms 4 deps 
7ms 5 deps 
7ms 6 deps 
7ms 6 deps 
7ms 6 deps 
7ms 6 deps 
7ms 6 deps 
7ms 6 deps 
7ms 6 deps 
7ms 6 deps 
7ms 6 deps 
7ms 6 deps 
7ms 6 deps 
7ms 7 deps 
7ms 7 deps 
7ms 7 deps 
7ms 7 deps 
7ms 7 deps 
7ms 7 deps 
7ms 7 deps 
7ms 8 deps 
7ms 8 deps 
7ms 9 deps 
7ms 9 deps 
7ms 9 deps 
7ms 9 deps 
7ms 9 deps 
8ms 10 deps 
8ms 10 deps 
8ms 10 deps 
8ms 10 deps 
8ms 10 deps 
8ms 11 deps 
8ms 12 deps 
8ms 12 deps 
8ms 13 deps 
8ms 13 deps 
8ms 13 deps 
8ms 14 deps 
8ms 34 deps 
8ms 4 deps 
8ms 5 deps 
8ms 5 deps 
8ms 6 deps 
8ms 6 deps 
8ms 6 deps 
8ms 7 deps 
8ms 8 deps 
8ms 8 deps 
8ms 9 deps 
8ms 9 deps 
9ms 10 deps 
9ms 10 deps 
9ms 14 deps 
9ms 14 deps 
9ms 20 deps 
9ms 6 deps 
9ms 6 deps 
9ms 6 deps 
9ms 8 deps 
9ms 8 deps 
9ms 9 deps 
9ms 9 deps 
10ms 10 deps
10ms 10 deps
10ms 10 deps
10ms 11 deps
10ms 11 deps
10ms 12 deps
10ms 12 deps
10ms 13 deps
10ms 13 deps
10ms 15 deps
10ms 15 deps
10ms 25 deps
10ms 6 deps 
10ms 6 deps 
10ms 7 deps 
10ms 8 deps 
10ms 8 deps 
10ms 9 deps 
10ms 9 deps 
10ms 9 deps 
10ms 9 deps 
11ms 10 deps
11ms 11 deps
11ms 12 deps
11ms 12 deps
11ms 13 deps
11ms 17 deps
11ms 21 deps
11ms 6 deps 
11ms 6 deps 
11ms 7 deps 
11ms 7 deps 
11ms 8 deps 
11ms 8 deps 
11ms 8 deps 
11ms 9 deps 
12ms 10 deps
12ms 10 deps
12ms 11 deps
12ms 14 deps
12ms 15 deps
12ms 17 deps
12ms 6 deps 
12ms 6 deps 
12ms 7 deps 
12ms 7 deps 
12ms 8 deps 
12ms 8 deps 
12ms 8 deps 
12ms 9 deps 
13ms 10 deps
13ms 10 deps
13ms 12 deps
13ms 12 deps
13ms 14 deps
13ms 5 deps 
13ms 9 deps 
14ms 10 deps
14ms 10 deps
14ms 12 deps
14ms 14 deps
14ms 15 deps
14ms 15 deps
14ms 17 deps
14ms 39 deps
14ms 8 deps 
15ms 10 deps
15ms 12 deps
15ms 12 deps
15ms 16 deps
15ms 16 deps
15ms 25 deps
15ms 9 deps 
16ms 10 deps
16ms 11 deps
16ms 23 deps
16ms 26 deps
16ms 7 deps 
17ms 13 deps
17ms 15 deps
17ms 15 deps
17ms 18 deps
18ms 11 deps
18ms 16 deps
18ms 16 deps
18ms 16 deps
18ms 25 deps
19ms 18 deps
19ms 24 deps
20ms 18 deps
21ms 18 deps
22ms 16 deps
22ms 25 deps
24ms 17 deps
24ms 28 deps
25ms 15 deps
26ms 20 deps
28ms 17 deps
29ms 14 deps
30ms 26 deps
31ms 26 deps
33ms 28 deps
36ms 31 deps
38ms 36 deps
41ms 35 deps
50ms 32 deps
56ms 48 deps

So the expensiveness seems to be explainable by the n^2 complexity introduced by the block

    const result = compact(
      dependencies.map(dependency => {
        if (this._resolver.isCoreModule(dependency)) {
          return null;
        }
        try {
          return this._resolver.resolveModule(file, dependency, options);
        } catch (e) {}
        return this._resolver.getMockModule(file, dependency) || null;
      })
    );

I'm wondering how you achieve faster speeds for the facebook monorepo though. 🤔

@cpojer
Copy link
Member

cpojer commented May 3, 2017

Would you be able to share the dependencies (relative paths as they are in require) with us? You can of course obfuscate the names a bit.

I'm not sure where you are getting n^2 from, it should be O(n*m) where n is the number of files and m is the avg number of dependencies per file, roughly. We should be able to speed it up by using more aggressive caching; as it stands it is doing too many filesystem reads, which is because of how the node resolution algorithm works :(

I'm assuming the reason why this is so much faster at FB is because we use global module ids (which is the point of jest-haste-map). ~95% of all dependencies are O(1) lookups Jest's ModuleMap.

@jonathanglasmeyer
Copy link
Author

Are you referring to the import statements of ~800 files? That'd be a lot. :D

Your right, O(n*m) is more accurate.

Good point with the global module ids. Yea, caching should be able to improve this. :)

Thanks for helping investigate!

@sebinsua
Copy link

I'm assuming the reason why this is so much faster at FB is because we use global module ids (which is the point of jest-haste-map). ~95% of all dependencies are O(1) lookups Jest's ModuleMap.

Is the default behaviour of Jest or do I need to configure something to get Jest to use jest-haste-map?

@cpojer
Copy link
Member

cpojer commented Aug 1, 2017

@sebinsua that only works if you use @providesModule and global module ids. There is an explanation here: https://reactnatve.wordpress.com/2016/06/16/alias-in-react-native/#more-550

@cpojer
Copy link
Member

cpojer commented Aug 24, 2017

This individual task is not actionable, and we have separate issues to track further performance work. If you'd like to work on Jest's perf, reach out to me :)

@github-actions
Copy link

This issue has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.
Please note this issue tracker is not a help forum. We recommend using StackOverflow or our discord channel for questions.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators May 13, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants