Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CreateTestContext doesn't get assigned different ports when run in parrallel #1326

Open
luhagel opened this issue Aug 5, 2020 · 3 comments · May be fixed by #1418
Open

CreateTestContext doesn't get assigned different ports when run in parrallel #1326

luhagel opened this issue Aug 5, 2020 · 3 comments · May be fixed by #1418
Labels
type/bug Something is not working the way it should

Comments

@luhagel
Copy link
Contributor

luhagel commented Aug 5, 2020

Screenshot

Description

get-port currently suffers from an issue preventing the assignment of a new port when the initial one is already taken.

This leads to tests erroring out because

  1. The new server tries to bind to 4000 and fails or
  2. The requests all go to 4000 but it has already been shutdown after the only working test context finished

Workaround

Right now tests have to be either --runInBand or the nexus has to be patched to assign Math.floor(Math.random() * (6000 - 4000)) + 4000 instead, which is ironically more reliable and works at least most of the time.

@luhagel luhagel added the type/bug Something is not working the way it should label Aug 5, 2020
@brainafesoh
Copy link

brainafesoh commented Aug 19, 2020

I had a similar problem here #1085
The workaround I finally used consists in increasing the jest timeout like this;
jest.setTimeout(100000)

See this stackoverflow issue for more info

@haysclark
Copy link

haysclark commented Sep 15, 2020

I dug into this issue and added some traces to monkey debug what is going on under the hood with get-port. I am using my plugins-prisma-and-jwt-auth-and-shield-piecemeal example for those that want a tests case.

Clearly others have seen the need for some caching mechanism for some time as well:

"// todo figure out some caching system here, e.g. imagine jest --watch mode" @Weakky.

Additionally, there are some loose ends, as illustrated by this comment by @jasonkuhrt which reference the issue #758; currently closed.

While my efforts may be fruitless, I'm currently experimenting with refactoring this code to use a Singleton or an Object Pool.

@haysclark
Copy link

Update: Singletons and Object Pools are complete dead-ends due to Jest running each test suite in a separate process. There seem to be few options for sharing data between processes, and using Redis or other Shared-Memory libraries seems like a huge hack.

The core issue appears to be related to the long duration it can take between port number assignment, the time it takes for the DevAppRunner to be created in createDevAppRunner, and the port actually being used when app.start() is executed. IMHO, no changes to the get-port library will protect a port being used twice. Possibly, a "retry" feature could be utilized.

Amazingly, @luhagel random generated ports is extremely effective and can be paired with get-port to avoid used system ports.

  const randomStartPort = Math.floor(Math.random() * (6000 - 4000)) + 4000
  const randomPort = await getPort({ port: getPort.makeRange(randomStartPort, 6000) })

Just for comparison, here are the performance improvements I see by doing these minor tweaks to Nexus/test.

Original code with --runInBand utilized.

npm run test -- --runInBand 

...

Test Suites: 10 passed, 10 total
Tests:       20 passed, 20 total
Snapshots:   8 passed, 8 total
Time:        109.546 s
Ran all test suites.

randomStartPort code running in parrallel

npm run test

...

Test Suites: 10 passed, 10 total
Tests:       20 passed, 20 total
Snapshots:   8 passed, 8 total
Time:        41.149 s, estimated 53 s
Ran all test suites.

randomStartPort code with --runInBand utilized.

npm run test -- --runInBand 

...

Test Suites: 10 passed, 10 total
Tests:       20 passed, 20 total
Snapshots:   8 passed, 8 total
Time:        109.214 s
Ran all test suites.

randomStartPort code with --watch utilized, running single suite.

npm run test -- --watch

...

Test Suites: 1 passed, 1 total
Tests:       2 passed, 2 total
Snapshots:   1 passed, 1 total
Time:        14.814 s, estimated 19 s
Ran all test suites related to changed files.

haysclark added a commit to haysclark/nexus that referenced this issue Sep 16, 2020
- port workaround allows more performant parallel testing
- testing on CI environments should still utilize Jest’s `—runInBand` flag
- resolves prisma-labs#1326
haysclark added a commit to haysclark/nexus that referenced this issue Sep 16, 2020
- port workaround allows more performant parallel testing
- testing on CI environments should still utilize Jest’s `—runInBand` flag
- resolves prisma-labs#1326
@haysclark haysclark linked a pull request Sep 16, 2020 that will close this issue
haysclark added a commit to haysclark/nexus that referenced this issue Sep 16, 2020
- port workaround allows more performant parallel testing
- testing on CI environments should still utilize Jest’s `—runInBand` flag
haysclark added a commit to haysclark/nexus that referenced this issue Sep 16, 2020
- during testing, get-port often re-uses the same port, due to the delay between when a free-port is assigned to a test, and when the port is actually used. Simply supplying an initial random port value to get-port dramatically reduces the chance of a port being used multiple times in a test run.
- testing on CI environments should still utilize Jest’s `—runInBand` flag
haysclark added a commit to haysclark/nexus that referenced this issue Sep 22, 2020
- during testing, get-port often re-uses the same port, due to the delay between when a free-port is assigned to a test, and when the port is actually used. Simply supplying an initial random port value to get-port dramatically reduces the chance of a port being used multiple times in a test run.
- testing on CI environments should still utilize Jest’s `—runInBand` flag
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type/bug Something is not working the way it should
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants