Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(fetch): properly release fetch during long-lived stream handling #13967

Open
wants to merge 45 commits into
base: develop
Choose a base branch
from

Conversation

Lei-k
Copy link

@Lei-k Lei-k commented Oct 12, 2024

Hi,
I saw that the trackFetchStreamPerformance option was introduced in PR #13951 to bypass this issue, and Issue #13950 was closed as a result. However, this hasn't fully resolved the problem, so I attempted to address it in this PR.

Here is my modified test result after applying the changes:

截圖 2024-10-13 上午2 39 23

Lei-k added 3 commits October 13, 2024 02:05
…tream is cancelled

- Modified `resolveResponse` to properly handle the cancellation of child streams by overriding the `cancel` method on the parent response's body.
- Updated unit tests to cover scenarios for both successful stream resolution and error cases involving stream cancellation.
@chargome
Copy link
Member

@Lei-k we have not released #13951 yet, can you specify why this does not resolve the issue?

@Lei-k
Copy link
Author

Lei-k commented Oct 14, 2024

@chargome From what I understand, in #13951 only adds the trackFetchStreamPerformance option and sets it to false by default to bypass the issue. However, when this option is enabled, the problem still persists.

@Lei-k Lei-k changed the title fix(fetch): fix fetch not release fix(fetch): properly release fetch during long-lived stream handling Oct 15, 2024
@Lei-k Lei-k requested a review from a team as a code owner October 15, 2024 14:36
@Lei-k
Copy link
Author

Lei-k commented Oct 16, 2024

I saw some errors in the CI. I'll deal with them later

@Lei-k
Copy link
Author

Lei-k commented Oct 18, 2024

It seems that there are no further errors related to this PR. Can you please help confirm this, @chargome .

@Lei-k
Copy link
Author

Lei-k commented Oct 18, 2024

擷取_2024_10_18_17_33_23_474
擷取_2024_10_18_17_35_01_322
擷取_2024_10_18_17_35_06_654

@chargome

Response was already in the code before, but now it's triggering a TS2304 error. Do you have any idea why this is happening?

It looks like only the TS2304 error and the file size check remain to be solved now.

@chargome
Copy link
Member

@Lei-k not sure, but the same error is also crashing node-otel-sdk-node and node-otel-custom-sampler tests

@Lei-k
Copy link
Author

Lei-k commented Oct 20, 2024

@chargome

All tests have passed on my side. Please check, thanks!

@Lei-k
Copy link
Author

Lei-k commented Oct 21, 2024

Hi, @chargome

There’s one remaining bundle size check that failed due to the size limit. Could you assist me in adjusting the size, or do you have any suggestions for resolving this?

@chargome chargome requested a review from lforst October 21, 2024 11:53
Copy link
Member

@chargome chargome left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @Lei-k sorry for the silence on this one, could you please have a look at my review and potentially also merge develop into your pr? We had some changes to our package structure lately, e.g. @sentry/types got deprecated.

if (res && res.body) {
const body = res.body;
const responseReader = body.getReader();
async function resloveReader(reader: WebReadableStreamDefaultReader, onFinishedResolving: () => void): Promise<void> {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
async function resloveReader(reader: WebReadableStreamDefaultReader, onFinishedResolving: () => void): Promise<void> {
async function resolveReader(reader: WebReadableStreamDefaultReader, onFinishedResolving: () => void): Promise<void> {

*
* Export For Test Only
*/
export function resolveResponse(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we might be introducing a potential memory leak within this function. Previously we were cancelling the resolving after a max timeout plus a timeout for individual chunk resolving. Could you explain why you removed these?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Lei-k could you address this concern?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants