Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: use devtools api to get response payload #1019

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

tmair
Copy link

@tmair tmair commented Jun 7, 2024

This PR uses the chrome devtools api to access the response palyload instead of patching fetch

There is an issue that I could not resolve (i suppose it is an issue of the chrome devtools), that sometimes during nextjs development mode the payload of the response can not be accessed.

Comment on lines +119 to +120
chrome.devtools.network.onRequestFinished.addListener(handleNetworkRequest);

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

onRequestFinished implies the full response has been returned right? The extension needs to be able to support HTTP streaming, so that you can later rewind time to see when specific chunks come in.

Do you know of a way to do that without using a fetch patch?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes that is correct. You get access to the request and response. I was not aware of the timing feature for individual chunks within the same response. I guess that that is not possible with the API offered by the devtools. I will dig a little deeper to find a definitive answer for it.

If it turns out that it is not possible, you can close the PR.

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You'd be able to see the future in action if you have a suspense boundary on a page with an artificially slowed down component inside. The response is a HTTP stream, and it won't wait for the slowed down component to finish to begin sending.

I will dig a little deeper to find a definitive answer for it.
Thanks, let me know what you find!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants