mirror of
https://github.com/ProxymanApp/Proxyman.git
synced 2026-04-25 16:15:55 +03:00
[GH-ISSUE #1323] Is there a way to see response data for streaming API calls (eg. endpoints meant for receiving Server sent events) #1317
Labels
No labels
Discussion
Feature request
In Progress...
Plugins
Waiting response
Windows
Windows
bug
duplicate
enhancement
feature
good first issue
iOS
macOS 10.11
question
wontfix
✅ Done
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
starred/Proxyman#1317
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @gamerkhang on GitHub (Aug 5, 2022).
Original GitHub issue: https://github.com/ProxymanApp/Proxyman/issues/1323
Originally assigned to: @NghiaTranUIT on GitHub.
Currently you do not see the response data for an API call until it's completed
However, there are API calls that are meant to stay active to intercept server sent events, where the response data is continuously streamed
Is there a way to see this information on Proxyman?
@NghiaTranUIT commented on GitHub (Aug 6, 2022):
We implemented this feature in the past, but it didn't work so well, so we completely removed it 😿
Can you elaborate on what type of streaming API you'd like to check? (Content-Type?)
@ivanmoskalev commented on GitHub (Nov 23, 2022):
I'm guessing server-sent events (
text/event-stream)@wesbos commented on GitHub (May 24, 2023):
This would be a neat feature to have - as streaming becomes more popular in the browser, there isn't a single tool I've found that will allow you to see the streamed request as it's coming in. All wait for the request to close before showing the entire payload of data
A common use case right now is many of these GPT chat apps will stream the response in from OpenAI. Many use Web Streams (https://developer.mozilla.org/en-US/docs/Web/API/Streams_API) and others use server sent events.
@NghiaTranUIT commented on GitHub (May 25, 2023):
I guess I can support the Streaming Body by looking at the Response Header:
Let me play around and send you a Beta build 👍
Reference: https://gist.github.com/CMCDragonkai/6bfade6431e9ffb7fe88
@SOVRON commented on GitHub (Jul 13, 2023):
Any update on this? I too am trying to intercept our ChatGPT stream api request but Proxyman Mac app shows nothing? Thanks
@farmisen commented on GitHub (Aug 23, 2023):
I would also be super interested by that feature - trying to debug our in house sse streamed events and being able to see them as they come instead of at once when the last one is sent would help a ton. I'll definitely be able to help QA that feature if needed.
@reubn commented on GitHub (Nov 5, 2023):
Currently facing this issue as well
@ChristianWeyer commented on GitHub (Dec 21, 2023):
Oh yeah, this is a super helpful feature @NghiaTranUIT - any updates on this?
Thanks!
@NghiaTranUIT commented on GitHub (Dec 22, 2023):
@ChristianWeyer not yet 😢 I tried to implement it but it breaks our current flow and doesn't meet our requirements. Thus, we postpone it until we find a better solution.
For example:
chunkin a very short time (millisecond) -> causing the UI to update too many times -> Lag and unresponsive.@ChristianWeyer commented on GitHub (Dec 22, 2023):
Thanks for getting back with the details @NghiaTranUIT - do you know of any similar HTTPS debugging proxy tool running on macOS that can handle response streaming?
@NghiaTranUIT commented on GitHub (Dec 22, 2023):
@ChristianWeyer you can use Charles Proxy. However, it's hard to set and you might follow some tutorials on Google 👍
@ChristianWeyer commented on GitHub (Dec 22, 2023):
Charles is too slow and cumbersome... 😅
@NghiaTranUIT commented on GitHub (Mar 8, 2024):
Good news everyone 🎉
Content-Type: text/event-streamVideo
https://github.com/ProxymanApp/Proxyman/assets/5878421/fba011b2-576e-4fcd-9d34-a5e489d19400
@ChristianWeyer @reubn @farmisen @SOVRON please give it a try and share with me the result 👍 I appreciate it 🙇
@lennondotw commented on GitHub (Mar 13, 2024):
It's working! 🎉
But there's a issue if scripting is enabled. I use scripts to add custom headers and don't change HTTP body data. With scripting enabled, SSE data is showing as stream in Proxyman but Chrome isn't receiving any data from Proxyman until the request is done. Chrome received all the SSE data at once.
Can we have an option to tell Proxyman a script will not modify HTTP body, so it doesn't have to wait for the entire request to end, but instead returns the data to the client in realtime?
@NghiaTranUIT commented on GitHub (Mar 15, 2024):
@reekystive I'm not sure how to implement the Scripting with SSE yet.
Currently, When a request matches with Scripting/Breakpoint, the script will be executed when the body is fully received -> So, we can modify the body (
response.body) -> Then, it writes entire HTTP Response to the client.@NghiaTranUIT commented on GitHub (Mar 15, 2024):
@reekystive I'm working on this change. May I ask:
RequestorResponsepart?@lennondotw commented on GitHub (Mar 15, 2024):
@NghiaTranUIT I only modify the request header with scripts to test APIs in production and test environment. But maybe someone will want to modify the response header, who knows?
@DeniDoman commented on GitHub (Oct 1, 2024):
Thank you for the implementation! The only UX issue I have is a data representation. GPT-like streaming API outputs data word by word, and the current output looks like this:
While I'd prefer it to look more natural, like a combined text.
If someone else faced the issue, I created a simple online converter to merge
data: {"type":"Content","event_type":"data","content":" prompt"}lines into readable output: https://brown-celinka-85.tiiny.site@ivanmoskalev commented on GitHub (Oct 1, 2024):
Not all SSE APIs would benefit from this. I have worked with an instant messaging product that utilized SSE for messages that were not intended to be joined.
@DeniDoman commented on GitHub (Oct 1, 2024):
Sure, I agree.
@Swimburger commented on GitHub (Mar 25, 2025):
There's another format that's commonly streamed.
Ndjson
https://github.com/ndjson/ndjson-spec
@NghiaTranUIT commented on GitHub (Mar 25, 2025):
FYI, you can prettify each JSON Streaming message by selecting a JSON string -> Right-click -> View as -> Prettify JSON
https://github.com/user-attachments/assets/bf7b4858-ac82-46de-ad6b-5690206f0263
@Swimburger commented on GitHub (Mar 25, 2025):
Oh, I forgot to mention, in my usecase in using ndjson, it's actually in the HTTP request, not the response (as opposed to the rest of this thread).
I think streamed requests should receive the same realtime UX in Proxyman.
@avarayr commented on GitHub (Apr 14, 2025):
if anyone here has a usecase of debugging OpenAI-like outputs Custom Tab that shows the content deltas, use this script for a custom tab
@johnib commented on GitHub (Jun 5, 2025):
Does this still work for you? the concept of a function being invoked directly? I can't get the Previewer tabs to work in SSE responses.
@NghiaTranUIT commented on GitHub (Jun 5, 2025):
@johnib Maybe I will introduce a native SSE Tab for Open API. I will trim the prefix
data:and prettify each JSON part. Does it work for you?@ChristianWeyer commented on GitHub (Jun 5, 2025):
Actually, for OpenAI-compatible endpoints, not just OpenAI ;-)
@johnib commented on GitHub (Jun 5, 2025):
I think my problem is different. Even the simplest values being set on the previewer tab.
Not sure how to debug this, even the provided example by proxyman doesn't work.
The request is getting processed by the script, I can see this on the logs.
@NghiaTranUIT commented on GitHub (Jun 5, 2025):
@ChristianWeyer @johnib let's try this beta build: https://download.proxyman.io/beta/Proxyman_5.20.0_Support_SSE_Tab_for_openapi_endpoints.dmg
Changelogs
Screenshots
@ChristianWeyer commented on GitHub (Jun 5, 2025):
Thanks, just tried it. It works.
But... how can we see the final response - for humans?
@NghiaTranUIT commented on GitHub (Jun 5, 2025):
Just open the Body Tab. It shows the Raw SSE events
@ChristianWeyer commented on GitHub (Jun 5, 2025):
No, I mean the final response. The 'assembled' response.
@NghiaTranUIT commented on GitHub (Jun 5, 2025):
can you give me example of "The 'assembled' response." ?
It's SSE, the server sent a bunch of events during the connection. There is no "assembled" response.
@ChristianWeyer commented on GitHub (Jun 5, 2025):
Yes, exactly. But especially for LLM calls (like OpenAI), the actual interesting trace is the final response.
When I send a request to OpenAI and get back an answer, it is the text of the LLM response which is of interest. In streamed mode, they are using SSE - this is fine and nice as to how your tool currently handles it.
But the 'assembled' final response is the concatenated
contentvalues of all events. We would need a way to 'prettify' this.Does that make sense?
@NghiaTranUIT commented on GitHub (Jun 5, 2025):
Thanks, I understand it, but each event has a different JSON key-value.
How can I assemble a new one?
Input:
I might an sample output here
@ChristianWeyer commented on GitHub (Jun 5, 2025):
See attached a sample answer from OpenAI GPT-4o for a simple RAG query.
The final text response is:
Response - api.openai.com_v1_chat_completions.txt
@NghiaTranUIT commented on GitHub (Jun 6, 2025):
Sorry, I don't understand how your attached file is assembled? From what I see, it has many individual events.
If you don't mind, it'd be great if you can share with me the output of my sample input in previous comment?
@ChristianWeyer commented on GitHub (Jun 7, 2025):
This is an export from your tool :-)
@NghiaTranUIT commented on GitHub (Jun 13, 2025):
@ChristianWeyer I understand your suggestion. Working on it now 👍
contentkey from OpenAPI stream event, much easier to read the OpenAPI response 👍@NghiaTranUIT commented on GitHub (Jun 13, 2025):
@ChristianWeyer can you try this beta build: https://download.proxyman.io/beta/Proxyman_5.21.0_Try_to_merge_openai_response.dmg
Changelog
Demo
https://github.com/user-attachments/assets/2ef7afd6-cf30-4813-b3b3-bf13bb238cdf
@ChristianWeyer commented on GitHub (Jun 17, 2025):
Nice, this is great. Thank you!