mirror of
https://github.com/cbeuw/Cloak.git
synced 2026-04-26 21:15:55 +03:00
[GH-ISSUE #140] Regression in 2.3.2: Long download closed after StreamTimeout even if data still moving #115
Labels
No labels
pull-request
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
starred/Cloak#115
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @notsure2 on GitHub (Dec 6, 2020).
Original GitHub issue: https://github.com/cbeuw/Cloak/issues/140
Hello @cbeuw
I think in the latest pull request, we accidentally made regression. If you set StreamTimeout to be a low value (eg: 5 seconds) and start a long download such that the download takes more than StreamTimeout seconds, the session will be forcefully closed even if there was data being downloaded. I will investigate.
The session needs to be kept alive if there is any traffic passing in it.
@notsure2 commented on GitHub (Dec 6, 2020):
Ok upon further investigation, this is not a regression. The bug was there from the start (not refreshing the read deadline when there's activity) but it was working by luck because streamBufferPipe was calling timer.AfterFunc to schedule a wake up just a very short time before the deadline actually passed every loop, so the StreamTimeout was actually malfunctioning and not being applied correctly.
github.com/cbeuw/Cloak@f1c656758f/internal/multiplex/streamBufferedPipe.go (L64-L92)In 2.3.2 the StreamTimeout is now being enforced correctly, however, there is a new behavior that is required, which is having any activity on the other side reset the timeout (to handle the case of HTTP download, the browser sends a short data then sends nothing, and the rest of the time is spent downloading).
Actually, even in 2.3.2 there is still a bug, if a browser uploads data continuously for the duration of the StreamTimeout, now the connection will be forcefully closed also because the deadline is not reset every loop (only when the buffer runs out).