mirror of
https://github.com/s3fs-fuse/s3fs-fuse.git
synced 2026-04-25 13:26:00 +03:00
[GH-ISSUE #1019] Time problem with large files #559
Labels
No labels
bug
bug
dataloss
duplicate
enhancement
feature request
help wanted
invalid
need info
performance
pull-request
question
question
testing
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
starred/s3fs-fuse#559
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @DigitalCyberSoft on GitHub (Apr 25, 2019).
Original GitHub issue: https://github.com/s3fs-fuse/s3fs-fuse/issues/1019
This is using the latest built version of s3fs.
While uploading a very large file, it eventually starts throwing the following error until retries run out
File size is 342,798,923,129 bytes and the error at around [start=342360064000][size=438859129][part=654]
For reference:
@gaul commented on GitHub (Apr 25, 2019):
I wonder if the retry code updates
x-amz-dateand resigns the request?@DigitalCyberSoft commented on GitHub (Apr 25, 2019):
I assumed something like this, as the problem appears to have popped up 30-40 minutes after the upload started.
@gaul commented on GitHub (Jul 2, 2019):
You can probably work around this by downgrading to version 1.84 until we can address this properly.