mirror of
https://github.com/s3fs-fuse/s3fs-fuse.git
synced 2026-04-25 21:35:58 +03:00
[GH-ISSUE #1675] Bad file descriptor #873
Labels
No labels
bug
bug
dataloss
duplicate
enhancement
feature request
help wanted
invalid
need info
performance
pull-request
question
question
testing
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
starred/s3fs-fuse#873
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @blakemcbride on GitHub (Jun 6, 2021).
Original GitHub issue: https://github.com/s3fs-fuse/s3fs-fuse/issues/1675
Additional Information
The following information is very important in order to help us to help you. Omission of the following details may delay your support request or receive no attention at all.
Keep in mind that the commands we provide to retrieve information are oriented to GNU/Linux Distributions, so you could need to use others if you use s3fs on macOS or BSD
Version of s3fs being used (s3fs --version)
1.89
Version of fuse being used (pkg-config --modversion fuse, rpm -qi fuse, dpkg -s fuse)
2.9.7
Kernel information (uname -r)
5.4.0-1049-aws
GNU/Linux Distribution, if applicable (cat /etc/os-release)
NAME="Ubuntu"
VERSION="18.04.5 LTS (Bionic Beaver)"
ID=ubuntu
ID_LIKE=debian
PRETTY_NAME="Ubuntu 18.04.5 LTS"
VERSION_ID="18.04"
HOME_URL="https://www.ubuntu.com/"
SUPPORT_URL="https://help.ubuntu.com/"
BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/"
PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy"
VERSION_CODENAME=bionic
UBUNTU_CODENAME=bionic
s3fs command line used, if applicable
cp Flash-VM.ova /S3
/etc/fstab entry, if applicable
s3fs on /S3 type fuse.s3fs (rw,nosuid,nodev,relatime,user_id=0,group_id=0)
s3fs syslog messages (grep s3fs /var/log/syslog, journalctl | grep s3fs, or s3fs outputs)
Jun 6 11:59:31 aws-linux-desktop s3fs[10964]: s3fs version 1.89(c2c56d0) : s3fs -o passwd_file=/etc/key arahant-backups /S3
Jun 6 11:59:31 aws-linux-desktop s3fs[10964]: Loaded mime information from /etc/mime.types
Jun 6 11:59:31 aws-linux-desktop s3fs[10966]: init v1.89(commit:c2c56d0) with OpenSSL
Jun 6 11:59:31 aws-linux-desktop s3fs[10966]: s3fs.cpp:s3fs_check_service(3541): Failed to connect region 'us-east-1'(default), so retry to connect region 'us-east-2'.
Jun 6 12:01:59 aws-linux-desktop s3fs[11005]: s3fs version 1.89(c2c56d0) : s3fs -o passwd_file=/etc/key arahant-backups-2 /S3
Jun 6 12:01:59 aws-linux-desktop s3fs[11005]: Loaded mime information from /etc/mime.types
Jun 6 12:01:59 aws-linux-desktop s3fs[11007]: init v1.89(commit:c2c56d0) with OpenSSL
Jun 6 12:01:59 aws-linux-desktop s3fs[11007]: s3fs.cpp:s3fs_check_service(3541): Failed to connect region 'us-east-1'(default), so retry to connect region 'us-east-2'.
Details about issue
I have a file that is 9.6GB. I can copy it from one location on my disk to another without a problem. But when I do:
cp Flash-VM.ova /S3
I get:
cp: error writing '/S3/Flash-VM.ova': Bad file descriptor
It only copies 5GB of the 9.6GB file. It seems like it has a 5GB limit.
Thanks!
@blakemcbride commented on GitHub (Jun 6, 2021):
The following did work:
@gaul commented on GitHub (Jun 7, 2021):
Can you run s3fs with debug logging via
s3fs -f -d -o curldbg? Are you using AWS or another S3 implementation?@blakemcbride commented on GitHub (Jun 7, 2021):
Yes, AWS.
t.txt
@gaul commented on GitHub (Jun 7, 2021):
Did you upload the correct log file? This uploads (275) 10 MB parts apparently successfully. This does not correlate with the 9.6 GB file you originally described.
@blakemcbride commented on GitHub (Jun 7, 2021):
The messages came out on the console. Once it hit the 5GB point it started spewing out many messages. I killed it and cut/paste the content of the console. It did error out at the same 5GB point.
@blakemcbride commented on GitHub (Jun 7, 2021):
Can you copy a random 10 GB file?
@gaul commented on GitHub (Jun 7, 2021):
Please provide the full log or close this issue.
@blakemcbride commented on GitHub (Jun 7, 2021):
You'll have to be explicit about which log file you want.
@gaul commented on GitHub (Jun 8, 2021):
Refer to the instructions in https://github.com/s3fs-fuse/s3fs-fuse/issues/1675#issuecomment-855491843.
@CarstenGrohmann commented on GitHub (Jun 9, 2021):
@blakemcbride: You can use the
teeutility to collect the debug output.teecaptures the output of the command, prints it to a console and writes it into a file.Example:
@ggtakec commented on GitHub (Jun 20, 2021):
@blakemcbride
In the log you provided, I found the following line:
This (
pseudo_fd = -1) is a bug and I posted a PR with #1693.I would appreciate it if you could check the corrections in this PR.
@gaul commented on GitHub (Jul 25, 2021):
Please reopen if symptoms persist with the latest master.