[GH-ISSUE #656] Impossible to read large files from a mount in a raspberry pi 2 #374

Closed
opened 2026-03-04 01:44:55 +03:00 by kerem · 7 comments
Owner

Originally created by @ghost on GitHub (Oct 9, 2017).
Original GitHub issue: https://github.com/s3fs-fuse/s3fs-fuse/issues/656

Hello!

When I do a mv/cp of a large file in the local disk, I get an I/O error:

$ cp /mnt/s3/myBigFile ~/
cp: error reading '/mnt/s3/myBigFile': Input/output error

Same with any software (e.g. video soft cannot read large videos):

$ mpv '/mnt/s3/myBigFile.mkv'
Failed to recognize file format.

I get these errors in the journal:

fdcache.cpp:Open(839): ftruncate() or fsync returned err(22)
s3fs[804]: s3fs.cpp:s3fs_read(2081): could not find opened fd(/myBigFile)
s3fs[804]: s3fs.cpp:s3fs_read(2081): could not find opened fd(/myBigFile)
s3fs[804]: s3fs.cpp:s3fs_read(2081): could not find opened fd(/myBigFile)
s3fs[804]: s3fs.cpp:s3fs_release(2213): could not find fd(file=/myBigFile)

Tested with different files.
It works with relatively small files, so I don’t think it’s permission-related.
Also, I use SSHFS on this rpi to retrieve the same files and never had any problem. Webdav also works correctly.

However, s3fs works well on my laptop with the exact same configuration, except for kernel (4.13.3, amd64).

Also, thank you for your work!

Steps to reproduce

  • Put a large file (more than 1.5GB) in a S3 bucket, from anywhere: big.file.
  • Create a mount point for the current user mkdir /mnt/s3 && chown me:me /mnt/s3
  • s3fs my-bucket /mnt/s3 -o passwd_file=~/.config/s3fs-credentials,allow_other,connect_timeout=15,retries=3,noatime,curldbg -d
  • Try to copy/read the big file. E.g. cp /mnt/s3/big.file ~/

Additional Information

Hardware: Rpi 2 (armv7)
Distro: Archlinux
Kernel: Linux 4.9.52-1
s3fs version: 1.80
fuse version: 2.9.7

Originally created by @ghost on GitHub (Oct 9, 2017). Original GitHub issue: https://github.com/s3fs-fuse/s3fs-fuse/issues/656 Hello! When I do a mv/cp of a large file in the local disk, I get an I/O error: > $ cp /mnt/s3/myBigFile ~/ > `cp: error reading '/mnt/s3/myBigFile': Input/output error` Same with any software (e.g. video soft cannot read large videos): > $ mpv '/mnt/s3/myBigFile.mkv' > `Failed to recognize file format.` I get these errors in the journal: ``` fdcache.cpp:Open(839): ftruncate() or fsync returned err(22) s3fs[804]: s3fs.cpp:s3fs_read(2081): could not find opened fd(/myBigFile) s3fs[804]: s3fs.cpp:s3fs_read(2081): could not find opened fd(/myBigFile) s3fs[804]: s3fs.cpp:s3fs_read(2081): could not find opened fd(/myBigFile) s3fs[804]: s3fs.cpp:s3fs_release(2213): could not find fd(file=/myBigFile) ``` Tested with different files. It works with relatively small files, so I don’t think it’s permission-related. Also, I use SSHFS on this rpi to retrieve the same files and never had any problem. Webdav also works correctly. However, s3fs works well on my laptop with the exact same configuration, except for kernel (4.13.3, amd64). Also, thank you for your work! #### Steps to reproduce - Put a large file (more than 1.5GB) in a S3 bucket, from anywhere: `big.file`. - Create a mount point for the current user `mkdir /mnt/s3 && chown me:me /mnt/s3` - `s3fs my-bucket /mnt/s3 -o passwd_file=~/.config/s3fs-credentials,allow_other,connect_timeout=15,retries=3,noatime,curldbg -d` - Try to copy/read the big file. E.g. `cp /mnt/s3/big.file ~/` #### Additional Information Hardware: Rpi 2 (armv7) Distro: Archlinux Kernel: Linux 4.9.52-1 s3fs version: 1.80 fuse version: 2.9.7
kerem closed this issue 2026-03-04 01:44:55 +03:00
Author
Owner

@ggtakec commented on GitHub (Oct 15, 2017):

@gui-don
This error appears to have received EINVAL(The argument length is negative or larger than the maximum file size) from ftruncate function.
Although raspberry pi is not detailed, there seems to be some restriction in raspberry pi, like "It works with relatively small files" you indicate.

s3fs opens a temporary file with tmpfile and changes its size with ftruncate.
Could you check whether the size of ftruncate on raspberry pi is restricted?

Thanks in advance for your assistance.

<!-- gh-comment-id:336689300 --> @ggtakec commented on GitHub (Oct 15, 2017): @gui-don This error appears to have received EINVAL(The argument length is negative or larger than the maximum file size) from ftruncate function. Although raspberry pi is not detailed, there seems to be some restriction in raspberry pi, like "It works with relatively small files" you indicate. s3fs opens a temporary file with tmpfile and changes its size with ftruncate. Could you check whether the size of ftruncate on raspberry pi is restricted? Thanks in advance for your assistance.
Author
Owner

@gaul commented on GitHub (Oct 16, 2017):

I wonder if Raspberry Pi has a 32-bit off_t? Maybe try recompiling with replacing ftruncate with ftruncate64?

<!-- gh-comment-id:336779733 --> @gaul commented on GitHub (Oct 16, 2017): I wonder if Raspberry Pi has a 32-bit `off_t`? Maybe try recompiling with replacing `ftruncate` with `ftruncate64`?
Author
Owner

@gorky commented on GitHub (Dec 30, 2017):

The Raspberry PIs < PI3 are all 32bit CPUs. Raspberry PI3 is the first of the 64bit PIs. So maybe ftruncate64 would be best?

<!-- gh-comment-id:354563691 --> @gorky commented on GitHub (Dec 30, 2017): The Raspberry PIs < PI3 are all 32bit CPUs. Raspberry PI3 is the first of the 64bit PIs. So maybe ftruncate64 would be best?
Author
Owner

@arl commented on GitHub (Feb 10, 2018):

@gui-don did you have the opportunity to try with ftruncate64, if so I'd be interested to know if that changed the outcome

<!-- gh-comment-id:364679531 --> @arl commented on GitHub (Feb 10, 2018): @gui-don did you have the opportunity to try with `ftruncate64`, if so I'd be interested to know if that changed the outcome
Author
Owner

@ghost commented on GitHub (Feb 22, 2018):

I’m sorry, I don’t have the time to test it for now. I’ll post anything I try here.

<!-- gh-comment-id:367774243 --> @ghost commented on GitHub (Feb 22, 2018): I’m sorry, I don’t have the time to test it for now. I’ll post anything I try here.
Author
Owner

@gaul commented on GitHub (Mar 15, 2019):

I successfully ran s3fs tests on a 64-bit Amazon ARM instance but lack a 32-bit instance that I can test on.

<!-- gh-comment-id:473198562 --> @gaul commented on GitHub (Mar 15, 2019): I successfully ran s3fs tests on a 64-bit Amazon ARM instance but lack a 32-bit instance that I can test on.
Author
Owner

@gaul commented on GitHub (Apr 9, 2019):

If someone can provide a Raspberry Pi that I can ssh into or a working VM image we can investigate this.

<!-- gh-comment-id:481185676 --> @gaul commented on GitHub (Apr 9, 2019): If someone can provide a Raspberry Pi that I can ssh into or a working VM image we can investigate this.
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/s3fs-fuse#374
No description provided.