[GH-ISSUE #141] Can create/remove objects but content is lost #85

Closed
opened 2026-03-04 01:41:53 +03:00 by kerem · 6 comments
Owner

Originally created by @benkehoe on GitHub (Mar 5, 2015).
Original GitHub issue: https://github.com/s3fs-fuse/s3fs-fuse/issues/141

Starting with commit github.com/s3fs-fuse/s3fs-fuse@114966e7c0 (I have narrowed it to this commit) I observe the following:

$ s3fs my_bucket ~/s3data && cd ~/s3data
$ echo "foo" > bar
bash: echo: write error: Input/output error
$ ls
bar
$ cat bar
$ rm bar
$ ls
$

The object can be seen to be present in the S3 console, but with a size of 0 bytes.

Originally created by @benkehoe on GitHub (Mar 5, 2015). Original GitHub issue: https://github.com/s3fs-fuse/s3fs-fuse/issues/141 Starting with commit https://github.com/s3fs-fuse/s3fs-fuse/commit/114966e7c0371eed17e2898cdf505ba4dbfb9b59 (I have narrowed it to this commit) I observe the following: ``` $ s3fs my_bucket ~/s3data && cd ~/s3data $ echo "foo" > bar bash: echo: write error: Input/output error $ ls bar $ cat bar $ rm bar $ ls $ ``` The object can be seen to be present in the S3 console, but with a size of 0 bytes.
kerem 2026-03-04 01:41:53 +03:00
  • closed this issue
  • added the
    bug
    label
Author
Owner

@gaul commented on GitHub (Mar 5, 2015):

This fails in small-integration-test.sh as well. We should really automate these tests per-pull request via Travis #129!

$ ./small-integration-test.sh
./small-integration-test.sh: connect: Connection refused
./small-integration-test.sh: line 14: /dev/tcp/localhost/8080: Connection refused
./small-integration-test.sh: connect: Connection refused
./small-integration-test.sh: line 14: /dev/tcp/localhost/8080: Connection refused
I 03-05 11:41:31.906 main org.eclipse.jetty.util.log:186 |::] Logging initialized @1867ms
I 03-05 11:41:31.953 main o.eclipse.jetty.server.Server:327 |::] jetty-9.2.z-SNAPSHOT
I 03-05 11:41:31.986 main o.e.j.server.ServerConnector:266 |::] Started ServerConnector@3cdf2c61{HTTP/1.1}{127.0.0.1:8080}
I 03-05 11:41:31.987 main o.eclipse.jetty.server.Server:379 |::] Started @1951ms
echo HELLO WORLD to test-s3fs.txt
./integration-test-main.sh: line 77: echo: write error: Input/output error
<!-- gh-comment-id:77436519 --> @gaul commented on GitHub (Mar 5, 2015): This fails in `small-integration-test.sh` as well. We should really automate these tests per-pull request via Travis #129! ``` $ ./small-integration-test.sh ./small-integration-test.sh: connect: Connection refused ./small-integration-test.sh: line 14: /dev/tcp/localhost/8080: Connection refused ./small-integration-test.sh: connect: Connection refused ./small-integration-test.sh: line 14: /dev/tcp/localhost/8080: Connection refused I 03-05 11:41:31.906 main org.eclipse.jetty.util.log:186 |::] Logging initialized @1867ms I 03-05 11:41:31.953 main o.eclipse.jetty.server.Server:327 |::] jetty-9.2.z-SNAPSHOT I 03-05 11:41:31.986 main o.e.j.server.ServerConnector:266 |::] Started ServerConnector@3cdf2c61{HTTP/1.1}{127.0.0.1:8080} I 03-05 11:41:31.987 main o.eclipse.jetty.server.Server:379 |::] Started @1951ms echo HELLO WORLD to test-s3fs.txt ./integration-test-main.sh: line 77: echo: write error: Input/output error ```
Author
Owner

@admorphit commented on GitHub (Mar 7, 2015):

+1

Am using centos 7 and getting error also

s3fs mybucket -f -d -o allow_other,del_cache,endpoint=us-west-1,url=https://s3-us-west-1.amazonaws.com /mnt/s3data/

Copy command fails

cp s3data/bigfile.mp4 /home
cp: error reading ‘s3data/bigfile.mp4’: Input/output error
cp: failed to extend ‘/home/bigfile.mp4’: Input/output error

and here's the log

s3fs_getattr(721): [path=/bigfile.mp4]
    HeadRequest(2112): [tpath=/bigfile.mp4]
    RequestPerform(1572): connecting to URL https://mybucket.s3-us-west-1.amazonaws.com/bigfile.mp4
    RequestPerform(1588): HTTP response code 200
    AddStat(248): add stat cache entry[path=/bigfile.mp4]
    GetStat(171): stat cache hit [path=/bigfile.mp4][time=1425723772][hit count=0]
s3fs_open(1911): [path=/bigfile.mp4][flags=32768]
    DelStat(370): delete stat cache entry[path=/bigfile.mp4]
    HeadRequest(2112): [tpath=/bigfile.mp4]
    RequestPerform(1572): connecting to URL https://mybucket.s3-us-west-1.amazonaws.com/bigfile.mp4
    RequestPerform(1588): HTTP response code 200
    AddStat(248): add stat cache entry[path=/bigfile.mp4]
    GetStat(171): stat cache hit [path=/bigfile.mp4][time=1425723772][hit count=0]
    ParallelGetObjectRequest(1105): [tpath=/bigfile.mp4][fd=5]
    GetStat(171): stat cache hit [path=/bigfile.mp4][time=1425723772][hit count=1]
    PreGetObjectRequest(2417): [tpath=/bigfile.mp4][start=0][size=10485760]
    PreGetObjectRequest(2417): [tpath=/bigfile.mp4][start=10485760][size=10485760]
    PreGetObjectRequest(2417): [tpath=/bigfile.mp4][start=20971520][size=10485760]
    PreGetObjectRequest(2417): [tpath=/bigfile.mp4][start=31457280][size=10485760]
    PreGetObjectRequest(2417): [tpath=/bigfile.mp4][start=41943040][size=10485760]
    Request(3627): [count=5]
s3fs_read(1957): could not find opened fd(/bigfile.mp4)
s3fs_read(1957): could not find opened fd(/bigfile.mp4)
s3fs_read(1957): could not find opened fd(/bigfile.mp4)
s3fs_read(1957): could not find opened fd(/bigfile.mp4)
s3fs_read(1957): could not find opened fd(/bigfile.mp4)
s3fs_flush(2017): [path=/bigfile.mp4][fd=5]
    GetStat(171): stat cache hit [path=/bigfile.mp4][time=1425723772][hit count=2]
s3fs_release(2057): [path=/bigfile.mp4][fd=5]
s3fs_release(2071): could not find fd(file=/bigfile.mp4)

Thoughts?

<!-- gh-comment-id:77683162 --> @admorphit commented on GitHub (Mar 7, 2015): +1 Am using centos 7 and getting error also ``` s3fs mybucket -f -d -o allow_other,del_cache,endpoint=us-west-1,url=https://s3-us-west-1.amazonaws.com /mnt/s3data/ ``` Copy command fails ``` cp s3data/bigfile.mp4 /home cp: error reading ‘s3data/bigfile.mp4’: Input/output error cp: failed to extend ‘/home/bigfile.mp4’: Input/output error ``` and here's the log ``` s3fs_getattr(721): [path=/bigfile.mp4] HeadRequest(2112): [tpath=/bigfile.mp4] RequestPerform(1572): connecting to URL https://mybucket.s3-us-west-1.amazonaws.com/bigfile.mp4 RequestPerform(1588): HTTP response code 200 AddStat(248): add stat cache entry[path=/bigfile.mp4] GetStat(171): stat cache hit [path=/bigfile.mp4][time=1425723772][hit count=0] s3fs_open(1911): [path=/bigfile.mp4][flags=32768] DelStat(370): delete stat cache entry[path=/bigfile.mp4] HeadRequest(2112): [tpath=/bigfile.mp4] RequestPerform(1572): connecting to URL https://mybucket.s3-us-west-1.amazonaws.com/bigfile.mp4 RequestPerform(1588): HTTP response code 200 AddStat(248): add stat cache entry[path=/bigfile.mp4] GetStat(171): stat cache hit [path=/bigfile.mp4][time=1425723772][hit count=0] ParallelGetObjectRequest(1105): [tpath=/bigfile.mp4][fd=5] GetStat(171): stat cache hit [path=/bigfile.mp4][time=1425723772][hit count=1] PreGetObjectRequest(2417): [tpath=/bigfile.mp4][start=0][size=10485760] PreGetObjectRequest(2417): [tpath=/bigfile.mp4][start=10485760][size=10485760] PreGetObjectRequest(2417): [tpath=/bigfile.mp4][start=20971520][size=10485760] PreGetObjectRequest(2417): [tpath=/bigfile.mp4][start=31457280][size=10485760] PreGetObjectRequest(2417): [tpath=/bigfile.mp4][start=41943040][size=10485760] Request(3627): [count=5] s3fs_read(1957): could not find opened fd(/bigfile.mp4) s3fs_read(1957): could not find opened fd(/bigfile.mp4) s3fs_read(1957): could not find opened fd(/bigfile.mp4) s3fs_read(1957): could not find opened fd(/bigfile.mp4) s3fs_read(1957): could not find opened fd(/bigfile.mp4) s3fs_flush(2017): [path=/bigfile.mp4][fd=5] GetStat(171): stat cache hit [path=/bigfile.mp4][time=1425723772][hit count=2] s3fs_release(2057): [path=/bigfile.mp4][fd=5] s3fs_release(2071): could not find fd(file=/bigfile.mp4) ``` Thoughts?
Author
Owner

@admorphit commented on GitHub (Mar 8, 2015):

Managed to resolve the issue today by upgrading fuse from 2.9.2 to 2.9.3. Am using centos 7.

Hope this helps someone.

<!-- gh-comment-id:77724771 --> @admorphit commented on GitHub (Mar 8, 2015): Managed to resolve the issue today by upgrading fuse from 2.9.2 to 2.9.3. Am using centos 7. Hope this helps someone.
Author
Owner

@ggtakec commented on GitHub (Mar 8, 2015):

Hi,
I confirmed that this is a bug by commit 114966e.
I will fix this problem as soon as possible.
Please wait a while.

<!-- gh-comment-id:77759315 --> @ggtakec commented on GitHub (Mar 8, 2015): Hi, I confirmed that this is a bug by commit 114966e. I will fix this problem as soon as possible. Please wait a while.
Author
Owner

@ggtakec commented on GitHub (Mar 8, 2015):

I fixed and merged master branch at #143.
Please try to check it.

Thanks in advance for your help.

<!-- gh-comment-id:77760288 --> @ggtakec commented on GitHub (Mar 8, 2015): I fixed and merged master branch at #143. Please try to check it. Thanks in advance for your help.
Author
Owner

@benkehoe commented on GitHub (Mar 9, 2015):

Looks fixed to me, and I'm still on fuse 2.9.2. Thanks for the quick response!

<!-- gh-comment-id:77841598 --> @benkehoe commented on GitHub (Mar 9, 2015): Looks fixed to me, and I'm still on fuse 2.9.2. Thanks for the quick response!
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/s3fs-fuse#85
No description provided.