mirror of
https://github.com/s3fs-fuse/s3fs-fuse.git
synced 2026-04-25 13:26:00 +03:00
[GH-ISSUE #2336] Inconsistency of bucket size and filesystem volume size where the bucket is mounted #1154
Labels
No labels
bug
bug
dataloss
duplicate
enhancement
feature request
help wanted
invalid
need info
performance
pull-request
question
question
testing
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
starred/s3fs-fuse#1154
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @itsystem on GitHub (Sep 27, 2023).
Original GitHub issue: https://github.com/s3fs-fuse/s3fs-fuse/issues/2336
Additional Information
Version of s3fs being used
Amazon Simple Storage Service File System V1.93 (commit:unknown) with OpenSSL
Version of fuse being used
2.9.2
Kernel information
3.10.0-1160.6.1.el7.x86_64
GNU/Linux Distribution
CentOS Linux 7 (Core)
How to run s3fs
command line
s3fs /home/bitrix/static -o passwd_file=$HOME/.passwd-s3fs -o url=https://storage.yandexcloud.net -o use_path_request_style -o bucket=test.static.example.com/etc/fstab entry
test.static.example.com /home/bitrix/static fuse.s3fs _netdev,allow_other,use_path_request_style,url=https://storage.yandexcloud.net,passwd_file=/root/.passwd-s3fs 0 0s3fs syslog messages
Syslog messages when mounted via fstab entry:
Syslog messages when mounted via command line:
Details about issue
The issue is that volume of mounted bucket is shown by df -h is 4GB while real volume of the bucket is 500GB.
I tried to mount with option bucket_size=500G but then I got message "s3fs: invalid bucket_size option." from s3fs.
So the question is: how to mount the bucket with its real volume?
@OttaviaB commented on GitHub (Oct 13, 2023):
Hello,
the bucket_size option expects either GB or GiB as units, not G.
Does that fix your problem?
@itsystem commented on GitHub (Oct 19, 2023):
Hello. Thanks alot. That fixed our problem.