mirror of
https://github.com/s3fs-fuse/s3fs-fuse.git
synced 2026-04-25 13:26:00 +03:00
[GH-ISSUE #106] Limit Cache Size #64
Labels
No labels
bug
bug
dataloss
duplicate
enhancement
feature request
help wanted
invalid
need info
performance
pull-request
question
question
testing
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
starred/s3fs-fuse#64
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @Jafo232 on GitHub (Jan 20, 2015).
Original GitHub issue: https://github.com/s3fs-fuse/s3fs-fuse/issues/106
Is there a way to limit the size of the cache, or somehow expire items in the cache after X amount of time?
@ggtakec commented on GitHub (Mar 4, 2015):
Now s3fs does not have the function which is limiting cache size.
But there is sample script in test/sample_delcache.sh, maybe it helps to solve your problem.
This script searches files in cache directory and removing cache files by sorted atime.
Please try to use and modify the script.
Thanks in advance for your assistance.
@ggtakec commented on GitHub (Jan 17, 2016):
#280(#269) supports ensure_diskfree option, it limits cache size on your disk.
I'm closing this issue, if you have a problem yet, please post new issue or reopen this issue.
Thanks in advance for your help.