[GH-ISSUE #1536] s3fs consumes lot of CPU after 5 days running or mounting #806

Closed
opened 2026-03-04 01:48:56 +03:00 by kerem · 2 comments
Owner

Originally created by @atulvspl on GitHub (Jan 25, 2021).
Original GitHub issue: https://github.com/s3fs-fuse/s3fs-fuse/issues/1536

Hello Guys

Hope you guys doing well.

New Server information: Debian GNU/Linux 10 (buster)
Old server information: Debian GNU/Linux 8 (Jessie)

I have mounted s3 bucket on the old server and there were approximately 2.5 TB of data now I have created or mounted s3 bucket with the new server before a few days ago but still continues syncing and taking to much CPUs load I have configured 4 core CPUs in my server but S3fs syncing is taking 300 + CPU usage or load on the server so due to that load my application is not working properly and facing some issues with my application. I have also check on the AWS side related to some limitation of data syncing based on per day or per request but there is I have not set any limitation for upload data or download data.

I have also updated some information on the /etc/updatedb.conf file to skip the scanning for PRUNEPATHS but still taking same load.
So can you please advise me on how can I resolve that issues?

Originally created by @atulvspl on GitHub (Jan 25, 2021). Original GitHub issue: https://github.com/s3fs-fuse/s3fs-fuse/issues/1536 Hello Guys Hope you guys doing well. New Server information: Debian GNU/Linux 10 (buster) Old server information: Debian GNU/Linux 8 (Jessie) I have mounted s3 bucket on the old server and there were approximately 2.5 TB of data now I have created or mounted s3 bucket with the new server before a few days ago but still continues syncing and taking to much CPUs load I have configured 4 core CPUs in my server but S3fs syncing is taking 300 + CPU usage or load on the server so due to that load my application is not working properly and facing some issues with my application. I have also check on the AWS side related to some limitation of data syncing based on per day or per request but there is I have not set any limitation for upload data or download data. I have also updated some information on the /etc/updatedb.conf file to skip the scanning for PRUNEPATHS but still taking same load. So can you please advise me on how can I resolve that issues?
kerem 2026-03-04 01:48:56 +03:00
  • closed this issue
  • added the
    need info
    label
Author
Owner

@gaul commented on GitHub (Feb 11, 2021):

Can you run s3fs with -f -d and see what files are being accessed? It might point to an application doing operations that you don't expect.

<!-- gh-comment-id:777168572 --> @gaul commented on GitHub (Feb 11, 2021): Can you run s3fs with `-f -d` and see what files are being accessed? It might point to an application doing operations that you don't expect.
Author
Owner

@gaul commented on GitHub (Apr 21, 2021):

Please reopen if symptoms persist.

<!-- gh-comment-id:824110717 --> @gaul commented on GitHub (Apr 21, 2021): Please reopen if symptoms persist.
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/s3fs-fuse#806
No description provided.