[GH-ISSUE #368] Feature Request Content-Encoding:gzip #191

Closed
opened 2026-03-04 01:43:07 +03:00 by kerem · 4 comments
Owner

Originally created by @lordtangent on GitHub (Feb 27, 2016).
Original GitHub issue: https://github.com/s3fs-fuse/s3fs-fuse/issues/368

How about support for automatic gzip compression and tagging with Content-Encoding:gzip ?

That would reduce the file sizes for many common file types including executables.

Originally created by @lordtangent on GitHub (Feb 27, 2016). Original GitHub issue: https://github.com/s3fs-fuse/s3fs-fuse/issues/368 How about support for automatic gzip compression and tagging with Content-Encoding:gzip ? That would reduce the file sizes for many common file types including executables.
kerem closed this issue 2026-03-04 01:43:08 +03:00
Author
Owner

@tspicer commented on GitHub (Mar 3, 2016):

+1

<!-- gh-comment-id:191829121 --> @tspicer commented on GitHub (Mar 3, 2016): +1
Author
Owner

@RobbKistler commented on GitHub (Mar 4, 2016):

I suspect the mapping of file-like access to object access (ranged GET's, S3 multipart PUT's) makes this difficult.

<!-- gh-comment-id:192074338 --> @RobbKistler commented on GitHub (Mar 4, 2016): I suspect the mapping of file-like access to object access (ranged GET's, S3 multipart PUT's) makes this difficult.
Author
Owner

@ggtakec commented on GitHub (Mar 6, 2016):

I think that it is difficult to dynamically compressed.

It is possible to compress upload after s3fs has accumulated in the temporary cache.
However, it is difficult to download.
Because if you want to download by specifying the offset, you have to decompress it after you download all.
And, I think this would be inefficient.

Therefore, I recommend that you compress the object before uploading it.
(I think that it is not good way to compress by s3fs.)

<!-- gh-comment-id:192890553 --> @ggtakec commented on GitHub (Mar 6, 2016): I think that it is difficult to dynamically compressed. It is possible to compress upload after s3fs has accumulated in the temporary cache. However, it is difficult to download. Because if you want to download by specifying the offset, you have to decompress it after you download all. And, I think this would be inefficient. Therefore, I recommend that you compress the object before uploading it. (I think that it is not good way to compress by s3fs.)
Author
Owner

@ggtakec commented on GitHub (Mar 30, 2019):

We kept this issue open for a long time.
I will close this, but if the problem persists, please reopen or post a new issue.

<!-- gh-comment-id:478215670 --> @ggtakec commented on GitHub (Mar 30, 2019): We kept this issue open for a long time. I will close this, but if the problem persists, please reopen or post a new issue.
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/s3fs-fuse#191
No description provided.