[PR #1016] [MERGED] Fix ACL errors for newly created and pre-existing blobs #1118

Closed
opened 2026-03-03 12:33:09 +03:00 by kerem · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/fsouza/fake-gcs-server/pull/1016
Author: @RachitSharma2001
Created: 12/22/2022
Status: Merged
Merged: 1/8/2023
Merged by: @fsouza

Base: mainHead: fix_944_945


📝 Commits (3)

  • 4b0dd7e Fix ACL errors for existing and pre-existing blobs
  • 2312a77 internal/backend: generalize code a bit
  • 0085fc6 Remove some comments

📊 Changes

7 files changed (+113 additions, -27 deletions)

View changed files

📝 fakestorage/object.go (+23 -3)
📝 internal/backend/fs.go (+7 -13)
📝 internal/backend/memory.go (+5 -10)
📝 internal/backend/object.go (+21 -0)
📝 internal/backend/storage.go (+1 -1)
📝 main.go (+7 -0)
📝 main_test.go (+49 -0)

📄 Description

This fixes #944 and fixes #945.

The following two python snippets no longer crash and instead give the correct output. The first one is for a pre-existing blob (thus fixing #944):

os.environ["STORAGE_EMULATOR_HOST"] = "http://0.0.0.0:4443"

client = storage.Client(
    credentials=AnonymousCredentials(),
    project="test-project",
)

# initial bucket/file for docker image
bucket_name = "sample-bucket"
blob_name = "some_file.txt"

# test
bucket = client.get_bucket(bucket_name)
blob = bucket.blob(blob_name)
print(list(blob.acl))
blob.make_public() 
print(list(blob.acl))
blob.make_private()
print(list(blob.acl))

The outputs:
[{'entity': 'projectOwner-test-project', 'role': 'OWNER'}]
[{'entity': 'projectOwner-test-project', 'role': 'OWNER'}, {'entity': 'allUsers', 'role': 'READER'}]
[{'entity': 'projectOwner-test-project', 'role': 'OWNER'}]

The second snippet shows how for a newly created blob, the acl's also update when make_public() and make_private() are called:

os.environ["STORAGE_EMULATOR_HOST"] = "http://0.0.0.0:4443"

client = storage.Client(
    credentials=AnonymousCredentials(),
    project="test-project",
)

# buckets/files to create and test
upload_bucket_name = "test-bucket-with-globally-unique-name"
upload_blob_name = "test-blob-upload.svg"
upload_blob_file_path = "./image.svg"

# initialize
try:
    bucket = client.bucket(upload_bucket_name)
    bucket.storage_class = storage.constants.STANDARD_STORAGE_CLASS
    client.create_bucket(bucket, location="EU", retry=None)
except:
    pass
bucket = client.get_bucket(upload_bucket_name)
blob = bucket.blob(upload_blob_name)
blob.upload_from_filename(upload_blob_file_path, retry=None)

# test
bucket = client.get_bucket(upload_bucket_name)
blob = bucket.blob(upload_blob_name)
print(list(blob.acl))
blob.make_public()
print(list(blob.acl)) 
blob.make_private()
print(list(blob.acl))

The outputs:
[{'entity': 'projectOwner-test-project', 'role': 'OWNER'}]
[{'entity': 'projectOwner-test-project', 'role': 'OWNER'}, {'entity': 'allUsers', 'role': 'READER'}]
[{'entity': 'projectOwner-test-project', 'role': 'OWNER'}]

Explanation of my implementation:

For fixing #944:
Within main.go, when it reads all the existing files within the bucket, it previously never set an ACL for any of these objects. I assumed that for these pre-existing blobs, the ACL for each of them would just be [{'entity': 'projectOwner-test-project', 'role': 'OWNER'}].

For fixing #945:
From looking at how the python API updates the ACL's, I found that rather than sending a POST to the endpoint /b/{bucketName}/o/{objectName:.+}/acl, it instead sends a PATCH request to the endpoint /b/{bucketName}/o/{objectName:.+}.

Thus, I needed to update the patchObject method within fakestorage/object.go to detect if new ACL's are passed in and update the object's ACL if so.


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/fsouza/fake-gcs-server/pull/1016 **Author:** [@RachitSharma2001](https://github.com/RachitSharma2001) **Created:** 12/22/2022 **Status:** ✅ Merged **Merged:** 1/8/2023 **Merged by:** [@fsouza](https://github.com/fsouza) **Base:** `main` ← **Head:** `fix_944_945` --- ### 📝 Commits (3) - [`4b0dd7e`](https://github.com/fsouza/fake-gcs-server/commit/4b0dd7e20620a3ab7af45c5dfd039bf160a5ec1c) Fix ACL errors for existing and pre-existing blobs - [`2312a77`](https://github.com/fsouza/fake-gcs-server/commit/2312a7799a971d46b22c5f802081c16e42bdcf89) internal/backend: generalize code a bit - [`0085fc6`](https://github.com/fsouza/fake-gcs-server/commit/0085fc6babdeefbe341e722ba579f2a889e0df86) Remove some comments ### 📊 Changes **7 files changed** (+113 additions, -27 deletions) <details> <summary>View changed files</summary> 📝 `fakestorage/object.go` (+23 -3) 📝 `internal/backend/fs.go` (+7 -13) 📝 `internal/backend/memory.go` (+5 -10) 📝 `internal/backend/object.go` (+21 -0) 📝 `internal/backend/storage.go` (+1 -1) 📝 `main.go` (+7 -0) 📝 `main_test.go` (+49 -0) </details> ### 📄 Description This fixes #944 and fixes #945. The following two python snippets no longer crash and instead give the correct output. The first one is for a pre-existing blob (thus fixing #944): ``` os.environ["STORAGE_EMULATOR_HOST"] = "http://0.0.0.0:4443" client = storage.Client( credentials=AnonymousCredentials(), project="test-project", ) # initial bucket/file for docker image bucket_name = "sample-bucket" blob_name = "some_file.txt" # test bucket = client.get_bucket(bucket_name) blob = bucket.blob(blob_name) print(list(blob.acl)) blob.make_public() print(list(blob.acl)) blob.make_private() print(list(blob.acl)) ``` The outputs: [{'entity': 'projectOwner-test-project', 'role': 'OWNER'}] [{'entity': 'projectOwner-test-project', 'role': 'OWNER'}, {'entity': 'allUsers', 'role': 'READER'}] [{'entity': 'projectOwner-test-project', 'role': 'OWNER'}] The second snippet shows how for a newly created blob, the acl's also update when make_public() and make_private() are called: ``` os.environ["STORAGE_EMULATOR_HOST"] = "http://0.0.0.0:4443" client = storage.Client( credentials=AnonymousCredentials(), project="test-project", ) # buckets/files to create and test upload_bucket_name = "test-bucket-with-globally-unique-name" upload_blob_name = "test-blob-upload.svg" upload_blob_file_path = "./image.svg" # initialize try: bucket = client.bucket(upload_bucket_name) bucket.storage_class = storage.constants.STANDARD_STORAGE_CLASS client.create_bucket(bucket, location="EU", retry=None) except: pass bucket = client.get_bucket(upload_bucket_name) blob = bucket.blob(upload_blob_name) blob.upload_from_filename(upload_blob_file_path, retry=None) # test bucket = client.get_bucket(upload_bucket_name) blob = bucket.blob(upload_blob_name) print(list(blob.acl)) blob.make_public() print(list(blob.acl)) blob.make_private() print(list(blob.acl)) ``` The outputs: [{'entity': 'projectOwner-test-project', 'role': 'OWNER'}] [{'entity': 'projectOwner-test-project', 'role': 'OWNER'}, {'entity': 'allUsers', 'role': 'READER'}] [{'entity': 'projectOwner-test-project', 'role': 'OWNER'}] Explanation of my implementation: For fixing #944: Within main.go, when it reads all the existing files within the bucket, it previously never set an ACL for any of these objects. I assumed that for these pre-existing blobs, the ACL for each of them would just be `[{'entity': 'projectOwner-test-project', 'role': 'OWNER'}]`. For fixing #945: From looking at how the python API updates the ACL's, I found that rather than sending a POST to the endpoint `/b/{bucketName}/o/{objectName:.+}/acl`, it instead sends a PATCH request to the endpoint `/b/{bucketName}/o/{objectName:.+}`. Thus, I needed to update the `patchObject` method within `fakestorage/object.go` to detect if new ACL's are passed in and update the object's ACL if so. --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
kerem 2026-03-03 12:33:09 +03:00
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/fake-gcs-server#1118
No description provided.