mirror of
https://github.com/s3fs-fuse/s3fs-fuse.git
synced 2026-04-25 21:35:58 +03:00
[GH-ISSUE #777] s3fs Memory Error #446
Labels
No labels
bug
bug
dataloss
duplicate
enhancement
feature request
help wanted
invalid
need info
performance
pull-request
question
question
testing
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
starred/s3fs-fuse#446
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @yssefunc on GitHub (Jun 13, 2018).
Original GitHub issue: https://github.com/s3fs-fuse/s3fs-fuse/issues/777
I am working on AWS EMR Cluster. I have a data on my S3 storage. After I clean my data, i am sending to my S3 storage again via s3fs library. The code works with files which size are between 200-500 mb. However, when i am uploading between 2.0 and 2.5 gb size. The code gives a error which is "MemoryError". Do you guys any ideas or expericence about this issue?
import s3fs
bytes_to_write = nyc_green_20161.to_csv(None).encode()
fs = s3fs.S3FileSystem(key='#', secret='#')
with fs.open('s3://ludditiesnyctaxi/new/2016/yellow/yellow_1.csv', 'wb') as f:
f.write(bytes_to_write)
@gaul commented on GitHub (Jun 13, 2018):
@yssefunc Please close this issue and follow up with the Python s3fs project. Unfortunately we share the same name which cause confusion.
@yssefunc commented on GitHub (Jun 13, 2018):
Ok..