mirror of
https://github.com/asciinema/asciinema.git
synced 2026-04-25 07:55:51 +03:00
[GH-ISSUE #91] Error: Sorry, your asciicast is too big. #69
Labels
No labels
bug
compatibility
feature request
fit for beginners
help wanted
hosting
idea
improvement
packaging
pull-request
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
starred/asciinema#69
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @drvinceknight on GitHub (Apr 10, 2015).
Original GitHub issue: https://github.com/asciinema/asciinema/issues/91
I can't seem to upload any casts?
Have just done one that was very short in time, is the size dependent on the amount of text being displayed? (I am for example using a pip install so perhaps that's the problem).
@daixtr commented on GitHub (Apr 10, 2015):
I had this error too..i save locally instead and saw its only 4mb
On Apr 10, 2015 4:22 PM, "vince" notifications@github.com wrote:
@drvinceknight commented on GitHub (Apr 10, 2015):
Using
pip install -qto greatly decrease the output allowed me to upload. Perhaps something about the size limit could be added to the documentation? (Happy to work on a PR if this is a welcome addition)@ku1ik commented on GitHub (Apr 10, 2015):
The max size was set 2MB which appears to be too low. I have upped it to 5MB. This isn't much but I'm paying for the storage (S3) from my own pocket so I can't offer GBs of storage for every user. Let me know if that works for you. I'm fine with increasing it even more, but now I want to figure out the good middle ground between user needs and hosting costs.
@drvinceknight commented on GitHub (Apr 10, 2015):
That's cool @sickill , do you have a monetization plan? Perhaps people could pay to lift the limit?
@drvinceknight commented on GitHub (Apr 10, 2015):
How would you feel about something about the limit being in the documentation? (I missed it if it's already there)
@ku1ik commented on GitHub (Apr 10, 2015):
Paying for lifting the limit sounds interesting, yeah.
I'll add info about limit to docs. Thanks!
@ku1ik commented on GitHub (Apr 22, 2015):
@drvinceknight @daixtr is the current limit good enough for now? Still hitting the limit?
@drvinceknight commented on GitHub (Apr 22, 2015):
Nope, current limit works great for me. Thanks!
On Wed, 22 Apr 2015 15:56 Marcin Kulik notifications@github.com wrote:
@morissette commented on GitHub (Apr 13, 2017):
:(
[mharris@mori example]$ asciinema upload /tmp/tmp0p8dkaof-asciinema.json
~ Upload failed: Sorry, your asciicast is too big.
~ Retry later by running: asciinema upload /tmp/tmp0p8dkaof-asciinema.json
[mharris@mori example]$ ls -lah /tmp/tmp0p8dkaof-asciinema.json
-rw------- 1 mharris mharris 8.5M Apr 12 20:51 /tmp/tmp0p8dkaof-asciinema.json
@caglar10ur commented on GitHub (Oct 10, 2017):
How about storing those files gzip'ed in S3? I just produces 21M file and realized that there is a limit but compressing the same file ends up with less than 1M file which I believe less than the current limit.
@ku1ik commented on GitHub (Oct 11, 2017):
@caglar10ur asciinema-server actually stores the recordings gzipped on S3 :)
The thing is, if we would allow gzipped uploads from the client to the server, then 5M gzipped file would actually be 100M ungzipped when later downloaded by the browser (files on S3 are served with
Transfer-Encoding: gzipso they're automatically ungzipped by the browser before handed to XHR result callback).From this 100M JSON the player builds frames, transforming each stdout event into full screen contents (lines with groups of colored text etc). This whole thing probably takes 10x the memory the fetched JSON takes.
In other words, my concern here is high memory usage in the browser.
One option to optimize mem usage would be to fetch the recording in gzipped form, without
Transfer-Encoding: gzipheader, and ungzipping it on the go, in lazy/streaming fashion. This way we're losing super-fast native ungzipping provided by the browser, and I am yet to find a streaming/incremental gzip decoder for JS (I'd love to be pointed to one if such exists).I'm open for further discussion of this!
@massimolepore commented on GitHub (Mar 8, 2018):
@sickill Running into "~ Upload failed: Sorry, your asciicast is too big." with a 1.1M file, which is now the current limit for files?
@ku1ik commented on GitHub (Mar 8, 2018):
@massimolepore no, that's not to supposed to have changed. I think I may have made a mistake when reconfiguring the load balancer few days ago. Let me check that.
@ku1ik commented on GitHub (Mar 8, 2018):
Fixed, it's 5M again. Sorry about that!
@ku1ik commented on GitHub (Jul 2, 2023):
FYI the limit is 8 MB now (for a while now actually).
@pscriptos commented on GitHub (Feb 2, 2024):
Hello,
do I have the possibility to set the maximum upload size on a self-hosted instance?
I am using asciinema in a docker and had a longer session tonight which caused my file to grow about 13 MB in the end.
Unfortunately, I can't upload it because I get the following error:
-rw-r--r-- 1 root root 13M Feb 2 22:46 pixThank you very much for your great work!
@ku1ik commented on GitHub (Feb 3, 2024):
Hey @pscriptos!
Right now you'd need to customize the image as shown here: https://docs.asciinema.org/manual/server/self-hosting/customization/
But having an easy option (e.g. env var) is something that would useful. I'll see if I can implement that quickly.
@pscriptos commented on GitHub (Feb 3, 2024):
That would be fantastic. Thank you very much! :)
@ku1ik commented on GitHub (Feb 3, 2024):
@pscriptos good news. I've just released version 20240203, with new
UPLOAD_SIZE_LIMIToption. See here: https://docs.asciinema.org/manual/server/self-hosting/configuration/#upload-size-limit@pscriptos commented on GitHub (Feb 3, 2024):
You are my personal hero. Thank you so much! 🍻