[GH-ISSUE #1182] S3 improvements #677

Closed
opened 2026-02-26 02:33:56 +03:00 by kerem · 2 comments
Owner

Originally created by @Inkybro on GitHub (Feb 17, 2020).
Original GitHub issue: https://github.com/koel/koel/issues/1182

Description
Koel looks like one of the only projects out there that can really meet the needs I have.

The S3 integration works, but not very reliably, and the koel-aws repo isn't even really up-to-date with the koel repo.

As an example, the koel-aws repo is ignoring files with extension .flac, even though koel supports it. This is the easiest/smallest fix I made locally. I already opened a PR for this here

There are also problems introduced by -- I'm making a few assumptions here -- the filesize of some .flac files. If the filesize is too large to be A) downloaded into memory and/or B) copied to local disk to do tag reading, the lambda will simply timeout or fail. That's what seems to be the case, but only once the .flac file reaches a size of ~100mb.

I've managed to take care of most of these issues on my own instance/repo. The filesize issue is one I still haven't been able to figure out. Part of this my have ended up being due to failure to clean tmp, but I'm not sure yet without more testing.

Example
Number one thing I'd want is some analogue to "koel:sync" for S3. I've already written a script to do this for me. I don't want to pay the charge of uploading to S3 every time I want to get the data updated. I wrote a script that I'm able to run locally that will essentially list the contents of the specified S3 bucket, and progressively iterate through those files, running a very similar operation to handlePut in koel-aws for each one.

So ultimately imho it might be better for the user to specify AWS details, along with the bucket(s) used, and for some analogue of the normal FS sync command or something along these lines. Perhaps even implementing it alongside the normal FS sync command (as in, a user could both be using local FS as well as S3?) I'm not sure, just some ideas.

Followup
I actually have a lot more I could add but I don't have time now....

Another thing I didn't fully understand is the lack of some kind of upload feature, but this is ultimately secondary to everything else I mentioned. If I just had a nice way to get everything synced between S3 and Koel without having to reupload and waste money to test things out I'd be a lot happier.

Is this something you'd be interested in a PR for?

Originally created by @Inkybro on GitHub (Feb 17, 2020). Original GitHub issue: https://github.com/koel/koel/issues/1182 **Description** Koel looks like one of the only projects out there that can really meet the needs I have. The S3 integration *works*, but not very reliably, and the `koel-aws` repo isn't even really up-to-date with the `koel` repo. As an example, the `koel-aws` repo is ignoring files with extension `.flac`, even though `koel` supports it. This is the easiest/smallest fix I made locally. I already [opened a PR for this here](https://github.com/koel/koel-aws/pull/11) There are also problems introduced by -- I'm making a few assumptions here -- the filesize of some .flac files. If the filesize is too large to be A) downloaded into memory and/or B) copied to local disk to do tag reading, the lambda will simply timeout or fail. That's what seems to be the case, but only once the .flac file reaches a size of ~100mb. I've managed to take care of most of these issues on my own instance/repo. The filesize issue is one I still haven't been able to figure out. Part of this my have ended up being due to failure to clean `tmp`, but I'm not sure yet without more testing. **Example** Number one thing I'd want is some analogue to "koel:sync" for S3. I've already written a script to do this for me. I don't want to pay the charge of uploading to S3 every time I want to get the data updated. I wrote a script that I'm able to run locally that will essentially list the contents of the specified S3 bucket, and progressively iterate through those files, running a very similar operation to `handlePut` in koel-aws for each one. So ultimately imho it might be better for the user to specify AWS details, along with the bucket(s) used, and for some analogue of the normal FS sync command or something along these lines. Perhaps even implementing it alongside the normal FS sync command (as in, a user could both be using local FS as well as S3?) I'm not sure, just some ideas. **Followup** I actually have a lot more I could add but I don't have time now.... Another thing I didn't fully understand is the lack of some kind of upload feature, but this is ultimately secondary to everything else I mentioned. If I just had a nice way to get everything synced between S3 and Koel without having to reupload and waste money to test things out I'd be a lot happier. Is this something you'd be interested in a PR for?
kerem 2026-02-26 02:33:56 +03:00
Author
Owner

@sergiustheblack commented on GitHub (Nov 15, 2021):

@Inkybro can you please share your "sync" script somehow?

<!-- gh-comment-id:969227858 --> @sergiustheblack commented on GitHub (Nov 15, 2021): @Inkybro can you please share your "sync" script somehow?
Author
Owner

@phanan commented on GitHub (Jul 10, 2024):

A bit late, but Koel Plus now has full support for S3 as a storage driver.

<!-- gh-comment-id:2221138691 --> @phanan commented on GitHub (Jul 10, 2024): A bit late, but Koel Plus now has full support for S3 as a storage driver.
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/koel-koel#677
No description provided.