mirror of
https://github.com/lucascbeyeler/zmbackup.git
synced 2026-04-24 22:55:56 +03:00
[GH-ISSUE #175] S3 storage backup destination #140
Labels
No labels
Bug Report
Enhancement
Enhancement
Feature Idea
Feature Idea
Not Implemented
Question
Question
Task
Wontfix
pull-request
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
starred/zmbackup#140
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @ananiasfilho on GitHub (Apr 27, 2022).
Original GitHub issue: https://github.com/lucascbeyeler/zmbackup/issues/175
Send backups to a S3 Storage bucket like AWS S3, MinIO, Wasabi
ISSUE TYPE
ENVIRONMENT VERSION
SUMMARY
As many zimbra users need send backup files to a external storage, S3 solutions are good and cheap options.
The idea is:
Make backup of each account and send each file to a S3 bucket - Important running in parallel.
We can discuss it better if you accept this feature request.
@protontrigg3r commented on GitHub (May 30, 2022):
A better idea might be a generic Remote Storage Backup Bestination by taking the code from rclone to connect remote storage ...
@ananiasfilho commented on GitHub (May 30, 2022):
IMHO we need a integrated solution. rclone is good but not integrated with ZmBackup procedures and will be more one service/software on Zimbra. When i proposed this, i was thinking on:
if i have 2TB backup, i dont need have 2TB FREE on my server to store backup and dont run backup store remote (slow procedure - NFS, SMBFS/CIFS, etc). The idea is: run 10-20GB backup of account/email and when finish this account, send to a S3 bucket, then start new one and delete this old. Today we need on our environmento, more than 6TB free space to make backups and its not good since, after make backup, i need send 6TB to a remote storage, then delete it on local storage. Make it directly sync to S3 bucket, i can create a list and restore it from S3 bucket. Lets discuss!
@protontrigg3r commented on GitHub (May 31, 2022):
I agree with you, in fact I didn't write "using rclone" but "taking the code from rclone"
very interesting ...