mirror of
https://github.com/007revad/Synology_M2_volume.git
synced 2026-04-26 00:06:14 +03:00
[GH-ISSUE #130] After succesfully creating NVMe Storage Pool I can no longer add HDDs to other Storage Pool #28
Labels
No labels
bug
pull-request
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
starred/Synology_M2_volume#28
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @cnstudios on GitHub (Feb 27, 2024).
Original GitHub issue: https://github.com/007revad/Synology_M2_volume/issues/130
I have a Synology NAS with a Storage Pool 1 where all my data is on slow disks. Added two NVMe drives and used those to create Storage Pool 2 where my docker containers are. Works as advertised. However after adding a new slow drive because Storage Pool 1 was running out of space I could no longer add that drive to expand that pool because the drive requirements are not met. After checking the details it appears that I can only add NVMe drives but not HDDs anymore. Perhaps I'm doing something wrong, but it seems weird as I have not seen such a message before. Below is a screenshot of the requirements. Perhaps there is a way around it?
@007revad commented on GitHub (Mar 25, 2024):
Sorry, some how I didn't notice this issue until now.
It looks like you were trying to add HDD 9 to your NVMe storage pool.
@cnstudios commented on GitHub (Mar 25, 2024):
Hmm, I just tried again, but I couldn't choose the storage pool. It simply says the drive requirements aren't met. I'll try to create a new pool and or volume tonight. Thanks
@cnstudios commented on GitHub (Mar 26, 2024):
Well, restarted the server and somehow it works again to add new drives to different pools. Don't really know what was going on, but I'm happy again!. Thanks
@007revad commented on GitHub (Mar 26, 2024):
That's strange. But good that the reboot fixed it.