mirror of
https://github.com/007revad/Synology_M2_volume.git
synced 2026-04-25 07:46:05 +03:00
[GH-ISSUE #150] synostgpool failed to create storage pool! #229
Labels
No labels
bug
pull-request
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
starred/Synology_M2_volume#229
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @007revad on GitHub (May 5, 2024).
Original GitHub issue: https://github.com/007revad/Synology_M2_volume/issues/150
Originally assigned to: @007revad on GitHub.
@aelmardi wrote:
Hey guys,
I'm facing an issue with my DS923+ on DSM 7.2.1-69057 Update 5. (Seagate Firecuda 530)
I don't have any logs available, but it failed during the storage creation step :
Thanks in advance for your help.
@007revad commented on GitHub (May 5, 2024):
I can see what the issue is. I just need to work why it happened and prevent it happening.
Line 10 in your screenshot shows:
Where it should show:
@007revad commented on GitHub (May 5, 2024):
Can you download and run this test version and reply with a screenshot of the output?
https://github.com/007revad/Synology_M2_volume/archive/refs/tags/v10.0.0-TEST.zip
@aelmardi commented on GitHub (May 5, 2024):
Hello,
Got it, here's the screenshot. Thanks for your prompt response.
@007revad commented on GitHub (May 6, 2024):
What does the following command return?
@aelmardi commented on GitHub (May 6, 2024):
Can't get the location of /dev/nvme0n1
But when I go to /dev/ an run ls command I have the nvme0n1 file :
/dev$ ls -lart nvme0*
crw------- 1 root root 250, 0 May 5 00:29 nvme0
brw------- 1 root root 259, 0 May 5 00:29 nvme0n1
brw------- 1 root root 259, 1 May 5 00:29 nvme0n1p1
brw------- 1 root root 259, 3 May 5 00:29 nvme0n1p3
brw------- 1 root root 259, 2 May 5 00:29 nvme0n1p2
brw------- 1 root root 259, 4 May 5 00:29 nvme0n1p5
@007revad commented on GitHub (May 6, 2024):
That is strange because the line before that in the script is:
synonvme --vendor-get /dev/nvme0n1So I'm not sure why this line does not work for you:
synonvme --get-location /dev/nvme0n1I've got a new test version for you to test. https://github.com/007revad/Synology_M2_volume/releases/tag/v10.0.0-TEST2
It is the full script with 2 changes:
synonvme --get-locationreturns an error it uses "nvme0n1" instead of expecting "M2 Drive 1".synostpoolcommand. Instead it will echo the command so I can see what arguments it would have used.@aelmardi commented on GitHub (May 6, 2024):
Here's the result of the script :
@007revad commented on GitHub (May 6, 2024):
I noticed another bug in your latest screenshot, which would have been the cause of the the synostpool failing.
I've released a new version of the script
https://github.com/007revad/Synology_M2_volume/releases/tag/v2.0.27
synonvme --get-locationfails.synostgpool failed to create storage poolwhenSingle volume storage pooltype selected with only 1 M.2 drive.@sin509 commented on GitHub (May 9, 2024):
hey,

i meet a similar problem,like
what can i do?
@007revad commented on GitHub (May 9, 2024):
@sin509
Can you download this version:
https://github.com/007revad/Synology_M2_volume/releases/tag/v10.0.0-TEST2
Then run the syno_create_m2_volume_test2.sh file and let me know if it works.
@aelmardi commented on GitHub (May 9, 2024):
With the latest version it doesn't work for me. I'm going to reinstall my DSM to see.
@sin509 commented on GitHub (May 9, 2024):
I have reinstalled my DSM from 918+ to 920+, and anything is ok.
so maybe it is down to 918+.
@aelmardi commented on GitHub (May 12, 2024):
You can close this issue because I believe the problem is that my M2 is not recognized either. I have created another issue:
https://github.com/007revad/Synology_HDD_db/issues/292