mirror of
https://github.com/007revad/Synology_HDD_db.git
synced 2026-04-25 21:55:59 +03:00
[GH-ISSUE #119] DSM Version: 7.2-64570 Update 2 - all nvme drives gone #550
Labels
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
starred/Synology_HDD_db#550
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @cfsnate on GitHub (Jul 24, 2023).
Original GitHub issue: https://github.com/007revad/Synology_HDD_db/issues/119
after updating to 7.2 U2, all nvme drives are gone from GUI, script still finds them and shows them as enabled. Have tried a number of script runs, restores and then reapply, reboots, etc - no dice.
using version 3.1.60
I have 2 in the native slots as a shr-1 volume
and then 2 on a E10M20-T1 card as a rw cache.
everything is degraded and basically in limp mode
thoughts to revive?
@ct85msi commented on GitHub (Jul 24, 2023):
same issue here. The nvme drive is accesible from the terminal but DSM won`t load it. It is a bug with update 2 or it is by design ?
ash-4.4# mdadm --assemble --scan
mdadm: /dev/md/4 has been started with 1 drive.
I can scan for the volume group and mount it. Update2 doesn`t like nvme devices.
@cfsnate commented on GitHub (Jul 24, 2023):
I'm also able to assemble the raids from ssh so I don't think there's data loss
but since one is an ssd cache for a volume, that volume is crashed and the other is its own volume so that's down
I haven't tried to mount the volume array yet since I ran into some LUKS issues and would need to decrypt it first.
Hopefully an easy update to the script 🙃
@007revad commented on GitHub (Jul 25, 2023):
I'll update my test NAS to 7.2 update 2 and see what's changed.
@007revad commented on GitHub (Jul 25, 2023):
After spending too many hours unpacking and comparing what's different in the 7.2 update 1 and 7.2 update 2 pat files I did find that a couple of nvme related files and libsynostoragemgmt has changed. The main nvme change is that DSM has preset power limits for PCIe cards.
So I backed up all my DS720+'s root files and then updated to 7.2 update 2. To my surprise my NVMe volume survived the update. But I didn't have any of my scripts scheduled to run at shut down or boot up.
@007revad commented on GitHub (Jul 25, 2023):
@cfsnate
This sounds like something went wrong with the update.
You could try my Synology_DSM_reinstall script which will allow to reinstall DSM 7.2-64570 (with Update 1). But I've never tried the script to roll back from an update.
@007revad commented on GitHub (Jul 25, 2023):
@ct85msi
It sounds like a bug. But in DSM 7.2-64570 there were 4 or 5 changes that appeared like their only purpose was to break my scripts.
@cfsnate commented on GitHub (Jul 25, 2023):
Will try the reinstall script and see what happens.
If i recall correctly, my nvme volume survived the update but my cache did not. After rebooting once, both were gone.
Did you reboot after your update?
@magicdude4eva commented on GitHub (Jul 25, 2023):
FWIW - on my DS1019+ I am using a NVME Storage Pool with Crucial NVMEs and did not have an issue after upgrade:

I do however have
syno_enable_m2_volume.shandsyno_hdd_db.shas a boot-up script:@cfsnate commented on GitHub (Jul 25, 2023):
Did this volume survive an additional reboot?
@007revad commented on GitHub (Jul 25, 2023):
I just rebooted now and the volume still exists.
Do you have any of my scripts scheduled to run at shut-down or boot-up?
I don't, and I haven't run any of them yet since updating DSM.
@cfsnate commented on GitHub (Jul 25, 2023):
I did when I ran the update, iirc I had hdd script running at bootup
@magicdude4eva commented on GitHub (Jul 25, 2023):
Yes it did - but remember that I have the boot-up script from @007revad as per my screenshot.
@007revad commented on GitHub (Jul 25, 2023):
@magicdude4eva
Personally I would schedule enable_m2_volume script to run at shut down. If it makes any changes you need to reboot. So running it at shutdown lets it make the changes (if they are needed) without needing to reboot.
@cfsnate commented on GitHub (Jul 25, 2023):
I was able to use the reinstall script to decrease the version and then install 7.2 Update 1 over top of Update 2.
This was successful!!!! Everything has been restored - no data loss.
I did have to re-run the hdd script 3.1.60 and reboot to enable my PCI card after "updating" down to Update 1.
@007revad what's the proper best practice for scheduling the hdd script or other scripts these days. It seems like all the functionality has been built into the hdd script. Is it best to schedule that one on shutdown?
@007revad commented on GitHub (Jul 25, 2023):
Excellent. I though the downgrade would work because it's still 7.2-64570
I'm undecided on if it's best to schedule the hdd script to run at shut-down or boot-up. Either works for me, But there have been edge cases where one was better than the other for some people. I know of 1 person who has it scheduled to run at shut-down and boot-up.
@ct85msi commented on GitHub (Jul 25, 2023):
Did the same, reinstalled update1 and I see my nvme array. Thank you.
@007revad commented on GitHub (Jul 25, 2023):
Awesome.
/run/synostorage/disks/*/compatibility_actionfiles after using Synology_HDD_db #697