RAID 10 (0+1) messed up after BIOS update
Posted: 09 Jul 2012, 07:14
Hello,
My system is as follows: ASUS Maximus IV Gene-Z mobo, Intel i7 2600 CPU. 4 x 2TB WDC HDD set up as a RAID 10 (0+1) RAID array from the mobo, using Intel IRST drivers.
I have Windows 7 Ultimate edition installed onto a 240 GB OCZ Vertex SSD. The SSD was had/ has two partitions, one for the OS and the other as 64GB Cache for IRST acceleration. The RAID (Volume A) was effectively one UEFI GPT partition which I created in the beginning by using DiskPart during the OS installation. It reads as 3.6TB in the intel IRST Option ROM menu.
Everything was working perfectly and as the saying goes, if it ain't broke don't fix it! Well I did just that and decided to update the IRST drivers to the latest 10.8 version in order to support the latest BIOS version.
The IRST driver update went perfectly after reboot. I then flashed the BIOS and on completion entered set up but forgot that everything was now in default. I exited BIOS and windows started to load but it was doing so in repair mode. It asked me if I wanted to repair the system. I said yes and after a time it said that repairs could not be made. Realising with horror my mistake I shut down the system to think what to do (at this time I was not aware of TestDisk).
I powered up again into BIOS and sure enough the SATA settings were set to AHCI rather than RAID. At this juncture I did not reset to RAID but I inserted my SPINRITE v6 data recovery tool to see if it would help. There were no options that I was happy with so I removed it without selecting any commands, however I fear that even this may have been unwise.
Anyway, after more research I reset the SATA to RAID mode and then entered into the IRST Option ROM (CTRL I). See image below:-
[img=http://s19.postimage.org/smsoc86z3/IRST_Option_ROM.jpg]
From the image you can see that the VOL A RAID 10 has failed and it shows that two of the WDC disks are non raid but the other two (4&5) are still members. I then made them (nos 4&5) non raid and the recreated a new RAID 10 Vol A, exactly as before, which basically was with default 64kb strips. All disks are in exactly the same order as before - none of the connections have been changed at all.
So with the RAID recreated I restarted, but the system will not properly start up. Windows still tries to load into repair mode, however I did not select any repair options. So far the only way I can get TestDisk to run is from a DOS environment CDRom (CAINE 5x) and when the TestDisk utility runs it only sees all the drives (SSD + 4 HDDs) individually. I have run quick scan on all drives but not made any 'write' instructions.
Disks 2 and 3 are showing up as being UEFI partition and disks 4 and 5 show up as Intel/PC which I don't understand. Anyway I am simply not sure what to do next as I am afraid to try any action that from this position can only be taken on an individual disk and hence I cannot see how it would lead to a reconstruct of the single GPT partition that existed across the four disks in the array.
My understanding is that TestDisk should 'see' the array when launched from windows. Should I reinstall windows onto the SSD and operate TestDisk that way, or can I perform the necessary recovery options from the DOS environment.
Ideally I would like to get the set up back to where it was before, however my priority is to recover my data.
Many thanks for your help and understanding.
Bill
My system is as follows: ASUS Maximus IV Gene-Z mobo, Intel i7 2600 CPU. 4 x 2TB WDC HDD set up as a RAID 10 (0+1) RAID array from the mobo, using Intel IRST drivers.
I have Windows 7 Ultimate edition installed onto a 240 GB OCZ Vertex SSD. The SSD was had/ has two partitions, one for the OS and the other as 64GB Cache for IRST acceleration. The RAID (Volume A) was effectively one UEFI GPT partition which I created in the beginning by using DiskPart during the OS installation. It reads as 3.6TB in the intel IRST Option ROM menu.
Everything was working perfectly and as the saying goes, if it ain't broke don't fix it! Well I did just that and decided to update the IRST drivers to the latest 10.8 version in order to support the latest BIOS version.
The IRST driver update went perfectly after reboot. I then flashed the BIOS and on completion entered set up but forgot that everything was now in default. I exited BIOS and windows started to load but it was doing so in repair mode. It asked me if I wanted to repair the system. I said yes and after a time it said that repairs could not be made. Realising with horror my mistake I shut down the system to think what to do (at this time I was not aware of TestDisk).
I powered up again into BIOS and sure enough the SATA settings were set to AHCI rather than RAID. At this juncture I did not reset to RAID but I inserted my SPINRITE v6 data recovery tool to see if it would help. There were no options that I was happy with so I removed it without selecting any commands, however I fear that even this may have been unwise.
Anyway, after more research I reset the SATA to RAID mode and then entered into the IRST Option ROM (CTRL I). See image below:-
[img=http://s19.postimage.org/smsoc86z3/IRST_Option_ROM.jpg]
From the image you can see that the VOL A RAID 10 has failed and it shows that two of the WDC disks are non raid but the other two (4&5) are still members. I then made them (nos 4&5) non raid and the recreated a new RAID 10 Vol A, exactly as before, which basically was with default 64kb strips. All disks are in exactly the same order as before - none of the connections have been changed at all.
So with the RAID recreated I restarted, but the system will not properly start up. Windows still tries to load into repair mode, however I did not select any repair options. So far the only way I can get TestDisk to run is from a DOS environment CDRom (CAINE 5x) and when the TestDisk utility runs it only sees all the drives (SSD + 4 HDDs) individually. I have run quick scan on all drives but not made any 'write' instructions.
Disks 2 and 3 are showing up as being UEFI partition and disks 4 and 5 show up as Intel/PC which I don't understand. Anyway I am simply not sure what to do next as I am afraid to try any action that from this position can only be taken on an individual disk and hence I cannot see how it would lead to a reconstruct of the single GPT partition that existed across the four disks in the array.
My understanding is that TestDisk should 'see' the array when launched from windows. Should I reinstall windows onto the SSD and operate TestDisk that way, or can I perform the necessary recovery options from the DOS environment.
Ideally I would like to get the set up back to where it was before, however my priority is to recover my data.
Many thanks for your help and understanding.
Bill