I have a Fedora 12 x86 server with RAID1+LVM acting as a file server. It was running with two degraded RAID1 devices (one drive dead/missing); that is, two of the physical volumes were RAID1 devices with only one drive. I added a new pair of drives, created a single partition on each drive, and made a new RAID1 device which I then added as a physical volume to the volume group. I then migrated the the data off one old physical volume to the new one. Once that was complete, I removed the old physical volume from the group an and added the underlying drive to the other degraded RAID1 device.
At some point in the process, I screwed up. I've been trying to remember exactly where things went wrong. I believe that I may have deactivated the one RAID1 device and added it's underlying drive to the other RAID1 device without doing pvremove. A bit later, I know I screwed up when I incorrectly marked the original good drive as faulty. So I had one "new" drive that was still being synced and the original good drive marked faulty. Based on something I read elsewhere, I concluded that this was not truly fatal and that a reboot would cause the array to be reassembled correctly. So I rebooted.
At that point, the system would reboot and try to start up all of the old RAID1 devices, including the one that no longer existed. That included another one that I didn't mention, but which was not part of the volume group. However, after starting up all those, it would then try to start the volumes find that part of the volumes were missing and then reboot.
I rebooted from the Fedora 12 DVD in recovery mode. That booted perfectly seeing only the correct RAID1 devices and the correct LVM information. Everything mounted correctly except for the /boot partition (which is another RAID1 device). I let the system sit overnight in this mode while the arrays finished syncing. I thought I would be good at that point, but it still won't boot.
After staring at the messages for a while, I concluded I needed to rebuild the initramfs. So, I booted from the DVD, rebuilt the initramfs and rebooted. I no longer get messages about missing parts of the LVM, only the RAID1 devices that are actually being used (and in /etc/mdadm.conf) are started. However, after appearing to have started everything, including the volumes, I get a brief message saying it can't mount them and it immediately reboots.
My data is there, and I believe the on-disk info is correct so the DVD can find it all and mount it, but something is still messed up when I boot from the drives. I'm guessing at this point that dracut is using some configuration information which is not consistent with the on-disk info, but I'm not sure what it is.
I'm wondering if I need to go through the procedure on the Linux Journal website here http://www.linuxjournal.com/article/8874?page=0,0
or if there is something else to get this working.