https://bugzilla.novell.com/show_bug.cgi?id=333753#c15
Occo Eric Nolf changed:
What |Removed |Added
----------------------------------------------------------------------------
Status|NEEDINFO |NEW
Info Provider|oen@dsl.pipex.com |
--- Comment #15 from Occo Eric Nolf 2007-10-26 18:07:20 MST ---
Before getting to the output produced by the different commands it may be
useful to draw your attention to the following 2 facts:
1 - Although RAID0+1 runs OK under 10.2 there was a problem with the original
installation.
For some reason, the 10.2 installation program would fail when LVM was used in
combination with RAID0+1. Even a setup where non-LVM partitions were defined
for the boot and root partitions wouldn't succeed.
2 - The numbering of the partitions on the RAID0+1 in my system is not the same
as the physical position of the partitions on disk. To be precise: the physical
order is 1, 5, 6, 8, 9, 10, and 7.
This is caused by the dual-boot setup of my system where partitions can be
created both under Windows XP x64 and under Linux. David Sauve, the other
poster to this thread, has a single-boot system and the partitions on his
RAID0+1 are numbered in the correct physical order; as we both experience the
same problems with installing 10.3 this may not be an important issue.
On the installed 10.2 system, the command 'dmsetup -ls' produces this output:
---------- Start of output
nvidia_acdaaaad-0_part6 (253, 19)
nvidia_acdaaaad-0_part10 (253, 23)
nvidia_acdaaaad_part10 (253, 16)
nvidia_acdaaaad-1_part9 (253, 8)
nvidia_acdaaaad-0_part5 (253, 18)
nvidia_acdaaaad-1_part8 (253, 7)
nvidia_acdaaaad_part1 (253, 10)
nvidia_acdaaaad-1_part7 (253, 6)
nvidia_acdaaaad (253, 2)
nvidia_acdaaaad-1_part6 (253, 5)
nvidia_acdaaaad-1_part5 (253, 4)
nvidia_acdaaaad-0_part1 (253, 17)
nvidia_acdaaaad-1_part10 (253, 9)
nvidia_acdaaaad_part9 (253, 15)
nvidia_acdaaaad_part8 (253, 14)
nvidia_acdaaaad-1_part1 (253, 3)
nvidia_acdaaaad_part7 (253, 13)
nvidia_acdaaaad-1 (253, 1)
nvidia_acdaaaad-0_part9 (253, 22)
nvidia_acdaaaad_part6 (253, 12)
nvidia_acdaaaad-0 (253, 0)
nvidia_acdaaaad-0_part8 (253, 21)
nvidia_acdaaaad_part5 (253, 11)
nvidia_acdaaaad-0_part7 (253, 20)
---------- End of output
On the 10.3 rescue system, after executing 'dmraid -ay -p', the command
'dmsetup ls' produces this output:
---------- Start of output
nvidia_acdaaaad-0_part6 (253, 6)
nvidia_acdaaaad-0_part10 (253, 10)
nvidia_acdaaaad_part10 (253, 18)
nvidia_acdaaaad-1_part9 (253, 25)
nvidia_acdaaaad-0_part5 (253, 5)
nvidia_acdaaaad_part2 (253, 12)
nvidia_acdaaaad-1_part8 (253, 24)
nvidia_acdaaaad_part1 (253, 11)
nvidia_acdaaaad-1_part7 (253, 23)
nvidia_acdaaaad (253, 2)
nvidia_acdaaaad-1_part6 (253, 22)
nvidia_acdaaaad-0_part2 (253, 4)
nvidia_acdaaaad-1_part5 (253, 21)
nvidia_acdaaaad-0_part1 (253, 3)
nvidia_acdaaaad-1_part10 (253, 26)
nvidia_acdaaaad_part9 (253, 17)
nvidia_acdaaaad-1_part2 (253, 20)
nvidia_acdaaaad_part8 (253, 16)
nvidia_acdaaaad-1_part1 (253, 19)
nvidia_acdaaaad_part7 (253, 15)
nvidia_acdaaaad-1 (253, 1)
nvidia_acdaaaad-0_part9 (253, 9)
nvidia_acdaaaad_part6 (253, 14)
nvidia_acdaaaad-0 (253, 0)
nvidia_acdaaaad-0_part8 (253, 8)
nvidia_acdaaaad_part5 (253, 13)
nvidia_acdaaaad-0_part7 (253, 7)
---------- End of output
There are differences: the 10.3 listing has 3 extra lines.
As far as I can see this is because the 10.3 listing contains entries for
partition #2 of the RAID0+1 and of both underlying RAID0 arrays. That seems
strange, as partition #2 is the extended partition containing all other
partitions except #1.
On the 10.3 rescue system the command 'dmraid -ay -p -vvv' produces this
output:
---------- Start of output
WARN: locking /var/lock/dmraid/.lock
NOTICE: /dev/sda: asr discovering
NOTICE: /dev/sda: ddf1 discovering
NOTICE: /dev/sda: hpt37x discovering
NOTICE: /dev/sda: hpt45x discovering
NOTICE: /dev/sda: isw discovering
NOTICE: /dev/sda: jmicron discovering
NOTICE: /dev/sda: lsi discovering
NOTICE: /dev/sda: nvidia discovering
NOTICE: /dev/sda: nvidia metadata discovered
NOTICE: /dev/sda: pdc discovering
NOTICE: /dev/sda: sil discovering
NOTICE: /dev/sda: via discovering
NOTICE: /dev/sdb: asr discovering
NOTICE: /dev/sdb: ddf1 discovering
NOTICE: /dev/sdb: hpt37x discovering
NOTICE: /dev/sdb: hpt45x discovering
NOTICE: /dev/sdb: isw discovering
NOTICE: /dev/sdb: jmicron discovering
NOTICE: /dev/sdb: lsi discovering
NOTICE: /dev/sdb: nvidia discovering
NOTICE: /dev/sdb: nvidia metadata discovered
NOTICE: /dev/sdb: pdc discovering
NOTICE: /dev/sdb: sil discovering
NOTICE: /dev/sdb: via discovering
NOTICE: /dev/sdc: asr discovering
NOTICE: /dev/sdc: ddf1 discovering
NOTICE: /dev/sdc: hpt37x discovering
NOTICE: /dev/sdc: hpt45x discovering
NOTICE: /dev/sdc: isw discovering
NOTICE: /dev/sdc: jmicron discovering
NOTICE: /dev/sdc: lsi discovering
NOTICE: /dev/sdc: nvidia discovering
NOTICE: /dev/sdc: nvidia metadata discovered
NOTICE: /dev/sdc: pdc discovering
NOTICE: /dev/sdc: sil discovering
NOTICE: /dev/sdc: via discovering
NOTICE: /dev/sdd: asr discovering
NOTICE: /dev/sdd: ddf1 discovering
NOTICE: /dev/sdd: hpt37x discovering
NOTICE: /dev/sdd: hpt45x discovering
NOTICE: /dev/sdd: isw discovering
NOTICE: /dev/sdd: jmicron discovering
NOTICE: /dev/sdd: lsi discovering
NOTICE: /dev/sdd: nvidia discovering
NOTICE: /dev/sdd: nvidia metadata discovered
NOTICE: /dev/sdd: pdc discovering
NOTICE: /dev/sdd: sil discovering
NOTICE: /dev/sdd: via discovering
NOTICE: added /dev/sda to RAID set "nvidia_acdaaaad"
NOTICE: added /dev/sdb to RAID set "nvidia_acdaaaad"
NOTICE: added /dev/sdc to RAID set "nvidia_acdaaaad"
NOTICE: added /dev/sdd to RAID set "nvidia_acdaaaad"
RAID set "nvidia_acdaaaad" already active
INFO: Activating raid10 RAID set "nvidia_acdaaaad"
WARN: unlocking /var/lock/dmraid/.lock
---------- End of output
Any more information you may need, just let me know - I'll try to respond a
little faster next time :)
Eric
--
Configure bugmail: https://bugzilla.novell.com/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are on the CC list for the bug.