From the output it looks like ESXi is installed on the local disk (vmhba0).
1 root root 146778685440 Jul 8 11:16 mpx.vmhba0:C0:T0:L0
119 917504 -rw------- 1 root root 939524096 Jul 8 11:16 mpx.vmhba0:C0:T0:L0:1
121 4193280 -rw------- 1 root root 4293918720 Jul 8 11:16 mpx.vmhba0:C0:T0:L0:2
123 138223680 -rw------- 1 root root 141541048320 Jul 8 11:16 mpx.vmhba0:C0:T0:L0:3
125 4080 -rw------- 1 root root 4177920 Jul 8 11:16 mpx.vmhba0:C0:T0:L0:4
127 255984 -rw------- 1 root root 262127616 Jul 8 11:16 mpx.vmhba0:C0:T0:L0:5
129 255984 -rw------- 1 root root 262127616 Jul 8 11:16 mpx.vmhba0:C0:T0:L0:6
131 112624 -rw------- 1 root root 115326976 Jul 8 11:16 mpx.vmhba0:C0:T0:L0:7
133 292848 -rw------- 1 root root 299876352 Jul 8 11:16 mpx.vmhba0:C0:T0:L0:8
From the sizes it looks like you are using 2x146GB HDDs as RAID1.
Btw. ls -lisa and fdisk -lu are two separate commands.
André
Welcome to the Community,
if you don't know this, it might be time to start with documentation :smileylaugh:
You could check the partition layout of the local disk to see whether it contains the typical ESXi partitions.
André
What's the output of ls -lisa /dev/disks/ and fdisk -lu?
André
Hi ap
~ # ls -lisa /dev/disks/ and fdisk -lu
ls: and: No such file or directory
ls: fdisk: No such file or directory
/dev/disks/:
4 0 drwxr-xr-x 1 root root 512 Jul 8 11:16 .
1 0 drwxr-xr-x 1 root root 512 Jul 8 11:16 ..
135 143338560 -rw------- 1 root root 146778685440 Jul 8 11:16 mpx.vmhba0:C0:T0:L0
119 917504 -rw------- 1 root root 939524096 Jul 8 11:16 mpx.vmhba0:C0:T0:L0:1
121 4193280 -rw------- 1 root root 4293918720 Jul 8 11:16 mpx.vmhba0:C0:T0:L0:2
123 138223680 -rw------- 1 root root 141541048320 Jul 8 11:16 mpx.vmhba0:C0:T0:L0:3
125 4080 -rw------- 1 root root 4177920 Jul 8 11:16 mpx.vmhba0:C0:T0:L0:4
127 255984 -rw------- 1 root root 262127616 Jul 8 11:16 mpx.vmhba0:C0:T0:L0:5
129 255984 -rw------- 1 root root 262127616 Jul 8 11:16 mpx.vmhba0:C0:T0:L0:6
131 112624 -rw------- 1 root root 115326976 Jul 8 11:16 mpx.vmhba0:C0:T0:L0:7
133 292848 -rw------- 1 root root 299876352 Jul 8 11:16 mpx.vmhba0:C0:T0:L0:8
144 73400320 -rw------- 1 root root 75161927680 Jul 8 11:16 naa.6006016049ef27000d6c501f7ff2e011
142 73392888 -rw------- 1 root root 75154317824 Jul 8 11:16 naa.6006016049ef27000d6c501f7ff2e011:1
162 2097152 -rw------- 1 root root 2147483648 Jul 8 11:16 naa.6006016049ef270022ea9bedfcf3e011
160 2096418 -rw------- 1 root root 2146732544 Jul 8 11:16 naa.6006016049ef270022ea9bedfcf3e011:1
167 52428800 -rw------- 1 root root 53687091200 Jul 8 11:16 naa.6006016049ef270030336e5bfbfee011
169 52428063 -rw------- 1 root root 53686337024 Jul 8 11:16 naa.6006016049ef270030336e5bfbfee011:1
140 314572800 -rw------- 1 root root 322122547200 Jul 8 11:16 naa.6006016049ef270058667b3f7ff2e011
157 314568701 -rw------- 1 root root 322118349824 Jul 8 11:16 naa.6006016049ef270058667b3f7ff2e011:1
175 314572800 -rw------- 1 root root 322122547200 Jul 8 11:16 naa.6006016049ef27007ec737eff514e111
182 314568701 -rw------- 1 root root 322118349824 Jul 8 11:16 naa.6006016049ef27007ec737eff514e111:1
165 262144000 -rw------- 1 root root 268435456000 Jul 8 11:16 naa.6006016049ef27008210b670fbfee011
172 262140573 -rw------- 1 root root 268431947264 Jul 8 11:16 naa.6006016049ef27008210b670fbfee011:1
177 52428800 -rw------- 1 root root 53687091200 Jul 8 11:16 naa.6006016049ef2700d23047c3f514e111
179 52428063 -rw------- 1 root root 53686337024 Jul 8 11:16 naa.6006016049ef2700d23047c3f514e111:1
136 0 lrwxrwxrwx 1 root root 19 Jul 8 11:16 vml.0000000000766d686261303a303a30 -> mpx.vmhba0:C0:T0:L0
120 0 lrwxrwxrwx 1 root root 21 Jul 8 11:16 vml.0000000000766d686261303a303a30:1 -> mpx.vmhba0:C0:T0:L0:1
122 0 lrwxrwxrwx 1 root root 21 Jul 8 11:16 vml.0000000000766d686261303a303a30:2 -> mpx.vmhba0:C0:T0:L0:2
124 0 lrwxrwxrwx 1 root root 21 Jul 8 11:16 vml.0000000000766d686261303a303a30:3 -> mpx.vmhba0:C0:T0:L0:3
126 0 lrwxrwxrwx 1 root root 21 Jul 8 11:16 vml.0000000000766d686261303a303a30:4 -> mpx.vmhba0:C0:T0:L0:4
128 0 lrwxrwxrwx 1 root root 21 Jul 8 11:16 vml.0000000000766d686261303a303a30:5 -> mpx.vmhba0:C0:T0:L0:5
130 0 lrwxrwxrwx 1 root root 21 Jul 8 11:16 vml.0000000000766d686261303a303a30:6 -> mpx.vmhba0:C0:T0:L0:6
132 0 lrwxrwxrwx 1 root root 21 Jul 8 11:16 vml.0000000000766d686261303a303a30:7 -> mpx.vmhba0:C0:T0:L0:7
134 0 lrwxrwxrwx 1 root root 21 Jul 8 11:16 vml.0000000000766d686261303a303a30:8 -> mpx.vmhba0:C0:T0:L0:8
145 0 lrwxrwxrwx 1 root root 36 Jul 8 11:16 vml.02000000006006016049ef27000d6c501f7ff2e011524149442035 -> naa.6006016049ef27000d6c501f7ff2e011
143 0 lrwxrwxrwx 1 root root 38 Jul 8 11:16 vml.02000000006006016049ef27000d6c501f7ff2e011524149442035:1 -> naa.6006016049ef27000d6c501f7ff2e011:1
141 0 lrwxrwxrwx 1 root root 36 Jul 8 11:16 vml.02000100006006016049ef270058667b3f7ff2e011524149442035 -> naa.6006016049ef270058667b3f7ff2e011
158 0 lrwxrwxrwx 1 root root 38 Jul 8 11:16 vml.02000100006006016049ef270058667b3f7ff2e011524149442035:1 -> naa.6006016049ef270058667b3f7ff2e011:1
163 0 lrwxrwxrwx 1 root root 36 Jul 8 11:16 vml.02000200006006016049ef270022ea9bedfcf3e011524149442035 -> naa.6006016049ef270022ea9bedfcf3e011
161 0 lrwxrwxrwx 1 root root 38 Jul 8 11:16 vml.02000200006006016049ef270022ea9bedfcf3e011524149442035:1 -> naa.6006016049ef270022ea9bedfcf3e011:1
168 0 lrwxrwxrwx 1 root root 36 Jul 8 11:16 vml.02000300006006016049ef270030336e5bfbfee011524149442035 -> naa.6006016049ef270030336e5bfbfee011
170 0 lrwxrwxrwx 1 root root 38 Jul 8 11:16 vml.02000300006006016049ef270030336e5bfbfee011524149442035:1 -> naa.6006016049ef270030336e5bfbfee011:1
166 0 lrwxrwxrwx 1 root root 36 Jul 8 11:16 vml.02000400006006016049ef27008210b670fbfee011524149442035 -> naa.6006016049ef27008210b670fbfee011
173 0 lrwxrwxrwx 1 root root 38 Jul 8 11:16 vml.02000400006006016049ef27008210b670fbfee011524149442035:1 -> naa.6006016049ef27008210b670fbfee011:1
178 0 lrwxrwxrwx 1 root root 36 Jul 8 11:16 vml.02000500006006016049ef2700d23047c3f514e111524149442031 -> naa.6006016049ef2700d23047c3f514e111
180 0 lrwxrwxrwx 1 root root 38 Jul 8 11:16 vml.02000500006006016049ef2700d23047c3f514e111524149442031:1 -> naa.6006016049ef2700d23047c3f514e111:1
176 0 lrwxrwxrwx 1 root root 36 Jul 8 11:16 vml.02000600006006016049ef27007ec737eff514e111524149442031 -> naa.6006016049ef27007ec737eff514e111
183 0 lrwxrwxrwx 1 root root 38 Jul 8 11:16 vml.02000600006006016049ef27007ec737eff514e111524149442031:1 -> naa.6006016049ef27007ec737eff514e111:1
~ #
The ESXi 4.1 OS located in HP BL460c server but we don't know where the ESXi installed?
the server is connected with EMC SAN storage for VM server as first attach PIC
From the output it looks like ESXi is installed on the local disk (vmhba0).
1 root root 146778685440 Jul 8 11:16 mpx.vmhba0:C0:T0:L0
119 917504 -rw------- 1 root root 939524096 Jul 8 11:16 mpx.vmhba0:C0:T0:L0:1
121 4193280 -rw------- 1 root root 4293918720 Jul 8 11:16 mpx.vmhba0:C0:T0:L0:2
123 138223680 -rw------- 1 root root 141541048320 Jul 8 11:16 mpx.vmhba0:C0:T0:L0:3
125 4080 -rw------- 1 root root 4177920 Jul 8 11:16 mpx.vmhba0:C0:T0:L0:4
127 255984 -rw------- 1 root root 262127616 Jul 8 11:16 mpx.vmhba0:C0:T0:L0:5
129 255984 -rw------- 1 root root 262127616 Jul 8 11:16 mpx.vmhba0:C0:T0:L0:6
131 112624 -rw------- 1 root root 115326976 Jul 8 11:16 mpx.vmhba0:C0:T0:L0:7
133 292848 -rw------- 1 root root 299876352 Jul 8 11:16 mpx.vmhba0:C0:T0:L0:8
From the sizes it looks like you are using 2x146GB HDDs as RAID1.
Btw. ls -lisa and fdisk -lu are two separate commands.
André
Thanks a lot ap I appreciate your help & support now my answer is completed
thanks to http://communities.vmware.com/people/a.p.
I'm trying to determine whether or not my client has esxi 5.0 installed on an internal flash drive, or the local hard drive. I've read this post, but none of this makes sense to me. the output from the commands does not compute.
How can someone tell from the above output, what is a internal flash drive? and what is its contents?
With ESXi 5.x you may need to use the partedUtil command instead of fdisk. Please post the output of this command for the boot device to see whether this helps determine what you are looking for.
André
Note: Please consider to open a new discussion rather than replying to an old one.