Your home ESX server lab hardware specs?

esx01 - poweredge 1950 Dual Quad-Core Xeon 2GHz, 32GB RAM, Dual 146GB 15K SAS RAID1 boot, GigE to iSCSI host, running ESX 4.0_U1

san01 - powervault 775N Dual Dual-Core Xeon 3GHz, 2GB RAM w/powervault 220s attached via U320 SCSI, running Openfiler 2.3 x86 as iSCSI host
5x146GB 10K U320 SCSI RAID5 (Main VM DataStore, boot disks, file server storage)
2x72GB 10K U320 SCSI RAID1(Templates, thin-provisioned)
4x146GB 15K U320 SCSI RAID10 (Exchange DBs / SQL data disks)
4x146GB 10K U320 SCSI RAID5 (Exchange logs, App server data disks)
 
My box sucks, but it gets the job done. Barely.

Whitebox, ESX 4.0
E2180, 2GB, 400gb IDE
Server 2003, Win XP, and Backtrack

How are you running ESX on an E2180. I have one of those and it's an Allendale with no Intel VT-d support, so no hardware virtualization support. ESX 4 won't install without hardware support on the CPU. :confused:
 
How are you running ESX on an E2180. I have one of those and it's an Allendale with no Intel VT-d support, so no hardware virtualization support. ESX 4 won't install without hardware support on the CPU. :confused:

Whoops, left of the "i" from ESX. Maybe that's why. . . Or I'm speshul :D

esx.png
 
How are you running ESX on an E2180. I have one of those and it's an Allendale with no Intel VT-d support, so no hardware virtualization support. ESX 4 won't install without hardware support on the CPU. :confused:

I think VT is required for x64 guests, but not for ESX itself or x86 guests. IIRC, ESXi 4 requires a minimum 64-bit CPU, but that's it. I could be wrong, but prochobo could easily test it out for us.
 
Running ESXi 4.0 on an HP DL380 G6 at home. The server has one 2.26GHz E5520, 4GB UDIMMs, two 146G 10k 3G SAS disks. Almost forgot - ESXi is installed on an SDHC card in the embedded SD slot.

Next year is the year of the upgrade: Expanding to eight drives, 24G RDIMMs, second CPU, second PSU, BBWC.
 
I also grabbed a Gateway SX2800-03 Desktop PC ($300 refurb), and ESXi installs just fine. No 64-bit guest support (No V-T on the 8200 quad), but I'm only running some Windows XP (for web browser compatibility testing) and couple 32bit linux based servers on it.

- Qlippoth
 
Running ESXi 4.0 on a homebuilt white box with the following specs

Tyan S2912 with V4.0 bios
2x AMD 8356 @ 2.3GHZ
4x 4GB of some noname ramz
2 port Pro/1000GT
NORCO RPC-250

ESXi is running off of a 2GB flash drive.


For storage I made a little File server that needs.

Gigabyte p35-ds3l
Intel e2200
2x 2GB of noname ramz
Rocket Raid 2680
4x500GB Westerna Digital Black for VMware
3x 1.5TB Western Digital Green for Storage
Rammed into a Antec Three Hundred with all fans but the big on on the top running (I broke it)

The 500GB drives are in raid 10
And the 1.5TB are in raid 5, though one of them keeps on dropping, and I am to lazy to fix it.
The file server is running Ubuntu 9.10 with NFS and iSCSI running.


This whole system is a complete hack job, and i am surprise its working. most of the parts are second hand, or off ebay.
 
Last edited:
My specs :p running some Windows 2008 Servers

Motherboard: Tyan S2895 K8WE
CPU: 2x Amd Opteron 275 DC
Memory: 8GB ECC DDR400

Running 2 Windows 2008 R2 servers and 2x Windows 2003 R2 Servers
 
Just a GX620 running a couple of vm's for my basic hosting needs, it will progress once we've moved out.

esxi.jpg
 
Last edited:
My Home lab for learning/testing things:

HP Proliant ML110 G5
Xeon 3065 Dual Core
8GB Ram
3x640Gb in RAID5

Used for running Server/SBS 2008

HP Proliant ML115
Opteron 1354 Quad Core
8GB Ram
640GB HD

Running Server 2008 R2's and XP machines
 
Last edited:
I've had difficulties getting raid working on SATA drives.

Even though you see hardware posts on whitebox for "SATA compatible" you may find its not SATA RAID compatible. Not sure about the new ESXi 4. I'm sure they added more driver support. We had to integrate our Areca SAS drivers into the 3.5 install disk which was a huge pain.
 
I've had difficulties getting raid working on SATA drives.

Even though you see hardware posts on whitebox for "SATA compatible" you may find its not SATA RAID compatible. Not sure about the new ESXi 4. I'm sure they added more driver support. We had to integrate our Areca SAS drivers into the 3.5 install disk which was a huge pain.

Usually onboard controllers (ICH10R, etc) only support JBOD and not actual "RAID" through the controller since it's software RAID!

In the ESXi 3.x world hardware support was much poorer and the oem.tgz file had to be hacked to get driver support (lame!). I decided to go with a non-whitebox for my ESXi 4 box and don't have the issue.
 
p45-ds3l
Q6600
8GB DDR2-800
2x 640GB WD Black
2x Intel PRO100 dual port NICs

ESXi 3.5
 
I have a Sun Java Workstation W2100z (precursor to the Ultra 40) currently as my ESXi box.

2x AMD Opteron 252s @ 2.6ghz
6GB ECC RAM
1x73GB 15k rpm scsi (boot / primary vm storage)
1x250GB sata storage
1x750GB sata storage

The machine cost me about $150. :) Can't do 64bit vms, but there is no 64bit version of HLDS anyways. :)

I also have an IBM xSeries 306(I think, might be 336) with 2x2.8ghz dual core Xeons and 2x36gb 15k rpm scsi drives and 3gb of ram (also $150). I was thinking of making it another esxi box. Right now it is my arcade machine in the basement because of the noise, but I am getting tired of dealing with the Radeon VE graphics (stuck at 1024x768).
 
Finally got my Opteron box working with ESXi; we'll see in the long run whether or not the TLB errata has any effect on performance or stability. Originally had Gentoo 2008.0 running on the box by itself, but I think I'll get more use out of it with ESXi on it.

esxi.png
 
Mine is pretty simple, but effective. I stopped running real servers (Other than my norco 4220) in my house last year. Just couldn't justify the sound.

So....

4 Dell Optiplex 755 Desktops
Q9400 Processors
8 GB DDR2 Memory
2x500 GB Sata Drives
Intel 1000MT Dual Port NICs

And for ISCSI I use my Norco 4220.
 
I decided to move back to ESX from Hyper-V. I work on Hyper-V all day at work, so I wanted something different at home. Plus, the $999 Small biz package made it too tempting.

Here's my set up:
ESXi on 3 identical Dell T300's.
Xeon X3663 2.83ghz QC
16gb RAM
4x500gb in a RAID 6 on a Perc6/i
Intel dual gigabit card
redundant PSUs

4 ports trunked into an HP 1800-24 gigabit switch. Definitely overkill.

vCenter:
Toshiba Qosmio X305* - 2ghz Core 2 Duo, 4gb RAM, 2x 320gb drives in RAID 1 running Server 2008R2.

I'm currently using 50% RAM on each host, running a 3 node Exchange 2010 DAG, some DCs, a Squeeze VM, a file server VM, etc. Soon I'll be adding a 3 node file server cluster. (I love TechNet.) I'll be in good shape whenever I bother to start studying for my MS certs again.


*The Qosmio used to be my main computer but I got sick of the small screen and built a desktop. It made more sense to simply use the Qosmio rather than build another box for vCenter. Its definitely fast enough. I know I could have run vCenter in a VM, but I like having vCenter independent and the laptop makes a nice way to access the VMs when I'm in the basement.
 
ESXi 4

HP ML150
2x Xeon Dual @3ghz
8g DDR 533
Boot 2x 72 gig SAS
VMFS 2x 500g SATA

Tower case single power supply so no huge power drain and is surprisingly quiet. Mostly for lab and studying
 
I'm running :

Asus 780G SB710
Kingston 2gb memory stick(The os)
Mushkin redline 4gb kit(2x)
Phenom 1 9850.
FSP 270W psu.

DFI lanparty JR 790GX M2RS
Kingston 2gb memory stick(The os)
Some corsair xms2 memory 8gb.
Phenom II 940
FSP 270W psu.

the nas is a dualcore 5600+ with 2gb ram, and 12 drives in total which is the san for my esx hosts.

I'm wondering about motherboards that have their nic supported natively by esxi 4 onboard, those intel nic addin board prices are killing me!
I also run cisco for router, 1 switch, and 1x 2series hp procourve, gigabits ofc!
my san has 3 gigabit uplinks :)

rocks pretty well i have to say =), now i need to get time to figure out what kinda servers i wanna use on them, got domain, exchange, web, sql, netmonitor, testw7 and xp pc to check if everything works.
 
Just installed ESXi 3.5 on my hardware. I originally used Server 2003 x64 and VMware Server 2.x but recently started using ESXi at work and was impressed so I decided to migrate at home too. Only using a whitebox setup, but so far it works really well.

Gigabyte G33M-DS2R motherboard, Intel Core 2 Duo E4500 CPU, 8GB DDR2-800 no-name memory, 2x 160GB Samsung SATA hard drives for local datastores, 1x Intel PRO/1000 GT NIC.

I had to make a few changes to the BIOS on the motherboard, changed the operational mode of the SATA interfaces so that ESXi recognised them, turned off all the other integrated peripherals, and added the Intel NIC, etc but other than that the installation was straightforward.

Currently running a Ubuntu VM attached to my LAN, a Vyatta router VM sitting between my LAN and the ESXi sandbox network, where a pair of Windows 2003 DC VM's are residing along with an installation of BackTrack in a VM. Will be adding lots more to this as I go along.

I used to have servers and PCs strewn all over the place, never again! Next stop, upgrade the CPU to something with Intel VT so I can support x64 guest machines! :)

Edit: Just side-graded from the E4500 to an E6550, gives me 2.2->2.33GHz (big wow!), 2->4Mb cache (bigger wow!), 800->1333MHz FSB (bigger wow), and - most crucially - the Intel VT support for x64 guests. Best of all, I picked up the E6550 S/H for £45, and sold the E4500 for £25, so not a bad move for £20!
 
Last edited:
a bit of an update to my esxi server, I wanted to convert from esxi 3.5 to 4 and ended up building a new machine while I was at it. Here are the specs:

2x amd quad core 2347
8gb ddr2 memory (for now)
supermicro dual socket f server motherboard
1 tb western digital caviar black drive for vms
boot partition runs on a flash drive
backups will be done every weekend
 
at home
dell 840, x3220, 8gb, 3tb on adaptec 3405, 74gb raptor as boot drive (lol) I will change this in the future (esxi4 u1)

in the datacenter
dell r210, pentium g6950, 4gb, 1x250, 1x1tb (server 2k8, with vmware workstation)
dell r710, dual x5540s, 24gb, 1x 50gb ssd, 7x500gb sas 2.5 - esxi4 u1
dell r710, dual e5520's, 16gb, 4x 500gb sas 2.5 (dev box) (esxi4 u1)
 
Last edited:
Hmm i'm not running VMware anymore :p since got CPU's with Virtualization i use Hyper-V

My server02 from my sig is running the Hyper-V console and i must say i like it better then VmWare !
 
Running ESXi 4.0 on an HP DL380 G6 at home. The server has one 2.26GHz E5520, 4GB UDIMMs, two 146G 10k 3G SAS disks. Almost forgot - ESXi is installed on an SDHC card in the embedded SD slot.

Next year is the year of the upgrade: Expanding to eight drives, 24G RDIMMs, second CPU, second PSU, BBWC.

Server is kicking it with eight 146G 10k SAS drives (two 3G and six 6G). I've migrated off the underpowered iSCSI datastore and now see extremely good I/O reads from the RAID5.
 
ESXi 4

White box
MB: Biostar TF720 A2+
CPU: AMD 4850e dual-core 2.8Ghz
RAM: 6GB
Boot: 4GB Usb flash drive
VMFS: Dell Perc 5/i, 2x146GB 15k SAS RAID1, 3x500GB SATA RAID5
NIC: HP NC7170 Gig Dual-Port, Intel Pro 1000 GT
 
I figure I will post mine up while it is down for the 4.0.2 update.

vmware.jpg


Sometime soon the RAM is going to be going from 4 to 16GB and the Data pool is going to get expanded by replacing my current 74GB SCSI drives with 146GB 10K SCSIs + a 1TB external over eSATA

Currently running 2x Windows server 2008 boxes for primary AD and a exchange server, and a fedora box for a seedbox and SNMP capture.
 
did ur update to v4U2 go smooth? I'm planning on upgrading my home lab this weekend and then servers at work if that all goes well.
 
did ur update to v4U2 go smooth? I'm planning on upgrading my home lab this weekend and then servers at work if that all goes well.

My test servers upgraded fine this morning. I think I'm gonna take the leap and upgrade the production servers after everybody goes home for the day.
 
did ur update to v4U2 go smooth? I'm planning on upgrading my home lab this weekend and then servers at work if that all goes well.

Went just fine for me. Had a slight issue with the ERROR: Unsupported boot disk /
The boot device layout on the host does not support upgrade. message, thankfully was able to work though it, and now everything is chugging along happily.
 
I just built a system on a Biostar TH55B with a Core i3 540, 4GB of ram, and a 640 WD Black. I am only using it to host my router and a few toy linux boxes. I never run more than two VMs at once.

I wanted low power useage :)

I must say, it is FUN. It's like building a new computer every day, if it want that. I sorta regret not going for a quad core... :p
 
2 ESX Hosts
1x Dell Optiplex 745 w/8Gigs of Ram, C2Q Q6600, PCI Express Intel PT Dual 1000, Qlogic 2312isp
1x Gigabyte P35-DS3R w/8Gigs of Ram, C2Q Q6600, 2xIntel MT Dual 1000, Emulex 11000
(VC is Virtualized)

1 Openfiler SAN
Asus P5Q-Pro 2Gigs Ram, E2200, Qlogic 2312ISP in Target mode, 2x Intel MT Dual 1000 (iSCSI Multipathing), 1x Pci Express 4Port Sata. all of this is in a Retrofitted IBM Storage Server case with a 16x SATAII Backplane.
(current hard drive specs are 2x2tb Raid1, 2x1.5TB Raid1, 3x500gb Raid5, 2x250gb Raid1)
Openfiler is setup to do multipathing on both the Fibre and the iSCSI for my ESX Hosts.

1xDell/Brocade 2gb/s Switch 8Port
1xNetgear 16Port 10/100/1000 Layer 3

all stored in my 1/4 Height Rack in my office.
 
Production/UAT/Beta/DR at one of our datacenters - down to 74 physical servers from 630 servers - took about a year and half to virtualize everything in Beta, UAT, DR, and Production

02x Dell R900's 20GB DDR2 4X 300GB Raid 10 vCenter 4.0 update 2
14X Dell R910's vSphere Ent 128GB DDR3
09x Dell R710's vSphere Ent 72GB DDR3
21x Dell 2950's vSphere Ent 32GB DDR2
05x Dell 1950's vSphere Ent 32GB DDR2
13x Dell MD3000i's 15 X 300GB 15K iscsi guest storage
10x Dell EqualLogic PS6010 15 x 300GB 15K iscsi guest storage
2x Cisco 6513 Catalyst 9x 6748 1000Gib blades

Home

2x Dell 1950 24GB DDR2 with 2x 300GB 1k in raid 1 - free Esxi 4.0
2x Dell 2900 8GB DDR2 with 8TB each of shared storage -Openfiler installed -set as iSCSI storage
1x Cisco 3750 48 port full gigabit switch
 
My home lab will be assembled this week.

ESXi box 1
AMD Athlon II X4 600e 45w CPU
8GB RAM
3x Gb NIC (SC/VMotion, iSCSI, VM port groups)
80GB 2.5" HDD

ESXi box 2
AMD Athlon II X3 405e 45w CPU
8GB RAM
3x Gb NIC (SC/VMotion, iSCSI, VM port groups)
80GB 2.5" HDD

File Server (Debian Linux)
Sempron 140 45w CPU
2GB RAM
2x 120GB 2.5" HDDs mirrored (OS)
4x 250GB HDDs RAID 10 (VMs, iSCSI)
3x 1.5TB HDDs RAID 5 (file share)
2x Gb NICs (one for iSCSI, one for file share)

I'm going after the VCAP once it comes out and the VCDX next year so it'll be really handy to have a couple boxes to fiddle with at home.
 
Last edited:
Back
Top