4-node SuperMicro SYS-e200-8d setup

  • 1 December 2017
  • 10 replies

Badge +7
Hi,Its my first post here. Been running this NutanixCE lab for close to 8 months. So far, it has been great!

I would like to call it a "mini DC in an Ikea Shelf".

Here's the Specifications:
4 x SuperMicro SYS-e200-8d
- Intel Xeon Intel® Xeon® processor D-1528, Single socket FCBGA 1667 6-Core, 12 Threads, 35W
- 128Gb Memory
- 2 x 512Gb ssd
- 2 x 10GbE1 x Netgear XS716T - 16 ports
1 x Netgear XS716E - 16 ports (not shown in pic)1 x Synology DS2015Xe
- 2 x 512Gb ssd (cache)
- 6 x 4TB WD NAS Sata Disks
1 x Asus RT-AC88U wireless router
1 x Intel NUC Skull Canyon NUC6i7KYK

Here's the PRISM console

10 replies

Badge +5
Do you have the NAS directly integrate with CE or is just storage and back up for VM's?
Userlevel 7
Badge +35
WOW - looks great thanks for sharing!
Userlevel 1
Badge +8
The setup looks almost like a enterprise cluster, great post. ????
Badge +7
It serves as just storage and backup.Was hoping that AHV can mount an external NFS share and use it as storage container (like vmware nfs datastore), so that I can move powered-off VMs and templates to the external NFS storage easily.
Look amazing - I'm looking at the Supermicro nodes, how have you configured each node? does each node have 128GB or RAM? and looking at the specs of the node it onl support 1 2.5" hard drive - have you used the M.2 slot as well? Would appreciate you thoughts on what you would change (if anything) now that you have been running it for a while.
Badge +7
Hi Kalamath,

Each SuperMicro SYS-e200-8d came with 2 x 10GbE and 2 x1GbE interfaces. I connected both to the 10G Netgear xs716t switch.

From the acropolis host
> ovs-vsctl show
> ovs-appctl bond/show bond0

From CVM
# Remove the 1gb ethernet ports from bond0
> manage_ovs --bridge_name br0 --bond_name bond0 --interfaces 10g update_uplinks

From acropolis host
> ovs-vsctl set port bond0 bond_mode=balance-slb
> ovs-vsctl set port bond0 other_config:bond-rebalance-interval=6000

Followed this article: https://next.nutanix.com/blog-40/network-load-balancing-with-acropolis-hypervisor-6463

Yes. Each node has 128Gb RAM.
I used up both 2.5" sata and m.2 slots.
I would have used a higher capacity ssd if there's enough budget.
Badge +5
Hi Lancez, I'm unable to get Balance-SLB going, whatever I do one of the 10GbE port will go offline even though all the cables are connected to the switch (all good with ESXi). So anything I'm missing here?
How did you manage to install nutanix on the supermicro servers? When I boot from USB key, it gets stuck at the end of system initialization with a weirdo message about a device not found using "ce-2019.02.11-stable".
When I installed mine I had to flash the BIOS on the Micro server. You could start there. Also what hard drive configuration do you have?
My BIOS is at latest version available. Problem is not to boot on media but to complete Nutanix boot itself. I've got a package with a 2TB Micron SSD and an Nvme Samsung 256GB Evo 970 plus and I boot from a reliable 32GB USB stick.