HPC Systems

IBM NeXtScale Cluster | IBM iDataplex Cluster 

ada: an IBM (mostly) NeXtScale Cluster

System Name: Ada ada
Host Name: ada.tamu.edu
Operating System: Linux (CentOS 6.5)
Nodes/cores per node: 845/20-core @ 2.5 GHz IvyBridge
Nodes with GPUs: 30 (2 Nvidia K20 GPUs/node)
Nodes with Phis: 9 (2 Phi coprocessors/node)
Memory size: 811 nodes with 64 GB/node;
34 nodes with 256 GB (DDR3 1866 MHz)
Extra-fat nodes/cores per node: 15/40-core @ 2.26 GHz Westmere;
4 2TB and 11 1TB (DDR2 1066 MHz)
Interconnect: FDR10 fabric based on the
Mellanox SX6536 core switch
Peak Performance: ~337 TFLOPs
Global Disk: 4 PB (raw) via IBM's GSS26 appliance
File System: Global Parallel File System (GPFS)
Batch: Platform LSF
Production Date: September 2014

Ada is a 17,500-core IBM commodity cluster with nodes based mostly on Intel's 64-bit 10-core IvyBridge processors. 20 of the nodes with GPUs have 256 GB of memory. Included in the 845 nodes are 8 login nodes with 256 GB of memory per node, 3 with 2 GPUs, and 3 with 2 Phi coprocessors.

Get details on using this system, see the User Guide for Ada.


eos: an IBM iDataplex Cluster

System Name: Eos eos
Host Name: eos.tamu.edu
Operating System: Linux (RedHat Enterprise Linux and CentOS)
Number of Nodes: 372(324 8-way Nehalem- and 48 12-way Westmere-based)
Number of Nodes with Fermi GPUs: 4(2 w 2 M2050 each and 2 w 1 M2070 each)
Number of Processing Cores: 3168(all@2.8GHz)
Interconnect Type: 4x QDR Infiniband (Voltaire Grid Director GD4700 switch)
Total Memory: 9,056 GB
Peak Performance: 35.5 TFlops
Total Disk: ~500 TB by a DDN S2A9900 RAID Array
File System: GPFS
Production Date: May 2010

Eos is an IBM "iDataPlex" commodity cluster with nodes based on Intel's 64-bit Nehalem & Westmere processor. The cluster is composed of 6 head nodes, 4 storage nodes, and 362 compute nodes. The storage and compute nodes have 24 GB of DDR3 1333 MHz memory while the head nodes have 48 GB of DDR3 1066 MHz memory. A Voltaire Grid Director 4700 QDR IB switch provides the core switching infrastructure.

Get details on using this system, see the User Guide for Eos. (For detailed technical information, click here.)