Hardware¶
The Bora HPC cluster consists of computing services and storage service. All servers in the HPC environment are based on Linux operating systems, distribution Rocky Linux 8.9.
A cluster is a collection of computing nodes connected by a communication network.
Three kinds of computing nodes are available on Bora:
CPU node: multiple CPU cores
FAT node: multiple CPU cores + large amount of RAM
GPU node: multiple CPU cores + at least one GPU
Computing nodes¶
Here are the specs of the computing nodes:
Node type |
Description |
Nodes count |
CPU |
Cores |
Frequency |
RAM |
GPU |
---|---|---|---|---|---|---|---|
Login |
Entry point to the cluster |
1 |
Xeon Silver 4410Y |
24 |
2.0 GHz |
128 Gb |
– |
CPU |
General-purpose CPU nodes |
4 |
Xeon Gold 6354 |
36 |
3.0 GHz |
256 Gb |
– |
GPU |
Nodes equipped with NVIDIA graphic cards |
1 |
Xeon Gold 6354 |
36 |
3.0 GHz |
256 Gb |
2 x Nvidia A30 24 Gb |
FAT |
Nodes for memory-intensive jobs |
1 |
Xeon Gold 6354 |
36 |
3.0 GHz |
1024 Gb |
– |
The total amount of cores on the 6 computing nodes is 216 cores, the total amount of RAM is 2.3TB.
Each GPU node has 2 physical Nvidia A30 GPUs, one of which is split into 4 virtual 6Gb sub-GPUs (MIG devices). See the User guide for further information on their usage.
Hyperthreading is disabled on all the nodes.
Storage nodes¶
Level |
Description |
Storage |
Total space |
---|---|---|---|
1st (fast) |
Fast I/O operations |
10 x 3.84 Tb SAS in RAID5 |
30 Tb |
2nd (slow) |
Data storage for home and shared folders |
6 x 18 Tb SAS III in RAID6 |
70 Tb |
All storage nodes are equipped with a Xeon Gold 5317 processor at 2.0GHz with 24 cores and 256 Gb or RAM.
Network¶
The network is based on Gigabit Ethernet for management purposes, and on InfiniBand 200Gbit, a low latency and high bandwidth network.