º£½ÇÉçÇø

High performance computing

We provide specialist computing services for research projects needing high-capacity systems.

At the moment the HPC central facility at the º£½ÇÉçÇø comprises 65 nodes, using an OmniPath interconnect at 100Gb/s Bandwidth, and in total the system has 1100 physical cores available.

The IT Services Research Computing team deals with the configuration, maintenance and development of the High-Performance Computing system; IT Services provides and maintains the general infrastructure hosting the HPC system. Among other things that includes a modern data centre, security, electricity, cooling, rack space, networking and storage. ITS initially provided HPC systems for general use and that worked as a basis for expansion by researchers according to computing requirements and budgets.

Please see .

Each user of the system is allocated a home working area that has nightly incremental backups and archive functionality as well as a non-backed up scratch space, limited to 2TB, in our Lustre parallel file system for intensive I/O applications. The new Lustre system, introduced in 2019, has been enhanced by the EPP group and the Astronomy Centre with extra 500TB of usable space and is now 1.4PB in size. The system is able to provide up to 9Gb/s throughput for intensive parallel applications. Researchers can request Lustre scratch space for specific projects, the project space can be shared by the research group members and will provide a common place for reference or intermediate data sets. Project space will be limited with a lustre quota according to the project’s needs.

Schools on their part provide additional staff for first line support and also specialized software or service deployment. Schools or research teams can also fund extra computing resources to be added to the system. These resources can used exclusively for particular research purposes, depending of the funding mechanism, or can be made available so other researchers could make opportunistic use of these computing resources when not in use by the funders.

User accounts can be requested through .

Updated on 2 December 2021