The IT Center operates high performance computers in order to support institutions and employees in terms of education and research.
All machines are integrated into one “RWTH Compute Cluster” running under the Linux operating system.
General information about usage of the RWTH Compute Cluster is described in this area - whereas information about programming the high performance computers is described in RWTH Compute Cluster - Parallel Programming.
All members of the RWTH Aachen University have free access to the RWTH Compute Cluster. But the amount of resources they can use is limited.
Above a certain threshold applications for more resources have to be submitted which then are reviewed. This application process is also open to external German scientists in institutions related to education and research. Please find related information here.
Please find information about how to get access to the system here.
You can get information about using and programming the RWTH Compute Cluster online on this website, or during our HPC related Events. For many of these Events, particularly turorials, we collect related material on our web site as well - see here. And then there are regular lectures, exercises and software labs of the Chair for HPC covering related topics.
Störungsmeldungen für Dienste der RWTH Aachen
2020-01-05, HPC Software News:
- New versions of Gurobi (8.1.1, 9.0.0) have been installed and are ready for use via the modules system.
2020-01-02, HPC Software News:
- Older versions of Ansys (16.0, 16.2, 17.0, 17.1 17.2, 18.0) moved to DEPRECATED area. These versions are delivered with older versions of Intel MPI nut fully supported on Intel OmniPath network.
2019-12-30, HPC Software News:
- Intel Inspector 2020 (initial release) installed and set to be the defaul 'intelixe' module.
2019-12-27, HPC Software News:
- Intel MPI version 2019.6.166 installed as intelmpi/2019.6 module.
2019-12-20, HPC Software News:
- Intel compiler "Parallel Studio XE 2020" installed as intel/19.1 module (as the real version is 18.104.22.168, sic!)
- Intel "Python for Linux* 2020" distribution installed as pythoni/3.7 module
- Intel MKL update installed as intelmkl/2020 module; however the default kept on intelmkl/2017(default) version. (Note: this library is feasible for non-Intel-compilers, as an Intel compiler always have a version of MKL integrated).
- Intel TBB update installed as inteltbb/2020 module; however the default kept on inteltbb/2017(default) version. (Note: this library is feasible for non-Intel-compilers, as an Intel compiler always have a version of TBB integrated).
- Intel Trace Analyzer and Collector for Linux 2020 (ITAC) installed as intelitac/2020 module and set to be the default.
- Intel Advisor for Linux 2020 installed as intelaxe/2020 module and set to be the default.
- Intel Inspector for Linux 2020 is not available yet and will be installed somewhen later.
2019-11-27, HPC Software News:
- Gamess version 2019 R2 installed.
2019-11-14, HPC Software News:
2019-11-14, HPC Software News:
2019-11-11, HPC Software News:
2019-10-23, old lecacy RV-NRW accounts locked.
- all known active RV-NRW users were informed mayn times prior this switch
- all users are enabled to log in further either by a provate 'rnrw' project granted, or as owner/cowercer on an other project.
- if in doubt please contact service desk.
2019-10-2x, major change in handling of project/group accounts:
- non-loginn accounts like project accounts (jaraNNNN, rwthMMMM, and so on) were not listed in SelvService anymor
- member script rewritten and got new functionality: Self-administration of UNIX groups with 'member
Previous blog post: Status RWTH Compute Cluster 2019-09-03