Service Description


The IT Center operates high performance computers in order to support institutions and employees in terms of education and research.

All machines are integrated into one “RWTH Compute Cluster” running under the Linux operating system.

General information about usage of the RWTH Compute Cluster is described in this area -  whereas information about programming the high performance computers is described in RWTH Compute Cluster - Parallel Programming.

All members of the RWTH Aachen University have free access to the RWTH Compute Cluster. But the amount of resources they can use is limited.

Above a certain threshold applications for more resources have to be submitted which then are reviewed. This application process is also open to external German scientists in institutions related to education and research. Please find related information here.

Please find information about how to get access to the system here.

You can get information about using and programming the RWTH Compute Cluster online on this website, or during our  HPC related Events. For many of these Events, particularly turorials, we collect related material on our web site as well - see  here. And then there are regular lectures, exercises and software labs of the Chair for HPC covering related topics.

Users of the RWTH Compute Cluster will continuously be informed through the HPC Mailinglist (registration, archive )

Maintenance Information


RWTH Störungsmeldungen
Störungsmeldungen für Dienste der RWTH Aachen
Rechner-Cluster -
Änderung von Montag 24.02.2020 06:00 bis unbekannt - FastX-Web-Verbindungen zu der URL https://login18-x-1.hpc.itc.rwth-aachen.de:3443 erlauben aus Sicherheitsgruenden nur noch TLS 1.2. Bitte teilen Sie uns mit, falls Sie Probleme haben sollten.
Rechner-Cluster - Kein HPCWORK auf login18-4
Teilstörung von Montag 16.03.2020 06:30 bis unbekannt - Auf login18-4 steht aktuell kein HPCWORK zur Verfuegung. Bitte weichen Sie bis die Stoerung behoben ist auf eines der anderen Dialog-Systeme aus.
Rechner-Cluster - Metadaten-Operation auf $HPCWORK teilweise sehr langsam
Teilstörung von Freitag 20.03.2020 16:00 bis unbekannt - Teilweise koennen Metadaten-Operationen auf $HPCWORK (cd, mkdir/rmdir, ls, ...) sehr lange dauern oder ggf. sogar ganz haengen. Der Lieferant des Systems wurde benachrichtigt und arbeitet an der Loesung des Problems.

News


  • 2020.04.06: Intel software updated.
    • Intel compiler intel/19.1.1.217 installed (marketing name  'Parallel Studio XE 2020 Update 1')
    • Intel VTune Profiler (previously known as 'Amplifier')   intelvtune/XE2020-u01 installed and set to be the default 'intelvtune' module version.
    • Intel MPI version 2019.7.217 installed and is available as intelmpi/2019.7.217 module.
    • Intel Python version 3.7 release 2020.1.893 installed and is available as pythoni/3.7 module; previously known by this name version 3.7.2020.0.014 has been moved to DEPRECATED area.
    • Intel Threading Building Blocks 2020 Update 2 installed and is available as inteltbb/2020.2 module and set to be the default inteltbb module.
    • Intel MKL version 2020.1.217 installed and set to be the intelmkl/2020 module; the previously known by this name version 2020.0.166 has been moved to DEPRECATED area.
    • Intel Inspector 2020 Update 1 (Build: 604266) installed as intelixe/2020-u1 and set to be the default intelixe module; previous default version intelixe/2020-u0 as well as version intelixe/2019-u3 have been moved to DEPRECATED area.
    • Intel Advisor 2020 Update 1 (Build: 605410) installed as intelaxe/2020 and set to be the default intelaxe module; previous default version (2020 initial release, 2020-u0) as well as version intelixe/2019-u1 have been moved to DEPRECATED area.
    • Intel ITAC 2020 Update 1 (2020.1.024)  installed as intelitac/2020 and set to be the default intelitac module; previous default version (2020 initial release, 2020.0.015) as well as version intelitac/2019[.1.022} have been moved to DEPRECATED area.
  • 2020.03.26: GROMACS version 2020.1 in different flaours available. Cf. gromacs
  • 2020.03.25:
    • Maple 2020 has been installed. Due to license server limitations this also means that any versions prior to 2018 are not supported anymore and were therefore removed.
    • Mathematica 12.1 has been installed.
  • 2020.03.23:
    • GCC compiler version 9.3.0 installed and set to be 'gcc/9' module. Version 9.2.0 previousl known as 'gcc/9' module has been moved to DEPRECATED area.
    • Clang compiler version 9.0.1 installed as 'clang/9.0' module and set to be the default clang module version. Version 9.0.0 preiousy known as 'clang/9.0' module has been moved to DEPRECATED area as well as version 7.0[.0]
  • 2020.03.19: Open MPI versions 1.10.7, 3.1.5,  4.0.3 installed.
  • Quantum Espresso 6.5 installed in ScaLAPACK and no-ScaLAPACK flavours. This times more additional binaries compilied, including gipaw.x
  • TotalView debugger version 2020.0[.5] installed.
  • PGI compiler version 19.10 installed and set to be the default 'pgi' version; version 20.1 installed; many older versions moved to DEPRECATED area.
  • Systemwartung vom 10.03. bis 12.03.2020
    • Maintenance starts on 10.03. at 8:00 and will presumably end on 12.03. at 18:00.

    • During maintenance no batch jobs will be running. The dialog systems are booted at the beginning and end of maintenance, but are available in between. $HOME and $WORK will be accessible as usual, however, $HPCWORK will not be available for the entire three days.

       

       

 

Previous blog post: Status RWTH Compute Cluster 2019-11-27

Icon

All members of the RWTH Aachen University have free access to the RWTH Compute Cluster. But the amount of resources they can use is limited.

Above a certain threshold applications for more resources have to be submitted which then are reviewed.
This application process is also open to all German scientists in institutions related to education and research.

Page: General Page: Usage Page: FAQ Page: News archive