Service Description


The IT Center operates high performance computers in order to support institutions and employees in terms of education and research.

All machines are integrated into one “RWTH Compute Cluster” running under the Linux operating system.

General information about usage of the RWTH Compute Cluster is described in this area -  whereas information about programming the high performance computers is described in RWTH Compute Cluster - Parallel Programming.

All members of the RWTH Aachen University have free access to the RWTH Compute Cluster. But the amount of resources they can use is limited.

Above a certain threshold applications for more resources have to be submitted which then are reviewed. This application process is also open to external German scientists in institutions related to education and research. Please find related information here.

Please find information about how to get access to the system here.

You can get information about using and programming the RWTH Compute Cluster online on this website, in our Primer which you can download as a single pdf file for print out, or during our  HPC related Events. For many of these Events, particularly turorials, we collect related material on our web site as well - see  here. And then there are regular lectures, exercises and software labs of the Chair for HPC covering related topics.

Users of the RWTH Compute Cluster will continuously be informed through the HPC Mailinglist (registration, archive )

Maintenance Information


RWTH Störungsmeldungen
Job restrictions on the BULL cluster - jobsize
Hinweis von Freitag 01.12.2017 10:00 bis Donnerstag 01.11.2018 00:00 - Due to problems in the BULL InfiniBand Fabric jobs are restricted to one chassis on the BULL cluster. This means, that a) the maximum coresize is restricted to 216 cores b) the maximum number of hosts is restricted to 18 hosts. In both cases, the job will be rejected if these numbers are exceeded. This does NOT affect the NEC cluster or the service integrative hosting!

News


  • 2018-02-253, HPC Software News: TotalView debugger version 2018.0.5 installed and set to the default. Older versions moved to DEPRECATED area.
     
  • 2018-01-25, HPC Software News:
    •  In order to simplify access, all ANSYS software has been consolidated into the "ansys" module. The former modules cfx, icem and fluent will be moved to the DEPRECATED area on March 1st, 2018 (formerly: February 1st). Users of said software should check their job scripts and load the "ansys" module instead. 


  • 2018-01-18: Intel compiler 17.0.6.256 installed and set to be intel/17.0 module; previous version known under this name (17.0.5.239) has been moved to DEPRECATED area.
  • 2018-01-17: Oracle Java/JDK has been updated from version 1.8.0_151 to 1.8.0_162
     
  • Due to technical issues in the BULL InfiniBand Fabric jobs are restricted to one chassis on the BULL cluster. This means, that
    • the maximum coresize is restricted to 216 cores (18 nodes x 12 cores)
    • the maximum number of hosts is restricted to 18 hosts
      If violated, in both cases the job will be rejected.
    This does NOT affect the NEC cluster or the Integrative Hosting service (IH)!

     
  • 2018-01-15: The Meltdown/Spectre updates survived.
    • CentOS is updated from 7.3 to 7.4.
    • SSH keys regenerated, so don't panic about 'man-in-the-middle-attack' messages, but read they carefully.
    • Due to security updates the performance of the systems could suffer. We assume a loss of about 10% to be acceptable; report if you observe worse performance.
    • The workaround for Intel-GLIBC-issue was the killer for some binaries, so it was reverted as we now have a fixed GLIBC version installed (2.17-196.el7_4.2)
  • 2018-01-09: Wir haben auf unseren Systemen neue ssh-Host-Keys generiert. Die Fingerprints der neuen Keys haben wir fuer die Dialog-Systeme auf der Seite https://doc.itc.rwth-aachen.de/x/cYA1Ag veroeffentlicht.
     
     
  • 2018-01-08, HPC Software News:
    • New version 3.10.1 of CMAKE tool installed, cf. 'module avail cmake'

    • [UPD 2018-01-13: workaround reverted] For all versions of Intel compiler prior 18.0.1.163 (excl.), now the environment variable LD_BIND_NOW will be set to 1. This is workaround for a bug in Intel compiler (incompatibility with GCC). Performance impact is considered to be low but not have been excluded. Further reading in

       Klicken Sie hier, um zu erweitern...
      [1] https://blog.fefe.de/?ts=a71ab7b5
      [2] https://bugzilla.redhat.com/show_bug.cgi?id=1499012
      [3] https://software.intel.com/en-us/articles/intel-compiler-not-compatible-with-glibc-224-9-and-newer
      [4] https://software.intel.com/en-us/articles/inconsistent-program-behavior-on-red-hat-enterprise-linux-74-if-compiled-with-intel
      [5] https://sourceware.org/bugzilla/show_bug.cgi?id=21236
      [6] https://sourceware.org/bugzilla/show_bug.cgi?id=21265#c11
      [7] https://sourceware.org/ml/libc-alpha/2017-03/msg00344.html
    • ff


 

  • 2018-01-02, HPC Software News:
    • New version 5.1 of CP2K software available.

    • Limitations:

      • not regression-tested (tests for intel/18.0 + intelmpi yet running)
      • not available for Intel compilers 16 and older due to compiler bugs. Use Intel compiler version 17 or (recommended) 18, or GCC compilers.
      • parallel (MPI) versions (pops, psmp) not available for Open MPI 1.10.x due to C++-bindings-issues. Use Intel MPI (recommended in general!).

    • Loading:


     
  • 2017-12-28, HPC Software News:
    • New version 3.8.2 of the PETSc library installed. Unfortunately it was not possible to compile it using intel/16.0 compilers (default in the HPC Cluster). We recommend to use the intel/18.0 compiler,


Previous blog post: Status RWTH Compute Cluster 2017-11-22

 

Icon

All members of the RWTH Aachen University have free access to the RWTH Compute Cluster. But the amount of resources they can use is limited.

Above a certain threshold applications for more resources have to be submitted which then are reviewed.
This application process is also open to all German scientists in institutions related to education and research.