Editorial: Supercomputer for the scientific daily routine

Even though the term "Supercomputer" seems rather reminiscent of literary models such as "HAL" or "Deep Thought" high performance computing has been part and parcel of scientific everyday life for a while. For decades, it has been dealing with algorithms, methods and techniques concerned with parallelization of Code. Up to 2005, the relevance of this topic within the civilian sector was mainly seen in simulations within the field of climate/weather, fluid dynamics and quantum chemistry and physics.

 

As a result of the introduction of multi core processors on the mass market, the average office computer has also become a simultaneous computer, which hardly exceeds its potential in usual interactive operations. Instruction sets for growing operands (SSE, AVX,...) turn each core into an array processor waiting to be programmed in an efficient way – modern processor chips currently have up to 16 of such computing cores.

Recent compilers are good but by far not sufficient. Therefore, the expertise which has been acquired in the field of high performance computing since the 1960s needs to be held available.