ETH Zürich

I N S T I T U T E   O F   A S T R O N O M Y    
H O E N G G E R B E R G   C A M P U S    




Who we are
What we do
Research Highlights

Astro Lunches
Journal Club

Lecture courses

Open positions

How to find us
How to contact us



ETH Physics

ETH Homepage


The Economist's
News on ZH

Extragalactic Astrophysics &
Observational Cosmology Group

Simulating the Universe with the Brutus Cluster

The ETH Brutus Cluster

The Cluster

Brutus is a high-performance Linux cluster consisting of 2200 processor cores in 756 compute nodes with a peak performance of about 11.3 TF. For massive parallel computing we use 360 nodes which are based on AMD Opteron 250 processors and are connected together via a high-speed Quadrics QsNetII network. Brutus supersedes the former ETH Beowulf cluster Gonzales.

Find out more about Brutus on the ETH IT Service website.

The virtues of Brutus

The theoretical/numerical work within the Extragalactic Astrophysics Group (lead by Prof. C.M. Carollo) and the Observational Cosmology Group (lead by Prof. S. J. Lilly) is dedicated to the growth and evolution of structures in our universe from the largest scales encompassing the whole universe down to formation of individual planets. Topics of research are in particular (i) the large-scale clustering of dark matter, (ii) the co-evolution of dark matter and baryons on the scale of galaxy clusters and groups, (iii) the study of the intracluster and interstellar medium,  (iv) the evolution of supermassive black holes in the centers of galaxies and (v) the formation of individual stars and planets.

High resolution, which translates into a high number of simulation particles, is mandatory in most astrophysical applications where the relevant processes occur on a range of spatial and temporal scales. This calls for the usage of a machine with a large number of processors like Brutus to distribute the workload across the computational volume.  We employ the N-body/SPH codes GASOLINE and Gadget-2 which allow efficient parallel runs with hundreds of processors.

The ETH Brutus cluster is an integral component of the research for the following people:

Selected Projects within the Extragalactic Astrophysics and Observational Cosmology Group

Large Scale Simulations


So far we have run three high-resolution simulations comprising 5123 collisionless dark matter particles in cosmic volumes of 45, 90 and 180 Mpc/h length with periodic boundaries. Also, we carried out a very-high resolution simulation consisting of 10243 collisionless dark matter particles in a cosmic volume of 90 Mpc/h,. This allows to study an 8 times larger volume with equivalent mass resolution as in the 5123 45 Mpc/h run. These runs reach a significantly higher mass-resolution than the Millenium simulation (which however covers a much larger volume).

detection of clusters in dark matter simulations

A dark matter simulation cube of size 180 Mpc/h. Clusters are identified by the eigenvalues of the tidal field tensor and indicated by red colors.

All simulations are performed using the code Gadget-2 (  Dark matter simulations  solve Poisson's equation for all particles and advance particle positions and momenta with a time-stepping algorithm. Gadget-2 uses a Barnes-Hut tree approach for short-range forces while the long range forces are computed with FFT. This approach leads to a scaling of the force computations with O(NlogN) for N particles and is thus optimal. Gadget-2 is MPI-parallel. The 10243 run required a running time of  roughly 3 months on 64 CPUs of Brutus' predecessor Gonzales.

Once a simulation is finished, post-processing of such huge amounts of data is rather involving. Gravitationally-bound structures (dark matter haloes) are identified in the 5123 or 10243 simulations. To this end, we have developed a parallel  halo-finder which is run on each snapshot of the simulation.


Processes such as star formation, tidal interactions between galaxies, strangulation of gas and morphologic transformations by dynamical instabilities are expected to be most efficient in galaxy groups (agglomerations of several tens of galaxies), which comprise about 50% of all galaxies today. Galaxy groups are thus expected to be the typical environment in which galaxies acquire their current appearance and have recently begun to attract considerable attention. Currently we are preparing a major computational campaign to simulate individual galaxy groups in fully-self consistent cosmological simulations. To reach the resolution of individual galaxies, we employ a renormalization technique which concentrates the computational resources predominantly on a given sub-volume while maintaining the tidal influence from the large-scale structure surrounding such a region. Our simulations employ the state-of-the-art N-body/SPH solver GASOLINE and include gas dynamics, gas cooling, star formation and feedback processes from supernovae and stellar winds. The simulations are computationally very demanding and run for several months on 64 processors. The high speed and low latency of the network interconnection of Brutus are essential to reduce the communication overhead and to allow the scaling of our hydrodynamic simulations to a high number of processors.

A group-size halo at different redshifts

The formation of a group-sized dark matter halo with cosmic time. The different snapshots show the projected matter density when the universe was merely a billion years old (top-left) up to the present day (bottom-right).

Black holes and Galaxy Merger


Coalescence of supermassive black holes in merging galaxies.
Massive galaxies contain supermassive black holes at their center which weigh up to billions of solar masses. The coalescence of supermassive black holes produces the strongest burst of gravitational waves which should be detectable with the Laser Interferometer Space Antenna (LISA) in the next decade or so, providing an important test of General Relativity. The number of gravitational wave bursts that will be observed will depend on the coalescence rate of supermassive black holes during the merger and assembly of galaxies in a hierarchical Universe. To date, such coalescence rates are poorly known. Using the SPH code GASOLINE and a new technique called particle splitting that increases the resolution in a selected region of a simulation, we have studied for the first time the orbital decay of black holes from tens of kiloparsecs down to parsec scales while galaxies merge. We have found that the black holes form a binary in less than a million years after the galaxies merge, being dragged to the center of a circumnuclear gaseous disk that arises in the merger remnant as a result of a gas inflow. The rapid orbital decay is due to friction caused by the gas in the disk. Such massive gaseous disks have recently been observed at the center of merger remnants. Future simulations will explore the later phase of decay entering the relativistic regime with the aid of a post-Newtonian approximation. If the decay continues at the measured rate, even below a parsec, the two black holes should merge in much less than a billion years, as opposed to a billion years or more as previously estimated in less realistic calculations. This implies a much higher rate of gravitational bursts events.

Merging Black Holes Sequence showing the merger of two gas-rich galaxies with two supermassive black holes at their center. The last stages are shown by zooming in the central 5 kiloparsecs (upper right panel) and then again in the central 100 pc just after the galaxy merger, when the two black holes have formed a bound binary system with a separation of about 2 pc in the central rotating gaseous disk (grey-scale inset, disk is seen face-on and edge-on). The figure is taken from the paper that appeared recently in Science Magazine (2007, 316, 1874) and was featured on ETH life at


Binary galaxy mergers.
Gravitationally bound structures in our universe are expected to form out of smaller lumps of matter in an hierarchical fashion. It seems therefore plausible that mergers among galaxies are an important evolutionary process in shaping the galaxy population. Mergers are invoked to explain, e.g. the build-up of the elliptical galaxy population, morphological transformations of galaxies or the excessive star formation rate of starbursting galaxies at high redshifts. To understand the role of mergers and their impact of galaxy morphology and kinematics we simulate binary mergers between models galaxies which are constructed to match observed properties. This method allows us to resolve individual galaxies with several million particles and thus archieving a much higher resolution than past cosmological simulations.

elliptical merger t=0 Gyr elliptical merger t=4.5 Gyr Elliptical merger t=6.4 Gyr
(Left) t=0 Gyr,
two elliptical galaxies on
a parabolic encounter
(Middle) t=4.5 Gyrs,
a broad fan of stars is
visible for a short time
(Right) t=6.4 Gyrs,
shells appear due to phase
wrapping of infalling material

A movie of this merger is available here.

The Interstellar and the Intergalactic Medium:


We have been developing new numerical methods that include a new implementation of Adaptive Mesh Refinement for studying astrophysical and cosmological systems. In addition to higher-order Godunov's methods for hydrodynamics, particle-mesh methods for collision-less matter, and a multigrid-multilevel relaxation based elliptic solver, the code incorporates newly developed higher order Godunov's methods for stiff sources (e.g. efficient radiative losses or stiffly coupled multi-fluid models) and cosmic-ray hydrodynamics. The fast switches and large bandwidth characterizing Brutus are very important features for running efficient, communication-intensive, elliptic solvers, which are necessary for self-gravitating systems. We apply the code to the study of the intracluster medium in galaxy clusters and the interstellar medium of galaxies.

AMR grid Cosmologial AMR calculation of a cluster of galaxies during the early phases of its formation. The image shows the gas distribution (gray-scale) with the box layout  superimposed (cyan lines).

Star Formation and Planet Formation


We use Brutus to simulate molecular clouds, the only known sites of star formation, in order to study the early evolution of prestellar and protostellar systems. As a dense prestellar core of gas and dust begins to collapse under its own gravitational attraction, the core material differentiates into a protostar with a circumstellar disc and an envelope. By simulating the spontaneous birth of a star within a  molecular cloud we can study the effect of the cloud environment on the early evolution of the star, disc, and envelope. At slightly later  times, when the prestellar gas disc has mostly accreted onto the protostar, we can study the role of the remains of the gas disc, the  protoplanetary disc, in formation of planets.

cloud density map at t = 0.625 tff
cloud density map at t = 1.15 tff
Cloud density map at t = 0.625 tff Cloud density map at t = 1.15 tff.
Two prestellar cores are clearly visible in the second image.
A movie of the cloud collaps is available here.


Here we demonstrate the effect of the equation of state on the migration of a Jovian planet.
The equation of state can have a strong influence on the migration rate of the planet. So far, the literature has concentrated on vertically isothermal disks, making the implicit assumption that any heat generated at a given place is immediately radiated away. At the other extreme, we can use an adiabatic equation of state and then consider that heat is not evacuated at all. Recent simulations as well as our preliminary results, obtained with the SPH code GASOLINE, show that migration under adiabatic conditions is much slower than in the isothermal case. The next step will be to study the migration under a more realistic radiative transfer model. We'll use the flux limited diffusion approximation. The figures below show the circumplanetary material in the 1 million particles, isothermal simulation. The gas settles into a keplerian disk. In the adiabatic runs, the circumplanetary material stays in a spherical envelope.

disk seen face-on
disk seen edge on
Disk seen face on
Disk seen edge on

Last update 16/11/07