A net captures the data flood from particel physics

- .
The four most important experiments ALICE, ATLAS, CMS and LHCb at the Large Hadron Collider at CERN in Geneva alone will generate about 15 million gigabytes of data and simulation data annually. No computing centre in the world could cope with this flood of information on its own. For this reason regional computing centres are now being set up in several countries and linked up via high-speed cables. Grid software provides for the automatic distribution of data and burdensharing.
In Germany, Forschungszentrum Karlsruhe has assumed this task. As a result of the establishment of the Grid Computing Centre Karlsruhe (or GridKa), the German elementary particle groups now have at their disposal enough computing power and storage capacity to make their contribution to the data analysis of the LHC experiments. GridKa participates in the global LHC Computing Grid Project as a regional data and computing centre of the “Tier 1” category for the experiments ALICE, ATLAS, CMS and LHCb.
However, research groups from Germany can also use GridKa in order to evaluate their experiments at other large particle accelerators at Fermilab or CERN. “Apart from this, we are developing and testing new technologies with GridKa, and these will simplify the evaluation of experiments in high-energy physics,” explains Klaus-Peter Mickel, director of the Steinbuch Centre for Computing at Forschungszentrum Karlsruhe, which coordinates GridKa. “In the process we are gaining insights and experience which will also be useful for other branches of science with similar computing time requirements and amounts of data.”

