The Grid to handle atom smashers 15 million GB data

November 4th, 2008 - 1:11 pm ICT by ANI  

Large Hadron Collider

Washington, Nov 4 (ANI): Computer scientists are going to use the Grid for dealing with an unprecedented volume of data to be produced by the Large Hadron Collider (LHC) when it is fully up and running, a total of 15 million gigabytes of data every year.

Located in a tunnel complex under the Franco-Swiss border, the LHC is the worlds largest and highest energy particle accelerator complex, intended to collide opposing beams of protons with very high kinetic energy.

It circulated its first particle beams on 10 September 2008, but a few days later had to suspend operations due to equipment failure.

During the mid-1990s, when CERN staff first considered how they might deal with the large volume of data that the huge collider would produce when its two beams of protons collide, a single gigabyte of disk space still cost a few hundred dollars.

At that time, CERNs total external connectivity was equivalent to just one of todays broadband connections.

It quickly became clear that computing power at CERN, even taking Moores Law into account, would be significantly less than that required to analyse LHC data.

The solution, it transpired during the 1990s, was to turn to high-throughput computing.

High-performance computing is ideal for particle physics because the data produced in the millions of proton-proton collisions are all independent of one another - and can therefore be handled independently.

So, rather than using a massive all-in-one mainframe supercomputer to analyze the results, the data can be sent to separate computers, all connected via a network.

From here sprung the LHC Grid.

The Grid, which was officially inaugurated last month, is a tiered structure centred on CERN (Tier-0), which is connected by superfast fibre links to 11 Tier-1 centres at places like the Rutherford Appleton Laboratory (RAL) in the UK and Fermilab in the US.

More than one CDs worth of data (about 700 MB) can be sent down these fibres to each of the Tier-1 centres every second.

Tier 1 centres then feed down to another 250 regional Tier-2 centres that are in turn accessed by individual researchers through university computer clusters and desktops and laptops (Tier-3).

According to Andreas Hirstius, manager of CERN Openlab and the CERN School of Computing, The LHC challenge presented to CERNs computer scientists was as big as the challenges to its engineers and physicists.

The computer scientists managed to develop a computing infrastructure that can handle huge amounts of data, thereby fulfilling all of the physicists requirements and in some cases even going beyond them, he added. (ANI)

Related Stories

Tags: , , , , , , , , , , , , , , , , , , ,

Posted in National |