Scientists gear up to tackle 15 million gigabytes of dataNovember 4th, 2008 - 4:33 pm ICT by IANS
London, Nov 4 (IANS) The four huge detectors of the new Large Hadron Collider near Geneva, when fully operational, are expected to generate up to a staggering 15 million gigabytes of data every year. Andreas Hirstius, manager of CERN Openlab and the CERN School of Computing, explained how computer scientists have met the challenge of handling this unprecedented volume of data.
When CERN staff first considered in the mid-1990s how they might deal with the large volume of data that the vast circular contraption would produce when its two beams of protons collide, a single gigabyte of disk space still cost a few hundred dollars and CERN’s total external connectivity was equivalent to just one of today’s broadband connections.
It quickly became clear that computing power at CERN, even taking Moore’s Law into account, would be significantly less than that required to analyse Large Hadron Collider (LHC) data.
The solution, which transpired during 1990s, was to turn to “high-throughput computing” where the focus is not on shifting data as quickly as possible from A to B but rather from shifting as much information as possible between those two points.
High-performance computing is ideal for particle physics because the data produced in the millions of proton-proton collisions are all independent of one another - and can therefore be handled independently, according to an Institute of Physics (IOP) release.
So, rather than using a massive all-in-one mainframe supercomputer to analyse the results, the data can be sent to separate computers, all connected via a network.
Enter the LHC Grid. The Grid, which was officially inaugurated last month, is a tiered structure centred on CERN (Tier-0), which is connected by superfast fibre links to 11 Tier-1 centres at places like the Rutherford Appleton Lab (RAL) in Britain and Fermilab in the US.
Every second, more than one CD’s worth of data (about 700 MB) can be sent down these fibres to each of the Tier-1 centres.
Tier-1 centres then feed down to another 250 regional Tier-2 centres that are in turn accessed by individual researchers through university computer clusters and desktops and laptops (Tier-3).
Andreas Hirstius wrote: “the LHC challenge presented to CERN’s computer scientists was as big as the challenges to its engineers and physicists. The computer scientists managed to develop a computing infrastructure that can handle huge amounts of data, thereby fulfilling all of the physicists’ requirements and in some cases even going beyond them.”
These findings will appear in November’s Physics World.
- The Grid to handle atom smashers 15 million GB data - Nov 04, 2008
- Google app shows colliding protons in real time - Oct 10, 2011
- Physicists look at initial data from Large Hadron Collider - Jan 07, 2010
- Post Higgs, Cern turns spotlight on dark matter - Jul 08, 2012
- God particle may not exist after all, say experts - Aug 23, 2011
- God Particle's discovery 'biggest leap in physics' (Roundup) - Jul 05, 2012
- Large Hadron Collider sets new beam intensity record - Apr 23, 2011
- Physicists begin to see data from "Big Bang Machine" - Jan 07, 2010
- LHC to be run for an extra yr to find the Higgs particle - Dec 11, 2010
- Scientists claim discovery of Higgs boson-like particle (Lead) - Jul 05, 2012
- Scientists recreate more 'mini Big Bangs' - Dec 05, 2010
- 'God particle' mystery to be solved by 2012 - May 18, 2011
- Fastest integrated circuit for Big Bang machine - Apr 09, 2010
- Scientist sets Large Hadron Collider data to 'music' - Jan 04, 2011
- 'Big Bang Machine' experiment successful: CERN - Mar 30, 2010