CERN and Big Data Analytics: Solving the Mysteries of The Universe With Big Data
Last month, the Chief Technology Officer at CERN presented at the Big Data Innovation Summit in London. His talk ‘Solving the Mysteries of the Universe with Big Data’ is now available to watch on demand : http://bit.ly/19IR3A7
ABOUT THE PRESENTATION (From: http://theinnovationenterprise.com/webinars/solving-the-mysteries-of-the-universe-with-big-data)
Solving the Mysteries of The Universe With Big Data
CERN is one of the world’s largest and most respected centres for scientific research. Its business is fundamental physics, finding out what the Universe is made of and how it works. CERN operates the Large Hadron Collider (LHC), the world’s largest and most powerful particle accelerator where the ATLAS and CMS experiments recently announced their observations of a particle consistent with the long-sought Higgs boson.
The particle’s detection has set the worldwide scientific community buzzing, but behind the success of the work undertaken at the LHC, lies a story of Big Data success that is truly ground-breaking.
Data handling on a massive scale is essential to achieve such results. CERN operates the Worldwide LHC Computing Grid (WLCG), which combines the IT power of tens of thousands of computers distributed across more than 150 computer centres to meet the needs of the LHC experiments. The rapid increase in performance of the LHC accelerator is having an impact on the computing requirements since it increases the rate, complexity and quantity of data that the LHC experiments need to store, distribute and process. Around 30 Petabytes of data is stored annually.
CERN is actively investigating a number of new approaches and technologies that will help ensure it continues to meet the extreme IT needs of the LHC over its foreseen 15 year lifetime. This presentation will explain how the LHC data is managed today and the future directions being investigated with leading IT companies and research organisations around the world.