Monday, November 4, 2013

Beyond the God particle

Foto
By Clive Cookson
Computers, born out of science, are transforming every aspect of the scientific process, as this year’s Nobel physics and chemistry prizes show. Fifty years ago the physics laureates Peter Higgs and François Englert applied their brains through pencil and paper, blackboard and chalk, to come up with proposals for a new fundamental energy field and subatomic particle, which would explain why matter has mass. Last year one of the most computer-intensive scientific experiments ever undertaken confirmed their theory by making the Higgs boson – the so-called “God particle” – in an $8bn atom smasher, the Large Hadron Collider at Cern outside Geneva.The three chemistry laureates, Martin Karplus, Michael Levitt and Arieh Warshel, won their prize for “taking the experiment to cyberspace”, in the words of the Royal Swedish Academy of Sciences. Since the 1970s they have pioneered the modelling of increasingly complex chemical reactions and molecules in computers.
The way computing is changing almost every field of research is not obvious to non-scientists who learn about newsworthy results from the media but see little of the process used to obtain them.
It may not even be clear to someone visiting, say, an organic chemistry laboratory, which will probably look – and smell – very like its counterpart 10 or 20 years ago, with chemicals and glassware arrayed on shelves above the lab benches and reactions brewing in fume cupboards.
But the preparations behind the scenes for the experiments are changing, as chemists go beyond drawing up their experiments on paper and begin to model molecules and reactions in silico before heading for the lab bench.
“The field of computational modelling has already revolutionised how we design new medicines, by allowing us to predict accurately the behaviour of proteins,” saysProfessor Dominic Tildesley, president-elect of the Royal Society of Chemistry and director of the European Centre for Atomic and Molecular Computation at the École Polytechnique Fédérale de Lausanne. “The time is coming when no one will carry out a chemistry experiment at the bench without doing some kind of modelling before they start.”
There will be huge savings of time, money and materials, he adds, because chemists will only try to make compounds in the lab that are shown by computer simulation to have a high chance of success.
In astronomy and cosmology, new telescopes on Earth and in space attract most of the publicity but Lord (Martin) Rees of Cambridge university, Britain’s astronomer-royal, mentions computer simulation as the factor that has made most difference over the past decade. “It has been a huge boost for the field,” he says. “Now you can compute what happens for example when black holes collide – the magnetic fields, jets and so on – in a realistic way.”
Lord Rees admits to being “slightly a dinosaur” himself when it comes to carrying out simulations. “On the computational side, I’m a cheerleader rather than an adept personal performer,” he says – before demonstrating on his laptop a spectacular flight through the universe created by the Virgo Consortium for Cosmological Supercomputer Simulations, an international collaboration based at Durham University in the UK and the Max Planck Institute for Astrophysics in Garching, Germany.
Cern has come to exemplify the application of information technology to science since the world wide web was invented there in 1989 as a way for researchers taking part in its experiments to share data around the globe.
Now its Worldwide LHC Computing Grid gives 8,000 physicists access to the one petabyte of data, equivalent to 210,000 DVDs, every day; at peak times the transfer rate exceeds 10 gigabytes per second.
Computers still coexist with the old methods. Writing a formula on a blackboard helps me think
- Gian Giudice, Cern theorist
Another technology has also made a huge difference to the way Cern researchers work: videoconferencing. “When I started working here [at Cern] in the 1990s we didn’t have videoconferencing and people had to come to Geneva for meetings,” says Tara Shears, a particle physics professor at Liverpool university in the UK. “Now all our working group meetings are videoconferenced.”
In addition to a huge band of experimentalists such as Prof Shears, Cern and other physics labs also employ theorists, scientists of a distinctly different class, some of whom are less dependent on computers.
“Computers still coexist with the old methods,” says Cern theorist Gian Giudice. “I still use pen and paper and blackboard and chalk a lot. Standing in front of a blackboard and writing down the formula really helps me think.”
Even when he is taking part in a Skype call, Prof Giudice will turn the webcam on his computer round to point at the blackboard.
Every field of science is wrestling with the problem and opportunity known in short as “big data”. Some disciplines, such as particle physics and meteorology, have used supercomputers for decades to extract insights from large-scale observational and experimental data.
Others are relatively new to supercomputing. Ecologists, for example, focused until recently on local-scale research into the way plants and animals interact with the environment. Now computing is enabling them to be far more ambitious.
In the US the National Science Foundation is spending more than $400m to set up aNational Ecological Observatory Network that will gather and synthesise data from 106 sites across the country, with the aim of tracking changes in the American ecosystem through space and time.
Scientists at Microsoft Research Cambridge, working with the UN Environment Programme, have developed a global ecosystem model. They say it is the first general purpose model of the biosphere that researchers can use to simulate any system of living organisms on Earth, big or small – an ecological counterpart to the computer models used to predict climate change.
“As recently as five years ago people were saying it would be impossible to build a global ecosystem model because there were too many complexities and uncertainties,” says Stephen Emmott, Microsoft’s head of computational science research. “But now we have done it.”
In the life sciences, one of the most spectacular applications of information technology will be the EU’s 10-year €1.2bn Human Brain Project, launched in Lausanne this week as the world’s largest neuroscience research programme.
Every aspect of the project depends on computing, from neuroinformatics – extracting the maximum information from the thousands of scientific papers published every year about the brain – to eventually simulating a working brain in a machine.
Given the importance of supercomputing for research, the world of science took note when a Chinese machine led the most recent TOP500 list of the fastest computers. The Tianhe-2, developed at the National University of Defence Technology, will be deployed at the National Supercomputer Centre in Guangzhou by the end of this year – two years ahead of schedule. Working at 33.86 petaflops (33,860 trillion calculations per second), Tianhe-2 will be available to Chinese researchers across a wide range of disciplines.
Worryingly for Europe, US supercomputers take second and third places, with a Japanese machine fourth. The top-rated European entry at number seven is Juqueen at the Forschungszentrum Juelich in Germany. The world’s most beautiful facility, Barcelona Supercomputing Centre’s MareNostrum inside the converted Torre Girona chapel, is number 30. Taking the list as a whole, the US is the clear leader with more than half (252) of the world’s 500 fastest supercomputers. Asia has 119 (including 66 in China) and Europe 112 systems.
But sheer speed and power are not everything in scientific computing, says Prof Tildesley. Clever programming and networking matter too.
“In the future academic researchers can expect to access different tiers of computing power, depending on their need,” he says. “At the top there will be exaflop machines [performing a billion billion operations per second] in a few international centres, then progressively larger tiers of less powerful computers at national centres and universities – down to personal computers on the lab bench.”
The software will increasingly incorporate artificial intelligence, spotting patterns in data. Yet Prof Giudice insists: “The old ways of working will continue for a long time to come. Discoveries will be made by people coexisting with computers, not by computers alone, and there will always be room for human inspiration.”
_____
De FINANCIAL TIMES, 12/10/2013
Fotografía: Barcelona Supercomputing Centre´s MareNostrum, one of Europe´s most powerful supercomputers, is housed inside the converted Torre Girona chapel

No comments:

Post a Comment