miercuri, 29 septembrie 2010

Oxford Martin School

A new programme at the Oxford Martin School, led by Pedro Ferreira of Oxford University's Department of Physics, aims to see how this and other seemingly impossible tasks can be tackled by developing new ways of handling data.




Pedro will be explaining this new approach as part of a showcase at the Royal Society this evening, but before that I caught up with him to ask about data, supercomputers and the biggest problems in science:



OxSciBlog: Why do we need new approaches to handling data?

Pedro Ferreira: We are at a threshold of a new era in Cosmology. Over the past few years we have developed very powerful instruments - groundbased, balloon born and satellite telescopes - that have a phenomenal capacity for collecting data. New surveys of galaxies, maps of cosmic radiation at various different frequencies all probe the Universe on a wide range of scales.



We want to learn what are the fundamental properties of the Universe, such as what its made of and how it is evolving. The new data sets are so massive that this can’t be done using conventional methods.



We need to be clever, innovative, pushing the boundaries of data management and statistical analysis. In particular we need to come up with radically different methods. Otherwise we won’t be able to extract the knowledge we want from the data we have in hand. And it is going to get much, much worse.



OSB: What lessons have been learnt from astrophysics & cosmology about processing/searching large amounts of data?

PF: We have learnt a lot. For example, we have had to come up with very powerful ways of simulating the data sets.



Think about it: we want to simulate the Universe on a computer, something immense, but with enough detail that we can recognize the fine details we see around us. That has proven a challenge but as a result, members of our team have taken part in developing codes which are some of the most powerful in the world.



I mean by this that they can run on the largest supercomputers in the world, making the most use of the gigantic computing capacity of these machines. In fact, as a result of this capacity, these codes can be used to benchmark the next generation of supercomputers.



OSB: What sort of new techniques will you look to develop?

PF: We have three strands of research that we want to pursue: First of all we want to develop methods to deal with the up and coming surveys of galaxies such as the ones that will come out of the Square Kilometre Array (SKA) or the Large Synoptic Survey Telescope (LSST). We need to be able to deal with data sets with billions, not millions of galaxies.



Second, we want to harness the capacity of the public to take part in the analysis of these data sets. Our Citizen Science project has been very successful in harnessing the creativity of hundreds of thousands of individuals online. We want to explore the potential of this new way of doing science.



Finally, we really want to push our ability to simulate the Universe using the most advanced computer codes in the field, developed by our team. These machines have to be able to correctly simulate the largest scales, the overall properties of the visible Universe while at the same time, pick out the fine details. How galaxies interact, merge and evolve to build up the complex cosmic ecology which we observe.



OSB: How might this help researchers in fields such as oceanography, climate science and medicine?

PF: The problems we are facing in Cosmology are present in many other fields. In Climate we need to be able to simulate incredibly complex systems on a wide range of scales. In Oceanography, there are experiments which will try to map out the oceans in real time and tens of thousands at different points.



Imaging can and will play a crucial role in medicine and is amenable to the use of novel statistical methods. Citizen Science, as developed by our group, is already being deployed in a range of fields, from the classification of weather logs to the reconstruction of classical papyri.
1. Home


2. Media

3. Science blog



Media



* Contact the Press Office

* Science blog

o Shells, silicon & neighbourly atoms

o Telling the Story of Science

o Wytham: 60 years of science

o Reading up on women in science

o Playing Dorothy

o LOFAR tunes in to pulsars

o Turning CO2 into fuel

o LHC & right handed particles

o The case of the grown-up galaxy

o Fool’s gold holds fossil treasure

o How history made science

o Nanocapsule delivers radiotherapy

o Science Blog up for award

o The science of collecting

o Evolution in Shangri La

o Hunting brown dwarfs

o Malaria: double the suffering

o Battle of the bugs

o HGP is 10: the gene therapy challenge

o HGP is 10: more than just genes

o Breast screening: balancing risk

o On a virtual Moon mission

o Satellites see the earth move

o Proteins prove their metal

o Sun's dark matter trap

o John Aubrey: a life surveyed

o Penicillin: the Oxford story

o Evolution's parent trap

o Restoring a reputation

o Herbaria: science and specimens

o Profiting from the windy Pampas

o Video: galaxies & citizen science

o Fused echoes see whole heart

o Video: animal flight

o B vitamins: in depth

o Neutrinos show Sun's dark side

o Numbers, books & apps

o Diviner reveals Moon's extremes

o Spin-outs show Oxford's impact

o Science cuts: the dangers

o Simulating the Universe

* Books

* News releases for journalists

* Find an expert

* Filming in Oxford

* For University members



about text



About this blog



Gives you the inside track on science at Oxford University: the projects, the people and what's happening behind the scenes. Curated by Pete Wilton, science writer and OU Press Officer.







Contact: Pete Wilton



01865 283877 begin_of_the_skype_highlighting 01865 283877 end_of_the_skype_highlighting



pete.wilton@admin.ox.ac.uk



Follow Pete on Twitter: http://twitter.com/petewilton





RSS feed

Oxford Science Blog

banner

Read the latest science news and views from Oxford University

Search blogs

Keyword

tag



*

Simulating the Universe



Pete Wilton
27 Sep 10
0 comments



Big bang simulation



How do you accurately simulate the Universe on a computer?



A new programme at the Oxford Martin School, led by Pedro Ferreira of Oxford University's Department of Physics, aims to see how this and other seemingly impossible tasks can be tackled by developing new ways of handling data.



Pedro will be explaining this new approach as part of a showcase at the Royal Society this evening, but before that I caught up with him to ask about data, supercomputers and the biggest problems in science:



OxSciBlog: Why do we need new approaches to handling data?

Pedro Ferreira: We are at a threshold of a new era in Cosmology. Over the past few years we have developed very powerful instruments - groundbased, balloon born and satellite telescopes - that have a phenomenal capacity for collecting data. New surveys of galaxies, maps of cosmic radiation at various different frequencies all probe the Universe on a wide range of scales.



We want to learn what are the fundamental properties of the Universe, such as what its made of and how it is evolving. The new data sets are so massive that this can’t be done using conventional methods.



We need to be clever, innovative, pushing the boundaries of data management and statistical analysis. In particular we need to come up with radically different methods. Otherwise we won’t be able to extract the knowledge we want from the data we have in hand. And it is going to get much, much worse.



OSB: What lessons have been learnt from astrophysics & cosmology about processing/searching large amounts of data?

PF: We have learnt a lot. For example, we have had to come up with very powerful ways of simulating the data sets.



Think about it: we want to simulate the Universe on a computer, something immense, but with enough detail that we can recognize the fine details we see around us. That has proven a challenge but as a result, members of our team have taken part in developing codes which are some of the most powerful in the world.



I mean by this that they can run on the largest supercomputers in the world, making the most use of the gigantic computing capacity of these machines. In fact, as a result of this capacity, these codes can be used to benchmark the next generation of supercomputers.



OSB: What sort of new techniques will you look to develop?

PF: We have three strands of research that we want to pursue: First of all we want to develop methods to deal with the up and coming surveys of galaxies such as the ones that will come out of the Square Kilometre Array (SKA) or the Large Synoptic Survey Telescope (LSST). We need to be able to deal with data sets with billions, not millions of galaxies.



Second, we want to harness the capacity of the public to take part in the analysis of these data sets. Our Citizen Science project has been very successful in harnessing the creativity of hundreds of thousands of individuals online. We want to explore the potential of this new way of doing science.



Finally, we really want to push our ability to simulate the Universe using the most advanced computer codes in the field, developed by our team. These machines have to be able to correctly simulate the largest scales, the overall properties of the visible Universe while at the same time, pick out the fine details. How galaxies interact, merge and evolve to build up the complex cosmic ecology which we observe.



OSB: How might this help researchers in fields such as oceanography, climate science and medicine?

PF: The problems we are facing in Cosmology are present in many other fields. In Climate we need to be able to simulate incredibly complex systems on a wide range of scales. In Oceanography, there are experiments which will try to map out the oceans in real time and tens of thousands at different points.



Imaging can and will play a crucial role in medicine and is amenable to the use of novel statistical methods. Citizen Science, as developed by our group, is already being deployed in a range of fields, from the classification of weather logs to the reconstruction of classical papyri.

(Full story)

*

Science cuts: the dangers



Jonathan Wood
23 Sep 10
0 comments



money



Scientists are expecting grim news in the forthcoming Comprehensive Spending Review, where funding for science and research is expected to be cut significantly. This is despite arguments that science and innovation should be at the heart of future economic growth, not least in a Royal Society report from March.



Business secretary Vince Cable at the beginning of the month said universities will have to do ‘more with less’, and angered researchers by suggesting up to 45% of grants went to research that wasn’t of excellent standard (with the implication that mediocre science could reasonably be cut). Now the heads of leading research universities are getting involved.



Lord Krebs, head of the House of Lords’ science and technology select committee, has warned today that cuts to the government’s science research budget will affect the ability of UK universities to attract and retain the best researchers from around the world.



He spoke this morning on Radio 4’s Today programme after sending the science minister David Willetts a letter setting out the views of the heads of six leading universities, including the Vice-Chancellor of Oxford Professor Andrew Hamilton.



The Times and BBC News Online have both covered the story and publish the letter in full.



David Willetts gave evidence to the Lords’ science and technology committee in July, when he invited the committee to provide evidence that the UK is becoming a less attractive place for science research.



Lord Krebs then wrote to the vice-chancellors of six leading research universities – Oxford, Cambridge, Manchester, Imperial, Edinburgh and UCL – for their experiences. Their responses provide the material for the letter to David Willetts, including the suggestion that a handful of top researchers have already returned back to the US given the outlook for research funding here.



In Oxford’s submission, Professor Andrew Hamilton says: ‘We have very real concerns that the brightest and best researchers at all stages of their career could accept offers of study or employment at our international competitor institutions should the national funding environment become more challenging ... We are of the firm view that it is less expensive to retain our [leading UK research intensive universities’] current quality endeavour than it would be if it had to be rebuilt in the future.’



Lord Krebs’ letter and the statements from the universities have now been published on the science and technology committee’s website [for more see PDF of Oxford's response].



It’s clear that discussion of the effect of science cuts will continue in the next days and weeks.
One way is to look at the number of spin-out firms universities create in order to exploit new scientific ideas and techniques.




As Richard Tyler writes in The Telegraph a new report into life science start-ups suggests that Oxford University is doing particularly well in turning good ideas into companies. The report states that 'if all the Russell Group universities were operating to the same level as Oxford, in theory there would have been an additional 78 university spin-outs over the period [2005-09].'



It goes on to show that, in measures of numbers of and investment in spin-outs, Oxford comes 'way out in front of the pack' ahead of Imperial and Cambridge. It also highlights the importance of university technology transfer companies, such as Oxford's own Isis Innovation.



Of course, as the report also makes clear, spin-outs are not the only measure of success in terms of the commercialisation of research. But these new companies are the most obvious sign that good science is being turned into good business.



Read more about Oxford's business links on our Enterprise web pages.

Faceți căutări pe acest blog

Persoane interesate

Despre mine

nicioadat nu au existat taine dar totdeauna realitatea nu poate fi perceputa decat partial