Brian Cox on Science Britannica: "Britain is in a world-leading position in science and engineering"

"I honestly believe that, with visionary expansion, science and technology could transform our economy”

Comments
Brian Cox on Science Britannica: "Britain is in a world-leading position in science and engineering"
Written By
Anjana Ahuja

There is a sticky moment, near the start of his new series Science Britannica, when Professor Brian Cox is surveying a mist-draped London skyline. All the danger signs are there: he’s dressed in a parka like an off-duty rock star, perched on a hill, a look of awe creeping over his face.

And then, as the camera pans across the spires and tower blocks, the great names of British science and invention start popping out of the sky: Alexander Fleming (penicillin), Charles Darwin (evolution), Michael Faraday (electricity) and Frank Whittle (jet engine). Cox disappears from view – and the threat of parody is narrowly averted.

The trio of programmes about Britain’s long and glorious scientific heritage, Cox insists, was never meant to continue the Wonders franchise – Wonders of the Solar System, Wonders of the Universe and Wonders of Life – that turned the 45-year-old physics professor into a household name, a tabloid columnist and a target for mimicry. “The idea had been around for about three years: to make a programme that asks why Britain has been so good at science and technology for so long. When you look at the London skyline, you can see spread out before you some of the greatest milestones in scientific history. This tiny piece of land has had a tremendous influence on civilisation.”

It would have been wrong, says Cox – a particle physicist who divides his time between Manchester University and the Cern complex in Geneva – to dress it up as a Wonders-style offering because the almost 400-year-long story of British science is also a saga of suspicion, fear and controversy.

In the 18th century, John Hunter, celebrated physician to such luminaries as Sir Joshua Reynolds and Adam Smith, appalled Georgian sensibilities by grafting a cockerel’s testicle into a hen’s belly, and pickling dead foetuses to study foetal development. But today, researchers who experiment on animals to find cures for human diseases, and scientists who explore animal and human cloning are vilified more than Hunter was. Climate scientists frequently run the gauntlet of politically motivated campaigns to discredit them; stem-cell biologists and fertility researchers are condemned for “playing God”.

In the opening programme, Cox explores what he euphemistically calls these “points of contact” between science and society by meeting Tipu Aziz, a neurosurgeon at Oxford University who studies Parkinson’s disease. Aziz is also an outspoken advocate of animal experimentation. Aziz has used the results of his experiments on macaque monkeys to help patients out of their wheelchairs; but his laboratory is often picketed by anti-vivisectionists, some of whom would like to see him dead.

So is Cox a supporter of Aziz’s branch of science? Realising the minefield he’s stepping into, he chooses his words carefully. “We didn’t want to get into the rights or wrongs of it, and we deliberately didn’t want to come down on one side or the other. What we are trying to do is to present the most controversial points of contact between scientific research and social attitudes. We thought quite hard about whether it would help or hinder to include difficult issues like climate change, GM crops and animal experiments. I think it helps because it’s an honest representation of points of friction. Otherwise you get a sanitised view of what science is about.”

But isn’t it a cop-out that Cox doesn’t say whether he’s for or against Aziz’s methods? “That would be a polemic, and I didn’t want to make a polemic,” he responds, somewhat testily. “The programme is about the interaction of science and society, presented by a scientist. We didn’t want it to be a scientist’s view of the interaction between science and society. I honestly wanted not to impose my views. I don’t want the press interviews to be about my views on anything.”

He reveals more later, including his vision for universal science education and why it’s OK for a scientist to conduct research without regard to its consequences. But back to the past.

In the mid-17th century, a group of natural philosophers began meeting in London to discuss a new way of learning about the natural world: through observation and experiment. That salon crystallised in 1660 into the Royal Society, the oldest and most venerated scientific institution in the world, whose fellows have included Isaac Newton, Joseph Banks (a founder of Kew Gardens), Charles Darwin, Benjamin Franklin and Isambard Kingdom Brunel. Knowledge expanded rapidly in such fields as astronomy, engineering, chemistry and zoology. Britain, already the centre of a geographical empire, became the centre of the scientific universe – the Royal Society boasted a foreign secretary 59 years before the government did.

When the Society started printing the world’s first scientific journal, Philosophical Transactions: Giving Some Accompt [sic] of the Present Undertakings, Studies, and Labours of the Ingenious in many Considerable Parts of the World, the idea of scientific publication and peer review – the rigorous, public checking and correction of methods and data, which is the bedrock of global science today – was born. This, together with the Industrial Revolution, saw English gradually replace Latin as the lingua franca of progress.

Then, in 1799, the Royal Institution, dedicated to educating the public about science, opened its doors in Piccadilly. Fashionable Londoners flocked to Albemarle Street to see its star turn: handsome, 22-year-old chemistry professor Humphry Davy, who delighted in turning science into theatre. The research of Davy’s assistant at the RI, Michael Faraday, paved the way for electricity. That legacy of technical brilliance survives; national spending on research and development remains relatively well protected in an age of austerity.

Cox is still not satisfied: “Maybe we don’t notice it but we’re in a world-leading position, with a science and engineering base that’s been built up over 400 years. And I honestly believe that, with visionary expansion, science and technology could transform our economy.” Cox’s new series rounds off with a look at our future as a science nation, and why blue-skies research, with no immediate payoff, is worth doing.

He even thinks we could be on the verge of another Golden Age, citing as examples the 2010 Nobel Prize-winning discovery of graphene (a material made up of a single layer of carbon atoms, which is set to revolutionise electronics) and the experiments at Cern earlier this year that confirmed the existence of the elementary particle the Higgs boson: “The idea that there’s nothing left to discover is nonsense.”

We veer off-message again, discussing how he would exploit Britain’s pre-eminent position. He would keep people in education until they’re 21 – with compulsory tuition in the country’s scientific, industrial and philosophical heritage. “Forty per cent of people go into higher education now. What if you double that? Including tuition fees, it would cost about £21 billion. That’s on a total government spend of £720 billion. How many Utopian visions are affordable? Well, this one is.”

Cox’s Utopia would turn us into a scientifically literate nation and raise the level of public debate: “We’re in a democracy and quite rightly public policy should be democratically decided, but everybody needs to at least understand what it means to make scientific statements. I’m not saying that you have to be a scientist or an expert on everything, but you have to understand what weight to give the peer-reviewed scientific consensus. Otherwise you just get uninformed reaction.”

In climate change, for example, challenging that consensus has become a way of rubbishing the field: “Nobody’s saying science is perfect or that it’s always right, but people need to understand it’s the best method we have [of acquiring and using knowledge].”

Law-makers and politicians, especially, should be able to grapple with uncertainties, error bars and statistical significance: “They should take due account of the weight of scientific evidence when they make decisions. But if they really don’t understand what it means to make a scientific statement on climate modelling or the dangers, or not, of GM crops, then they’ve got no chance [of making informed decisions].”

Scientific ignorance doesn’t, however, mean that scientists should dictate regulation. Scientists, in his view, should do science, and society should decide its uses. Is it OK, then, for scientists to wash their hands of the moral consequences of what they do, even if it’s potentially dangerous, like the atom bomb?

“Well, that’s a legitimate view in the sense that, in a pure world, a scientist’s job is to understand nature, to generate knowledge,” he muses. “It’s for society to decide what to do with that knowledge.” Otherwise, he warns, scientists become an unaccountable “priesthood”.

Could, in fact, a blindness to moral niceties be vital for progress? By definition, new science upsets old thinking; pioneers must have the stomach for the fight. We talk about people such as Craig Venter, an American geneticist and synthetic biologist who wants to create artificial organisms, including biofuels to power the planet, and Severino Antinori, an Italian embryologist who wants to clone a human being.

“Maybe you’re right, that you’re always labelled as radical and cavalier if you go off in some direction that generates new knowledge,” says Cox. “That’s an interesting conundrum for science, because it has to be true that understanding nature is always a good thing. Being an ostrich is never a great strategy. But, at some point, you have to trust civilisation to use that knowledge wisely.”

Science Britannica starts tonight at 9:00pm on BBC2


 


Add new comment