Understanding scientific uncertainty

People expect a lot from scientists. Preferably ready-made, unambiguous answers, valid for eternity. But because of science’s critical character and rigorous reality checks of hypotheses, different scientists can give different answers to the same questions. If these questions concern cutting edge research, this is more the rule than the exception. Only after years or even decades of extensive checks do some scientific hypotheses make it into the handbooks of science, that are hardly doubted anymore. But even some scientific handbook might get overthrown after some time.

Furthermore, even the best scientists at the time can be terribly mistaken. When American physicist Charles Townes in 1951 started to think about microwave amplification by the stimulation emission of radiation – a maser, the microwave equivalent and predecessor of the laser – Nobel prize winner Isidor Isaac Rabi and Polykarp Kusch, who was yet to win the Nobel prize, told Townes that it was impossible and asked him to stop his research. Luckily Townes didn’t stop and developed the first maser only two years later, which won him the 1964 Nobel prize.

The story repeats itself with the development of the laser. Townes’ brother-in-law Arthur Schawlow, who was also to win a Nobel prize, had predicted that it was impossible to build a laser with ruby as a laser medium. The young Theodore Maiman wasn’t convinced and started his intensive research at Hughes Research Laboratories. The Hughes management however, trusting Schawlow’s prediction, discouraged Maiman’s ruby-research. Maiman stubbornly continued, and in 1960, this year exactly fifty years ago, he demonstrated the first laser…with ruby as a laser medium. The great freedom to doubt the thoughts of even the best scientists led Townes to the maser and Maiman to the laser. Uncertainty in science is a strong stimulus for creativity.

By better understanding the role of uncertainty in science we may better understand what science can and cannot offer society. Uncertainty in science has essentially three roots: in measurements, in data analysis and in models (both conceptual, physical and numerical). Scientists try to get rid of uncertainties as much as possible, but cannot get rid of all of them. Therefore, science is first of all a process that separates the evidently untrue from the possibly true. This is very different from the public perception that science is an encyclopedia of absolutely true facts.

Unfortunately, when people hear scientists saying they don’t know everything, they often conclude that they know nothing, or that one opinion is as good as any other, or that evident blunders in the IPCC-report make the whole report worthless. Uncertain science, however, is something different from bad science. There are degrees in uncertainty, varying from extremely uncertain to virtually certain. Hardly anything in science is absolutely certain. Watch below what physicist and Nobel prize winner Richard Feynman had to say on uncertainty: “It’s much more interesting to live not knowing than to have answers that might be wrong…I don’t feel frightened by not knowing things.”


Science journalism in the Entertainment Age

In his essay ‘Science journalism: Too close for comfort’ (Nature, 25 June 2009) the American science reporter Boyce Rensberger analyzes the history of science journalism and distinguishes three ages: the ‘Gee-Whiz Age’, the ‘Watchdog Age’ and the ‘Digital Age’.

About the first two there can be little disagreement. However, to call the third age – our present time – the ‘Digital Age’, tells only something about the technology used to convey science journalism, but nothing about its character. I would call our age the ‘Entertainment Age’. Before I come to explain science journalism in the ‘Entertainment Age’, let’s first go back into history.

What is the ‘Gee-Whiz Age’? ‘Gee-whiz’ means something like amazement or enthusiasm and is thought to originate from the exclamation ‘Jesus Christ! – ‘What a surprise!’ The two decades immediately following the Second World War were, at least in the Western World, years of general reconstruction, industrialization and strong economic growth. Almost autonomously, it was thought, fundamental research would produce a stream of technological innovations – an idea that created a kind of scientific paradise where researchers were given a large degree of freedom and substantial financial support.

Science journalism in the ‘Gee-Whiz Age’ put an emphasis on the wonders of science. Underneath the idea was that science brings progress and prosperity. This is true, but only till the point when our basic needs are satisfied. Simply said, when we have food, housing and clothing, our happiness is more determined by social relations, self-fulfillment and the absence of dramatic changes in life than by owning a new mobile phone or a flat screen TV. There is no evidence that people in the Western World of today are happier than thirty years ago, although we own more ‘stuff’ and have longer life expectancies.

From the end of the sixties both the public and politicians started viewing science in a different way. The optimism of the post-war years made way for a more critical attitude, with research coming under greater scrutiny than ever before. The shift began with the movement to democratize society in general and the universities in particular at the end of the 1960s. It was subsequently reinforced by the oil crisis of the early 1970s, the Club of Rome report Limits to Growth and the awareness of environmental problems. Science journalism entered the ‘Watchdog Age’, as Rensberger calls it: science journalists scrutinizing scientists in the way political journalists scrutinize politicians.

However, what Rensberger doesn’t write is that in the Watchdog Age there was still a lot of Gee-Whiz science journalism (may be even more than watchdog-journalism), simply because the majority of people are more interested in knowing, let’s say, what black holes are than in knowing how much money goes into black hole-research and in scrutinizing black-hole-scientists. And this is valid up till today. ‘Gee-Whiz’ science journalism never dies because of people’s natural curiosity about how ‘it all’ works and where ‘it all’ comes from.

And now our present day. Sure, we live in a digital age. Science journalism is moving online. But what about its character? In the Western World, I think that more than ever before the entertainment character of journalism is prominent. We live in an Entertainment Age and science journalism has also moved into the Entertainment Age. We have busy lives, we have all the material requirements for a life without too many worries, and in our few free hours per day we want to be entertained. Science for fun: A magician showing how he tricks our minds; a documentary imagining the disaster after a super volcano explosion; a story about time travel. The Entertainment Age is characteristic for wealthy, stable societies where technological progress doesn’t contribute anymore to increasing happiness.

Is it a bad thing that science journalism is now in the Entertainment Age? Not at all. Or at least, not necessarily. Entertainment can be a great way to convey the messages of science journalism. Entertainment brings emotion and we tend to remember information in an emotional package better. The challenge is to find the right combination of Gee-Whiz-, Watchdog- and Entertainment-journalism, either apart from each other or combined, because a good science story or documentary can combine all three.

Fantasy is cheap, facts are expensive

“By the end of 2013, 100.000 Europeans have died of starvation.”

“One solar storm could destroy power grids all over the world…”

Sometimes I wonder why I don’t change my profession from being a science journalist to being a fantasy writer. Just writing whatever sells. It would save days of checking facts.

This thought was running through my mind when I was doing research for an article for the Dutch popular science magazine NWT about the effects of a geomagnetic super storm on modern societies. A geomagnetic super storm? Yes, sometimes the sun blows out a bubble of charged particles, a solar hurricane. When the solar hurricane hits the earth, it can create a geomagnetic super storm, which – in principle – can knock down power grids.

In principle – that is the clue here. In fact, only hundreds of kilometers long electricity lines are vulnerable. Nobody knows exactly what is going to happen. And what is going to happen will change from country to country, depending on the latitude and the electricity infrastructure.

In my research I was first reading scientific reports and papers. Later I started calling electrical engineers and power grid specialists. While in the beginning of my research, New Scientist published the article ‘Gone in 90 seconds’, about exactly the same subject. The author sketched a worst case scenario for the US. It was a dramatized extrapolation of an American scientific report.

The crucial question remained whether the American situation is typical for countries on other continents. What about the situation in China, India, Argentina, Poland, Cameroon…? Don’t these countries count?

No, the American situation is not typical, as the article doesn’t mention. For example, Europe has hardly any of the very long electricity lines that the US has. Therefore, the quote from the New Scientist-lead, the second quote with which I started – “One solar storm could destroy power grids all over the world…” – is rubbish.

Does this article lead sell? For me it works the opposite way. As soon as somebody claims that ‘all over the world’ things will be destroyed, I rather think: “Oh no, somebody is trying to fool me.”

A month later a colleague – and the editor of my article-in-the-making – sent me a link to another article about geomagnetic super storms, this time from UK’s Daily Mail: ‘Meltdown! A solar super storm could send us back into the dark ages – and one is due in just THREE years’.

An exclamation mark and a word written in capitals in the title…Another reason to put aside the article. However, I was doing my research on geomagnetic super storms, so I wanted to read everything I could about the subject. It turned out that the New Scientist-scenario that was sketched for the US was shamelessly translated to a European context. No critical question was asked to what extent this can be done. No own research at all. Add a bit of spice to what you hear from others.

Some extra fantasy was added, with little creativity though. The scenario only got worse. No electricity, no delivery of food to shops, fridges breaking down, and soon the first people start to die… 100.000 Europeans would die of starvation after the geomagnetic super storm would hit the earth, the article states.

100.000 dead? It takes a fraction of a second to fantasize the number. But even weeks of research wouldn’t lead to a reliable number. Impossible to give arguments for any reliable number. Even irresponsible to speculate about it in a journalistic article.

Fantasy is cheap, facts are expensive. I hope that the world wide cuts in (science) journalism will not lead to even more fantasy and even less facts. Bad fantasy is so much harder to swallow than good facts.

The inflationary news universe

This February’s AAAS conference in Chicago once again brought together a varied selection of scientists and scientific topics. It was my third AAAS in a row and it was the third time that I have found it very useful to interview scientists, talk to them informally and hear about new research directions in various lectures.

But it was also the third time that I have heard some colleagues complaining: “there is no hard news”…And you could hear them thinking: “why do I travel to a conference if I can receive all the science news on my computer at home?”

I find it worrisome that the notion of ‘news’ – even a very narrow notion of news – for some science journalists has become a dogma that completely determines their way of working.

Why is it worrisome?

First of all, it’s not just a publication in one of the science journals that makes news. Why wait till a journal sends a press release announcing that there is news? And is it really news? With the ever growing amount of scientific papers per year, the news inflation is also growing. The discovery of the first exoplanet is thrilling, the next one just exciting, but the discovery of lifeless exoplanet number 314 is rather boring.

We should also make our own news, as I have argued in my previous contribution to this blog. For making our own news and finding fresh angles to ongoing research, a conference like the AAAS provides an excellent opportunity.

The second reason to worry about the news dogma, is that we work for people who are interested in much more than just news. Most of all, they need scientific context and background to form opinions about the ever more complex world. What does it mean in my practical life that scientists can unravel my genome? What does a brain scan tell about who I am? How does a climate model work? How reliable are mathematical models of the economy?

We are living in an inflationary news universe. Our modern information world provides an overload of so called news, and a lack of context. Most people get totally confused if they first read in a one hundred word article that green tea is healthy for them and half a year later in another one hundred word article that it is not proven that green tea is good for them. This news swing can continue for years, ultimately leading to people turning their back to science news all together. Too much published science news is trash news.

Luckily, people continue to be intrigued by scientific questions of everyday life: Why do humans sleep eight hours and elephants only 3,5 hours? Or why do women cry more than men? And of course every new generation wonders about philosophical questions such as what is time, what is life, or what is consciousness? There is always ongoing research that provides a hook for covering such timeless or ordinary life questions in a fresh way.

The third and most important reason that the news dogma is worrisome, is that the notion of ‘news’ in science journalism plays a different role than in ‘ordinary’ journalism. Science always acts on large timescales – mostly years and sometimes even decades – and every day life mostly does not. Therefore reporting about trends is at least as important in science journalism as reporting about so called news. If a scientific discovery is announced today, you can be sure that it has been preceded by years of work. But if today a plane crashes…of course, that would be news of today. No way that we could have reported about it yesterday.

Science journalism should not be guided by the narrow notion of news that ordinary journalism seems to demand from us. Let’s give people the scientific context that they need to know and enjoy to know, instead of boring and meaningless ‘news’ about the still-not-one-hundred-percent-proven-healthiness-of-green-tea.

(By the way, news or no news, I love fresh green tea.)

Go to the lab and your mind can be read

Science is what scientists do. But what scientists really do, only partly appears in their scientific publications. In the publications we read what went well, not what went wrong; we read the results, not the struggle to find the results. When I was doing science myself – as a PhD student in physics – I have seen colleagues struggling for four or five years to build an experiment and get it to work. When the experiment finally worked, the data were sometimes collected in a month. From their scientific publications you would guess that the research had gone smoothly and logically, but the reality had been the opposite.

Science is a process, not even a logical process, but an irregular one. To understand that process, science journalists should regularly go out and see science in action: in the lab, at the accelerator, to the Arctic, on a volcano, or wherever. We can not fully understand science from scientific publications alone.

Since I have reported regularly about brain scan experiments, I finally wanted to be part of such an experiment myself. Not just for fun, but as a subject in a scientific study. Last Thursday I was lying in an fMRI-scanner at Maastricht University in the Netherlands for an hour. With a three-tesla magnetic field my brain was scanned and my mind read.

The aim of the researchers is to let ‘locked-in’-patients communicate with their family and friends. ‘Locked-in’-patients suffered from a stroke or an illness so that they cannot move, they cannot speak, they cannot even blink their eyes. But they are conscious, as we know from those who luckily managed to recover. They can hear what others are saying but they cannot react in any way. German Karl-Heinz Pantke was one of the lucky ones who recovered and he wrote about his striking experiences in the book Locked-in – Gefangen im eigenen Körper. These patients would be helped enormously if their minds could be read.

Researchers from the Maastricht Brain Imaging Centre now invented a way to indirectly read the letters the patient produces in his mind. For example, when I was in the scanner, we had defined that the letter D stood for saying in my mind the Shakespeare-quote ‘To be or not to be, that’s the question’, and that A stood for mentally drawing a house. I produced seven letters by different cognitive tasks, and the researchers reconstructed all the letters correctly from the scans, of course not knowing which letters I wanted to produce. The principle works and they read my mind. And I will be one of the six subjects on whose experimental results the scientific publication will be based.

Sure, it’s very cool to lie in the scanner, have your mind scanned and after the experiment see your own brain inside out on a high resolution scan. But it also gave me much more insight in the scientific process. I saw the clever way in which they had devised the experiment, but I also noticed little things that went wrong. When I had not understood a certain task, I started to analyze my own mistake. This messed up that part of the experiment, because I couldn’t concentrate well anymore on the cognitive tasks I had to do.

I know, going out there to watch science in action takes time, and time is money, but it is an essential part of science journalism. I am afraid that with the growing commercial pressure on journalism, there will be even less journalists than today that will take the time and the effort to go to the lab. But without that effort our job will lose a lot of meaning. As it is said: one dead person is a tragedy, a thousand dead is statistics. Science is more than telling the statistics. To show this, we have to go out there and report science in action.

Hypothesis God and Ockham’s razor

In 2009 it will be 150 years ago that Charles Darwin published his evolution theory. It will also be 400 years ago that Galileo Galilei was the first to discover the heavens by looking through a telescope. So, we are soon to celebrate both the International Darwin Year and the International Year of Astronomy.

Apart from celebrating these events and explaining to the public the powerful insights that evolution theory and astronomy have given us, science journalists will have an extra job to do. We can be sure that creationists, intelligent-design-dreamers and religious believers will do their utmost to cast doubt on the theory of evolution. And it will be our job to counteract.

The strategy of creationists and other believers will be the same as the one the tobacco industry has used in the debate about the health effects of smoking: casting doubt. The trick is easy: creationists point at those questions that science hasn’t answered yet – obviously always the most difficult nuts to crack – and conclude either that science doesn’t know anything about the origin of life, or that science and religion are two equally valuable ways to understand the world.

Both conclusions are of course invalid. If scientists don’t know all about a phenomenon, it doesn’t mean they know nothing. And whereas science produces knowledge, religion produces only beliefs. Scientists test their theories with experiments – the scientific method. Religion doesn’t have a validated method of investigation.

Interestingly enough, it is the freedom to doubt that has made science so successful, and it is the lack of doubt that has made religion often so problematic.

Personally, discussing evolution theory with creationists bores me to death. Their arguments are always the same and they have no facts to support their beliefs. Science, on the contrary, shows a still growing body of facts that support evolution theory. For example: the scientific Breakthrough of the Year 2005 – as chosen by the magazine Science – was the fact that we can now see evolution in action on the genetic level. A splendid discovery that deepens Darwin’s insights. No religious book has given us that insight.

Unfortunately, it doesn’t matter for creationists how large the body of scientific knowledge has grown. For them the discussion always comes down to questions that science has not yet answered, and of course to the question of what happened at the first moment. Hypothesis God can be used to ‘explain’ that the universe was ‘created’. But those that are happy with this hypothesis should be equally happy with the hypothesis that Cookie Monster created everything, or Tweedledum and Tweedledee, or that a bunch of drunken Gods and Goddesses were enjoying an orgy that created the world. And so on. No way to check any of these assumptions.

To save a lot of time and energy in discussing with creationists, let me give my shortest version to end the discussion with creationists.

It’s based on what philosophers call Ockham’s razor, named after the 14th century English philosopher William of Ockham. A modernized version Ockham’s razor reads: ‘don’t make more assumptions to explain a phenomenon than logically necessary’. Ockham’s razor cuts away superfluous assumptions. For example: the attraction between the earth and the moon can be explained by the theory of gravity. Adding the hypothesis that God created the earth, the moon and gravity, doesn’t explain more of the attraction between the earth and the moon. Therefore Hypothesis God is superfluous.

Everybody has the freedom to believe, but once you are interested in knowledge, then Hypothesis God is a superfluous assumption. We can explain a great deal of the visible universe by describing it in terms of matter, energy, space, time and gravity. We can make observations, do experiments and test our theories. But nowhere in our reasoning does Hypothesis God explain anything more than science does. On the contrary: Hypothesis God hasn’t given any insight in the universe that has been able to withstand experimental tests. Science has given plenty of insights that have been able to withstand experiments.

Only if we start thinking about the moment of creation – assuming that there was such a moment – does Hypothesis God do equally well or poor as the scientific hypothesis of a Big Bang. (But for all the 13.7 billion years after the beginning, astronomy does infinitely better than Hypothesis God…). As we have no way to test any theory about the first moment, we can assume any beginning, but it’s not science anymore.
Ockham’s razor is the shortest version I know to end the discussion with creationists. So, let’s sharpen Ockham’s razor. We’ll need it soon in the discussions with creationists, in whatever disguise.

Let’s start the New Year with launching a two-step rocket. The first step celebrates the International Year of Astronomy. The second step, automatically launched once the first step is already high up in the air, celebrates the International Darwin Year. This two-step rocket shows the power of science as a unified body of knowledge to understand the world: because it is the evolution of the universe that has led by chance to the evolution of life.

A happy New Year! Cheers!

The engineering journalist

“For a successful technology, reality must take precedence over public relations, for nature cannot be fooled.”

Substitute ‘technology’ for ‘financial product’, ‘nature’ for ‘economy’, and we have formulated the cause of the present day global financial crisis: “For a successful financial product, reality must take precedence over public relations, for economy cannot be fooled.”

This quote ends the report of physicist Richard Feynman to the Space Shuttle Challenger inquiry. TheChallenger exploded shortly after its launch on January 28, 1986, killing seven people. Engineers had warned before the launch that the so called O-rings were unreliable at low ambient temperatures. And on January 28 the launching site had low temperatures. Despite the warnings, managers didn’t want to postpone the launch any longer. They tried to fool nature, the O-rings broke and what follows is tragic history.

Building a Space Shuttle is an art of engineering. And engineering differs from science. Scientists want to understand the world. They simplify a problem until they can solve it. Engineers devise solutions to real problems, thus changing the world. Often they fix a problem without knowing why it works. Trial and error. Nothing wrong with that. That’s what you have to do when the world is too complicated. Engineers deal with a real-life problem with all its dirty traps. Planes not only have to fly when the sky is clear and sunny, but also when there is storm and thunder. The science of a flying plane is long known. But engineering a plane improves every year. It’s the engineer who has to incorporate all the safety aspects in the design of the plane, not the scientist.

The global financial crisis would not have happened if all those innovative financial products were designed by engineers instead of by scientists-trying-to-impress-their-managers-with-fancy-mathematical-money-models. The Americans call them ‘quants’, the quantitative analysts. The ‘quants’ were too much scientist, too little engineers. They had too little feeling for the risks involved.

Of course, not only the scientists-trying-to-impress-their-managers-with-fancy-mathematical-money-models are to blame. Consumers were greedy to buy products they couldn’t buy, and bankers were greedy for selling products and getting high bonuses. But those innovative financial products were clearly very poorly engineered.

We, science journalists – as the name suggests – concentrate mostly on science. But we too should pay much more attention to the engineering side in our science reporting. It would prevent misunderstanding on the side of the public. A few attention seeking pseudo-scientists claimed earlier this year that the world’s biggest science experiment – the Large Hadron Collider (LHC) in Geneva – might produce tiny black holes that would eat the world in a fraction of a second. They have received more attention then all the LHC-engineering work from the last twenty years all together.

Why did we give a bunch of silly pseudo-scientists so much attention? Because predicting a disaster sells, even if it’s utter nonsense? Why not explain the difficulty of building this highly complicated accelerator? Think about it: Twenty years of engineering work before the science can start. Can you really expect it to work perfectly from the start?

On September 10 a few hundred science journalists were invited to Geneva to glorify the LHC before it had even proved to work as a collider and detector. Protons raced around once in the 27 kilometre loop. There were no collisions, and the energy was much lower than the collider is designed for. Do we glorify something that looks like a plane, but can’t yet fly? No wonder the public was confused to hear that a week after the big launch that the machine showed some serious problem.

There is much more engineering we, as science journalists, tend to overlook: climate models are engineering models. Here too: there is nothing wrong with that, as researchers have no other option. Let’s just explain it honestly to the public. Climate models are full of empirical rules, and buttons that can be turned left or right, to fix scientific details that we don’t yet understand. But you don’t have to understand everything to produce sound results. That’s what we can learn from engineers.

There is much more engineering in science than we think. And we should better report about it.