Dutch fact checking project offers valuable tips for journalists

Journalism and New Media students at Leiden University and Fontys School of Journalism in Tilburg, both in the Netherlands, scrutinised media reports last year, functioning as fact checkers. Their supervisors Alexander Pleijter, Peter Burger and Theo Dersjant wrote a contribution for the recently published anthology ‘Journalism brought into discredit’ produced by the Catholic Institute for Mass Media (KIM, University of Nijmegen) in which they described what the students had discovered. The part of that chapter that looks at causes and offers suggestions is reproduced below. The original Dutch version was published at De Nieuwe Reporter website.

Flawed facts: six causes
The fact checkers came across news items which were not always corroborated by facts. They also found exaggerated information, statistics taken out of context, fabricated problems and muddle-headed experts. They even stumbled across a case of plagiarism. Journalists make mistakes, but whether they make many or few mistakes – well, everyone will have their own opinion on that. We have tried to discover a pattern in these mistakes. The most important causes can be summarised as follows.

1. Unknown sources.
Many of the errors the students noted would not have been made if the journalists concerned had requested a report or phoned up a researcher. At least, if that had been possible – which is often not the case because the sources may not be disclosed or the research has not yet been published. The Code of Bordeaux states with good reason (Article 3): “Journalists should report using only the facts of which they know the origin”.
An example: Bureaucracy is taking up less of Dutch people’s time and money. This was the conclusion the television programme ‘Rondom 10’ drew from statistics published by the ministries of Justice and Home Affairs. None of the newspapers that reported this news had read the research reports – and neither had the TV programme editors.
In order to form an opinion on the reliability of any research, American quality media start from the premise that one should know who commissioned the research, who paid for it and which method was used to collect the data. If the research is not accessible, journalists should state that – or throw the whole subject in the waste-paper bin.

2. Lack of subtlety and context.
News facts and subtleties do not go well together. Trusting the Dutch News Agency ANP, various newspapers reported that children were suffering from ‘St Nicolas Stress’, referring to a traditional holiday in the Netherlands. The agency had not included the information that the expert who made this statement was only talking about autistic children; they become nervous on seeing all the goodies on 5 December stacked up in the shops long before that date. In fact, the ANP had brought in an educational expert who did not believe this story, but this subtlety was excluded from the versions printed by the newspapers De Telegraaf and Metro.
Ready-made salads are vulnerable to Listeria contamination, according to the ANP (the bacterium having killed around 20 Dutch people in 2006). The source of this information was a Wageningen University PhD student who was not talking about ready-made salads at all, but about the danger of wrongly cooled food products.
A related source of error is lack of context. Is there any previous research on the subject? What do other specialists think? A professor of sales and account management in Rotterdam told various media, including the popular Pauw & Witteman TV chat show that, within five years, brain scans would help to judge the suitability of job applicants. However, it was not made clear that he was not a brain scan expert and that scientists who were experts in the field did not believe in his vision of the future.
Journalists also tend to rely on far too few measurements in making their claims about trends. TheSp!ts newspaper, for example, opened on 21 October with the rather worrying headline: “Child killers on the increase”. The gist of this report was that increasing numbers of children in the Netherlands are murdered by their parents (whether natural, foster or step parents). The source for this article was the extremely reliable Statistics Netherlands (the CBS). The article compared the numbers of children murdered in 2000 with the numbers in 2007. There was certainly an increase, but anyone who requested the statistics for the intervening and previous years – as a Tilburg student did – would discover there was no question of an increase. The point was – there were extremely few murders in 2000.

This is how presumptions become certainties, little chance becomes an enormous risk, a few measurements suddenly signal a trend, and random examples turn into newsworthy phenomena.

3. Inadequate research methods.
According to a British professor of psychology, women go shopping more often and make larger numbers of irresponsible purchases in the ten days before menstruating. The BBC, De Telegraafnewspaper and other media reported this as news but anyone making the effort to study the research report would see that the method was totally inadequate and the results inconsequent. For example, there was no significant difference in the degree of compulsive buying between the last and middle period of the menstrual cycle. In other words, it could just as easily have been coincidence.
We get the impression that editorial offices do not give a high priority to insight into research methods or an understanding of statistics. Or perhaps they do have insight into these matters but are unable to see the consequences. Internet polls are often unreliable because those taking part select themselves; everyone recognizes this to be a problem, but that does not prevent news media from presenting the results of some such polls as news.
One example is the Dutch website ikmeldagressie.nl, a website for reporting aggression. It held an online survey from which the media concluded that aggression towards public servants is a big and growing problem.
An even more idiotic example: one of the fact checkers came across some news about gossiping men (they do this more than women: on average 76 minutes per day! And women only 52!) on a British website called OnePoll where people taking part can earn money by filling in a questionnaire. The more rapidly – and more imprecisely – they do this, the more they earn.

4. Biased sources.
All journalists get it hammered into them that sources should be unbiased. Nevertheless, the fact checkers discovered countless reports along the lines of ‘We from Brand X recommend Brand X’. Another example was a claim from two computer games magazines that gaming was the cheapest of all hobbies. A fact checker calculated, however, that the board game Settlers of Catan costs much less and that you could always go walking or do some knitting, should the board game also prove too expensive.
Some more examples: those organising the Emigration Fair commissioned a survey which concluded that many Dutch people are thinking of emigrating; the job vacancy site Monsterboard commissioned a poll which showed that most of those responding do not have the perfect job; the person responsible for all the ATMs and point-of-sale machines knows that PIN usage prevents robbery with violence – even though he has no figures to prove this.

5. Editorial intervention.
Mistakes can arise when changes made by an editor are not checked with the author of the original piece. Various regional Dutch newspapers published an alarming report about an ‘army of cats’ that was ‘spreading death and destruction’; in other words an explosive growth in the number of stray cats in the Netherlands was causing a wave of stench, filth, feline howling and other such problems. The journalist who had written the original text for the regional paper, the Leeuwarder Courant, did not recognize his own story when he saw the version produced by the press agency GPD and then once again by the local Brabants Dagblad. At each editing stage, the stray-cat problem had become even more sensational.

6. Fabricated explanations.

The facts are correct but the explanation is not. Does the annual change to Summer Time cause an increase in the numbers of heart attacks? Yes, there is certainly something strange about the statistics on heart attacks, but scientists cannot agree that this is due to Summer Time. Do people cope with their worries about the recession by eating more meat snacks? Yes, more snacks are being sold, butthis fact cannot be ascribed to the credit crisis because that had not started when the snack sales were investigated.
The credit crisis, that was raging in the period our fact checkers were at work, is a category in itself when it comes to reasons for broadcasting news and all the whys and wherefores. When the students were checking news about the desire to emigrate, they stumbled across one expert who ascribed the increasing emigration to the recession and yet another expert who noticed that emigration was dropping because of the recession.
The Metro newspaper, using exaggerated statistics, noticed the crisis was causing an increase in the numbers of stray cats and dogs – dumped by their poverty-stricken owners. The television current affairs programme Netwerk, in its coverage of the large numbers of pets that, because of the crisis, were ending up in special homes for stray cats and dogs, based their report on research that predated the credit crisis. The Netwerk editor responsible said the following: “Hard facts weren’t necessary in this case”.

Escape the fact checker’s knife
Truth and factual information are both essential to journalism but the urge to produce entertaining, striking or out-of-the-ordinary stories sometimes prevails. A commonly recurring reaction from journalists to questions from fact checkers was: “I didn’t set out to write scientific analysis; it was lightweight news, something nice to read at breakfast”.

They are not saying that lies are OK, but they are defending their right to be economical with the truth. One is at liberty to give facts an interesting slant to make them newsworthy. If scientific research shows that injecting L-Cystine into rats causes them to have an erection, that is not newsworthy. However, if a journalist learns that L-Cystine can be converted into hydrogen sulphide and he remembers it is the stuff that makes rotten eggs stink so abominably, before you know it, the headline is ‘Men get a hard-on from rotten eggs’. Factually incorrect, but it sounds great and will undoubtedly attract lots of readers.

A term like L-Cystine also immediately demonstrates where things can so easily go wrong. Many journalists do not possess the necessary knowledge to understand scientific subjects fully. Just try reading an article in the Proceedings of the National Academy of Sciences, with all those scientific terms and statistical analyses. Try making head or tail of that!

And that’s assuming you have the time to read a research report of that kind. Many journalists do not even receive a copy of the research they are writing about. Many newspaper articles are written using press releases or reports from other media. The latter, particularly, lead to news being spread about without anybody raising the alarm and, anyway, why should one do a check? This is just meant to be entertaining news, surely? Why spend time poring over a research report? It is just a question of choices: checking takes time, so is that subject worth the time? This applies to the enjoyable – and thus not too serious – news reports: they should be written only if they do not require much time.

The above examples seem to suggest that fact finding is not a priority for many a journalist. The requirement to deliver necessitates quantity rather than quality. Furthermore, editorial staff seem to be limited in their ability to correct themselves and to learn from their own mistakes. They are not used to tackling each other about trashy texts. You need courage to tell a colleague that something is inadvisable, or that they have allowed rubbish to be printed in the newspapers. The consequence of this non-intervention culture is an inability to self-regulate, and this leads to unnecessary errors.

For editors who wish to break away from this culture, here are a few suggestions for escaping the knives of watchful students. These suggestions arise from our experiences with the fact-checking projects.

  1. Don’t be afraid to check stories to the point they fall apart. Students often came up with correct stories that were certainly just as entertaining as the original (but erroneous) news. You could, for example, accept the press release from the Dutch Institute for Road Safety Research which stated that the number of serious accidents in 30-kilometre zones increased sharply in the period 2002-2007. You could also check the facts and discover that the number of 30-kilometre zones also increased sharply in this period. And this also leads to a nice report about the Institute disseminating statistics which should be taken with a pinch of salt.
  2. Be critical of other media. News facts published by the BBC or the Guardian are not necessarily correct; they make mistakes too. Here again, reporting errors made by other media can produce sound as well as entertaining reports. If all the media are claiming that men get an erection from the smell of rotten eggs, you can write a splendid piece in which the researcher rejects this claim as a load of rubbish.
  3. Make sure the necessary expertise is present in the editorial office. It is really not necessary to staff the whole office with statisticians and scientists in order to prevent errors. However, it is essential to have several editors with a basic knowledge of figures, research and statistics. There should also be someone who can function as an information desk and who has the courage to point out colleagues’ errors. When we aired these suggestions in an editorial office, the editor-in-chief’s response was “the need to employ scientists to write about science is just as great as the need to employ a footballer to write about sport”. OK, but the other extreme is a football reporter who does not know what ‘offside’ means. Or a science journalist who does not understand the principles and pitfalls of research.
  4. Throw out idiotic rules –such as ‘at least 500 people should be interviewed in a research project’. There are innumerable large-scale – yet inadequate – research projects. At the same time, a great deal of research with fewer than 500 participants is of excellent quality. It appears that a journalist seldom looks at the non-responses in surveys even though they can provide an evaluation of the survey’s reliability. If 500 people respond to a survey and 5000 slammed the door, what does that say about that particular piece of research? Rules like this do not protect one against fiascos. Editors with an understanding of and insight into methodology would in fact be much more useful.
  5. Request figures and research reports. Do not be kidded into believing something has increased. Ask for the figures! Companies will often benefit from research being publicised in the news but, at the same time, are unwilling to release the research report. How can a journalist check to see if the research has been well done? Do not let yourself be manipulated; do not publish information about research that you have not been allowed to cast your eye over. Otherwise, you are making it very easy for these companies to manoeuvre you into giving them free publicity.
  6. Consult the experts. You may not have the necessary expertise to evaluate a piece of research yourself. In that case, do not be embarrassed about seeking the advice of the researchers in question or other experts. They can point out any wrongful interpretations or shortcomings in the research. Take, for example, a research project which showed that vegetarians have an increased chance of developing intestinal cancer. Check with the experts to see if these results correspond with those of any previous research.

If all journalists and editorial staff took these suggestions to heart, our brilliant didactic concept would soon be superfluous. Fortunately, it appears that not everyone feels enthusiastic about fact checking. One journalist was unwilling to respond to our students because, as he said, “You fact checkers are always a few steps behind us”. Maybe so, but we are watching your backs!