From knowing to doing, in 20 lessons

Last November 21st, a paper was published in Nature (available for free), co-authored from three scientists, two of them from Cambridge, UK, and one from Melbourne, Australia, suggesting twenty tips for interpreting scientific claims. Their goal was tohelp  improving policy makers’ understanding vis-à-vis scientific results. Politicians are used to take advise from experts and consultants, and they constantly have to assess quality, limitations and biases of studies on which are based their arguments and policies.

Authors of this work suggest their 20 lessons should be part of any training programs dedicated to civil servants, politicians, policy advisers and journalists who have to interact with scientists. When I was Dean of the French School of Public Health (EHESP, Rennes), a school who is in charge, by monopoly, of training all civil servants in charge of the public health care system in France, we introduced a mandatory core curriculum where such kinds of basics were taught. However, I think, when I stepped down, last year, that we were still far from having achieved fascinating goals presented in this paper (I invite interested readers to directly read the full article in Nature).

Here, I will only list the 20 tips as selected by William J. Sutherland, David Spiegelhalter and Mark Burgman, since I believe such a list may inspire  all those who are intending to set up a training program dedicated to high profiles in charge of public affairs in their own country:

1. Differences and chance cause variation

2. No measurement is exact

3. Bias is rife

4. Bigger is usually better for sample size

5. Correlation does not imply causation

6. Regression to the mean can mislead

7. Extrapolating beyond the data is risky

8. Beware the base-rate fallacy

9. Controls are important

10. Randomization avoids bias

11. Seek replication, not pseudoreplication

12. Scientists are human

13. Significance is significant

14. Separate no effect from non-significance

15. Effect size matters

16. Study relevance limits generalizations

17. Feelings influence risk perception

18. Dependencies change the risks

19. Data can be dredged or cherry picked

20. Extreme measurements may mislead

We have here a comprehensive agenda for our next summer schools, haven’t we?

We had performed a tentative similar exercise in France in 2011 at the National Academy of Medicine (pdf report available without any restriction, in French), more focused on epidemiologic results. This report was commissioned by the Parliament Office for Evaluation of Scientific and Technologic Options (Office Parlementaire d’Evaluation des Choix Scientifiques et Technologiques, OPECST).

These attempts for setting up such training programs seem important, however, it may be urgent to see them delivered … and probably to assess their usefulness too!

  • Recent Posts

  • Recent Comments

  • Archives

  • Categories

X