Fourteen Do’s and Don’ts for Medical Bloggers

acmedsci snake shhhh

Fellow freelance journalist James Butcher alerted me to the existence of a clutch of rather verbose guidelines for journalists and others pertaining to the reporting of medical research results.

The guidelines were published in November by the UK’s Academy of Medical Sciences, an organisation that apparently promotes advances in medical science and campaigns to ensure these are translated as quickly as possible into healthcare benefits for society. The report primarily highlights the role of observational research in identifying environmental and lifestyle causes of disease, such as obesity and diabetes, cancer, cardiovascular disease etc, but warns researchers against overstating the importance of their findings.

The guidelines themselves actually formed part of a much wider document entitled not very snappily: Identifying the environmental causes of disease: how should we decide what to believe and when to take action?. The report’s title alone should have warned readers by now that the guidelines themselves are likely to raise a few issues.

Of course, most medical and science journalists will already have their own internal list of guidelines to follow, probably in much clearer and simple English, and directed firmly in the direction of writing the best piece they can rather than aimed at satisfying some higher bureaucratic order. Depending on the writer’s background these internal guidelines will overlap in essence with the fourteen do’s and don’ts listed below and in other areas there will be little common ground.

Some might say that the list of guidelines is a little patronising to journalists and written in an ostentatious and overblown manner. Others might point out that they assume far too much prior knowledge. Do political, legal, financial, and arts correspondents get such lists of guidelines I wonder? Would those correspondents, as opposed to a science or medical writer understand the technicalities of item 1a “What is the sample?” Well, if you know how to answer that question, you probably don’t need the guidelines, and if you don’t know how to answer it then you should maybe stick to covering art gallery openings and ministerial indiscretions. Either that or head back to school for a quick stats course.

This is not the first time an organisation like this has attempted to lay down guidelines for journalists and others. The Royal Society made an attempt at it a few years ago and recently revised them. Ironically, they did not consult the Association of British Science Writers in producing their guidelines and so they went down like the proverbial plumbum inflatable. More to the point, rulez is for breakin’ and if you’re a (tabloid) hack intent on writing a health scare story, then you’re going to write it regardless of any list of guidelines from an organisation of which you have probably never even heard. And, if you’re not a scaremonger, then, as I said before, you will already have your own endogenous list of guidelines.

Perhaps what is needed is some kind of guidance for the public rather than the journalists that allows them to make more sense of the dozens and dozens of health stories on cancer, obesity, estrogen, bird flu, HIV, MMR vaccination. Such advice might help them to see the facts when those internal guidelines have been overridden in the name of great headlines.

It would be an interesting exercise to analyse each of the news articles and others that I cited in a recent post entitled Obesity News Epidemic, to see just how well each of those mesh with the ACMedSci guidelines. Anyone care to take on the research project?

Anyway, with ACMedSci permission I’ve cribbed the guidelines below for your delectation and to save you wading through the 150 pages of the less-than snappy document. They were originally aimed at journalists and others in the media, presumably to help prevent sensationalisation and healthscares. They could be equally useful/useless (del. as applic.) to bloggers and others too.

  1. Pay detailed attention to the methodology of all studies being reported. Important questions to consider include: a. What was the sample? b. What were the measures? c. How strong were the effects in both relative and absolute terms? d. Has there been adequate attention to alternative explanations, and to good control of possible confounding variables? e. Has the finding been replicated? f. Is there supporting experimental or quasi-experimental evidence? g. Are the findings in keeping with what is known about disease mechanisms?
  2. Whilst it may not be appropriate to offer extensive discussion of all these details when writing or speaking to the general public, key aspects can be communicated successfully using clear, jargon free, language.
  3. The science or medical correspondent needs to have an appropriate grasp of the scientific issues in order to know how best to convey what was novel, interesting and important in the research.
  4. Exercise appropriate judgment in identifying and drawing attention to those points of design that are particularly relevant to the study in question – especially when ignoring them might lead to misunderstanding.
  5. Bear in mind the research track record of the researchers and of their employing institutions.
  6. Consider whether there are any conflicts of interest that might lead to possible bias.
  7. Seek to determine the theory or set of biological findings that constitute the basis for the research – noting how this fits in with, or forces changes in, what we already know or believe.
  8. Whilst paying appropriate attention to competing views, be wary of creating spurious and misleading ‘balance’ by giving equal weight to solid research evidence and weakly supported idiosyncratic views.
  9. Be very wary of drawing conclusions on the basis of any single study, whatever its quality.
  10. When considering public policy implications, draw a careful distinction between relative risk (i.e. the increased probability of some outcome given the disease causing factor) and absolute risk (i.e. the probability of that disease outcome in those with the disease risk).
  11. Use simple counts to describe risk whenever possible, rather than probabilities.
  12. Be careful, insofar as the evidence allows, to clarify whether the causal effect applies to everyone or only to a small special sub-segment of the population.
  13. Set the causal factor you are describing in the context of all known causal factors, whilst explaining that there may be others, as yet unknown or unsuspected.
  14. In writing about research, seek to educate and engage readers with the science and to encourage them to think critically.