A Rough Guide to Spotting Bad Science:
The vast majority of people will get their science news from online news site articles, and rarely delve into the research that the article is based on. Personally, I think it’s therefore important that people are capable of spotting bad scientific methods, or realising when articles are being economical with the conclusions drawn from research, and that’s what this graphic aims to do. Note that this is not a comprehensive overview, nor is it implied that the presence of one of the points noted automatically means that the research should be disregarded. This is merely intended to provide a rough guide to things to be alert to when either reading science articles or evaluating research.
I think that one of the biggest problem in science reporting is that the effect size (which studies usually report–indeed, in many journals, are required to report) is ignored. It's one thing to say that eating some expensive and exotic nut leads to better health but if the effect is equivalent to getting up each morning and doing jumping jacks for two minutes, it probably isn't worth a trip to the specialty store or the cost. My other pet-peeve is that news reporting tends to simplify issues and remove all the caveats, which are usually present in the original articles. In most bad news stories, it probably isn't the science itself that's bad but rather the reporting of the science by reporters who aren't qualified to talk about science and a public who also doesn't have a sense of what science really looks like.