There’s a common mistake in the way findings of surveys are reported. It annoys me every time I see it, and when I saw it again yesterday it annoyed me enough to write on my blog about it. Here’s a screenshot of a tweet promoting the article. Can you spot the mistake?
Pretty awful, right?
So, the survey asked people which groups they thought faced a lot of discrimination in the United States today. 44% of white evangelicals said Muslims did, and 57% of white evangelicals said Christians did. So more white evangelicals think Christians face a lot of discrimination than think Muslims do. But that’s not what the headline says.
The headline says that white evangelicals think Christians face more discrimination than Muslims. And it follows from the survey’s findings that at least 13% of them do think this, since at least 13% think Christians face a lot and don’t think Muslims do. But it’s still open that 87% think that Muslims face more discrimination. I guess it probably isn't as high 87%, but we just don't know what the figure is. If a survey just finds that at least 13% of people in a group think something, it’s misleading to report the survey as finding that the group as a whole thinks that thing. And if that thing is both false and a dangerous and foolish thing to think, as in this case, it’s especially important not to misleadingly report the group as thinking it.
Now, you might say that from the results of the survey it’s more or less certain that most white evangelicals think Christians face more discrimination than Muslims. I don’t think it is. But even if it is, the reader can be the judge of that. Report what the survey says, and let the reader draw their own conclusions. Or argue for a conclusion. But don’t just misreport the findings of the survey.