research to chew on, or that "the chances of a child’s dying in a school shooting are remote," as Sam Harris points out in his excellent overview. We humans don't care about probability theory when we've seen the shattered grimaces of mourning parents on TV and pictured their kids in our minds. No statistic can defeat that, recent polls suggest.
(There's another side of this story, of course: Loving the texture of a trigger too much can also shut down your frontal lobe.)
That said, is it acceptable that journalists and designers take advantage of people's cognitive biases and shortcomings rather than trying to overcome them? To what extent is an illustration like the one on the left misleading (National Post: source)? My liberal reptilian brain screams that's a lot of deaths, you NRA morons! This is "the point" the illustration makes. At the same time, we all should know that absolute figures are dangerous when presented out of context: Are 900 deaths a lot in a country of 300 million? Are all gun-related deaths equal? And so forth. Still, it is fair to think that a good portion of those deaths could have been avoided if we had stricter control measures that even knowledgeable libertarians embrace. Saving hundreds of lives a month is a worthwhile goal in absolute and relative terms. These should all be elements for a national debate.
I think that you can see where I'm going: There's depth, messiness, and uncertainty behind the deceptive simplicity of the National Post illustration, which is more an op-ed than an infographic. Why should we avoid them? Why should we devote so much real state to promote such a shallow message, rather than using it to help citizens reach a reasonable conclusion? As much as I like this graphic, which is essentially true, I believe that pondering over the data much further could make it truer.