Jay Rosen on the Post's reporting on the intelligence system post-9/11:
I’ve been trying to write about this observation for a while, but haven’t found the means to express it. So I am just going to state it, in what I admit is speculative form. Here’s what I said on Twitter Sunday: “We tend to think: big revelations mean big reactions. But if the story is too big and crashes too many illusions, the exact opposite occurs.” My fear is that this will happen with the Afghanistan logs. Reaction will be unbearably lighter than we have a right to expect— not because the story isn’t sensational or troubling enough, but because it’s too troubling, a mess we cannot fix and therefore prefer to forget.
Last week, it was the Washington Post’s big series, Top Secret America, two years in the making. It reported on the massive security shadowland that has arisen since 09/11. The Post basically showed that there is no accountability, no knowledge at the center of what the system as a whole is doing, and too much “product” to make intelligent use of. We’re wasting billions upon billions of dollars on an intelligence system that does not work. It’s an explosive finding but the explosive reactions haven’t followed, not because the series didn’t do its job, but rather: the job of fixing what is broken would break the system responsible for such fixes.
The mental model on which most investigative journalism is based states that explosive revelations lead to public outcry; elites get the message and reform the system. But what if elites believe that reform is impossible because the problems are too big, the sacrifices too great, the public too distractible? What if cognitive dissonance has been insufficiently accounted for in our theories of how great journalism works… and often fails to work?
That challenge of having "too much 'product' to make intelligent use of" struck me, though other things in the article. On one hand, clients think that more is better-- it must be, right?-- but more is also a lot harder to deal with. I've had clients who wanted to be able to see the raw material I work with, but certainly didn't have time to read it and think about it themselves. Partly they wanted me to be able to demonstrate that I wasn't just making up stuff, but there is this intuitive belief that when it comes to information, more and faster are better.
Unfortunately, that's not the case, and I think most of us recognize that, but we don't have a way to describe "less" as a virtue. All too often "less intelligence" (whether competitive or strategic) tends to translate into "only looking at the data that support my position," or it sounds like "being stupid," rather than being judicious and recognizing the impossibility of reading and digesting everything.
Likewise, the idea that if a "story is too big and crashes too many illusions" it will be ignored strikes me as a nice description of the problem of getting people to take the end of bubbles and Black Swans seriously.