Reflections on treatments for today's 'information disorder'
Few people who see how many of us interact with each other in the age of social media would dispute the Aspen Institute's assertion that contemporary society is suffering from an "information disorder."
This week, the institute, a global think tank striving for, as its website www.aspeninstitute.org declares, "a free, just, and equitable society," issued a report describing the information disorder it sees and suggesting some ways to address it.
The 80-page report is a product of a commission established by the institute to study the issue and make recommendations for addressing it. The website's introduction to the report opens with an ominous declaration: "America is in a crisis of trust and truth. Bad information has become as prevalent, persuasive and persistent as good information, creating a chain reaction of harm."
Then, the report itself proceeds to describe what the commission sees as inherent symptoms of this disease and offers 15 strategies for alleviating, if not entirely eliminating, them. I'm under no illusions that most people are going to rush out to read a think tank's wonky report, but at least a cursory reflection on the document is worth your time if you care about the state of information these days, and it's worthwhile to note some of the highlights.
One of the most striking, for me, comes in the observation that an inherent "demand" for bad misinformation already is baked into society.
"'Disinformation' and information campaigns by bad actors don't magically create bigotry, misogyny, racism or intolerance," the report states. "Instead, such efforts are often about giving readers and consumers permission to believe things they were already predisposed to believe."
In other words, social media are not the source of some of our most difficult divisions, but they do widen them. And closing the gulfs will require a variety of public and personal actions.
The report cites a range of additional impacts and characteristics of the information disorder, arguing that government has been slow to respond the problem, that "Big Tech" uses this sluggishness to suggest an interest in addressing the problem while lobbying against initiatives to do so and, particularly, that the consequences for sowing misinformation and discord have shifted.
In the past, the report notes, business and political leaders have faced the risk of punishment for lying or misleading the public. Now, that fear has been flipped on its head.
"Today, though, they're increasingly celebrated for their lies and mistruths -- and punished, politically, for not ascribing to others' falsehoods," the report states.
Some of these observations are hardly news to anyone who has been paying attention to public conversations over the past decade. But they are still interesting to contemplate in the context of efforts to produce solutions.
The commission breaks down its 15 recommendations into three categories of actions -- to increase transparency, to build trust and to reduce harms. Suggestions include stronger requirements for social media platforms to monitor and report on the dissemination of bad information, developing digital tools that strengthen trust among communities, investing in local news media, improving education about the ways misinformation is spread and creating punishments for individuals and agencies that spread it.
The tone and detail of a report of this magnitude cannot be thoroughly distilled in the small space of a simple newspaper column, but the message deserves to be emphasized: An "information disorder" is infecting society; it has real, damaging consequences; and there are things that can be done about it.
Are the Aspen Institute's recommendations the be-all and end-all response to the problem? Of course not.
But its ideas certainly are worth are considering and study.
• Jim Slusher, email@example.com, is deputy managing editor for opinion at the Daily Herald. Follow him on Facebook at www.facebook.com/jim.slusher1 and on Twitter at @JimSlusher.