In a recent Wall Street Journal op-ed, News Corp chief executive Robert Thomson sets his eyes on Facebook, Google, and Silicon Valley more broadly. He argues that the commodification of content by “the two most powerful news publishers in human history have created an ecosystem that is dysfunctional and socially destructive.” An ironic start—given that content commodification is the raison d’etre of the news business—that quickly spirals into a screed against the entire modern digital economy.

Strap yourselves in, because there’s a lot to pick at here.

The Times They Are A Changin’

To start, Mr. Thomson’s general contention seems to be that Facebook and Google—the “digital duopoly”—have made just about everything worse. The contentious political environment throughout the world is, he argues, “in part, a result of the idealism of the Silicon Valley set” that aims to connect people through digital channels. The greed that drives the Valley’s desire for global ubiquitous interconnectivity, and its failure to appropriately curate their content, are the reasons the world is going to hell in a handbasket. After all, Thomson points out, these companies:

could have done far more to highlight that there is a hierarchy of content, but instead they have prospered mightily by peddling a flat-earth philosophy that doesn’t distinguish between the fake and the real because they make copious amounts of money from both.

The commodification of content is what drives these decisions, according to Mr. Thomson. If tech firms had employed the same approach to governing content distribution as the newspapers of yesteryear, presumably we wouldn’t have “fake news.” Is this an argument for editorial discretion and curation, perhaps even censorship? Mr. Thomson seems to waffle on this point.

On the one hand, he bemoans how “institutional neglect has left us perched on the edge of the slippery slope of censorship.” Yet this lamentation is preceded by him “hoping, mostly against hope” that the same companies that have supposedly placed us in this precarious position will “take meaningful action … to purge their sites of the rampant piracy that undermines creativity.” So which is it? To censor, or not to censor? He doesn’t really say.

Instead, Mr. Thomson not-so-subtly suggests that the problem is a lack of a reputable code of honor in the tech community. If only Silicon Valley could imitate the great bastions of journalistic integrity, we presumably would not be staring at the impending information dystopia. Unfortunately, he goes on, “[t]here is no Silicon Valley tradition, as there is at great newspapers, of each day arguing over rights and wrongs, of fretful, thoughtful agonizing over social responsibility and freedom of speech.” Except Silicon Valley does fret over those issues. All the time.

Of course, no piece on the fake news crisis would be complete without throwing some blame at the algorithms.

Mr. Thomson castigates what he characterizes as the Valley’s almost-religious devotion to algorithms as an end-all-be-all solution to the epidemic of proliferating falsehoods. Those systems, he argues, are “routinely cite[d] as a supposedly objective source of wisdom and insight. These algorithms are obviously set, tuned and repeatedly adjusted to suit their commercial needs.” None of that is true. Nobody who has even a superficial understanding of algorithms claims they serve as an “objective source of wisdom and insight,” and it’s far from “obvious” that they are readjusted and trained to achieve optimal “commercial needs.” For an article that so haughtily criticizes the “journalistic jetsam and fake flotsam” of the tech industry, there certainly doesn’t seem to be much concern with mischaracterizing a notoriously complicated technology.

Mr. Thomson’s criticisms, slathered in a molasses-drenched coating of pretentious loquaciousness (see, I can do it, too), read more as a diatribe against changing times than an empirically grounded, solution-oriented approach to the problems he (mis)identifies. The article isn’t a critique; it’s a polemical condemnation of the digital age, a lament for a bygone, idyllic past. One can almost hear Grandpa Simpson yearning for the times when he was with “it” before they changed what “it” was.

grandpasimpson

But if the “digital duopoly” isn’t to blame for fake news, who is? As with so many things in life, it’s complicated.

The Unaddressed Benefits of Fake News and Social Media

Although I take issue with Mr. Thomson’s tirade against the digital content distribution industry, he is broadly correct in noting that Silicon Valley has changed the rules of the game. That’s not necessarily a bad thing, however.

Where online service providers like Facebook create a platform for individual expression and content creation, the news media giants of yesteryear provided gated communities of content, curated and whitewashed by publishers who up until recently retained a monopoly on what constitutes “newsworthy.” That’s not necessarily a bad thing. Especially amidst the current crisis of truth-seeking in an online ecosystem so polluted by falsehoods, good journalism is more important than it ever has been. But so is a recognition of the value that social media and the Internet have played in revealing the social malaise underlying the fake-news epidemic.

Ironically, the speed at which digital communications platforms have spread falsehoods corresponds to the speed at which we’ve been able to recognize an undercurrent of social discontent. Social media can spread fake news, but it also spreads awareness of fake news. That in turn has sparked a broader conversation about the current state of our post-truth political atmosphere and socio-cultural perturbations.

Whatever blame can be assigned to online service providers like Facebook and Google is insignificant compared to those larger socio-cultural forces. And while it’s easy to assign the lion’s share of condemnation to society, it’s just as true that individuals bear significant responsibility for the current state of affairs. The Internet, after all, is just a reflection of ourselves. It “mirrors society and, by extension, each of us: our preferences, associations, and world views.”

Every link we click matters; every scurrilous post we share makes a difference, sometimes for the worse. In the aggregate, all of these small actions have an impact on the norms of the still-emerging cybersociety. As in the real world, we share a collective responsibility for the world still being built in cyberspace. That broader responsibility will determine the Internet’s cultural trajectory. The world of binary bits can either reflect the best that humanity has to offer, or the worst. Each of us has to decide what and how we contribute to this digital ecosystem, and bear the responsibility for the outcomes.

None of these problems is new, but they also haven’t gotten worse, contrary to Mr. Thomson’s assertions. Social media and the Internet have actually enabled us to see the problems more clearly, revealing the messy complexity at the heart of the human social order. The problems only seem intractable because the solutions aren’t as simple as we’d like to believe.

In an age of global digital connectivity, we are all complicit. Unfortunately, this interwoven web of infinite complexity means that simple solutions will not suffice.

Conclusion

Danah Boyd recently wrote what I consider to be one of the most articulate and intellectually honest assessments of the fake-news issue:

Try writing a content policy that you think would work. And then think about all of the ways in which you’d be eliminating acceptable practices through that policy. Next, consider how your adversaries would work around your policy. This was what I did at Blogger and LiveJournal, and I can’t even begin to express how challenging it was to do that work. I can’t even begin to tell you the number of images I saw that challenged the line between pornography and breastfeeding. These lines aren’t as clean as you’d think.

I don’t want to let companies off the hook, because they do have a responsibility in this ecosystem. But they’re not going to produce the silver bullet that they’re being asked to produce. And I think that most critics of these companies are really naive if they think that this is an easy problem for them to fix.

Her solution is less simple, but recognizes the inherent difficulties with policing online content:

The puzzles made visible through “fake news” are hard. They are socially and culturally hard. They force us to contend with how people construct knowledge and ideas, communicate with others and construct a society. They are also deeply messy, revealing divisions and fractures in beliefs and attitudes. And that means that they are not technically easy to build or implement. If we want technical solutions to complex socio-technical issues, we can’t simply throw it over the wall and tell companies to fix the broken parts of society that they made visible and helped magnify. We need to work together and build coalitions of groups who do not share the same political and social ideals to address the issues that we can all agree are broken. Otherwise, all we’re going to be doing is trying to wage a cultural war with companies as the intermediary and referee. And that sounds like a dreadful idea.

Mr. Thomson was right about two things. The first, is that “authenticated authenticity” is indeed “an asset of increasing value in an age of the artificial.” That may offer an opportunity for high-caliber journalists of integrity to have a more effective voice in the digital age.

The second point is that “understanding the ebb and flow of humanity will not be based on fake news or ersatz empathy, but on real insight.” I think that’s absolutely true. However, it’s also true that insightfulness is not uniquely the property and purview of the “great newspapers” that retained a monopoly on the dissemination of information in the pre-digital era. As the propagation of fake news has shown us, insight into an understanding of “the ebb and flow of humanity” can come from many quarters—as can solutions to the emerging problems online. And as the Internet continues to mature, it will almost certainly spawn solutions to this problem that we cannot yet imagine.

In the meantime, we’ll all have to work a little harder to make a better cybersociety, one comment or share at a time.