Analysis
Podcasts

Did Facebook Really Polarize and Misinform the 2016 Electorate?

From Russian trolls to racist rhetoric, Facebook has been blamed for the divisive 2016 presidential election. Does Facebook direct users to diverse information or to fake news and ads that misinform, making us hate the other side? Michael Beam finds that Facebook users actually saw more information from the other political side in 2016; he finds no evidence that Facebook polarized our attitudes. But Young Mie Kim finds 2016 Facebook users saw lots of divisive misinformation from untraceable groups via ads. Facebook may just be the latest scapegoat for our polarized politics, but its stream of information is making it hard to sort fact from fiction.

The Niskanen Center’s Political Research Digest features up-and-coming researchers delivering fresh insights on the big trends driving American politics today. Get beyond punditry to data-driven understanding of today’s Washington with host and political scientist Matt Grossmann. Each 15-minute episode covers two new cutting-edge studies and interviews two researchers.

You can subscribe to the Political Research Digest on iTunes here.

Photo by MICHAEL REYNOLDS/EPA-EFE/REX/Shutterstock

Transcript

Matt Grossmann: This week on Political Research Digest, did Facebook polarize and misinform the 2016 electorate? From the Niskanen Center, I’m Matt Grossmann.

From Russian trolls to racist rhetoric, Facebook has been blamed for the divisive 2016 presidential election. So, does Facebook direct users to diverse information or to fake news and ads that misinform, making us hate the other side? New research tracking opinion during the 2016 campaign suggests Facebook use was not responsible for our division.

I talked to Michael Beam of Kent State University about his new Information, Communication and Society article with Myiah Hutchens and Jay Hmielowski: “Facebook News and (De)polarization: Reinforcing Spirals in the 2016 Election.” They find no evidence that Facebook users developed more negative attitudes toward the other party or increasingly avoided news from the other side during the campaign, but perhaps there was a lot of misinformation delivered through Facebook ads. I also talked to Young Mie Kim of the University of Wisconsin about her new article in Political Communication with eight co-authors: “The Stealth Media? Groups and Targets behind Divisive Issue Campaigns on Facebook.”

She finds that Facebook users in battleground states saw plenty of divisive ads from untrackable groups in the late stages of the campaign. So, are we using social media to cocoon ourselves, seeing only news that fits our current views and perhaps even helping Donald Trump in 2016? Michael Beam says that’s mostly a myth.

Using Facebook for news during this last 2016 election in the U.S. did not lead to increased attitude polarization. There was a popular notion in the news and journalism and some tech circles that people who are using Facebook and social media for news would be trapped in echo chambers or filter bubbles, basically being surrounded by news you already agree with, but our results show that people who use Facebook for news actually show modest levels of depolarization compared to those who don’t use it. We argue that that is happening because they’re more likely than others to see news that they disagree with come across their feed.

The conventional wisdom is also out of step with most other research.

Michael Beam: Most of the empirical research that’s looked at this area and the research that we’ve done and research that others have done has found that generally, using technology and social media for news recommendations is related to a lot more information exposure, more information, more news that you agree with and more news that you disagree with.

Matt Grossmann: The 2016 election was quite different, but Beam says we should identify specific changes rather than blame social media for Trump.

Michael Beam: In 2016, I would argue that more information was probably flying when you used Facebook, but our data would not show and I don’t think that for the general Facebook user, the needle was moved in terms of people being more likely to support Trump because they were more likely to see misinformation. I think for specific people, for specific groups, like people who have high conspiratorial beliefs, this is a great place for them to be able to really get into this but I think that’s such a small share of the population that I’m skeptical that it made a big difference in terms of the vote.

Matt Grossmann: Beam used panel survey data to identify changes during the campaign.

Michael Beam: The survey data that we were given was 500 completed surveys for each of the three waves. The same person took the survey three times, once right after the primary conventions and once, I believe it was right after the debates ended, I think that was early October, and then once in the week just before the election.

Matt Grossmann: At each time point, they looked at polarization, Facebook use and consumption of news that agrees and disagrees with users’ views.

Michael Beam: We measured affective polarization using a measure of party-feeling thermometers. We asked people to rate using a clickable thermometer on a scale of one to 100 how they felt about the Democratic party and the Republican party. Then we took the difference of those two scores and that gave us a polarization score. That’s a common way that affective polarization has been used in political science literature for quite some time. We also measured how frequently people were using Facebook for news.

Matt Grossmann: Overall, the changes in views were quite minimal.

Michael Beam: We asked them three times and then we looked at the change over time in their responses to those things and we saw some slight changes over time but by and large, these behaviors and attitudes stayed mostly static.

Matt Grossmann: If you don’t ask repeatedly, it can look like Facebook is polarizing us but that’s because the already polarized gravitate to Facebook for their news.

Michael Beam: The 20 percent of people who, news and politics are their hobby or their profession, they have very solidified attitudes that are going to be hard to change. So, it’s unsurprising to us that early on in the election, the people who are using these tools for news more are people that are more polarized at that time because a lot of the people in elections historically just aren’t that interested or informed. We find that throughout the election, we find this slight depolarization. Again, it’s very modest. It’s one or two degrees on this 100-point feeling thermometer where we find that others have a one or two degree polarization over time.

Michael Beam: In the end, we’re looking at an average level of polarization that’s 50 degrees on this 100 degree feeling thermometer.

Matt Grossmann: The change that did occur in 2016 was toward reduced ratings of the out-party during the campaign, but the declines were minimal overall and actually less severe among Facebook news readers.

Michael Beam: We do see a slight negative dip across the sample in out-party score. People become more cold on their out party over time, and that’s true not just in our 2016 data but if we look at the trends in the nation, over 30 or 40 years of affective polarization, it’s not that we are becoming more gung-ho about our in-party, we dislike the out-party more.

Matt Grossmann: Beam thinks that’s because Facebook users were more likely to encounter news that disagreed with their political perspective.

Michael Beam: Going to those counter-attitudinal news stories and getting more involved in news, I suspect that that’s where the out-party changes, that’s where the change is happening and it’s likely, again, very slightly increases in out-party scores.

Matt Grossmann: Overall, he finds that social media expands access to diverse sources of news.

Michael Beam: Using social and algorithmic news facilitates more entry to news in general both in the election news and political news category but also across the categories. So, people who are interested in business news or people who are interested in sports news that when you’re using these social and algorithmic systems, you’re reading more types of news from more sources of news.

Matt Grossmann: And he saw similar effects for Trump and Clinton supporters, mostly confirming that they did not seek out news against their views.

Michael Beam: We see very, very small differences in these variables. For pro-attitudinal news exposure, we see no differences. For counter-attitudinal news exposure, we see a very small but statistically significant difference in two of the three waves, but if we look at the mean values of those differences, it’s a mean difference from Trump supporters at .7 and Clinton supporters at .99 and the indicator zero on this scale was I never see counter-attitudinal news, and one was less than several times a month.

Matt Grossmann: But Young Mie Kim found that most Facebook users saw a lot of suspicious ads in 2016.

Young Mie Kim: The key top-line findings we have: first anonymous groups ran divisive issue campaigns, including candidate attacks. The second, these anonymous groups targeted battleground states such as Wisconsin, Pennsylvania, low-income white voters.

First of all, anonymous groups ran divisive issue campaigns and anonymous groups include unidentifiable suspicious groups, astroturf, movement unregistered groups, or non-FEC-reporting nonprofit. We define suspicious groups as basically unidentifiable, unknown groups who do not have a public footprint. So even after we match it with the Federal Election Commission report, IRS databases and other research databases, we could not find any information about these sponsors. So those are suspicious groups. Later, one out of the six of our suspicious groups turned out to be Kremlin-linked Russian groups. The volume of the ads run by this anonymous groups was four times larger than FEC groups.

Matt Grossmann: Kim did not have to rely on self-reports. She saw the same ads her users did via an app.

Young Mie Kim: We developed an app that worked like an ad blocker, but instead of it blocking ads, the ads are captured as content in the meta information such as landing page information. Landing page is the ultimate, like a destination page and if you click the ad, where does it lead us to? We used this app and we captured nearly five million paid ads in Facebook exposed to 10,000 volunteer participants between September 28 and November 8, 2016.

Matt Grossmann: She found that Facebook users saw many political ads leading up to election day, including some from Russian sources and others that looked just as shady.

Young Mie Kim: The conventional wisdom about the Facebook is that people probably don’t get any political ads in the Facebook. My empirical research shows that there are a lot of ads actually. 23% of all paid ads are political. That’s quite a lot. Now, in media, there was a lot of reports about Russian ads on the Facebook. Now we know that based on our data, there are quite a lot of Russian groups on Facebook and they generated a lot of ads.

Matt Grossmann: Only some of the ads were accompanied by organic news content, but the stories that did appear matched their ad content.

Young Mie Kim: Some of the suspicious groups just focus on a paid ad. If you look at their Facebook pages, almost like new information. It’s a collection of pictures or a collection of abstract artwork and things like that.

Some of the suspicious groups seem to promote so-called fake news. The paid ads are candidate attacks, but if we go to their Facebook pages, say you click on the ads, and then they directed us to their Facebook pages, and then if you look at all their organic posts, it’s like a series of breaking news, but most of them are misleading information or false information. Some of the suspicious groups are clearly extreme ideologists.

Matt Grossmann: By surveying users and observing their online activity, she was also able to investigate targeting.

Young Mie Kim: The reason we wanted to do a survey is because we wanted to reverse-engineer a targeting pattern, because targeting is sort of a black box. It is based on algorithm, groups like modeling or targeting matrices that are provided by, like Facebook. It is largely unknown, nobody actually knows, so then the only way we can make an inference is to sort of reverse-engineer, and see who are more likely to see what types of ads.

Matt Grossmann: She found upper Midwest swing-state targets.

Young Mie Kim: Voters are clearly targeted. Geographically a targeted key battleground states including Pennsylvania, Wisconsin, and Virginia. Those are the top three states that are targeted the most. Especially Pennsylvania and Wisconsin, generally overlap with the battleground states.

Matt Grossmann: Poor whites were specifically targeted for messages about race and immigration.

Young Mie Kim: Low-income voters whose household income is less than $40,000 were specifically targeted with the ads focusing on immigration and racial conflict. White voters, again, compared to other measured groups were also highly targeted with the issue of immigration. For example, white voters received about 44 percent more immigration ads than the average voting age population. About 87% of all immigration ads targeted white voters.

Matt Grossmann: Instead of one big campaign, lots of small efforts added up to a lot of ads.

Young Mie Kim: We found that divisive issue campaigns are run by a large number of small groups, not the campaign. With just a few exceptions, most of the suspicious groups are just small niche groups that nobody ever heard about, but when there are a large number of suspicious groups like that on the digital platforms, it will add up and then have a significant impact.

Matt Grossmann: Beam says both studies showed a lot of diversity of information, leaving voters to decide what is true.

Michael Beam: There is a consistency across our studies, and that is that Facebook is an important vector for all kinds of information, that when you’re using these platforms, you’re thrown into an information world and you’re going to be exposed to a lot of stuff, stuff that’s true and untrue, and stuff that’s right and on the left. I think that’s a really important point and something that we can explore more in social science. I mean, I think where we’re at right now is people are not being isolated from particular types of information, which was this idea of a filter bubble. Instead, people are really getting the whole hose, the whole stream of information, which then puts the onus on users to suss out what is reliable and what is not.

Matt Grossmann: But the picture from Kim’s study is decidedly bleaker, as ads were designed to reinforce existing views or even suppress turnout, rather than persuade.

Young Mie Kim: These ads that are targeted to people who are already on board on the particular things. For example, say anti-immigration ads are targeted to people who are already against immigration policies. It will just reinforce those people and then increases a cynicism about the mainstream politics and the functions of the institution in general.

The other tactic we found is the voter suppression. Some ads, for example, anti-Hillary ads, targeted people who are weak Democrats, for example. Then these people are clearly exposed to the ads, like I will not vote for Hillary Clinton this election. Then it might not converge their voting decision, but it might be influencing whether they’re going to turn out to vote.

Matt Grossmann: Beam says rather than blame Facebook, we should isolate the types of people and the uses most likely to be problematic.

Michael Beam: Facebook has become more of a scapegoat than it should be, but that’s not to say that it’s something that we shouldn’t be attentive to. I think that there is an important point that we don’t hear about a lot, and that is the fact that particular groups are very attentive or engaged on social media. For example, our data shows that basically nobody is using Twitter. It’s a very select group of people that are using Twitter, but it does happen to be a place where journalists and academics do find themselves.

Michael Beam: When we look at how people come about knowing what President Trump is tweeting, the vast majority of people … We published a book chapter that talked about how Republicans and Trump supporters themselves are actually far less likely to be on Twitter than others, but his supporters are still finding out what he says, and that’s because they’re finding out through the people who are on Twitter amplifying that voice. Primarily that is journalists who are amplifying what he’s saying, and then finding out through television, radio or from their friend. I don’t want to say that social media is not having very real, both positive and negative impacts on our public discourse, but it’s not so simple as people who use it are going to show these differences and people who don’t aren’t.

Matt Grossmann: But Kim says regulation is needed, at least to match what voters learn about television ads.

Young Mie Kim: Voters should know that. They have a right to know who are influencing them or even like who are trying to manipulate them, and then who are trying to get personal information and then use that for what purposes.

Matt Grossmann: There’s a lot more to learn. Political Research Digest is available biweekly from the Niskanen Center and on iTunes. I am your host, Matt Grossmann. Thanks to Young Mie Kim and Michael Beam for joining me. Join us next time to find out whether American politics are nationalizing as voters tune out local media, and whether the news is still informing voters.

Photo Credit: Anthony Quintano from Honolulu, HI, United States [CC BY 2.0 (https://creativecommons.org/licenses/by/2.0)]