Though former Vice President Joe Biden has now won the recent election in the United States, the country is still deeply polarized. In Israel, which has had three consecutive elections with another one possibly around the corner, polarization doesn’t even begin to cover it.
Israeli researcher Dr. Roi Levy studies how societies polarize into camps that struggle to find common ground. His new study shows that one way polarization can be mended is through exposure to media from the opposite side of the ideological aisle.
Levy, a political economy researcher, is currently pursuing a postdoctoral fellowship at MIT. His study is based on a large-scale experiment of 38,000 participants, in their “natural” Facebook environment. The study he published was recently accepted into the American Economic Review.
Lately there has been a growing preoccupation with the concept of polarization, but societies have always divided into political camps. So, is this a new concept? What has changed?
"In the academic field, this is not a new concept, it has existed since the 1980s and 1990s, when researchers showed a decline in cooperation between party representatives in Congress,” says Levy. “But yes, lately there is a sense that this trend is strengthening, societies are divided into two separate camps. Support a Democratic presidential candidate and a Republican Senate candidate? Today, people are less willing to even consider it.”
"There is a lot of evidence that emotional polarization is increasing. Democrats and Republicans hate each other more,” Levy continues.
But he adds an important caveat. “We have not necessarily moved farther apart in terms of our opinions,” Levy says. “What has changed is our emotions. Emotions between different groups are more negative. I would call it emotional polarization.”
The rise in emotional polarization led Levy to investigate what happens when people are exposed to news written from an opposing viewpoint.
"The question of the relationship between the media that people consume, how they consume it, and its impact on their behaviors and opinions is a topic that has always interested me,” Levy says. “There are not many careful studies that systematically present findings on these issues."
Levy explains that in the field of political economy, large-scale field trials are still relatively rare. "It was important for me to investigate in the most organic environment, in the natural environment where people consume news – their Facebook."
One like and see what follows
Levy kept his experiment simple, interfering as little as possible with his subjects’ Facebook feeds.
In the first phase, he posted an ad asking users to respond to a short survey, which examined their political preferences. 38,000 people responded to the survey. He divided respondents randomly into three groups. The first group was prompted to “like” the pages of left-leaning media sources. The second group was prompted to like right-leaning pages. The third served as a control, receiving prompts to like pages from a variety of viewpoints.
"The beauty is that it's all the experiment, basically. I don't pay them, but once they like a page, the feed changes a bit. They don't have to like anything, but more than half did," Levy explains.
Levy claims that those who chose to like the pages that match their views saw 64 more posts a week. Those who liked pages that represent a different opinion from their own saw about 35 more posts a week.
Can you explain that gap?
"This gap can have several explanations. But the algorithm probably exposes you to things that are similar to you. I think that's one of the strongest proofs of how Facebook's algorithm works," Levy says.
Levy followed the participants in several ways: he followed their “data” on Facebook (likes, shares, etc.), asked a small group to install a plugin that collects information about the news sites they use, and after two months asked all participants to answer another survey.
"The second survey dealt with two main things: how exposure to other media affected participants' opinions, and whether their perception of the other side changed," Levy says.
It’s likely that no one will transform into a Democrat or a Republican in a week.
"That's why I focused on the issues that were on the agenda at the time and there was a media discussion about them. At that time there was talk of gun ownership, investigations into Trump, trade agreements. Not a complete change of opinion, but an attitude to specific issues."
And how do you test the relationship to the other side?
“The questions were not how much you agree or identify, but questions like,’How much can you see the perspective of the other side? Do you think the other side's positions should be taken into account? How do you feel about them on a scale of of 1 to 100?’”
Did the results surprise you?
"Yes. The truth is they surprised me. The study had three main results: The first is that what people see in their feed greatly affects the sites they visit. If your feed randomly becomes more left-wing, you will go to more left-wing sites, even if it does not match your perception. It did not surprise me, but it is very significant. It shows the great power of the algorithm. People get into what is accessible, not in their opinions.
"The second result is that people's opinions did not change at all, which surprised me. People went in and read articles on sites contrary to their worldview, but their opinions on current affairs did not move and they answered in the poll as someone whose worldview is similar and not exposed to other media.
"The third result is that the emotional polarization was indeed affected by it. Whoever was exposed to the other side developed a less negative attitude. Exposure to the other side's worldview probably did not convince anyone that the other side was right, but perhaps it made them realize there is a world of values that should not be ignored.”
"It is important to keep an open channel"
"Polarization has long been not just a topic for academic discussion, but a real danger to society. In the liberal public in the United States, for example, Trump is perceived as a clown at best, and a mad dictator at worst. In the Republican public, many see him as the savior of the nation. These two publics do not live in the same places, read the same newspapers, or laugh at the same jokes. Yet they experienced the same polarized reality, which led them to vote at an unprecedented rate and increased the fear of violent riots when the results became known, to the point of fear of a civil war.
"In Israel, similar divides can be attributed to the huge controversy surrounding Prime Minister Benjamin Netanyahu. And in concepts such as Israel I and Israel II, it also seems that in Israel the level of understanding between the parties is declining and at the same time, the level of violence is rising.
"The ability of the parties to recognize the legitimacy of other worldviews and reduce emotional polarization may prove crucial in the coming years."
Levy says the results of the experiment made him think about his use of the media, and like some pages that contradict his positions. "The department where I taught at Yale has a subscription to the Wall Street Journal, a newspaper I do not identify with and have not read before. So I started reading a column here and there. Sometimes I agree, sometimes I disagree. But it’s important to keep this channel open.”
What do you think the findings mean?
"On the one hand, if you're a pessimist, you might say the result of the study is disturbing. Today, people consume more articles through Facebook, and most of those articles are from their own political side. This may show that algorithms increase emotional polarization, making it difficult for people to identify with and understand the other side.
"On the other hand, because the research shows that people are willing to visit sites from the opposite side, we can also think about how to change that. My conclusion is that people can be encouraged to change their news consumption without being forced to."
How can this be done?
"Facebook can incorporate an alert: '80% of the news you read is from the same side of the political map, would you like to diversify?' Facebook's page recommendations can also diversify.”