The authors of a recent paywalled article in MIS Quarterly here summarize their findings in another restrictive and normally paywalled site, the Washington Post. At least the latter gives some access – I was able to read it without forking out $15, and I hope you are too. Unfortunately I don’t have access to the original paper (yet) but I’d really like to read it.
The authors examined the web browsing history of nearly 200,000 US adults, and looked at differences in diversity and polarization related to use of Reddit, Twitter, and Facebook, correlating it with political leanings. What they found will surprise few who have been following such issues. The headliner is that Facebook is over five times more polarizing for US conservatives than for liberals, driving them to far more partisan news sites, far more of the time. Interestingly, though, those using Reddit visited a far more diverse range of news sites than expected, and tended towards more moderate sites than usual: in fact, the sites were a claimed 50% more moderate than what they would typically read. Furthermore, and just as interesting to me, Twitter seemed to have little effect either way.
The authors blame this on the algorithms – that Facebook preferentially shows posts that drive engagement (so polarizing issues naturally bubble to the top), while Reddit relies on votes for its emphasis, so presenting a more balanced view. In the Washington Post article they have little to say about Twitter, apart from that it wants to be more transparent in its algorithms (though nothing like as transparent as Reddit). But it isn’t, and I think I know why that lack of effect was seen.
Algorithms vs structure
You could certainly look at it from an algorithmic perspective. There is no doubt that different algorithms do lead to different behaviours. Facebook and Twitter both make use of hidden algorithms to filter, sort, and alter the emphasis of posts. In Twitter’s case this is a relatively recent invention. It started using a simpler, time-based sort order, and it has become a much less worthwhile site since it began to emphasize posts it thinks individuals want to see. I don’t like it, and I am very glad to hear that it intends to revert to providing greater control to its users (what Judy Kay calls scrutable adaptation). Reddit’s algorithms, on the other hand, are entirely open and scrutable, as well as being intuitive and (relatively) simple. It is important to remember that none of these sites are entirely driven by computer algorithms, though: all have rules, conditions of use, and plentiful supplies of humans to enforce them. Reddit has human moderators but, unlike the armies of faceless paid moderators employed by Twitter and Facebook to implement their rules, you can see who the moderators are and, if you put in the effort and feel so inclined, you could become one yourself.
However, though algorithms do play a significant role, I think that the problem is far more structural, resulting from the social forms each system nurtures. These findings accord very neatly with the distinction that Terry Anderson and I have made between nets (social systems formed from connections between individuals) and sets (social systems that form around interests or shared attributes of their users). Facebook is the archetypal exemplar of the network social form; Reddit is classically set-oriented (as the authors put it ‘topic based’); Twitter is a balanced combination of the two, so the effects of one cancel out the effects of the other (on average). It’s all shades of grey, of course – none are fully one or the other (and all also support group social forms), and none exist in isolation, but these are the dominant forms in each system.
Networks – more specifically, scale-free networks – have a natural tendency towards the Matthew Effect: the rich get richer while the poor get poorer. You can see this in everything from academic paper citations to the spread of diseases, and it is the essence of any human social network. Their behaviours are enormously dependent on highly connected influencers, They are thus naturally inclined to polarize, and it would happen without the algorithms. The algorithms might magnify or diminish the effects but they are not going to stop them from happening. To make things worse, when they are taken online then it is not just current influence that matters, because posts are persistent, and continue to have an influence (potentially) indefinitely, whether the effect is good or bad (though seldom if it is somewhere in between).
There are plenty of sets that are also highly partisan. However, they are quite self-contained and are thus containable, either because you simply don’t bother to join them or because they can easily be eliminated: Reddit, for instance, recently removed r/the_donald, and extreme right wing subreddit for particularly rabid supporters of Trump, for its overwhelmingly violent and hateful content. Also, on a site such as Reddit, there are so many other interesting subreddits that even the hateful stuff can get a bit lost in the torrent of other news (have you seen the number of subreddits devoted to cats? Wow). And, to a large extent, a set-based system has a natural tendency to be more democratic, and to tend towards moderate views. Reddit’s collective tools – karma, votes, and various kinds of tagging – allow the majority (within a given subreddit) to have a say in shaping what bubbles to the top whereas, in a network, the clusters that form around influencers inevitably channel a more idiosyncratic, biased perspective. Sets are intentional, nets are emergent, regardless of algorithms, and there are patterns to that emergence that will occur whether or not they are further massaged by algorithms. Sets have their own intractible issues, of course: flaming, griefing, trolling, sock-puppeting and many more big concerns are far greater in set-based systems, where the relatively impersonal and often anonymous space tends to suck the worst of humanity out of the woodwork.
I would really like to see the researchers’ results for Twitter. I hypothesize that the reason for its apparent lack of effect is that the set-based features (that depolarize) counterbalance the net-based features (that polarize) so the overall effect is null, but that’s not to say that it has no effect: far from it. People are going to be seeing very different things than they would if they did not use Twitter – both more polarized and more moderate, but (presumably) a bit less in between the two. That’s potentially very interesting, especially as the nuances might be quite varied.
Are networks necessarily polarizing?
Are all online social networking systems evil? No. I think the problem emerges mainly when it is an undifferentiated large-scale general purpose social networking system, especially when it uses algorithmic means to massage what members see. There are not many of those (well, not any more). There are, however, very many vertical social networks, or niche networks that, though often displaying the same kinds of polarization problem on a smaller scale, are far less problematic because they start with a set of the people who share attributes or interests that draw them to the sites. People are on Facebook because other people are on Facebook (a simple example of Metcalfe’s Law). People are on (say) ResearchGate are there because they are academics and researchers – they go elsewhere to support the many other facets of their social lives. This means that, for the most part, niche networks are only part of a much larger environment that consists of many such sets, rather than trying to be everything to everyone. Some are even deliberately focused on kindness and mutual support.
Could Facebook shift to a more set-oriented perspective, or at least develop more distinct and separate niches? I doubt it very much. The whole point of Facebook is and always has been to get more people spending more time on the site, and everything it does is focused on that one goal, regardless of consequences. It sucks, in every way, pulling people and content from other systems, giving nothing back, and it thrives on bias. In fact, it is not impossible that it deliberately nurtures the right-wing bias it naturally promotes, because it wishes to avoid being regulated. Without the polarization that drives engagement, it would lose money and users hand over fist, and there are bigger, more established incumbents than Facebook in the set space (YouTube, at least). Could it adjust its algorithms to reduce the bias? Yes, but it would be commercial suicide. Facebook is evil and will remain so because its business model is evil. For more reasons than I can count, I hope it dies.
It could have been – and still could be – different. Facebook more or less single-handedly and very intentionally maimed the Open Social project, to which virtually all other interested organizations had signed up, and that would have allowed federation of such systems in many flexible ways. However, the dream is not dead. A combination of initiatives like Solid, perhaps a browser-based approach to payments, certainly connecting protocols like Webmention, and even the not-quite-dormant OpenSocial might yet enable this to happen. Open source social networking software, like Diaspora or Mastodon or Elgg or Minds (with some provisos), support something like a distributed model, or at least one that can be owned by individuals rather than corporations. WordPress dwarfs every other social system in terms of users and websites, and is inherently set-based and distributed: there are also plentiful plugins that support those other open protocols to provide deeper connections. This kind of distributed, open, standards-based initiative could radically alter the dynamic, giving far more control to end users to pick the sets and networks that matter to them, to wrest control of the algorithms from one big behemoth, and to help build a richer, more tolerant society. I am delighted to see that Facebook has lost a couple of million of its US and Canadian users as well as being boycotted by lots of advertisers, and I hope that the void isn’t being filled by Instagram (also Facebook). It could be the start of something big, because Metcalfe’s Law works the same in reverse: what goes up fast can come down just as quickly. Get in on the trend while it’s hot, and join the exodus!
Originally posted at: https://landing.athabascau.ca/bookmarks/view/6960862/echoes-and-polarization-in-facebook-reddit-and-twitter-its-not-just-about-the-algorithms