The unsurprising fact that Facebook selectively suppresses and promotes different things has been getting a lot of press lately. I am not totally convinced yet that this particular claim of political bias itself is 100% credible: selectively chosen evidence that fits a clearly partisan narrative from aggrieved ex-employees should at least be viewed with caution, especially given the fact that it flies in the face of what we know about Facebook. Facebook is a deliberate maker of filter bubbles, echo chambers and narcissism amplifiers and it thrives on giving people what it thinks they want. It has little or no interest in the public good, however that may be perceived, unless that drives growth. It just wants to increase the number and persistence of eyes on its pages, period. Engagement is everything. Zuckerberg’s one question that drives the whole business is “Does it make us grow?” So, it makes little sense that it should selectively ostracize a fair segment of its used/users.
This claim reminds me of those that attack the BBC for both its right wing and its left wing bias. There are probably those that critique it for being too centrist too. Actually, in the news today, NewsThump, noting exactly that point, sums it up well. The parallels are interesting. The BBC is a deliberately created institution, backed by a government, with an aggressively neutral mission, so it is imperative that it does not show bias. Facebook has also become a de facto institution, likely with higher penetration than the BBC. In terms of direct users it is twenty times the size of the entire UK population, albeit that BBC programs likely reach a similar number of people. But it has very little in the way of ethical checks and balances beyond legislation and popular opinion, is autocratically run, and is beholden to no one but its shareholders. Any good that it does (and, to be fair, it has been used for some good) is entirely down to the whims of its founder or incidental affordances. For the most part, what is good for Facebook is not good for its used/users. This is a very dangerous way to run an institution.
Whether or not this particular bias is accurately portrayed, it does remain highly problematic that what has become a significant source of news, opinion and value setting for about a sixth of the world’s population is clearly susceptible to systematic bias, even if its political stance remains, at least in intent and for purely commercial reasons, somewhat neutral. For a site in such a position of power, though, almost every decision becomes a political decision. For instance, though I approve of its intent to ban gun sales on the site, it is hard not to see this as a politically relevant act, albeit one that is likely more driven by commercial/legal concerns than morality (it is quite happy to point you to a commercial gun seller instead). It is the same kind of thing as its reluctant concessions to support basic privacy control, or its banning of drug sales: though ignoring such issues might drive more engagement from some people, it would draw too much flak and ostracize too many people to make economic sense. It would thwart growth.
The fact that Facebook algorithmically removes 95% or more of potentially interesting content, and then uses humans to edit what else it shows, makes it far more of a publisher than a social networking system. People are farmed to provide stories, rather than paid to produce them, and everyone gets a different set of stories chosen to suit their perceived interests, but the effect is much the same. As it continues with its unrelenting and morally dubious efforts to suck in more people and keep them for more of the time, with ever more-refined and more ‘personalized’ (not personal) content, its editorial role will become ever greater. People will continue to use it because it is extremely good at doing what it is supposed to do: getting and keeping people engaged. The filtering is designed to get and keep more eyes on the page and the vast bulk of effort in the company is focused wholly and exclusively on better ways of doing that. If Facebook is the digital equivalent of a drug pusher (and, in many ways, it is) what it does to massage its feed is much the same as refining drugs to increase their effects and their addictive qualities. And, like actual drug pushing that follows the same principles, the human consequences matter far less than Facebook’s profits. This is bad.
There’s a simple solution: don’t use Facebook. If you must be a Facebook user, for whatever reason, don’t let it use you. Go in quickly and get out (log out, clear your cookies) right away, ideally using a different browser and even a different machine than the one you would normally use. Use it to tell people you care about where to find you, then leave. There are hundreds of millions of far better alternatives – small-scale vertical social media like the Landing, special purpose social networks like LinkedIn (which has its own issues but a less destructive agenda) or GitHub, less evil competitors like Google+, junctions and intermediaries like Pinterest or Twitter, or hundreds of millions of blogs or similar sites that retain loose connections and bottom-up organization. If people really matter to you, contact them directly, or connect through an intermediary that doesn’t have a vested interest in farming you.
Address of the bookmark: http://gizmodo.com/former-facebook-workers-we-routinely-suppressed-conser-1775461006