Mark Zuckerberg no longer wants to run Facebook alone. In the wake of the Cambridge Analytica privacy scandal, he told Recode that he was "fundamentally uncomfortable" in making some political decisions. Later, he described an idea for a "high court" independent of Facebook that could review decisions about community standards. And in his giant manifesto last year, he called for a "large-scale democratic process" to run Facebook. "We are committed to doing it better, even if that means building a global voting system to give it more voice and control," he promised.
Zuckerberg has always talked about Facebook as a new kind of nation, and his comments have played a great debate on how to give users a share in the platforms they populate. But it's worth remembering that, years ago, Facebook tried to become a democracy, and nobody showed up.
I'm talking about the government's voting system on Facebook, which was announced in February 2009. The company was responding to the controversy over a new policy change, which critics had interpreted as giving Facebook power without control over the user data. He promised to publish drafts of several rules, allow users to send replies and then launch new versions based on the comments. Facebook users would vote on the draft, and if more than 30 percent of all active registered users participated, their decision would be "binding."
"Companies like ours need to develop new models of government."
Zuckerberg described the new system in an idealistic language. "As people share more information about services like Facebook, a new relationship is created between Internet companies and the people they serve." Last week, he reminded us that users feel a real sense of ownership over Facebook, not just the information they share, "he said in the press release. "Companies like ours need to develop new models of government." The publication even included a provisional endorsement by the charity Privacy International, praising "Facebook's bold move towards transparency and democratization."
A couple of months later, Facebook proved its incipient democratic process, asking users to approve some new service conditions. Nobody voted
Well, according to Facebook, a total of 665,654 people voted. But this was about 0.3 percent of its 200 million users at that time. The Los Angeles Times called Facebook's vote "a task nobody did," noting that Facebook asked users to choose between two long and very similar versions of a policy whose effects they would probably barely notice in both directions. Facebook followed the opinion of the majority, but as most people voted for the proposed change, this only meant doing something they already wanted to do.
Facebook maintained a government page and opened some comment periods more for various policies. But the next widely publicized vote, held in 2012, was for a new site policy that would eliminate voting. (It also allowed the company to share account data between services such as Facebook and Instagram.) Facebook said it was ending the vote to encourage good comments from a small group of users rather than the superficial participation of many of them. This time, the vast majority of voters disapproved of that idea, with 88 percent of votes to keep the old documents. But since that was 88 percent of 668,500 voters, comprising 0.0668 percent of the 1 billion Facebook users, it did not really matter. The brief Facebook experiment with the user's direct control was over.
Most people will not volunteer to help run a government online
That happened more than five years ago, and today, an idea like the vote of governability would be even more difficult to achieve. Facebook has to reach 2 billion users, and its policies have more weight in the real world than they used to. Would the platform feel comfortable asking people to vote on policies that address something like anti-Rohingya propaganda in Myanmar? If yes, should users around the world call for such a specific issue at the national level? Would we have federated Facebook "states" with their own rules? And since Facebook is still banning fake propaganda accounts, how would it prevent people from committing virtual electoral fraud to play with the system?
But all these questions are debatable if Facebook can not get people to vote in the first place. And that would require not only putting a page or a press release, but also making sure that each user, regardless of their location or language, clearly encourages them to participate. If Facebook wants people to do more than blindly mark a box, you should also make sure that users understand exactly what they are voting for, in plain language, not just place some links in a policy text wall.
The Facebook site administration page is still active, with new rules appearing for comments (but not voting) every two years. One was published yesterday, so you can participate in the version of the site of a town hall at this time, as long as you are willing not only to read the whole proposal of Facebook on new privacy rules and terms of service, but look for the ancient. policies on your own to see what is being changed.
My colleague Casey Newton has rightly pointed out that making Facebook a "democratic system", as Zuckerberg put it, could allow the platform to shirk responsibility for difficult decisions. But there is a more immediate problem: as we learned almost a decade ago, building a system for the self-government of social networks is difficult. Zuckerberg uses "democracy" as an abbreviation to "let users make decisions" or "do what communities want", but does not talk about how Facebook will inform and engage users and communities, rather than assume that they will take action proactively. the extra work of running a digital government. We have already seen that particular assumption on Facebook. It did not end well.