You may have heard that on Wednesday Facebook’s governing body upheld the website’s decision at least temporarily to block the account of the former President Donald Trump. That in and of itself is a remarkable result with far-reaching consequences for the journey Trump communicates with his supporters and helps define the future of the GOP. But as a tech and political reporter for FiveThirtyEight, I kept an eye on the process as well as the outcome and what it tells us about the state of Big Tech. This process shows that, if only at risk of regulation, tech companies will find ways of regulating themselves that are little more than the theater of justice.
The board’s decision was the final chapter in a story that began in January. After the attack on the Capitol on January 6th Trump’s account has been banned because of his contributions, which the attackers praised. Throw the leader of the free world in “Facebook Prison” was controversial, to say the least, but others Tech platforms followed suit. After the day of inauguration, Facebook referred the case to its regulator before further action was taken. The board consists of 20 members, including an impressive list of former judges, legal experts and a Nobel Prize winner.
The board was released on Wednesday his decisionand stated that Facebook was authorized to suspend Trump’s account, but should not have been allowed to do so indefinitely, as such suspension is not described in the website’s policies. (Instead, Facebook should have blocked its account for a period of time or completely blocked Trump.) The board then said Facebook would have to “apply for and justify a defined penalty” over the next six months.
In its 12,000-word decision, released Wednesday, the board meticulously breaks down whether the Trump account suspension was warranted, relying on Facebook’s community standards, the website’s previous actions and actions nearly 10,000 public comments. She also cited several United Nations documents, such as the Guiding Principles on Business and Human Rights – high ideals that Facebook has advocated to varying degrees, but which are not legally binding. The board criticized Facebook for trying to evade its responsibility to set and enforce policies by relaying that decision to the board, and for failing to answer questions about whether the website’s algorithm might be suggesting content from Trump advertised the January 6 attack.
But all of this – the critical tone, the authoritative language, the multiple citations from U.N. documents – is just a very well executed stage attire. The supervisory body is ultimately a creation of Facebook, financed by Facebook and intended to serve Facebook: As a private, for-profit company, Facebook has little incentive to invest in projects that could do it more harm than good. The social media giant $ 130 million smuggled into an irrevocable trust to fund the board for at least six years, money that helps pay them off six-figure salaries of board members so you can write long reflections on it ultimately possess superficial authority: The decisions of the board are “binding”. according to his websiteThis means that Facebook “will be required to implement it unless it could be against the law.” But asked from whom? The board that Facebook made? In practice, Facebook may or may not take the advice of the board. It could break up the board tomorrow. It’s all just regulatory splendor.
That’s the problem: there is no current legal process that Big Tech can hold accountable for its moderation policies. Facebook, along with the rest of the tech industry, is almost completely unregulated. Aside from harsh restrictions on child pornography, for example, there is virtually no legal impact on the decisions these companies make. For a long time, that meant Big Tech did pretty much anything it wanted. Now with increasing public and political pressure To combat some of the worst habits in the industry and to avoid actual regulations, Facebook created a version of self-regulation. This is what it looks like.
This does not mean that federal regulations are necessarily the answer. It is difficult to imagine that laws could hold platforms for misinformation and extremist content accountable while protecting and avoiding freedom of expression the destruction of some of the best parts of the internet. I don’t envy anyone who tries to solve this complex puzzle. Despite its age, the internet is still a new frontier in many ways.
But instead of regulation and with an always threatening one threat Regarding regulation, we are left with a legal theater with no practical accountability that does not address some of the most conspicuous problems facing our modern tech oligopoly. This is Facebook’s answer to the problem. It is up to all of us to decide whether this is enough.