by Rafaela Prifti-
Topics of police and justice reform dominate an already intense newscycle as the heavily trafficked platforms of social media particularly Facebook enjoy record high stocks in August while swimming in criticism and grievances from both sides.
An estimated 600 million people see a news story on Facebook every week. The social network’s founder Mark Zuckerberg has been transparent about his goal to monopolize digital news distribution. Facebook’s news section operates like a traditional newsroom that reflects the biases of its employees and the institutional imperatives of the corporation. The company claims that the trending module provides lists of “topics that have recently become popular on Facebook” not acknowledging the imposition of human editorial values onto the items that an algorithm spits out.
ANTI-TRUST AND FAIRNESS
Mark Zuckerberg, founder and CEO of Facebook admitted to buying Instagram and WhatsApp to eliminate them as competitors. Yet Facebook insists both acquisitions have not harmed competition. On the question of fairness, Facebook Inc. is adamant that it does not play favorites. Longtime and former employees reportedly say that Zuckerberg isn’t easily influenced by politics. But he cares deeply about Facebook’s growth potential. The co-founder of Accountable Tech – an organization that makes recommendations to tech companies on public-policy issues – noted that Facebook, more so than other platforms, has gone out of its way to not ruffle feathers in the current administration. ” As long as the government is in pursuit of antitrust cases against big tech companies the President does have leverage over Zuckerberg who has been called by the regulators in Congress a few times. The pattern has come to light in countries around the world. The Wall Street Journal reported on the FB posts of a lawmaker in India calling for violence against Rohingya Muslim immigrants. The Facebook executive was accused of granting special treatment to the lawmaker from Prime Minister Narendra Modi’s Party. It was only after the reporting by The Wall Street Journal that the company banned it. On September 14 a memo from a Facebook former employee was published detailing how it had ignored or delayed taking action against foreign national governments using fake accounts to mislead their citizens.
IMMUNITY UNDER SECTION 230 OF THE COMMUNICATIONS DECENCY ACT OF 1996
In late May President Trump signed an executive order that threatens to revoke the immunity enjoyed by social media companies, including Facebook, if they showed political bias. Facebook responded by saying the move would restrict free speech. The order was an apparent threat to social networks like Twitter that censored posts from President Trump and his allies. The pressure is mounting for Facebook’s main rivals. The U.S.Department of Justice is preparing to file a case against Google, before Election Day. Another key competitor of Facebook, Chineses-owned TikTok, is facing ejection from the country it finds a US buyer. Oracle has agreed to become TikTok’s business partner. It is unclear whether the deal will satisfy the government officials on either side, who have indicated they intend to carefully review any new arrangement.
Facebook executives say their only loyalty is to free speech. Nick Clegg, the head of policy and communications, claims that despite isolated cases, the systematic or deliberate political bias in Facebook decisions is not borne out by the facts. In 2016, a former journalist who was part of the project at FB reported the company workers routinely suppressed news stories of interest to conservative readers from the social network’s influential “trending” news section. Facebook executives often point out that the company was seen as overly friendly to Democrats during the Obama administration and that it takes plenty of heat from the Right. But the Gizmodo story emboldened the claims of anticonservative bias at social media companies. In response to the backlash, Facebook started to drift rightward, according to Bloomberg Businessweek reporting. The company flew conservative commentators to its California headquarters to reassure them that there was no need for concern about how Facebook operated.
Historically, Facebook had placed most of the decision-making about its products to the executives. In 2018 the company’s policy team seemed to have veto power. In January, Zuckerberg asked to reduce the prevalence of news in users’ feeds, especially from incendiary and untrustworthy outlets. An internal report around the same time touted Trump’s superior strategy with Facebook ads, noting that candidate Trump followed advice and training from the company that his opponent, Hillary Clinton, had rejected. Andrew Bosworth who ran the ads department at the time and is now head of augmented and virtual reality wrote in a memo to employees in 2018: “Trump “got elected because he ran the single best digital ad campaign I’ve ever seen from any advertiser. Period.” In the eyes of Facebook’s mostly liberal staff, the Republican relationship-building was the price of doing business. According to reports, Russia’s spread of election misinformation and failure to stop Cambridge Analytica’s data-gathering operation caused a shift among the rank and file. After the Kavanaugh hearings, employees began to notice that Kaplan of Facebook’s policy team seemed more concerned about critiques of bias from conservatives than from liberals. The product team tweaked the news feed. Upon review of test simulations by Kaplan’s team, the product change was causing traffic to drop more severely for right-wing outlets which tend to publish more incendiary content, noted the source. The engineers were ordered to tweak the algorithm a little more until it punished liberal outlets as much as conservative ones, before releasing the update to 2.5 billion users. As employees started to worry about Facebook’s proximity to the Right, Facebook’s Management seemed intent on pushing the company even closer to it. Faced with criticism about misinformation, the response of Facebook’s policy team as posted in a blog has been: “There is an election coming in November and we will protect political speech, even when we strongly disagree with it.”
In the 2016 election, Russian operatives created fake accounts aimed at Black voters directing people who followed these accounts not to vote or do so by text message, which isn’t possible. In all, the Russian posts reached more than 150 million Americans. In response, Facebook’s election integrity and cybersecurity is charged with the task of rooting out fake content created by foreign national governments. Last year, Facebook removed 50 networks of accounts like the Russian one from 2016. The following year, Facebook did make rules against giving incorrect information about how to vote. But when Twitter had fact-checked posts containing election voter disinformation, Zuckerberg went on Fox News to criticize it. Later an outside civil rights auditor concluded that Facebook failed to enforce its own policies. Instead Zuckerberg came up with “the largest voting information campaign in U.S. history,” a plan to register 4 million voters. Facebook designed a “Voting Information Center,” a web page with facts about the election compiled from state authorities. The social media network has been promoting the page atop every user’s Facebook and Instagram feed and attaches a link to it with every post on the service that mentions the election process. Facebook’s head of cybersecurity policy, told reporters that the hub “ensures that people can see the post and hear from their elected officials… But users are not warned if the information is untrue—Facebook simply advertises an information center. Facebook has said that the suggestion that the company scaled down its voter registration plans for political reasons is “pure fabrication.” Meanwhile, conspiracy theorists abound on the site. In June Zuckerberg announced that he had rehired Chris Cox, Facebook’s Chief Product Officer, who had been active in Democratic politics since a high-profile departure from the company last year. In reference to a new future administration, Nick Clegg, Vice President of Global Affairs and Communications at Facebook said “We’ll adapt to the environment in which we’re operating.”