[ad_1]
Mark Zuckerberg, founder and CEO of Meta, declared that his social media empire is distancing itself from politics last month. This declaration is a shock, especially considering that Facebook played an important role in the 2016 and 2020 elections. While the decision’s effects are still unknown, it presents an opportunity to see how effectively a common source of news and political opinions can distance itself from the partisan fray. And while Meta’s policy to keep out of politics this election cycle may prove beneficial to democracy, it also presents the risk of users sharing misinformation at higher rates.
Social media has played an increasingly crucial role in political campaigns throughout the past few elections. Starting with the presidential election of Barack Obama in 2008, candidates of both major parties have used social media to reach voters online. This includes former President Donald Trump’s use of X throughout the campaign process and his presidency, as well as both Trump and Vice President Kamala Harris establishing accounts on other platforms for their respective campaigns.
However, with more politicians campaigning on social media, there has been more weaponization of social media for political purposes. The 2016 election saw allegations of Russia utilizing social media to interfere with the election. Beyond foreign interference, in 2020, social media algorithms pushed election denial rhetoric along with the platforming of fringe conspiracy groups like QAnon. These groups were present during the Jan. 6 insurrection at the U.S. Capitol — an event spurred in part by social media.
Politics on social media means even more for Zuckerberg. In 2021, Zuckerberg alleged that the White House pressured him to censor potential misinformation related to COVID-19 on his platforms, a move he now regrets. Combined with the accusations that the company allowed for the spread of misinformation in previous elections, Zuckerberg decided to have his company step back from politics, a decision seemingly induced by the government’s demands and a desire to reduce the company’s liability.
The new and less political Meta no longer prioritizes political content and isn’t promoting it to the platform’s users. Zuckerberg also stepped back from directly handling election security issues and reduced Facebook’s team dedicated to handling election content. These steps are part of a plan for Zuckerberg to reduce his and his company’s accountability for election-related issues.
Zuckerberg and Meta’s move looks promising from a democratic standpoint, as the deprioritizing of fringe content should reduce the amount of misinformation on the platform. The platform’s new policy could limit the impact of social media on the election process, especially how users share election-related content.
However, this choice isn’t without drawbacks. One of Meta’s biggest impacts on elections has involved the spread of election-related misinformation, whether it’s about a candidate, policy or an event happening in the news. Meta’s stance to stay out of politics also means that its ability to combat misinformation is also reduced.
Although political content is no longer recommended to users, it can still spread alongside coinciding misinformation. Take the oft-repeated example of Haitian migrants eating domesticated animals. Facebook allowed this misinformation to go viral. Despite Facebook’s algorithm not promoting political content, this misinformation gained enough traction for Trump to iterate it on the debate stage in September. The presidential debate displayed that Meta’s intent to step back from politics doesn’t stop politics from running through the platform.
As much as Zuckerberg may want to remove politics from Meta’s platforms, stepping back completely is a Sisyphean undertaking, since politics is ingrained in how Facebook functions. Although the algorithm has changed, Facebook needs to do more to remove itself from politics, while also ensuring that misinformation is reduced from its current levels. To reduce misinformation, Facebook must take a more active role in content moderation.
While monitoring political content is a reason why Meta’s leadership initially wanted to step away from politics, it doesn’t mean he can ignore it. Zuckerberg recognized that misinformation was a problem on his platforms before and established teams to stop its spread. Just because he faced pressure to monitor an issue doesn’t mean that all content doesn’t deserve the same monitoring it had before. Monitoring misinformation should be a nonpartisan goal for Zuckerberg and his company. Just because they monitor content doesn’t mean that action is meant to appeal to a specific partisan group. Rather, moderating content should appeal to the country as a whole, which should appreciate efforts to promote the truth when it comes to an election.
Ultimately, Zuckerberg and Meta’s decision to step back from politics shows promise — but still requires more work if it’s to benefit democracy instead of benefiting the company. Much of this action must come in prioritizing content moderation, something the company used to do often, until they decided to step away from politics. By committing to monitoring misinformation and instituting a new algorithm that deemphasizes political content, Zuckerberg and Meta could show companies and users across the U.S. that social media can exist without the need for provocative and often inaccurate content taking center stage during elections.
Thomas Muha is an Opinion Columnist and can be reached at tmuha@umich.edu. His column “Internet Insight” discusses the legal and economic issues facing the internet today.
Related articles
[ad_2]
Source link