Download Free Audio of In 2020, Mark Zuckerberg said that he strongly bel... - Woord

Read Aloud the Text Content

This audio was created by Woord's Text to Speech service by content creators from all around the world.


Text Content or SSML code:

In 2020, Mark Zuckerberg said that he strongly believes that Facebook should not be the arbiter of truth of what is said online. Despite Mr Zuckerberg’s efforts to avoid this, however, Facebook seems to be moving in this direction. Facebook has the ability to highlight certain messages and hide others. In January this year, Facebook kicked Mr Trump off for stoking the riots at the Capitol, showing the force of its power. In February, the social media company announced that falsehoods about vaccinations would not be tolerated on its platform. While previously, the company would only demote claims that vaccines make people ill or cause autism, today it removes posts and blocks anti-vaccine groups. These actions have led Facebook and other internet companies to come under pressure to do more to police anti-vax content. Social media platforms in America are both tools for spreading misinformation and for co-ordination. For instance, the anti-vax campaigners who briefly halted immunisations in Los Angeles used a Facebook page to organise their activities. Campaigners are also using social media to push anti-vax bills in many American states. It is unclear how much Facebook will actually curb vaccine misinformation because showing users truthful content in their feeds and searches will help, but removing problematic content could also drive users to other platforms. A former director of public policy at Facebook believes that « censoring speech and pretending you can make it go away is really problematic. » As an example, after Facebook and Twitter cracked down on accounts promoting QAnon conspiracies, those users just went elsewhere. Facebook’s recent policing of content on its platform also forces the company to contend with its public relations issues. The decision to combat anti-vax propaganda may risk further alienating conservatives in Washington who are concerned about the censorship of free speech. It could also revive discussion about antitrust enforcement among the many politicians who worry about the dominance of big tech. Already, politicians and regulators are grappling with how and whether to change the liability shield that Internet companies have when hosting users’ content. This protection comes under section 230 of the Communications Decency Act, an internet law passed in 1996. This law provides immunity to online platforms from civil liability based on third-party content and for the removal of content in certain circumstances. By taking more proactive action to police content on its own platform, Facebook may be hoping to head off discussions of reforming or entirely repealing section 230. In fact, Facebook and other Internet firms worry that tweaking section 230 could lead to a deluge of lawsuits from people who consider them responsible for material posted on their sites.