SAN FRANCISCO — For months, Facebook has weathered criticism for its willingness to show all types of political advertising to its billions of users, even if those ads contained lies.
Now the company is changing tack — sort of.
On Tuesday, the social network said it would allow people in the United States to opt out of seeing social issue, electoral or political ads from candidates or political action committees in their Facebook or Instagram feeds. The ability to hide those ads will begin with a small group of users, before rolling out in the coming weeks to the rest of the United States and later to several other countries.
“Everyone wants to see politicians held accountable for what they say — and I know many people want us to moderate and remove more of their content,” Mark Zuckerberg, chief executive of Facebook, wrote in an op-ed piece in USA Today on Tuesday. “For those of you who’ve already made up your minds and just want the election to be over, we hear you — so we’re also introducing the ability to turn off seeing political ads. We’ll still remind you to vote.”
The move allows Facebook to play both sides of a complicated debate about the role of political advertising on social media ahead of the November presidential election. With the change, Facebook can continue allowing political ads to flow across its network, while also finding a way to reduce the reach of those ads and to offer a concession to critics who have said the company should do more to moderate noxious speech on its platform.
Mr. Zuckerberg has long said that Facebook would not police and moderate political ads. That’s because the company does not want to limit the speech of candidates, he has said, especially in smaller elections and those candidates who do not have the deep pockets of the major campaigns.
But critics, including the Biden presidential campaign, have argued that Facebook’s laissez-faire approach has dangerous consequences, with untruthful political ads leading to the spreading of disinformation and potential voter disenfranchisement. Some Republicans have argued that Facebook should not act as an arbiter of what can and cannot be posted in ads, and that the company’s intervention amounts to censorship.
The Biden presidential campaign lashed out at Facebook over its hands-off policy on political ads last October after the Trump campaign released ads on the social network that falsely claimed that Mr. Biden had offered to bribe Ukrainian officials to drop an investigation into his son. Since then, the Biden campaign has called for the company to fact-check ads from candidates and their campaigns.
Last week, Mr. Biden’s campaign also began an online petition and letter to Mr. Zuckerberg to demand changes to its speech policies ahead of the 2020 presidential contest. At the same time, the Biden campaign also spent $5 million in advertising on Facebook, surging past political ad spending by Mr. Trump on the platform.
Facebook has previously modified what some users can see with political ads. In January, the company said it would allow people the option to see fewer such ads. The update announced on Tuesday will let them opt out entirely.
Other social media companies have taken a far harder line on political ads. Last year, Jack Dorsey, Twitter’s chief executive, said Twitter would ban all political ads because they presented challenges to civic discourse.
“We believe political message reach should be earned, not bought,” Mr. Dorsey said.
Both Facebook and Twitter publish libraries of political ads that have run on their sites, allowing people to research specific advertisers while tracking their messages and spending habits. The companies also regularly take down coordinated disinformation campaigns, and are monitoring attempts at election interference from foreign operatives.
Still, critics said that Facebook wasn’t being transparent enough. “There are significant problems with the Facebook ad library, which makes it really difficult to keep on top of what is circulating to even monitor for disinformation in ads, let alone to judge what the impact is with audiences,” said Claire Wardle of First Draft, a nonprofit that researches the impact of misinformation in the media.
Facebook also unveiled a voting information center on Tuesday, a feature that aims to give people more data on elections. That includes details on how and when to vote, information about voter registration, voting by mail and early voting.
“Covid is going to make it really difficult for people to understand what’s going on and how to vote,” Emily Dalton Smith, a director of social impact products at Facebook, said in an interview. She said the voting information center would help people get necessary and accurate information for the fall elections.
The feature will roll out at the top of the news feeds for American users of Facebook and Instagram. Facebook has pledged a goal of helping more than four million people register to vote through its initiative. It estimated that half of the U.S. population would see information on how to vote in the November elections.
Kate Conger contributed reporting from Oakland, Calif., and Cecilia Kang from Washington.