By Jeff Horwitz

Facebook Inc. said it is beginning to reduce how much political content users see in their news feeds, potentially diminishing the role that the world's largest social network plays in elections and civil discourse more broadly.

The announcement, made in a Wednesday blog post, follows Facebook Chief Executive Mark Zuckerberg's declaration on the company's earnings call last month that most users wanted to see less political content. He said at the time that cutting back on politics would allow Facebook to "do a better job of helping to bring people together and helping to promote healthier communities."

Facebook says that political content currently constitutes only 6% of what people see on the platform. It will begin running experiments to reduce that amount for a small percentage of people in Canada, Brazil and Indonesia immediately, with tests in the U.S. in weeks to come.

The company said it isn't removing political content but rather exploring ways to reduce the exposure for users who would prefer not to see it. In practice, that means Facebook will still allow users to post about politics and argue among friends, but its algorithms will de-prioritize those conversations and spread them less widely across the network, especially for people who have not expressed interest in those topics.

The company didn't specify how it will define political content.

Facebook said on Wednesday that its new efforts will be gradual and accompany existing tools -- such as the ability to opt out of political ads or guarantee that content from chosen entities appears high in the news feed -- that the platform already offers users.

The effort marks a pivot for Facebook, which has historically relished its role as a central, populist actor in elections and social movements around the world. In an October 2019 speech, Mr. Zuckerberg declared social media to be "the fifth estate," a center of civic power on par with the press as well as the executive, legislative, and judicial branches of government.

Mr. Zuckerberg said on the company's earnings call that he is rethinking the place of politics on the platform as part of a continuing effort to "turn down the temperature and discourage divisive conversations and communities." The shift comes after a bruising U.S. presidential election that twice led Facebook to invoke what it called emergency "break glass" measures to calm civic discourse. The first was just after the election in November and the second was in the wake of Trump supporters storming the U.S. Capitol on Jan. 6.

Those emergency measures were designed to be temporary but some, such as restrictions on how quickly certain Facebook Groups can grow, are now permanent.

The Wall Street Journal previously reported that internal Facebook research concluded ahead of the election that the most active political Facebook Groups were incubating a toxic brew of hate speech, conspiracy theories and calls to violence.

Depending on how far Facebook's overhaul goes, the reduction in the visibility of political content could jumble the ecosystem of online activism, publishing and advertising that has grown up around the social network and its reported 2.8 billion monthly users. It also could reanimate complaints, largely from conservatives, that the company is stifling political speech.

The ultimate prominence of political content will depend on testing and user feedback, the company says.

"Our goal is to preserve the ability for people to find and interact with political content on Facebook, while respecting each person's appetite for it at the top of their News Feed," said the blog post by Facebook product manager Aastha Gupta.

Facebook said it would exempt from the tests health information from organizations that it classifies as authoritative, as well as content from official government agencies.

The move isn't the first time that Facebook has tinkered with the prominence of particular categories of content on its platforms. After gearing its content-recommendation algorithms to increase how much time users spent on the platform or responded to content, Facebook gradually shifted in the second half of the last decade toward maximizing a new metric: "meaningful social interaction."

The upshot was reduced attention to passively consumed content -- posts from businesses, brands and media, according to Mr. Zuckerberg -- and greater prominence for material shared by a user's family, friends and interest-based groups. Outside researchers and the company's researchers found the changes corresponded to less traffic for many newspaper publishers but more attention for stories that elicited a strong reaction from users.

The Journal reported last year that internal research indicated Facebook's algorithms were rewarding the purveyors of polarizing content that struck an emotional response with users.

Ms. Gupta also said the company would publicly discuss changes as it made them.

"As we embark on this work, we'll share what we learn and the approaches that show the most promise," she wrote.

Facebook is facing other hot-button political issues in the near term. Its new independent content-oversight board is set to determine later this year whether Facebook erred in suspending former President Donald Trump from its platform.

Write to Jeff Horwitz at Jeff.Horwitz@wsj.com

(END) Dow Jones Newswires

02-10-21 1143ET