A tool which gives more information about publishers and why their content is appearing on a user’s News Feed is also being extended to images, the social network said.
In a wide range of updates, a new section has also been added to Facebook’s Community Standards website which will enable users to track changes the company makes to its policies.
Facebook has been at the centre of the debate around the influence of social media on society, well-being and political interference in recent years, and has also been widely criticised for its business practices in relation to personal data.
The company has been attempting to regain public trust since the Cambridge Analytica scandal last year, with a number of changes to its service.
In a blog post outlining the latest changes, Facebook’s vice president of integrity Guy Rosen and head of news feed integrity Tessa Lyons said the updates aimed to “keep people safe and maintain the integrity of information that flows through the Facebook family of apps”.
The new ranking signal, known as “Click-Gap”, ranks websites on a graph depending on the number of visitors going to and leaving a website by clicking on links.
The tool will identify sites with a “disproportionate” number of clicks to their site coming from Facebook compared to that site’s place in the graph, and show less of their content in the News Feed.
The company said such a reading “can be a sign that the domain is succeeding on News Feed in a way that doesn’t reflect the authority they’ve built outside it and is producing low-quality content”.
The expanded context button, which currently allows users to get more information about a publisher and its articles will now also be used for images, with relevant posts being reviewed by third-party fact-checkers.
Facebook also said it would begin demoting posts to the News Feed from groups that repeatedly share mis-information, confirming that any content that has been rated false by independent fact-checkers.
The social network said it is also looking to expand its network of independent experts to help “fight false news”.
“We need to find solutions that support original reporting, promote trusted information, complement our existing fact-checking programs and allow for people to express themselves freely — without having Facebook be the judge of what is true,” Facebook said.
“Any system we implement must have safeguards from gaming or manipulation, avoid introducing personal biases and protect minority voices.”