Research also shows that polarization has been growing in the United States for decades, long before platforms like Facebook even existed, and that it is decreasing in other countries where Internet and Facebook use has increased. "The goal of the Meaningful Social Interactions ranking change is in the name: improve people's experience by prioritizing posts that inspire interactions, particularly conversations, between family and friends - which research shows is better for people's well-being - and deprioritizing public content. On the claim that the change in "Meaningful Social Interactions" in 2018 amplified polarizing and hateful content: That's why we've invested so heavily in safety and security." Our incentive is to provide a safe, positive experience for the billions of people who use Facebook.
"Hosting hateful or harmful content is bad for our community, bad for advertisers, and ultimately, bad for our business. On the claim that incentives within Facebook are misaligned, and the desire for engagement on the platform and profit outweighs safety in some instances: We have a strong track record of using our research - as well as external research and close collaboration with experts and organizations - to inform changes to our apps." If any research had identified an exact solution to these complex challenges, the tech industry, governments, and society would have solved them a long time ago. "We've invested heavily in people and technology to keep our platform safe, and have made fighting misinformation and providing authoritative information a priority.