Although Facebook has said they want users to get the content they desire on the platform, the company’s algorithms and policies make it clear they believe they know better than you what you should be reading.
For example, “Facebook recently stated that it plans to temporarily reduce the visibility of political content for some users in the U.S., Canada, Brazil, and Indonesia as the company faces increased scrutiny over its ability to moderate misinformation and ‘hateful content,'” according to a Breitbart report.
Of course, they aren’t going to tell users what they are hiding. They will decide because they are Facebook and they know best.
The changes won’t immediately impact everyone, but Facebook admitted it will “temporarily reduce the visibility of political content” for a select group.
To read more about Facebook’s actions to control what you read and see, click here.