How Facebook Takes Care of Your News Feed
- 28 Sep 2021
Recently, Facebook was shamed with new transparency issues investigated by Wall Street Journal. This is not the first time Facebook had to deal with public concerns about how the social platform moderates the content. The latest precedent, dubbed “Facebook Files,” was related to the special XCheck moderation program for celebrities, a social media effect on teens, and some other problems. Now, Facebook decided to take some steps towards the public trust and share an explanation of how its News Feed algorithm works.
Facebook had already revealed this information before, but this time, it relates specifically to the rules that limit posts in News Feed. Indeed, some posts get less reach than others, and there are reasons for that. Facebook explained them, so here are some types of information that reduce the posts’ distribution:
- Engagement baits that induce users to like or share posts to make a choice, for example, not because they actually liked it;
- Ad farms that direct users to the ad-riddled pages;
- Comments that should be hidden according to the Facebook moderation policy;
- Links that require user’s personal data or links to suspected cloaking domains;
- Low-quality comments with a nickname only, poor-quality videos, or links to low-quality websites with a poor display;
- Pages with spam and dangerous content related to health issues.
If a post contains at least one of these types of information, it will reduce its potential to reach a diverse audience. That’s how Facebook struggles with those screwing the social platform while doing their business. Also, these limits guarantee Facebook users a more “clean” News Feed with minimal risks to be caught by dangerous links or get misleading information. Do you feel more safe knowing that Facebook takes care of you in such a way? Or do you think these measures are not necessary? Share your opinion in the comments below.