Facebook announced this morning a set of new rules designed to further penalize those who breach its community standards, especially around Facebook Groups. It also implemented rules intended to crack down on misinformation circulating across these more private networks. The reforms would affect those who helped lead organizations that were eventually banned and members who took part in them. The guidelines would also exclude, among other items, some of the more potentially dangerous groups from Facebook’s community recommendations.
The current recurrence policy of Facebook was intended to discourage people from forming new groups similar to those banned for breaking their community standards. The rule was, however, only applicable to community administrators. Facebook also states that all administrators and moderators will not be allowed to build any new groups for “a period” after their group has been banned for a breach of policy. That time is 30 days, Facebook tells us. If the admin or moderator wants to create another infringing party after the 30 days, then they will be paused again for 30 days.
Also, for the next 30 days, group members who have had some violations of community norms in a group will now need post-approval. That means all of their posts must be pre-approved by a community administrator or moderator. This could help groups cope with those whose activity is sometimes flagged, but with a large number of users, it could also overwhelm groups. And Facebook says the group will be deleted if the admins or moderators then accept a post that contravenes community standards.
Facebook would also require an active administrator of groups. Administrators also get distracted and step down or leave their party. Facebook will also try to recognize groups that are not associated with an administrator and proactively recommend admin positions to users who may be interested. You may already have received reminders that an admin is required from some of your groups. If so, it is because you have been classified by Facebook as someone who can lead the party, as you have no history of violations. Another shift would influence which classes are recommended to consumers.
Health associations will no longer be recommended, as “individuals must receive their health data from credible sources,” the company said.
Read more: Darkstore Releases Fastaf App For The Same Day Delivery Of Products
Unfortunately, this move alone can only minimize the possibility of false health information but does nothing to actively stop it. Since health groups can still be found through search, users can easily uncover groups that match their beliefs, even if those beliefs are detrimental to themselves or others. Today there are a large number of organizations that keep sharing false health information or pressuring consumers to find new or untested cures.
Conclusion:
This shows that much of Facebook’s work in this field is performative, rather than effective. A one-time sweep of harmful groups is not the same as dedicating resources and staff to the task of moving these dangerous radical organizations, violence-pronounced organizers, or anti-medical science adherents to the edges of society.