Facebook is getting rid of secret and closed groups—just the privacy settings, not the groups themselves—in an attempt to simplify its privacy settings for groups and reduce four options to two: public and private.
Groups product manager Jordan Davis said in a Newsroom post that the move was made because users expressed a desire for more clarity in the privacy settings for groups, adding, “Having two privacy settings—public and private—will help make it clearer about who can find the group and see the members and posts that are part of it. We’ve also heard that most people prefer to use the terms ‘public’ and ‘private’ to describe the privacy settings of groups they belong to.”
Anyone on Facebook can see who belongs to public groups and content that is shared in those groups. For private groups, only members can see who belongs and what they have posted.
By default, groups that had been secret will now be private and hidden, and groups that were closed will now be private and visible. Public groups will remain public and visible.
Group administrators can find the new controls in their group settings and, as had already been the case, restrictions govern when they can change the privacy settings of a group. Members are notified whenever a group’s privacy settings are updated.
Davis pointed out that Facebook has added over 30,000 people to its safety and security teams over the past few years, adding that a specialized team has been working on its Safe Communities Initiative with the aim of protecting people who use groups on the social network.
He wrote, “Increasingly, we can use AI (artificial intelligence) and machine learning to proactively detect bad content before anyone reports it, and sometimes before people even see it. As content is flagged by our systems or reported by people, trained reviewers consider context and determine whether the content violates our community standards. We then use these examples to train our technology to get better at finding and removing similar content.”
Alison said the Safe Communities Initiative is made up of product managers, engineers, machine learning experts and content reviewers, and it teams up to anticipate ways people can use groups in a harmful fashion and develops solutions to minimize and prevent those actions.
He wrote, “Deciding whether an entire group should stay up or come down is nuanced. If an individual post breaks our community standards, it comes down, but with dozens, hundreds or sometimes thousands of different members and posts, at what point should a whole group be deemed unacceptable for Facebook?”
Factors that the social network weighs include whether the group’s name or description contains hate speech or other forbidden content, as well as the actions of admins and moderators.
Alison explained, “If group leaders often break our rules, or if they commonly approve posts from other members who break our rules, those are clear strikes against the overall group. And if a group member repeatedly violates our standards, we’ll start requiring admins to review their posts before anyone else can see them. Then if an admin approves a post that breaks our rules, it will count against the whole group.”
Alison wrote, “We help admins to establish positive group norms by adding a section for rules so that they can be clear about what is and isn’t allowed. Admins and moderators also have the option to share which rule a member broke when declining a pending post, removing a comment or muting a member.”
And he pointed out that before joining groups, Facebook users see relevant details about those groups, such as who the admins and moderators are and whether the group had different name in the past. People can also preview groups when deciding whether to accept or decline invitations to join them.
Alison concluded, “Through the Safe Communities Initiative, we’ll continue to ensure that Facebook groups can be places of support and connection, not hate or harm. There’s always more to do, and we’ll keep improving our technology, tools and policies to help keep people safe.”