A guide to parental controls on social media



CNN Business

A little over a year ago, social media companies were put on notice for how they protect or fail to protect their youngest users.

In a series of congressional hearings, executives from Facebook (FB), TikTok, Snapchat and Instagram faced tough questions from lawmakers about how their platforms could expose young users to harmful content, damaging mental health and body image (especially teenagers). There was a lack of adequate parental control and protection to protect teenagers.

Those hearings, which followed revelations known as the “Facebook Papers” from whistleblower Francis Haugen about Instagram’s influence on teenagers, prompted companies to pledge to change. Four social networks have introduced additional tools and parental control options aimed at better protecting young users. Some have even made changes to their algorithms, such as making teenagers watch less sensitive content and increasing their efforts to moderate them. But some lawmakers, social media experts and psychologists say the new solutions are still limited, and more needs to be done.

“More than a year after the Facebook Papers dramatically exposed Big Tech’s abuses, social media companies have taken only small, slow steps to clean up their act,” Sen. Richard Blumenthal, chairman of the Senate Consumer Protection Subcommittee, told CNN Business. “Trust in Big Tech is long overdue and we need real rules to ensure children’s safety online.”

Michela Menting, director of digital security at market research firm ABI Research, agreed that social media platforms are “providing very little content to deal with the problems that occur on their platforms.” Their solutions, she said, put parents in charge of activating various parental controls, such as filters, aimed at blocking and restricting access, and more passive options such as monitoring and surveillance tools that run in the background.

Alexandra Hamlett, a New York City-based clinical psychologist, recalls being invited to a roundtable discussion about 18 months ago to discuss ways to improve Instagram, especially for younger users. “I don’t see many of our ideas being implemented,” she said. Social media platforms, he added, need to work on “continuing to improve parental controls, protect youth against targeted advertising, and remove objectively harmful content.”

The social media companies featured in this piece either declined to comment or did not respond to requests for comment on criticism that more needs to be done to protect young users.

For now, parents have to learn how to use parental controls, and teens can often block those devices. Here’s a detailed look at what parents can do to keep their kids safe online.

Following the fallout from the leaked documents, Meta-owned Instagram halted its much-criticized plan to release a version of Instagram for children under 13 and focused on making its core service safer for younger users.

It introduced an educational hub with resources for parents, tips and articles from experts on user safety, and a tool that allows parents to see how much time their kids spend on Instagram and set time limits. Parents can also receive updates about accounts their teens follow and accounts that follow them, and can see and be notified when their child makes updates to their privacy and account settings. Parents can also view accounts blocked by their teens. The company also offers video tutorials on how to use the new supervision tools.

Another feature encourages users to take a break from the app, such as by suggesting they take a deep breath, write something, check a to-do list or listen to a song, after a predetermined time. Instagram also said it’s taking a “tough approach” to the content it recommends to teens and will actively push them on different topics, such as architecture and travel destinations, if they’ve been sitting on any kind of content for too long.

Facebook’s Safety Center offers monitoring tools and resources, such as articles and advice from leading experts. “Our vision for Family Center is to ultimately allow parents and guardians to help manage their teens’ experiences on Meta Technologies all from one place,” Meta spokeswoman Liza Crenshaw told CNN Business.

The hub also offers a guide to Meta’s VR parental supervision tools from ConnectSafely, aimed at helping kids stay safe online, to help parents discuss virtual reality with their teens. Parents can see which accounts their teen has blocked and access monitoring tools, as well as approve a teen’s download or purchase of an app that is blocked by default based on their rating, or block specific apps that may be inappropriate for their teen.

In August, Snapchat introduced a parent guide and hub aimed at giving parents more insight into who their teens have been talking to in the past week about their use of the app (without revealing the content of those conversations). To use the feature, parents must create their own Snapchat account, and teens must opt-in and give permission.

While this was Snapchat’s first formal foray into parental controls, it retained some of the previously existing safeguards for younger users, such as prohibiting teens from friending each other and having public profiles before they start communicating with each other. Teen users can still use their Snap Maps location-sharing tool to reveal their real-time location to a friend or family member even when their app is turned off as a security measure, but the location-sharing tool is turned off by default. Meanwhile, a friend check tool encourages Snapchat users to review their friend lists and make sure they still want to keep in touch with certain people.

Snap previously said it was working on additional features, such as the ability for parents to see new friends their teens have added and allow them to privately report accounts their kids interact with. It is also working on a tool that will give younger users the option to notify their parents when they report an account or piece of content.

The company told CNN Business that it will continue to build on its security features and will consider feedback from the community, policymakers, safety and mental health advocates and other experts to improve the tools over time.

In July, TikTok announced new ways to filter mature or “potentially problematic” videos. The new safeguards assign a “maturity score” to videos that contain potentially mature or complex themes. It also rolled out a tool that aims to help people decide how much time they want to spend on TikToks. The tool lets users set regular screen time breaks, and provides a dashboard that shows the number of times they’ve opened the app, a breakdown of day and night usage and more.

The popular short-form video app currently offers a family pairing hub, which allows parents and teens to customize their security settings. A parent can link their TikTok account to their teen’s app and set parental controls, including how much time they can spend on the app each day; restricting exposure to certain content; Decide whether teens can search for videos, hashtags, or live content; and whether their account is private or public. TikTok also offers its Guardian Guide which highlights how parents can better protect their children on the platform.

In addition to parental controls, the app restricts access to certain features such as live and direct messaging to younger users. A pop-up also appears when teens under 16 are ready to publish their first video, asking them to choose who can view the video. Push notifications are stopped after 9pm for account users aged 13 to 15 and after 10pm for users aged 16 to 17.

The company said it will do more work around promoting awareness of its parental control features in the coming days and months.

Discord did not appear in the Senate last year, but the popular messaging platform has faced criticism over the difficulty of reporting problematic content and the ability of strangers to connect with young users.

In response, the company recently refreshed its Safety Center, where parents can find guidance on how to turn on safety settings, FAQs on how Discord works, and tips on how to talk to teens about online safety. Some existing parental control tools include an option to prohibit a minor from receiving friend requests or direct messages from people they don’t know.

Still, it is possible for minors to connect with strangers on public servers or in private chats if the person was invited to the room by someone else or if the channel link is left in a public group accessed by the user. By default, all users – including users aged 13 to 17 – can receive friend requests from anyone on the same server, which opens up the ability for them to send private messages.



Source

Also Read :  Woman slashes monthly electric bill to £10 using £30 Argos gadget | Personal Finance | Finance

Leave a Reply

Your email address will not be published.

Related Articles

Back to top button