In a much-needed step towards a safer internet for children, Meta's latest set of parental supervision tools helps monitor children and teens as we traipse through the world of virtual reality.
Launched on June 14, the update allows parents and guardians to block apps and web browsers directly, see their children's screen time and friend lists, and turn off the ability to use the Link and Air Link features on Quest headsets to access otherwise blocked content on the user's personal computers. The supervision tools include the ability to view app downloads and purchases on user headsets, along with the optional requirement for teens to notify their guardians and initiate parent approval for purchases. The company will also launch a new parent education hub with a how-to guide for the VR supervision tools.
"With VR technologies increasingly gaining traction, and the Quest becoming a favorite product of many youth, parents and guardians will now have access to a suite of tools to safeguard and stay involved with their teen's participation and experiences," wrote Dr. Sameer Hinduja, co-director of the Cyberbullying Research Center, in the update's announcement.
In addition to the new safety tools for virtual reality, Meta is also expanding teen well-being resources for Instagram users. Parents will be able to set specific "quiet hours" during the day or week for child use, and see more information on accounts and posts reported by monitored users. The app will also begin alerting users to switch topics on their Explore Page after scrolling through the same content for an app-designated amount of time. According to the company, the alert is "designed to encourage teens to discover something new and excludes certain topics that may be associated with appearance comparison." Instagram will also feature new "Take a Break" videos when a user has been scrolling through Instagram Reels for too long, much like TikTok's screen time prompts.
The new parental controls in VR debut only a few days after law firm Beasley Allen filed eight lawsuits against Meta for, as it claims, not adequately protecting children and "exploiting young people for profit." It's just the latest of due criticism thrown at the company for its apparent lack of concern for adolescent safety following accusations last year that its social media platforms ignored concerns about teen mental health, which resulted in a congressional testimony by Instagram head Adam Mosseri in 2021.
Following the launch of Meta's Horizon Worlds — a VR "creator space" for users to connect and build virtual worlds — and it's new "safety-focused" features, users and researchers alike expressed concern that young users would still be easily exposed to unmoderated hate speech and harassment. Meta later added a "garbled voices" filter to Horizon Worlds that turned the voice chats of VR strangers into unintelligible, friendly sounds, and a "personal boundary" feature to hopefully block harassment by uninvited users. Then in May, Meta announced new locking tools to block specific apps from a user's Quest headset in response to concerns that teens and children with unsupervised access were being exposed to inappropriate virtual reality spaces.
This announcement isn't Meta's first or last attempt to make its apps and new tech safer for young people. In March, Meta launched the Family Center for Instagram, which houses the app's teen safety and parental monitoring tools, including supervision dashboards where guardians can monitor activity, followers, and frequently-interacted-with accounts. The center also included educational resources about online safety for families, in partnership with outside organizations like The Trevor Project. Much more recently, Instagram added the ability to filter out sensitive content, like graphic violence or sexually explicit posts, from your feed.
In the virtual reality world of Meta Quest, a realm of almost frighteningly varied possibilities, tools like these are an even more pressing concern. But the user-oriented tools still have their limits in a space where even adults can't escape harassment, and beg us to ask how companies can address the damage already done.
0 Comments