I was alone in the virtual world, meandering through a side corridor of a simplistically rendered palace, when a stranger entered.

“Hello?” a small voice called from the palace’s central plaza. “Where are you?”

I hesitated. The voice sounded like it belonged to a child.

“I can’t find you,” the child said, plaintively. “Can you give me a hint?”

In theory, kids aren’t allowed in the game. The new virtual-reality app Horizon Worlds, the first foray into the much-hyped “metaverse” for Facebook parent company Meta, is limited to adults 18 and older.

In practice, however, very young kids appear to be among its earliest adopters. The person I met that day, who told me they were 9 and using their parents’ Oculus VR headset, was one of many apparent children I encountered in several weeks on the appAnd reviews of Horizon Worlds include dozens of complaints about youngsters, some of them foulmouthed and rude, gleefully ruining the experience for the grown-ups.

(Meta)

But experts say the presence of children in Meta’s fledgling metaverse raises a graver concern: that by mixing children with adult strangers in a largely self-moderated virtual world, the company is inadvertently creating a hunting ground for sexual predators.

When new online forums arise that attract kids, sexual predators “are often among the first to arrive,” said Sarah Gardner, vice president of external affairs at Thorn, a tech nonprofit that focuses on protecting children from online sexual abuse. “They see an environment that is not well protected and does not have clear systems of reporting. They’ll go there first to take advantage of the fact that it is a safe ground for them to abuse or groom kids.”

Once a predator has identified a target, they’ll try to isolate the child, gain their trust, and coax or bribe them into sending nude images or videos of themselves or even meeting up in person. It’s a problem that has been documented on social apps such as Instagram and Discord along with games such as “Fortnite” and “Minecraft,” many of which have taken steps to address it.

Meta appears to have done little to address the possibility of child-grooming specifically, despite throwing huge amounts of resources into the development of the metaverse, even changing its corporate name from Facebook in October to reflect a new emphasis on virtual reality.

Asked whether Meta views young children on Horizon Worlds and Horizon Venues, a sister app focused on live VR events, as a problemspokeswoman Kristina Milian emphasized that social VR is an emerging and rapidly evolving medium and that the company is figuring things out as it goes.

“We will continue to look at how people use Quest devices and how the product develops in making decisions about our product policies,” Milian said. “Our goal is to make Horizon Worlds and Horizon Venues safe, and we are committed to doing that work.”

Meta did not respond to a question about whether it had received any reports of child exploitation or grooming in Horizon Worlds. It also declined to say whether it had taken any measures aimed at protecting children from those threats.

It’s impossible to know with certainty that an avatar is a child, but judging by their voices and actions, they’re hard to miss. For a company that is building what it hopes is the future of online interaction, the failure to enforce an age limit — one of its most basic rules — in Horizon Worlds would seem to be an ominous sign.

The apparent 9-year-old I met in that crudely drawn palace, part of a Horizon Worlds minigame called “Magic Mania” involving wands and spells, didn’t hesitate to share their age or the fact that they were using a parent’s Facebook account. I didn’t ask more questions. I told them to be careful and left the virtual room. I couldn’t help wondering how the interaction might have gone if the strange adult in the game with them had been a predator instead of a journalist.

For Meta, a lot is riding on Horizon Worlds.

CEO Mark Zuckerberg and company built a social media empire over the past 18 years, but its flagship Facebook app has plateaued and its reputation has been battered by ceaseless waves of scandal over its handling of user data, its moderation of the content users post, and its influence on public opinion and debate. In October, under renewed pressure from regulators in multiple countries over leaks that showed it seeming to prioritize business dominance over ethics and users’ well-being, Zuckerberg announced the rebranding from Facebook to Meta.

In the future, Zuckerberg said, people will meet up not on text-based social networks but in virtual physical spaces, where their avatars will move around, interact, play games, shop and hold business meetings. While Facebook was hardly the first company to embrace the metaverse — which Zuckerberg described as an “embodied internet”— its dramatic pivot ignited a frenzy of interest in the concept as rival tech giants, start-ups and venture capitalists raced to capitalize.

Horizon Worlds, a beta version of which featured prominently in Zuckerberg’s announcement, launched Dec. 9 in the United States and Canada on the company’s Oculus virtual-reality platform and represents its first major attempt to deliver on his vision.

(Meta)

Users represented by cartoonish, legless avatars move around its three-dimensional landscape, interacting in real time with whoever happens to be in the same area simply by talking into their headsets. Within Horizon Worlds are a wide array of minigames and experiences, a few built into the app by Meta but most — including “Magic Mania” — created by other users with the help of the app’s development tools.

Imagine the wildly popular kids’ games “Minecraft” and “Roblox,” but in virtual reality — and, so far, with far fewer users, since the programs are available only via Meta’s own pricey Quest and Rift headsets. That potential audience, however, is growing fast: The Oculus companion app was by some metrics the most downloaded app in the United States over the Christmas holiday week, suggesting that the headset was a popular gift.

A key difference is that Horizon Worlds is ostensibly adults-only, at least for now. As a result, it lacks the parental controls and guardrails for younger users, such as disabling chat functions, that “Minecraft” and “Roblox” have implemented. Instead, it focuses on empowering users to control their own experience by muting, blocking or reporting bad actors.

Meta’s Milian said the company designed the app with safety in mind, pointing to a November blog post from Chief Technology Officer Andrew Bosworth on keeping people safe in VR and beyond. “Technology that opens up new possibilities can also be used to cause harm, and we must be mindful of that as we design, iterate, and bring products to market,” Bosworth wrote.

Users who are being harassed, feel uncomfortable or just need a break can raise their wrist — a movement echoed by their virtual avatar — and tap a button with their other hand to activate a “safe zone” feature that pulls them out of their surroundings and brings up a menu of moderation options. The feature is explained in a tutorial when you first download Horizon Worlds, and in-app billboards serve as remindersOn Friday, Meta announced a new measure called “personal boundary” that keeps users’ avatars four feet apart by default, with the goal of “making it easier to avoid unwanted interactions.”

The app continually records each user’s experience, storing the last few minutes on a rolling basis for moderators to review if needed. There are also some human “community guides” in the app — represented by specially marked avatars — who can answer questions and help if summoned. But those moderators are sparse except in the app’s few most populated spaces, and some users say they rarely intervene proactively or enforce the app’s age restrictions.

“Age verification does not eliminate children using those technologies,” said Eliza McCoy, executive director for outreach, training and prevention at the nonprofit National Center for Missing and Exploited Children, or NCMEC, a U.S. government-sanctioned clearinghouse for reports of child exploitation by online service providers. “We know that two populations get around those [restrictions].” One is “kids who are trying to get on there even though they know they’re not allowed, because it’s interesting and engaging and fun.” The other: “adults who are looking to offend against children,” sometimes by pretending to be kids themselves.

McCoy said the NCMEC received a record 37,872 reports of “online enticement” — attempts by adults to communicate with kids to sexually abuse or abduct them — across all platforms in 2020. That number was up 97 percent from 2019, an increase McCoy said was driven in part by the coronavirus pandemic pushing more people to interact online.

Children aren’t the only ones at risk if Meta doesn’t get moderation right. Bloomberg News’s Parmy Olson wrote in December that she found Horizon Worlds to be male-dominated, populated with trolls (including kids), and “deeply uncomfortable at times” for a woman because of unwanted advances that bordered on harassment. In November, a beta tester reported being virtually groped in the app, MIT Technology Review reported; an internal review by Meta found that the user didn’t deploy the “safe zone” feature.

The early reviews of Horizon Worlds on the Oculus store read like a litany of complaints about boorish behavior by unwelcome youngsters. The game had a rating of just over three out of five stars as of Feb. 4, with many users lamenting that its potential was undercut by ineffectual moderation. Similar complaints dot other online forums that discuss the app, and users who spoke to The Washington Post — including some I spoke with inside the app — unanimously agreed that kids are a problem.

David Corbett, a 56-year-old Horizon Worlds user from Des Moines, delved into the social VR app last month after getting an Oculus headset for Christmas. Corbett, a product specialist who works on medical devices, said that he sees great potential in the platform but that the prevalence of youngsters in the app is a daily annoyance and a serious safety concern.

“Every session, I have seen kids who sound very young,” he said. “A lot of them are being rude and waving their hands in people’s faces and jumping around.”

Corbett has taken to blocking the kids when he encounters them, which means he no longer hears or sees them in the app, though other users still can. But he said conversations with other adult users have persuaded him to try reporting underage kids to Meta as well, out of concern for their safety.

(Meta)

“A kid running around unsupervised I think is a big red flag or target to someone who might be a predator,” Corbett said. “It would be easy for someone to say to that kid, ‘Hey, come on over to this side of the room where no one else can hear.’ And that worries me.”

Jeff Haynes, senior video games editor for the nonprofit Common Sense Media, which reviews entertainment with an eye to age appropriateness, said he has found the pervasiveness of kids in Horizon Worlds “alarming,” encountering youngsters 10 and younger every time he has used it.

“Since there are no guardrails there, you have kids that are volunteering a ton of information,” he said. “Maybe part of it is because the graphics almost feel akin to other kid experiences, like a ‘Roblox’ or a ‘Minecraft,’ but they’re not taking any of the precautions to safeguard themselves in a virtual-world experience that is open to pretty much anybody. And that’s hazardous. If they’re willing to volunteer their age, or that they’re not in school, or the state they happen to be in, what’s to say they’re not going to volunteer where they live, or more details about themselves?”

He echoed Gardner’s concern that the app may soon begin to attract sexual predators, if it hasn’t already.

Meta’s age policies, Haynes said, are a “paper tiger.” While there are some age barriers to setting up the headset, once it’s tied to an adult’s Facebook account, anyone who puts it on can access all the same apps and experiences, regardless of their age rating. Horizon Worlds is free to download and requires no additional age verification.

Congress has been exploring ways to better protect children and teens online, partly in response to concerns over young people’s use of Meta-owned Instagram. A 1998 law designed to protect kids’ privacy online limits the data that companies can collect from those younger than 13, which helps to explain why many online platforms require users to be 13 or older.

“It’s not anything to do with age appropriateness or child development,” the NCMEC’s McCoy said of the under-13 cutoff. “It’s just that they can’t collect data on those users, so why bother?”

In an ideal world, parents would keep tabs on young kids’ gaming activity, Haynes noted. But many aren’t yet familiar with Horizon Worlds or other social virtual-reality apps. Unlike phones, computers or gaming consoles, VR headsets have no external display that parents can see while their kids are playing, and there’s no telltale record of their kids’ interactions on them.

Haynes has gotten a few questions from parents who said they could hear their kid talking to someone and wondered if they should be worried. The child often assures the parent that they’re talking to “a friend” — though for kids growing up in a digital age and a pandemic, he said, that could mean someone they’ve just met online.

Gardner, of Thorn, said Facebook has been an industry leader in identifying and reporting child sex-abuse material on its social media platforms, and she’s hopeful it will take child safety seriously as it builds its next generation of platforms. But she said she worries the company faces incentives to sweep the problem under the rug or ignore it until after serious harm is done.

“We’ve been playing catch-up against Web 2 for 10 years, cleaning up the fact that we didn’t have child safety in our minds” when it was built, Gardner said, referring to the generation of Internet platforms that includes social networks such as Facebook and Instagram. “That will happen again with Web 3 unless we push the creators of those environments to do it differently.”

Haynes said he is hopeful that Meta will rethink its approach, prioritize either keeping kids out of the app or keeping them safe within it, and invest much more heavily in moderation. But he said it needs to happen soon, given the company’s grand ambitions for the metaverse. Until then, Haynes said, “it’s really open season for anybody that happens to have these devices.”

Facebook Comments Box