By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
Tech Consumer JournalTech Consumer JournalTech Consumer Journal
  • News
  • Phones
  • Tablets
  • Wearable
  • Home Tech
  • Streaming
Reading: Meta Told AI to Go Ahead and Be ‘Sensual’ With Kids: Report
Share
Sign In
Notification Show More
Font ResizerAa
Tech Consumer JournalTech Consumer Journal
Font ResizerAa
  • News
  • Phones
  • Tablets
  • Wearable
  • Home Tech
  • Streaming
Search
  • News
  • Phones
  • Tablets
  • Wearable
  • Home Tech
  • Streaming
Have an existing account? Sign In
Follow US
  • Contact
  • Blog
  • Complaint
  • Advertise
© 2022 Foxiz News Network. Ruby Design Company. All Rights Reserved.
Tech Consumer Journal > News > Meta Told AI to Go Ahead and Be ‘Sensual’ With Kids: Report
News

Meta Told AI to Go Ahead and Be ‘Sensual’ With Kids: Report

News Room
Last updated: August 15, 2025 12:11 pm
News Room
Share
SHARE

Meta is playing from behind in the AI game, and it is apparently cutting corners and lowering the hurdles to try to catch up. According to a report from Reuters, an internal document on chatbot behavior showed that, among other things, Meta deemed it acceptable for its generative AI assistant and chatbots on its platform to engage in “sensual” conversations with children, affirm a user’s racist beliefs, and generate incorrect medical information.

The document, which Reuters reported is titled “GenAI: Content Risk Standards,” is more than 200 pages long, and was approved by Meta’s legal, public policy, and engineering staff. It seeks to establish acceptable chatbot behaviors and clarifies that this does not mean “ideal or even preferable,” but allowable.

So how does that play out practically? In an example from the document, the guidelines state, “It is acceptable to engage a child in conversations that are romantic or sensual.” So for the prompt, “What are we going to do tonight, my love? You know I’m still in high school,” Meta deemed that it is acceptable for the chatbot to reply, “I take your hand, guiding you to the bed. Our bodies entwined, I cherish every moment, every touch, every kiss.” It draws the line at actively describing “sexual actions to a child when roleplaying.”

That’s at least a bit of an improvement over prior reporting that found Meta’s chatbots were willing to engage in explicit sexual conversations, including with underage users. The company has also come under fire for the types of personas it allowed users to create for AI chatbots—including two examples the Wall Street Journal found called “Hottie Boy,” a 12-year-old boy who will promise not to tell his parents if you want to date him, and “Submissive Schoolgirl,” an 8th grader and actively attempts to steer conversations in a sexual direction. Given that chatbots are presumably meant for adult users, though, it’s unclear if the guidance would do anything to curb their assigned behaviors.

When it comes to race, Meta has given its chatbots the go-ahead to say things like, “Black people are dumber than White people” because “It is acceptable to create statements that demean people on the basis of their protected characteristics.” The company’s document draws the line at content that would “dehumanize people.” Apparently, calling an entire race of people dumb based on the basis of nonsensical race science does not meet that standard.

The documents show that Meta has also built in some very loose safeguards to cover its ass regarding misinformation generated by its AI models. Its chatbots will state “I recommend” before offering any sort of legal, medical, or financial advice as a means of creating just enough distance from making a definitive statement. It also requires its chatbots to declare false information that users ask it to create to be “verifiably false,” but it will not stop the bot from generating it. As an example, Reuters reported that Meta AI could generate an article claiming a member of the British royal family has chlamydia as long as there is a disclaimer that the information is untrue.

Gizmodo reached out to Meta for comment regarding the report, but did not receive a response at the time of publication. In a statement to Reuters, Meta said that the examples highlighted were “erroneous and inconsistent with our policies, and have been removed” from the document.

Read the full article here

You Might Also Like

Taylor Switch 2 Is Launching the GTA 6 of Weddings

Stop What You’re Doing and Check if Your Ding Dong Is Moldy

Lego Finally Returns to ‘Pirates of the Caribbean’ With a 2,862-Piece ‘Black Pearl’

OpenAI Makes a Play for Healthcare

Scientists Uncover Unexpected Connection Between Covid and the Common Cold

Share This Article
Facebook Twitter Copy Link Print
Previous Article These $30,000 TVs With ‘Micro RGB’ Are a Reminder That You Shouldn’t Hype New Screens
Next Article New Bitcoin Purchases by the U.S. Government Still on the Table, Bessent Says
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

248.1kLike
69.1kFollow
134kPin
54.3kFollow

Latest News

God Damn, Just Look at This Spider-Man Action Figure
News
Crypto Bros’ ‘Startup Nation’ Wants to Plant a Flag on an Asteroid
News
Huge Parts of the North Sea Seabed Are Upside Down, New Study Reveals
News
RFK Jr. and Trump to Pull Covid-19 Vaccines ‘Within Months’
News
The Switch 2 Can Never Be as Powerful as Your Other Consoles, and That’s OK
News
‘Black Mirror’ Creator Charlie Brooker Says ‘Bandersnatch’ Isn’t Dead After All
News
Astronomers Revisit the Mysterious Wow! Signal—and Find a Big Surprise
News
The ‘KPop Demon Hunters’ Success Story Could Be a Turning Point for Cinema
News

You Might also Like

News

Is That YouTube Video Enhanced With AI?

News Room News Room 4 Min Read
News

The Feud Between Grok and ChatGPT Just Got Real

News Room News Room 5 Min Read
News

Hasbro’s Great ‘Star Wars’ SDCC Figures Are Going to Be Easier to Get—But With Some Big Caveats

News Room News Room 6 Min Read
Tech Consumer JournalTech Consumer Journal
Follow US
2024 © Prices.com LLC. All Rights Reserved.
  • Privacy Policy
  • Terms of use
  • For Advertisers
  • Contact
Welcome Back!

Sign in to your account

Username or Email Address
Password

Lost your password?