By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
Tech Consumer JournalTech Consumer JournalTech Consumer Journal
  • News
  • Phones
  • Tablets
  • Wearable
  • Home Tech
  • Streaming
Reading: Grok’s ‘Spicy’ Mode Makes NSFW Celebrity Deepfakes of Women (But Not Men)
Share
Sign In
Notification Show More
Font ResizerAa
Tech Consumer JournalTech Consumer Journal
Font ResizerAa
  • News
  • Phones
  • Tablets
  • Wearable
  • Home Tech
  • Streaming
Search
  • News
  • Phones
  • Tablets
  • Wearable
  • Home Tech
  • Streaming
Have an existing account? Sign In
Follow US
  • Contact
  • Blog
  • Complaint
  • Advertise
© 2022 Foxiz News Network. Ruby Design Company. All Rights Reserved.
Tech Consumer Journal > News > Grok’s ‘Spicy’ Mode Makes NSFW Celebrity Deepfakes of Women (But Not Men)
News

Grok’s ‘Spicy’ Mode Makes NSFW Celebrity Deepfakes of Women (But Not Men)

News Room
Last updated: August 6, 2025 6:29 pm
News Room
Share
SHARE

This week, Elon Musk officially launched Grok Imagine, xAI’s image and video generator for iOS, for people who subscribe to SuperGrok and Premium+ X. The app allows users to create NSFW content with its “Spicy” mode, and The Verge reported on Tuesday that users are able to create topless videos of Taylor Swift easily—without even asking for it. But it’s not just Swift who should be concerned about Musk’s new AI-generated softcore porn tool.

Gizmodo created about two dozen videos of politicians, celebrities, and tech figures using the Grok Spicy mode, though some were blurred out or came back with a message reading “video moderated.” When Grok did make scandalous images, it would only make the ones depicting women truly not-safe-for-work. Videos of men were the kind of thing that wouldn’t really raise many eyebrows.

X has been swamped over the past two days with AI-generated images of naked women and tips on how to achieve the most nudity. But users, who’ve created tens of millions of Grok Imagine images according to Musk, don’t even need to go to some great effort to get deepfakes of naked celebrities. Gizmodo didn’t explicitly ask for nudity in the examples we cite in this article, but we still got plenty of it. All we did was click on the Spicy button, which is one of four options, along with Custom, Fun, and Normal.

Gizmodo tested Grok Imagine by generating videos of not just Taylor Swift, but other prominent women like Melania Trump and historical figures like Martha Washington. Melania Trump has been a vocal supporter of the Take It Down Act, which makes it illegal to publish non-consensual “intimate imagery,” including deepfakes.

Grok also created a not-safe-for-work video of the late feminist writer Valerie Solanas, author of 1967’s S.C.U.M Manifesto. Almost all of the videos depicted the women that we tested as shedding clothes to make them naked from the waist up, though the video of Solanas was unique in that it did show her completely naked.

What happens when you try to generate Spicy videos of men? The AI will have the male figure take off his shirt, but there’s nothing more scandalous than that. When Gizmodo figured out that it would only remove a man’s shirt, we prompted the AI to create a shirtless image of Elon Musk and see what it might do with that. The result was the extremely ridiculous (and safe-for-work) video you see below.

Attempts to make videos of Mark Zuckerberg, Jeff Bezos, Joaquin Phoenix, and Charlie Chaplin, as well as Presidents Barack Obama, Bill Clinton, and George Washington, ran into the same limitation. The AI-generated videos will have the men take their shirts off most of the time, but there’s nothing beyond that. And if there is anything more, it’s usually so cringe that we’d worry about users dying from secondhand embarrassment. Making a Spicy video of Errol Musk, Elon’s father, produced the same thing. He just took off his shirt.

When we made a generic man to see if Spicy mode would be more loose with its sexual content since it wasn’t a known public figure, it still just made a bizarre, awkward video of a man tugging at his pants. The pants, it should be noted, seem to be a combination of shorts for one leg and long jeans for the other before transforming into just shorts. The audio for each video was also auto-generated without any further instruction.

Trying the same thing with a generic woman rendered much more revealing images of a woman in a swimsuit who pulls down the top to reveal her naked breasts.

Most mainstream AI video generators, like OpenAI’s Sora and Google’s Veo, have guardrails to protect against the creation of things like revenge porn and images of celebrities. And it seems like xAI does in some ways, at least for men. But most people would probably object to their image being used to create an AI avatar in various states of undress. Gizmodo reached out to Musk through xAI to ask about safeguards and whether it’s acceptable for users to create topless videos of celebrities. We haven’t heard back.

One of the most striking things about Grok’s AI image generator is that it’s often terrible at making convincing celebrity fakes. For example, the images below were generated when asking for Vice President JD Vance and actress Sydney Sweeney. And unless we completely forgot how those two people look, it’s not even close. That could turn out to be Musk’s saving grace, given the fact that a tool like this is bound to attract lawsuits.

Phone screenshots of images created by SuperGrok that are supposed to depict Sydney Sweeney and JD Vance. Screenshots: xAI

There were other glitches, like when we created an AI-generated image of President Harry Truman that looked very little like him, and the man’s nipples appeared to be on the outside of his dress shirt. Truman, in Spicy mode, did take off his shirt to reveal his bare chest, which had identical nipples.

When Gizmodo created images using the prompt “Gizmodo writer Matt Novak,” the result was similar to what we saw with videos for Elon Musk and generic men. The figure (who, we should note, is in much better shape than the real Matt Novak) took off his shirt with a simple click of the Spicy button.

As The Verge notes, there is an age verification window when a user first tries to create a video with Grok Imagine, but there doesn’t appear to be any kind of check by the company to confirm the year a given user was actually born. Thankfully, Gizmodo’s generation of a cartoon Mickey Mouse in Spicy mode didn’t render anything scandalous, just the animated character jumping harmlessly. An AI image of Batman yielded a “Spicy” result not unlike other male figures, where he only stripped his top off.

Gizmodo did not attempt to create any images of children, though The Verge did try that in Spicy mode and reports nothing inappropriate was rendered. The “Spicy” mode was still an option that was listed, however. “You can still select it, but in all my tests, it just added generic movement,” The Verge notes. Elon Musk very infamously reinstated an account on X that posted child sexual abuse material in 2023, according to the Washington Post.

It’s perhaps not surprising that Elon Musk’s new NSFW video creator has different standards for men and women. The billionaire recently retweeted a far-right figure who claimed that women are “anti-white” because they’re “weak.” The Tesla CEO, who suggested in 2024 that he wanted to impregnate Taylor Swift, isn’t exactly known for being a champion of women’s rights.

Gizmodo signed up for the $30 per month SuperGrok subscription and only got to test it for about 1.5 hours before we were told we’d reached our image creation limit. Strangely enough, users can still create a single still image for a prompt after getting the warning and generate NSFW videos using that lone image, but it’s much more limited than what was previously available.

We were told to upgrade to SuperGrok Heavy for $300 per month if we wanted to continue using the tool with all its features. But given the fact that we didn’t need any more shitty images of naked celebrities to write this article, we declined. We got the answers we were looking for, unfortunately.

Read the full article here

You Might Also Like

Karoline Leavitt Makes Hilarious Mistake While Defending Bullshit on Covid Vaccines

‘The Wizard of Oz’ at the Sphere Has a Shocking 2-Second Cameo: David Zaslav

Bella Ramsey Tells ‘The Last of Us’ Haters to Go Play Their Video Games

Meet Freddy Fazbear and Friends at Halloween Horror Nights’ ‘Five Nights at Freddy’s’ House

Spiders Hijack Fireflies to Create Devious Glowing Death Traps

Share This Article
Facebook Twitter Copy Link Print
Previous Article Tim Cook to Appear With Trump at the White House as Apple Announces $100 Billion Investment
Next Article Companies Find Potential Way to Avoid Trump Tariffs and Keep Prices Low
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

248.1kLike
69.1kFollow
134kPin
54.3kFollow

Latest News

Ares’ Uses Elements From a Decade-Old Script
News
Oh Dear, ‘Star Trek: Strange New Worlds’
News
Backstreet Boys at the Sphere Sci-Fi Themes
News
At Least 2 People Died of ‘Flesh-Eating’ Bacteria After Eating Tainted Oysters
News
The ‘Twilight’ Movies are Coming Back to Cinemas, Right in Time for Halloween
News
Samsung Is Announcing a New iPad Pro Killer at IFA Next Week
News
Jackie Chan’s Stunt Team Join ‘Spider-Man: Brand New Day’
News
Tesla Makes Desperate New Cybertruck Move As Stock Wobbles
News

You Might also Like

News

Did Nvidia Just Pop an AI Bubble? Here’s What the Market Says

News Room News Room 6 Min Read
News

‘The Dark Crystal’ Is Returning to Theaters, Which Isn’t ‘The Dark Crystal’ News We Were Hoping For

News Room News Room 3 Min Read
News

People With Ties to Trump Accused of Carrying Out ‘Covert’ Influence Operations in Greenland

News Room News Room 6 Min Read
Tech Consumer JournalTech Consumer Journal
Follow US
2024 © Prices.com LLC. All Rights Reserved.
  • Privacy Policy
  • Terms of use
  • For Advertisers
  • Contact
Welcome Back!

Sign in to your account

Username or Email Address
Password

Lost your password?