2026-01-08 23:44:35

The biggest AI story of the first week of 2026 involves Elon Musk’s Grok chatbot turning the social media platform into an AI child sexual imagery factory, seemingly overnight.
I’ve said several times on the 404 Media podcast and elsewhere that we could devote an entire beat to “loser shit.” What’s happening this week with Grok—designed to be the horny edgelord AI companion counterpart to the more vanilla ChatGPT or Claude—definitely falls into that category. People are endlessly prompting Grok to make nude and semi-nude images of women and girls, without their consent, directly on their X feeds and in their replies.
Sometimes I feel like I’ve said absolutely everything there is to say about this topic. I’ve been writing about nonconsensual synthetic imagery before we had half a dozen different acronyms for it, before people called it “deepfakes” and way before “cheapfakes” and “shallowfakes” were coined, too. Almost nothing about the way society views this material has changed in the seven years since it’s come about, because fundamentally—once it’s left the camera and made its way to millions of people’s screens—the behavior behind sharing it is not very different from images made with a camera or stolen from someone’s Google Drive or private OnlyFans account. We all agreed in 2017 that making nonconsensual nudes of people is gross and weird, and today, occasionally, someone goes to jail for it, but otherwise the industry is bigger than ever. What’s happening on X right now is an escalation of the way it’s always been, and almost everywhere on the internet.
The internet has an incredibly short memory. It would be easy to imagine Twitter Before Elon as a harmonious and quaint microblogging platform, considering the four years After Elon have, comparatively, been a rolling outhouse fire. But even before it was renamed X, Twitter was one of the places for this content. It used to be (and for some, still is) an essential platform for getting discovered and going viral for independent content creators, and as such, it’s also where people are massively harassed. A few years ago, it was where people making sexually explicit AI images went to harass female cosplayers. Before that, it was (and still is) host to real-life sexual abuse material, where employers could search your name and find videos of the worst day of your life alongside news outlets and memes. Before that, it was how Gamergate made the jump from 4chan to the mainstream. The things that happen in Telegram chats and private Discord channels make the leap to Twitter and end up on the news.
What makes the situation this week with Grok different is that it’s all happening directly on X. Now, you don’t need to use Stable Diffusion or Nano Banana or Civitai to generate nonconsensual imagery and then take it over to Twitter to do some damage. X has become the Everything App that Elon always wanted, if “everything” means all the tools you need to fuck up someone’s life, in one place.
This is the culmination of years and years of rampant abuse on the platform. Reporting from the National Center for Missing and Exploited Children, the organization platforms report to when they find instances of child sexual abuse material which then reports to the relevant authorities, shows that Twitter, and eventually X, has been one of the leading hosts of CSAM every year for the last seven years. In 2019, the platform reported 45,726 instances of abuse to NCMEC’s Cyber Tipline. In 2020, it was 65,062. In 2024, it was 686,176. These numbers should be considered with the caveat that platforms voluntarily report to NCMEC, and more reports can also mean stronger moderation systems that catch more CSAM when it appears. But the scale of the problem is still apparent. Jack Dorsey’s Twitter was a moderation clown show much of the time. But moderation on Elon Musk’s X, especially against abusive imagery, is a total failure.
In 2023, the BBC reported that insiders believed the company was “no longer able to protect users from trolling, state-co-ordinated disinformation and child sexual exploitation” following Musk’s takeover in 2022 and subsequent sacking of thousands of workers on moderation teams. This is all within the context that one of Musk’s go-to insults for years was “pedophile,” to the point that the harassment he stoked drove a former Twitter employee into hiding and went to federal court because he couldn't stop calling someone a “pedo.” Invoking pedophelia is a common thread across many conspiracy networks, including QAnon—something he’s dabbled in—but Musk is enabling actual child sexual abuse on the platform he owns.
Generative AI is making all of this worse. In 2024, NCMEC saw 6,835 reports of generative artificial intelligence related to child sexual exploitation (across the internet, not just X). By September 2025, the year-to-date reports had hit 440,419. Again, these are just the reports identified by NCMEC, not every instance online, and as such is likely a conservative estimate.
When I spoke to online child sexual exploitation experts in December 2023, following our investigation into child abuse imagery found in LAION-5B, they told me that this kind of material isn’t victimless just because the images don’t depict “real” children or sex acts. AI image generators like Grok and many others are used by offenders to groom and blackmail children, and muddy the waters for investigators to discern actual photographs from fake ones.

“Rather than coercing sexual content, offenders are increasingly using GAI tools to create explicit images using the child’s face from public social media or school or community postings, then blackmail them,” NCMEC wrote in September. “This technology can be used to create or alter images, provide guidelines for how to groom or abuse children or even simulate the experience of an explicit chat with a child. It’s also being used to create nude images, not just sexually explicit ones, that are sometimes referred to as ‘deepfakes.’ Often done as a prank in high schools, these images are having a devastating impact on the lives and futures of mostly female students when they are shared online.”
The only reason any of this is being discussed now, and the only reason it’s ever discussed in general—going back to Gamergate and beyond—is because many normies, casuals, “the mainstream,” and cable news viewers have just this week learned about the problem and can’t believe how it came out of nowhere. In reality, deepfakes came from a longstanding hobby community dedicated to putting women’s faces on porn in Photoshop, and before that with literal paste and scissors in pinup magazines. And as Emanuel wrote this week, not even Grok’s AI CSAM problem popped up out of nowhere; it’s the result of weeks of quiet, obsessive work by a group of people operating just under the radar.
And this is where we are now: Today, several days into Grok’s latest scandal, people are using an AI image generator made by a man who regularly boosts white supremacist thought to create images of a woman slaughtered by an ICE agent in front of the whole world less than 24 hours ago to “put her in a bikini.
As journalist Katie Notopoulos pointed out, a quick search of terms like “make her” shows people prompting Grok with images of random women, saying things like “Make her wear clear tapes with tiny black censor bar covering her private part protecting her privacy and make her chest and hips grow largee[sic] as she squatting with leg open widely facing back, while head turn back looking to camera” at a rate of several times a minute, every minute, for days.
A good way to get a sense of just how fast the AI undressed/nudify requests to Grok are coming in is to look at the requests for it https://t.co/ISMpp2PdFU
— Katie Notopoulos (@katienotopoulos) January 7, 2026
In 2018, less than a year after reporting that first story on deepfakes, I wrote about how it’s a serious mistake to ignore the fact that nonconsensual imagery, synthetic or not, is a societal sickness and not something companies can guardrail against into infinity. “Users feed off one another to create a sense that they are the kings of the universe, that they answer to no one. This logic is how you get incels and pickup artists, and it’s how you get deepfakes: a group of men who see no harm in treating women as mere images, and view making and spreading algorithmically weaponized revenge porn as a hobby as innocent and timeless as trading baseball cards,” I wrote at the time. “That is what’s at the root of deepfakes. And the consequences of forgetting that are more dire than we can predict.”
A little over two years ago, when AI-generated sexual images of Taylor Swift flooding X were the thing everyone was demanding action and answers for, we wrote a prediction: “Every time we publish a story about abuse that’s happening with AI tools, the same crowd of ‘techno-optimists’ shows up to call us prudes and luddites. They are absolutely going to hate the heavy-handed policing of content AI companies are going to force us all into because of how irresponsible they’re being right now, and we’re probably all going to hate what it does to the internet.”
It’s possible we’re still in a very weird fuck-around-and-find-out period before that hammer falls. It’s also possible the hammer is here, in the form of recently-enacted federal laws like the Take It Down Act and more than two dozen piecemeal age verification bills in the U.S. and more abroad that make using the internet an M. C. Escher nightmare, where the rules around adult content shift so much we’re all jerking it to egg yolks and blurring our feet in vacation photos. What matters most, in this bizarre and frequently disturbing era, is that the shareholders are happy.
2026-01-08 22:49:12

Billionaire Toby Neugebauer laughed when the Amarillo City Council asked him how he planned to handle the waste his planned datacenter would produce.
“I’m not laughing in disrespect to your question,” Neugebauer said. He explained that he’d just met with Texas Governor Greg Abbott, who had made it clear that any nuclear waste Neugebauer’s datacenter generated needed to go to Nevada, a state that’s not taking nuclear waste at the moment. “The answer is we don't have a great long term solution for how we’re doing nuclear waste.
The meeting happened on October 28, 2025 and was one of a series of appearances Neugebauer has put in before Amarillo’s leaders as he attempts to realize Project Matador: a massive 5,769 acre datacenter being built in the Texas Panhandle and constructed by Fermi America, a company he founded with former Secretary of Energy Rick Perry.
If built, Project Matador would be one of the largest datacenters in the world at around 18 million square feet. “What we’re talking about is creating the epicenter for artificial intelligence in the United States,” Neugebauer told the council. According to Neugebauer, the United States is in an existential race to build AI infrastructure. He sees it as a national security issue.
“You’re blessed to sit on the best place to develop AI compute in America,” he told Amarillo. “I just finished with Palantir, which is our nation’s tip of the spear in the AI war. They know that this is the place that we must do this. They’ve looked at every site on the planet. I was at the Department of War yesterday. So anyone who thinks this is some casual conversation about the mission critical aspect of this is just not being truthful.”
But it’s unclear if Palantir wants any part of Project Matador. One unnamed client—rumored to be Amazon—dropped out of the project in December and cancelled a $150 million contract with Fermi America. The news hit the company’s stock hard, sending its value into a tailspin and triggering a class action lawsuit from investors.
Yet construction continues. The plan says it’ll take 11 years to build out the massive datacenter, which will first be powered by a series of natural gas generators before the planned nuclear reactors come online.
Amarillo residents aren’t exactly thrilled at the prospect. A group called 806 Data Center Resistance has formed in opposition to the project’s construction. Kendra Kay, a tattoo artist in the area and a member of 806, told 404 Media that construction was already noisy and spiking electricity bills for locals.
“When we found out how big it was, none of us could really comprehend it,” she said. “We went out to the site and we were like, ‘Oh my god, this thing is huge.’ There’s already construction underway of one of four water tanks that hold three million gallons of water.”
For Kay and others, water is the core issue. It’s a scarce resource in the panhandle and Amarillo and other cities in the area already fight for every drop. “The water is the scariest part,” she said. “They’re asking for 2.5 million gallons per day. They said that they would come back, probably in six months, to ask for five million gallons per day. And then, after that, by 2027 they would come back and ask for 10 million gallons per day.”
During an October 15 city council meeting, Neugebauer told the city that Fermi would get its water “with or without” an agreement from the city. “The only difference is whether Amarillo benefits.” To many people it sounded like a threat, but Neugebauer got his deal and the city agreed to sell water to Fermi America for double the going rate.
“It wasn’t a threat,” Neugebauer said during another meeting on October 28. “I know people took my answer…as a threat. I think it’s a win-win. I know there are other water projects we can do…we fully got that the water was going to be issue 1, 2, and 3.”
“We can pay more for water than the consumer can. Which allows you all capital to be able to re-invest in other water projects,” he said. “I think what you’re gonna find is having a customer who can pay way more than what you wanna burden your constituents with will actually enhance your water availability issues.”
According to Neugebauer and plans filed with the Nuclear Regulatory Commission, the datacenter would generate and consume 11 gigawatts of power. The bulk of that, eventually, would be generated by four nuclear reactors. But nuclear reactors are complicated and expensive to make and everyone who has attempted to build one in the past few decades has gone over budget and they weren’t trying to build nuclear power plants in the desert.
Nuclear reactors, like datacenters, consume a lot of water. Because of that, most nuclear reactors are constructed near massive bodies of water and often near the ocean. “The viewpoint that nuclear reactors can only be built by streams and oceans is actually the opposite,” Neugebauer told the Amarillo city council in the meeting on October 28.
As evidence he pointed to the Palo Verde nuclear plant in Arizona. The massive Palo Verde plant is the only nuclear plant in the world not constructed near a ready source of water. It gets the water it needs by taking on the waste and sewage water of every city and town nearby.
That’s not the plan with Project Matador, which will use water sold to it by Amarillo and pulled from the nearby Ogallala Aquifer. “I am concerned that we’re going to run out of water and that this is going to change it from us having 30 years worth of water for agriculture to much less very quickly,” Kay told 404 Media.
The Ogallala Aquifer runs under parts of Colorado, Kansas, Nebraska, New Mexico, Oklahoma, South Dakota, Texas, and Wyoming. It’s the primary source of water for the Texas panhandle and it’s drying out.
“They don’t know how much faster because, despite how quickly this thing is moving, we don’t have any idea how much water they’re realistically going to use or need, so we don’t even know how to calculate the difference,” Kay said. “Below Lubbock, they’ve been running out of water for a while. The priority of this seems really stupid.”
According to Kay, communities near the datacenter feel trapped as they watch the construction grind on. “They’ve all lived here for several generations…they’re being told that this is inevitable. Fermi is going up to them and telling them ‘this is going to happen whether you like it or not so you might as well just sell me your property.’”
Kay said she and other activists have been showing up to city council meetings to voice their concerns and tell leaders not to approve permits for the datacenter and nuclear plants. Other communities across the country have successfully pushed datacenter builders out of their community. “But Texas is this other beast,” Kay said.
Jacinta Gonzalez, the head of programs for MediaJustice and her team have helped 806 Data Center Resistance get up and running and teaching it tactics they’ve seen pay off in other states. “In Tucson, Arizona we were able to see the city council vote ‘no’ to offer water to Project Blue, which was a huge proposed Amazon datacenter happening there,” she said. “If you look around, everywhere from Missouri to Indiana to places in Georgia, we’re seeing communities pass moratoriums, we’re seeing different projects withdraw their proposals because communities find out about it and are able to mobilize and organize against this.”
“The community in Amarillo is still figuring out what that’s going to look like for them,” she said. “These are really big interests. Rick Perry. Palantir. These are not folks who are used to hearing ‘no’ or respecting community wishes. So the community will have to be really nimble and up for a fight. We don’t know what will happen if we organize, but we definitely know what will happen if we don’t.”
2026-01-08 22:00:08

A social media and phone surveillance system ICE bought access to is designed to monitor a city neighborhood or block for mobile phones, track the movements of those devices and their owners over time, and follow them from their places of work to home or other locations, according to material that describes how the system works obtained by 404 Media.
Commercial location data, in this case acquired from hundreds of millions of phones via a company called Penlink, can be queried without a warrant, according to an internal ICE legal analysis shared with 404 Media. The purchase comes squarely during ICE’s mass deportation effort and continued crackdown on protected speech, alarming civil liberties experts and raising questions on what exactly ICE will use the surveillance system for.
“This is a very dangerous tool in the hands of an out-of-control agency. This granular location information paints a detailed picture of who we are, where we go, and who we spend time with,” Nathan Freed Wessler, deputy project director of the American Civil Liberties Union’s (ACLU) Speech, Privacy, and Technology Project, told 404 Media.
2026-01-08 06:25:03

A maroon Honda Pilot SUV sits perpendicular across a residential road in Minneapolis. At the time, federal authorities were in the neighborhood as part of the Department of Homeland Security’s (DHS) recently announced surge of thousands of officials. A silver Nissan Titan drives up the road and stops because the Honda is blocking its path. Two officers dressed in body armor, pouches, and badges saying “police” exit the Nissan.
The two people walk towards the Honda. Someone can be heard saying “get out of the fucking car.” One of them tries to open the driver’s door and reach through the open window. The driver of the Honda reverses and turns, getting straighter with the road. The driver then slowly accelerates and starts to turn to the right, leveling the car out with its front pointing away from the two officers.
A third officer, who has been standing on the other side of the road, pulls out a firearm while the car is turning away from him and fires into the car three times. The officer fires two of the shots when the vehicle is already well past him. He is not in front of the car, but to the side. The officer calmly holsters his weapon.
2026-01-08 00:06:06
For the past two months I’ve been following a Telegram community tricking Grok into generating nonconsensual sexual images and videos of real people with increasingly convoluted methods.
As countless images on X over the last week once again showed us, it doesn’t take much to get Elon Musk’s “based” AI model to create nonconsensual images. As Jason wrote Monday, all users have to do is reply to an image of a woman and ask Grok to “put a bikini on her,” and it will reply with that image, even if the person in the photograph is a minor. As I reported back in May, people also managed to create nonconsensual nudes by replying to images posted to X and asking Grok to “remove her clothes.”
These issues are bad enough, but on Telegram, a community of thousands are working around the clock to make Grok produce far worse. They share Grok-generated videos of real women taking their clothes off and graphic nonconsensual videos of any kind of sexual act these users can imagine and slip by Grok’s guardrails, including blowjobs, penetration, choking, and bondage. The channel, which has shut down and regrouped a couple of times over the last two years, focuses on jailbreaking all kinds of AI tools in order to create nonconsensual media, but since November has focused on Grok almost exclusively.
2026-01-07 22:28:53

Developers making mods and plugins for hentai games and sex toys say Github recently unleashed a wave of suspensions and bans against their repositories, and the platform hasn’t explained why.
Developers I spoke to said the community estimated around 80 to 90 repositories containing the work of 40 to 50 people went down recently, with many becoming inaccessible around late November and early December. Many of the affected accounts are part of the modding community for games made by the now-defunct Japanese video game studio Illusion, which made popular games with varying degrees of erotic content. One of the accounts Github banned contained the work of more than 30 contributors in more than 40 repositories, according to members of the modding community that I spoke to.
Github didn’t tell most suspended users what terms they broke to earn a suspension or ban, and developers told me they have no idea why their accounts went down without notice. They said they thought they were within Github’s acceptable use guidelines; even though they make mods for hentai games and things like interactive vibrator plugins, they took care to not host anything explicit directly in their repositories.
“Amongst my repositories there were no explicitly sexual names or images anywhere in the code or the readme, the most suggestive naming would be on the level of referencing the dick as ‘the men thing’ or referencing the sex as ‘huffing puffing,’” one developer, Danil Zverev, told me. He makes plugins for an Illusion game called Koikatsu. Zverev said he’s been using Github for this purpose since 2024, but on November 18, his Github page was “completely deleted,” he said. “No notifications anywhere, simply a 404 error when accessing the page and inability to log in on the web or in the mobile app. Also it does not allow me to register a new account with the same name or email.”