Sex is getting scrubbed from the internet, but a billionaire can sell you AI nudes

Actually the wonderful new Internet, teenage girls cannot learn about Reddit intervals, and independent artists cannot sell SMUTY games on ITCH.IO, but the military contractor will make you the non -general Deepfakes from Taylor Swift takes it at the top of $ 30 a month.
Early on Tuesday, Elon Mousse launched a new video and video generator called Grok Imagine with a “hot” mode whose out of the suggestive gestures to nudity ranges. Since Grok Imagine does not also have concrete handrails against creating images of real people, this means that you can create soft pornography mainly for anyone famous enough to re -create (although in practice, it seems mainly NSFW product very seriously for women). Musk boasted that more than 34 million pictures were created within a day of launch operations. But the real coup shows that Xai can ignore the pressure to maintain the content of adults from its services with the help of users to create something widely packed, thanks to the legal gaps and political leverage that there is no other company.
It seems that the Xai Video feature – which first appeared at the same time as the Romantic Chatbot companion named Valentine – from a stunning, strange angle, because it is released during a period when the sex is pushed (all the way to the word itself) to the Internet margins. Late last month, the United Kingdom began implementing the rules of age that require x and other services to prohibit sexual or “harmful” content for users under the age of 18. Almost at the same time, an active group called Collection Shoot succeeded in pressing Steam and Itch.io to play adult games and other media, leading to iTCH.IO in particular to the extended downloads on the tools.
Deepfake Porn of Real Popher is a form of non -future intimate images, which is intentionally published in the United States under the Take It Download law, which was signed by President Donald Trump earlier this year. In a statement published on Thursday, Rainn’s rape, the GROK feature “is part of an increasing problem of image -based sexual assault” and mocked that Grok “did not obtain the memo” on the new law.
But according to Mary Ann Franks, a professor at George Washington’s Law Faculty and Head of Non -profit Civil Rights Initiative (CCRI), there is a “little risk in the face of any kind of responsibility” under the Take It Download Law. “The criminal ruling requires” publication “, which, although it is unfortunately not defined in the statute, indicates the availability of content for more than one person,” says Franks. “If GROK makes videos just as exposed to the person who uses the tool, it doesn’t seem enough.”
Organizers failed to implement laws against major companies even when they advance
It is also possible that Grok is not required to remove images under Take It Download – although this rule is so wide that it threatens most social media services. “I don’t think Grok-or at least this GROK tool-qualifies as a” covered platform “, because the definition of the covered statute requires that it” primarily provides a forum for the content created by the user. “The content created from artificial intelligence includes user inputs, but the actual content, as the term, which was created by intelligence indicates by intelligence Artificial. ”The removal customer is also designed to work through people who announce the content, and Grok publicly does not publicly publish images where other users can see them – it makes them very easy (and it inevitably publishes almost on social media) on a large scale.
Franx and CCRI called the limited definition of a “covered platform” as a problem for other reasons months ago. It is one of the many ways that ACT IT Download fails to serve people affected by unusual intimate images with a threat to web platforms that work in good faith. Franks told the matter that GROK may not prevent adopting photos of artificial intelligence. Spitfire News In June, partly because there are open questions about whether Grok is a “person” affected by the law.
These types of failures are the subject of operational in the Internet regulation that is supposed to abandon harmful or inappropriate content; The UK state, for example, has made it difficult to manage independent forums while it is still somewhat easy for children to wrap.
Regarding this problem, especially in the United States, organizational agencies have failed to impose meaningful consequences on all types of companies by powerful companies, including many companies in Musk. Trump gave the musk -owned companies a full pass for bad behavior, and even after he left his strong position in the Ministry of government efficiency, Musk is likely to maintain a huge crane on organizational agencies such as FTC. (XAI just got a contract of $ 200 million with the Ministry of Defense.) So even if Xai violates the Take It Download Law, it may not face the investigation.
Besides the government, there are layers of gate guards dictating what is acceptable on the platforms, and they often take a diming of sex. Apple, for example, Discord, Reddit, Tumblr and other platforms for controlling NSFW materials with varying levels of success. STEAM and Itch.io reassessed adult content at a threat to lose relationships with payment processors and banks, which have already placed bolts on platforms like only Fans and Pornhub.
In some cases, such as Pornhub, this pressure is the result of platforms that allow unambiguously harmful and illegal loads. But it does not seem that Apple processors and payment processors maintain the solid policies that are equally implemented. Their enforcement seems to be largely dependent on the general pressure balanced with the amount of strength that the target possesses, and despite its fall with Trump, anyone at work has more political strength than musk. Apple and Musk have been repeatedly clashed over Apple policies, and Apple has mostly kept things like the fee structure, but it seems that they have retracted smaller problems, including re -ads to X after pulling them from the platform full of Nazi.
Apple has banned smaller applications for a naked manufacturer of real people. Will you practice this type of pressure on Grok, which was launched exclusively on iOS? Apple did not respond to a request for comment, but don’t dely.
The new GROK feature is harmful to people who can now easily get an unusual nakedness made from the main service of Amnesty International, but also shows the extent to which the promise of the Internet promise is “safer”. Small platforms face pressure to remove registered or fictional media that humans have made, while the company run by the billionaire can make money from something illegal in some cases. If you are connected to the Internet in 2025, nothing is related to sex, including sex – which, as usual, is related to power.
Don’t miss more hot News like this! Click here to discover the latest in AI news!
2025-08-10 12:30:00