Breaking News

Trump signs bipartisan Take It Down Act to fight ‘revenge porn’ and deepfakes. Here’s what’s in it

On Monday, President Donald Trump signed the Take It Download law, a legislation from the two parties that enact strict penalties for the distribution of intransigent intimate images, which are sometimes called “Resvenger Porn”, as well as the deep created by artificial intelligence.

This measure, which enters into force immediately, was presented by senator Ted Cruz, a Republican of Texas, and the Senator Amy Klopochar, a democratic of Minnesota, and later obtained the support of the first lady Melania Trump. Critics of this measure, which deals with both the real images and employee created by intelligence, says the language is very wide and can lead to the first monitoring and modification issues.

What is the verb to take it?

The law makes illegal “publication” intentionally or threatening to publish intimate images without a person’s consent, including “Deepfakes” created AI-AI-Creation. It also requires web sites and social media to remove these materials within 48 hours of notification from the victim. Platforms should also take steps to delete refined content. Many countries have already banned sexual sexual or revenge porn, but Take Download is a rare example of federal organizers who are imposed on Internet companies.

Who supports her?

The Take It download law won strong support from the two parties, and was defended by Melania Trump, who pressed the Capitol Hill in March, saying it was “tragic” to see what teenagers are going through, especially girls, after they were passed by a victim of people who spread this content.

Cruz said the procedure is inspired by Euseon Perry and her mother, who visited his office after he refused Snapchat for about a year to remove “Deepfake” that was created from the 14 -year -old artificial intelligence.

Meta, which owns and runs Facebook and Instagram, supports legislation.

“The presence of an intimate-real or created image of artificial intelligence-it can be joint without approval that may be destroyed and dead and affects many efforts to help prevent it,” said Meta spokesman Andy Stone.

“It is an important step forward that will help people to follow justice when they are victims of intransigent intimate images, including DEPFAKE images created using artificial intelligence,” said IT and Innovation Foundation, a research company supported in the technology industry, in a statement after the draft law last month.

“We must provide victims of online abuse with the legal protection they need when sharing intimate images without their consent, especially after Deepfakes create new, terrifying chances of abuse,” Kloposhr said in a statement. “These images can destroy life and reputation, but now that our legislation from the two parties has become legally, victims will be able to remove these articles from social media platforms and law enforcement can bear the perpetrators.”

Klopochar described it as “a great victory for victims of abuse online” and said that it gives people “their legal protection and tools when their intimate photos are shared, including Deepfakes, without their consent, and enabling the application of the law to hold the perpetrators accountable.”

“This is also a prominent step towards setting the proper road rules on the road around social media and Amnesty International,” she added.

“The predators who are the new technology weapon to spread this exploitative filth will now face criminal consequences, and the major technology will no longer be allowed to overcome the spread of this precise material,” said Cruz said.

What are control concerns?

Freedom of expression and digital rights groups say that the draft law is very wide and can lead to legitimate images control, including legal pornographic materials and LGBTQ content, as well as government critics.

“While the bill aims to address a serious problem, goodwill alone is not enough to develop a good policy,” said the non -profit electronic border institution, which is the group of defense of digital rights. “Legislators must strengthen and enforce the current legal protection of the victims, rather than inventing the new mature removal systems for abuse.”

EFF said that the removal item in the bill “applies to a much wider category-it is likely to include any images that include intimate or sexual content” of narrower definitions of intimate intimate images in another place in the text.

“The removal clause also lacks critical guarantees against the requests of trivial or bad removal. Services will depend on automatic filters, which are very frank tools,” said Eve. “They often clarify the legal content, from the suspension of fair use to news reports. The narrow time frame of the law requires to remove applications and websites to speak within 48 hours, and it is rarely enough time to verify whether the speech is already illegal.”

As a result, the group said that companies via the Internet, especially the smaller companies that lack the resources necessary to roam through a lot of content, “may choose to avoid arduous legal risks by organizing speech instead of trying to verify it.”

EFF said the procedure also presses the platforms of “speech monitoring, including currently encrypted” to address the threats of responsibility.

The electronic civil rights initiative, a non -profit institution that helps victims of crime and abuse online, said it is “dangerous reservations” over the bill. She described the provision of mysterious removal unconstitutional, unconstitutional hyper, and lacks sufficient guarantees against misuse.

For example, the group said that the platforms may be obligated to remove press photos of naked protest in a public street, or once a metro of tunnels distributed by the law application to locate the perpetrator, sexually explicit content or synchronous sexual materials, but they were unusual narrow.

This story was originally shown on Fortune.com

2025-05-20 11:22:00

Related Articles

Back to top button