AI

Neurotech companies are selling your brain data, senators warn

Three Democratic Senate members highlight a warning to the ability of the techniques of the Brain computer interface (BCI) to collect our nervous data-to sell it-and perhaps sell it. In a letter to the Federal Trade Committee (FTC), he called Sens. Chuck Schumer (D-Ny), Maria Kaneuel (D-IN), and Ad-MA to conduct an investigation into the processing of neural technology companies with user data, and for more tight regulations on data sharing policies.

“Unlike other personal data, nervous data – captured directly from the human brain – can reveal mental health conditions, emotional states and cognitive patterns, even when their identity is not disclosed,” says the message. “This information is not only a deep character, but it is strategically sensitive.”

While the concept of nervous techniques may evoke pictures of brain transplant such as Neuralink’s Elon Musk, there are less organizational nervous products – and less regulated – and organize them in the market, including headphones that help people to meditate, and affect the clear dream clearly, and promised users to dating online through aid through applications “based on a reaction Instinctive. Dressing these consumer products visions about the nervous data for users – and since they are not classified as medical devices, companies behind them do not prevent the participation of these data with third parties.

“Neurological data is the most private, personal and strong information we have – and no company should be allowed to harvest it without transparency, iron approval, and strict handrails. However, companies collect them with mysterious policies and transparency zero,” said Shomer. ” freedom By email.

The message cited a 2024 report issued by the Neurotights Foundation, which has found that most nervous technology companies have only a few guarantees on user data but also have the ability to share sensitive information with third parties. Research the report on data policies that include 30 BCI companies facing the consumer and found that all except one “seems to be able to access” user’s nervous data, “and does not provide any meaningful restrictions on this access.” Neurotights Foundation, which was only included in the companies whose products are available to consumers without the help of a medical specialist; Slowed like that of Neuralink was not among them.

The companies included in the Neusrights Foundation make it difficult to cancel the participation of their nervous data with third parties. It is slightly more than half of the companies mentioned in the report, which allows consumers explicitly to cancel the approval of data processing, and not only 14 of 30 users are granted the ability to delete their data. In some cases, user rights are not universal – for example, some companies only allow users of the European Union to delete their data but do not give the same rights to users elsewhere in the world.

To protect against potential violations, Senators call to FTC to:

  • An investigation whether nervous technology companies are involved in unfair or deceptive practices that violate the FTC law
  • Forcing companies to report data and commercial practices and access to the third party
  • Clarify how to apply current privacy standards to nervous data
  • Imposition of the IPO Privacy Law for children in terms of its connection with BCIS
  • Start the process of setting the rules to create guarantees for nervous data, and setting restrictions on secondary uses such as artificial intelligence training and behavioral description
  • And make sure that both invading and non -invasive nervous technology are subject to basic detection and transparency standards, even when the data is unidentified

Although the message of Senators calls neuralink by name, the MUSK brain transplant technology is already subject to more regulations from other BCI technologies. Since Neuralink brain transplantation is a “medical” technique, it is required to comply with the HIPAA (HIPAA) law, which protects medical data for individuals.

The CEO of the Neurotights, Stephen Damanus, said that HIPAA may have not fully signed the current nerve technology, especially with regard to the requirements of “enlightened approval”.

“There are firm and validation models for approval from the medical world, but I think there is a work that must be done about understanding the approval of the enlightened when it comes to nervous technology,” Damanus told Damianos. freedom. “The measurement that I would like to do is, if you are going through my apartment, I will know what you want and you will not find in my apartment, because I have a feeling of what is exactly there. But brain tests excel on the roads, which means that it collects more data than what is required to run.

Data collection becomes more difficult for “wellness” nervous technology products, which do not have to comply with HIPAA, even when they announce themselves as helping in mental health conditions such as depression and anxiety.

Damanus said that there is a “very fogged gray area” between medical devices and wellness devices.

“There is an increasing category of increased devices that are marketed for health and wellness, distinct from medical applications, but there can be a lot of overlap between those applications,” said Damianos. The separation line is often whether a medical broker is needed to help someone get a product, or whether he can “connect to the Internet only, put your credit card, and appear in a box after a few days.”

There are very few regulations about nervous technology announced as “wellness”. In April 2024, Colorado approved the first legislation ever to protect the nervous data of consumers. The state has updated the current consumer protection law, which protects “sensitive data for users”. Under the updated legislation, “sensitive data” now includes “biological data” such as biological, genetic, physiological, physiological and nervous information. In September, California amended the consumer privacy law to protect nervous data.

“We believe in the transformative potential of these technologies, and sometimes I think there is a lot of destruction and depression around,” freedom. “We want to get this moment correctly. We believe it is a really deep moment that has the ability to reshape what it means to be a human being. Huge risks come from it, but we also believe in taking advantage of the ability to improve people’s lives.”

Don’t miss more hot News like this! Click here to discover the latest in AI news!

2025-04-28 17:00:00

Related Articles

Back to top button