Technology

Think Twice Before Creating That ChatGPT Action Figure

in the beginning In April, the movement of the movement began to appear on social media including LinkedIn and X. Every number depicted the person who created it with strange precision, with personal accessories such as reusable coffee cups, yoga, and headphones.

All this is possible due to the new GPT-4O photo generator from Openai, which is presented by the Superchass Chatgpt series on photo editing, text presentation, and more. Openai’s ChatGPT images can also create pictures of the style of the Japanese film company, Studio Ghibli – a trend of virus quickly.

The pictures are fun and easy to make – all you need is a free Chatgpt account and a picture. However, to create an image or image of the GHibli-style studio, you also need to deliver a lot of data to Openai, which can be used to train its models.

Hidden data

The data you give is often hidden when using an artificial intelligence photo editor. Every time you download a picture to Chatgpt, you are likely to deliver a “full set of descriptive data,” says Tom Vasandar, head of the cybersecurity at the Open Institute of Technology. “This includes Exif data attached to the image file, such as the time when the image and GPS coordinates were taken from where it was photographed.”

Openai also collects data about the device you use to access the platform. This means your device type, operating system, browser version and unique identifiers, says Vazdar. “Because platforms like ChatGPT work in a conversation, there are also behavioral data, such as what you wrote, the type of images that you requested, and how I interacted with the interface and the frequency of these procedures.”

It is not just your face. If you download a high-resolution image, you give Openai anything else in the image also-background, other people, things in your room and anything that can be read like documents or badges, says Camden Woolven, head of the AI ​​product group in GRC International Group.

Vazdar says that this type of voluntary subsidized data is “a golden mine to train obstetric models”, especially multimedia models that depend on visual inputs.

Openai denies that it regulates viral images trends as a trick to collect user data, however the company is definitely gaining an advantage. Openai does not need to scrape the web for your face if you download it happily, as Vazdar notes. “This trend, whether by design or a comfortable opportunity, provides the company with huge amounts of high -quality fresh face data from various age groups, races and geography.”

Openai says he is not actively looking for personal information to train models – and does not use general data on the Internet to build profiles about people to announce or sell their data. However, under Openai’s current privacy policy, images can be kept through ChatGPT and their use to improve their models.

Jake Moore, the global cybarte security adviser at Outfit ESET, who has created his work number to show the risk of privacy in the direction on LinkedIn, says that any data, claims or requests that you share help teach the algorithm – and personal information helps to control it more.

Alien

In some markets, your photos are protected according to the regulations. In the UK and the European Union, the Data Protection Regulation, including GDP, provides strong protection, including the right to access or delete your data. At the same time, the use of biometric data requires clear approval.

However, biometric photographs become only when processing through specific technical means, allowing the unique identification of a specific individual, says Melissa Hall, a great fellow in the MFMAC law office. She says processing an image to create a caricature of the topic in the original image “is unlikely to meet this definition.”

Don’t miss more hot News like this! Click here to discover the latest in Technology news!

2025-05-01 13:56:00

Related Articles

Back to top button