Sam Altman’s goal for ChatGPT to remember ‘your whole life’ is both exciting and disturbing

Sam -German CEO of Openai put a great vision for the future Chatgpt at the Amnesty International event hosted by VC Sequoia earlier this month.
When one of the attendees was asked how Chatgpt becomes more customized, Altman replied that he even wanted to document the model everything in a person’s life.
He said the ideal is “a very little thinking model with a trillion symbols of the context you put throughout your life.”
“This model can think completely through your context and do it efficiently. Every conversation in your life, every book you have ever read, every e -mail you have ever read, everything you looked at at all, in addition to connected to all your data from other sources. And your life only keeps sticking to the context.”
“Your company does the same thing for all your company’s data,” he added.
Altman may have some of the reasons for data to believe that this is the natural future of Chatgpt. In that same discussion, when the great methods are required to use Chatgpt young people, he said: “People use it in the kidney as an operating system.” They download the files, connect data sources, then use “complex claims” for that data.
In addition, with ChatGPT – that can use previous chats and facts that he saved as a context – one of the trends he noticed is that young people “do not really make life decisions without asking Chatgpt”.
He said: “Excessive simplicity is: the elderly use Chatgpt, such as Google’s replacement.” “People in the twenties and thirties of age use it like a life consultant.”
It is not a big jump to see how Chatgpt can become a well -known artificial intelligence system. In addition to the agents who are currently trying to build the valley, this is an exciting future.
Imagine artificial intelligence automatically in scheduling your car oil and reminding you; Planning to travel to attend a wedding outside the city and request a gift from the record; Or arrange the next folder from the series of books you were reading for years.
But the frightening part? What is the amount that we must trust in a large profit company to find out everything about our lives? These are companies that do not always act in typical ways.
Google, which started life with the slogan “is not evil”, lost a lawsuit in the United States, which accused it of engaging in anti -competition and monopoly behavior.
Chatbots can be trained to respond in political motivations. Not only was Chinese robots to comply with the requirements of censorship in China, but was randomly discussed “white genocide in South Africa in South Africa when people asked them completely unrelated questions. Many indicated that behavior indicates the deliberate manipulation of his response engine in the leadership of its founder -born institution in South Africa, Elon Musk.
Last month, Chatgpt became so acceptable that he was honestly. Users have started sharing screenshots from robots that construct with dangerous decisions and ideas. The Taman soon responded by the promise of the team that identified the disk that caused the problem.
Even the best reliable models are still explicitly making things from time to time.
Therefore, the presence of Amnesty International Assistant to know throughout our lives can help in ways that we can only see. But given the long history of Big Tech for IFFY behavior, this is also a mature position for misuse.
Don’t miss more hot News like this! Click here to discover the latest in Technology news!
2025-05-15 23:05:00