AI

The Rise and Fall of Inflection’s AI Chatbot, Pi

In the past few years, Amnesty International has set the Silicon Valley on fire. The new book Artificial Intelligence Valley: Microsoft, Google and The Trillion-Dollaar Race to manufacture artificial intelligence It narrates those high times that burn, novel of startups, investment capital companies, and old technology companies that are bright – and those that have already been offered.

In the excerpt below, the author Gary Rivlin tells the internal story of Starting flexionEstablished by the founder of LinkedIn in 2022, Red Hoffman, and the founder of Dibind Mustafa Suleiman. Consider the reflection of the distinction between itself by building Chatbot with a high emotional intelligence, and the company was at some point worth $ 4 billion. But Chatbot, PI failed to obtain a market share, and in March 2024, Microsoft acquired most of the company’s workforce, leaving what remains of PI to be licensed to use it as a basis for customer service.

I was not human and therefore could not have a personality. However, it will fall on the “Personality team” to turn to the PI with a set of characteristics and features that might make it look as if it were. The team included many engineers, two linguists, as well as Rachel TaylorWho was the Creative Director of London -based Advertising Agency before going to work to turn.

Taylor said: “Mustafa gave me a little overview of what they were working on, and I could not stop thinking about the matter.” “I thought it might be the most influential thing that I worked at all.”

Humans develop a personality through a complex interaction of genetics and environmental influences, including education, culture and life experiences. PI started with the features of the team’s inclusion. Some of the positives were. Be nice, be supportive. Others were negative features of avoiding, such as irritation, arrogance and fighting.

Mustafa Suleiman said: “You offer the model a lot of comparisons that show the difference between good and bad cases of this behavior,” “Learning to reinforcement with human comments” in the language of industry, or RLHF. Sometimes, the teams that work on RLHF only work their behavior that they want to avoid a model (sexual, violent, gay). But the reflection made people help a digital degree of the device’s responses. “In this way the model learns mainly,” Oh, this was a really good answer, I will do more, “or” that was terrible, I will do so less than that. “. The grades were fed in an algorithm that modified the weight of the model accordingly, and the process was repeated.

Developing personal traits with me

Unlike many other artificial intelligence companies, which have learned reinforcement to third parties, they rented a reflection and training their people. Applicants were placed through a battery of the tests, starting with an understanding exercise of reading, Solomon described as “very accurate and very difficult.” Then another set of exams and several rounds of training came before putting them at work. Suleiman said that the medium “teacher” got between $ 16 and $ 25 per hour, but up to $ 50 if someone is an expert in the right field. Suleiman said: “We are trying to make sure that it comes from a wide range of backgrounds and representing a wide range of ages.”

There were many hundreds of teachers who were trained in PI in the spring of 2023. In some cases, we paid several hundreds of dollars per hour to very specialized specialists such as behavioral therapists, psychologists, theater writers, and novelists, “said Suleiman. They even rented many comedians at some point, to help give PI a sense of humor. Suleiman said: “Our goal is to have a more official, comfortable and conversation experience,” said Suleiman.

The company met with a final date imposed on March 12, 2023 to obtain a trial version of PI, which they shared with thousands of laboratories. With the Beta version, the company appeared from the ghost mode. Pi’s press advertisement was described as “supportive and emotional Amnesty, which is eager to talk about anything at any time.” The company has described PI as a “new type of artificial intelligence” different from other chat in the market, by May, the application was free and available to anyone who wanted to register and log in to use the service.

the New York TimesEven a short element rarely runs the release of a new product, especially a small, unknown product. However, a few companies can be proud of the founders with communications and the power of stars in turn: Red Hoffman, co -founder of LinkedIn and Suleiman, who was the property of Amnesty International as a DeepMind. This influence was translated into initial real estate on the first page of Times Business section, including a great and attractive clarification and a title that extends across multiple columns: “My New BFF: PI, Chatbot emotional support.” The correspondent Irene Griffiths was skeptical of the breathing exercises he suggested to me to help her reduce the pressure in her life. But the robot helped her to develop a special fevering plan, and she has definitely left her feeling. Griffiths assured me that her feelings were “understandable”, “reasonable”, and “completely normal”.

Suleiman published a statement on the site of the turn on the day the PI was released. Social media has poisoned the world, and began. Anger and anger led to participation, and the profit temptation has proven very strong. Suleiman wrote: “Imagine Amnesty International, which helps you sympathize with or even tolerate” the other side “, rather than anger and fear of them.” “Imagine Amnesty International, which improves your long -term goals and does not benefit from your need for distraction when you get tired at the end of a long day.” He described the artificial intelligence they were building as “a companion of personal artificial intelligence with the only task of making you happier, healthier and more productive.”

In June 2023, the reflection of the financing round of the series A. Suleiman and Hoffmann had come out of the belief that they would bring together 600 million dollars and 675 million dollars, but after the launch of the PI, the reflection was linked as one of the new hot starters. He wanted a long list of investors. Suleiman said: “We were overwhelmed by the offers.” In the end, they raised 1.3 billion dollars in a project tour, which is estimated at 4 billion dollars.

HarperCollins Publishers

Technical and commercial challenges to decline

PI’s willingness to address almost any theme was a pride point inside the reflection. When other robot programs close users if you exceed anywhere near a sensitive topic, PI invited a conversation. Suleiman said: “He will try to admit that a topic is sensitive or controversial and then be careful in submitting strong judgments and the user leads.” PI corrected the truth data that was wrong so that there would not be wrong information but instead of rejecting an explicit view, it offered a confrontation.

Suleiman was especially proud of PI in the weeks that followed Hamas’s attack on Israel and the subsequent bombing campaign in Gaza. “It was good in the actual time as things were revealing, it’s good now,” said two months in hostilities. “It is very balanced and equal, very respectful.” Suleiman said that if he had one bias, then this was deliberate in favor of “peace and respect for human life.” The robot that believed in its essence in the sanctity of human life does not seem bad.

Taylor considered the first version of the PI “acceptable”. She said, “It was very polite and very official,” she said. “But there was no conversation we wanted.” attractive. positive. respected. All of these were great, but it did not add exactly to the “fun” experience they were selling. After finding this right balance, it has been difficult. The character team will run the call or another, but it was as if they were playing What-A-Boll. “They were filled with weights and convinced the model to use more colloquial and colloquial, but then I was” very friendly and unofficial in some way people might find rude, “Taylor said.

The wide range of preferences among users was a fixed topic of conversation within the company. The default PI was “friendly”, but a short list of alternatives was added to people to choose from: casual, brilliant, sympathetic, dedicated. PI will change situations if the user tells that they are looking for a sympathetic ear and not a friend who tries to fix a problem. But the future, as Solomon imagined, was a model that reads the emotional tone of the person and modifies it quickly on its own, as a person may do whether greeting a friend with welcome, but then they immediately turn when learning that connects with bad news. But robots were not at the point where they could read a person’s preferences without clear instructions. Suleiman said that it took at least ten turns from the conversation, and up to thirty to distinguish the mood of the user.

Suleiman said: “In the future, Amnesty International will be many, many things at the same time.” “People ask me, is it a processor?” Well, it contains flavors. Among their noble goals, PI had multiple personalities, such as Siburg Sibel with separatist identity disorder. As they saw, PI will eventually be able to take over a semi -limited number of patterns capable of matching the moment.

By December 2023, the PI was available for Android and about 3 billion users worldwide. But Solomon and others in the thirst were dark about the numbers of users – greatly. They were disappointment. In that fall, the respondents asked the people who used Chatbots anyone who often turned to him. Fifty -two percent said Chatgpt and another 20 percent named Claude. The confusion was in third place with a share of 10 percent, followed by a cold Google (9 percent) and Bing (7 percent). PI was assembled with 2 percent of the users who chose the “others”.

The company had a usual long tasks list. However, the main challenge was PI teaching to improve in a wide range of tasks. People thought about PI as a conversation, which is good, but the assistant who is only good in conversation is limited. “PI cannot symbol,” Palkarishnan said that winter. “You must improve in thinking. It cannot take action. It is really useful if you want to talk about your feelings.”

From the book: Artificial Intelligence Valley: Microsoft, Google and The Trillion-Dollaar Race to manufacture artificial intelligence Written by Gary Rivlin. Publishing rights © 2025 by Gary Rivlin. It was reprinted with the permission of Harper Business, a footprint of Harbrickallins.

From your site articles

Related articles about the web

2025-04-01 13:00:00

Related Articles

Back to top button