Breaking News

Sam Altman’s AI empire will devour as much power as New York City and San Diego combined. Experts say it’s ‘scary’

New York City photos on a reward summer night: all air conditioners are tense, underground subway cars, lighting towers. Now add San Diego at the height of a standard heat wave, when he launched the demand for approximately 5,000 megawatts and the network.

This is almost the size of the electricity that Sam German and its partners say is that it will be devouring through the next wave of artificial intelligence data centers – one project that consumes greater strength, every day, more than two American cities that were pushed to the collapse point.

This advertisement is a “silent moment” that Andrew Chen, a professor of computer science at the University of Chicago, says he is waiting for a long time to see its fruits.

“I have been a computer scientist for 40 years, and the computing most of that time was the smallest part of using the power of our economy,” said Chen. luck. “Now, it has become a large share of what the entire economy consumes.”

He called the transformation both exciting and disturbing.

“It is frightening because … now [computing] It can be 10 % or 12 % of the world’s power by 2030. We reach some similar moments about how we think about artificial intelligence and its impact on society. ”

Openai announced this week a plan with NVIDIA to create AI data centers that consume up to 10 GB of energy, with additional projects with a total of 17 GB of movement. This is almost equivalent to the operation of New York City – which uses 10 GB in summer – and San Diego during the intense heat wave in 2024, when more than five GB was used. Or, as an expert said, it is close to the total demand for electricity in Switzerland and Portugal combined.

“It is very amazing,” Shin said. “A year and a half ago they were talking about five gigawatts. They now raised to 10, 15, until 17 years. There is a continuous escalation.”

Fengqi You, a professor of energy systems at Cornell University, also agreed to study AI.

And he told luck. “Seventeen GB is similar to the operation of both countries together.”

The Texas Network usually works, as Altman has been shattered on one of the projects this week, about 80 GB.

“So you are talking about a degree of strength that can be compared to 20 % of the entire Texas network,” said Shin. “This is for all other industries – provinces, factories and families. It is a large amount of strength.”

Altman frame the construction as necessary to keep pace with the fugitive demand of artificial intelligence.

“This is what it takes to present artificial intelligence,” he said in Texas. He pointed out that the use of Chatgpt has jumped 10 times in the past 18 months.

What is the energy source that artificial intelligence needs?

Altman did not hide its favorite source: nuclear. Each of the startups for fission and fusion has supported, as it is betting that only reactors can provide a kind of fixed and concentrated output needed to maintain the urgent demand request from artificial intelligence.

He said: “The infrastructure account will be the basis for the economy of the future,” and it sets a framing of nuclear weapons as the backbone of this future.

Chen, however, is explicit around the border in the short term.

“In my knowledge, the amount of nuclear energy that can be brought on the network before 2030 is less than Gigawatt,” he said. “So when you hear 17 GB, the numbers do not match.”

With projects such as Openai required from 10 to 17 GB, nuclear weapons are “roads outside, a slow slope, even when they reach there,” said Sheen. Instead, wind, solar energy, natural gas and new storage techniques are expected to dominate.

You, Cornell Energy Expert, hit a medium floor. He said that Al -Nawawi may not be avoided in the long run if artificial intelligence continues to expand, but he warned that “in the short term, there is not much backup” – whether it is fossil, renewable or nuclear. “How can we expand this short -term ability? This is unclear,” he said.

He also warned that the schedule may be unrealistic.

He said, “The model nuclear plant takes years to allow and build.” “In the short term, they will have to rely on renewable energy sources, natural gas, and possibly modify the old plants. The nucleus will not reach quickly.”

Environmental costs

The costs of the environment waving on the horizon for these experts as well.

“We have to face the fact that companies promised to be clean and clear, and in the face of the growth of artificial intelligence, it may not be,” Chen said.

Ecological systems can be exposed under pressure, as I said Cornell.

He said: “If the data centers consume all local waters or disrupt biodiversity, then this creates unintended consequences.”

Amazing investment numbers. The value of each Openai is about $ 50 billion, adding up to 850 billion dollars in planned spending. NVIDIA alone pledged up to $ 100 billion to support expansion, providing millions of new GPUs Vera Rubin.

Chen added that we need a broader community conversation about environmental costs that waved a lot of electricity for artificial intelligence. In addition to carbon emissions, he pointed to hidden strains on water supplies, biological diversity and local communities near huge data centers. He pointed out that cooling alone can consume huge amounts of fresh water in areas that are already facing a scarcity. And because the devices flow very quickly – with the new NVIDIA processors that are offered every year – primary chips are constantly ignored, creating waste streams immersed with toxic chemicals.

“They told us that the data centers will be clean and green,” Shin said. “But in the face of the growth of artificial intelligence, I don’t think they can be. It is time to keep their feet on the fire.”

2025-09-24 18:25:00

Related Articles

Back to top button