AI Networking: Cornelis’ CN500 Boosts Performance

In the old days, the networks were about connecting a small number of local computers. But the times have changed. In a world dominated by artificial intelligence, the trick coordinates the activity of tens of thousands of servers to train a large language model-without any delay in communication. Now there is an improved structure to do so. Cornelis Networks says the CN500 network fabric increases artificial intelligence, which supports publishing with up to 500,000 computers or processors – which is higher than today – and there is no additional access time.
The new technology offers a third major product to the world of networks, along with Ethernet and Infiniband. It is designed to enable AI-high-performance computers (HPC, or superconducting devices) to achieve faster and more faster completion times more efficiently. For HPC, Cornelis claims that its technology excels on Infiniband Ndr – the version presented in 2022 – twice the number of messages per second and 35 percent of cumin. For artificial intelligence applications, it provides six times faster connection compared to ethernet protocols.
Ethernet has always been synonymous with local region networks, or LAN. Software corrections allowed their communication protocols to overcome time test. The Infiniband invention was improved, but it was still designed with the same goal: linking a small number of local devices. “When these technologies were invented, they had nothing to do with parallel computing,” says Philip Murphy, co -founder, head and head of operations at Cornellis, based in Pennsylvania.
When data centers began to appear, engineers needed to solve new networks. Since different systems used various programs, they were unable to share resources – until the likes of ethernet and Infiniband are restricted to accommodate the most crowded periods of operations. “This raised the development of the entire cloud,” said Murphy. Participation of the CPU based on the group of the orbits between different computers or even different organizations has become the solution.
But while the data center pioneers tried to increase the number of applications that work on one server, Murphy and his colleagues wear value in the opposite approach: maximizing the number of processors operating on one application. “This requires a completely different solution from the networks,” he says, which Cornelis now offers. The company’s OMNI-Path structure, which was developed by Intel for supercover applications such as climatic models simulation or molecular reactions for drug design, is maximum productivity with a loss of data packages.
Highway free road
Coordination of processors to train artificial intelligence models requires the exchange of many messages – Data packages – highly highly highly frequent. The rate of messages per millise of a second matters, as well as cumin, means the time the recipient takes to respond.
One of the main challenges in sharing many data packages around the network is traffic congestion. Murphy explains that you need a way to direct reliable packages about crowding points without creating other problems. For example, if the packages take different ways to the same destination, it may reach the outside.
The dynamic adaptive guidance algorithm of the Corinlis of congestion is reduced by guidance around short -term congestion events, while the architecture of congestion control revolves around “popular” destinations. “If there is an event on a field that we all want to go to, you do not want the traffic that exceeds the stadium to arrest there,” Murphy says. This central speed technology allows the intention to control congestion. The keys see where the traffic is formed, then told the messengers to slow down until the crowding is dissipated. “Think of mitigating traffic as it comes on the highway on the highway,” Murphy explains.
Another challenge is to avoid cumin. In the traditional Ethernet structure, it is required to send enough memory package at the end point. Murphy says: “If you send you and deny the memory, you must return and tell me that,” Murphy says. This is a long episode that requires temporary, non -developmental temporary warehouses. Instead, Cornelis uses an algorithm called control -based flow control that specializes in advance memory. “You don’t have to tell me anything, and I will know how much I can send,” Murphy says.
Finally, the system avoids grinding to stop if the graphics processing unit fails to fail. In the traditional structure, if the server decreases, as well as the application. Its repair requires restart from the latest checkpoint – which takes itself a large -scale computing power to create it. “Imagine if you were every time you press” save “on your document, you had to wait 20 minutes.” Instead, since they spread through multiple computers, Cornelis networks keep an application that works, albeit at the frequency range a little lower so that the wrong link can be replaced – there are no required checkpoints.
Acting Amnesty International
Physically, the CN5000 is a network card based on a dedicated chip. Murphy explains that the network cards connect each server, “like you connect the Ethernet card to your computer at home.” The upper switch switch is turned on to each server and to other keys, and the exit category comes with 48 or 576 outlets to connect the shelf keys. “Every server has connected cards, so you can create groups of the multiple end point,” Murphy says.
The company’s main market is organizations that want to upgrade to a new group to simulate HPC faster. This is done through one of three original equipment manufacturers that work with Cornelis that works with servers and network switches. OEM purchases the Cornelis material cards and connects them to servers before the application is executed.
Until recently, the nerve network model training was one time a deal. But now, the AI model training is millions of parameters diminishes formulating or updating frequently. Cornellis expects to benefit from it. “If you do not adopt artificial intelligence, you will get out of the work. If you use Amnesty International ineffective, you will continue to work,” Murphy says. “Our customers want to build artificial intelligence in the most efficient way.”
From your site articles
Related articles about the web
Don’t miss more hot News like this! Click here to discover the latest in AI news!
2025-06-22 13:00:00