Big Tech develops AI networking standard – Networking


Major tech firms, including Meta, Microsoft, AMD and Broadcom, have developed a new industry standard for networking in AI data centres, the latest effort to break the dominance of market leader Nvidia.



The “Ultra Accelerator Link,” is an attempt to establish an open standard for communication between artificial intelligence accelerators – systems that can help process the vast amounts of data employed in AI tasks.

Other members include Alphabet-owned Google, Cisco System, Hewlett Packard Enterprise and Intel.

Nvidia, the biggest player in the AI chip market with a share of around 80 percent, is not part of the grouping.

Tech giants like Google and Meta are keen to reduce their dependence on Nvidia, whose networking business forms an essential part of the package that enables its AI dominance.

Broadcom’s central rival in the networking and custom chip market – Marvell Technologies, is also not part of the group.

“An industry specification becomes critical to standardize the interface for AI and Machine Learning, HPC (high-performance computing), and Cloud applications for the next generation of AI data centres and implementations,” the companies said in a statement.

Tech companies are pouring billions of dollars into the hardware required to support AI applications, boosting demand for AI data centres and the chips that they run on.

The Ultra Accelerator Link group has designed specifications governing connections among different accelerators in a data centre.

The specifications will be available in the third quarter of 2024 to companies that join the Ultra Accelerator Link (UALink) Consortium.

A spokesperson for Nvidia declined to comment. Marvell did not immediately respond to a Reuters request for comment.



Source link