Huawei
Huawei OceanStor A800 AI storage claims first spot in global benchmark test
Huawei OceanStor A800 AI storage solution has ranked first in the global benchmark testing. The product topped the test – thanks to various smart AI capabilities and amazing high-end performance which is 10 times the traditional storage system.
MLPERF organized the Storage V1.0 AI benchmark test. It is a standard test platform that measures the performance level of AI hardware, software, as well as services.
The global authoritative platform conducts benchmark tests every year. It generates results by evaluating how many GPUs an AI storage system can support at a time.
It also includes aspects like storage bandwidth and computing power. The data needs to be read from the storage node and shouldn’t be cached on the host in advance. Overall, the test reflects the performance and large model storage experience.
In the latest MLPERF 2024 global storage benchmark testing, the Huawei OceanStor A800 AI Storage also participated and won the game defeating 12 of its rivals, for the first time.
It meets data throughput needs such as 255 GPU simulation training with a single device, GPU utilization rate above 90%, and single-frame stable bandwidth of 679 GB/s.
Huawei OceanStor A800
OceanStor A800 AI storage can easily deliver 340GB/s bandwidth per node and 85GB/s bandwidth per U on average. It can also offer hundreds of TB-level bandwidth using large-scale horizontal expansion. It also lessens the checkpoint read and write time from 10 minutes to seconds for faster processes.
The breakpoint continuation is decreased by 15 minutes which ultimately reduces GPU waiting and boosts the end-to-end computing power usage by more than 30%.
Huawei has been promoting its data storage solutions for a long time. The company is also adopting AI capabilities that can suit its new hardware and build a leading storage system architecture. The latest AI storage achievement not only highlights the company’s powerful storage tech but also lists another milestone.
(Source)