DeepSeek R2 Slashes Costs and Packs 512 PFLOPS

AI startup DeepSeek plans to release its new R2 model soon, according to popular tech insider @iruletheworldmo on X. The company uses Huawei Ascend 910B chips with their custom training system to reach 82% hardware use. This equals 512 PetaFLOPS of computing power, about 91% of what older NVIDIA systems deliver, but costs 97% less.

Several partners help make this happen. Tuowei Information handles most hardware orders. Sugon provides special cooling racks that manage high heat loads. Innolight adds parts that cut power use by 35%. Different centers run the system across China, with Runjian Shares managing southern operations for about ¥5 billion yearly.

The R2 model already works for private use. Through the Yun Sai Zhilian platform, it helps smart city projects in 15 Chinese provinces. If more power becomes needed, Huawei can offer its CloudMatrix 384 system as an option to NVIDIA gear. This system packs 384 Ascend chips with more total memory but uses four times more electricity.

Experts expect a smooth launch for the R2 model. Tech watchers await official announcements to see how well it performs against other AI systems.
 

Attachments

  • DeepSeek R2 Slashes Costs and Packs 512 PFLOPS.webp
    DeepSeek R2 Slashes Costs and Packs 512 PFLOPS.webp
    49.5 KB · Views: 17

Trending content

Latest posts

Top