AMD thinks data centers will become less important for AI tasks as phones and laptops take over. Their top tech expert, Mark Papermaster, said companies are moving toward running AI on regular devices instead of massive server farms. He believes most AI processing will happen on personal devices by 2030 as new apps develop for them. This shift comes because running AI in data centers costs much more money than companies want to spend.
AMD takes the AI computer trend seriously compared to Intel and Qualcomm. Their newest chips, like Strix Point, show they want to bring AI power to small devices at lower prices. Papermaster mentioned that companies are working hard to make AI models more accurate and efficient. Eventually, he expects our devices will run complex AI without needing internet connections.
His comments echo what Intel's former leader said about AI processing moving to regular devices. This happens because NVIDIA controls the market for training AI models. The only way competitors can challenge NVIDIA is through processing AI on everyday devices. AMD has already started making chips that can handle these tasks locally.
AMD takes the AI computer trend seriously compared to Intel and Qualcomm. Their newest chips, like Strix Point, show they want to bring AI power to small devices at lower prices. Papermaster mentioned that companies are working hard to make AI models more accurate and efficient. Eventually, he expects our devices will run complex AI without needing internet connections.
His comments echo what Intel's former leader said about AI processing moving to regular devices. This happens because NVIDIA controls the market for training AI models. The only way competitors can challenge NVIDIA is through processing AI on everyday devices. AMD has already started making chips that can handle these tasks locally.