Microsoft Azure's AI inference accelerator Maia 200 aims to outperform Google TPU v7 and AWS Inferentia with 10 Petaflops of FP4 compute power.
The Maia 200 AI chip is described as an inference powerhouse — meaning it could lead AI models to apply their knowledge to ...
Inference-optimized chip 30% cheaper than any other AI silicon on the market today, Azure's Scott Guthrie claims Microsoft on ...
Today, we’re proud to introduce Maia 200, a breakthrough inference accelerator engineered to dramatically improve the ...
Looking ahead in 2026, many solution providers say technology investment is no longer about experimentation but execution at ...
Microsoft has unveiled its second-generation artificial intelligence chip, Maia 200, as it pushes to strengthen its cloud ...
Microsoft says the new chip is competitive against in-house solutions from Google and Amazon, but stops short of comparing to ...
Microsoft’s new Maia 200 inference accelerator chip enters this overheated market with a new chip that aims to cut the price ...
The cloud giant talks loudest about what scares it most. Here's what should terrify it For a decade, AWS's position on multi-cloud was clear: don't.… Multi-cloud meant a lowest-common-denominator ...
Jensen Huang has built a $4.6 trillion empire selling the picks and shovels of the AI revolution. But while he preaches ...
Microsoft (NASDAQ:MSFT) just dropped a new AI chip, and the subtext is pretty obvious: it's tired of AI being so expensive. The chip is called Maia 200, and instead of being some future concept, it's ...
Artificial intelligence (AI) spending is growing fast, and these blue-chip tech stocks can help you profit from it.