Physical AI marks a transition from robots as programmed tools to robots as adaptable collaborators. That transition will unfold over years rather than months, but the foundation models emerging from ...
Software developers have spent the past two years watching AI coding tools evolve from advanced autocomplete into something that can, in some cases, build entire applications from a text prompt. Tools ...
Today, we’re proud to introduce Maia 200, a breakthrough inference accelerator engineered to dramatically improve the economics of AI token generation. Maia 200 is an AI inference powerhouse: an ...
Maia 200 is Microsoft’s latest custom AI inference accelerator, designed to address the requirements of AI workloads.
The government says artificial intelligence is saving money on translators, but human translators worry it may cost them work ...
Microsoft has announced that Azure’s US central datacentre region is the first to receive a new artificial intelligence (AI) inference accelerator, Maia 200.
Calling it the highest performance chip of any custom cloud accelerator, the company says Maia is optimized for AI inference on multiple models.
According to Gartner, public cloud spend will rise 21.3% in 2026 and yet, according to Flexera's last State of the Cloud ...
Microsoft recently announced Maia 200, a new AI accelerator specifically designed for inference workloads. According to ...
Microsoft unveils the Maia 200 AI chip. Learn about the tech giant's shift toward in-house silicon, its performance edge over ...