ADVERTISEMENT

Microsoft Bets on Faster Chips, AI Services, to Win Cloud Wars

Microsoft will let customers take advantage of its AI tools to lure business from Amazon and Google.

Microsoft Bets on Faster Chips, AI Services, to Win Cloud Wars
An exhibitor demonstrates the Microsoft Corp. Surface Hub at the Microsoft Developers Build Conference in San Francisco, California, U.S. (Photographer: David Paul Morris/Bloomberg)  

(Bloomberg) -- Microsoft has spent the past few years coming up with ways to use artificial intelligence internally. Now it will let customers take advantage of some of these tools while aiming to lure business from Amazon and Google.

The company will let customers use a chip system it built to process AI queries cheaper and faster, called Project Brainwave. The first Brainwave service will speed up image recognition so it's almost instantaneous, said Doug Burger, a distinguished engineer in Microsoft Research, who works on the company's chip development strategy for the cloud.

Microsoft, starting next year, also will sell an AI-sensor device based on the technology it its motion-controlled Kinect gaming sensor. Called Project Kinect for Azure, it will let cloud customers do things like track motion and map the space around them. 

Microsoft Chief Executive Satya Nadella wants to win customers with artificial intelligence tools. Increasingly these services need to operate in Microsoft's own cloud data centers and on customers' connected devices, including factory equipment and drones. As Microsoft, Amazon.com Inc. and Alphabet Inc.'s Google race to add AI products and make their clouds run faster, all are boosting work on customized microprocessors to try to gain an edge.

“It comes down to the cloud wars -- all of these vendors are salivating at the AI workloads because they are very compute intensive and they are very data intensive,” said Mike Gualtieri, an analyst at Forrester Research.

Microsoft is announcing the new services and products Monday at its annual Build conference for software developers in Seattle.

Brainwave uses customizable chips known as field programmable gate arrays. Microsoft buys the chips from Altera, a subsidiary of Intel Corp., and adapts them for its own purposes using software, an ability that's unique to that type of chip. “It's pretty tricky engineering stuff to program these,” Gualtieri said. “The significance of Brainwave is it's simple to do that -- Microsoft does it for you.”

One early client is electronics manufacturer Jabil Inc., which plans to use the service in factories where it has optical scanners that find possible product defects, including variations in tiny components.

Right now, Jabil's scanners are very conservative when they flag possible issues, which then get examined by workers -- 40 percent of the time there's nothing actually wrong. Jabil has an AI system that has lowered the false positives by 75 percent, but it's running on pricier graphics chips. As the company looks to move the system from testing on two manufacturing lines to hundreds, it's planning to switch to Microsoft's option, which is cheaper, said Ryan Litvak, information technology manager at Jabil.

The image processing is done in Jabil's factories, an example of Microsoft's strategy to let customers use its AI products in the cloud and on the customer's own devices.

Many customers want to have the AI services available for equipment like factory machinery or drones scanning power lines and pipe networks for defects -- and those devices often aren’t connected to the Internet, which means the service has to run on the device.

Burger said Microsoft's Brainwave service will provide the fastest analysis of images using one of the most common AI neural networks for such a task -- a nearly instantaneous response.

Microsoft Bets on Faster Chips, AI Services, to Win Cloud Wars

 

Improvements in chip performance are slowing. Intel's next advancement, 10-nanometer microprocessors, is running late and the company said last month they won’t be in mass production until 2019. That has put pressure on Microsoft and rivals to develop the best way to augment commodity processors to speed the performance of their networks.

“If one company picks the right architecture and one picks the wrong one, it's a pretty big deal,” Burger said.  

Microsoft's Azure already uses these FPGA chips -- every Azure server put into service in the past three years has one of these chips in it, said Azure Chief Technology Officer Mark Russinovich.

Project Kinect for Azure devices will go on sale next year and will let software developers write cloud applications that make use of sound, gestures or spatial understanding of the surrounding area. For example, a customer could place the devices in work sites to track things like spills or in a retail store as part of a cashless checkout experience, a similar idea to Amazon’s Go store. 

Nadella also committed $25 million over five years for a program to use AI for accessibility -- projects like helping people with disabilities to communicate and find employment. 

To contact the author of this story: Dina Bass in Seattle at dbass2@bloomberg.net.

To contact the editor responsible for this story: Andrew Pollack at apollack1@bloomberg.net.

©2018 Bloomberg L.P.