By: Patrick Moorhead
I’m back on the road again attending Microsoft Build 2018 in Seattle, Washington. The company’s annual developer’s conference is always a highlight of the season for me, with developers, partners, press, analysts, and some customers all coming together to hear about the latest and greatest from the tech giant. Microsoft continues to demonstrate success in modern business and workforce transformation through its cloud and AI offerings, Microsoft 365 and mixed reality. Last quarter, Azure cloud had 93% growth, Office 365 with 42% commercial seat growth, and Dynamics 365 with 65% growth. Microsoft has a long list of new businesses like who are doing these kinds of activities with them like Toyota, Jaguar Land Rover, and Coca-Cola.
CEO Satya Nadella led off the Day 1 keynote and didn’t waste much time launching into the topic of civic responsibility for tech companies (citing the need for privacy and ethical AI). I know Nadella wasn’t talking directly to Facebook and Google, but it sure sounded like it to me. It was a good note to kick things off—positioning Microsoft as one of the “good guys,” before launching into the newest offerings (many of which center around data, AI, and the like). Unlike others, Microsoft’s core business isn’t mining personal information for advertising, so the company from the bat has a good leg to stand on. Today I wanted to give a recap of some of the more important announcements to come out of Day 1 of the event.
Some of the first announcements that caught my eye pertained to Microsoft’s Azure IoT Edge technology, where, amongst other things, Microsoft announced a new partnership with DJI, developing drones, and Qualcomm, on a machine learning vision developer kit. Another huge announcement here was that Microsoft is open sourcing Azure IoT Edge Runtime. I won’t go into much detail here, but I published a separate, in-depth blog on Microsoft’s IoT news here which I would encourage you to read if interested.
Another IoT-adjacent announcement made at the event was the return of Kinect, in the form of Project Kinect for Azure—a package of sensors and the depth-sensing Kinect camera, now geared towards developers. Microsoft says that Project Kinect can input fully articulated hand tracking and high fidelity spatial mapping, all while making use of Azure AI to gain valuable insights and drive operations. Ironically, this makes more sense in IoT computer vision than it ever did in games, where it was originally introduced to the Xbox 360 in 2010.
Azure AI announcements
There were several AI-related announcements on Day 1. Microsoft announced that its Project Brainwave (an architecture for deep neural net processing) is now available in preview on the edge and in Azure. Supporting both Intel Altera FPGA hardware and ResNet50-based neural networks, Microsoft is saying that Project Brainwave makes Azure the fasted cloud to run real-time AI upon. Specifically, the company means machine learning inference. I have to say Project Brainwave is getting much more interesting and real. Microsoft is boasting up to 5 times lower hardware latency than Google’s TPU, though I’ll have to dig into that claim. Analyst Karl Freund wrote about FPGAs and AI here.