Steve Clayton, general manager, Microsoft AI, said that the company was experiencing an “explosion of interest in AI and its potential” from Australian enterprises.
The products released are:
- Visual Studio Tools for AI
- Cloud and AI capabilities for Azure IoT Edge
- Innovations in Microsoft Translator
- Chinese language learning smartphone app
- Seeing AI, to help vision-impaired people interact with the world around them
Visual Studio Tools for AI is aimed at AI developers and data scientists. The new product in the Visual Studio line will leverage Visual Studio's debugging and rich editing capabilities, adding deep learning frameworks such as Microsoft Cognitive Toolkit, Tensorflow and Caffe.
According to Clayton, “AI is about amplifying human ingenuity through intelligent technology that will reason with, understand and interact with people and, together with people, help us solve some of society’s most fundamental challenges.”
Locally, these tools are being embraced by the University of Canberra which has developed the Lucy and Bruce chatbots to support students and university employees with intelligent insights and advice to streamline support services. The chatbots were developed using Microsoft Bot Framework and Microsoft Cognitive Services Language Understanding Intelligent Service.
Microsoft also collaborated with Pact Group to develop a proof-of-concept workplace safety solution. Using Microsoft Cognitive Services Computer Vision for facial and objection recognition the Workroom Kiosk Demo can identify individual employees in a workshop environment, detect if the correct safety equipment is being worn and monitor workplace behaviour based on an understanding of the tasks and tools the individual is authorised to perform. Team leaders are automatically alerted to potential issues.
Microsoft has been working in the field of AI for almost three decades and last year created the AI & Research Group which now has more than 8000 computer scientists, researchers and engineers.
The output of the group combined with “the power and scale of Azure, which ensures the big compute foundations for AI, and Microsoft Graph, which delivers the one of the broadest collection of data insights into world and workplace knowledge, are promising huge impact for organisations and individuals,” said Clayton.
Microsoft says its release today of multiple innovations in Microsoft Translator will expand the use of neural networks technologies to improve text and speech translations in all of Microsoft Translator’s supported products from developer centric API, to apps, to conversation and presentation translation features.
The company announced it would soon be releasing a smartphone app from Microsoft Research Asia that will benefit people learning Chinese, acting as an always-available, AI-based language learning assistant.
Microsoft's free app, Seeing AI, is designed to support blind and low vision people by narrating the world around them and is now available in Australia. Seeing AI is an ongoing research project and uses Microsoft's AI technologies and the iPhone camera to read documents, identify products, and recognise faces and provide a description of people's appearance.
For the estimated 384,000 Australians who are blind or have low vision, Seeing AI is a “beautifully simple app that describes people, things and text”, said Jenny Lay-Flurrie, chief accessibility officer, Microsoft, who is visiting Australia.
“We’ve providing computers with the intelligent capabilities to see, hear, talk and understand natural ways of communication,” she said. “This has profound implications for enterprise technology customers but critically also allows us to develop tools that promote inclusivity and allow more people to benefit from digital innovation.”
Microsoft says the foundations for its AI advances are a number of breakthroughs in computer vision, speech recognition and natural language understanding. It has received accolades through developing AI technologies that can recognise speech with an error rate of just 5.1%, achieved two months ago – and identify images with an error rate of only 3.5%.
Microsoft is also currently leading a competition run by Stanford University called the SQuAD dataset that uses information from Wikipedia to test how well AI systems can answer questions about text passages, which fuels results in areas like Bing search and chatbot responses.
Practically, this means computers will recognise words in a conversation on par with a person, deliver relevant answers to very specific questions and provide real-time translation.
It also means computers on a factory floor can distinguish between a fabricated part and a human arm, or that an autonomous vehicle can tell the difference between a bouncing ball and a toddler skipping across a street.
Clayton says, "Our job is to democratise AI so every company can be an AI company. AI is all part of the stack. It is infrastructure – you should have whatever GPU compute you want, so you can build your own intelligence. It's a set of APIs. If you want speech recognition, you want image recognition and text understanding."
“It's about applications with built-in AI. Every part of what we do will be AI. But, more importantly, every company that works to build something of their own will incorporate AI. That's what I mean by democratising AI. AI is perhaps the most transformative thing that's ever happened."
“We believe AI will complement rather than replace human endeavour in all fields.
"We encourage business leaders to replace the labour-saving and automation mindset with a maker and creation mindset – that’s what we are seeing with AI adoption at the University of Canberra and Pact. AI is augmenting rather than diminishing us.”