NewsDate: 05-04-2018 by: Tiffany Won
A NEW WAY AUGMENTED REALITY IS MEETING ARTIFICIAL INTELLIGENCE
Augmented reality applications are limited by the tools that developers use to build them. In any field, people are limited by the capabilities of the tools they possess. The tools you use make a difference in the quality and scope of your work, and limit what you can build and accomplish.
Augmented reality applications for industrial uses such as those in manufacturing—whether it’s to bolster complex assembly, support maintenance or repair technicians, or for quality assurance—are generally reported on as being in the experimental phase.
There haven’t really been any breakthrough industrial applications in augmented reality, partially because the technology is new, and partly because the tools that software developers are limited in scope.
Artificial Intelligence and augmented reality seem ideally suited to one another, but developers may feel like it’s difficult to find an entryway or a foothold to begin incorporating AI into augmented reality applications. An augmented reality application could be greatly served by incorporating AI into it. AI would allow AR to interact with physical environments on a multidimensional level by incorporating real world object tagging, for example. That’s why the latest news from Unity about their partnership with IBM is so intriguing.
Incorporating Artificial Intelligence into Augmented Reality Applications
Unity announced a partnership with IBM to launch a new IBM Watson Unity SDK, a programming interface that gives developers a way to add cloud-based AI to their applications. From IBM, the perspective is data-centric.
Huge swaths of data are being generated at an enormous rate now—2.5 quintillion bytes of data per day—and IBM Research writes that 90 percent of the data that exists today was created over only the last two years. This is way more data than can be properly analyzed or utilized in a productive way.
IBM is looking to improve data visualization methods using augmented reality driven visual environments. An IBM project called Immersive Insights is tackling this issue by giving AR developers a chance to incorporate computer vision and speech recognition AI elements into Unity applications.
Most people are familiar with speech recognition, as most of us have spoken to our smartphones to command them to carry out an operation. IBM is also offering developers the ability to incorporate speech recognition with classification and language translation features in the IBM Watson Unity SDK.
Computer vision is the part of AI that is seeping into augmented reality applications built by companies like Vuforia, Unity, Microsoft and Blippar. IBM has Vision API for computer vision that will be available on the IBM Watson Unity SDK.