Augmented Reality, from WWDC to ARKit
Apple has been the subject of rumors relating to AR and VR projects for quite some time, with details surfacing mostly through patents, industry reports, and speculation based on what the rest of the tech world has released. Not much information about its plans were revealed in the first half of 2017, but a trickle of stories flowed before WWDC suggesting something was on the way.
ARKit was introduced at that June 5 developer conference, presented as a toolkit for developers to add AR more easily to their iOS apps. Demonstrated on stage by an AR scene produced by director Peter Jackson’s Wingnut AR, it was shown that ARKit had the potential to place complex scenes on a surface within the view of an iPad, and that Apple had solved some of the tougher problems developers had working within AR.
For a start, ARKit does a lot of the work on behalf of developers, analyzing an environment and locating horizontal planes suitable to place virtual objects, while also keeping track of the location of these planes when they moved out of view of the rear camera. It is also capable of monitoring the lighting of the environment, which could then be used on objects within the AR view to make them appear to be more “real” in a scene.
Due to the use of CoreMotion data and able to work using a single rear camera, without requiring any extra accessories or add-ons to a mobile device, this also meant that ARKit apps would be able to function on a vast number of iPhones and iPads in use in the world. These relatively low hardware requirements effectively meant anyone with an iOS device released in the last few years could easily run ARKit apps, giving developers a potentially large audience.
More importantly, ARKit has been created to be relatively simple to incorporate AR into their apps, to the level that there are easy-to-follow tutorials on using it in Apple’s Swift Playgrounds. This relative ease in integrating ARKit into software and the massive audience of people intrigued by AR led to the development community embracing the framework, and quickly putting together their own apps to try it out.
Major companies including Lego, Ikea, and Amazon have adopted the technology for its apps, with others replacing their own systems with Apple’s version. For example, Niantic optimized its Pokemon Go iOS app with ARKit, with the AR+ mode allowing players to “sneak” up on an AR Pokemon to capture it, as well as making the creature’s placement in the environment more realistic.
Owners of the iPhone X are also able to use ARKit in another way: on their face. ARKit’s face tracking system powers the popular Animoji feature, as well as its Portrait Lighting effects, masks, and avatars, which developers are only just starting to use for their own purposes.
Considering the months of availability to the public and the adoption of the platform by developers, it is hard to call ARKit anything but a great start for the fledgling framework. As it gets more established, expect there to be even more AR-related content in the App Store, especially if AR technology acquired through the purchase of Shazam is tapped for use.
If it does, Apple is poised to greatly increase its revenue from AR apps, with Apple CEO Tim Cook advising in a November interview Apple’s is “all about making sure the customer experience is great” with “revenue and profits to follow” if everything goes right.
“I view AR as profound. Not today, not the app you’ll see on the App Store today, but what it will be, what it can be,” said Cook. “I think it’s profound and I think Apple is in a really unique position to lead in this area.”
Apple finally arrives to the VR party
Pre-WWDC, it was fair to say that VR was treated in a similar fashion to AR, in that there were rumors but little in the way of solid detail. Closer to WWDC itself, reports leaned towards a more major AR-related announcement than one for VR, and considering third-party VR efforts were largely nonexistent on Mac, there was a possibility that Apple wouldn’t even bother showing it off at all.
Although brief, Apple did take time to highlight its VR credentials during WWDC, demonstrating that its new iMac systems were capable of rendering VR content. The demonstration, using Epic’s Unreal Engine to render a Star Wars-themed scene, ran at a smooth 90 frames per second, with the high frame rate making it suitable for VR applications.
While the new iMac models and the announcement of the more powerful iMac Pro desktops shipping later the same year were proof that Mac systems were able to match PCs in pushing pixels fast enough for VR content creation and consumption, a different development offered more potential in the field.
Introduced in macOS High Sierra, support for external graphics cards helped give a boost to the GPU enclosure market, which allows for graphics cards to be installed in a separate box connected over Thunderbolt 3, effectively allowing the host computer to take advantage of its graphical power. This option offers the possibility of higher quality VR visuals and more complex scenes while maintaining the frame rate, though as a lower-cost upgrade to a compatible Mac instead of necessitating the purchase of a more powerful system.
As an indication that this is possibly the way to go with regards to VR content and higher GPU processing power requirements, even Apple has issued its own eGPU enclosure. The external GPU developer kit provided by the company included a Sonnet enclosure and a Sapphire RX 580 reference-design GPU, a combination that shows the concept has some promise.
Even so, Apple has advised it will be making improvements to eGPU support in early 2018, fixing issues that could keep potential users from trying it out. The list of supported GPUs is limited, with users needing to perform some minor hacks to get around the restrictions, while the lack of a loopback and a “clamshell” mode makes the current implementation less than ideal for anyone other than those inclined to tinker.
Once these issues are solved, it is probable that VR will become more prevalent on Mac in the future, though it remains to be seen much of a push it needs to properly take off.
No headsets now, nor in the immediate future
One area that Apple has certainly avoided in 2017 is that of wearable VR and AR hardware. While major rival Samsung is working on its phone-based VR headsets, with HTC, Facebook’s Oculus, and Sony in the VR marketplace with their own computer-connected efforts, along with continued teases by Magic Leap over its AR headset, Apple has so far declined to enter that field at all.
Anyone looking to try out virtual reality will need to look at the rising number of headsets already on the market. For AR, the current best options are to try and acquire one of the headsets inspired by Microsoft’s HoloLens, or to wait for Magic Leap’s long-awaited release.
This lack of Apple-branded hardware releases hasn’t stopped the rumor mill from running, with numerous stories surfacing throughout the year over AR hardware that Apple is claimed to be working on, but has yet to reveal.
At the start of the year, Apple was reportedly working with Zeiss to produce mixed-reality glasses, with Zeiss’ lack of related products at its CES 2017 booth used as evidence. Also on the production side, the acquisition of SensoMotoric Instruments also has potential for AR applications, with the firm’s eye-tracking technology used for both AR experiences and in the medical field.
In August, one report claimed AR glasses had become “a particular area of experimentation” within Apple, with teams working on designs using screens integrated into glasses and a system similar to Samsung’s Gear VR headset.
Magic Leap’s One Lightwear mixed reality goggles
An October report from Bloomberg added more fuel to the rumor bonfire, claiming the fabled glasses are codenamed “T288,” would be a standalone unit with its own display and processor instead of tethering to a host device, and would run its own “rOS” platform. According to the report, even this hardware is a long way from public consumption, with Apple apparently seeking to develop technology for the hardware by 2019 for a market release in 2020.
An interview with Tim Cook in the same month does indicate that there will be a long wait before Apple releases AR glasses, regardless of how true the earlier rumors have been. The technology to make such a device “doesn’t exist to do that in a quality way” at the moment, Cook insisted at the time, citing huge challenges with putting hardware near a user’s face and the field of view of optics.
Cook also advised that Apple probably won’t be first to the market with such hypothetical hardware, declaring a preference to be the best on the market and to “give people a great experience.”
While the official Apple line is to neither confirm nor deny its development, some leaks may have confirmed its existence. A leaked safety report from a contractor indicates members of staff suffered eye-related injuries in February and March, testing prototypes at two offices in Cupertino.
Is there a chance that Apple will release AR glasses in 2018? It’s highly doubtful, but it’s more probable that there will be more rumors surrounding the supposed headgear in the coming 12 months, especially if ARKit continues to increase in popularity.