WWDC revealed Apple's plans for Augmented Reality. AR is something long-rumored as Cupertino's next big thing. With ARKit, Apple readies developers to bring you Augmented Reality apps using your iPhone and iPad.

With integration into SpriteKit and SceneKit, it should be easy for developers to update their apps. But how will this affect the apps of the future?

What Is Augmented Reality?

As James wrote in his coverage of Apple's VR efforts, Pokemon Go was many people's first taste of AR apps. It was a simplistic beginning, putting a simple overlay of the Pokemon on the image in your camera. It did not interact with the background, nor did the game do anything with the orientation of your device.

Then it should not be a surprise Apple showed off an update to Pokemon Go. However, rather than ignoring the background, Pikachu jumps up and down in the sand kicking up dust. They also showed off moving your phone around AR objects. You can see them from different angles against the background. They can react based on the position of the phone.

There was an impressive keynote moment, as a developer built a landscape on top of a table. It was live animation with 3D objects you could view from every angle. Apps are still using your iPhone or iPad as a portal to see the content. However, it was impressive to see the first possibilities. So far we've mostly seen game gimmicks, but that could be due to hardware limitations.

ARKit and the New APIs

ARKit is Apple's centerpiece to this new technology. This is a new set of APIs that allows for developers to get data from the camera and accelerometers. Then they track that data to render the app as an overlay to the real world.

In theory, a developer can implement augmented reality into their app with just a few new calls. Then you can get data used to draw an object overlaid in real space. The secret sauce that Apple touts for ARKit is horizontal plane detection. If you do not remember your geometry, plane in this sense is a flat surface. The iPhone can use these as points to place objects, allowing the app to render objects on multiple surfaces like putting a chair on the floor and a lamp on a table.

Apple uses something called Visual Intertial Odometry (VIO) to track how a device moves around. Then it passes to the app's data, without having to recalibrate. VIO uses two existing iOS API's camera input and CoreMotion. If you watch the developer sessions from WWDC or read the docs, it is clear that Apple focused on using existing data to create AR.

There is some additional work, but it should make it easy for existing apps to use this. ARKit is certified to work on A9 chips or later, bringing phones as far back as the iPhone SE along for the ride.

SpriteKit and 2D Animation in Real Life

SpriteKit is the 2D animation toolset that Apple offers developers. It is mainly used to create 2D games but does have some other uses. SpriteKit is getting some upgrades for integration into AR apps.

Sprites are digital 2D art that can move around the screen. The best-known example is in 8-bit and 16-bit games where characters appeared on screen as sprites. For iOS, Sprites are in a "sprite scene" which acts as the layout for a game or app. Then developers create physics or movement for the sprites.

First, it now supports moving and flipping sprites in 2D space. This means that your sprite can have two sides as you flip it, giving it perspective even if it is flat. In the WWDC session, they used the example of placing floating emoji in space. The camera moved around to how they hung in place about the phone and each other.

Another interesting demonstration was breaking out three layers of a scene on a table. This gives a 2D game depth and perspective in the real world. They also showed of putting an arcade cabinet into the room to play a game. That last one was not so practical, but it did look cool.

SceneKit Works Out the Shadows

SceneKit is Apple's 3D animation toolkit for apps. Not too much is changing with SceneKit to integrate with AR. It is already a fairly comprehensive animation kit. SceneKit a complex API, but if you are curious check out the documentation.

ARKit does change how the API uses camera data to capture how the lighting behaves in your environment. Then, it passes that data to the 3D animation. Objects are then displayed with the same lighting. This prevents AR objects from having an unearthly glow.

Ikea is coming out with an app that lets you display furniture in the room where you want it. This lighting means that you should be able to see it as close to reality as possible. Alas, you cannot tell how comfy that chair is, but you can see if your reading light makes it have a funny color.

Glasses or Tired Arms in September?

If all these tools are limited to Pokemon Go clones and apps like IKEA's, AR is (still) not going to change the game. These are all cool, but the gimmick of running around with your iPhone or iPad as a portal will lose its novelty.

Throughout this article are some YouTube clips of the demos already being made. So it does seem that developers are excited as Apple about the possibilities.

We will have to see what Apple is offering this Fall to be sure how developers adopt ARKit. What is clear for now, is that Apple created a ton of easy to use developer tools. With support from Unity and Unreal, game makers are going to have fun as well.

Personally I'd like to see a board game app use AR to put the board and pieces in front of you, with your player info still on the iPad. You could implement multi-device or pass and play multiplayer too.

What would be your ideal AR app for the iPad? If you feel underwhelmed, what hardware from Apple would get your excited about AR? Let us know in the comments.