It has been a while, since I am interested in Augmented Reality (AR) and by now I was exploring more its practical side experimenting with ARToolkit. Now, I decided to do a more theoretical research in an attempt to, firstly, clear up in my head what Augmented Reality, Augmented Virtuality (AV) and Mixed Reality (MR) are and then document important AR projects for museums. The latter, as this post became 1000 words long, is going to be the next post.
The Augmented Reality described here refers to visual AR, whilst for audio augmented reality I have found the following publications:
- Guided by Voices: An Audio Augmented Reality System
- Hear&There: An Augmented Reality System of Linked Audio
The most comprehensive definitions of Mixed Reality, Augmented Reality and Augmented Virtuality come from Wikipedia:
Mixed Reality refers to the merging of real and virtual worlds to produce new environments and visualisations where physical and digital objects co-exist and interact in real time.
Augmented Reality is a term for a live direct or indirect view of a physical real-world environment whose elements are augmented by virtual computer-generated imagery.
Augmented Virtuality refers to the merging of real world objects into virtual worlds.
In other words: Mixed Reality is a superset of both Augmented Reality and Augmented Virtuality representing all the cases that the user sees real and digital objects co-existing; Augmented Reality is when the user sees as a background the real world with digital objects superimposed and Augmented Virtuality is the reverse situation when the viewer sees a virtual world with some real objects in it. I must admit, when I cleared those terms in my mind, I felt better, like Kyle from Southpark I said to myself “You see, I’ve learned something today!” The subject of this post however is Augmented Reality alone.
With the definitions given above Augmented Reality can also be considered any application that uses the camera feed as its background. In most cases, though, when Augmented Reality is referred, it is meant that the superimposed graphics and they way they appear depend on the tracking, the localisation of the position and orientation of real physical objects. For example, in the Figure 1, the green creature appears always on top of the black square card. If that card, a real world object, is moved or rotated, so will also do the creature. There are several tracking techniques but in this post we will focus on the two most popular approaches the Marker-based and the GPS/Compass tracking.
Marker is usually a square black and white illustration with a thick black border and white background. Here are some examples:
Using a camera, the software recognises the position and orientation of the marker and according to that, it creates a 3D virtual world which Origin, i.e. the (0,0,0) point, is the center of the of the marker while the X and Y axes are parallel to the two sides of it and the Z axis is vertical to the plane shaped by the X-Y axes. For the ARToolkit Library in particular, one of the most popular regarding marker-based tracking, the coordinate system of the 3D world is as it is shown below:
The superimposed graphics are designed according to that coordinate system thus every move of the marker affects subsequently the graphics.
The main advantage of marker-based tracking is the accuracy as well as the stability in terms that, as long as the marker is clear in the camera’s view, the scene is solid and positioned precisely. Portability is another feature, as there is no need to make any changes on the software for changing the real-world environment. However one major disadvantage I faced with marker-based tracking is the flickering of the superimposed graphics when the marker is “face-on”. I did try for optimisation and work-arounds to avoid it but nothing worked; I just should not point at the marker directly.
Another very popular tracking technique used on AR applications is the combination of a GPS and a compass and it is very popular in the latest technology smartphones. The concept here is fairly simple; the software has some places of interest stored as if they were on a map (longitude and latitude values for each one of them) and considering that with the GPS and the compass the software is aware of the direction the user is looking at, if there are any stored places in that area in front of the user, then the information about each one of them is displayed. For the places outside of the field of the user’s view, arrows may appear pointing to each one of these places’ direction. An indicative example is the Nearest Tube iPhone 3GS application:
The GPS/Compass tracking lacks in terms of accuracy as the GPS has ±10 meters precision as well as of stability because its functionality depends on the GPS-reception of the area it is used. However it has the major advantage of being marker-less and thus it can be applied without any intervention to the real-world.
Optical vs. Video see-through
Apart from the tracking method Augmented Reality applications are also categorised by the display they use, between the optical see-through and the video see-through displays. An optical see-through display employs half-silver mirror technology to allow views of physical world to pass through the lens and graphical overlay information to be reflected into the user’s eyes (Wikipedia, § Augmented Reality). An optical see-through display is usually part of a Head Mounted Display (HMD) a RoboCop like thing, which allows to whom it is wearing it, to see the real world through his or her glasses but with computer-generated graphics superimposed. The video see-through display is when the real world and the virtual world are shown in one single video stream. The Nearest Tube shown above as well as any other AR application for handheld devices uses a video see-through display i.e. the smartphone’s screen.
Developing AR Applications
There is a plethora of free tools that can be used to develop AR applications and an extended list of them can be found in Wikipedia. Below there are some of the libraries suggested.
- ARToolKit A Cross-platform Library for the creation of augmented reality applications, developed by Hirokazu Kato in 1999. It is maintained as an opensource project hosted on SourceForge and its newer versions since 2007 can be bought from ARToolWorks.
- ATOMIC Authoring Tool A Cross-platform Authoring Tool software, which is a front-end for the ARToolKit library. Was developed for non-programmers, to create small and simple, Augmented Reality applications, released under the GNU GPL License.
- OSGART A combination of ARToolKit and OpenSceneGraph.
- SSTT Core Marker-based tracking which markers are coloured, without thick black border.
- ARKit Open-Source Augmented Reality library for iPhone.
- mixare – Open-Source (GPLv3) Augmented Reality Engine for Android. It works as a completely autonomous application and is available as well for the development of own implementations.
- NyARToolkit – an ARToolkit class library released for virtual machines, particularly those which host Java, C# and Android.
- AndAR – A native port of ARToolkit to the Android platform.
- FLARToolKit An ActionScript 3 port of ARToolKit for Flash 9 and later.
- SLARToolkit A Silverlight port of NyARToolkit.
Well, I did learn something today..like that Android is a particularly AR-friendly platform. The next post is going to be on Applications of Augmented Reality for Museums.