augmenting

tracking augmented reality

Presenting Augmented Reality at MEIC5

 

We presented our 3-month research & prototyping project in augmented reality for mobile at the Mobile Experience and Innovation Center event on Thursday, November 26, to a very receptive, mobile-aware crowd.

In short, our R&P project, partially supported by MEIC, aim to develop a toolkit for making augmented reality applications on mobile platforms. Half of the project was doing an industry survey of the state of AR (who’s doing what in regards to AR on mobile phones) and another half was dedicated to coding and development. We worked with a very bright intern from Ryerson University, Michael Lawrie, on the development end.

Even though AR has been gathering a steady stream of attention from the online advertisers and other media in the past two years, once we looked into it, we found that the development for both web-based and mobile phone applications are still quite difficult and resources are limited. The open source library, ARToolKit and its Flash ported (FLARToolKit) version have helped to push the development on the web end, but projects in the mobile space are, relatively speaking, rather sparse.

In particular, we were interested in making an AR application that overlay virtual objects onto public spaces (at specific locations) – this would have required the usage of both image recognition and location based services. Developing an AR apps with LBS is already available with the iPhone SDK and other platforms (and several apps have been launched in the past months), doing image recognition on the mobile phone, however, is not. We spent the past month developing some tools that would allow us to do image recognition on the iPhone, and while we are very aware of examples and demos that various companies have put on YouTube, we have yet to see a compelling working application.

At MEIC5, we presented our overview on the state of AR in mobile platforms, its highlights and pitfalls, and showed the progress of our own development. Unfortunately there was no video recording. We’ve made our slides available, but embedded video clips from the slides are not supported by slideshare.

Here’s Mark at MEIC5:
Mark presenting @ #MEIC5

Our slideshare deck:

Demo videos of the split-screen iPhone apps doing marker detection:

Our prototype testing for doing AR on the iPhone using NYARToolKit to detect a marker

Our prototype testing for doing AR on the iPhone using OpenCV to detect a marker

Thanks, MEIC!

December 3rd, 2009

Posted in Notes

Leave a Reply

Subscribe to comments with RSS or TrackBack.

Back to top