augmenting

tracking augmented reality

Archive for the ‘Notes’ Category

Presenting Augmented Reality at MEIC5

without comments

We presented our 3-month research & prototyping project in augmented reality for mobile at the Mobile Experience and Innovation Center event on Thursday, November 26, to a very receptive, mobile-aware crowd.

In short, our R&P project, partially supported by MEIC, aim to develop a toolkit for making augmented reality applications on mobile platforms. Half of the project was doing an industry survey of the state of AR (who’s doing what in regards to AR on mobile phones) and another half was dedicated to coding and development. We worked with a very bright intern from Ryerson University, Michael Lawrie, on the development end.

Even though AR has been gathering a steady stream of attention from the online advertisers and other media in the past two years, once we looked into it, we found that the development for both web-based and mobile phone applications are still quite difficult and resources are limited. The open source library, ARToolKit and its Flash ported (FLARToolKit) version have helped to push the development on the web end, but projects in the mobile space are, relatively speaking, rather sparse.

In particular, we were interested in making an AR application that overlay virtual objects onto public spaces (at specific locations) – this would have required the usage of both image recognition and location based services. Developing an AR apps with LBS is already available with the iPhone SDK and other platforms (and several apps have been launched in the past months), doing image recognition on the mobile phone, however, is not. We spent the past month developing some tools that would allow us to do image recognition on the iPhone, and while we are very aware of examples and demos that various companies have put on YouTube, we have yet to see a compelling working application.

At MEIC5, we presented our overview on the state of AR in mobile platforms, its highlights and pitfalls, and showed the progress of our own development. Unfortunately there was no video recording. We’ve made our slides available, but embedded video clips from the slides are not supported by slideshare.

Here’s Mark at MEIC5:
Mark presenting @ #MEIC5

Our slideshare deck:

Demo videos of the split-screen iPhone apps doing marker detection:

Our prototype testing for doing AR on the iPhone using NYARToolKit to detect a marker

Our prototype testing for doing AR on the iPhone using OpenCV to detect a marker

Thanks, MEIC!

Written by admin

December 3rd, 2009 at 11:05 am

Posted in Notes

Feasibility summary – video analyzing on iPhone

without comments

We spent some good number of hours last week looking into analyzing the video feed on the iPhone in order to do marker tracking for AR. The common knowledge amongst the developers is that it is currently not supported by Apple, but we wanted to look into what exactly needed to happen and if there is a way to get around it using a combination of techniques available.

Well, in short, the answer’s a big disappointing no. At least not just yet with the SDK. There is no quick way to get to the video stream in order to analyze the frame. While image capturing provides enough data to do image analysis on a single frame, drawing on top of the image then re-analysing the next frame, is, however, impossible because any elements that were drawn onto to the video will get captured as well. In addition, the frame rate would be too low to make the application seem realistic (1 fps at best).

Our assumption is that marker-based tracking iPhone apps that might be available to developers out there (as we’ve seen with ARToolKit 4.4 demo video) must be using non-SDK code, aka jailbroken method.

AR using only location based data, however, is available in the SDK and ready to roll.  iPhone 3.1 SDK made available the ability to draw sprites on top of video, and we’ve seen way-finding applications already available in the app store.  (see now-available apps in the previous post)

Also, a techy read:  Why 3d markerless tracking is difficult for mobile augmented reality

Written by admin

September 21st, 2009 at 12:02 pm

Posted in Mobile, Notes

There are inspirations, but then there are also other things

without comments

Our path to the world of augmented reality probably began like most curious coders out there:  it started with a link to some YouTube video and a “hey that’s cool, let me figure out how they did that” and naturally this led to Googling, finding some opensource codes, staying up late and opening up 20 more browser windows, etc.

Wait, actually, maybe it was seeing that GE ad… or was it that Japanese video… or was it this blog post?

Anyway, there’s a lot out there, and it’s steadily gaining street creds despite the gimmicky nature of 3D graphics.  Now that the venerable iPhone is about to get some AR applications in the App Store (they said September, right?), the time seems right to pay attention to this thing.  We’re starting this website as a part of our R&D effort to collect resources, notes, and track the world of augmented reality development, mobile platforms and otherwise.  Who knows, it might be useful.

Written by admin

September 2nd, 2009 at 12:48 pm

Posted in Notes

Back to top