Feasibility summary – video analyzing on iPhone
We spent some good number of hours last week looking into analyzing the video feed on the iPhone in order to do marker tracking for AR. The common knowledge amongst the developers is that it is currently not supported by Apple, but we wanted to look into what exactly needed to happen and if there is a way to get around it using a combination of techniques available.
Well, in short, the answer’s a big disappointing no. At least not just yet with the SDK. There is no quick way to get to the video stream in order to analyze the frame. While image capturing provides enough data to do image analysis on a single frame, drawing on top of the image then re-analysing the next frame, is, however, impossible because any elements that were drawn onto to the video will get captured as well. In addition, the frame rate would be too low to make the application seem realistic (1 fps at best).
Our assumption is that marker-based tracking iPhone apps that might be available to developers out there (as we’ve seen with ARToolKit 4.4 demo video) must be using non-SDK code, aka jailbroken method.
AR using only location based data, however, is available in the SDK and ready to roll. Â iPhone 3.1 SDK made available the ability to draw sprites on top of video, and we’ve seen way-finding applications already available in the app store. Â (see now-available apps in the previous post)
Also, a techy read: Â Why 3d markerless tracking is difficult for mobile augmented reality