Several natural function tracking libraries are now available: Qualcomm AR SDK and Metaio Unifeye SDK and Metaio Junaio Glue allow you to register and track images on mobile devices, and HIT Lab NZ has the Opira library, which provides similar functionality, but on the desktop (with really good tools development).
There are two main approaches to inserting a 3D object into an unprepared, unmodulated video stream: the first is to make some changes in SLAM (simultaneous placement and display) - find the points of the function in the video stream, find those that are reliable -to-frame frames and use them to create a 3D point map that can be used as a model of the environment / tracking to insert a 3D object. The best known in AR circles is PTAM (parallel tracking and matching). The second approach is mainly used outdoors: suppose that the user remains stationary and asks them to create a panorama that you project onto the cylinder around the user. Objects can then be inserted into coordinates around the cylinder, and the environment can be studied and tracked. This is Panoramic Tracking and Matching (vaguely, the same abbreviation).
Hope this helps you get started.
source share