Horizontal & vertical plane tracking

A major way in which augmented reality is used is the projection of virtual 3D objects into a physical environment. To make such an environment usable as a projection canvas, you first need to map it accurately. This can be done using the camera from a mobile phone or tablet. Every second of footage generates 60 separate images, or video frames.

The plane tracking software searches every frame for distinctive visual reference points, which are known as feature points. These might be things such as contours of objects and the dividing lines between different planes. Because these same references are applied to every frame, they are tracked in moving images.

As the camera moves around, the software receives more and more information about the interconnected relationships between all the different reference points, which in what is known as a pointcloud. By combining this data with information from your mobile device’s sensors (e.g. compass, accelerometer), smart algorithms identify all the various horizontal and vertical planes in a space, together with their interconnected relationships.

Horizontal plane tracking