Simple image based lighting for AR
In this experiment we looked at how we could make the 3D augmentations look more realistic. In many AR applications it is very easy to tell the real world and the virtual addons apart. The virtual objects are often much sharper, brighter and more colorful than the real world, and in almost all work that has been done so far (in the research community), the light from the real world does not affect the virtual objects.
In the real world, objects receive light from lightsources directly (e.g. the sun, lamps, nearby lava pools) and indirectly (via mirrors, other objects, etc). On the other hand, physical objects can also block light, so the virtual object could receive a (hard or soft) shadow because of this.
Some advanced research has been done to allow real light to affect virtual objects (see for example this and this paper), for example by analysing the light received by the camera, and splitting it into a) the components that comprise the native color of the object, and b) the color of the light that the object has received.
Since most of our applications should be able to run on mobile devices, we adopted a much simpler approach. In the video below we take the image from the camera, blur it to a smaller or larger degree, and use a special form of reflection mapping to add this ‘light’ to the object.
It’s not perfect, but certainly a step in the right direction, and further measures can be taken to increase the realism further without adding too many computations. For example by making sure that the light from above is brighter than light from below.
This experiment was done on a laptop with a webcam, and we’re looking forward to trying this approach with a mobile phone or tablet.