Southampton Solent University logo
Southampton Solent University logo
Skip to main content

The warped world of projection mapping

17 January 2019

Media Technology students were joined today by Creative Technology’s Head of Technology Tom Burford for a fascinating session on projection mapping. Now that’s a lot of ‘technologies’ for one sentence!

Tom took us through the process of producing graphics for projection onto large and complex structures such as buildings and stage sets. The session was based around several case studies including a live demonstration using a model prepared for an event at Holland’s largest windmill.

We discussed the various methods for modelling a structure including architectural drawings as well as measurement techniques using photogrammetry, Total Station LASER systems or LIDAR. Tom mentioned how difficult this could be, particularly with older buildings that tend to have been constructed to loose tolerances, or temporary structures that don’t yet exist! Creative Technology have successfully projected on some major landmarks and even onto a mountain made of irregularly placed foam wedges. We learned how this often involves a combination of modelling techniques and calibration, occasionally even including building the set to the projection model!

The data that is collected tends to be stored in a very raw format and the bespoke nature of each job tends to result in quite manual workflows to process it. This underlined the need for an understanding of scripting languages to ‘glue’ applications and data together. Tom described the UV Unwrap tool that they use to transform three dimensional objects into two dimensional mesh textures that can be projected. This involves some serious trigonometry as well as consideration for the resultant light levels and dot pitch around the structures. As the surfaces vary in distance and depth from the projector so a linearity adjustment needs to be made to account for this as well as traditional key-stoning effects. The surfaces also tend to contain elements with different gain and scattering characteristics, which may also need to be factored into the textures to produce a consistent image. Projecting live images and data onto the BBC’s Broadcasting House during the 2015 general election could have become extremely embarrassing if any of the politician’s faces had become distorted in the wrong way.

In order to produce even coverage from a wide viewing angle, several projectors usually need to be blended. This means that they need to be carefully aligned and calibrated, which is achieved using a six-point pose estimation. The offsets obtained are then used to conform the image to the structure during rendering – often in real time. Tom discussed how the GTX architecture that NVidia introduced has made this possible by introducing CUDA cores with lots of floating point ALUs that are great for threading graphics tasks. Automation tools are starting to be developed such as cameras that can analyse a structural light pattern to automatically calibrate arrays of projectors within minutes.

As if the processes weren’t impressive enough, Tom left us with some thoughts on tracking moving or deforming objects. Data can be captured from winch automation or infrared cameras to track moving objects – for a Miley Cirus concert, Creative Technology covered a giant inflatable dog in IR dots to map a projection as the structure wobbled and deflated. The acoustics buffs in the audience were particularly interested to hear how ray tracing software can be used to calculate positional data based on the input.

The students were left with their eyes opened to an enthralling application of a range of technologies. Many thanks to Tom for sharing some of his experience with us.