Monday interview: OMG's Julian Morris on motion capture advances, camera tracking and, er, road signs

Interviews
Share

Andy Serkis might not have got his Oscar for playing Gollum, but there’s no doubt that motion capture technology has well and truly arrived in Hollywood. And in games too, for that matter – nowadays no self-respecting football game comes without an array of famous players whose dribbles, flicks and vicious-testicle-stamps have been motion captured for inclusion.

OMG is one of the leading names in the motion capture industry, through its Vicon subsidiary which supplies kit to all manner of studios and game developers. I talked to deputy chairman Julian Morris about how the technology is evolving, and how it’s being developed for use in other unexpected areas.

OMG started in 1984, initially using its Vicon technology for medical uses, such as analysing the gait of children with cerebral palsy. Without getting too layman about it, motion capture involves attaching sensors to someone’s body which can then be used to track and model their movements in a computer (Wikipedia says it better and at more length).

Around 1995, OMG realised that there might be applications for this technology in the movie business, not to mention console games – anywhere where people wanted digitally-animated characters who moved more realistically. The rest was history (and countless promotional shots showing famous footballers covered in ping-pong ball motion sensors).

According to Morris, the biggest developments in motion capture technology in recent years have been increasing the resolution and processing power of the cameras used in the process.

“The resolution has driven down the size of market that we need to place on the actor,” he says “They started as the 25-milimetre ping-pong balls, but now when we’re doing facial motion capture, we put tiny little markers that are a couple of milimeters in diameter on their face.”

OMG’s cameras also now have internal processors, which mean that when filming now, the results can be seen in near real-time, allowing directors to watch the results as they are shot, rather than wait for several hours of number-crunching to discover that the actor made a wrong move. The other benefit is the ability to have a bunch of actors in one shot.

“In the early days, it was one actor performing, but now we can have five or even ten people acting as a group,” says Morris. “That’s great for sports and fighting games, while in animated films like Beowulf it’s essential to capture them simultaneously, not just their bodies, but their facial movements too.”

With the boom in animated films – and the need for animated characters even in live-action films (think Gollum, King Kong…) – being able to do motion capture is becoming a specialised skill for actors. But it hasn’t killed them off.

“In the early days, a lot of people predicted that computer graphics and motion capture would completely replace human actors,” says Morris. “Almost  the reverse has happened in both games and films. The kudos of a star doing the motion capture is important, for example. And the motion capture has become so subtle, that no amount of completely automation can replace a human actor.”

One of the interesting things about OMG is the crossover between its different areas. For example, the smaller sensors came out of demand from the medical customers who wanted more accuracy in the way body movements were analysed – but then the film and games customers realised they could use it too.

This means OMG’s tech is also being used in other new areas. For example, the company developed camera-tracking technology which matches the movement of a computer-generated character to the movement of a real-life camera. In other words, if you slap a 3D digital character into a film scene where the camera moves around, camera-tracking tech ensures their perspective and scale don’t go to pot. This camera tracking technology has applications in other areas too.

“It turns out various other people can use the information you get out of a moving camera,” says Morris. “For example, if UAVs are flying along and looking at the ground, you can work out where the aircraft is using GPS, but if you want to work out where the camera was pointing to get precise measurements or model the landscape, you need to do the same mathematics as these camera tracking projects in the visual effects industry.”

That’s fairly technical, but there could be other applications. Intruigingly, OMG has just started a new business using similar technology for a more consumer-relevant purpose: monitoring the huge amount of roadside objects such as road signs, speed limits, cameras and so on.

“It’s put there by lots of different agencies, mostly public but some private, and there’s no central database of what all this stuff is and where it is,” says Morris. The new division, Geospatial Vision, is basically putting vans onto the roads with advanced cameras in, driving round recording and automatically recognising this stuff.

“The previous method of collecting this information was to have people walking the road with tape measures and clipboards,” says Morris. “They stopped 10-15 years ago because the casualty rate was so high! It was apparently more dangerous than being in the armed forces.”

From Gollum to Gatso cameras in a few easy moves – who’d have thought it?

Stuart Dredge
For latest tech stories go to TechDigest.tv