Real-time ray tracing in Unreal Engine 4 will enable future games to become even more cinematic.
First released in 1998 by Epic Games, the Unreal Engine was a piece of coding developed for the first person PC shooter, Unreal.
Other software developers were so impressed with the technology that Epic Games licensed their 3D software and by the end of the following year, approximately sixteen other games were being developed with the Unreal Engine.
At the moment the current iteration is Unreal Engine 4 which was first revealed to the public in 2012. However, the coding is always being upgraded and two new major features have just been showcased at the Game Developers Conference.
The term ‘motion capture’ is now very common when discussing visual effects in the motion picture industry. The most common and well-known examples include Avatar and the recent Planet of the Apes trilogy.
In those films, actors were filmed in jumpsuits which were covered with special markers including their faces. Specially positioned cameras around the film set record the movement of these markers and send them to a computer.
The visual effects team can then replace the actor with a digital creature that replicates the actors moves exactly including their facial movements.
Then comes the process of rendering this data into a final screen ready animation where a single frame can take hours to be created.
At the Games Developer Conference, Epic Games and 3Lateral demonstrated a system running on the Unreal Engine 4 that takes that raw motion capture data and outputs a final rendered animation in real-time.
Epic Games stated,
While this is a stunning proof-of-concept achievement that for now will remain in the realm of professional visual effects, someday photorealistic digital humans will be used in interactive entertainment, simulations, research, non-verbal communication as an interface with the machines, artificial intelligence and mixed reality applications as well.
Here is the first video showcasing a real-time rendered Andy Serkis.
As already explained, motion data capture can be applied to any digital creation. Here is the same Andy Serkis performance from above but applied to a digital creature. Again, this is running in real-time.
Another demonstration that was unveiled was the use of real-time ray tracing. Ray tracing is a technique where the computer calculates how light reflects off different surfaces to generate a realistic picture.
In the 1990’s, ray tracing was used to create pictures of silver balls on chequered floors. This sounds unimpressive now but at the time it was amazing. To think that computers could create these stunning pictures by working out how light affected the picture and calculate what you can see reflected on the surface of the ball.
The only drawback is that the calculations for this kind of effect is incredibly processor intensive and it can take hours to render a single picture depending on the complexity.
An example of early ray tracing.
Software and hardware have obviously come a long way in the last twenty or so years. Epic Games along with ILMxLAB demonstrated a new enhancement to Unreal Engine 4.
This short clip created using Star Wars characters would again take hours to be rendered by a computer but his incredible, life-like, entirely computer-generated animation was running in real-time.
Here is a more down to earth showcase of real-time ray tracing from Remedy Entertainment.
Admittedly, these demonstrations aren’t running on a normal high-end PC. The Star Wars ray tracing demonstration was running on an NVIDIA DGX, an incredibly powerful workstation that sells for $60,000!
But technology is always improving and these are the kind of amazing graphical effects that will feature on our gaming screens in years to come.