InnerPiece’s Real-time Ray Tracing Technology

Prime Engine X

1. Ray tracing

Ray tracing is a rendering technique used widely in 3D Animation, Architecture Visualization and Visual Effects because it has capability to produce images with very high degree of visual realism. It requires a great deal of computational power provided by machine to generate effects by simulating the travel of light ray in 3D scene.

This image is produced by ray tracing algorithm.

2. Real-time ray tracing

Because of extremely high computational cost, ray tracing can not be used for real-time applications such as Games, Interactive Simulation, etc. The following techniques can speed up ray tracing:

  • Algorithm: store the scene in a more optimal way for ray tracing (acceleration structure) and use better strategies to simulate light ray
  • Clustering: build a complex system consisted of multiple computers, ranging from hundreds to thousands depending on purpose and hardware

Disney studio used a 55,000-core system to render its animated film Big Hero 6. Their rendering solution is called Hyperion. You can learn more about it here.

But that’s an old story, it’s 2018, where the future of computer graphics completely changes:

  • Microsoft announced DirectX Ray Tracing (DXR) API, enabling real-time ray tracing features on graphics card. Before DXR, people were still able to involve GPU in ray tracing, but it was general computation algorithm and the GPU was completely unaware of what you were trying to do. Things like DXR provide core features for ray tracing on consumer’s hardware, thus making it easier to implement and run faster.
  • NVIDIA releases word’s first ray tracing GPU (Turing architecture) after 10 years of research. This hardware is called RTX and able to perform real-time ray tracing algorithms as well as AI denoising. See demo video.
  • Vulkan API is becoming more popular and now available on most platforms, including Apple’s MacOS and iOS. NVIDIA also contributed RTX API to Vulkan so developers can leverage Turing’s power easily.
  • Apple has its own real-time ray tracing API.

These are some of the events that truly mark the new era of rendering realistic images for an interactive environment with low cost and availability for everyone.

3. Prime Engine X

Technology pioneering is one of our principles. InnerPiece is currently researching and developing a real-time ray tracing renderer prototype. Our goal is to catch up with the world and bring more practical technology to people.

Ray-traced shadow

This effect is computed by casting a ray from object’s position towards light. If the ray reaches the light, it’s not in shadow.

And it does not have any problems like common shadow mapping techniques in traditional real-time graphics.

More images on ray-traced shadows to demonstrate the accuracy:

Above section showed our shadow implementation with ray tracing. Further improvements for this technique will include soft shadow.

Planned techniques

We will add more effects to our real-time ray tracing renderer and update on this page. Below are our references.


Screen-space ray-traced reflection (SSR) has been developed in real-time graphics for years, but the downside of it is lacking of off-screen details. For example: if something is behind or the camera (viewer), it can not be reflected. Our engine will care about these details and render true reflections.

Ambient Occlusion

This effect calculates how the object should look like with ambient lighting.

Remember to subscribe and get latest news about InnerPiece’s real-time ray tracing technology.