Do you know the difference between the three Rs of rendering? We shine a light on the basics of real-time, ray-traced and rasterized rendering to break it down.
Since its release late last year, V-Ray for Unreal has completely changed the way artists and arch-viz studios can explore their scenes, breaking down the boundaries between photorealism and real-time. An update in May bolstered the product further — making it rock solid and ready to tackle tough projects — while smoothing the transition from V-Ray to Unreal.
But, we often get asked questions about V-Ray for Unreal: How does it work and why do I need it? To get to the bottom of it, Chris Nichols recorded CG Garage Episode #228, dedicated to answering some of the important V-Ray for Unreal questions. Joining Chris was V-Ray for Unreal’s Product Manager, Simeon Balabanov, and Vice President of Product Management, Phil Miller, and what came about was a fascinating and enlightening conversation covering the very basics of rendering — as well as the importance of V-Ray for Unreal.
Episode #228 of CG Garage features V-Ray for Unreal’s Product Manager, Simeon Balabanov (left), and Vice President of Product Management, Phil Miller (right).
Listen to CG Garage Episode #228 now:
In this article, we’ll share the three Rs of rendering — as discussed by Chris, Phil and Simeon in CG Garage Episode #228 — and give insight into the differences between real-time, ray-traced and rasterized rendering.
Rasterizing and ray tracing
We’ll start at the very beginning. To depict a 3D object in a 3D space — for the sake of simplicity, let’s say it’s a view from within an open cardboard box — you’ll use your chosen 3D software to create the basic box shape, apply a flat image of cardboard (a texture) to each side, and lighten or darken each side depending on where it is in relation to your positioned light source. Your 3D software works out the color of each pixel of the box, depending on the positioned light source, which results in a fairly realistic depiction of a 3D cardboard box.
In a game-engine workflow, this is known as rasterized rendering, and the information about whether the cardboard should be light or dark is basic direct illumination. With rasterized rendering, you’ll often need to place extra lights into a scene — sometimes below the floor line and in the corners of walls — to achieve the lightness and darkness effects you’re looking for, because light doesn’t bounce in a rasterizer as it does in real life.
On the other hand, with ray-traced rendering, you only need to place lights where they actually are, as the light emitted will bounce around the scene as it would in the real world — and therefore looks far more realistic. So, let’s say we take your cardboard box “room” and we want to use the computer to produce the true illumination from a light source. Thanks to ray tracing, the light from that source will bounce around the scene; it’ll figure out where light intersects with the box and how much light it should reflect, how it travels through the lens of a virtual camera and, finally, how the image from this camera should be displayed on your screen. The results are highly photorealistic. This is indirect illumination, or more often called Global Illumination, which is typically shortened to simply “GI.”
Both techniques have benefits:
- Rasterized rendering is a faster way of creating 3D imagery, but it comes at the cost of quality. This makes it ideal for video games, where speed is more important than aesthetics. Unreal Engine is an example of a popular rasterized renderer.
- Ray-traced rendering is more computationally intensive and is suited to industries where quality is more important than quick interaction: VFX for film, TV and advertising, and imagery and animation for arch-viz, design and automobiles. V-Ray is a popular ray-tracing renderer.
A little history
While ray-traced rendering is used by all the major animation studios and VFX companies now, this wasn’t always the case. Just over a decade ago, ray tracing revolutionized the results that artists could achieve, and allowed “lighters” (the CG artists responsible for working out how to light scenes) to be able to create even more shots in much less time.
Chris Nichols explained in CG Garage Episode #228: “If you were going to do a movie, you would look at how many shots you had to light, and [back then] you could only budget about 20 shots per lighter, because that's how long it took for each scene to be lit correctly. You had to fake all the lights, all the shaders, all of the stuff to make it look correct. When [artists] moved to ray-traced global illumination rendering, suddenly that budget went way up. One lighter could light 120 shots, easily, or more! You don't have to fake anything.”
Let’s summarize some of the facts:
- Ray tracing can render accurate shadows, recursive reflections, refractions, and any reflected/bounced light.
- Rasterized rendering can’t do accurate shadows, recursive reflections, refractions, or any reflected/bounced light; with rasterized rendering you have to fake or skip these qualities.
- Ray tracing is ideal for those who care about model complexity and their shaders and want to be able to keep their original geometry; the results can be scaled up to whatever is required.
- Rasterized rendering is ideal for those that don’t need to worry about model complexity or simplified shaders; the results will be restricted to the game engine.
What about hybrid rendering?
The hybrid of ray tracing — which is essentially “ray-tracing effects” — is now being included in Unreal Engine and Unity; these effects are applied on top of the raster engine and the process of setting up the scene for the game engine has not changed.
Let’s get real-time
The process of a computer generating a series of images fast enough to allow for interaction is referred to as real-time rendering. The goal with real-time rendering is to match the monitor refresh rate to make visuals butter smooth — so we’re talking 60 Hz = 60 FPS (frames per second) for games and VR. (24 or 30 FPS is often referred to as “real-time,” but this technically relates to film and broadcast frame rates.)
In the past, the term “real-time” only applied to game engines, but with the advent of fast ray-tracing technology — such as Chaos Group’s Project Lavina — it now applies to anything which generates images quickly and smoothly. “Real-time, technically speaking, is just the speed at which the renders can be screened,” says Simeon. “Nowadays, I believe that the threshold would be more than 30 frames per second.”
Preview the major features coming soon to Project Lavina, our groundbreaking application for 100% ray tracing in real-time.
Today, VR is still finding its feet in the videogame industry, but for architects and designers it’s already indispensable. Phil Miller explains: “Having that feeling of one-to-one scale is revolutionary. In all of the centuries of visualization, it's always been a 2D picture. But a lot of people can't relate to a 2D picture — there's perspective enforced, you don’t know what it really looks like, or where your eye level is. You don't feel like you're there. In VR, you do; you get a real sense of space.”
You can use both rasterized and ray-traced render engines to create VR experiences; rasterized engines allow for movement and interaction at the cost of realism, while ray-traced engines can display photorealistic worlds (if you don’t move anything other than your head). And this is exactly why we created V-Ray for Unreal; to help artists achieve photorealistic results with real-time workflows.
Real-time, real problems
Ray-traced and rasterized engines are two very different things, and both require artists to have different skill sets. However, V-Ray for Unreal now allows artists to render with a ray-traced workflow and get the same high-quality V-Ray results but with a real-time workflow in Unreal.
An artist working on a video game, for example, has to understand how to make the game assets fit certain polycounts and how to optimize textures without compromising how they look. A VFX or arch-viz artist has to make sure everything looks photorealistic and physically correct. Working photorealistically is easier by comparison since the results relate to the real world, and so this workflow is beneficial to all designers and artists.
Today, industries that typically used a ray-traced workflow — such as architecture — want to have the benefits of real-time: The ability to interact with a scene and inspect it from any angle; the chance to experience it in VR with full movement in any direction. But they also want the incredible fidelity they’re used to from ray-traced rendering engines.
One of V-Ray for Unreal’s greatest benefits is its ability to bake light just as photorealistically in Unreal as it does in V-Ray. V-Ray Light Bake (or, more simply, V-Ray Bake) is the process of creating lightmaps using V-Ray and its GI and lighting calculations, generating accurate results in Unreal. Most importantly, it means you get the actual lights from your 3D software as you designed them. So, if you create a photorealistic environment in 3ds Max or Maya, for example, it can be baked just as photorealistically in Unreal as it is in V-Ray. What’s more, once your light is baked, you can adjust the light intensity and color (though not the lighting position). Note that if your lighting isn’t baked, it’s going to look entirely different, because your light isn’t bouncing.
If you’re looking for photorealistic results in Unreal, it’s going to be difficult to get the results you want without V-Ray — without investing a lot more time. Chris explains: “You're taking artists who are used to just designing their space not thinking about some of these concepts. They’re used to getting beautiful photorealistic images at the cost of render time. Suddenly they have to do a lot more technical work and develop a lot more knowledge and spend a lot of artist time in order to get to that level.”
And this is where V-Ray for Unreal helps. A lot.
With V-Ray for Unreal, you can now bring V-Ray scenes from 3ds Max, Maya, SketchUp and Rhino directly into the Unreal Editor — and, for the first time, you can render 100% ray-traced, photorealistic images with V-Ray directly from Unreal.
V-Ray for Unreal
Now that we’ve nailed the basics of the three Rs (rasterization, ray tracing and real-time), we can take a look at V-Ray for Unreal. As we’ve already highlighted, Unreal Engine is typically used for rasterized rendering, whereas V-Ray is a ray-tracing engine. The clever bit is that, with V-Ray for Unreal, ray-traced scenes created with 3D software such as 3ds Max, Maya or SketchUp can be imported into Unreal Engine and explored in real-time.
How does this work? V-Ray for Unreal looks at the V-Ray scene and takes the geometry across, and automatically works out the nearest Unreal equivalents for the lights and materials in your scene. For the most part, the Unreal materials will look just like their V-Ray counterparts. With your baked lighting, you can explore your scene in Unreal Engine and view it in real-time from any angle, and even take it into virtual reality.
With V-Ray for Unreal, artists don’t have to learn a whole new workflow or become game-engine experts to get their work into a real-time format.
The GPU equation
Exploring complex rasterized scenes in real-time requires a high-performance GPU. Once a scene is loaded into V-Ray for Unreal, artists can make use of V-Ray’s ray-traced rendering to create high-quality photorealistic imagery, without having to leave the game engine. V-Ray for Unreal even links the materials in Unreal with their V-Ray counterparts, so renders generated from within Unreal look the same as they did in V-Ray.
But quality is subjective, as Simeon Balabanov explains: “I’ve had talks with customers that wanted to just do space explorations in VR. They didn't care about the materials. They just wanted to see how the shadow was cast inside a building and make sure that when they put on VR goggles and put it into human-scale they have physical accurateness for the lighting but not physical accurateness for materials. They just throw in a grey-scale material there and make the windows transparent.”
V-Ray for Unreal offers the best of both worlds: Incredible quality combined with Unreal Engine’s sprightly real-time rendering.
Want to try V-Ray for Unreal for yourself? Download a free 30-day trial and jump on in.
If you have technical questions, be sure to check out our detailed V-Ray for Unreal Help docs and visit our friendly forums for the latest discussions.