Sunteți pe pagina 1din 252

Blender Eevee

The guide to real-time rendering with Blender 2.8

Allan Brito

Description and data


Technical info about the Book

Author: Allan Brito

Reference: blender3darchitect.com

Edition: 1st

Cover image credits: Thor Alvis @Unsplash

Licensed in public domain - https://unsplash.com/license

Blender version used in the Book: 2.80

First edition date: August 2019


Imprint: Independently published

About the author


Allan Brito is a Brazilian architect that has a passion for applying
technology and open source to design and visualization. He is a longtime
Blender user ever since 2005, and believes the software can become a great
player in the architecture and design markets.

You will find more about him and the use of Blender for architecture in
blender3darchitect.com, where he writes articles about the subject on a
daily basis.
Who should read this book?
Ever since the release of Blender 2.8 and the debut of Eevee as an option
for rendering, artists are struggling to learn the unique challenges of a real-
time render engine. How to use probes? How to get indirect lights in
Eevee? How to fix light leaks?

If you are starting to use Blender and would like to know how to set up and
use some of the essential options of Eevee, this book is for you.

You will learn how to render and apply options to generate indirect lights,
control shadow, and handle materials.

No matter if you are starting with Blender or is an experienced artist. The


book you guide you through all the steps necessary to use Eevee in your
projects.
Foreword
When Blender Cycles appeared for the first time in the early releases of
Blender 2.6x people got excited about the possibilities brought by an
advanced renderer. We have the same feeling now with Eevee and real-time
render inside Blender 2.8.

The technology behind those real-time render engines is incredible and


points to a future where we won't have to wait several minutes or hours for
a render to finish. With Eevee, we can have our images and animations
rendered in seconds or a couple of minutes.

Eevee is not a replacement for Cycles in Blender, but it points to a future


where the vast majority of projects will use Eevee instead of Cycles for
rendering.

One of the main problems with Eevee nowadays is that people using
Blender are not familiar with real-time rendering. How to use a probe? How
to get materials with the best settings?

Along with the book, you will find a significant amount of information to
use Eevee in your projects regardless of context and field. I will explain
with simple examples the most common workflows for Eevee, which you
can apply to your projects later.

I hope you like the content and it helps you to understand and use Eevee to
create renders in real-time!

Allan Brito
Downloading Blender
One of the significant advantages of Blender when comparing to similar
softwares is their open source nature. You can use Blender without any
hidden costs! All you have to do is download the software and start using it.

How to download it? To download Blender, you should visit the Blender
Foundation website:

https://www.blender.org/download/

For this book, we will use version 2.80 of Blender, but the vast majority of
techniques will still work with later versions.
Chapter 1 - Rendering with Eevee and
Blender
What is real-time rendering? It is a new technology that debuted with
Blender 2.8 in a brand new renderer called Eevee. With Eevee, you can get
incredible results for rendering with a couple seconds of render time.
Coming from Cycles, that is an incredible feature, which is hard to beat.

The way Eevee work is unique regarding settings to get a scene with
realistic lights, materials, and effects.

In our first chapter, you will learn the essential aspects about rendering with
Eevee using Blender, from the creation of a single image to animation
rendering.

Here is a list of all topics you will learn in this chapter:

Understand the differences between Cycles and Eevee


How Eevee works
Rendering scenes with Eevee
Saving a static image
Working with the camera
Placing the camera in a scene
Rendering an animation
Convert image sequences to video
1.1 Real-time rendering with Eevee
The release of Blender 2.8 featuring Eevee as an option to render our scenes
in real-time is a milestone for the software and a welcome addition to any
artist using Blender. Before Eevee, we only had the option to use Cycles as
the advanced renderer for Blender.

Why is Eevee becoming so popular among artists using Blender? The main
reason for the popularity of Eevee is a combination of render quality allied
with speed. Imagine a render engine that can deliver realistic results in a
couple of seconds. That is what Eevee can do in Blender 2.8.

One of the secrets behind Eevee efficiency is the use of a technique called
Rasterization. You will find Rasterization as the primary technique used in
Game Engines to display 3D graphics in a 2D screen. The Rasterization
process in Eevee works by projecting the faces of a model as pixels that
compose a 2D image on your screen.

The Rasterization can be incredibly fast, but it is not visually accurate like
other methods. In Eevee, you will get Rasterization using Open GL 3.3 that
delivers incredible speeds for rendering.

How to view Rasterization with Eevee in action? That is simple. You have
to enable the Rendered shading mode in your 3D Viewport in Blender
(Figure 1.1).
Figure 1.1 - Rendered Mode

If you navigate in 3D using that mode, you won't experience any delays or
hiccups. The scene will update in real-time with all the effects you need to
create a rendered version at the end. That is quite different from Cycles,
where you have to wait a few seconds for the processing of an image to end
until you see a noise-free image.

Tip: You can also use the Z key to open a small menu with all the shading
options.

1.2 Comparing Eevee with Cycles


A critical difference between Eevee and Cycles, other than render times,
relies on the method used to render scenes. With Eevee, you will get
Rasterization projecting 3D models on your screen based on pixels. In
Cycles, we get something called Path Tracing.

The Path Tracing method from Cycles handles light bounces around the
scene, and it produces a much more realistic result than Eevee. It has the
name of Path Tracing because it works by:
1. Casting rays from the camera on your scene
2. The beams will bounce around the scene
3. Each ray will interact with surfaces, materials, and textures
4. Once the rays find a light source, it stops
5. The light contribution of each beam gets calculated every time it hits
a light source

Since it deals with all those calculations based on the path a light ray will
have on the scene, it receives a name of Path Tracing (Figure 1.2).

Figure 1.2 - Path Tracing diagram

The method is accurate to calculate realistic lights and materials, but it is


slow in processing. With modern hardware, we can get much faster render
times, but it is not real-time.

Since Cycles has all those calculations based on lights, it has a clear
advantage on something critical for realism, which is shading. A lot of
effects will look better with cycles like:

Ambient Occlusion
Global Illumination
Reflections
Transparency
Refraction
Shadows
Subsurface Scattering
Volumetrics
Motion Blur
Depth of Field

All those aspects of a scene will look accurate and based on physics with
Cycles. Keep in mind that looking better doesn't mean you have to put
Eevee aside and only use Cycles. The accuracy of Cycles comes with a cost
in render time.

Do you want to see a quick example of how Cycles handles better some
aspects of a render? Look at Figure 1.3.
Figure 1.3 - Comparing shadows

On the top, we have a scene using Cycles and a realistic projection of


shadows. The image below uses Eevee for shadows, and without any
tweaks, it clearly shows some pixelation on shadows.

The image using Cycles took 45 seconds to render. With Eevee, that render
time was 1 second.

That is one example of how Cycles can produce better results for a
particular project.

1.3 When should you use Cycles?


Even being a great option to create realistic images with Blender, you will
have to eventually move away from Eevee and use Cycles. The main
question is when you should make a move?

If you get a project that requires you to work with accurate lights and the
use of physics to get high-quality images, you should move to Cycles. One
of the points where Eevee will struggle is with advanced materials. In
scenes that require you to have superior transparency like subsurface
scattering or complex refractions, you will see significant improvements by
using Cycles.

In most projects, you will be able to get incredible results with Eevee and
have saved an impressive amount of time from rendering. A few types of
projects where you will probably get better results with Cycles:

Cinematic animation for TV or Cinema


Marketing materials for large print formats
VFX for video
Those types of projects usually require a high level of quality for images
and will benefit from Cycles accuracy. Be ready to invest in some top of the
line hardware to handle large amounts of data and processing.

1.4 Rendering with Eevee from the 3D Viewport


Using the shading modes in the 3D Viewport header is the fastest way to
start a render with Eevee, and you will be able to see the results in real-
time. But, what if you want to save the results of the render you have at the
3D Viewport?

You can start to render by pressing the F12 key using what you see from the
3D Viewport, but it will only display what the active camera is viewing.

To get any angle from the 3D Viewport regardless of the cameras you will
have to use the View → Viewport Render Image menu (Figure 1.4).
Figure 1.4 - Render the viewport

It will start a render with everything you see at the moment in the 3D
Viewport. That is a great way to produce a quick render using Eevee.

The downside of this option is that it will also get elements that usually
wouldn't appear in a render. For instance, if you are in Edit Mode working
on a modeling project and want a snapshot of the object, you will get
vertices, edges, and faces (Figure 1.5).

Figure 1.5 - Snapshot from the Viewport

For artists trying to build a visual progress of their 3D modeling projects


that is a great option, and will avoid the need of getting screenshots on third
party software. You can make the screenshots in Blender.

What if you have an animation? You also have the option Viewport Render
Animation that will process all the frames from an animation. It will use the
data you have on the project like keyframes, framerate, and objects.

It is essential to do two things before rendering an animation:

Save your Blender file


Set the output directory in the Output tab at the Properties Editor

To make it easier for you to find the rendered animation, it is always a good
practice to save the Blender file in a unique folder. Make another folder
there to keep your renders.

From the Output tab, you can also pick the format in which you want to
save your animations rendered with Eevee.

1.5 Saving renders from Eevee


Regardless of the method, you will use to start a render with Eevee, at some
point you have to save them to disk. How to save your renders from Eevee?
At the Render tab in the Properties Editor, you will find all options related
to saving your renders. From the default output folder to the format used to
save the images (Figure 1.6).
Figure 1.6 - Output tab settings

There you should keep the file format as PNG all the time and avoid getting
your images as JPG. The main reason for that is keeping your images with
the highest possible quality. Every time you save a file as a JPG, you will
apply some level of compression to that image.

Whenever you save an image file, the software will apply compression to
reduce the file size. Nowadays, we have two main types of compression:

Lossy: To make your files even smaller the compression excludes


parts of the image. That is what you will find in JPG files.
Lossless: The goal of this compression is to preserve data as much as
possible with bigger files. You will find this compression in PNG
files.

An optimal workflow to save your render from Eevee would be to get them
in PNG first and convert the material later to JPG. You can keep the PNG
file as a source for post-processing and also generating lower resolution
versions. If you save the project straight to JPG, you will already start
losing data.

Besides keeping the images in PNG, you should also use the RGBA color
space. The RGBA means you will use colors in RGB with an Alpha
channel. With an alpha channel, you will be able to get transparent
backgrounds for the render.

That is perfect for composition and also adding the render results in other
projects. To get a render with a transparent background, you have to keep
the Film settings in the Render tab marked with the transparent option
enabled (Figure 1.7).
Figure 1.7 - Transparent option

After you render the project, you will see the transparency as a squared
pattern in the background of your render (Figure 1.8).
Figure 1.8 - Render with transparency

What if you forget to set the image as a PNG file using the RGBA format?
At the save dialog in Blender, you will be able to change the format and
settings on the left side. You can easily change the image settings using
those options.

Info: The save dialog will appear when you render a scene and from the
window where your image appear you choose the Image → Save As…
option.

How to save the file? If you trigger the render from Eevee, you will see the
results in the Image Editor. At the editor, you will see a menu called
“Image,” and there you have an option “Save As…”. Pick that option to
save the render to disk.

1.5.1 Choosing what to render


The first time you start rendering projects in Blender, you will find
something strange. You will set the view of a scene with navigation
shortcuts and begin to render, and get a completely different look from the
scene.

To choose what you will view from a render, you must set what the active
camera from Blender is viewing. You can have multiple cameras in a
Blender scene, but only one of them will be the active (Figure 1.9).
Figure 1.9 - Multiple cameras

How to find the active camera? You will identify the active camera with a
visual cue. At the top of the camera icon, you will see a filled triangle. All
other cameras will have the same triangle, but only with an outline (Figure
1.10).
Figure 1.10 - Active camera

To make any camera active, you must select the camera and press the
CTRL+Numpad 0 keys. Once you have an active camera set, you will see
the results of what that camera is looking at the render.

Info: If you are using an computer that doesn't have a Numpad, you can go
to the Edit → Preferences and from the Input tab enable the “Emulate
Numpad.” That will make your alphanumeric keys work like the Numpad.

The easiest way to change and place the camera is with a shortcut. With the
CTRL+ALT+Numpad 0, you can align the active camera with the view you
have from the scene. Use the 3D navigation shortcuts to align your view
and press the shortcut. The active camera will align with your view.

After you have the active camera in place, select the border of the camera,
and make further adjustments with the G and R keys (Figure 1.11).
Figure 1.11 - Camera border

You can even emulate a dolly movement to the camera by moving it in the
local Z-axis. Press the G key and the Z key twice. That will move the
camera in the local Z-axis and emulate a dolly movement.

1.5.3 Camera settings for Eevee


When you have a camera selected in Blender the settings and options for
that object will become available at the Data Object tab. Look at the
Properties Editor, and you will see the options to change and adjust the
camera (Figure 1.12).
Figure 1.12 - Camera settings

Among the most useful options, you will find the focal distance to the
camera. For artists with a background in photography, the focal distance
will look familiar. That distance regulates how wide will be the view from
the camera.

The default value for the focal distance in Blender is 50mm, which won't
result in a broad view from the scene. To get a wider perspective, you will
have to reduce the distance to values around 20mm. Keep in mind that
using values lower than 18 might give you an ultra-wide view at the cost of
distorting objects at the border of your render (Figure 1.13).
Figure 1.13 - Comparing focal distances

Besides the focal distance from the camera, you can also use settings to
enable some helpers for your project. With the Composition Guides, you
can view some lines on your camera that will help to make a better
composition.

How house them? For instance, you can turn on the “Rule of Thirds” to
display guides on strategic locations of your camera. The rule states that
you should place essential objects at the location where the lines cross. That
will make the objects to appear visually more important in your image
(Figure 1.14).

Figure 1.14 - Rule of thirds

Use those rules to help you make better renders by finding optimal
locations to frame your objects.

1.6 Rendering animations from Eevee


The process to render animations from Eevee looks a lot like what you do
with Cycles, but with a significant advantage. Since we are working with
real-time render in Eevee, the animations will become ready in a much
shorter time. Before we start to render animations, it is crucial to check a
few details about your scene and format used to save an animation.

You have two main options to save animations from Eevee:

Save as a sequence of images


Save to a video container

What is the best option? To save time and allow you to work on multiple
post-processing options, you should always prefer the image sequence
option. The main reason to pick the image sequence option is to save the
files in PNG.

Once you save the animation as a sequence of PNG files, you can later use
Blender to transform the images on any video container quickly. For
instance, you can save a sequence of PNG files and then make Blender
process them to become an MP4 file.

Having the animation as a sequence of images has other advantages:

You can render in multiple steps by choosing different ranges of


frames
You got to keep the source files with a lossless compression
You can use the PNG files to make all types of compositions and
work on video timing

If you decide to render the project straight to a video container like an MP4
file, you will have fewer options to work in post-production.

1.6.1 Saving as an image sequence


Regardless of the format, you wish to use for your animations; you must
always save the Blender file before trying to process an animation. By
saving the file, you will have more freedom to pick a folder to save the
animation and also avoid potential data loss from crashes.

To save your animations as an image sequence, you must use the Output tab
in the Properties Editor (Figure 1.15). There you can choose the folder
where Blender will save the files. Below the folder settings, you can pick
the format used to save each image. As mentioned before, always use PNG
to save your animation. Unless you desperately have to save on file sizes
and have to use JPG instead.
Figure 1.15 - Output tab

What happens when you start rendering an animation as an image


sequence? After you trigger the animation render with the CTRL+F12 keys
or with the Render → Render Animation menu your output window will
show each frame. Once Bender starts displaying the frames, it is just a
matter of waiting until you get all the images from the animation.

At the output folder, you will see the image sequence using the frame
numbers for names. If you want to add a suffix to the filenames, you can
add the text in the Output folder settings (Figure 1.16).

Figure 1.16 - Image file names

Use the filename to make different versions of a shot or animation.

1.6.2 Saving animations as video


To save your animations as video files, you will have to change the settings
at the Output tab. There you will change the File Format from image to
FFmpeg video. After you set the format to FFmpeg video, you will see a
new field below called Encoding (Figure 1.17).
Figure 1.17 - Encoding video
You can use presets to save your file to video, which will appear in the icon
right next to the Encoder label (Figure 1.18).
Figure 1.18 - Presets for video
Use the presets as a starting point and make adjustments for each video. For
instance, you will see that having the video as h264 in MP4 will set the
output quality to “Medium.” You can change that to “High quality” if you
wish.

Below your video settings, you also see audio options.

1.6.3 Converting images to video


If you want to make different versions of animation into a video quickly,
you can render the animation as images. Later you can convert the images
to video files.

How to convert image sequences to video files?

Assuming you will render an animation project to a sequence of images, the


next step is you use to process them in the Video Sequencer Editor. In
Blender, we can use the Workspace for Video Editing. Open the Workspace
selector and go to the Video Editing → Video Editing option (Figure
1.19).
Figure 1.19 - Video Editing Workspace

The bottom half of this Workspace has the Video Sequencer Editor where
we can add the image sequence from your animation. It works like a video
editor inside Blender. Use the green vertical line to control the playback and
use each channel to place content (Figure 1.20).

Figure 1.20 - Video Sequencer

To add your image sequences, open the Add menu, and choose
Image/Sequence. Use the A key to select all images, and you will see the as
a block of content in the Sequencer (Figure 1.21).
Figure 1.21 - Image sequence

Now, go to the Output tab and choose the settings you want to use for your
video files. You will notice that rendering an image sequence to video is
much faster than processing your scenes from 3D Data. Use the Render →
Render Animation menu to start the process (Figure 1.22).

Figure 1.22 - Rendering video from images


Set the format you wish and also the output folder. Since the process is
relatively fast, you can use multiple presets to see what best fits your needs
regarding video formats.

Whenever you have data in the Video Sequencer, Blender will render the
contents from the video channels there instead of 3D Data. The reason for
that is because at the Post Processing field at the Output tab you have the
“Sequencer” option marked. That will make Blender render content from
your video editor and ignore any 3D Data.

What is next?
The first step to start using Eevee is to manage to render and saving images
from your 3D Viewport in Blender. After you have all the knowledge
necessary to get your work saved, it is time to start digging on how to add
lights and materials to the scene.

Eevee has a particular type of requirements regarding lights because of how


Rasterization works. For Cycles, we usually place some critical lights in the
scene and press the render button. Since it has a base on physics, the
software will handle most of the hard work regarding illumination.

In Eevee, we must add some key components to help the Rasterization


process to work to mimic how Cycles works, and render our scenes in a
snapshot.
Chapter 2 - Environment lights with
Eevee
One of the most important aspects of a scene in either Eevee or Cycles is
the environment lights because those lights will give you a head start in
setting up realist lights for any project. Another benefit of using
environmental lights is to provide reflections for objects using glossy
surfaces as a material.

The most common type of texture for an environmental texture is an HDR


image, which works great as a source of light for a scene. In Eevee, you
will find that sometimes an HDR can cause several problems like light
leaks. If your scene experience such issues, you can replace the HDR
texture with an Area Light.

From all the lights available in Blender you will find that Area Lights are
the ones that give the best results for Eevee. You can get them to work as an
environment texture or mix them with an HDR. You will learn how to
manage and place them in the best possible locations for interior scenes you
wish to render using Eevee.

Here is a list of what you will learn in this chapter:

What are environmental textures


Apply an HDR as an Environmental texture
Control the visibility of an environmental texture
Use gradients as the main background for Eevee
Replace the environmental texture with an Area light
How to place Area Lights for interior scenes

2.1 Using Environmental textures with Eevee


The first time you open Blender and hit the Rendered shading mode, you
will notice that Eevee will display your scene in real-time and it will also
appear to be dark. We could use some additional light in the scene to make
it look brighter. Another great way to improve lights on any project that
uses Eevee for rendering is with environmental textures.

What are environmental textures? They offer a way to add lights to the
scene using something that you already have in a project at the very
beginning. It uses the background of the scene.

From the background go your scene, we can add light from all directions,
which will make any project or scene to look brighter immediately!

To use your background as a light source, you have to open the World tab in
the Properties Editor and look for the Surface field. There you will see a
button with a "Use Nodes" option. That means you are not using the
background to emit light yet (Figure 2.1).
Figure 2.1 - World tab

After you press the "Use Nodes" button, you will see all options regarding
the use of your background. At this point, we have two main options for the
background:

Use color to add lights to the scene


Use a texture that will provide real-world reflections and natural
lights

If you want to use a solid color, the Color and Strength fields will be
enough for that task. However, for a more complex and realistic light setup,
you will need environmental textures associated with the Color field.

At the right of your Color field, you will see a button with a dot in the
middle. Click on that button and choose Environment Texture (Figure 2.2).
Figure 2.2 - Environment texture

When you add an Environment texture to the color, you will be able to pick
an image texture to use in your background. You can use any texture for
that task, but to have better results and natural lights, you will want to use
an HDR map.

An HDR map is a particular type of image that can store some vital
information regarding lights. You will be able to mimic the same lights used
in the environment where the image got captured. For instance, if you get
an HDR map from a location where you had a bright daylight scene, it will
generate that same effect for Eevee.

Another exciting aspect of HDR maps is that most of them will come in a
format called "Equirectangular." An image in this format is a representation
of volumetric space. You can map it into a shape like a sphere, cylinder, or
cube.

The main benefit of using such images in Eevee is that you will get lights
and reflections from all directions. In Figure 2.3, you can see an example of
an HDR map with an equirectangular projection.

Figure 2.3 - Equirectangular HDR

You don't have to make any changes to settings in Blender to use such those
types of maps. That is the default format Blender uses to apply maps to the
background.

2.2 Using an HDR map


How to use an HDR map in Blender Eevee? After you add the environment
texture to the color, you can use the "Open" button to pick an HDR texture
from your disk. There are several free online libraries available that offer
public domain HDR maps. One of the most used options for Blender is
HDRIRaven (https://www.hdriraven.com).

There you will be able to pick a map and download high-quality textures
for your projects using Eevee. For instance, we can use the map shown in
Figure 2.4 as the background of our scene.

Figure 2.4 - HDR map for Eevee

A key element of each HDR map is how the light works on each map. Here
is a preview from two different types of HDR maps in Figure 2.5. Each one
of them has a unique approach to light behavior.

Figure 2.5 - HDR Preview

Notice how one of the maps has a bright light source direction and hard
edge shadows. The other map has soft edges for shadows and a more
scattered pattern.
Even with those examples of shadows generated by an HDR map, at this
moment Eevee cannot cast shadows from HDR maps only. You will have to
add auxiliary lights to create shadows.

You should pick the HDR for your project based on the light you want to
apply to the scene and choosing an HDR according to that light. What
happens after you add an HDR to a scene in Eevee? You will start to see an
immediate benefit of that light source to the project (Figure 2.6).

Figure 2.6 - Scene with HDR

Besides getting a more natural color for the project using an HDR map, you
will also get the texture appearing in the background. In some cases, that
could be a benefit for composition purposes. For instance, you can get the
background with the sky and clouds for a project.

In other cases, it may require you to find a way to hide the HDR in the
background. The HDR may give you an excellent effect for lights, but will
not align with the camera angle for a particular scene.

Info: With the HDR map you won't get shadows for Eevee, but we use it
together with other lights like an Area or Sun.

2.2.1 Hidding the HDR map


Using an HDR map in the background for a scene in Eevee will give you a
significant effect for any scene in Blender. After you add the HDR image to
the background, it will start to appear in your scene and cover the entire
backplate of the render.

Is there a way to hide the HDR? Unfortunately, we don't have a dedicated


control for HDR visibility in Eevee but, we have two options that can help
to hide the texture from your renders.

The first one involves the use of an object to block the visibility of your
texture in the background.

The technique consists of:

1. Adding a sphere to the scene and using the scale to make it big
enough to surround the entire scene
2. In Edit Mode, change the normals of every face of your sphere to
point at the interior of your model
3. Smooth the borders of your faces. Right-click and choose Shade
Smooth.

The trick here is to place the entire scene inside a sphere with all normals
pointing to the interior. When you have normals leading to the interior of an
object, it becomes "transparent" to light on the outside. In computer
graphics, a normal also works as the visible side of a polygon.

When you have the normals for that face pointing to the interior, it will
become "invisible" to anyone looking on the outside (Figure 2.7). That also
includes light sources.

Figure 2.7 - Sphere on the outside

To achieve that you can use the following procedure:

1. Press SHIFT+C to center your 3D Cursor.


2. Press SHIFT+A and add a UV Sphere.
3. Use the S key to scale up the sphere until it surrounds the entire
scene.
4. Press the TAB key to enter Edit Mode.
5. With all vertices of the sphere selected, go to the Mesh → Normals
→ Flip menu.
6. Go to the Face → Shade smooth menu.

If you try to view the scene from a camera using the Rendered shading
mode, you won't see the HDR in the background anymore (Figure 2.8).

Figure 2.8 - View from inside the sphere

To improve even more this setup, you can use the sphere as a fill light to the
scene. Select the sphere and add a material to the object with the Emission
shader (Figure 2.9).
Figure 2.9 - Emission shader

Adjust the material color to something close to a light blue, and you will
have a great fill light to make your scene even brighter. Use the strength
settings to control the intensity of your background lights.

2.2.2 Using a transparent background


The second option to hide your HDR map in the background is with a
transparent effect for your render. There is an option in Eevee that will
apply a transparency effect to your render and remove visually the HDR.
The effect is available at the Render tab in the Film section.

There you will see the "Transparent" checkbox (Figure 2.10). That is the
same option we saw in chapter 1.
Figure 2.10 - Transparent option

If you enable this option, you won't see the HDR map in your background
or any other element. Instead, you will see a squared background pattern
that identifies a transparent surface (Figure 2.11).
Figure 2.11 - Transparent background

Make sure you save the file using a PNG format with the color set to RGBA
to get the transparency effect to also appear in your image.

Tip: Use a transparent background whenever you need a scene that will
receive a background later in softwares like GIMP or Photoshop. You can
add something like a sky background or any other image to compose your
final render.

2.2.3 Controling HDR rotation


What if you want to view the HDR map in the background, but it is in the
wrong position or rotation? We can control the rotation of any HDR map
used in the background of a scene with the Shader Editor of Blender. After
you add the HDR map in the Word tab, we can go to the Shader Editor.

We can use the Shading Workspace by choosing from the selector at the top
of your 3D Viewport the General → Shading option (Figure 2.12).
Figure 2.12 - Shading Workspace

At the bottom, you will see the Shader Editor. Change the Shader Type to
World to view the Nodes for our background (Figure 2.13).

Figure 2.13 - Nodes for the background


Use the SHIFT+A keys to add two new Nodes:

From Input → Texture Coordinates


From Vector → Mapping

Connect the Generated output socket to the input of your Mapping. And the
Mapping to the Image Texture. You should have a setup that looks like
Figure 2.14.

Figure 2.14 - Nodes for HDR control

To connect each Node, you can click and drag with the mouse between an
output and input sockets. Hold the CTRL key and click and drag with the
right mouse button to cut the connections.

The Texture Coordinates Node will control the location used to orient the
HDR map around your scene. With the Mapping Node, we can have
numeric control over the texture.

After you make all connections, the Mapping rotation settings will control
the orientation of the HDR map. That will be perfect to set the direction of
lights generated with the texture. For instance, if you choose an HDR
texture that creates hard edge shadows, you will be able to set the direction
of lights using the Node (Figure 2.15).
Figure 2.15 - Rotation controls

Always keep the scene in the Rendered shading mode to evaluate if you
have the shadows in the correct direction. Use the Z-axis controls to rotate
the HDR map and have full control on shadows.

2.3 Using a gradient in the background


In computer graphics, you will find some scenes using in the background a
gradient of color to represent an artificial horizon. In Eevee, you can also
make such gradient and use it to light the scene. The process is similar to
the controls we used to manage an HDR rotation.

To add the gradient to the background, you have to start a scene, and in the
World, tab enables the "Use Nodes" button. At the Shader Editor, you will
change the Shader Type to World and start working in the Node setup.

We will need the following Nodes, which you can add using the SHIFT+A
keys or the Add menu:

From Input → Texture Coordinates


From Vector → Mapping
From Convertor → Separate XYZ
From Convertor → ColorRamp

You can start by connecting the Texture Coordinate to the Mapping in a


similar way that we did in the HDR rotation controls.

Why do we need the other two Nodes? The Separate XYZ will get the
orientation of the gradient for the background. If you use Z, it will be a
"vertical" gradient and using either X or Y will make a "horizontal"
gradient aligned to each axis.

From the Mapping Node, you can connect the output socket to the Separate
XYZ and using only the Z output, connect it to the ColorRamp. Use the
ColorRamp to setup your gradient. You will get the Node setup like Figure
2.16 shows.

Figure 2.16 - Nodes for gradient backgroud

The next step is to use your ColorRamp to prepare the gradient with all the
colors you need to place in the background. How do the ColorRamp works?
In Figure 2.17, you have a diagram of all available controls for the
ColorRamp.

Figure 2.17 - ColorRamp

For instance, you can add new colors and use the sliders to place it in the
gradient bar. You can also use only two colors. Click at the index you want
to use, and in the color picker, you can choose the tone you need.

A background that simulates an artificial sky would have a darker blue at


the bottom and a lighter version to the top (Figure 2.18).
Figure 2.18 - Artificial Sky gradient

The ColorRamp will give you some flexibility on how to choose and place
each color for the background.

2.4 Using Area lights for the environment


Even being a great choice to add lights to the scene using the background
you might find that in some cases, you can have better results with a light
from Blender adding energy as the background. You have in Blender
several types of lights that you can use:

Lamp
Area
Sun
Spot
To use in the background to replace an HDR map, you can easily add an
Area light that will fill any scene. Why an Area light and not other types of
lights? Because with an Area light, you will have a large surface that will
emit energy to the scene, much like a background would perform.

You will also have the advantage of generating a "constant" flow of light to
the scene.

For instance, look at the scene shown in Figure 2.19.

Figure 2.19 - Scene for testing

Instead of using an HDR map in the background, we will apply an Area


light that will point to the interior of the room. You can either add a new
Area light to the scene using the SHIFT+A keys or turn any existing light
source into an Area.

You can select an existing light and at the Object Data tab swap the type of
light you are using (Figure 2.20).
Figure 2.20 - Types of light

An Area light can assume a squared or rectangular shape. If you choose the
square, which is the default format, you will be able to control the size at
the Object Data tab. For the rectangular shape, you will have to choose the
width (X) and height (Y) for the light (Figure 2.21).

Figure 2.21 - Area light size

You can mimic the behavior of your Environmental Texture; the Area light
should have a large size that will fill all the space in your scene. One aspect
of the Area lights in Blender that you will notice is that they produce a faint
light effect. To make a difference in the scene, you should increase the
Power controls for the Area light.

Here is a comparison between different energy levels in Figure 2.22.


Figure 2.22 - Energy for Area Lights

As you can notice from the results, you have to increase the Area light
energy a lot to have a considerable effect on the scene.

In the Area light tab, you will find lots of interesting controls that will allow
fixing several aspects of the scene. For instance, you can turn on the use of
contact shadows that will enhance the realism of your scene.

2.5 Mixing HDR maps and Area lights


When you use only an Area light to generate the environmental lights for a
project, you will lose some of the benefits related to HDR maps in general
— for instance, having a source of light that can give you a more natural
aspect for the overall illumination of the scene.

An Area light you generate a constant illumination using a single color,


whereas the HDR will add all the colors from the map to the scene. In the
end, you will get a more realistic scene.

Can we use them together? A common technique regarding scene lights is


to mix an HDR and some Area lights to compose a scene. The process
consists of:

Using the HDR in the background as an Environment Texture


Add an Area Light to each Window, Door, or opening in a scene

By using this technique, you will be able to get all the benefits of using an
HDR as an Environment Texture and also will be able to get additional light
from places where you would expect those lights to enter the scene.

For instance, you can take a look at Figure 2.23 to see a scene that needs
more light.
Figure 2.23 - Scene and HDR

The scene has only one HDR as the primary light source, and even after
increasing the strength of that light, you still don't get enough energy in the
scene. In Eevee we have another problem with that type of setup because it
doesn't compute indirect lights the way Cycles handles. We have to help the
renderer by adding those Area lights.

A solution for this scene would be to add in each opening an Area light that
will work as a fill light for the scene. For each opening, we can add an Area
light and scale it in a way that it fills the whole dimension of each opening
(Figure 2.24).
Figure 2.24 - Area light in the scene

As a result, you will get a much better lighting effect for the scene. You can
play around with the strength of the Area light to find the best balance to
the project (Figure 2.25).
Figure 2.25 - Scene with Area lights

At this point, you might get some problems with aspects of lights like
shadows and light leak from corners of your 3D model. You can improve
the shadows in several ways like in the Render tab in the Eevee settings.
For light leaks, we will cover that in chapter 4 in more detail.

Tip: Remember that HDR maps won't cast shadows with Eevee and have a
change to cause light leaks for interiors. You should prefer to use them in
exterior visualization projects.

What is next?
The first step in many projects is adding an Environmental texture to the
background, which will give you some initial light setup for a scene. That
first light will make it easy to start working with other aspects of the scene
like materials.

After you have that light set, you can move straight to the materials. Before
we move any further with the setup of a scene in Eevee, you should add all
available materials to the scene. That will be important when we work with

Since Eevee can offer real-time results for a render, you will have a much
better experience working with materials than Cycles. As you will learn in
the following chapters, you need materials in a scene to use all the power
from Eevee to render your scenes.

The materials will help you generate indirect lights that will eventually
bounce and blend colors related to the materials in your scene. You must
apply materials before moving forward in Eevee.
Chapter 3 - PBR materials with Eevee
After you have the environment lights in place, you can start to move
forward with the setup of your scene in Eevee. You should begin to work
with materials as the next step to build a scene.

In the following chapter, you will learn how to manage and work with
materials in Blender, which will allow you to apply the concepts not only to
Eevee but also with Cycles. Handling materials involves working with a
collection of shaders and textures.

To get the most out of Eevee, we will use a material in a format called PBR
to make realistic surfaces with a collection of image textures that works
with the incredible Principled BSDF shader.

Here is what you will learn about materials:

How materials work in Eevee


Managing and applying materials to objects
Using multiple indexes for materials
Setup and use the Shader Editor for materials
Apply PBR Materials to an object in Eevee
Control tilling for PBR materials
Creating glossy and glass materials for Eevee
Reuse materials from other files
3.1 Using materials in Eevee
The use of materials for any project related to 3D modeling, design, or
animation is essential to give context and meaning to surfaces. In Eevee,
you will see that working with materials is way easier than Cycles because
of the real-time previews. If you have previous experiences using Cycles to
create materials, you might remember the render tests required to check if a
material looks good.

Since Eevee allows us to create materials and immediately see the results in
real-time, you won't lose any time waiting for a render to complete and
check a material. For artists using Cycles, you can even change the renderer
to Eevee during the materials setup stage. That will save an incredible
amount of time.

One of the benefits you will find with Blender by having both Eevee and
Cycles available is that materials will work on the two renders. You can
start a project making materials for Cycles and later change the renderer to
Eevee. In most cases, you won't have to make changes to the settings.

3.1.1 Starting with materials


Is there a better moment to add materials to elements in your scene?

If you are wondering if you should start with materials or leave it as the last
step in a project, an approach that is popular among artists is to setup
materials after the environment map. The main reason to add materials
early in the scene setup has a direct relation to calculations of indirect
lights.

You will learn later in chapters 4 and 5 that Eevee can't process indirect
lights for a scene like Cycles. We have to use special objects called probes
to compute those lights. Those probes will capture light and bounce it
around the scene. By the time you start using probes, you must have all
materials in place.

Because light bouncing in a surface that has a material will also carry colors
from those materials and contribute to the shading, if the material is not
present, you will lose the shading contribution of that material (Figure 3.1).

Figure 3.1 - Materials and shading

For instance, if you have a scene with a red floor, you will expect that color
to bounce in white walls. If the red doesn't appear as part of the indirect
illumination calculations, you will lose a lot of the realism for that scene.
Since Eevee can provide some fast previews, you will be able to quickly
add all necessary materials for a scene and evaluate if they look good on
your project.

3.1.1 How to apply materials in Blender


To apply materials in Blender, you have first to select the object you wish to
edit and go to the Materials tab at the Properties Editor. There you will find
all information regarding materials. If you choose an object that already has
materials applied, you will see the options for that material.

An object that doesn't have any materials will show an empty panel with a
"New" button (Figure 3.2).
Figure 3.2 - Materials options

After you add the materials to objects, you will find controls at the top of
your Material tab that will give you additional options to manage each
material. You will find a description of each one of the settings in Figure
3.3.
Figure 3.3 - Material controls

Here is what you can do with each one of the controls:

Material selector: The option allows you to choose from a list of all
available materials in the scene.
Material name: If you have to rename the material you can click at
this field and type the name you wish to use. Each material in Blender
must have a unique name.
Fake user: When you remove a material from an object, it will
remain in the scene asset list. After you save and close the file,
Blender will purge all materials that are not in use. If you want to
keep a material that is not in use by any objects, you can turn on the
Fake User.
Duplicate material: For the cases where you need a copy of existing
material to make small changes or use it as a template.
Remove material: If you want to remove a material, you can use this
option. But, it will still appear on the list of assets for the scene. The
exclusion will occur after you save and close the file; only of no
object is using a material.

An important concept regarding materials in Blender is that they exist


concerning the file and not the object. At the Materials tab, you will only
associate the material to the object. You can have the same material in
multiple objects.

3.1.2 Getting materials from other files


If you have existing materials in other Blender files, you can quickly get
them to your project using an option called Append. If you go to the File →
Append menu, you can get the materials from a different file. Once you
open the Append option, you will see the Blender file picker (Figure 3.4).

Figure 3.4 - Using the Append option

When you pick a Blender file, you will see some folders in that file, where
one of them has a name of "Material." By opening the folder, you will see a
list with all available materials you can get.

Tip: You can also use the Link option instead of the Append, but it will not
import the data directly to your file. It will use the material as a reference
link, where you can't edit the contents of your material.

3.1.3 Using multiple materials


The materials in Blender will not have a direct relation to objects in your
scene but the file. However, something that we must edit straight in objects
is the use of material indexes. At the top of your material options, you will
notice that you have a field that displays the material name.

That is the material index list, which can use various materials and has a
relation to the object (Figure 3.5).
Figure 3.5 - Material indexes

You can add multiple materials to each index of an object. Press the "+"
button, and you will be able to either choose an existing material or create a
new one (Figure 3.6).
Figure 3.6 - Adding new materials

Once you have an additional index available you can assign that to any part
of your 3D models. The options to manage and assign each of those indexes
will appear in Edit Mode. With the object selected, you can go to Edit
Mode, and at the bottom, you will see the new buttons (Figure 3.7).
Figure 3.7 - New options for indexes

Select the faces you wish to assign the new indexes and press the "Assign"
button. They will now display the material selected for the index you have
active in the list.

Tip: You can remove an index without affecting the material. It will still be
available from the material selector.

3.2 Shaders for materials


How to build materials for Eevee? After you have an active material
assigned to an object, it is time to make a few choices based on the type of
material you wish to create. At the Material tab, you will have to choose
among several shaders that will set how the material interacts with light
(Figure 3.8).
Figure 3.8 - Shader list

You will find several different types of shaders available like:

Diffuse BSDF: One of the most simple materials available. It will let
you use a solid color for a surface.
Emission: If you need a material that emits light, you can use this
shader to turn any object into a light source.
Transparent BSDF: With this shader, you can create simple
transparency for surfaces in Eevee.
Mix Shader: An option that will let you blend two or more shaders to
craft complex materials.

The type of surface you choose to create an Eevee will depend on the
project you are trying to make. Regardless of the type, you will have to use
shaders to get the desired surface.

If you are converting a scene from Cycles, you will find that most of the
materials from Cycles will also work in Eevee. A few unique materials with
effects like transparency and glossy surfaces will require a few changes to
work correctly in Eevee.

For a simple surface with a plain color, you can easily set the shader as
Diffuse BSDF and pick a color for that surface (Figure 3.9).

Figure 3.9 - Material color


It is easy to create simple color materials, but they won't help much to
create realistic images with Eevee. For that, we will need additional shaders
and Nodes.

3.2.1 Shader Editor


The best way to edit and craft materials for Eevee is with the use of a
particular editor in Blender that will allow you to manage Nodes. With the
Shader Editor, you will have a flexible space to mix and create all types of
materials. To open the Shader Editor, you can use any available space in the
Blender user interface (Figure 3.10).

Figure 3.10 - Shader Editor

Inside the Shader Editor, you will see Nodes representing different parts of
your materials. That is the same Editor used in chapter 2 to control the
HDR Rotation.
A few tips on using the Shader Editor:

You can add Nodes using either the Add menu or the SHIFT+A keys
Each Node might have an input (left) and output (right) sockets
To connect Nodes, you can click and drag between sockets
Use the color codes from sockets to check data types
To cut a connection between two Nodes, you can hold the CTRL key
while clicking and dragging with the right mouse button
Use the same selection shortcuts to manage your Nodes
You can use the same navigation shortcuts to manage your
visualization of the Shader Editor

The Shader Editor is a vital component of any project in Eevee because it


will allow you to create and make physically based materials later.

Info: We already used the Shader Editor back in chapter 2 to control HDR
maps.

3.3 Using PBR materials for Eevee


If you have plans to create realistic images using Eevee, you must invest
time working only with PBR materials. The acronym PBR means
Physically Based Render and identifies a type of material that will display a
high level of fidelity with a surface in the real world.

Luckily for us, we can easily use PBR materials in Eevee with a powerful
shader called Principled BSDF. That shader can handle multiple texture
maps that compose a PBR material. In some projects, you may have your
materials using only the Principled BSDF for all surfaces because it can
replace most of the other shaders available in Blender.
When using PBR materials, you have two options:

Create your materials to use in Blender


Get materials ready from an existing library

For artists looking to make original content and invest in a workflow to


make those materials, you will find lots of tools and hardware to create
those materials.

3.3.1 Getting PBR materials


You can also go to several online libraries that offer those materials in a
public domain license. You can get PBR materials from those sites at no
cost:

CC0 Textures (https://www.cc0textures.com)


CGBookcase (https://cgbookcase.com)
TextureHaven (https://www.texturehaven.com)

For instance, we can visit one of those three sites and download a fabric
texture with a 4K resolution, which will give us maps having 4096 pixels in
size. You will download PBR materials as a ZIP file, which you will have to
extract somewhere in your computer.

Good practice in those cases is to save your Blender file in a folder and
extract the textures to that same location. Our fabric texture has a total of
four maps (Figure 3.11).
Figure 3.11 - PBR material maps

Each of the maps has a function in the PBR material:

Color: Defines the visible Color of the material


Roughness: Adjust the glossy level of your surface
Normal: Here we can use this map to create bumps in your surface
Displacement: Using the displacement will also make bumps, but
they use real geometry to generate those details

Depending on the type of material, you can also have additional maps. Both
the normal and displacement have similar functions with the difference in
the results. With the normal map, you get bumps that don't use real
geometry. The displacement can make bumps with geometry but will
require you model to have a high-density for your polygons.

3.3.2 Setup of PBR materials


How to properly set up PBR materials using Eevee? If you got yourself a
PBR material from one of those online libraries, you would have to use the
Principled BSDF shader. That shader will connect all maps in a single Node
to output a realistic surface.
The first thing you need to get your physically-based surface is to apply a
new material to any object in your scene. In that material, you will use the
Principled BSDF as the main shader (Figure 3.12).

Figure 3.12 - Principled BSDF as shader

Open the Shader Editor to make your workflow smoother and add an Image
Texture Node. Press the SHIFT+A keys and add Texture → Image Texture
three times. You can also create one Node and duplicate it two times with
the SHIFT+D keys.

Tip: From your file manager you can also drag and drop the images
straight to the Shader Editor in Blender. They will appear with an Image
Texture Node ready to use.

In each one of the Image Texture Nodes, you will open the following maps
for the PBR material:
Color
Roughness
Normal

For the Roughness and Normal maps, you will change the Color Space
from the Node to "Non-Color" (Figure 3.13).

Figure 3.13 - Color Space

If you are applying the material to a flat surface, you won't have to make
changes to the Mapping of your textures. However, you will experience
misplaced textures in case you use a tridimensional object. You can change
the Projection of your textures from "Flat" to "Box," and they will adapt to
almost any shape (Figure 3.14).
Figure 3.14 - Texture Projection

The last step is to connect the Image Texture maps to each of the
corresponding input sockets at the Principled BSDF:

Color to Base Color


Roughness to Roughness

For the Normal map, we need an additional Node. Press the SHIFT+A keys
and add from Vector → Normal Map. Connect the Image Texture with the
normal to the Normal Map Node, and then to the Normal input socket at the
Principled BSDF.

In the end, you will have the setup shown in Figure 3.16 for the PBR
material.

Figure 3.15 - PBR material setup

If you render the scene in Eevee, you will be able to visualize the material
in real-time (Figure 3.16).
Figure 3.16 - PBR material in real-time

You can make changes to the material in your Shader Editor and view them
at the 3D Viewport with no delays, which is a great advantage of Eevee.

3.3.3 PBR materials with Ambient Occlusion


In some PBR materials, you will find an additional map for your surfaces
called Ambient Occlusion or AO. That map will also contribute to the final
setup of your scene and should go alongside the Color texture to the Base
Color of your Principled BSDF shader.

How to use two maps at the same time? To use two maps for the Base
Color, we will need a MixRBG Node. Add the Node using the SHIFT+A
keys and choose Color → MixRGB.

Connect the Color to the top input of your MixRGB and the Ambient
Occlusion to the lower input (Figure 3.17).
Figure 3.17 - Using Ambient Occlusion

With the Ambient Occlusion map, you will get a PBR material that can use
contact shadows in bumps and ridges of your surface.

3.4 Glass materials for Eevee


A common type of material that you will have to create is a glass surface
that mixes glossy reflections with transparency. In Eevee, that is a
complicated setup to achieve because of the way rasterization works. For
Cycles, you can quickly get the Glass BSDF shader working for a simple
setup.

At first, you may try to use the Glass BSDF for Eevee but soon will realize
it doesn't work to get transparent surfaces ready for a more realistic project.

To create a convincing glass material for Eevee, we will have to use two
shaders:

Principled BSDF
Transparent BSDF
As you will notice from the Principled BSDF, we don't have any control
related to transparency. To get transparency with the Principled BSDF, we
have to mix that with the Transparency BSDF.

The first step to create a glass material for Eevee is to make a material and
in the Shader Editor add:

One Transparent BSDF Node


One Principled BSDF Node
One Mix Shader Node

Connect both the Transparent and Principled Nodes to the Mix Shader and
the Mix Shader to the Material Output (Figure 3.18).
Figure 3.18 - Initial glass setup

To get a better reflection for the glass material, we need to create some
angular reflections. You can achieve that with two additional Nodes. Press
the SHIFT+A keys and create from:

Input → Fresnel
Color → RGB Curves
Connect the Fresnel to the RGB Curves and this one to the Fac socket of
your Mix Shader (Figure 3.19).

Figure 3.19 - Glass Nodes setup

Set the IOR from the Fresnel to 1.45 and use the curves in the RGB Curves
to adjust the reflectivity of light.

At this point, if you try to preview the material using Eevee, you won't see
any transparency. The reason for that is because we still have to change the
Blend Mode and Shadow mode at the materials to create the glass.

You will find those controls in the Materials tab at the Settings field. You
will set them like:

Blend Mode: Additive


Shadow Mode: Alpha Hashed
By changing those settings, you will get transparency for materials in
Eevee. You can also enable Screen Space Refraction and Show backface
(Figure 3.20).

Figure 3.20 - Glass material settings

If you apply that material to an object and rendering it with Eevee, you will
get a glass effect for that surface (Figure 3.21).
Figure 3.21 - Glass effect for surface

You can control the Color of your glass using the Base Color settings and
the glossy reflections for the surface using the Roughness. A value of zero
will give you a clear surface reflection.

If you want to use a more straightforward solution without the need for a
Principled BSDF, you can try the Glass BSDF with a Transparent BSDF.
Use the Fresnel as the Fac for the Mix Shader (Figure 3.22).
Figure 3.22 - Simpler glass

That will not create the same type of glass for your scenes but will require
fewer steps to build.

Info: In Chapter 7, you will learn a third way to create glass materials for
Eevee using the Principled BSDF alone.

3.5 Controlling tiling for materials


Once you start to work with PBR materials, you will notice that some
surfaces will appear in your objects with the appropriate size and tilling.
However, for models that require a larger area to fill, you will get large
blocks of images. That is when tilling controls become essential in your
materials.

To control the tilling for your PBR materials in Eevee, we will need two
Nodes:

Input → Texture Coordinates


Vector → Mapping

You must connect the Generated output socket from the Texture
Coordinates to the Mapping. From the Mapping Node, you will connect to
each one of the Image Texture Nodes (Figure 3.23).

Figure 3.23 - Nodes for tilling control

Depending on the number of textures you have in a PBR material, you


might have a large number of connections to make. To control the tilling of
a PBR material, you will use the Scale settings for the Mapping. For
instance, a scale with a value of one will give the default size for the
texture.

By using a lower value will increase the size of your textures and a higher
value will give you a smaller block (Figure 3.24).
Figure 3.24 - Texture sizes

If you feel your textures are not looking good with the use of those tiles,
you can always set the scale back to one and have the default tilling for the
PBR material.

What is next?
The materials are a vital part of any project that uses Eevee for realism, and
you won't get great images without investing some time with PBR
materials. Having a preview of your materials in real-time will add an
incredible boost in productivity for Eevee and your workflow.

Most of the effects you see on screen using Eveee will appear in the render
preview without the need of any unique setups, but due to the nature of
Rasterization, we will have to use a few tricks to see the full extent of
visual effects for a scene.

In Cycles, we can prepare the scene and start the render after making some
simple choices for materials and lights. The Path Tracing algorithm will
handle most of the physics and lights with accuracy at the cost of render
times.

You can't get lots of those effects in Eevee by default because Rasterization
can't handle them. You will need to add helper objects that will compute
effects like indirect lights and reflections. The materials we created in this
chapter will show in the preview window with all the necessary effects but
will lack reflections for the scene.

In the following chapter, you will learn how to use a unique object for
Eevee called probes. Those probes will help us adding those effects in a
scene to both improve materials and the overall scene.
Chapter 4 - Probes and lights for Eevee
The rasterization process of Eevee is incredible to deliver high-quality
images in real-time, but to get to that quality, we have to overcome a few
limitations. In Cycles, we have a process that uses common types of objects
like lights and textures to create images for a project.

Since it uses a physically accurate process, you can add the lights and start
the render process, which will result in a realistic image. That comes at the
cost of render times, which are significantly higher than what we get in
Eevee.

Among the limitations from Eevee, we can list indirect lights and
reflections for glossy materials. Those two elements that are critical for
realism will not appear in rasterization render automatically. We have to use
some helper objects to enable them.

In this chapter, you will learn how to use those helper objects that receive
the name of probes. Eevee has three different light probes that will give you
tools and options to mimic the behavior of light. You can get results that are
close to what we find with Cycles.

Here is a list of what you will learn in the chapter:

What are light probes and their purpose in Eevee


How to use the Irradiance Volume
Control the Irradiance Volume settings
Use nested Irradiance Volumes
Create reflections with a Cubemap
Place and setup a Cubemap
Make flat reflection planes
Use visibility controls with collections for probes

4.1 Light probes in Eevee


When Eevee first appeared in Blender 2.8, a lot of artists became excited
about the possibilities a real-time render engine gives to digital art. Along
with the excitement, you also got some setbacks about some new options
regarding Eevee. In the Blender creation panel, a new set of tools appeared
with the name of Light Probe (Figure 4.1).
Figure 4.1 - Light Probes

The light probes will help with the process of setting up a scene in Eevee
for getting effects and better lights. Unlike Cycles that works with an
algorithm that uses Path Tracing to get interactions between lights and
surfaces, you will have the Rasterization process in Eevee applying visual
tricks to the scene.
One of the objectives of those tricks is to mimic effects like global
illumination for real-time renders. If you have any previous experiences
with game engines, you will find the light probes familiar. That is because
on those engines you also have the same concept of a light probe.

The probes in Eevee have a single purpose, which is to fake some of the
effects you would get in traditional render engines like Cycles, but in real-
time. From the list of probes, we get the following options:

Irradiance Volume
Reflection Cubemap
Reflection plane

All those probes will help you make specific effects in Eevee and have a
vital role in any attempt to render realistic scenes.

4.2 Irradiance Volume


The Irradiance Volume is one of the essential probes from Eevee, and you
should give it special attention on any setup. What is the primary purpose of
the probe and why you need one of many of those probes?

With an Irradiance Volume, we can fake an effect that you can quickly get
in Cycles, which is indirect illumination. In chapter 5, we will talk more
about the process of baking and managing indirect lighting with Eevee.

To fully understand how you can benefit from the use of an Irradiance
Volume, we can take a look at a scene that doesn't use any probes. In Figure
4.2, you can see a scene that has a red color for the surface and white walls.
Figure 4.2 - Scene with no probes

The scene doesn't look realistic for many reasons, and one of the missing
aspects is the indirect light effect. In a scene like this one, you would expect
the light is hitting the red surface to bounce around, and as a result, you
would get a faint red color spread over walls.

What you should expect from a scene is what you see in Figure 4.3, where
the bounces from the floor will add some color to the scene. That is indirect
light.
Figure 4.3 - Indirect light effect

Due to the nature of Rasterization, we can’t get that in Eevee without some
tricks.

When you add a light source in Eevee, it will lit surfaces and do not bounce
around the scene. The light will hit a surface and stop. Here is where our
Irradiance Volume enters. Using that volume will enable Eevee to catch
light in a particular area and compute the indirect bounces in a specific
region.

4.2.1 Irradiance Volume structure


To fully understand how to use and manage Irradiance Volumes we have to
take a moment to analyze the structure of that object, how the placement of
one or multiple of them in a scene will impact indirect lights. After you
press the SHIFT+A key and add the probe to the scene, you will get a
unique boxed shape probe (Figure 4.4).
Figure 4.4 - Irradiance Volume

You can change the settings and the appearance of the probe using the
Object Data panel when you select the Irradiance Volume (Figure 4.5).
Figure 4.5 - Irradiance Volume options

At the options you will see:

Distance
Falloff
Intensity
Resolution (X/Y/Z)
Clipping (Start/End)

Those are the main options for the Irradiance Volume, and we will work
with them to get the best results. How do they affect the probe?

Before we describe each option from the probe, it is vital to see a visual
breakdown of the object. In Figure 4.6, you can see an image of the probe
with some key points highlighted.

Figure 4.6 - Probe description

We can begin our analysis of the probe with the distance. As you can notice
from the images representing the Irradiance Volume, we have a two-box
structure for the object. A big box on the outside and another one smaller
inside. The gap between those two boxes is what the “distance” setting
controls.
The main indirect light calculation will happen on the smaller box. Between
the two shapes, you will have decay on the effect that will eventually fade
away. You can set the amount of that decay with the Falloff control. By
default, the value one is what gives the most realistic results.

You can adopt as a rule for any scene, where you want to process indirect
lights, that it will have to fit inside the smaller box. If you set the distance
to zero, you will have only a single box design, which might help you to
align the Irradiance Volume with a scene (Figure 4.7).

Figure 4.7 - Distance set to zero

Another key element of the Irradiance Volume is the sample distribution.


By default, you will have the small dots inside the volume with a
distribution of 3 on each axis. How much do you need for a scene? That
will depend on the quality you are looking to achieve with indirect lights.

Tip: You will be able to later increase the size of the samples to make it
easier to manage the way they capture and bounce lights.

If you get an area in your scene that shows some low-quality shadows or a
criss-cross pattern on shading, you might need additional samples. You can
add more of them using the resolution settings.

Having samples between objects and surfaces also help with the indirect
light calculations. If you look at Figure 4.8, you will notice that we have a
simple scene. The objects are in the area of effect for an Irradiance Volume,
and others don’t have any sample between them and the floor.

Figure 4.8 - Objects with no samples

The object on the right has samples between their shape and the floor.
Because of that, it receives indirect light from the material. As a result, you
will get a shadow that has a mix of colors from the floor material. The
object on the left doesn't show that shade, because it doesn't have probes
between his shape and the floor.

In Figure 4.9, you can see the results of the same scene with additional
samples.
Figure 4.9 - Scene with additional points

The number of samples in your scene will have an impact on the indirect
light calculations and shading quality. You will also notice some increase in
indirect light processing later. We will discuss how to manage those
calculations in chapter 5.

4.2.2 Nested Irradiance Volumes


Another possible solution for a scene that requires more points to calculate
indirect lights is the use of nested volumes. In the example shown in Figure
4.8, we could also add volume. You have to create a new volume and scale
it down to fit the area you wish to improve.

A solution with a nested Irradiance Volume would look like Figure 4.10
shows.
Figure 4.10 - Nested Irradiance Volume

You can get as many volumes you think are necessary to fix a problematic
shading for indirect lights.

4.2.3 How to place Irradiance Volumes?


What is the best strategy to place an Irradiance Volume in a scene? The rule
you have to follow is; place all objects and scene inside the volume to have
indirect lights.

For instance, if you have a scene that needs indirect lights calculations, you
can use the scale transformation in Blender with the S key to adjust the size
of your volume. Make the volume to fit everything on the scene in a project
that has a rectangular shape that will be easy to achieve.

What if you have a scene that has another type of shape? In that case, you
can use multiple volumes. In Figure 4.11, you have an example of a scene
that has a form that requires various volumes.
Figure 4.11 - Scene with multiple volumes

The scene requires two volumes to get covered for indirect lights
calculations.

To make your process of adapting a volume to a scene easier, you can set
the distance of each volume to zero. That way you won't have to worry
about getting objects in the falloff area with a reduced shading effect for the
indirect lights.

4.3 Reflection Cubemap


Unlike Cycles that can perform a lot of calculations for the behavior of
light, we have to use a few tricks with Eevee to get effects like indirect
illumination. Another effect that Eevee can’t do alone is the reflection on
surfaces. For that particular task, we can use a probe called Reflection
Cubemap that will help the renderer to make reflections.

To create a Cubemap, you have to press the SHIFT+A keys and choose
from the probes group the Reflection Cubemap. The object will have a
sphere shape by default, but you can also use a cube as the primary
reference. If you want to make it visually unique, it is a good idea to keep it
as a sphere, since the Irradiance Volume already uses a cube shape (Figure
4.12).

Figure 4.12 - Reflection Cubemap

What is the benefit of a Reflection Cubemap? The main advantage of using


such an object is to have better glossy surfaces. If you use a glossy material
in your scene, the object won't show reflections at first, even after enabling
the rendered mode.

Tip: You can create glossy surfaces by changing the roughness settings for
any shader in Blender.
When you place the object inside the Reflection Cubemap, it will
immediately capture the surrounding objects and start displaying reflections
(Figure 4.13).

Figure 4.13 - Glossy material with reflections

That will give you a lot of freedom to create and place objects with glossy
reflections in your scene.

4.3.1 Placing and setup of a Cubemap


For the Cubemap in your scene, you must follow a rule similar to what we
did with the Irradiance Volume. You will place the objects that have a
glossy reflection or surface inside the Cubemap. The best way to perform
the setup is with a scale transformation applied to the probe.

If you want to streamline the setup process, you can quickly get the probe
with a scale that fits the entire scene, like Figure 4.14 shows.
Figure 4.14 - Reflection Cubemap

After you add the Cubemap, you will have to bake reflections to the
objects, which we will cover in Chapter 5. If you have a small area or
surface that has a glossy reflection, you can make the probe small enough to
fit only the area of that object.

When you select the Reflection Cubemap object, you will be able to see all
options regarding the control and behavior of that probe (Figure 4.15).
Figure 4.15 - Cubemap options

It has similar options to the Irradiance Volume like:

Intensity
Radius
Falloff
Clipping (Start/End)

The Intensity defaults are one, and it will produce the best results for
reflections. Unless you have a good reason to increase or decrease the
reflections, you can always leave the value as one. With the radius you can
control the size of your probe and the Falloff will set the distance between
the inner and outer probe. If you don't want to fade reflections, change the
value to zero.

For big scenes that have lots of objects with a glossy reflection, you can
make a large Cubemap that will surround the entire scene.

Regarding the position of the probe, make sure it is in a location that will
create proper reflections. The probe will act like a camera that takes a
snapshot of the surroundings and cast the image to the glossy surfaces. It
will only project what is in the visual distance.

For instance, if you place the probe near the floor, it will project the
background of your scene to the objects. You will see a black reflection in
the bottom half of every object.

Info: We will have to bake the probe to see all results with reflections. If you
don't bake the probe, the scene will remain the same.

4.4 Reflection plane


In some cases, a Reflection Cubemap will add too much computational load
to a complex scene, especially in the bake processing. The probe helps with
reflections from tridimensional objects, but for simpler models, we can use
a Reflection plane. The primary purpose of this probe is to create
reflections for flat surfaces.

After you add the probe to the scene, it will appear to be a simple plane. It
has an arrow showing the reflection direction and an area of influence. If
you increase the Distance for a Reflection plane, you will see the results as
a bigger probe (Figure 4.16).

Figure 4.16 - Reflection plane size

The size is essential to define what the probe sees, which will also be
visible as a reflection.

You can preview the results of your probe using the “Show Preview Plane”
at the bottom of your probe options. At the “Viewport Display” field, you
will find that option to display what the plane sees (Figure 4.17).
Figure 4.17 - Probe preview

That is the best way to create true mirror objects in Eevee with a minimum
computational load to the scene. For projects that require true reflective
surfaces like mirrors, water, or anything with a perfect reflection, you
should use Reflection planes.

One of the advantages of the Reflection Planes is that they don't require any
type of baking.

Info: It is possible to create reflective objects using materials. For instance,


using the Principled BSDF, you can change the roughness of the material to
zero. That will turn the surface into a polished mirror.

4.5 Visibility controls for probes


For each probe in Eevee, you will find an additional control for all of them
called “Visibility Collection,” which can help you organize a complex
scene. What are collections in Blender? That is the tool Blender uses to
place objects in groups.

A collection can store multiple types of objects in a scene and can receive
unique names, and you can even nest collections for a more powerful
organization. When you use collections with the probe visibility options,
you can control what objects will interact with the probe.

A scene that has no collections set in the visibility options will reflect and
interact with anything that is the range of a probe. If you add a collection to
the visibility, only the objects that are part of that collection will show in
the probe area of effect.

For instance, we have in Figure 4.18 a Reflection Cubemap that is


surrounded by multiple objects. They will all appear in a glossy surface that
is at the center of our probe.

Figure 4.18 - Reflection Cubemap

Now, if we get the objects with red color from that scene and place them in
a collection called “Red Objects.” Add that collection in the Visibility
options; only they will appear for the probe (Figure 4.19).
Figure 4.19 - Visibility options for probes

Using those options will give you more power to choose if an object will
appear as part of the probe. If you start to adopt collections for all your
projects, it will be easy to control the visibility of objects for probes. For
instance, you could have a collection called “Furniture” and move all
objects representing furniture to that group.

After adding the collection to the visibility options, a mirror represented by


a Reflection plane would only show objects that are part of the “Furniture”
collection.

4.5.1 Managing collections


Since collections are essential to manage how you display and interact with
objects and probes, we can review how you can create, manage, and remove
collections in Blender. Where are the collections? In the default Blender
user interface, you will find collections at the Outliner Editor (Figure 4.20).

Figure 4.20 - Outliner Editor


There you can interact with collections by:

Renaming
Moving
Erasing
Creating
Nesting

You will have all those controls for collections available at the Outliner
Editor.

To create a new collection, you can use the right-mouse button at the
Outline Editor that will open a small menu. At this menu, you will see a
“New” option that creates an empty collection (Figure 4.21).
Figure 4.21 - Empty collection

You can rename the collection to something that will help identify the
contents. For instance, we can use “Red chair” in this collection. Double-
click the collection name to rename. Since they will help you with the
organization of a scene, it is essential to assign meaningful names to either
collections or objects.

Info: Each scene in Blender will have a “Scene Collection,” which works
like a master collection that has all others nested. You can’t remove the
Scene Collection. The newly created collection appear on the list as a
nested collection.

To move an object to that collection, you can click on the object you wish
and drag it by their name and release over the collection name. A simple
click and drag will move objects between collections.

Another way to add objects to collections is with the M key in the 3D


Viewport. You can select one or multiple objects and press the M key.
When you press that key, a small menu will appear (Figure 4.22).

Figure 4.22 - Collections menu


The menu has a list with all existing collections in the scene, including the
Scene Collection, and an option to create a new one. If you choose an
existing collection, the object will move to that current group. By selecting
the “+ New Collection,” you will move the object to a brand new
collection. You will even be able to select the collection name in the 3D
Viewport.

Now, if you set the “Red chair” as the Visibility Collection for the probe,
you will only see the objects from that collection in the reflections (Figure
4.23).

Figure 4.23 - Visibility options

Use the collections as much as possible to organize a scene. Not only to


control the visibility of your scene regarding probes but to give you a
higher level of control for the scene.

One of the benefits of using collections for a scene is the ability to use the
Append and Link options from the File menu in Blender. With those two
options, you can move data between files and reuse some of your assets.
For large projects that have dozens or hundreds of assets, you can use
collections to identify groups of objects quickly.
You can even create a “template” for Eevee with all necessary probes and
lights and place them in a collection. After starting a new project, you can
simply append the contents of that collection with pre-made settings.

What is next?
The light probes in Eevee are an excellent help for any project that has an
objective to make real-time images using realism. If you know how to use
them in a project, you can get results that are close to what Cycles deliver.

Using probes like the Irradiance Volume might transform any scene you
make in Eevee by adding Indirect Light calculations. When working with
projects that must show interiors or a scene in enclosed spaces, using
indirect lights might enhance the realism and quality of your shading.

Even delivering images in real-time, we still have to make a few processing


before rendering with Eevee. That processing has a direct connection to the
probes. If you try to add an Irradiance Volume or Reflection Cubemap to a
scene, you won't see much difference.

The visuals of your scene will only change after you bake the results to the
scene, which is the main subject of the next chapter. With the baking
process, you will have to wait a few seconds or minutes for Eevee to
calculate each probe contribution. It is not like a render from Cycles, but it
will add a few moments to your render.

Unlike many processes that also require the baking of lights to textures, you
don't have to keep objects at fixed locations. After baking, you can move
them around, which will make animation production a lot easier.

The next step in your Eevee setup is to get your probes backed to the scene
and see the benefits of using a real-time render engine like Eevee.
Chapter 5 - Indirect Lights with Eevee
At this point, you have a solid understanding about how to use probes with
Eevee and add them to a scene, but placing them in a 3D scene won't trigger
the visuals or effects we need. There is an extra step we have to perform to
get the effects for each of those probes.

You have to bake the results of each probe to get the visuals in a render.
What is the baking process? That is the main subject of this chapter, where
you will learn how to manage and process the probes.

The baking component of Eevee will make you remember Cycles for a
moment because we have to process the scenes to get a good result. But, it
is a lot faster than what we get using Path Tracing.

You will also learn how to manage common problems associated with the
baking of Indirect Lights in Eevee, like the light leaks from surfaces that
can cause many delays in projects and make you go back to the modeling
stage.

In this chapter, you will learn how to manage and fix light leaks. Here is a
list of everything you will learn:

Using indirect lights in Eevee


Baking effects for probes
Change settings for baking indirect lights
Managing light leaks
Fixing light leaks in Eevee
Modeling for Eevee
Adjusting shadow settings for Eevee
Baking reflection probes
Using Screen Space Reflections

5.1 Indirect lights for Eevee


A key component for any project that wants to create realistic images
coming from Eevee is working with indirect lights. When you use a
renderer like Cycles, it already handles indirect lights from surfaces
automatically. Due to the nature of how Path Tracing works, you will get
good results with "minimum" effort.

The price you will pay in Cycles is a long render time. In Eevee, we have
the benefit of getting renders in real-time, but most of the process to get
good indirect lights for the scene will require the use of probes and a panel
do process indirect lights.

Why do we have to "bake" indirect lights? Eevee can't process light


bounces alone. That is the main reason. You have to add probes like:

Irradiance Volume
Reflection plane
Reflection Cubemap

Each one of those probes will contribute to the final solution to make a
better render in Eevee.

Before you start working on the solution to get your indirect lights in
Eevee, it is imperative to add all the materials to the project. Regardless of
the complexity or types, you will set all the colors for each surface.

The primary reason to work on all materials before you start dealing with
indirect lights is to ensure you are using the accurate light bounces. When
you process indirect lights, you will add to the scene some light bounces
that will carry colors from each surface.

To get that effect in Eevee, you will need a scene that has:

All materials and textures


Light probes to help with calculations

With those two aspects of the scene ready, we can move to calculate
indirect lights in Eevee and make our scenes look better visually. The
process of baking indirect lights in Eevee requires some calculations and
resembles a pre-rendering. As a result, you may have to wait a few seconds
or minutes until the calculations are complete.

It is not as long as a render in Cycles but will require some time to process.

5.2 Baking and calculating indirect lights


To process indirect lights in Blender, we have to use the Render tab and the
Eevee options. There you will find a field called "Indirect Lighting" where
you have all the necessary options to process that information. At the panel,
you have the button that starts the process, which is the "Bake indirect
Lighting" (Figure 5.1).
Figure 5.1 - Indirect lighting options
A requirement to bake indirect lights in Eevee is the existence of at least
one Irradiance Volume at the scene. If you don't have any probes, the
indirect lights won't appear.

You can start the process by pressing the button to make Blender calculate
all contributions to the indirect lights. When you press the "Bake indirect
Lighting," a progress bar will appear at the bottom of your interface,
showing the state of your calculation (Figure 5.2).

Figure 5.2 - Progress indicator

By pressing the button, you will bake not only the Irradiance Volume results
but also the Reflection Cubemap.

Depending on the complexity, you may have to wait a few seconds or


minutes to finish each calculation. After you have the process finished, you
will see some data below your settings with:

Number of irradiance samples


Size in memory
Number of Cubemaps

Does it make a huge difference to bake indirect lights? Here is a


comparison of an image that has no indirect lights and the same file with
the baked solution in Figure 5.3.

Figure 5.3 - Indirect lights comparison

Look at the contribution made by the light bouncing in surfaces that have a
more prominent color.

After you calculate the indirect lights, you can view the irradiance samples
with an increased size to compare how they are reflecting light. Use the
“Irradiance Size” option to increase the size and enable the visualization
(Figure 5.4).

Figure 5.4 - Irradiance Volume size

Each sphere will show a preview of how your samples are bouncing light
across the scene.

5.2.1 Updating the indirect lighting


Depending on the project you are working, it may be necessary to make
changes to the model or replace a material. In those cases, you will have to
erase and update the indirect lights calculations. At the bottom of your
options to start baking those lights, you have the "Delete Lighting Cache"
(Figure 5.5).
Figure 5.5 - Erasing indirect light calculations

After you press this button, you will remove any saved solutions for
indirect lights and will be able to start over with the process. Replace
materials and make changes to the scene. Once you have the updates, you
can press the "Bake indirect Lighting" again.

How often do you have to erase and recalculate indirect lights?

Whenever you make changes to the scene regarding materials is the most
common cause to require a full recalculation of the scene. Moving objects
around will not cause any problems or errors, but changing materials may
affect the scene directly.

For instance, you may have a wood floor for a scene and replace that
material with metal tiles. The color of those two materials are entirely
different and will appear in the rendered scene. You have to recalculate to
have a visually accurate scene.

In any case, you can enable the automatic update of the calculations. At the
Render tab, find the "Auto Bake" option (Figure 5.6).
Figure 5.6 - Auto Bake

Every time it updates an object on your scene that requires a change in the
indirect lights, Blender will trigger the baking process automatically.

5.3 Optimizing Indirect lights


The default values for the indirect lighting calculations will give you great
results already, but we can make them even better with a few teaks in
settings. From the list of options, you will find settings to control bounces,
shadows, and the resolution for the calculations (Figure 5.7).
Figure 5.7 - Indirect light options

When should you make changes to the settings for indirect lights? Usually,
when you have a situation that will demands changes based on poor
shadows or bad shading. You may see artifacts in the final render, strange
shadows, or light leaks in the model.

Here is a list of possible settings that you can use to enhance the indirect
light calculations:

Diffuse bounces: How many times light will bounce in the scene.
Cubemap size: Resolution for the Cubemap used to record the
illumination.
By using those two settings, you can already improve the final solution for
your render in Eevee. The best way to show how those settings affect the
final render of a scene; we can use the following project that has indirect
lights calculated using the default values (Figure 5.8).

Figure 5.8 - Scene with default values

If we increase the number of bounces to 6 and the Cubemap size to 1024px,


the final result will have a much better look (Figure 5.9).
Figure 5.9 - Scene with improved solution

Use those settings whenever you feel that your final indirect lights solution
still lacks some quality for either the number of bounces or the irradiance
resolution.

Info: The Cubemap size will only affect the probe Reflection Cubemap. You
must start to bake that probe to see such effect.

5.3.1 Managing noise from glossy surfaces


A common problem that can appear in render using Eevee is the presence of
noise in glossy materials. That is a significant problem not only for Eevee
but also for Cycles. A solution to that problem is to increase the number of
samples used to render your scenes.

At the Render tab, you will see the Sampling field. There you have two
settings for the Render and Viewport (Figure 5.10).
Figure 5.10 - Samples settings

In most cases, you will only need 64 for the final render. When glossy
surfaces start to display noise, which will appear as small white dots, you
can increase the samples for rendering to remove them.

5.4 Fixing light leaks in Eevee


A major problem that you may encounter in Eevee is the appearance of
light leaks in interiors scenes. That will happen for several reasons like a
poorly designed 3D model or misplaced shadows. No matter the cause, you
will eventually face that problem in Eevee.
The light leaks will show up as strange highlights in corners of your model
after you process indirect lights (Figure 5.11).

Figure 5.11 - Light leaks example

You can fix light leaks in Eevee using a variety of settings from different
locations. We will have to tweak settings for:

3D Models
Shadows
Lights

Depending on your Model, you may fix the leaks by changing just one of
the settings.

5.4.1 Managing 3D models to prevent leaks


One of the main problems that can cause light leaks in Eevee is the way you
build the 3D model. In general, you must avoid structures that have only a
single plane and no thickness. Those types of structures will eventually
suffer from light leaks.

What would be a bad structure for a 3D model?

From a modeling point of view, you can say that a structure that doesn't
have the proper optimizations would look like Figure 5.12 shows.

Figure 5.12 - 3D model with single planes

The problem with that model is that it has only single planes for the
structure. If you plan to use Cycles for rendering it will work fine, but with
Eevee, we have to be extra careful. After you start adding lights to the
scene, it will eventually show some light leaks on corners from indirect
light calculations.

If you decide to use an HDR or Sunlight from Blender, the problem will
appear with a higher level of intensity.

How to fix that? You can follow some simple guidelines to create 3D
models for Eevee and prevent light leaks. Regarding modeling, you can add
some thickness to all your 3D models. One of the most straightforward
ways to add thickness to the models is with the Solidify modifier.

For instance, we can take a model that will eventually suffer from light
leaks like the one from Figure 5.12. Take that model and apply a Solidify
modifier. As a result, you will get some thickness to the object (Figure
5.13).

Figure 5.13 - Model with a solidify modifier

That alone will not solve light leaks in Eevee but will reduce the chance of
having those undesired effects in your projects.

5.4.2 Modeling for Eevee


Since the structure of your 3D models can become one of the reasons you
start to see light leaks in Eevee, it is essential to adopt a few practices
during the modeling stage of a project. For Cycles, you don't have to
change much regarding the creation of a scene in 3D.

For Eevee, you should make a few adjustments to the process. The first
thing to do is to prepare the scene with models that have a lot of thickness.
To demonstrate how you can start a simple scene that has optimized
polygons for Eevee, we can make the model for the room shown in Figure
5.14.

Figure 5.14 - Room to Model

As you can see from the image, we have a simple room with standard
thickness for the walls. A common approach to the modeling of such
environments would be to get a small plane at the corner of your room and
start extruding it from that point (Figure 5.15).
Figure 5.15 - Plane on the corner

That would work for a model you have to create and render in Cycles, but
in Eevee, you will eventually face some potential light leak problems. At
the beginning of your modeling, start adding some thickness to the walls. In
Edit Mode, select the vertices of your plane and stretch them (Figure 5.17).

Figure 5.16 - Stretching objects


From that plane, you can start making the extrudes, until you will end up
with a model that has no accurate wall thickness. But, it will work for
Eevee (Figure 5.17).

Figure 5.17 - Walls with extra thickness

If that won't show up in the final render, you don't have to worry about
accuracy regarding 3D modeling (Figure 5.18).
Figure 5.18 - Extruded walls

What if you got a model coming from an old project? In that case, you can
always apply the Solidify modifier.

5.4.3 Tweaking shadows to fix light leaks


The following settings that you should try to tweak are the shadows for
your render in Eevee. You will find the Shadow settings in the Render tab.
There you will have options to change the method used to generate shadows
and also the resolution (Figure 5.19).
Figure 5.19 - Shadow settings

For the shadows settings, you can use the following parameters to create
high-quality shadows that prevent leaks:
Method: Change to VSM
Cubemap: Increase to 1024px or higher. Those are shadows for area
and point lights.
Cascade Size: Increase to 2048px if you are using sunlight.
High Bitdepth: Enable
Soft Shadows: Enable

Those settings will help you to fix potential light leaks in shadows. If you
have a scene with a baked solution for indirect lights, it is now time to
delete the cache and process them once again.

5.4.4 Tweaking lights to fix light leaks


Another step you can perform to fix light leaks in Eevee is in the settings
for each of the light sources in our scene. From a light perspective, you can
have three main types of lights for a scene in Eevee:

Area
Point
Sun

Each of them will require specific settings to prevent leaks. From the two
options, you will find light leaks more often when using the Sun, which is
why an Area light is the best choice for scenes in Eevee.

If you are using an Area or Point light, you can select it and open the Object
Data panel to see all options regarding lights (Figure 5.20).
Figure 5.20 - Area light options

There we can change the following settings:

Clip Start: Change to 0.1


Softness: Change to zero
Bias: Change to 0.001
Bleed Bias: Change to 0.1
Contact shadows: Enable
Softness (Contact shadows): Change to 5

If you are using a Sun for your scene, you can use the same values with a
few differences:

Direction: Do not point the Sun directly to the interior of your scene
Angle: Change to 0.1

By making those changes, you will help prevent and fix potential light
leaks. With those settings and tweaks, you can delete the cache for your
Indirect lights and process them again (Figure 5.21).
Figure 5.21 - Sun options

What if the leaks don't disappear? If they remain in your scene, you can try
to move the lights from their location or look at the structure of your 3D
model. At some point, you will find a combination for settings and 3D
models that will not generate leaks.

The most common scenario for leaks comes from projects that you try to
migrate from Cycles to Eevee. Since we don't have those problems in
Cycles, you will eventually find yourself with a scene that needs a fix in
Eevee.

For the cases where you start from scratch with the purpose to render in
Eevee, you can follow the guidelines for modeling to prevent the problem
with your lights.

5.5 Baking Cubemaps for reflections


At the Indirect Lighting options, you will see two options for baking. One
of them is for Indirect Lights that can process the Irradiance Volume and
Reflection Cubemap. The second option is for the Cubemap probe alone. If
you only need to process the reflections from a Cubemap you can use the
“Bake Cubemap Only.”

The process is simple to complete and will require you to only place one of
the probes in the scene, and at the Render tab, press the "Bake Cubemap
Only" button (Figure 5.22).
Figure 5.22 - Cubemap bake

When you finish the Cubemap bake, you will be able to see reflections and
other visual effects related to the probe (Figure 5.23).

Figure 5.23 - Reflections in objects


One thing you should keep in mind before hitting the bake button is the
probe position. Depending on the location of your probe, you might get
different reflections. For instance, if you place the probe by the floor of
your scene, the object will catch the environment in the bottom half.

Tip: You can make glossy materials that look like metal using the Glossy
BSDF shader and setting the roughness to zero.

The result will be a considerable black reflection on all glossy surfaces


(Figure 5.24).

Figure 5.24 - Black reflections

If you look closely to Figure 5.23, you will notice the metallic object is not
reflecting the red chair at the object. The reason for that is because our
probe is in the middle of our scene. To the right, it doesn't see any other
objects.

To fix that you have to either add another Cubemap between the chair and
object or move the probe. By placing the probe between both objects, you
will get the chair reflected in the object (Figure 5.25).
Figure 5.25 - Object reflection

As a plus to the baking process of Cubemaps in Eevee, you don't have to


worry about keeping the object at the same locations. The reflections will
update even if you move the objects around in the scene. That is great for
animations where you have moving objects that are part of the probe area of
influence.

Tip: You can view the probe as a mirror ball after enabling the eye icon for
the Cubemap at the Indirect Lighting panel.

From the two reflection probes, e have available in Eevee you have to bake
only the Cubemap. The reflection plane doesn't require any baking.

5.5.1 Improving reflections with Screen Space


Reflections
Since we are dealing with reflections in Eevee, you can improve the visuals
of a scene that have glossy surfaces with the Screen Space Reflections. The
option is available at the Render tab, and you must enable it to make it
appear in your renders (Figure 5.26).
Figure 5.26 - Screen Space Reflections

We already mentioned the effect when discussing materials with settings for
Eevee, but now it is time to explore a little more of the effect.

Like the name states with Screen Space Reflections, you get an effect that
will mirror your scene on glossy surfaces. It will capture what you see on
screen and mirror that on that surfaces. It will help to add another level of
realism to the scenes. The effect works even if you don't have probes in the
scene.

Does it make a difference to use Screen Space Reflections for a project?


Look at Figure 5.27 for an example of a scene the effect applied.

Figure 5.27 - Screen Space Reflections applied

The result is a surface with reflections for materials like a floor with glossy
reflections. That is different from the effect you get in a probe because it
blends with the materials. A reflection probe would make a perfect mirror.

With the Screen Space Reflections, you will get an effect closer to glossy
surfaces in Cycles.

5.5.2 Overscan for Screen Space Reflections


The Screen Space Reflections works by getting what you see on screen and
apply that as a mirror to glossy surfaces, which is excellent for quick effects
but can result in strange renders. If you get an off-screen object, it will
disappear from reflections.
A solution for that is with the use of a feature from Eevee called Overscan.
The Overscan is available at the Film options in the Render tab (Figure
5.28).

Figure 5.28 - Overscan

When you activate the Overscan option, you will get Eevee extending the
area of your render for Screen Space Reflections. The default value will
extend the visible region by 3%. To get more objects appearing on screen
for rendering, you can increase the Overscan to a more significant amount.

What is next?
You know have all necessary information to start working on a project that
will use all available tools and resources to get real-time render made with
realism in Eevee. From indirect lights and probes to the final render.

The next step regarding using Eevee is to get an existing project and apply
those settings to check results and fix potential problems. In the following
two chapters, we will get projects that have only the modeling part ready,
and add elements like lights and materials to render with Eevee.

In the first scene, we will cover what to do in an interior project that has
challenges for light placement and visualization. The next scene includes an
exterior visualization with unique challenges for representing a daylight
simulation. Both scenes will allow us to apply the knowledge acquired until
this point in the book.

After you have the scenes ready, we will jump to the color management
panel and effects to apply come exposure and gamma settings to improve
lights and the overall image of our render.
Chapter 6 - Interior lights with Eevee
Until this point in the book, we learned a lot about how to work with Eevee
and setup several aspects of a project like lights, probes, materials, and the
render process. It is now time to put the workflow to the test with the
complete settings on how to create an interior render with Eevee.

We will follow a checklist of actions to work with Eevee, which is unique


to the renderer and takes advantage of aspects like light probes and effects.

The objective of this chapter is to define a workflow with the necessary


steps to create an interior render. You will be able to replicate that same
process in any other project using Eevee to make interior renders. From the
locations of Area Lights for your scene to the use of probes and effects to
enhance the render.

At the end of the chapter, you will have a better understanding of how to go
from start to finish with a project using Eevee, and have your renders in
real-time.

Here is a list of what you will learn in this chapter:

Make a checklist of actions for interior rendering with Eevee


Prepare a model to receive lights for interiors
Place lights for an interior scene
Control shadows for Area Lights
Change shadow settings for Area Lights
Prepare materials for an interior scene
Use assets from external files
Add probes to compute indirect lights and reflections
Improve indirect light calculations
Prepare the camera for interior renders
Optimize the processing of effects
Use Ambient Occlusion to create proximity shadows

6.1 Eevee workflow for interiors


After a few projects using Eevee, you will start to realize that we can follow
a checklist of actions and procedures, which will appear in every single
scene you work. That will help you to develop a workflow of actions to
prepare a scene to render. If you are reading the book in the order of
chapters, you are already following that list of tasks.

You might encounter a few changes between projects based on the nature of
what you are trying to render. For instance, a project that has a focus on
interior scenes will include a few unique problems that will not appear at
external scenes.

Here is the workflow for interior models in Eevee:

1. Check the model to see if you have some thickness to walls and make
sure there are no unnecessary gaps between objects.
2. Add environmental lights using either an HDR map or Area Lights.
For interiors, you should prefer Area Lights.
3. Create all materials to the scene. If you can, use only PBR materials
for your surfaces.
4. Add the light sources that will generate the proper mood for your
scene. It could be an Area, Point, or Sun.
5. Create probes to compute indirect lights and reflections.
6. Bake the probes to the scene
7. Review the solution to fix potential shading and light leak problems.
8. Add effects and post-production to finish your image.

As you can see from the list, a critical step that you must perform before
going any further in the process is adding materials. You should look for
PBR materials for all the surfaces to make sure you have the benefits of
accurate physical reflections and surfaces.

We will apply the checklist during the rest of this chapter to create the
image shown in Figure 6.1.
Figure 6.1 - Interior scene in Eevee

The scene will work as an example of how we can create realistic images in
real-time using Eevee.
6.2 Interior models for Eevee
The first step regarding modeling in a project you know will use Eevee for
rendering is to make sure all elements have a proper thickness to avoid
future problems regarding light leaks. You can use a model that has thin
planes as structure and not have light leaks at all, but it will be better to
avoid any potential problems.

Our scene is simple and consists of a room with:

Floor
Walls
Window

A critical aspect of a scene that you can try to follow in your projects is to
remove geometry behind the camera. That will give your scene another
source of light coming from the background.

In Figure 6.2, you can take an overview of the model.


Figure 6.2 - Overview of the scene

At this point, the scene doesn't have any lights or materials in Eevee. If you
trigger a render, it will display only the primary colors for all the objects.

6.3 Environmental lights for interiors


For the scene environmental lights, we will take a different approach from
previous examples and won't use any HDR textures. Since they could add a
significant amount of noise to the scene and cant cast shadows in Eevee, we
will focus on Area Lights for the environment.

The main point of entrance for lights in the scene is the side window, which
doesn't have any frames, and the big opening behind the camera (Figure
6.3).

Figure 6.3 - Lights point of entrance


If you follow our advice regarding lights in Eevee, you will place two
leading lights for this particular scene. The first one will be a massive Area
Light at the same position in which you would have a Sun (Figure 6.4).

Figure 6.4 - Large Area Light

For the settings regarding this Area Light that will simulate the illumination
coming from the background, you will enable “Contact Shadows” and
adjust the Power to meet the desired level (Figure 6.5).
Figure 6.5 - Light settings

Since we are still at the beginning of our setup for the scene, it is too early
to change settings related to light leaks. If the scene requires some
additional adjustments, we will wait until the indirect light calculations.

Info: Why not use a Sun Light? The Sun Light in Eevee has a high
probability of generating light leaks, and you should avoid using it if you
have other options. In this case, an Area Light will create the effect we
need.

The next light for the scene will be another Area that will stay at the big
opening behind the camera. You will adjust the scale of this particular light
to fit the exact size of the opening (Figure 6.6).

Figure 6.6 - Second Area Light


What is the purpose of this second Area Light? It will work as a fill light for
the scene, and you should use a significant lower Power. For instance, in
the light that creates the environmental illumination, you will use a Power
of 1000. In the second light, you will use a Power of 80 (Figure 6.7).
Figure 6.7 - Second light settings

Also, you will not want multiple shadows in the scene. For that reason, you
must disable the shadow casting for this second light. You can use this
technique for any interior scene you work:

1. Make a large light source for the environment


2. Add small Area Light to each one of the openings in the 3D model

That alone will create a great start for any interior rendering with Eevee.

Tip: A quick way to adjust Area Lights to opening is with the use of a
dynamic scale of those objects. If you select an Area Light, you will see a
yellow border with small squares on each corner of the light shape. Click at
one of those squares to dynamically resize the shape.

6.2 Materials and textures for interiors


Before we start adding probes and working with the indirect lights for the
scene, it is time to add materials to the objects. With the materials in place,
we will get light bounces that will contribute to the overall illumination.

The first object that will receive materials is the floor model, which will use
a PBR parquet texture. For this particular texture, we will use a set of
textures containing:

Color
Ambient Occlusion
Roughness
Normal

To add the material in the floor model of that scene, you can:
1. Select the floor object
2. Go to the Material tab
3. Create a new material
4. Set the shader as the Principled BSDF
5. Open the Shader Editor
6. Add four Image Texture Nodes
7. Open each one of the textures in the Texture Nodes
8. Set the color space for both Roughness and Normal to Non-Color
9. Add a MixRGB Node
10. Add a Normal Map Node
11. Connect the Color and Ambient Occlusion to the MixRGB
12. Connect the MixRGB to the Base Color in the Principled BSDF
13. Connect the Roughness to the Roughness socket of the Principled
BSDF
14. Connect the Normal to the Normal Map Node
15. Connect the Normal Map Node to the Normal socket of the
Principled BSDF

That will create the base PBR texture for the floor. We also want to add
some tilling control:

1. Add a Texture Coordinate Node


2. Add a Mapping Node
3. Connect the Generated Socket from your Texture Coordinate to the
Mapping
4. Connect the Mapping to each one of the Image Texture Nodes
5. Adjust the texture size using the Scale option from the Mapping

In the end, you will have the setup shown in Figure 6.8.

Figure 6.8 - Floor material Nodes

You can adjust the scale of the texture using the Mapping Node until you
got a size that fits your scene scale (Figure 6.9).

Figure 6.9 - Texture scale

If you want a more detailed explanation regarding materials and Nodes, you
can go back to chapter 3, where we discuss them in more detail.

Tip: You can isolate the selection of an object to make it easier to edit
materials. Select the object and press SHIFT+H. Press ALT+H to display
all other objects again.

6.2.1 Setup a chrome material


A unique type of material will appear for the chair model in the scene,
which is a chrome surface. It must look metalling and reflect all objects on
the surroundings. To create such type of material for Eevee, you can either
use the Principled BSDF or a Glossy BSDF shader.

For instance, we can select the chair frame and go to the Material tab. There
you will add new material and don’t even have to use the Shader Editor. All
the process can use the Material tab options.

Choose as the shader for this material a Glossy BSDF and set the roughness
to zero. With the roughness as zero, you will have a clear reflection on the
surface. Make the material color to be white, and you should have a chrome
surface (Figure 6.10).
Figure 6.10 - Chrome material

The process with the Principled BSDF would require a similar approach,
using the roughness settings.

Remember that we won't see much different on that material until we add a
Reflection Cubemap to the scene.

6.2.2 Append other materials


For all other materials in the scene, you can use the Append option from the
File menu to bring it from other files. That will allow you to reuse some of
the previous materials used in Blender. If you have the projects saved in
your local computer or network, you can use the Append to get data from
other Blender files.

For instance, we can get the fabric materials for the chair cushions from
another project. Go to the File → Append menu. Pick the file you want to
use as an external library, and you will see folders with all the available
data (Figure 6.11).
Figure 6.11 - Folders to Append

Use the Material folder and look for a material that you want to bring to the
current project. In our case, I will use the Fabric material. Select the
material and hit the Append button on the top right.

Tip: You should always give meaningful names to materials if you want to
reuse them later in other projects.

How to use that material? It will now be part of your file assets. At the
Material tab, you will find it in a list you can open from the material
selector (Figure 6.12).
Figure 6.12 - Material selector

If you have the model for the cushions selected, you can easily open the
material selector and apply the Fabric material to the objects. If you don't
like the tilling for that particular material, you can always open the Shader
Editor to change and adjust the texture.

6.3 Adding probes for interiors


It is time to start working with the most important part of any Eevee related
project. You will add at the probes to start calculations for indirect lights
and also reflections. The probes we will need for this particular scene are:

Irradiance Volume
Reflection Cubemap

The Irradiance Volume will generate all necessary light bounces for indirect
light since Eevee cant process that alone. With the Reflection Cubemap, we
will get the materials using glossy surfaces something to reflect and don’t
look artificial.

You can start with the Irradiance Volume for the scene. Press the SHIFT+A
keys and add one of the probes. Set the distance for the object as zero to use
no falloff. The irradiance shape will have the full effect.

Since the default distribution of points inside the Irradiance Volume is low
for an interior scene, we can increase the value of points from a matrix of 4
to 6. Go to the Object Data tab, and with the Irradiance Volume selected,
you can change the Resolution to six in all axis (Figure 6.13).
Figure 6.13 - Irradiance Volume settings

Now, using the scale transformation, you will try to make the probe fit the
scene interior. Use the S key with each corresponding axis to make the
probe fit the entire scene (Figure 6.14).
Figure 6.14 - Adjusting the Irradiance Volume size

Based on the shape of the scene, we will need an additional Irradiance


Volume to cover all the space available. If you use one Irradiance Volume
only will leave part of the model out of the indirect light solution.

To avoid any potential problems, we can add Irradiance Volume to the


scene or duplicate the existing probe and adjust the scale to make it fit the
remaining space (Figure 6.15).
Figure 6.15 - Additional Irradiance Volume

With both Irradiance Volumes in place, we can also include a Reflection


Cubemap to the scene. Place in the center of your scene and scale it up until
all the scene is inside the probe. Remember to raise the probe above
coordinate zero in the Z-axis. Otherwise, you will have a black reflection in
the lower bottom of all objects (Figure 6.16).
Figure 6.16 - Reflection Cubemap

It is now time to process the scene indirect lights! Go to the Render tab and
in the Indirect Lighting options change the diffuse bounces to six and press
the “Bake indirect Lighting” to calculate both probes.

The process may take a while to finish, but after a few moments, you will
have the file solution (Figure 6.17).
Figure 6.17 - Baked indirect lights

As you can see from Figure 6.17, we don’t have any light leaks, which is a
great sign. But, we can improve the simulation. You can increase the Power
for the environment light from 1000 to 2500 to make it stronger and mimic
a Sun.

From that simple change in settings, we will have a much better solution to
emulate a daylight scene (Figure 6.18).
Figure 6.18 - Simulating a daylight scene

Our next step is to add additional effects and enhancements to improve


shadows and the overall look of the scene.

6.4 Setting the camera for interiors


Before we jump to the next step in the project setup, it is essential to define
the view we will have from the scene to render. Back in chapter 1, we
discussed some options to place and adjusted a camera for rendering. You
can have multiple cameras to render a scene, but only one of them will be
the active. In Blender, you will get the output from the active camera as the
render result.

The easiest way to set the active camera is with the CTRL+ALT+Numpad 0
shortcut. To use this shortcut, you have to set the view you want to have
from the scene with the 3D navigation shortcuts and press the keys.

For instance, we can use the middle mouse button to orbit the scene until
you find the best point of view. Press the keys, and your active camera in
the scene will align to that view (Figure 6.19).
Figure 6.19 - Align the view

After you align for the first time, you can select the camera border and use:

G Key to move
G key and Z key twice to dolly your camera

Since we are viewing an interior, you might want to reduce the focal
distance to make your camera have a wider field of view. A typical value
for interiors is between 18 and 24 mm (Figure 6.20).
Figure 6.20 - Camera focal distance

Once you have the camera in the desired location for rendering, we can add
some effects and work on improved shadows.

6.5 Effects and rendering for interiors


After you have all the lights and probes in place, we can start to use some
effects and tweaks to improve the visual quality of our render. In Eevee,
you will find several of those effects and enhancements at the Render tab
(Figure 6.21).
Figure 6.21 - Render tab effects

For our scene, we will use a couple of those effects to improve the overall
visualization of the scene. Those will be a great help, but we will do even
more regarding effects on chapter 8.
The first effect that you will enable is the Ambient Occlusion at the top of
your Render tab. What effect will the Ambient Occlusion add to the scene?
By using Ambient Occlusion in our project, you will start to see proximity
shadows on objects (Figure 6.22).

Figure 6.22 - Ambient Occlusion effect

That is a visual effect that we usually see in the real world and Eevee cant
reproduce by default.

If you compare the scene with and without Ambient Occlusion, you will
notice how the proximity shadows make a difference in the overall realism
of the scene (Figure 6.23).
Figure 6.23 - Ambient Occlusion comparison

You can even use the Intensity controls to make your proximity shadows to
spread below objects. One of the benefits of using the effect is the depth
perception of the scene. The lack of proximity shadows might make your
objects look like they are “floating” above the floor.
In Cycles, you don’t have to use such effect because the render can process
those shadows as part of the physics calculations for the image.

6.5.1 Shadow quality


Since we are discussing the use of shadows to improve realism, you will
notice that our scene has some blurred and fragmented shadow borders. We
can improve the quality of shadows in the render using a few tweaks in the
Shadows settings for the project (Figure 6.24).

Figure 6.24 - Shadow settings

Using the default settings for shadows will result in those poor shadows in
any render. For high-quality shadows for your interior renders you should
use:

Cube Size: 2048px


Volume size: 2048px
High-depth shadows: Enabled

Using those settings will genuinely improve the shadows for your scene.
Look at Figure 6.25 to see how the shadow settings will change the way
any object cast shadows.

Figure 6.25 - Shadows settings

What about the shadows from light sources? In that case, you can use the
settings available at the light object. For instance, using the options from
our Area Light will enable you to control the softness of your shadows.

The default value will give a weak, soft edge for all shadows. If you reduce
the softness of your shadows for that particular light, you will get hard edge
shadows (Figure 6.26).
Figure 6.26 - Hard-edge shadows

If you are trying to create a daylight scene with Eevee, the control of your
shadows is critical. In bright daylight, you will get most objects casting
hard edge shadows. As a final effect, you can also enable the Screen Space
Reflections.
The scene still needs a boost in lighting, which will add with the color
management settings in Chapter 8.

What is next?
Even after those tweaks and adjustments to the scene effects, we can still
improve a lot the overall quality of our lights. If you take a close look at the
scene, it is still too dark. The shadows and Ambient Occlusion helped to
build a more realistic scene but go further.

The next chapter will cover the aspects of exterior lighting in Eevee for any
project, where you can use an HDR map for the environment and a light
setup unique to external scenes.

Is it too different from an interior scene? You will see several aspects where
we use an identical setup for external scenes, with a few key differences.
The probes placement and handling will require special attention because
they have a limited volume.

If you want to learn how to improve your projects, on any context using the
Color Management settings of Eevee, you can go straight to chapter 8.
There you will learn how to make scenes receive a significant boost on
lights using controls like the exposure and also gamma settings.

That is the setting for a scene that can completely transform any project and
you will learn how to take advantage of those settings and other options like
the Depth of Field, which can also change the render.
Chapter 7 - External lights with Eevee
In chapter 6, we learned how to apply the workflow for Eevee in an interior
scene that uses all the techniques explained along with the book. You had
the opportunity to see how to place probes, lights, and materials for an
interior scene.

How different is an interior from an exterior render in Eevee? The primary


difference between the scenes is regarding the scale of objects we will be
dealing. With interiors, you have small enclosed spaces and for exterior
scenes will feature large open areas.

That will present unique challenges to use Eevee like working with indirect
lights and probes. In the next chapter, you will find our workflow adapted
to an external scene to handle those unique problems.

Here is what you will learn in this chapter:

Apply the Eevee rendering workflow to external projects


Setup environment lights with an HDR map
Use an additional light source for shadow casting
Apply and control PBR materials with correct tilling
Use glass materials to reflect the HDR maps
Setup and place probes for indirect lights and reflections
Use effects to enhance the depth perception of the scene
7.1 Eevee checklist for exteriors
In chapter 6, we made a list of all the steps necessary to prepare an interior
scene for rendering with Eevee. The checklist will help you to create a
workflow of actions to streamline the production of any scene using Eevee.
Is it different from an exterior scene?

An exterior scene will show some unique challenges for lighting with
Eevee, especially in the environment map that will work much better with
an HDR. Here is the Eevee workflow for exteriors:

1. Check the model to see if you have some thickness to objects and
make sure there are no unnecessary gaps between objects or
overlapping faces.
2. Add environmental lights using an HDR map with an Area Light or
Sun
3. Adjust the HDR rotation to match with the background
4. Align the active camera with the project
5. Add a light source that can cast shadows (Sun or Area)
6. Create all materials to the scene. If you can, use only PBR materials
for your surfaces.
7. Create probes to compute indirect lights and reflections.
8. Review the solution to fix potential shading and light leak problems.
9. Add effects and post-production to finish your image.

As you can see from the list of steps, you should use an HDR for the
environment map this time. In the checklist for interiors, you can either use
an HDR of go with Area Lights. For open spaces, you will have a much
harder time trying to find the best settings with Area Lights alone.
The HDR map will help in several ways to create a realistic render for
exteriors with Eevee. Here are the main benefits of using an HDR for
exteriors:

1. Generate natural light using color variations from the Sky


2. Make reflections for glossy surfaces
3. Compose with the background of your render

Since we will also encounter much fewer problems regarding light leaks in
exterior scenes, we can use lights that eventually could generate shadow
problems in interior scenes, like a Sun. You will notice that in step two, you
can use either a Sun or Area Light to the scene.

Using a Sun will bring lots of benefits for shadow casting like easier to
manage hard edge shadows.

The main reason you have to use a light and an environmental texture is
that Eevee can’t generate shadows for HDR maps. If you only use an HDR
map as the light source for the scene, you won't get any shadow casting for
the objects, which will compromise the realism of your scene.

During the rest of the chapter, we will use all the steps to set up and prepare
the scene for rendering. Some of the procedures will use similar options
from the interior render, but others will be unique to external scenes.

We will apply the workflow to create the image shown in Figure 7.1.
Figure 7.1 - External model in Eevee

The model is simple but will help us understand and apply the concepts
from the checklist.

7.2 Exterior models for Eevee


For interior models, you should keep a close look in the thickness of the
models that could lead to potentially light leaks during the indirect lights
calculations. That is one of the main causes of problems related to leaks in
Eevee.

Regarding external models, that won't be much of a problem because most


models will have a volumetric shape. If you take a close look at the model
we have for the project, it already features some shapes that have a
considerable width (Figure 7.2).

Figure 7.2 - 3D Models overview

Since they are not flat and have a thickness to the objects, you will not find
problems regarding light leaks on those objects.

The main issue that could potentially appear in external models, which you
should fix to avoid texturing problems is the overlapping of faces. After the
modeling process, you can have some overlapping faces in models. That
will appear in the 3D Viewport as a darker face.

When that type of problem regarding modeling usually occurs? For


instance, if you press the SHIFT+D keys to duplicate an object and cancel
the transformation of your copy with the ESC key, you will end up with an
overlapping geometry unless you press the CTRL+Z key to undo the
duplication.
The result will eventually appear at the rendering. To fix those overlapping
faces you can use the Merge option from the Context menu in Blender.
Make sure you have the selection mode set to vertices in Edit Mode.

Select the overlapping faces and use a right-click to open the Context menu.
You can also use the A key to select all elements. From that menu, you can
go to Merge → By Distance (Figure 7.4).
Figure 7.3 - Merge by distance

By keeping the distance to merge set to zero, you will remove all
overlapping faces (Figure 7.4).

Figure 7.4 - Merging vertices

The problem could also appear when you import models from external
sources like an OBJ or FBX file that you bring to Blender.

Info: The option Merge by distance is the name Blender uses now for a tool
that had a different name for several years. In previous versions of Blender
the tool had a name of Remove Doubles.
7.3 Environmental lights for exteriors
When you are dealing with interior models, it might not be a good idea to
use an HDR image with Eevee, because it will not generate shadows and
the benefits of having a more natural light will not compensate the change
to get light leaks. Instead of using an HDR for interiors it is much better to
go with an Area Light.

For exterior render projects, the HDR map will become more useful for
both lights and also composition. You can use the natural shading offered
by an HDR map applied as an environmental map allied with another light
source to create shadows.

In our project, you will start by adding the HDR map shown in Figure 7.5
to the World tab as an Environment map.

Figure 7.5 - HDR map for background

The map has a sky background that will be perfect to use as a composition
for the final render. Use the Rendered mode from Eevee to align and place
the HDR in the proper location.

At the World tab, you can also set the intensity of your HDR map shading
using the Strength value (Figure 7.6).

Figure 7.6 - HDR intensity

Keep in mind that we will later add a light source to generate shadows for
the scene. The HDR map should work now as a reference for us to later
place the camera.

7.3.1 Environmental map rotation


What if you don't like the angle and alignment of the HDR map? We can
use the Mapping and Texture Coordinate options to control the rotation as
we did in chapter 2. The process we used back in chapter 2 involved the
Shader Editor, which is a powerful and flexible option.

However, you can also use the World tab alone to set up the Nodes to
control an HDR rotation. At the World tab, you can click at the small circle
next to the Vector option from the Environmental map (Figure 7.7).
Figure 7.7 - Vector input socket

From the options that will appear for the Vector, you should find and pick
the Mapping. In the Mapping options, you will see another Vector field,
which represents the input socket for that Node. Click, once again, in the
small circle on the right and choose from the Texture Coordinate field the
Generated option (Figure 7.8).
Figure 7.8 - Texture coodinate
Use the Rotation controls from the Mapping to set the rotation of your HDR
map for the render. Keep your shading mode as Rendered to get instant
feedback about the location of your HDR.

Tip: If you don't want the HDR to appear in your final render you can
always enable the Transparent option in the Film settings. You will find
those options in the Render tab for Eevee.

7.4 Camera settings for exteriors


The HDR in the background will help you set the best angle for your
camera in an exterior render. You will have the background ready to find
the best angle for the camera in your render. How to align the camera? The
best option is with the same shortcut used in chapter 6 for interior scenes.

You can use the 3D navigation options to find a viewing angle that you like
and press the CTRL+ALT+Numpad 0 keys. That will make the active
camera to align with the view (Figure 7.9).
Figure 7.9 - Aligned camera

Use the camera settings to change your Focal Distance and also the
transformation shortcuts to fine-tune the framing. Select the camera border
to make the adjustments.

When you get the perfect alignment, you can go back to the World tab and
also change the settings for the HDR rotation.

7.5 Adding a Sunlight in Eevee


With the camera aligned, we can start to develop our lighting further. The
scene already has a large environment map in the background, and now we
need some shadows. For external scenes, we can use a Sun from Blender
that will create a constant and robust flow light into the scene.

Using a Sun to create shadows for the interior is dangerous because they
have a high chance to generate light leaks. However, in exterior
visualization projects, we don't have such problem.

By pressing the SHIFT+A keys and going to the lights options, we can add
a Sun to the scene. With the Sun still selected move up and far from the 3D
model. Since we already have the camera with the final framing, you can
adjust the Sun rotation with the R key to make your light to come from
behind the camera (Figure 7.10).

Figure 7.10 - Sun behind camera

After you have the Sun in the correct location, we can open the options of
making adjustments to shadows and angle. To improve shadows and other
aspects of of the Sun you can change the following options:

Bias: 0.75
Exponent: 1.5
Softness: 2.5
Contact shadows: Enabled
Those settings will help to create an excellent looking daylight simulation
with a Sun casting hard.

7.6 Materials and textures for exteriors


The materials and textures for exteriors will use the same PBR surfaces
from all previous examples in chapters 3 and 6. For exterior scenes, you
will have to be extra careful about one aspect of those surfaces. Since we
will handle scenes that have a much larger scale than interiors, you will
have to be extra cautious about tilling.

When adding materials to the surfaces, you will probably have to add a:

Texture Coordinate
Mapping

For instance, if you look at the brick fence from the scene in Figure 7.11.
Figure 7.11 - Brick fence

That material is using the default tilling from our PBR material with no
additional controls for the surface. Notice how big the bricks are
concerning the rest of the scene. We have to add both Nodes to the PBR
material to scale our textures in a way that they will look with the correct
size of a brick (Figure 7.12).

Figure 7.12 - Scaled textures

Find the best scales for each one of the texture is the biggest challenge in
exterior models because each PBR material will require a unique setting for
the scale. Luckily for us, we can use the Rendered mode from Eevee to get
real-time feedback on textures without the need to make render tests.

Tip: If you want to look at how to set up PBR materials and control tilling,
go back to chapter 3 where we discuss the workflow to handle PBR
materials.

7.6.1 Transparency and refractions with Eevee


Back in chapter 3, we learned how to create glass materials for Eevee using
two different methods. For exterior projects, we can use a much simpler
version of that material to reflect the surrounds of your scene. That will
help a lot with the final render and the realism of your render.

For instance, we can apply the glass material to the windows of our project.
With the glass object selected, we can apply a new material and use a
Principled BSDF shader.

In the settings for the shader, you must:

Set the transmission to 1.0000: That will make the material fully
transparent
Set the roughness to zero: With a value of zero you will get crispy
reflections from the glass
From the Settings options enable the Screen Space Refraction:
The option will help our material to reflect the surroundings of our
scene, even without a probe

The last setting is what will make a huge difference in our glass setup. But,
it will only work if you enable the Refraction option at the Screen Space
Reflections (Figure 7.13).
Figure 7.13 - Glass for windows

Since we are using an HDR map as the background, we will have some
great reflections for the windows. Don’t forget to enable the Screen Space
Reflections and the Refraction option.

Info: Until now, we have three different methods to create glass for Eevee.
What is the best option? That will depend on the visuals you want to
achieve for the glass surface. All three options will give you great results,
but with differences in the final reflections.

7.7 Probes and indirect lights for exteriors


It is time to add some probes to the scene to work with indirect lights and
reflections. Perhaps, one of the most significant differences from interior to
exterior scenarios is the way we will set up the probes. For interiors, where
we have a confined space that will benefit from indirect lights, it is
imperative to add an Irradiance Volume.

Otherwise, you will only get the lights from direct sources in the scene. In
external scenes, you will only have a few places in the model that will
benefit from indirect bounces.

For instance, in your example, you will notice only a few areas that will
most likely receive indirect bounces from lights (Figure 7.14).

Figure 7.14 - Indirect light bounces locations

Those locations in the model have small enclosed spaces that will enable
light to bounce around a few times. Using your camera viewing angle as a
reference, we will add the Irradiance Volume probe to the model and scale
it to cover all the front space of the model (Figure 7.15).
Figure 7.15 - Irradiance Volume area

Make the Irradiance Volume to cover only the areas visible by the camera,
which will optimize the calculations later. We have some surfaces that
represent glass, which will also benefit from a Reflection Cubemap. Add
the probe and place it in the front of the scene (Figure 7.16).

Figure 7.16 - Probes for exteriors

Regarding the Irradiance Volume, you should increase the number of


samples in each axis to 12. That will make the light bounces to interact with
more surfaces. Also, add a Reflection Cubemap and scale it up until the
whole scene is inside the probe.
Once you have the probes in place, we can go to the Render tab and trigger
the Indirect light baking (Figure 7.17).

Figure 7.17 - Baking for exterior renders

The baking results you help us evaluate the location of your Irradiance
Volume probe, and if necessary, you can make adjustments to the samples
and location. It will not have the same impact on rendering as we would
have in interior visualizations.

7.8 Effects and render for exteriors


The indirect lights calculations will help you to get better visuals for the
exterior visualization project, but the render will still need some
enhancements. We can use some of the same options from chapter 6, where
we added the following effects to the render:

Ambient Occlusion
Screen Space Reflections
Bloom
You can enable all the effects using the Render tab options. With the Screen
Space Reflection, we will get better reflections for glossy surfaces. The
Bloom effect is useful for external scenes simulating bright daylight
because it will add a glow effect on surfaces receiving direct light (Figure
7.18).

Figure 7.18 - Bloom effect

From all three effects, you can use the Ambient Occlusion to add some
depth to the scene by increasing the Distance value. If you increase the
Distance, you will get more prominent contact shadows for objects, which
will give the impression of more depth for the scene (Figure 7.20).

Figure 7.20 - Ambient Occlusion distance

As a result, you will get a render that has a visual aspect of depth. It still
looks dark, but we will improve the visuals using some color correction
techniques in chapter 8.
What is next?
The process to set up and render an exterior scene has many similarities
with an interior, and you will be able to reuse a big part of the workflow.
Since we are dealing with a context that has a scene with a larger scale and
open spaces, we can experiment more with light sources like the Sun and
use an HDR map in the background.

Using those two options in interior scenes will eventually increase the
change of light leaks in the scene, but not in exterior renders.

What is the next step? After processing all the scene and adding effects, we
will still have to work with the Color Management options to further
enhance the render in Eevee. There we will find options to improve
brightness and color balance.

In the next chapter, you will learn how to use the exposure and gamma
controls from that panel. Besides those effects, we will also take a look at
how to create and use volumetric lights in Eevee and working with a studio
set up with an infinite background.
Chapter 8 - Color management and
effects with Eevee
After you finish a project in Eevee, you may have a render that still needs
some work regarding colors or contrast balance. Some artist would take that
image and bring it to an image editor software. In Blender, you will find a
panel that can help you make adjustments to any render.

At the Color Management, you will be able to fine-tune aspects of your


render like exposure and gamma, which have the potential to transform any
render you are working. It can take a project that you are working and make
it brighter or enhance contrast to make it look more appealing.

Besides the color management options, you will also find in the following
chapter explanations in how to add several effects to a project. For instance,
you will learn how to work with Depth of Field in Eevee, which can create
blurred backgrounds for your render. It works way faster than Cycles.

You will also find a detailed explanation about the use of volumetric lights
for Eevee that can add depth to a scene by showing the light beams. To get
that effect, you need an object container and a particular type of shader
called Volume Scatter.

To finish the chapter, you will see how to create a studio scene with an
infinite background to make renders and present products in real-time.

Here is a list of what you will learn in this chapter:

How to work with volumetric lights with Eevee


Using the Principled Volume shader
Create objects that emit light with materials
Enhance your renders using color controls
Work with exposure and gamma settings
Add Depth of Field to renders
Create a studio scene with an infinite background

8.1 Using volumetric lights with Eevee


The use of volumetric lights is a great way to add realism to any scene, and
Eevee does support the use of such volumetric. What is a volumetric light?
You will quickly identify those types of lights when you see an effect of
light going through a fog. You will be able to see the light beams in that
type of scene.

As a result, you will see the volume lights for that scene. In Blender, we can
create that type of effect using a combination of materials and render
settings for Eevee.

You will need a few objects in your scene to work with volumetric lights in
Eevee. The renderer uses an object that will use a special type of shader as a
material. You will control the density and other aspects of the effect in the
Material tab. In Eevee, we have to enable the use of volumetric lights in the
Render tab.

If you look at the Render tab, you will see an option for volumetric (Figure
8.1).
Figure 8.1 - Volumetrics for rendering

You must enable the Volumetric Lighting and if you need shadows to
interact with the effect, also enable the Volumetric Shadow. Like other
effects in Eevee, you will have several controls to adjust the way your lights
will appear:

Samples: Since the effect uses some processing to display the


volumes, it has a samples control. If you think the volume is showing
noise, you can increase the samples.
Tile size: With a volume light, you will use a resolution of 64 pixels
for each interaction by default. To get better-looking beams of light,
you can increase the Tile Size to a higher value.
Distribution: A value that controls how many samples will appear
close to the camera. The default value of "0.8" makes 80% of the
samples to appear close to the camera.
Start/End: The distance where your effect will begin and end
regarding the camera.

If you think the effect you see for volumetric lights doesn't have the quality
you were expecting, use those settings to increase samples and resolution.

8.1.1 Volumetric Shader


Having the volumetric option enabled won't be enough to display the effect
in Eevee. It would be difficult to apply the volumetric lights in the entire
scene, and for that reason, we need a container object to limit the area of
our volume light.

You can use any Mesh object as a container and apply the Principled
Volume shader. For instance, we can create a cube in the scene and assign a
material with that particular shader (Figure 8.2).
Figure 8.2 - Volume Scatter shader

After you create the object and assign the shader, you will place the camera
inside the container area. Otherwise, the volumetric effect won't be visible.
You will start to see the volume lights when rendering the scene (Figure
8.3).
Figure 8.3 - Volumetric effect

In Figure 8.3, you can see the effect from a Spotlight. The density of your
Principled Volume will start at 1.0. Reduce the value to 0.1 to see your
lights.

You can change settings in the material to control certain aspects of the
effect like the density of your volume and also the color used to create the
foggy aspect of your effect.

Info: Keep in mind that for the Principled Volume to work you must add it
to the Volume field and not Surface.

8.2 Emission shader with Eevee


A great way to create visual effects with Eevee regarding lights is the use of
a special type of shader called Emission. If you take a look at the list of
available shaders from the Material tab, you will see the Emission (Figure
8.4).
Figure 8.4 - Emission shader

After you add the Emission shader to an object, it will start to contribute to
the lights of your project. At least they should contribute to the lights. In
Cycles, you don't have to perform any extra steps to use materials to emit
light. For Eevee, we have to use the Irradiance Volume and the Indirect
Lights baking process.

You can select an object and add the material with the shader set to
Emission (Figure 8.5).
Figure 8.5 - Object with Emission

If you render the scene, you will notice that even with an emission shader,
you won't get any lights from the object. The solution for that type of object
is to use an Irradiance Volume and bake indirect lights (Figure 8.6).
Figure 8.6 - Emission after the baking process

As a result, you will see the objects contributing to the overall illumination
of the scene. However, we have a limitation in those types of lights for
Eevee. They will not generate shadows.

If you need shadows for the lights coming from those planes, you will have
to add another source close to them to get the shadows generated. You can
do that using options like an Area Light with a small amount of energy to
create shadows.

8.3 Color management


The projects you are working in Blender and rendering with Eevee might
have all they need regarding lights, probes, and indirect light backing and
still doesn't look "right." You might experience the same type of problem
when rendering from Cycles using Blender.

At the Render tab, you will find a collection of tools and options that can
transform a project from a flat and unbalanced light into a realistic and
bright image. With the Color Management options, you will find the
settings needed to make significant changes to your renders (Figure 8.7).
Figure 8.7 - Color management options

At the Color Management options you will find:

Look: A quick panel that offers templates for contrast settings for the
scene. You can choose from options starting with "Very Low" and
going up to "Very High." The scene will have the colors changed to
match the template you choose.
Exposure: The exposure settings will help you manage the brightness
of a render with both Eevee and Cycles. You get a slider that starts
with a value of zero for no changes in exposure. If you need
additional lights for a scene, you can increase the amount to get more
light to the scene.
Gamma: Do you think your render needs a change regarding the
balance of whites and blacks? That is what you can do with the game
settings. The slides will let you control the balance between dark and
bright colors. You can even use a more advanced option to handle
gamma settings using curves for more visual control.

No matter what type of change or enhancement you need to apply for a


particular render, you definitively should try to use color management
settings to get an improvement to the render.

Unfortunately, you won't find any templates for best settings to use with the
values available at the color management. The main reason for that is the
unique nature of each project and scene. You will have to manage lights,
materials, and environments that will only apply to your scene.

Info: One of the many benefits of using Eevee for rendering is that you will
be able to quickly evaluate the results of adjustments from the color
management in real-time. Activate the Render mode and make changes to
the settings to find the best possible setup for a render.
8.3.1 Exposure settings
In some scenes, you will add an environmental texture and also several
types of lights to the project and will still feel that it looks too dark. An easy
and quick way to increase the light levels at once for your scene is with the
use of exposure settings.

If you have previous experiences with photography, the term exposure will
be familiar. For photography, we usually find exposure as a measure of time
in which the camera sensor gets exposed to light. With more time receiving
the light, it will generate a brighter image.

For render in Eevee, the rule also applies, and you will have a brighter
image by increasing the exposure settings. The settings for exposure are
simple to operate and require you only to choose the level you wish to use
(Figure 8.8).

Figure 8.8 - Exposure settings

The default value for exposure is zero, which will not add or subtract and
light to the scene.

If we take one of the scenes we created in chapter 6 and compare with a


higher value for the exposure, you will notice an immediate gain in
brightness (Figure 8.9).

Figure 8.9 - Scene with higher exposure

Using the exposure settings will make your life easier in terms of finding
the best levels for lights in a scene. You can add several Area Lights to a
project and use an HDR in the background and still have a dark scene. A
quick change in exposure settings can solve the problem.

Unlike Cycles, we will have the benefit of evaluating the results of the
exposure settings in real-time using Eevee.

You have to take care of an effect that could appear from the use of
exposure settings, which is the overexposure of an image when you
increase the exposure to a level where some parts of your image will start to
look burned and with excessive white colors.

8.3.2 Gamma settings for renders


The use of gamma has a great potential to change the overall look of a
scene and enhance realism. However, using the slider to make changes is
not easy because most people don't have experience managing the balance
of black and white from images.

When you open an image editor like Photoshop or GIMP, you will find
tools like a histogram that will display a graph for the black and white
balance. From that graph, you can find if you need an additional boost on
either of those tones.

To edit and change an image in Eevee, you will have to develop a critical
eye to evaluate the need for each image. Since you are the author of that
project, you will have to find the best results from your point of view. Do
you need more blacks? Or make the image receive some white colors?

Using too much of each will result in a darker image or a washed effect.
The challenge will be to find a balance between them. The gamma settings
will always start with a value of one (Figure 8.10).
Figure 8.10 - Gamma settings

If you change the values to a lower number, you will increase the amount of
black in the images. A higher value will add more whites (Figure 8.11).
Figure 8.11 - Gamma results

As always, you will be able to evaluate the results in Eevee using the real-
time render preview. That will make the process of finding the point with
the best balance a lot easier.
8.3.3 Gamma with curves
If you want additional controls for the gamma settings, we can enable the
Curve option. When you enable the curve option for your gamma settings, a
small panel will appear in the color management (Figure 8.12).

Figure 8.12 - Curve options

The graph will start with a straight line defining the gamma settings, which
we can change by modifying the curve. How to change the curve? Here are
the procedures to change that curve:

Click and drag anywhere in the curve to add a node and deform the
graph
You can click and drag using an existing node
To remove a node, you can click on the node to select and use the "x
"button to remove
The curves accept changes for the overall color or each spectrum of
the RGB. A selector at the top let you pick the channel from CRGB
For each channel, you can make changes to the RGB mix for the
blacks and whites using numeric values
To reset all settings of the gamma, use the "Reset" button at the
bottom

Working with the graph view can become a challenge until you find the best
balance between the vertical (black) and horizontal (white) fields.
8.4 Depth of field in Eevee
An effect that can add a significant level of realism for your projects in
Eevee is the Depth of Field or DoF. Have you ever saw an image where
parts were out of focus? That is what you can do with the Depth of Field
effect. After you apply the effect in a scene, you will have the visuals like
Figure 8.13.

Figure 8.13 - Depth of Field effect

You can notice that part of the image is out of focus and with a blur effect.
The effect works with both Cycles and Eevee using a similar workflow. To
create the blur on your image, you have to follow two simple rules:

You must view the scene from the camera


You need a reference object to focus
The reference object is essential to help Eevee identifies what parts of your
image; it must keep in focus, and everything else will receive a blur effect.
For instance, if we take a scene like the one shown in Figure 8.14.

Figure 8.14 - Scene for DoF effect

At the scene, you can see that we have a camera and also a couple of
objects. What will be the focused object? You can either pick one of the 3D
models in the scene or create an Empty object in Blender to use as a
reference. Using an Empty will be better to allow you to move the focus
point around without the need to change your 3D scene.

To create an Empty, you can press the SHIFT+A keys and choose Empty
→ Plain Axis. Use a move transformation to place the object in the location
you wish your camera to focus (Figure 8.15).
Figure 8.15 - Location to focus

The next step is you change your view of the camera. Press the Numpad 0
to view your camera, and select the camera object. You can do this by
clicking on the camera using the Outliner or the rectangular border on your
screen. With the camera picked, open the Object Data tab at the Properties
Editor (Figure 8.16).

Figure 8.16 - Camera options


In your camera options locate the Depth of Field and "Focus Object" select
the Empty from the list. If you changed the name of your Empty, make sure
you get the name currently in use.

At first, you won't see any differences in your image after selecting the
Empty as the focused object. To enhance the Depth of Field effect, we have
to change the F-Stop value. The F-stop in photography means the ratio
between your focal length to the diameter of a lens.

What values would give a high Depth of Field effect? If you reduce the
amount, you will get an extreme to defocus effect. For instance, use a value
of "0.1" to get a high blur effect on your scene.

You can easily move the focus object in real-time to view the effect update
in your 3D Viewport (Figure 8.17).

Figure 8.17 - Depth of Field with Eevee

You can increase the number of Blades to get a better quality effect and
increase the F-Stop to reduce the blur.
Another way to control the Depth of Field in Eevee is with the Max Size
option at the Render tab. There you can choose the value Eevee uses to blur
your pixels (Figure 8.18).

Figure 8.18 - Max Size for DoF

If you reduce the size to a maximum of 2 instead of the default value, you
will get a small amount of blur on your pixels. The Depth of Field effect
and add a small amount of processing on your scene for rendering.

One thing you must have in mind regarding the Depth of Field in Eevee is
that it has limitations on image quality and won't look as good as Cycles.
But, scenes that use the Depth of Field will have a much better level of
realism using the effect.

8.5 Studio lights with Eevee


If you need a generic scene that has an infinite background that works like
most studio spaces, you can easily make one with Eevee. That is a perfect
scene to render single objects using a bright light with very few shadows.
The concept behind a studio is to have large panels of light located to the
sides and behind the camera, and an infinite background with no visible
borders.

To make such a scene in Blender to render with Eevee, you can start with a
plane in your scene and using the extrude tool to make the background.
After you add the plane:

1. Select the object and go to Edit Mode


2. Apply a scale in the Y-axis
3. Select one edge of the plane
4. Press the E key and the X key to extrude in the X-axis
5. Make three extrudes and move the edges in the Z-axis
6. The result will be a curved object
7. Create one last extrude to add some height to the background
8. Using the CTRL+R keys add new loops to the four corners of the
background (Figure 8.19)
Figure 8.19 - Background for studio lights

Make additional extrudes until you get the model shown in Figure 8.20.
Figure 8.20 - Background with all faces

Apply a Subsurf modifier to the background in Object Mode and from the
context menu, which you can open using a right-click to choose Shade
Smooth.

As a result, you will have an infinite background for a studio scene (Figure
8.21).
Figure 8.21 - Infinite background

Apply a white color to the object and add three area lights to the scene.
They will be on each side of the scene, and one behind the camera (Figure
8.22).
Figure 8.22 - Area lights for the studio scene

Since we are working with Eevee, you will also need an Irradiance Volume
and Reflection Cubemap probes in the scene. Add them and make sure the
scale is large enough that you will have the full studio scene inside the
probes (Figure 8.23).
Figure 8.23 - Probes for the studio

You can now add an object to the center location of your studio scene and
bake any indirect lights and reflections for the probes. The result will be a
perfect scene for making presentations of products and 3D models with an
infinite background (Figure 8.24).
Figure 8.24 - Studio scene for Eevee

You can save that scene and swap the objects whenever you need to make
product presentations.

What is next?
With the release of Eevee in Blender 2.8, we have a powerful tool in our
hands that can deliver quality images in real-time. If you are an artist that
usually get the projects rendered with Cycles, you will notice that it has
highlights and drawbacks.

The most significant advantage of Eevee is the possibility to get your


renders in a couple of seconds. No wait beyond a possible baking process
for indirect lights, and you will have your images or animations. For
animation projects, it is a game-changer feature to have your projects
rendered in minutes and not days.
Of course, you will have to learn and manage the limitations of real-time
render technology. For instance, you will have a less powerful shader to
create and handle complex materials. You must process indirect lights and
reflections before a render.

Once you learn how to manage those limitations and take advantage of a
powerful PBR material system, you will be able to create incredible
projects with Eevee and also add some effects to make them look even
more realistic.

The next step now is to put all your recently knowledge about Eevee in
practice and start doing projects using real-time render. That is by far the
best way to develop your skills even further and find solutions to common
problems in Eevee. It could be either a light leak or the setup of a probe.

You now have a solid base to start taking advantage of real-time rendering
with Blender.

S-ar putea să vă placă și