Documente Academic
Documente Profesional
Documente Cultură
Allan Brito
Reference: blender3darchitect.com
Edition: 1st
You will find more about him and the use of Blender for architecture in
blender3darchitect.com, where he writes articles about the subject on a
daily basis.
Who should read this book?
Ever since the release of Blender 2.8 and the debut of Eevee as an option
for rendering, artists are struggling to learn the unique challenges of a real-
time render engine. How to use probes? How to get indirect lights in
Eevee? How to fix light leaks?
If you are starting to use Blender and would like to know how to set up and
use some of the essential options of Eevee, this book is for you.
You will learn how to render and apply options to generate indirect lights,
control shadow, and handle materials.
One of the main problems with Eevee nowadays is that people using
Blender are not familiar with real-time rendering. How to use a probe? How
to get materials with the best settings?
Along with the book, you will find a significant amount of information to
use Eevee in your projects regardless of context and field. I will explain
with simple examples the most common workflows for Eevee, which you
can apply to your projects later.
I hope you like the content and it helps you to understand and use Eevee to
create renders in real-time!
Allan Brito
Downloading Blender
One of the significant advantages of Blender when comparing to similar
softwares is their open source nature. You can use Blender without any
hidden costs! All you have to do is download the software and start using it.
How to download it? To download Blender, you should visit the Blender
Foundation website:
https://www.blender.org/download/
For this book, we will use version 2.80 of Blender, but the vast majority of
techniques will still work with later versions.
Chapter 1 - Rendering with Eevee and
Blender
What is real-time rendering? It is a new technology that debuted with
Blender 2.8 in a brand new renderer called Eevee. With Eevee, you can get
incredible results for rendering with a couple seconds of render time.
Coming from Cycles, that is an incredible feature, which is hard to beat.
The way Eevee work is unique regarding settings to get a scene with
realistic lights, materials, and effects.
In our first chapter, you will learn the essential aspects about rendering with
Eevee using Blender, from the creation of a single image to animation
rendering.
Why is Eevee becoming so popular among artists using Blender? The main
reason for the popularity of Eevee is a combination of render quality allied
with speed. Imagine a render engine that can deliver realistic results in a
couple of seconds. That is what Eevee can do in Blender 2.8.
One of the secrets behind Eevee efficiency is the use of a technique called
Rasterization. You will find Rasterization as the primary technique used in
Game Engines to display 3D graphics in a 2D screen. The Rasterization
process in Eevee works by projecting the faces of a model as pixels that
compose a 2D image on your screen.
The Rasterization can be incredibly fast, but it is not visually accurate like
other methods. In Eevee, you will get Rasterization using Open GL 3.3 that
delivers incredible speeds for rendering.
How to view Rasterization with Eevee in action? That is simple. You have
to enable the Rendered shading mode in your 3D Viewport in Blender
(Figure 1.1).
Figure 1.1 - Rendered Mode
If you navigate in 3D using that mode, you won't experience any delays or
hiccups. The scene will update in real-time with all the effects you need to
create a rendered version at the end. That is quite different from Cycles,
where you have to wait a few seconds for the processing of an image to end
until you see a noise-free image.
Tip: You can also use the Z key to open a small menu with all the shading
options.
The Path Tracing method from Cycles handles light bounces around the
scene, and it produces a much more realistic result than Eevee. It has the
name of Path Tracing because it works by:
1. Casting rays from the camera on your scene
2. The beams will bounce around the scene
3. Each ray will interact with surfaces, materials, and textures
4. Once the rays find a light source, it stops
5. The light contribution of each beam gets calculated every time it hits
a light source
Since it deals with all those calculations based on the path a light ray will
have on the scene, it receives a name of Path Tracing (Figure 1.2).
Since Cycles has all those calculations based on lights, it has a clear
advantage on something critical for realism, which is shading. A lot of
effects will look better with cycles like:
Ambient Occlusion
Global Illumination
Reflections
Transparency
Refraction
Shadows
Subsurface Scattering
Volumetrics
Motion Blur
Depth of Field
All those aspects of a scene will look accurate and based on physics with
Cycles. Keep in mind that looking better doesn't mean you have to put
Eevee aside and only use Cycles. The accuracy of Cycles comes with a cost
in render time.
Do you want to see a quick example of how Cycles handles better some
aspects of a render? Look at Figure 1.3.
Figure 1.3 - Comparing shadows
The image using Cycles took 45 seconds to render. With Eevee, that render
time was 1 second.
That is one example of how Cycles can produce better results for a
particular project.
If you get a project that requires you to work with accurate lights and the
use of physics to get high-quality images, you should move to Cycles. One
of the points where Eevee will struggle is with advanced materials. In
scenes that require you to have superior transparency like subsurface
scattering or complex refractions, you will see significant improvements by
using Cycles.
In most projects, you will be able to get incredible results with Eevee and
have saved an impressive amount of time from rendering. A few types of
projects where you will probably get better results with Cycles:
You can start to render by pressing the F12 key using what you see from the
3D Viewport, but it will only display what the active camera is viewing.
To get any angle from the 3D Viewport regardless of the cameras you will
have to use the View → Viewport Render Image menu (Figure 1.4).
Figure 1.4 - Render the viewport
It will start a render with everything you see at the moment in the 3D
Viewport. That is a great way to produce a quick render using Eevee.
The downside of this option is that it will also get elements that usually
wouldn't appear in a render. For instance, if you are in Edit Mode working
on a modeling project and want a snapshot of the object, you will get
vertices, edges, and faces (Figure 1.5).
What if you have an animation? You also have the option Viewport Render
Animation that will process all the frames from an animation. It will use the
data you have on the project like keyframes, framerate, and objects.
To make it easier for you to find the rendered animation, it is always a good
practice to save the Blender file in a unique folder. Make another folder
there to keep your renders.
From the Output tab, you can also pick the format in which you want to
save your animations rendered with Eevee.
There you should keep the file format as PNG all the time and avoid getting
your images as JPG. The main reason for that is keeping your images with
the highest possible quality. Every time you save a file as a JPG, you will
apply some level of compression to that image.
Whenever you save an image file, the software will apply compression to
reduce the file size. Nowadays, we have two main types of compression:
An optimal workflow to save your render from Eevee would be to get them
in PNG first and convert the material later to JPG. You can keep the PNG
file as a source for post-processing and also generating lower resolution
versions. If you save the project straight to JPG, you will already start
losing data.
Besides keeping the images in PNG, you should also use the RGBA color
space. The RGBA means you will use colors in RGB with an Alpha
channel. With an alpha channel, you will be able to get transparent
backgrounds for the render.
That is perfect for composition and also adding the render results in other
projects. To get a render with a transparent background, you have to keep
the Film settings in the Render tab marked with the transparent option
enabled (Figure 1.7).
Figure 1.7 - Transparent option
After you render the project, you will see the transparency as a squared
pattern in the background of your render (Figure 1.8).
Figure 1.8 - Render with transparency
What if you forget to set the image as a PNG file using the RGBA format?
At the save dialog in Blender, you will be able to change the format and
settings on the left side. You can easily change the image settings using
those options.
Info: The save dialog will appear when you render a scene and from the
window where your image appear you choose the Image → Save As…
option.
How to save the file? If you trigger the render from Eevee, you will see the
results in the Image Editor. At the editor, you will see a menu called
“Image,” and there you have an option “Save As…”. Pick that option to
save the render to disk.
To choose what you will view from a render, you must set what the active
camera from Blender is viewing. You can have multiple cameras in a
Blender scene, but only one of them will be the active (Figure 1.9).
Figure 1.9 - Multiple cameras
How to find the active camera? You will identify the active camera with a
visual cue. At the top of the camera icon, you will see a filled triangle. All
other cameras will have the same triangle, but only with an outline (Figure
1.10).
Figure 1.10 - Active camera
To make any camera active, you must select the camera and press the
CTRL+Numpad 0 keys. Once you have an active camera set, you will see
the results of what that camera is looking at the render.
Info: If you are using an computer that doesn't have a Numpad, you can go
to the Edit → Preferences and from the Input tab enable the “Emulate
Numpad.” That will make your alphanumeric keys work like the Numpad.
The easiest way to change and place the camera is with a shortcut. With the
CTRL+ALT+Numpad 0, you can align the active camera with the view you
have from the scene. Use the 3D navigation shortcuts to align your view
and press the shortcut. The active camera will align with your view.
After you have the active camera in place, select the border of the camera,
and make further adjustments with the G and R keys (Figure 1.11).
Figure 1.11 - Camera border
You can even emulate a dolly movement to the camera by moving it in the
local Z-axis. Press the G key and the Z key twice. That will move the
camera in the local Z-axis and emulate a dolly movement.
Among the most useful options, you will find the focal distance to the
camera. For artists with a background in photography, the focal distance
will look familiar. That distance regulates how wide will be the view from
the camera.
The default value for the focal distance in Blender is 50mm, which won't
result in a broad view from the scene. To get a wider perspective, you will
have to reduce the distance to values around 20mm. Keep in mind that
using values lower than 18 might give you an ultra-wide view at the cost of
distorting objects at the border of your render (Figure 1.13).
Figure 1.13 - Comparing focal distances
Besides the focal distance from the camera, you can also use settings to
enable some helpers for your project. With the Composition Guides, you
can view some lines on your camera that will help to make a better
composition.
How house them? For instance, you can turn on the “Rule of Thirds” to
display guides on strategic locations of your camera. The rule states that
you should place essential objects at the location where the lines cross. That
will make the objects to appear visually more important in your image
(Figure 1.14).
Use those rules to help you make better renders by finding optimal
locations to frame your objects.
What is the best option? To save time and allow you to work on multiple
post-processing options, you should always prefer the image sequence
option. The main reason to pick the image sequence option is to save the
files in PNG.
Once you save the animation as a sequence of PNG files, you can later use
Blender to transform the images on any video container quickly. For
instance, you can save a sequence of PNG files and then make Blender
process them to become an MP4 file.
If you decide to render the project straight to a video container like an MP4
file, you will have fewer options to work in post-production.
To save your animations as an image sequence, you must use the Output tab
in the Properties Editor (Figure 1.15). There you can choose the folder
where Blender will save the files. Below the folder settings, you can pick
the format used to save each image. As mentioned before, always use PNG
to save your animation. Unless you desperately have to save on file sizes
and have to use JPG instead.
Figure 1.15 - Output tab
At the output folder, you will see the image sequence using the frame
numbers for names. If you want to add a suffix to the filenames, you can
add the text in the Output folder settings (Figure 1.16).
The bottom half of this Workspace has the Video Sequencer Editor where
we can add the image sequence from your animation. It works like a video
editor inside Blender. Use the green vertical line to control the playback and
use each channel to place content (Figure 1.20).
To add your image sequences, open the Add menu, and choose
Image/Sequence. Use the A key to select all images, and you will see the as
a block of content in the Sequencer (Figure 1.21).
Figure 1.21 - Image sequence
Now, go to the Output tab and choose the settings you want to use for your
video files. You will notice that rendering an image sequence to video is
much faster than processing your scenes from 3D Data. Use the Render →
Render Animation menu to start the process (Figure 1.22).
Whenever you have data in the Video Sequencer, Blender will render the
contents from the video channels there instead of 3D Data. The reason for
that is because at the Post Processing field at the Output tab you have the
“Sequencer” option marked. That will make Blender render content from
your video editor and ignore any 3D Data.
What is next?
The first step to start using Eevee is to manage to render and saving images
from your 3D Viewport in Blender. After you have all the knowledge
necessary to get your work saved, it is time to start digging on how to add
lights and materials to the scene.
From all the lights available in Blender you will find that Area Lights are
the ones that give the best results for Eevee. You can get them to work as an
environment texture or mix them with an HDR. You will learn how to
manage and place them in the best possible locations for interior scenes you
wish to render using Eevee.
What are environmental textures? They offer a way to add lights to the
scene using something that you already have in a project at the very
beginning. It uses the background of the scene.
From the background go your scene, we can add light from all directions,
which will make any project or scene to look brighter immediately!
To use your background as a light source, you have to open the World tab in
the Properties Editor and look for the Surface field. There you will see a
button with a "Use Nodes" option. That means you are not using the
background to emit light yet (Figure 2.1).
Figure 2.1 - World tab
After you press the "Use Nodes" button, you will see all options regarding
the use of your background. At this point, we have two main options for the
background:
If you want to use a solid color, the Color and Strength fields will be
enough for that task. However, for a more complex and realistic light setup,
you will need environmental textures associated with the Color field.
At the right of your Color field, you will see a button with a dot in the
middle. Click on that button and choose Environment Texture (Figure 2.2).
Figure 2.2 - Environment texture
When you add an Environment texture to the color, you will be able to pick
an image texture to use in your background. You can use any texture for
that task, but to have better results and natural lights, you will want to use
an HDR map.
An HDR map is a particular type of image that can store some vital
information regarding lights. You will be able to mimic the same lights used
in the environment where the image got captured. For instance, if you get
an HDR map from a location where you had a bright daylight scene, it will
generate that same effect for Eevee.
Another exciting aspect of HDR maps is that most of them will come in a
format called "Equirectangular." An image in this format is a representation
of volumetric space. You can map it into a shape like a sphere, cylinder, or
cube.
The main benefit of using such images in Eevee is that you will get lights
and reflections from all directions. In Figure 2.3, you can see an example of
an HDR map with an equirectangular projection.
You don't have to make any changes to settings in Blender to use such those
types of maps. That is the default format Blender uses to apply maps to the
background.
There you will be able to pick a map and download high-quality textures
for your projects using Eevee. For instance, we can use the map shown in
Figure 2.4 as the background of our scene.
A key element of each HDR map is how the light works on each map. Here
is a preview from two different types of HDR maps in Figure 2.5. Each one
of them has a unique approach to light behavior.
Notice how one of the maps has a bright light source direction and hard
edge shadows. The other map has soft edges for shadows and a more
scattered pattern.
Even with those examples of shadows generated by an HDR map, at this
moment Eevee cannot cast shadows from HDR maps only. You will have to
add auxiliary lights to create shadows.
You should pick the HDR for your project based on the light you want to
apply to the scene and choosing an HDR according to that light. What
happens after you add an HDR to a scene in Eevee? You will start to see an
immediate benefit of that light source to the project (Figure 2.6).
Besides getting a more natural color for the project using an HDR map, you
will also get the texture appearing in the background. In some cases, that
could be a benefit for composition purposes. For instance, you can get the
background with the sky and clouds for a project.
In other cases, it may require you to find a way to hide the HDR in the
background. The HDR may give you an excellent effect for lights, but will
not align with the camera angle for a particular scene.
Info: With the HDR map you won't get shadows for Eevee, but we use it
together with other lights like an Area or Sun.
The first one involves the use of an object to block the visibility of your
texture in the background.
1. Adding a sphere to the scene and using the scale to make it big
enough to surround the entire scene
2. In Edit Mode, change the normals of every face of your sphere to
point at the interior of your model
3. Smooth the borders of your faces. Right-click and choose Shade
Smooth.
The trick here is to place the entire scene inside a sphere with all normals
pointing to the interior. When you have normals leading to the interior of an
object, it becomes "transparent" to light on the outside. In computer
graphics, a normal also works as the visible side of a polygon.
When you have the normals for that face pointing to the interior, it will
become "invisible" to anyone looking on the outside (Figure 2.7). That also
includes light sources.
If you try to view the scene from a camera using the Rendered shading
mode, you won't see the HDR in the background anymore (Figure 2.8).
To improve even more this setup, you can use the sphere as a fill light to the
scene. Select the sphere and add a material to the object with the Emission
shader (Figure 2.9).
Figure 2.9 - Emission shader
Adjust the material color to something close to a light blue, and you will
have a great fill light to make your scene even brighter. Use the strength
settings to control the intensity of your background lights.
There you will see the "Transparent" checkbox (Figure 2.10). That is the
same option we saw in chapter 1.
Figure 2.10 - Transparent option
If you enable this option, you won't see the HDR map in your background
or any other element. Instead, you will see a squared background pattern
that identifies a transparent surface (Figure 2.11).
Figure 2.11 - Transparent background
Make sure you save the file using a PNG format with the color set to RGBA
to get the transparency effect to also appear in your image.
Tip: Use a transparent background whenever you need a scene that will
receive a background later in softwares like GIMP or Photoshop. You can
add something like a sky background or any other image to compose your
final render.
We can use the Shading Workspace by choosing from the selector at the top
of your 3D Viewport the General → Shading option (Figure 2.12).
Figure 2.12 - Shading Workspace
At the bottom, you will see the Shader Editor. Change the Shader Type to
World to view the Nodes for our background (Figure 2.13).
Connect the Generated output socket to the input of your Mapping. And the
Mapping to the Image Texture. You should have a setup that looks like
Figure 2.14.
To connect each Node, you can click and drag with the mouse between an
output and input sockets. Hold the CTRL key and click and drag with the
right mouse button to cut the connections.
The Texture Coordinates Node will control the location used to orient the
HDR map around your scene. With the Mapping Node, we can have
numeric control over the texture.
After you make all connections, the Mapping rotation settings will control
the orientation of the HDR map. That will be perfect to set the direction of
lights generated with the texture. For instance, if you choose an HDR
texture that creates hard edge shadows, you will be able to set the direction
of lights using the Node (Figure 2.15).
Figure 2.15 - Rotation controls
Always keep the scene in the Rendered shading mode to evaluate if you
have the shadows in the correct direction. Use the Z-axis controls to rotate
the HDR map and have full control on shadows.
To add the gradient to the background, you have to start a scene, and in the
World, tab enables the "Use Nodes" button. At the Shader Editor, you will
change the Shader Type to World and start working in the Node setup.
We will need the following Nodes, which you can add using the SHIFT+A
keys or the Add menu:
Why do we need the other two Nodes? The Separate XYZ will get the
orientation of the gradient for the background. If you use Z, it will be a
"vertical" gradient and using either X or Y will make a "horizontal"
gradient aligned to each axis.
From the Mapping Node, you can connect the output socket to the Separate
XYZ and using only the Z output, connect it to the ColorRamp. Use the
ColorRamp to setup your gradient. You will get the Node setup like Figure
2.16 shows.
The next step is to use your ColorRamp to prepare the gradient with all the
colors you need to place in the background. How do the ColorRamp works?
In Figure 2.17, you have a diagram of all available controls for the
ColorRamp.
For instance, you can add new colors and use the sliders to place it in the
gradient bar. You can also use only two colors. Click at the index you want
to use, and in the color picker, you can choose the tone you need.
The ColorRamp will give you some flexibility on how to choose and place
each color for the background.
Lamp
Area
Sun
Spot
To use in the background to replace an HDR map, you can easily add an
Area light that will fill any scene. Why an Area light and not other types of
lights? Because with an Area light, you will have a large surface that will
emit energy to the scene, much like a background would perform.
You will also have the advantage of generating a "constant" flow of light to
the scene.
You can select an existing light and at the Object Data tab swap the type of
light you are using (Figure 2.20).
Figure 2.20 - Types of light
An Area light can assume a squared or rectangular shape. If you choose the
square, which is the default format, you will be able to control the size at
the Object Data tab. For the rectangular shape, you will have to choose the
width (X) and height (Y) for the light (Figure 2.21).
You can mimic the behavior of your Environmental Texture; the Area light
should have a large size that will fill all the space in your scene. One aspect
of the Area lights in Blender that you will notice is that they produce a faint
light effect. To make a difference in the scene, you should increase the
Power controls for the Area light.
As you can notice from the results, you have to increase the Area light
energy a lot to have a considerable effect on the scene.
In the Area light tab, you will find lots of interesting controls that will allow
fixing several aspects of the scene. For instance, you can turn on the use of
contact shadows that will enhance the realism of your scene.
By using this technique, you will be able to get all the benefits of using an
HDR as an Environment Texture and also will be able to get additional light
from places where you would expect those lights to enter the scene.
For instance, you can take a look at Figure 2.23 to see a scene that needs
more light.
Figure 2.23 - Scene and HDR
The scene has only one HDR as the primary light source, and even after
increasing the strength of that light, you still don't get enough energy in the
scene. In Eevee we have another problem with that type of setup because it
doesn't compute indirect lights the way Cycles handles. We have to help the
renderer by adding those Area lights.
A solution for this scene would be to add in each opening an Area light that
will work as a fill light for the scene. For each opening, we can add an Area
light and scale it in a way that it fills the whole dimension of each opening
(Figure 2.24).
Figure 2.24 - Area light in the scene
As a result, you will get a much better lighting effect for the scene. You can
play around with the strength of the Area light to find the best balance to
the project (Figure 2.25).
Figure 2.25 - Scene with Area lights
At this point, you might get some problems with aspects of lights like
shadows and light leak from corners of your 3D model. You can improve
the shadows in several ways like in the Render tab in the Eevee settings.
For light leaks, we will cover that in chapter 4 in more detail.
Tip: Remember that HDR maps won't cast shadows with Eevee and have a
change to cause light leaks for interiors. You should prefer to use them in
exterior visualization projects.
What is next?
The first step in many projects is adding an Environmental texture to the
background, which will give you some initial light setup for a scene. That
first light will make it easy to start working with other aspects of the scene
like materials.
After you have that light set, you can move straight to the materials. Before
we move any further with the setup of a scene in Eevee, you should add all
available materials to the scene. That will be important when we work with
Since Eevee can offer real-time results for a render, you will have a much
better experience working with materials than Cycles. As you will learn in
the following chapters, you need materials in a scene to use all the power
from Eevee to render your scenes.
The materials will help you generate indirect lights that will eventually
bounce and blend colors related to the materials in your scene. You must
apply materials before moving forward in Eevee.
Chapter 3 - PBR materials with Eevee
After you have the environment lights in place, you can start to move
forward with the setup of your scene in Eevee. You should begin to work
with materials as the next step to build a scene.
In the following chapter, you will learn how to manage and work with
materials in Blender, which will allow you to apply the concepts not only to
Eevee but also with Cycles. Handling materials involves working with a
collection of shaders and textures.
To get the most out of Eevee, we will use a material in a format called PBR
to make realistic surfaces with a collection of image textures that works
with the incredible Principled BSDF shader.
Since Eevee allows us to create materials and immediately see the results in
real-time, you won't lose any time waiting for a render to complete and
check a material. For artists using Cycles, you can even change the renderer
to Eevee during the materials setup stage. That will save an incredible
amount of time.
One of the benefits you will find with Blender by having both Eevee and
Cycles available is that materials will work on the two renders. You can
start a project making materials for Cycles and later change the renderer to
Eevee. In most cases, you won't have to make changes to the settings.
If you are wondering if you should start with materials or leave it as the last
step in a project, an approach that is popular among artists is to setup
materials after the environment map. The main reason to add materials
early in the scene setup has a direct relation to calculations of indirect
lights.
You will learn later in chapters 4 and 5 that Eevee can't process indirect
lights for a scene like Cycles. We have to use special objects called probes
to compute those lights. Those probes will capture light and bounce it
around the scene. By the time you start using probes, you must have all
materials in place.
Because light bouncing in a surface that has a material will also carry colors
from those materials and contribute to the shading, if the material is not
present, you will lose the shading contribution of that material (Figure 3.1).
For instance, if you have a scene with a red floor, you will expect that color
to bounce in white walls. If the red doesn't appear as part of the indirect
illumination calculations, you will lose a lot of the realism for that scene.
Since Eevee can provide some fast previews, you will be able to quickly
add all necessary materials for a scene and evaluate if they look good on
your project.
An object that doesn't have any materials will show an empty panel with a
"New" button (Figure 3.2).
Figure 3.2 - Materials options
After you add the materials to objects, you will find controls at the top of
your Material tab that will give you additional options to manage each
material. You will find a description of each one of the settings in Figure
3.3.
Figure 3.3 - Material controls
Material selector: The option allows you to choose from a list of all
available materials in the scene.
Material name: If you have to rename the material you can click at
this field and type the name you wish to use. Each material in Blender
must have a unique name.
Fake user: When you remove a material from an object, it will
remain in the scene asset list. After you save and close the file,
Blender will purge all materials that are not in use. If you want to
keep a material that is not in use by any objects, you can turn on the
Fake User.
Duplicate material: For the cases where you need a copy of existing
material to make small changes or use it as a template.
Remove material: If you want to remove a material, you can use this
option. But, it will still appear on the list of assets for the scene. The
exclusion will occur after you save and close the file; only of no
object is using a material.
When you pick a Blender file, you will see some folders in that file, where
one of them has a name of "Material." By opening the folder, you will see a
list with all available materials you can get.
Tip: You can also use the Link option instead of the Append, but it will not
import the data directly to your file. It will use the material as a reference
link, where you can't edit the contents of your material.
That is the material index list, which can use various materials and has a
relation to the object (Figure 3.5).
Figure 3.5 - Material indexes
You can add multiple materials to each index of an object. Press the "+"
button, and you will be able to either choose an existing material or create a
new one (Figure 3.6).
Figure 3.6 - Adding new materials
Once you have an additional index available you can assign that to any part
of your 3D models. The options to manage and assign each of those indexes
will appear in Edit Mode. With the object selected, you can go to Edit
Mode, and at the bottom, you will see the new buttons (Figure 3.7).
Figure 3.7 - New options for indexes
Select the faces you wish to assign the new indexes and press the "Assign"
button. They will now display the material selected for the index you have
active in the list.
Tip: You can remove an index without affecting the material. It will still be
available from the material selector.
Diffuse BSDF: One of the most simple materials available. It will let
you use a solid color for a surface.
Emission: If you need a material that emits light, you can use this
shader to turn any object into a light source.
Transparent BSDF: With this shader, you can create simple
transparency for surfaces in Eevee.
Mix Shader: An option that will let you blend two or more shaders to
craft complex materials.
The type of surface you choose to create an Eevee will depend on the
project you are trying to make. Regardless of the type, you will have to use
shaders to get the desired surface.
If you are converting a scene from Cycles, you will find that most of the
materials from Cycles will also work in Eevee. A few unique materials with
effects like transparency and glossy surfaces will require a few changes to
work correctly in Eevee.
For a simple surface with a plain color, you can easily set the shader as
Diffuse BSDF and pick a color for that surface (Figure 3.9).
Inside the Shader Editor, you will see Nodes representing different parts of
your materials. That is the same Editor used in chapter 2 to control the
HDR Rotation.
A few tips on using the Shader Editor:
You can add Nodes using either the Add menu or the SHIFT+A keys
Each Node might have an input (left) and output (right) sockets
To connect Nodes, you can click and drag between sockets
Use the color codes from sockets to check data types
To cut a connection between two Nodes, you can hold the CTRL key
while clicking and dragging with the right mouse button
Use the same selection shortcuts to manage your Nodes
You can use the same navigation shortcuts to manage your
visualization of the Shader Editor
Info: We already used the Shader Editor back in chapter 2 to control HDR
maps.
Luckily for us, we can easily use PBR materials in Eevee with a powerful
shader called Principled BSDF. That shader can handle multiple texture
maps that compose a PBR material. In some projects, you may have your
materials using only the Principled BSDF for all surfaces because it can
replace most of the other shaders available in Blender.
When using PBR materials, you have two options:
For instance, we can visit one of those three sites and download a fabric
texture with a 4K resolution, which will give us maps having 4096 pixels in
size. You will download PBR materials as a ZIP file, which you will have to
extract somewhere in your computer.
Good practice in those cases is to save your Blender file in a folder and
extract the textures to that same location. Our fabric texture has a total of
four maps (Figure 3.11).
Figure 3.11 - PBR material maps
Depending on the type of material, you can also have additional maps. Both
the normal and displacement have similar functions with the difference in
the results. With the normal map, you get bumps that don't use real
geometry. The displacement can make bumps with geometry but will
require you model to have a high-density for your polygons.
Open the Shader Editor to make your workflow smoother and add an Image
Texture Node. Press the SHIFT+A keys and add Texture → Image Texture
three times. You can also create one Node and duplicate it two times with
the SHIFT+D keys.
Tip: From your file manager you can also drag and drop the images
straight to the Shader Editor in Blender. They will appear with an Image
Texture Node ready to use.
In each one of the Image Texture Nodes, you will open the following maps
for the PBR material:
Color
Roughness
Normal
For the Roughness and Normal maps, you will change the Color Space
from the Node to "Non-Color" (Figure 3.13).
If you are applying the material to a flat surface, you won't have to make
changes to the Mapping of your textures. However, you will experience
misplaced textures in case you use a tridimensional object. You can change
the Projection of your textures from "Flat" to "Box," and they will adapt to
almost any shape (Figure 3.14).
Figure 3.14 - Texture Projection
The last step is to connect the Image Texture maps to each of the
corresponding input sockets at the Principled BSDF:
For the Normal map, we need an additional Node. Press the SHIFT+A keys
and add from Vector → Normal Map. Connect the Image Texture with the
normal to the Normal Map Node, and then to the Normal input socket at the
Principled BSDF.
In the end, you will have the setup shown in Figure 3.16 for the PBR
material.
If you render the scene in Eevee, you will be able to visualize the material
in real-time (Figure 3.16).
Figure 3.16 - PBR material in real-time
You can make changes to the material in your Shader Editor and view them
at the 3D Viewport with no delays, which is a great advantage of Eevee.
How to use two maps at the same time? To use two maps for the Base
Color, we will need a MixRBG Node. Add the Node using the SHIFT+A
keys and choose Color → MixRGB.
Connect the Color to the top input of your MixRGB and the Ambient
Occlusion to the lower input (Figure 3.17).
Figure 3.17 - Using Ambient Occlusion
With the Ambient Occlusion map, you will get a PBR material that can use
contact shadows in bumps and ridges of your surface.
At first, you may try to use the Glass BSDF for Eevee but soon will realize
it doesn't work to get transparent surfaces ready for a more realistic project.
To create a convincing glass material for Eevee, we will have to use two
shaders:
Principled BSDF
Transparent BSDF
As you will notice from the Principled BSDF, we don't have any control
related to transparency. To get transparency with the Principled BSDF, we
have to mix that with the Transparency BSDF.
The first step to create a glass material for Eevee is to make a material and
in the Shader Editor add:
Connect both the Transparent and Principled Nodes to the Mix Shader and
the Mix Shader to the Material Output (Figure 3.18).
Figure 3.18 - Initial glass setup
To get a better reflection for the glass material, we need to create some
angular reflections. You can achieve that with two additional Nodes. Press
the SHIFT+A keys and create from:
Input → Fresnel
Color → RGB Curves
Connect the Fresnel to the RGB Curves and this one to the Fac socket of
your Mix Shader (Figure 3.19).
Set the IOR from the Fresnel to 1.45 and use the curves in the RGB Curves
to adjust the reflectivity of light.
At this point, if you try to preview the material using Eevee, you won't see
any transparency. The reason for that is because we still have to change the
Blend Mode and Shadow mode at the materials to create the glass.
You will find those controls in the Materials tab at the Settings field. You
will set them like:
If you apply that material to an object and rendering it with Eevee, you will
get a glass effect for that surface (Figure 3.21).
Figure 3.21 - Glass effect for surface
You can control the Color of your glass using the Base Color settings and
the glossy reflections for the surface using the Roughness. A value of zero
will give you a clear surface reflection.
If you want to use a more straightforward solution without the need for a
Principled BSDF, you can try the Glass BSDF with a Transparent BSDF.
Use the Fresnel as the Fac for the Mix Shader (Figure 3.22).
Figure 3.22 - Simpler glass
That will not create the same type of glass for your scenes but will require
fewer steps to build.
Info: In Chapter 7, you will learn a third way to create glass materials for
Eevee using the Principled BSDF alone.
To control the tilling for your PBR materials in Eevee, we will need two
Nodes:
You must connect the Generated output socket from the Texture
Coordinates to the Mapping. From the Mapping Node, you will connect to
each one of the Image Texture Nodes (Figure 3.23).
By using a lower value will increase the size of your textures and a higher
value will give you a smaller block (Figure 3.24).
Figure 3.24 - Texture sizes
If you feel your textures are not looking good with the use of those tiles,
you can always set the scale back to one and have the default tilling for the
PBR material.
What is next?
The materials are a vital part of any project that uses Eevee for realism, and
you won't get great images without investing some time with PBR
materials. Having a preview of your materials in real-time will add an
incredible boost in productivity for Eevee and your workflow.
Most of the effects you see on screen using Eveee will appear in the render
preview without the need of any unique setups, but due to the nature of
Rasterization, we will have to use a few tricks to see the full extent of
visual effects for a scene.
In Cycles, we can prepare the scene and start the render after making some
simple choices for materials and lights. The Path Tracing algorithm will
handle most of the physics and lights with accuracy at the cost of render
times.
You can't get lots of those effects in Eevee by default because Rasterization
can't handle them. You will need to add helper objects that will compute
effects like indirect lights and reflections. The materials we created in this
chapter will show in the preview window with all the necessary effects but
will lack reflections for the scene.
In the following chapter, you will learn how to use a unique object for
Eevee called probes. Those probes will help us adding those effects in a
scene to both improve materials and the overall scene.
Chapter 4 - Probes and lights for Eevee
The rasterization process of Eevee is incredible to deliver high-quality
images in real-time, but to get to that quality, we have to overcome a few
limitations. In Cycles, we have a process that uses common types of objects
like lights and textures to create images for a project.
Since it uses a physically accurate process, you can add the lights and start
the render process, which will result in a realistic image. That comes at the
cost of render times, which are significantly higher than what we get in
Eevee.
Among the limitations from Eevee, we can list indirect lights and
reflections for glossy materials. Those two elements that are critical for
realism will not appear in rasterization render automatically. We have to use
some helper objects to enable them.
In this chapter, you will learn how to use those helper objects that receive
the name of probes. Eevee has three different light probes that will give you
tools and options to mimic the behavior of light. You can get results that are
close to what we find with Cycles.
The light probes will help with the process of setting up a scene in Eevee
for getting effects and better lights. Unlike Cycles that works with an
algorithm that uses Path Tracing to get interactions between lights and
surfaces, you will have the Rasterization process in Eevee applying visual
tricks to the scene.
One of the objectives of those tricks is to mimic effects like global
illumination for real-time renders. If you have any previous experiences
with game engines, you will find the light probes familiar. That is because
on those engines you also have the same concept of a light probe.
The probes in Eevee have a single purpose, which is to fake some of the
effects you would get in traditional render engines like Cycles, but in real-
time. From the list of probes, we get the following options:
Irradiance Volume
Reflection Cubemap
Reflection plane
All those probes will help you make specific effects in Eevee and have a
vital role in any attempt to render realistic scenes.
With an Irradiance Volume, we can fake an effect that you can quickly get
in Cycles, which is indirect illumination. In chapter 5, we will talk more
about the process of baking and managing indirect lighting with Eevee.
To fully understand how you can benefit from the use of an Irradiance
Volume, we can take a look at a scene that doesn't use any probes. In Figure
4.2, you can see a scene that has a red color for the surface and white walls.
Figure 4.2 - Scene with no probes
The scene doesn't look realistic for many reasons, and one of the missing
aspects is the indirect light effect. In a scene like this one, you would expect
the light is hitting the red surface to bounce around, and as a result, you
would get a faint red color spread over walls.
What you should expect from a scene is what you see in Figure 4.3, where
the bounces from the floor will add some color to the scene. That is indirect
light.
Figure 4.3 - Indirect light effect
Due to the nature of Rasterization, we can’t get that in Eevee without some
tricks.
When you add a light source in Eevee, it will lit surfaces and do not bounce
around the scene. The light will hit a surface and stop. Here is where our
Irradiance Volume enters. Using that volume will enable Eevee to catch
light in a particular area and compute the indirect bounces in a specific
region.
You can change the settings and the appearance of the probe using the
Object Data panel when you select the Irradiance Volume (Figure 4.5).
Figure 4.5 - Irradiance Volume options
Distance
Falloff
Intensity
Resolution (X/Y/Z)
Clipping (Start/End)
Those are the main options for the Irradiance Volume, and we will work
with them to get the best results. How do they affect the probe?
Before we describe each option from the probe, it is vital to see a visual
breakdown of the object. In Figure 4.6, you can see an image of the probe
with some key points highlighted.
We can begin our analysis of the probe with the distance. As you can notice
from the images representing the Irradiance Volume, we have a two-box
structure for the object. A big box on the outside and another one smaller
inside. The gap between those two boxes is what the “distance” setting
controls.
The main indirect light calculation will happen on the smaller box. Between
the two shapes, you will have decay on the effect that will eventually fade
away. You can set the amount of that decay with the Falloff control. By
default, the value one is what gives the most realistic results.
You can adopt as a rule for any scene, where you want to process indirect
lights, that it will have to fit inside the smaller box. If you set the distance
to zero, you will have only a single box design, which might help you to
align the Irradiance Volume with a scene (Figure 4.7).
Tip: You will be able to later increase the size of the samples to make it
easier to manage the way they capture and bounce lights.
If you get an area in your scene that shows some low-quality shadows or a
criss-cross pattern on shading, you might need additional samples. You can
add more of them using the resolution settings.
Having samples between objects and surfaces also help with the indirect
light calculations. If you look at Figure 4.8, you will notice that we have a
simple scene. The objects are in the area of effect for an Irradiance Volume,
and others don’t have any sample between them and the floor.
The object on the right has samples between their shape and the floor.
Because of that, it receives indirect light from the material. As a result, you
will get a shadow that has a mix of colors from the floor material. The
object on the left doesn't show that shade, because it doesn't have probes
between his shape and the floor.
In Figure 4.9, you can see the results of the same scene with additional
samples.
Figure 4.9 - Scene with additional points
The number of samples in your scene will have an impact on the indirect
light calculations and shading quality. You will also notice some increase in
indirect light processing later. We will discuss how to manage those
calculations in chapter 5.
A solution with a nested Irradiance Volume would look like Figure 4.10
shows.
Figure 4.10 - Nested Irradiance Volume
You can get as many volumes you think are necessary to fix a problematic
shading for indirect lights.
For instance, if you have a scene that needs indirect lights calculations, you
can use the scale transformation in Blender with the S key to adjust the size
of your volume. Make the volume to fit everything on the scene in a project
that has a rectangular shape that will be easy to achieve.
What if you have a scene that has another type of shape? In that case, you
can use multiple volumes. In Figure 4.11, you have an example of a scene
that has a form that requires various volumes.
Figure 4.11 - Scene with multiple volumes
The scene requires two volumes to get covered for indirect lights
calculations.
To make your process of adapting a volume to a scene easier, you can set
the distance of each volume to zero. That way you won't have to worry
about getting objects in the falloff area with a reduced shading effect for the
indirect lights.
To create a Cubemap, you have to press the SHIFT+A keys and choose
from the probes group the Reflection Cubemap. The object will have a
sphere shape by default, but you can also use a cube as the primary
reference. If you want to make it visually unique, it is a good idea to keep it
as a sphere, since the Irradiance Volume already uses a cube shape (Figure
4.12).
Tip: You can create glossy surfaces by changing the roughness settings for
any shader in Blender.
When you place the object inside the Reflection Cubemap, it will
immediately capture the surrounding objects and start displaying reflections
(Figure 4.13).
That will give you a lot of freedom to create and place objects with glossy
reflections in your scene.
If you want to streamline the setup process, you can quickly get the probe
with a scale that fits the entire scene, like Figure 4.14 shows.
Figure 4.14 - Reflection Cubemap
After you add the Cubemap, you will have to bake reflections to the
objects, which we will cover in Chapter 5. If you have a small area or
surface that has a glossy reflection, you can make the probe small enough to
fit only the area of that object.
When you select the Reflection Cubemap object, you will be able to see all
options regarding the control and behavior of that probe (Figure 4.15).
Figure 4.15 - Cubemap options
Intensity
Radius
Falloff
Clipping (Start/End)
The Intensity defaults are one, and it will produce the best results for
reflections. Unless you have a good reason to increase or decrease the
reflections, you can always leave the value as one. With the radius you can
control the size of your probe and the Falloff will set the distance between
the inner and outer probe. If you don't want to fade reflections, change the
value to zero.
For big scenes that have lots of objects with a glossy reflection, you can
make a large Cubemap that will surround the entire scene.
Regarding the position of the probe, make sure it is in a location that will
create proper reflections. The probe will act like a camera that takes a
snapshot of the surroundings and cast the image to the glossy surfaces. It
will only project what is in the visual distance.
For instance, if you place the probe near the floor, it will project the
background of your scene to the objects. You will see a black reflection in
the bottom half of every object.
Info: We will have to bake the probe to see all results with reflections. If you
don't bake the probe, the scene will remain the same.
After you add the probe to the scene, it will appear to be a simple plane. It
has an arrow showing the reflection direction and an area of influence. If
you increase the Distance for a Reflection plane, you will see the results as
a bigger probe (Figure 4.16).
The size is essential to define what the probe sees, which will also be
visible as a reflection.
You can preview the results of your probe using the “Show Preview Plane”
at the bottom of your probe options. At the “Viewport Display” field, you
will find that option to display what the plane sees (Figure 4.17).
Figure 4.17 - Probe preview
That is the best way to create true mirror objects in Eevee with a minimum
computational load to the scene. For projects that require true reflective
surfaces like mirrors, water, or anything with a perfect reflection, you
should use Reflection planes.
One of the advantages of the Reflection Planes is that they don't require any
type of baking.
A collection can store multiple types of objects in a scene and can receive
unique names, and you can even nest collections for a more powerful
organization. When you use collections with the probe visibility options,
you can control what objects will interact with the probe.
A scene that has no collections set in the visibility options will reflect and
interact with anything that is the range of a probe. If you add a collection to
the visibility, only the objects that are part of that collection will show in
the probe area of effect.
Now, if we get the objects with red color from that scene and place them in
a collection called “Red Objects.” Add that collection in the Visibility
options; only they will appear for the probe (Figure 4.19).
Figure 4.19 - Visibility options for probes
Using those options will give you more power to choose if an object will
appear as part of the probe. If you start to adopt collections for all your
projects, it will be easy to control the visibility of objects for probes. For
instance, you could have a collection called “Furniture” and move all
objects representing furniture to that group.
Renaming
Moving
Erasing
Creating
Nesting
You will have all those controls for collections available at the Outliner
Editor.
To create a new collection, you can use the right-mouse button at the
Outline Editor that will open a small menu. At this menu, you will see a
“New” option that creates an empty collection (Figure 4.21).
Figure 4.21 - Empty collection
You can rename the collection to something that will help identify the
contents. For instance, we can use “Red chair” in this collection. Double-
click the collection name to rename. Since they will help you with the
organization of a scene, it is essential to assign meaningful names to either
collections or objects.
Info: Each scene in Blender will have a “Scene Collection,” which works
like a master collection that has all others nested. You can’t remove the
Scene Collection. The newly created collection appear on the list as a
nested collection.
To move an object to that collection, you can click on the object you wish
and drag it by their name and release over the collection name. A simple
click and drag will move objects between collections.
Now, if you set the “Red chair” as the Visibility Collection for the probe,
you will only see the objects from that collection in the reflections (Figure
4.23).
One of the benefits of using collections for a scene is the ability to use the
Append and Link options from the File menu in Blender. With those two
options, you can move data between files and reuse some of your assets.
For large projects that have dozens or hundreds of assets, you can use
collections to identify groups of objects quickly.
You can even create a “template” for Eevee with all necessary probes and
lights and place them in a collection. After starting a new project, you can
simply append the contents of that collection with pre-made settings.
What is next?
The light probes in Eevee are an excellent help for any project that has an
objective to make real-time images using realism. If you know how to use
them in a project, you can get results that are close to what Cycles deliver.
Using probes like the Irradiance Volume might transform any scene you
make in Eevee by adding Indirect Light calculations. When working with
projects that must show interiors or a scene in enclosed spaces, using
indirect lights might enhance the realism and quality of your shading.
The visuals of your scene will only change after you bake the results to the
scene, which is the main subject of the next chapter. With the baking
process, you will have to wait a few seconds or minutes for Eevee to
calculate each probe contribution. It is not like a render from Cycles, but it
will add a few moments to your render.
Unlike many processes that also require the baking of lights to textures, you
don't have to keep objects at fixed locations. After baking, you can move
them around, which will make animation production a lot easier.
The next step in your Eevee setup is to get your probes backed to the scene
and see the benefits of using a real-time render engine like Eevee.
Chapter 5 - Indirect Lights with Eevee
At this point, you have a solid understanding about how to use probes with
Eevee and add them to a scene, but placing them in a 3D scene won't trigger
the visuals or effects we need. There is an extra step we have to perform to
get the effects for each of those probes.
You have to bake the results of each probe to get the visuals in a render.
What is the baking process? That is the main subject of this chapter, where
you will learn how to manage and process the probes.
The baking component of Eevee will make you remember Cycles for a
moment because we have to process the scenes to get a good result. But, it
is a lot faster than what we get using Path Tracing.
You will also learn how to manage common problems associated with the
baking of Indirect Lights in Eevee, like the light leaks from surfaces that
can cause many delays in projects and make you go back to the modeling
stage.
In this chapter, you will learn how to manage and fix light leaks. Here is a
list of everything you will learn:
The price you will pay in Cycles is a long render time. In Eevee, we have
the benefit of getting renders in real-time, but most of the process to get
good indirect lights for the scene will require the use of probes and a panel
do process indirect lights.
Irradiance Volume
Reflection plane
Reflection Cubemap
Each one of those probes will contribute to the final solution to make a
better render in Eevee.
Before you start working on the solution to get your indirect lights in
Eevee, it is imperative to add all the materials to the project. Regardless of
the complexity or types, you will set all the colors for each surface.
The primary reason to work on all materials before you start dealing with
indirect lights is to ensure you are using the accurate light bounces. When
you process indirect lights, you will add to the scene some light bounces
that will carry colors from each surface.
To get that effect in Eevee, you will need a scene that has:
With those two aspects of the scene ready, we can move to calculate
indirect lights in Eevee and make our scenes look better visually. The
process of baking indirect lights in Eevee requires some calculations and
resembles a pre-rendering. As a result, you may have to wait a few seconds
or minutes until the calculations are complete.
It is not as long as a render in Cycles but will require some time to process.
You can start the process by pressing the button to make Blender calculate
all contributions to the indirect lights. When you press the "Bake indirect
Lighting," a progress bar will appear at the bottom of your interface,
showing the state of your calculation (Figure 5.2).
By pressing the button, you will bake not only the Irradiance Volume results
but also the Reflection Cubemap.
Look at the contribution made by the light bouncing in surfaces that have a
more prominent color.
After you calculate the indirect lights, you can view the irradiance samples
with an increased size to compare how they are reflecting light. Use the
“Irradiance Size” option to increase the size and enable the visualization
(Figure 5.4).
Each sphere will show a preview of how your samples are bouncing light
across the scene.
After you press this button, you will remove any saved solutions for
indirect lights and will be able to start over with the process. Replace
materials and make changes to the scene. Once you have the updates, you
can press the "Bake indirect Lighting" again.
Whenever you make changes to the scene regarding materials is the most
common cause to require a full recalculation of the scene. Moving objects
around will not cause any problems or errors, but changing materials may
affect the scene directly.
For instance, you may have a wood floor for a scene and replace that
material with metal tiles. The color of those two materials are entirely
different and will appear in the rendered scene. You have to recalculate to
have a visually accurate scene.
In any case, you can enable the automatic update of the calculations. At the
Render tab, find the "Auto Bake" option (Figure 5.6).
Figure 5.6 - Auto Bake
Every time it updates an object on your scene that requires a change in the
indirect lights, Blender will trigger the baking process automatically.
When should you make changes to the settings for indirect lights? Usually,
when you have a situation that will demands changes based on poor
shadows or bad shading. You may see artifacts in the final render, strange
shadows, or light leaks in the model.
Here is a list of possible settings that you can use to enhance the indirect
light calculations:
Diffuse bounces: How many times light will bounce in the scene.
Cubemap size: Resolution for the Cubemap used to record the
illumination.
By using those two settings, you can already improve the final solution for
your render in Eevee. The best way to show how those settings affect the
final render of a scene; we can use the following project that has indirect
lights calculated using the default values (Figure 5.8).
Use those settings whenever you feel that your final indirect lights solution
still lacks some quality for either the number of bounces or the irradiance
resolution.
Info: The Cubemap size will only affect the probe Reflection Cubemap. You
must start to bake that probe to see such effect.
At the Render tab, you will see the Sampling field. There you have two
settings for the Render and Viewport (Figure 5.10).
Figure 5.10 - Samples settings
In most cases, you will only need 64 for the final render. When glossy
surfaces start to display noise, which will appear as small white dots, you
can increase the samples for rendering to remove them.
You can fix light leaks in Eevee using a variety of settings from different
locations. We will have to tweak settings for:
3D Models
Shadows
Lights
Depending on your Model, you may fix the leaks by changing just one of
the settings.
From a modeling point of view, you can say that a structure that doesn't
have the proper optimizations would look like Figure 5.12 shows.
The problem with that model is that it has only single planes for the
structure. If you plan to use Cycles for rendering it will work fine, but with
Eevee, we have to be extra careful. After you start adding lights to the
scene, it will eventually show some light leaks on corners from indirect
light calculations.
If you decide to use an HDR or Sunlight from Blender, the problem will
appear with a higher level of intensity.
How to fix that? You can follow some simple guidelines to create 3D
models for Eevee and prevent light leaks. Regarding modeling, you can add
some thickness to all your 3D models. One of the most straightforward
ways to add thickness to the models is with the Solidify modifier.
For instance, we can take a model that will eventually suffer from light
leaks like the one from Figure 5.12. Take that model and apply a Solidify
modifier. As a result, you will get some thickness to the object (Figure
5.13).
That alone will not solve light leaks in Eevee but will reduce the chance of
having those undesired effects in your projects.
For Eevee, you should make a few adjustments to the process. The first
thing to do is to prepare the scene with models that have a lot of thickness.
To demonstrate how you can start a simple scene that has optimized
polygons for Eevee, we can make the model for the room shown in Figure
5.14.
As you can see from the image, we have a simple room with standard
thickness for the walls. A common approach to the modeling of such
environments would be to get a small plane at the corner of your room and
start extruding it from that point (Figure 5.15).
Figure 5.15 - Plane on the corner
That would work for a model you have to create and render in Cycles, but
in Eevee, you will eventually face some potential light leak problems. At
the beginning of your modeling, start adding some thickness to the walls. In
Edit Mode, select the vertices of your plane and stretch them (Figure 5.17).
If that won't show up in the final render, you don't have to worry about
accuracy regarding 3D modeling (Figure 5.18).
Figure 5.18 - Extruded walls
What if you got a model coming from an old project? In that case, you can
always apply the Solidify modifier.
For the shadows settings, you can use the following parameters to create
high-quality shadows that prevent leaks:
Method: Change to VSM
Cubemap: Increase to 1024px or higher. Those are shadows for area
and point lights.
Cascade Size: Increase to 2048px if you are using sunlight.
High Bitdepth: Enable
Soft Shadows: Enable
Those settings will help you to fix potential light leaks in shadows. If you
have a scene with a baked solution for indirect lights, it is now time to
delete the cache and process them once again.
Area
Point
Sun
Each of them will require specific settings to prevent leaks. From the two
options, you will find light leaks more often when using the Sun, which is
why an Area light is the best choice for scenes in Eevee.
If you are using an Area or Point light, you can select it and open the Object
Data panel to see all options regarding lights (Figure 5.20).
Figure 5.20 - Area light options
If you are using a Sun for your scene, you can use the same values with a
few differences:
Direction: Do not point the Sun directly to the interior of your scene
Angle: Change to 0.1
By making those changes, you will help prevent and fix potential light
leaks. With those settings and tweaks, you can delete the cache for your
Indirect lights and process them again (Figure 5.21).
Figure 5.21 - Sun options
What if the leaks don't disappear? If they remain in your scene, you can try
to move the lights from their location or look at the structure of your 3D
model. At some point, you will find a combination for settings and 3D
models that will not generate leaks.
The most common scenario for leaks comes from projects that you try to
migrate from Cycles to Eevee. Since we don't have those problems in
Cycles, you will eventually find yourself with a scene that needs a fix in
Eevee.
For the cases where you start from scratch with the purpose to render in
Eevee, you can follow the guidelines for modeling to prevent the problem
with your lights.
The process is simple to complete and will require you to only place one of
the probes in the scene, and at the Render tab, press the "Bake Cubemap
Only" button (Figure 5.22).
Figure 5.22 - Cubemap bake
When you finish the Cubemap bake, you will be able to see reflections and
other visual effects related to the probe (Figure 5.23).
Tip: You can make glossy materials that look like metal using the Glossy
BSDF shader and setting the roughness to zero.
If you look closely to Figure 5.23, you will notice the metallic object is not
reflecting the red chair at the object. The reason for that is because our
probe is in the middle of our scene. To the right, it doesn't see any other
objects.
To fix that you have to either add another Cubemap between the chair and
object or move the probe. By placing the probe between both objects, you
will get the chair reflected in the object (Figure 5.25).
Figure 5.25 - Object reflection
Tip: You can view the probe as a mirror ball after enabling the eye icon for
the Cubemap at the Indirect Lighting panel.
From the two reflection probes, e have available in Eevee you have to bake
only the Cubemap. The reflection plane doesn't require any baking.
We already mentioned the effect when discussing materials with settings for
Eevee, but now it is time to explore a little more of the effect.
Like the name states with Screen Space Reflections, you get an effect that
will mirror your scene on glossy surfaces. It will capture what you see on
screen and mirror that on that surfaces. It will help to add another level of
realism to the scenes. The effect works even if you don't have probes in the
scene.
The result is a surface with reflections for materials like a floor with glossy
reflections. That is different from the effect you get in a probe because it
blends with the materials. A reflection probe would make a perfect mirror.
With the Screen Space Reflections, you will get an effect closer to glossy
surfaces in Cycles.
When you activate the Overscan option, you will get Eevee extending the
area of your render for Screen Space Reflections. The default value will
extend the visible region by 3%. To get more objects appearing on screen
for rendering, you can increase the Overscan to a more significant amount.
What is next?
You know have all necessary information to start working on a project that
will use all available tools and resources to get real-time render made with
realism in Eevee. From indirect lights and probes to the final render.
The next step regarding using Eevee is to get an existing project and apply
those settings to check results and fix potential problems. In the following
two chapters, we will get projects that have only the modeling part ready,
and add elements like lights and materials to render with Eevee.
In the first scene, we will cover what to do in an interior project that has
challenges for light placement and visualization. The next scene includes an
exterior visualization with unique challenges for representing a daylight
simulation. Both scenes will allow us to apply the knowledge acquired until
this point in the book.
After you have the scenes ready, we will jump to the color management
panel and effects to apply come exposure and gamma settings to improve
lights and the overall image of our render.
Chapter 6 - Interior lights with Eevee
Until this point in the book, we learned a lot about how to work with Eevee
and setup several aspects of a project like lights, probes, materials, and the
render process. It is now time to put the workflow to the test with the
complete settings on how to create an interior render with Eevee.
At the end of the chapter, you will have a better understanding of how to go
from start to finish with a project using Eevee, and have your renders in
real-time.
You might encounter a few changes between projects based on the nature of
what you are trying to render. For instance, a project that has a focus on
interior scenes will include a few unique problems that will not appear at
external scenes.
1. Check the model to see if you have some thickness to walls and make
sure there are no unnecessary gaps between objects.
2. Add environmental lights using either an HDR map or Area Lights.
For interiors, you should prefer Area Lights.
3. Create all materials to the scene. If you can, use only PBR materials
for your surfaces.
4. Add the light sources that will generate the proper mood for your
scene. It could be an Area, Point, or Sun.
5. Create probes to compute indirect lights and reflections.
6. Bake the probes to the scene
7. Review the solution to fix potential shading and light leak problems.
8. Add effects and post-production to finish your image.
As you can see from the list, a critical step that you must perform before
going any further in the process is adding materials. You should look for
PBR materials for all the surfaces to make sure you have the benefits of
accurate physical reflections and surfaces.
We will apply the checklist during the rest of this chapter to create the
image shown in Figure 6.1.
Figure 6.1 - Interior scene in Eevee
The scene will work as an example of how we can create realistic images in
real-time using Eevee.
6.2 Interior models for Eevee
The first step regarding modeling in a project you know will use Eevee for
rendering is to make sure all elements have a proper thickness to avoid
future problems regarding light leaks. You can use a model that has thin
planes as structure and not have light leaks at all, but it will be better to
avoid any potential problems.
Floor
Walls
Window
A critical aspect of a scene that you can try to follow in your projects is to
remove geometry behind the camera. That will give your scene another
source of light coming from the background.
At this point, the scene doesn't have any lights or materials in Eevee. If you
trigger a render, it will display only the primary colors for all the objects.
The main point of entrance for lights in the scene is the side window, which
doesn't have any frames, and the big opening behind the camera (Figure
6.3).
For the settings regarding this Area Light that will simulate the illumination
coming from the background, you will enable “Contact Shadows” and
adjust the Power to meet the desired level (Figure 6.5).
Figure 6.5 - Light settings
Since we are still at the beginning of our setup for the scene, it is too early
to change settings related to light leaks. If the scene requires some
additional adjustments, we will wait until the indirect light calculations.
Info: Why not use a Sun Light? The Sun Light in Eevee has a high
probability of generating light leaks, and you should avoid using it if you
have other options. In this case, an Area Light will create the effect we
need.
The next light for the scene will be another Area that will stay at the big
opening behind the camera. You will adjust the scale of this particular light
to fit the exact size of the opening (Figure 6.6).
Also, you will not want multiple shadows in the scene. For that reason, you
must disable the shadow casting for this second light. You can use this
technique for any interior scene you work:
That alone will create a great start for any interior rendering with Eevee.
Tip: A quick way to adjust Area Lights to opening is with the use of a
dynamic scale of those objects. If you select an Area Light, you will see a
yellow border with small squares on each corner of the light shape. Click at
one of those squares to dynamically resize the shape.
The first object that will receive materials is the floor model, which will use
a PBR parquet texture. For this particular texture, we will use a set of
textures containing:
Color
Ambient Occlusion
Roughness
Normal
To add the material in the floor model of that scene, you can:
1. Select the floor object
2. Go to the Material tab
3. Create a new material
4. Set the shader as the Principled BSDF
5. Open the Shader Editor
6. Add four Image Texture Nodes
7. Open each one of the textures in the Texture Nodes
8. Set the color space for both Roughness and Normal to Non-Color
9. Add a MixRGB Node
10. Add a Normal Map Node
11. Connect the Color and Ambient Occlusion to the MixRGB
12. Connect the MixRGB to the Base Color in the Principled BSDF
13. Connect the Roughness to the Roughness socket of the Principled
BSDF
14. Connect the Normal to the Normal Map Node
15. Connect the Normal Map Node to the Normal socket of the
Principled BSDF
That will create the base PBR texture for the floor. We also want to add
some tilling control:
In the end, you will have the setup shown in Figure 6.8.
You can adjust the scale of the texture using the Mapping Node until you
got a size that fits your scene scale (Figure 6.9).
If you want a more detailed explanation regarding materials and Nodes, you
can go back to chapter 3, where we discuss them in more detail.
Tip: You can isolate the selection of an object to make it easier to edit
materials. Select the object and press SHIFT+H. Press ALT+H to display
all other objects again.
For instance, we can select the chair frame and go to the Material tab. There
you will add new material and don’t even have to use the Shader Editor. All
the process can use the Material tab options.
Choose as the shader for this material a Glossy BSDF and set the roughness
to zero. With the roughness as zero, you will have a clear reflection on the
surface. Make the material color to be white, and you should have a chrome
surface (Figure 6.10).
Figure 6.10 - Chrome material
The process with the Principled BSDF would require a similar approach,
using the roughness settings.
Remember that we won't see much different on that material until we add a
Reflection Cubemap to the scene.
For instance, we can get the fabric materials for the chair cushions from
another project. Go to the File → Append menu. Pick the file you want to
use as an external library, and you will see folders with all the available
data (Figure 6.11).
Figure 6.11 - Folders to Append
Use the Material folder and look for a material that you want to bring to the
current project. In our case, I will use the Fabric material. Select the
material and hit the Append button on the top right.
Tip: You should always give meaningful names to materials if you want to
reuse them later in other projects.
How to use that material? It will now be part of your file assets. At the
Material tab, you will find it in a list you can open from the material
selector (Figure 6.12).
Figure 6.12 - Material selector
If you have the model for the cushions selected, you can easily open the
material selector and apply the Fabric material to the objects. If you don't
like the tilling for that particular material, you can always open the Shader
Editor to change and adjust the texture.
Irradiance Volume
Reflection Cubemap
The Irradiance Volume will generate all necessary light bounces for indirect
light since Eevee cant process that alone. With the Reflection Cubemap, we
will get the materials using glossy surfaces something to reflect and don’t
look artificial.
You can start with the Irradiance Volume for the scene. Press the SHIFT+A
keys and add one of the probes. Set the distance for the object as zero to use
no falloff. The irradiance shape will have the full effect.
Since the default distribution of points inside the Irradiance Volume is low
for an interior scene, we can increase the value of points from a matrix of 4
to 6. Go to the Object Data tab, and with the Irradiance Volume selected,
you can change the Resolution to six in all axis (Figure 6.13).
Figure 6.13 - Irradiance Volume settings
Now, using the scale transformation, you will try to make the probe fit the
scene interior. Use the S key with each corresponding axis to make the
probe fit the entire scene (Figure 6.14).
Figure 6.14 - Adjusting the Irradiance Volume size
It is now time to process the scene indirect lights! Go to the Render tab and
in the Indirect Lighting options change the diffuse bounces to six and press
the “Bake indirect Lighting” to calculate both probes.
The process may take a while to finish, but after a few moments, you will
have the file solution (Figure 6.17).
Figure 6.17 - Baked indirect lights
As you can see from Figure 6.17, we don’t have any light leaks, which is a
great sign. But, we can improve the simulation. You can increase the Power
for the environment light from 1000 to 2500 to make it stronger and mimic
a Sun.
From that simple change in settings, we will have a much better solution to
emulate a daylight scene (Figure 6.18).
Figure 6.18 - Simulating a daylight scene
The easiest way to set the active camera is with the CTRL+ALT+Numpad 0
shortcut. To use this shortcut, you have to set the view you want to have
from the scene with the 3D navigation shortcuts and press the keys.
For instance, we can use the middle mouse button to orbit the scene until
you find the best point of view. Press the keys, and your active camera in
the scene will align to that view (Figure 6.19).
Figure 6.19 - Align the view
After you align for the first time, you can select the camera border and use:
G Key to move
G key and Z key twice to dolly your camera
Since we are viewing an interior, you might want to reduce the focal
distance to make your camera have a wider field of view. A typical value
for interiors is between 18 and 24 mm (Figure 6.20).
Figure 6.20 - Camera focal distance
Once you have the camera in the desired location for rendering, we can add
some effects and work on improved shadows.
For our scene, we will use a couple of those effects to improve the overall
visualization of the scene. Those will be a great help, but we will do even
more regarding effects on chapter 8.
The first effect that you will enable is the Ambient Occlusion at the top of
your Render tab. What effect will the Ambient Occlusion add to the scene?
By using Ambient Occlusion in our project, you will start to see proximity
shadows on objects (Figure 6.22).
That is a visual effect that we usually see in the real world and Eevee cant
reproduce by default.
If you compare the scene with and without Ambient Occlusion, you will
notice how the proximity shadows make a difference in the overall realism
of the scene (Figure 6.23).
Figure 6.23 - Ambient Occlusion comparison
You can even use the Intensity controls to make your proximity shadows to
spread below objects. One of the benefits of using the effect is the depth
perception of the scene. The lack of proximity shadows might make your
objects look like they are “floating” above the floor.
In Cycles, you don’t have to use such effect because the render can process
those shadows as part of the physics calculations for the image.
Using the default settings for shadows will result in those poor shadows in
any render. For high-quality shadows for your interior renders you should
use:
Using those settings will genuinely improve the shadows for your scene.
Look at Figure 6.25 to see how the shadow settings will change the way
any object cast shadows.
What about the shadows from light sources? In that case, you can use the
settings available at the light object. For instance, using the options from
our Area Light will enable you to control the softness of your shadows.
The default value will give a weak, soft edge for all shadows. If you reduce
the softness of your shadows for that particular light, you will get hard edge
shadows (Figure 6.26).
Figure 6.26 - Hard-edge shadows
If you are trying to create a daylight scene with Eevee, the control of your
shadows is critical. In bright daylight, you will get most objects casting
hard edge shadows. As a final effect, you can also enable the Screen Space
Reflections.
The scene still needs a boost in lighting, which will add with the color
management settings in Chapter 8.
What is next?
Even after those tweaks and adjustments to the scene effects, we can still
improve a lot the overall quality of our lights. If you take a close look at the
scene, it is still too dark. The shadows and Ambient Occlusion helped to
build a more realistic scene but go further.
The next chapter will cover the aspects of exterior lighting in Eevee for any
project, where you can use an HDR map for the environment and a light
setup unique to external scenes.
Is it too different from an interior scene? You will see several aspects where
we use an identical setup for external scenes, with a few key differences.
The probes placement and handling will require special attention because
they have a limited volume.
If you want to learn how to improve your projects, on any context using the
Color Management settings of Eevee, you can go straight to chapter 8.
There you will learn how to make scenes receive a significant boost on
lights using controls like the exposure and also gamma settings.
That is the setting for a scene that can completely transform any project and
you will learn how to take advantage of those settings and other options like
the Depth of Field, which can also change the render.
Chapter 7 - External lights with Eevee
In chapter 6, we learned how to apply the workflow for Eevee in an interior
scene that uses all the techniques explained along with the book. You had
the opportunity to see how to place probes, lights, and materials for an
interior scene.
That will present unique challenges to use Eevee like working with indirect
lights and probes. In the next chapter, you will find our workflow adapted
to an external scene to handle those unique problems.
An exterior scene will show some unique challenges for lighting with
Eevee, especially in the environment map that will work much better with
an HDR. Here is the Eevee workflow for exteriors:
1. Check the model to see if you have some thickness to objects and
make sure there are no unnecessary gaps between objects or
overlapping faces.
2. Add environmental lights using an HDR map with an Area Light or
Sun
3. Adjust the HDR rotation to match with the background
4. Align the active camera with the project
5. Add a light source that can cast shadows (Sun or Area)
6. Create all materials to the scene. If you can, use only PBR materials
for your surfaces.
7. Create probes to compute indirect lights and reflections.
8. Review the solution to fix potential shading and light leak problems.
9. Add effects and post-production to finish your image.
As you can see from the list of steps, you should use an HDR for the
environment map this time. In the checklist for interiors, you can either use
an HDR of go with Area Lights. For open spaces, you will have a much
harder time trying to find the best settings with Area Lights alone.
The HDR map will help in several ways to create a realistic render for
exteriors with Eevee. Here are the main benefits of using an HDR for
exteriors:
Since we will also encounter much fewer problems regarding light leaks in
exterior scenes, we can use lights that eventually could generate shadow
problems in interior scenes, like a Sun. You will notice that in step two, you
can use either a Sun or Area Light to the scene.
Using a Sun will bring lots of benefits for shadow casting like easier to
manage hard edge shadows.
The main reason you have to use a light and an environmental texture is
that Eevee can’t generate shadows for HDR maps. If you only use an HDR
map as the light source for the scene, you won't get any shadow casting for
the objects, which will compromise the realism of your scene.
During the rest of the chapter, we will use all the steps to set up and prepare
the scene for rendering. Some of the procedures will use similar options
from the interior render, but others will be unique to external scenes.
We will apply the workflow to create the image shown in Figure 7.1.
Figure 7.1 - External model in Eevee
The model is simple but will help us understand and apply the concepts
from the checklist.
Since they are not flat and have a thickness to the objects, you will not find
problems regarding light leaks on those objects.
The main issue that could potentially appear in external models, which you
should fix to avoid texturing problems is the overlapping of faces. After the
modeling process, you can have some overlapping faces in models. That
will appear in the 3D Viewport as a darker face.
Select the overlapping faces and use a right-click to open the Context menu.
You can also use the A key to select all elements. From that menu, you can
go to Merge → By Distance (Figure 7.4).
Figure 7.3 - Merge by distance
By keeping the distance to merge set to zero, you will remove all
overlapping faces (Figure 7.4).
The problem could also appear when you import models from external
sources like an OBJ or FBX file that you bring to Blender.
Info: The option Merge by distance is the name Blender uses now for a tool
that had a different name for several years. In previous versions of Blender
the tool had a name of Remove Doubles.
7.3 Environmental lights for exteriors
When you are dealing with interior models, it might not be a good idea to
use an HDR image with Eevee, because it will not generate shadows and
the benefits of having a more natural light will not compensate the change
to get light leaks. Instead of using an HDR for interiors it is much better to
go with an Area Light.
For exterior render projects, the HDR map will become more useful for
both lights and also composition. You can use the natural shading offered
by an HDR map applied as an environmental map allied with another light
source to create shadows.
In our project, you will start by adding the HDR map shown in Figure 7.5
to the World tab as an Environment map.
The map has a sky background that will be perfect to use as a composition
for the final render. Use the Rendered mode from Eevee to align and place
the HDR in the proper location.
At the World tab, you can also set the intensity of your HDR map shading
using the Strength value (Figure 7.6).
Keep in mind that we will later add a light source to generate shadows for
the scene. The HDR map should work now as a reference for us to later
place the camera.
However, you can also use the World tab alone to set up the Nodes to
control an HDR rotation. At the World tab, you can click at the small circle
next to the Vector option from the Environmental map (Figure 7.7).
Figure 7.7 - Vector input socket
From the options that will appear for the Vector, you should find and pick
the Mapping. In the Mapping options, you will see another Vector field,
which represents the input socket for that Node. Click, once again, in the
small circle on the right and choose from the Texture Coordinate field the
Generated option (Figure 7.8).
Figure 7.8 - Texture coodinate
Use the Rotation controls from the Mapping to set the rotation of your HDR
map for the render. Keep your shading mode as Rendered to get instant
feedback about the location of your HDR.
Tip: If you don't want the HDR to appear in your final render you can
always enable the Transparent option in the Film settings. You will find
those options in the Render tab for Eevee.
You can use the 3D navigation options to find a viewing angle that you like
and press the CTRL+ALT+Numpad 0 keys. That will make the active
camera to align with the view (Figure 7.9).
Figure 7.9 - Aligned camera
Use the camera settings to change your Focal Distance and also the
transformation shortcuts to fine-tune the framing. Select the camera border
to make the adjustments.
When you get the perfect alignment, you can go back to the World tab and
also change the settings for the HDR rotation.
Using a Sun to create shadows for the interior is dangerous because they
have a high chance to generate light leaks. However, in exterior
visualization projects, we don't have such problem.
By pressing the SHIFT+A keys and going to the lights options, we can add
a Sun to the scene. With the Sun still selected move up and far from the 3D
model. Since we already have the camera with the final framing, you can
adjust the Sun rotation with the R key to make your light to come from
behind the camera (Figure 7.10).
After you have the Sun in the correct location, we can open the options of
making adjustments to shadows and angle. To improve shadows and other
aspects of of the Sun you can change the following options:
Bias: 0.75
Exponent: 1.5
Softness: 2.5
Contact shadows: Enabled
Those settings will help to create an excellent looking daylight simulation
with a Sun casting hard.
When adding materials to the surfaces, you will probably have to add a:
Texture Coordinate
Mapping
For instance, if you look at the brick fence from the scene in Figure 7.11.
Figure 7.11 - Brick fence
That material is using the default tilling from our PBR material with no
additional controls for the surface. Notice how big the bricks are
concerning the rest of the scene. We have to add both Nodes to the PBR
material to scale our textures in a way that they will look with the correct
size of a brick (Figure 7.12).
Find the best scales for each one of the texture is the biggest challenge in
exterior models because each PBR material will require a unique setting for
the scale. Luckily for us, we can use the Rendered mode from Eevee to get
real-time feedback on textures without the need to make render tests.
Tip: If you want to look at how to set up PBR materials and control tilling,
go back to chapter 3 where we discuss the workflow to handle PBR
materials.
For instance, we can apply the glass material to the windows of our project.
With the glass object selected, we can apply a new material and use a
Principled BSDF shader.
Set the transmission to 1.0000: That will make the material fully
transparent
Set the roughness to zero: With a value of zero you will get crispy
reflections from the glass
From the Settings options enable the Screen Space Refraction:
The option will help our material to reflect the surroundings of our
scene, even without a probe
The last setting is what will make a huge difference in our glass setup. But,
it will only work if you enable the Refraction option at the Screen Space
Reflections (Figure 7.13).
Figure 7.13 - Glass for windows
Since we are using an HDR map as the background, we will have some
great reflections for the windows. Don’t forget to enable the Screen Space
Reflections and the Refraction option.
Info: Until now, we have three different methods to create glass for Eevee.
What is the best option? That will depend on the visuals you want to
achieve for the glass surface. All three options will give you great results,
but with differences in the final reflections.
Otherwise, you will only get the lights from direct sources in the scene. In
external scenes, you will only have a few places in the model that will
benefit from indirect bounces.
For instance, in your example, you will notice only a few areas that will
most likely receive indirect bounces from lights (Figure 7.14).
Those locations in the model have small enclosed spaces that will enable
light to bounce around a few times. Using your camera viewing angle as a
reference, we will add the Irradiance Volume probe to the model and scale
it to cover all the front space of the model (Figure 7.15).
Figure 7.15 - Irradiance Volume area
Make the Irradiance Volume to cover only the areas visible by the camera,
which will optimize the calculations later. We have some surfaces that
represent glass, which will also benefit from a Reflection Cubemap. Add
the probe and place it in the front of the scene (Figure 7.16).
The baking results you help us evaluate the location of your Irradiance
Volume probe, and if necessary, you can make adjustments to the samples
and location. It will not have the same impact on rendering as we would
have in interior visualizations.
Ambient Occlusion
Screen Space Reflections
Bloom
You can enable all the effects using the Render tab options. With the Screen
Space Reflection, we will get better reflections for glossy surfaces. The
Bloom effect is useful for external scenes simulating bright daylight
because it will add a glow effect on surfaces receiving direct light (Figure
7.18).
From all three effects, you can use the Ambient Occlusion to add some
depth to the scene by increasing the Distance value. If you increase the
Distance, you will get more prominent contact shadows for objects, which
will give the impression of more depth for the scene (Figure 7.20).
As a result, you will get a render that has a visual aspect of depth. It still
looks dark, but we will improve the visuals using some color correction
techniques in chapter 8.
What is next?
The process to set up and render an exterior scene has many similarities
with an interior, and you will be able to reuse a big part of the workflow.
Since we are dealing with a context that has a scene with a larger scale and
open spaces, we can experiment more with light sources like the Sun and
use an HDR map in the background.
Using those two options in interior scenes will eventually increase the
change of light leaks in the scene, but not in exterior renders.
What is the next step? After processing all the scene and adding effects, we
will still have to work with the Color Management options to further
enhance the render in Eevee. There we will find options to improve
brightness and color balance.
In the next chapter, you will learn how to use the exposure and gamma
controls from that panel. Besides those effects, we will also take a look at
how to create and use volumetric lights in Eevee and working with a studio
set up with an infinite background.
Chapter 8 - Color management and
effects with Eevee
After you finish a project in Eevee, you may have a render that still needs
some work regarding colors or contrast balance. Some artist would take that
image and bring it to an image editor software. In Blender, you will find a
panel that can help you make adjustments to any render.
Besides the color management options, you will also find in the following
chapter explanations in how to add several effects to a project. For instance,
you will learn how to work with Depth of Field in Eevee, which can create
blurred backgrounds for your render. It works way faster than Cycles.
You will also find a detailed explanation about the use of volumetric lights
for Eevee that can add depth to a scene by showing the light beams. To get
that effect, you need an object container and a particular type of shader
called Volume Scatter.
To finish the chapter, you will see how to create a studio scene with an
infinite background to make renders and present products in real-time.
As a result, you will see the volume lights for that scene. In Blender, we can
create that type of effect using a combination of materials and render
settings for Eevee.
You will need a few objects in your scene to work with volumetric lights in
Eevee. The renderer uses an object that will use a special type of shader as a
material. You will control the density and other aspects of the effect in the
Material tab. In Eevee, we have to enable the use of volumetric lights in the
Render tab.
If you look at the Render tab, you will see an option for volumetric (Figure
8.1).
Figure 8.1 - Volumetrics for rendering
You must enable the Volumetric Lighting and if you need shadows to
interact with the effect, also enable the Volumetric Shadow. Like other
effects in Eevee, you will have several controls to adjust the way your lights
will appear:
If you think the effect you see for volumetric lights doesn't have the quality
you were expecting, use those settings to increase samples and resolution.
You can use any Mesh object as a container and apply the Principled
Volume shader. For instance, we can create a cube in the scene and assign a
material with that particular shader (Figure 8.2).
Figure 8.2 - Volume Scatter shader
After you create the object and assign the shader, you will place the camera
inside the container area. Otherwise, the volumetric effect won't be visible.
You will start to see the volume lights when rendering the scene (Figure
8.3).
Figure 8.3 - Volumetric effect
In Figure 8.3, you can see the effect from a Spotlight. The density of your
Principled Volume will start at 1.0. Reduce the value to 0.1 to see your
lights.
You can change settings in the material to control certain aspects of the
effect like the density of your volume and also the color used to create the
foggy aspect of your effect.
Info: Keep in mind that for the Principled Volume to work you must add it
to the Volume field and not Surface.
After you add the Emission shader to an object, it will start to contribute to
the lights of your project. At least they should contribute to the lights. In
Cycles, you don't have to perform any extra steps to use materials to emit
light. For Eevee, we have to use the Irradiance Volume and the Indirect
Lights baking process.
You can select an object and add the material with the shader set to
Emission (Figure 8.5).
Figure 8.5 - Object with Emission
If you render the scene, you will notice that even with an emission shader,
you won't get any lights from the object. The solution for that type of object
is to use an Irradiance Volume and bake indirect lights (Figure 8.6).
Figure 8.6 - Emission after the baking process
As a result, you will see the objects contributing to the overall illumination
of the scene. However, we have a limitation in those types of lights for
Eevee. They will not generate shadows.
If you need shadows for the lights coming from those planes, you will have
to add another source close to them to get the shadows generated. You can
do that using options like an Area Light with a small amount of energy to
create shadows.
At the Render tab, you will find a collection of tools and options that can
transform a project from a flat and unbalanced light into a realistic and
bright image. With the Color Management options, you will find the
settings needed to make significant changes to your renders (Figure 8.7).
Figure 8.7 - Color management options
Look: A quick panel that offers templates for contrast settings for the
scene. You can choose from options starting with "Very Low" and
going up to "Very High." The scene will have the colors changed to
match the template you choose.
Exposure: The exposure settings will help you manage the brightness
of a render with both Eevee and Cycles. You get a slider that starts
with a value of zero for no changes in exposure. If you need
additional lights for a scene, you can increase the amount to get more
light to the scene.
Gamma: Do you think your render needs a change regarding the
balance of whites and blacks? That is what you can do with the game
settings. The slides will let you control the balance between dark and
bright colors. You can even use a more advanced option to handle
gamma settings using curves for more visual control.
Unfortunately, you won't find any templates for best settings to use with the
values available at the color management. The main reason for that is the
unique nature of each project and scene. You will have to manage lights,
materials, and environments that will only apply to your scene.
Info: One of the many benefits of using Eevee for rendering is that you will
be able to quickly evaluate the results of adjustments from the color
management in real-time. Activate the Render mode and make changes to
the settings to find the best possible setup for a render.
8.3.1 Exposure settings
In some scenes, you will add an environmental texture and also several
types of lights to the project and will still feel that it looks too dark. An easy
and quick way to increase the light levels at once for your scene is with the
use of exposure settings.
If you have previous experiences with photography, the term exposure will
be familiar. For photography, we usually find exposure as a measure of time
in which the camera sensor gets exposed to light. With more time receiving
the light, it will generate a brighter image.
For render in Eevee, the rule also applies, and you will have a brighter
image by increasing the exposure settings. The settings for exposure are
simple to operate and require you only to choose the level you wish to use
(Figure 8.8).
The default value for exposure is zero, which will not add or subtract and
light to the scene.
Using the exposure settings will make your life easier in terms of finding
the best levels for lights in a scene. You can add several Area Lights to a
project and use an HDR in the background and still have a dark scene. A
quick change in exposure settings can solve the problem.
Unlike Cycles, we will have the benefit of evaluating the results of the
exposure settings in real-time using Eevee.
You have to take care of an effect that could appear from the use of
exposure settings, which is the overexposure of an image when you
increase the exposure to a level where some parts of your image will start to
look burned and with excessive white colors.
When you open an image editor like Photoshop or GIMP, you will find
tools like a histogram that will display a graph for the black and white
balance. From that graph, you can find if you need an additional boost on
either of those tones.
To edit and change an image in Eevee, you will have to develop a critical
eye to evaluate the need for each image. Since you are the author of that
project, you will have to find the best results from your point of view. Do
you need more blacks? Or make the image receive some white colors?
Using too much of each will result in a darker image or a washed effect.
The challenge will be to find a balance between them. The gamma settings
will always start with a value of one (Figure 8.10).
Figure 8.10 - Gamma settings
If you change the values to a lower number, you will increase the amount of
black in the images. A higher value will add more whites (Figure 8.11).
Figure 8.11 - Gamma results
As always, you will be able to evaluate the results in Eevee using the real-
time render preview. That will make the process of finding the point with
the best balance a lot easier.
8.3.3 Gamma with curves
If you want additional controls for the gamma settings, we can enable the
Curve option. When you enable the curve option for your gamma settings, a
small panel will appear in the color management (Figure 8.12).
The graph will start with a straight line defining the gamma settings, which
we can change by modifying the curve. How to change the curve? Here are
the procedures to change that curve:
Click and drag anywhere in the curve to add a node and deform the
graph
You can click and drag using an existing node
To remove a node, you can click on the node to select and use the "x
"button to remove
The curves accept changes for the overall color or each spectrum of
the RGB. A selector at the top let you pick the channel from CRGB
For each channel, you can make changes to the RGB mix for the
blacks and whites using numeric values
To reset all settings of the gamma, use the "Reset" button at the
bottom
Working with the graph view can become a challenge until you find the best
balance between the vertical (black) and horizontal (white) fields.
8.4 Depth of field in Eevee
An effect that can add a significant level of realism for your projects in
Eevee is the Depth of Field or DoF. Have you ever saw an image where
parts were out of focus? That is what you can do with the Depth of Field
effect. After you apply the effect in a scene, you will have the visuals like
Figure 8.13.
You can notice that part of the image is out of focus and with a blur effect.
The effect works with both Cycles and Eevee using a similar workflow. To
create the blur on your image, you have to follow two simple rules:
At the scene, you can see that we have a camera and also a couple of
objects. What will be the focused object? You can either pick one of the 3D
models in the scene or create an Empty object in Blender to use as a
reference. Using an Empty will be better to allow you to move the focus
point around without the need to change your 3D scene.
To create an Empty, you can press the SHIFT+A keys and choose Empty
→ Plain Axis. Use a move transformation to place the object in the location
you wish your camera to focus (Figure 8.15).
Figure 8.15 - Location to focus
The next step is you change your view of the camera. Press the Numpad 0
to view your camera, and select the camera object. You can do this by
clicking on the camera using the Outliner or the rectangular border on your
screen. With the camera picked, open the Object Data tab at the Properties
Editor (Figure 8.16).
At first, you won't see any differences in your image after selecting the
Empty as the focused object. To enhance the Depth of Field effect, we have
to change the F-Stop value. The F-stop in photography means the ratio
between your focal length to the diameter of a lens.
What values would give a high Depth of Field effect? If you reduce the
amount, you will get an extreme to defocus effect. For instance, use a value
of "0.1" to get a high blur effect on your scene.
You can easily move the focus object in real-time to view the effect update
in your 3D Viewport (Figure 8.17).
You can increase the number of Blades to get a better quality effect and
increase the F-Stop to reduce the blur.
Another way to control the Depth of Field in Eevee is with the Max Size
option at the Render tab. There you can choose the value Eevee uses to blur
your pixels (Figure 8.18).
If you reduce the size to a maximum of 2 instead of the default value, you
will get a small amount of blur on your pixels. The Depth of Field effect
and add a small amount of processing on your scene for rendering.
One thing you must have in mind regarding the Depth of Field in Eevee is
that it has limitations on image quality and won't look as good as Cycles.
But, scenes that use the Depth of Field will have a much better level of
realism using the effect.
To make such a scene in Blender to render with Eevee, you can start with a
plane in your scene and using the extrude tool to make the background.
After you add the plane:
Make additional extrudes until you get the model shown in Figure 8.20.
Figure 8.20 - Background with all faces
Apply a Subsurf modifier to the background in Object Mode and from the
context menu, which you can open using a right-click to choose Shade
Smooth.
As a result, you will have an infinite background for a studio scene (Figure
8.21).
Figure 8.21 - Infinite background
Apply a white color to the object and add three area lights to the scene.
They will be on each side of the scene, and one behind the camera (Figure
8.22).
Figure 8.22 - Area lights for the studio scene
Since we are working with Eevee, you will also need an Irradiance Volume
and Reflection Cubemap probes in the scene. Add them and make sure the
scale is large enough that you will have the full studio scene inside the
probes (Figure 8.23).
Figure 8.23 - Probes for the studio
You can now add an object to the center location of your studio scene and
bake any indirect lights and reflections for the probes. The result will be a
perfect scene for making presentations of products and 3D models with an
infinite background (Figure 8.24).
Figure 8.24 - Studio scene for Eevee
You can save that scene and swap the objects whenever you need to make
product presentations.
What is next?
With the release of Eevee in Blender 2.8, we have a powerful tool in our
hands that can deliver quality images in real-time. If you are an artist that
usually get the projects rendered with Cycles, you will notice that it has
highlights and drawbacks.
Once you learn how to manage those limitations and take advantage of a
powerful PBR material system, you will be able to create incredible
projects with Eevee and also add some effects to make them look even
more realistic.
The next step now is to put all your recently knowledge about Eevee in
practice and start doing projects using real-time render. That is by far the
best way to develop your skills even further and find solutions to common
problems in Eevee. It could be either a light leak or the setup of a probe.
You now have a solid base to start taking advantage of real-time rendering
with Blender.