Documente Academic
Documente Profesional
Documente Cultură
Legal Notes
Definiens, Definiens Cellenger and Definiens Cognition Network Technology are
registered trademarks of Definiens AG in Germany and other countries. Cognition
Network Technology, Definiens eCognition, Enterprise Image Intelligence, and
Understanding Images, are trademarks of Definiens AG in Germany and other
countries.
All other product names, company names, and brand names mentioned in this
document may be trademark properties of their respective holders.
Protected by patents US 7146380, US 7117131, US 6832002, US 6738513, US 6229920,
US 6091852, EP 0863485, WO 00/54176, WO 00/60497, WO 00/63788 WO 01/45033,
WO 01/71577, WO 01/75574, and WO 02/05198. Further patents pending.
2
Definiens Developer 7.0 Essentials Training Module1: Complete Basic Workflow - Operation Principals and Tools;
the main Features and Functions
Table of Contents
Essentials Training Module1: Complete Basic Workflow _____________________ 1
Imprint and Version __________________________________________________ 2
Legal Notes_________________________________________________________ 2
Table of Contents _______________________________________________________ 3
Introduction to this Module ______________________________________________ 7
Symbols at the side of the document 8
Lesson 1 Introduction to the Essential Tools for Rule Set Development _______ 9
1.1 The Definiens Workspace and Project File _________________________ 10
1.1.1 Open the Workspace 10
1.1.2 Open the Project file 11
1.2 The predefined Viewing Modes _________________________________ 11
1.3 The Visualization Tools ________________________________________ 13
1.3.1 Visualize Image Objects 13
1.3.2 Display Classification results 14
1.3.3 Panning and zooming functions 14
1.4 The essential windows for Rule Set development ___________________ 16
1.4.1 The Feature View and Image Object Information 16
1.4.2 The Class Hierarchy 17
1.4.3 The Process Tree 18
1.5 Execute a Process sequence ____________________________________ 19
Lesson 2 Load and View Data__________________________________________ 21
2.1 Create a Project in a Workspace _________________________________ 21
2.1.1 Supported image formats 22
2.1.2 Define the data to be loaded 23
2.1.3 Overview: Create Project dialog box 23
2.1.4 Define Layer Alias 24
2.1.5 Confirm the settings to create the Project 25
2.2 How to open a the created Project _______________________________ 25
2.3 How to save a Project _________________________________________ 26
2.4 Adjust the View settings _______________________________________ 26
2.4.1 Changing the Layer Mixing 26
2.4.2 Navigating through Image Layers 28
Lesson 3 Introduction to Processes _____________________________________ 29
3.1 Introduction to Cogniton Network Language (CNL) _______________ 29
The Concept of the Image Object Domain 30
3.2 Overview of available algorithms ________________________________ 31
3.3 Working with Processes________________________________________ 33
3.3.1 Create a Process 33
3.3.2 Arrange Processes 34
3.3.3 Save a Rule Set or single Process 34
3.3.4 Execute Processes 34
3.3.5 Delete Rule Set or single Process 35
Lesson 4 Segmentation: How to Create Image Objects ____________________ 37
4.1 Theory: Segmentation and Image Objects _________________________ 38
4.1.1 Image Object Primitives and Objects of Interest 38
4.1.2 The Image Object Hierarchy 39
4.1.3 Generating Suitable Image Objects 40
4.2 The Multiresolution Segmentation _______________________________ 41
4.2.1 The Multiresolution Segmentation algorithm 41
4.2.2 Segment with Multiresolution Segmentation 43
4.2.3 Effect of different Image Layer Weights 45
4.2.4 Effect of different Homogeneity Criterion 50
3
Definiens Developer 7.0 Essentials Training Module1: Complete Basic Workflow - Operation Principals and Tools;
the main Features and Functions
4
Definiens Developer 7.0 Essentials Training Module1: Complete Basic Workflow - Operation Principals and Tools;
the main Features and Functions
10.3.2
Add Features and configure the attribute table 99
Lesson 11 Sample Based Classification with Nearest Neighbor Classifier
103
11.1 Nearest Neighbor (NN) theory__________________________________ 103
Classic workflow 104
11.2 Nearest Neighbor configurations _______________________________ 106
11.3 Declare Sample Objects for the NN Classification, (manual step!) ______ 109
11.4 Add, edit and execute a Process to classify________________________ 111
11.5 Refine the Classification_______________________________________ 111
Lesson 12 Batch-Processing with eCognition Server _______________ 113
12.1 Import data using an existing template __________________________ 113
12.1.1 Load the import template 114
12.1.2 Load the data 114
12.2 Submitting data for analysis ___________________________________ 115
12.3 View Job Scheduler status in a browser __________________________ 116
12.3.1 Review user jobs 117
12.3.2 Review job overview 117
12.3.3 View job details 118
12.3.4 Review engine status 118
12.3.5 Review engine usage 118
12.4 Roll-back to initial status ______________________________________ 119
5
Definiens Developer 7.0 Essentials Training Module1: Complete Basic Workflow - Operation Principals and Tools;
the main Features and Functions
6
Introduction to the Essential Tools for Rule Set Development
7
Introduction to the Essential Tools for Rule Set Development
Introduction If the side is hachured and Introduction is added, this indicates that the text is giving a
general introduction or methodology about the following Lesson, method or exercise.
Information If the side is hachured and Information is added, this indicates that the text is giving
information about the following exercise.
If this symbol is shown, you have to follow the numbered items in the text. If you just
want to work through the exercises without reading the theory part, follow only this
sign.
Action!
If this symbol is shown, compare the settings shown in the screenshot with the settings
in the according dialog box in Developer.
Settings
Check
If this symbol is shown check the screenshot of the Process Tree with the content of the
Process Tree in Developer.
Rule Set
Check
If this symbol is shown check the screenshot aside with the result Developer. It should
look similar.
Result
Check
8
Introduction to the Essential Tools for Rule Set Development
After participating in this training the trainee will be able to solve image analysis tasks
comparable to the following example. The essential concepts of Object based image
analysis will be taught and the main strategies for Rule Set development will be
introduced.
In this first Lesson you will get an introduction to the most important tools for Rule Set
development and get a feeling of the goal the whole training module is about.
This example Project contains an already classified subset of a Quickbird scene (Data
courtesy of Digital Globe).
Figure 1: Definiens Developer 7.0 GUI with one viewer open. At the top are the menus and
toolbars. At the right the Process Tree and the Class Hierarchy windows, as well as the Feature
View and the Image Object Information window is shown.
9
Introduction to the Essential Tools for Rule Set Development
projects
10
Introduction to the Essential Tools for Rule Set Development
The Yokosuka Project file represents a subset of a whole Quickbird scene (Image data
courtesy of DigitalGlobe). This Project has already been processed using the Process
stored in the Process Tree.
The Project contains one Image Object Level. In addition the Image Objects have been
classified.
1. To open a Project file from the Workspace window do one of the following:
NOTE:
The currently opened Project is marked with an asterisk in the Workspace window
Figure 3: View Settings toolbar with the 4 predefined View Setting buttons: Load and Manage
Data, Configure Analysis, Review Results, Develop Rule Sets.
2. Select the predefined View Setting number 4 Develop Rulesets from the
View Settings toolbar.
By default one viewer window for the image data is open, as well as the Process Tree Action!
and the Image Object Information window.
3. Check whether the following tools are open:
11
Introduction to the Essential Tools for Rule Set Development
Process Tree: go to the main menu and select Process>Process Tree or press
the Process Tree button .
Feature View: go to the main menu and select Tools>Feature View or press
the Feature View button .
Figure 4: Definiens GUI with main windows: image data Viewer in the left window, menu- and tool bars
at the top, Process Tree, Class Hierarchy, Feature View and Image Object Information windows at the
right side.
12
Introduction to the Essential Tools for Rule Set Development
It is essential to display the image data, segmentation outlines, Feature values in a Introduction
color range and Classification results during Rule Set development and to verify your
results.
Several options for displaying the content of the Image Object Level can be chosen from
the View Settings toolbar.
2. Select the Show or Hide Outlines button to show all outlines of all Image Action!
Objects of the Level. See how they fit with the image content.
3. To show the Object mean view select the Pixel View of Object Mean View
button.
Display
Check
Figure 6: Image View, Object outlines with Pixel View and Object Outlines with Object Mean View.
13
Introduction to the Essential Tools for Rule Set Development
1. Select the View Classification button and make sure that the Show or Hide
Outlines button is deselected.
Action! 2. Move the cursor over the Classification and the assigned class will appear in a tool
tip.
3. Select the Pixel View or Objects Mean View button to switch on and off the
transparency view.
4. Select the Show or Hide Outlines button and the outlines will be displayed in
the Classification colors.
Display
Check
Figure 7: Classification View with Pixel View, Classification View with Object Mean View and
Classification View with Object Outlines and Pixel View.
Action! 2. To pan, drag the hand-shaped cursor around the Project window to move to
other areas of the image. (Alternatively Crtl + P)
5. Zoom In Center
6. Select or enter a zoom value to change the display in the Project view.
14
Introduction to the Essential Tools for Rule Set Development
1. Zoom In
2. Zoom Out
5. The Pan Window enables you to move around the image. Drag the red
rectangle to move to a different region of the image.
15
Introduction to the Essential Tools for Rule Set Development
To get information about several Feature values for one Image Object use the
Image Object Information.
To get values of one Feature for all Image Objects use the Feature View.
1. Select a single Image Object by clicking on it and see the associated Features and
values in the Image Object Information window.
Action! 2. Select another Image Object and the values will change.
3. Double-click on one Feature listed in the Feature View window.
All Objects appear now in gray values representing the values for the selected Feature.
Objects with low values are shown in dark grey, high values in bright.
Display
Check
Figure 9: Image Object Information window with Feature values for a selected Image Object.
16
Introduction to the Essential Tools for Rule Set Development
Figure 10: Feature View window with value displayed in the viewer in grey values.
Display
Check
Figure 11: Class Hierarchy window, the Groups Hierarchy tab is selected.
17
Introduction to the Essential Tools for Rule Set Development
Action! 2. Double-click on a single Process and examine the content and settings.
Display
Check
Figure 12: Process Tree window showing the Process for analyzing the subset.
18
Introduction to the Essential Tools for Rule Set Development
19
Introduction to the Essential Tools for Rule Set Development
20
Load and View Data
In this Lesson we create a new Project and we will examine the loaded data. Introduction
After importing the data and at different steps of the workflow of an image analysis, you
investigate your Definiens projects visually. Different visualization methods enable you
focus on what you are searching for.
At the beginning, you explore the single image layers. You can define the color
composition for the display of image layers and set equalizing options.
In this Lesson a subset of a Quickbird scene is used (Data courtesy of Digital Globe).
Figure 13: Different view settings for the same data set.
21
Load and View Data
Image layers
Thematic layers
While image layers contain continuous information, the information of thematic layers is
discrete. The two types of layers have to be treated differently in both segmentation and
Classification. Thematic layers can be imported in addition to image layers.
Information
Definiens Developer 7.0 supports the import of a variety of raster file formats
including:
22
Load and View Data
04mar_pan.img
04mar_multi.img
The Create Project dialog box opens.
5. In the Create Project dialog box in the field Name enter a meaningful name for the
Project, e.g. QB_Maricopa.
the geocoding information is displayed if the Use geocoding check box is selected
and the resolution is automatically detected and displayed in the Resolution field.
the unit is detected automatically if auto is selected from the drop down list.
The unit is automatically set to meters, but can be changed by selecting an other
one from the drop down list.
A subset of the loaded images can be selected by clicking the Subset Selection
button. The Create Subset dialog box opens.
23
Load and View Data
All preloaded image layers are displayed along with their properties. To select an
image layer, click it. To select multiple image layers, press Ctrl or the Shift key and
click on the image layers.
To edit a layer double-click or right-click an image layer and choose Edit. The
Layer Properties dialog will open. Alternatively you can click the Edit button.
To insert an additional image layer you can click the Insert button or right-click
inside the image layer display window and choose Insert on the context menu.
To remove one or more image layers, select the desired layer(s) and click Remove.
To change the order of the layers select an image layer and use the up and down
arrows.
To set No Data values for those pixels not to be analyzed, click No Data. The
Assign No Data Values dialog box opens.
To insert a thematic layer, you can click the Insert button or right-click inside the
thematic layer display window l and choose Insert from the context menu.
To edit a thematic layer works similar to editing image layers described above.
24
Load and View Data
1. To assign a layer alias, select the layer in the Create Project dialog box and double
click it.
The Layer Properties dialog opens.
Action!
Layer1 blue
25
Load and View Data
Right-click on the Project and select Open from the main menu.
Note:
The currently opened Project is marked in the Workspace window with an asterisk.
NOTE:
As there is no undo command, it is recommended that you save a Project prior to any
operation that could lead to the unwanted loss of information, such as deleting an
Object layer or splitting Objects. To retrieve the last saved state of the Project, close
the Project without saving and reopen.
NOTE:
The layer mixing changes the display only and does not affect image processing.
Action!
26
Load and View Data
or click on the Edit Image Layer Mixing button in the View Settings
toolbar.
2. To view the image in true color, set the blue, green and red layer to the respective
color slots by clicking in the according field.
3. To view additionally the nir layer, select the respective box additionally.
4. To weight the display of the image layer, uncheck the field No Layer Weights
and click inside the desired R, G or B box. Increase with a left mouse click, decrease
them with the right mouse click.
5. To confirm the settings click OK at the bottom of the Edit Layer Mixing dialog
box.
The image will now be displayed using the view settings you specified.
Settings
Check
Figure 18: False color mix with additional nir layer displayed green.
Figure 19: False color mix settings with weighted image layers and resulting image display in the viewer.
27
Load and View Data
The Single Layer Gray scale Button displays the first image layer in gray
values.
The Mix Three Layers RGB Button shows the first three layers as true
color.
The Show previous image layer Button shifts the weight arrangement
down.
The Show next image layer Button shifts the weight arrangement up.
The Edit Image Layer Mixing Button opens the related window.
28
Introduction to Processes
This Module gives you an introduction to the Cogniton Network Language which is Introduction
the unique computing language for developing advanced image analysis algorithms.
29
Introduction to Processes
30
Introduction to Processes
31
Introduction to Processes
32
Introduction to Processes
Create a Process
Arrange Processes
Save a Rule Set or single Process
Execute Processes
Delete Rule Set or single Process
All Processes are created saved and stored in the Process Tree window. Introduction
add
arrange
delete
load
save
single Processes and Process sequences.
Processes may have any number of Child Processes. The so formed hierarchy defines
the structure and flow control of the image analysis. Arranging Processes containing
different types of algorithms allows the user to build a sequential image analysis
routine.
33
Introduction to Processes
Figure 20: Process Tree window with one Process inserted. As this Process has the default settings it is
listed as for all.
Action! 2. Alternatively a Process can be dragged beneath another Process while pressing
the right mouse button.
3. Change the order of a Child Process. Drag it below all Processes while pressing the
left mouse button.
Process
Check
34
Introduction to Processes
1. Right-click on the Process or Process Sequence you want to execute and select
Execute.
Right-click and select Delete Rule Set to delete the complete Rule Set.
Create a Process
Arrange Processes
Save a Rule Set or single Process
Execute Processes
Delete Rule Set or single Process
35
Introduction to Processes
36
Segmentation: How to Create Image Objects
In this Lesson you will learn how to create Objects with the Multiresolution Introduction
Segmentation. Segmentation is always the first step to do when starting an image
analysis.
Before executing any other analysis Process, all subsequent steps require initial Image
Objects. Depending on the Level of detail of your task, several Segmentation and
Classification Processes may follow.
Segmentation algorithms are used to subdivide the entire image represented by the
pixel Level domain or specific Image Objects from other domains into smaller Image
Objects. Or to merge small Objects into larger ones.
Definiens provides several different approaches ranging from very simple algorithms like
chessboard and quad tree based segmentation to highly sophisticated methods like
multiresolution segmentation or the contrast filter segmentation.
Segmentation algorithms are required whenever you want to create new Image
Objects Levels based on the image layer information. But they are also a very valuable
tool to refine existing Image Objects by subdividing them into smaller pieces for a
more detailed analysis. Some are used for merging, e.g. the Spectral Difference
Segmentation.
NOTE:
For more detailed information on the segmentation algorithm, please refer to the
Reference Book, which you will find in the folder User Guide of your Definiens
Developer installation.
37
Segmentation: How to Create Image Objects
Object Primitives are the starting point for every processing in Definiens.
Ideally, Object Primitives are fragments of the Objects of Interest (but not
necessarily)
The internal presentation and data structure are the same for Object Primitives
and Objects of Interest
In most applications there are no generic procedures that are able to reliably extract
Objects of interest. Objects of interest can be heterogeneous, variable, noisy, structured.
semantics and expert knowledge, as used in Definiens, are needed to accurately identify
and shape the right Objects of interest.
38
Segmentation: How to Create Image Objects
39
Segmentation: How to Create Image Objects
Always produce Image Objects representing the class Features accurately: as large
as possible and as fine as necessary.
40
Segmentation: How to Create Image Objects
In each loop every Object in the Object Level will be handled once.
Therefore, higher values for the scale parameter will result in larger Objects,
smaller values in small.
41
Segmentation: How to Create Image Objects
Figure 23: With these settings only the panchromatic layer is weighted.
Note that the sum of all chosen weights for image layers is internally normalized to 1.
Thematic Layer usage:
If Thematic Layers are used, the segmentation will not segment (=merge) over thematic
Objects. Consequently in the segmentation result, where there is an Object boundary in
the thematic layer, there will always be an Object boundary.
Figure 24: Left: Thematic Layer; Right: Object outlines on basis of Thematic Layer.
Scale parameter:
The scale parameter is an abstract term.
It determines the maximum allowed heterogeneity for the resulting Image Objects.
In heterogeneous data the resulting Objects for a given scale parameter are smaller than
in homogeneous data.
By modifying the value of the scale parameter, you can vary the size of the resulting
Image Objects. A high scale parameter results in large Objects and vice versa.
42
Segmentation: How to Create Image Objects
Figure 25: Objects are allowed to grow coarse, the higher the scale parameter was set.
Preparation
Clear the existing rules set and append a new Process.
1. Delete the Rule Set in QB_Maricopa Project created earlier this training.
2. Select the predefined view setting number 4 Develop Rulesets from the View
Settings toolbar, if not already selected. Action!
43
Segmentation: How to Create Image Objects
Process Tree: go to the main menu and select Process>Process Tree or press
the Process Tree button .
Feature View: go to the main menu and select Tools>Feature View or press
the Feature View button .
4. Append a new Process by right-clicking in the Process Tree and selecting: Append
New.
Algorithm parameters:
7. Set the Scale parameter to 60.
8. Set the Level Name to Level1.
9. Keep the Image Layer weights and all remaining settings as they are per default.
10. Execute the Process.
Settings
Check
Figure 26: Edit Process dialog box with sample settings for multiresolution segmentation.
44
Segmentation: How to Create Image Objects
Examine the result and see that in homogeneous areas, the Objects are bigger (e.g.
water areas) and in more heterogeneous areas the Objects are smaller.
Result
If you want to see the Image Object outlines press the Show/Hide Outlines button . Check
04MAR17_MS_Image_Layer_Weights.TIF
04MAR17_PAN_Image_Layer_Weights.TIF
Information about these files is listed in the Create Project dialog box.
5. Define the according alias for the image layers.
6. In the Create Project dialog box in the field Name enter a meaningful name for
the Project, e.g. Layer Weights.
7. Click OK at the bottom of the Create Project dialog box.
8. Switch back to the Develop Ruleset view by clicking in the View Settings
tool bar.
45
Segmentation: How to Create Image Objects
Settings
Check
Figure 28: Edit Process dialog box with settings for Multiresolution Segmentation with all image layers
weighted.
Result
Check
Figure 29: An Image Object Level is created where all image layers have the same influence on the
Object shapes.
46
Segmentation: How to Create Image Objects
Settings
Check
Figure 30: Edit Process dialog box with settings for deleting Image Object Level1.
Rule Set
Check
Figure 31: Process Tree with Segmentation Process and Process to delete the Image Object Level.
47
Segmentation: How to Create Image Objects
Figure 32: Only the panchromatic layer is weighted to be used for the Object creation.
8. Select the the layers blue, green, red and nir and insert the weighting 0 in the
New value text box and click Apply.
9. Keep for the panchromatic layer the weighting 1.
10. Confirm the Process settings with OK.
11. Execute the Process.
Settings
Check
Figure 33: Edit Process dialog box with settings for Multiresolution Segmentation with only
panchromatic layer weighted.
48
Segmentation: How to Create Image Objects
Result
Check
Figure 34: Left: Image with only multispectral image layers displayed; Right: only panchromatic
layer displayed.
Figure 35: Image Objects created with the same scale parameter but different layer weightings.
Left: Image Objects created in basis of all image layers; Right: Image Objects created on basis of
only the higher resoluted panchromatic layer.
49
Segmentation: How to Create Image Objects
Shape Criterion:
In the Shape field you can define to which percentage the shape of the Objects (in
terms of the parameters smoothness and compactness) contribute to the entire
homogeneity criterion, as opposed to the percentage of the color.
For most cases the color is most important to create meaningful Objects. However, a
certain degree of shape homogeneity often improves the quality of Object extraction.
Therefore:
Changing the weight for the shape criterion to 0 will result in Objects more optimized
for spatial homogeneity. The shape criterion cannot have a value more than 0.9, due to
the obvious fact that without the spectral information of the image, the resulting
Objects would not be related to the spectral information at all.
Compactness:
In addition to spectral information, the Object homogeneity is optimized with regard to
the Object shape. The shape criterion is composed of two parameters:
The smoothness criterion is used to optimize Image Objects with regard to smooth
borders.
Although they do not share an antagonistic relationship, they sum up to 1. When the
compactness value is set to 1, the shapes of the Objects will be only optimized for
compactness, where as the value is set to 0 Objects will be optimized for smoothness.
50
Segmentation: How to Create Image Objects
1. Switch back to the Load and Manage Data view by clicking in the View
Settings tool bar.
2. To create a new Project in the Workspace window , right-click and select Add Action!
Project from the menu.
The Create Project dialog box opens.
3. Navigate to the folder SAR_Indonesia.
4. Mark the Indonesia_aug_sep.bmp file and click Open.
Information about these files is listed in the Create Project dialog box.
5. In the Create Project dialog box in the field Name enter a meaningful name for
the Project, e.g. Homogeneity Criterion.
6. Click OK at the bottom of the Create Project dialog box.
7. Switch back to the Develop Ruleset view by clicking in the View Settings
tool bar.
8. Delete the existing Rule Set.
51
Segmentation: How to Create Image Objects
9. Examine the outlines of the Image Object according to whether they fit to
represent the river.
10. Delete the Image Object Level after you examined the result and change the
parameter in the Process according to the list below and re segment.
Shape Compactness
0 0
0.9 0.5
0.6 0.5
0.6 0.1
0.6 0.9
0.2 0.9
Table 1: Parameters for different composition of Homogeneity Criterion.
11. After each segmentation, show the Object outlines and zoom in to compare
single Image Objects with the information at pixel Level.
Result
Check
52
Segmentation: How to Create Image Objects
Every segmentation uses the Image Objects of the next lower Image Object
Level as building blocks, which are subsequently merged into new segments.
At the same time, the Object borders of the next higher Level are stringently
obeyed.
For this reason, it is not possible to build a Level containing larger Objects (i.e., using a
larger scale parameter) than its super-Objects. Consequently, it is also not possible to
build a Level containing Objects smaller than its sub-Objects.
The image date used in this chapter is a subset of an IKONOS scene (Data courtesy of
GeoEye).
53
Segmentation: How to Create Image Objects
Note
When creating the first Object Level, the lower limit is represented by the pixels, the
upper limit by the scene size.
Figure 38: The first Level is created on basis of the Pixel Level.
Action!
54
Segmentation: How to Create Image Objects
Settings
Check
55
Segmentation: How to Create Image Objects
Process
Check
Figure 40: Process Tree with Parent Process and Child Process added.
1. Right-click in the Process Tree on the already existing Process and choose
Append New from the list.
NOTE:
Algorithm parameters:
4. In the Level Settings change use current (merge only) to create above
5. As Level Name insert Level2.
6. For the Image Layer Weights set for all layers except the pan layer a value of 0.
Only the panchromatic layer will be used for segmentation.
7. Change the Scale Parameter to 35.
8. Keep the default settings for the homogeneity criterion.
9. Click on the Execute button to perform the Segmentation.
A second Object Level is created by merging Objects from Level 1 into larger Objects in
Level 2.
56
Segmentation: How to Create Image Objects
Settings
Check
Process
Check
Figure 42: Process Tree with Process for creating the second Level added.
1. Right-click in the Process Tree on the already existing Process and choose
Append New from the list.
2. Algorithm: choose Multiresolution Segmentation. Action!
3. In the Image Object domain change the Level domain from pixel Level to Level2.
Algorithm Parameters:
4. In the Level Settings change use current (merge only) to create above.
5. As Level Name insert Level3.
57
Segmentation: How to Create Image Objects
6. For the Image Layer Weights set for all layers except the pan layer a value of 0.
Only the panchromatic layer will be used for segmentation.
7. Change the Scale Parameter to 110.
8. Keep the default settings for the homogeneity criterion.
9. Click on the Execute button to perform the Segmentation.
Settings
Check
Result
Check
Figure 44: Object outline view of all three Image Object Levels: Level1, Level2; Level3.
1. First view the outlines by clicking on the Show or Hide Outlines button .
Action! 2. To navigate through the Levels one by one, click for one Level down or for
one Level up on the Navigate toolbar.
3. To navigate to a specific Level, select the desired Level from the drop down list in
the Navigate toolbar.
Figure 45: Navigate toolbar with Level 1 selected for display in the viewer.
58
Segmentation: How to Create Image Objects
59
Segmentation: How to Create Image Objects
60
Image Objects - the Information Carriers
No matter with which algorithm the Objects have been created, they all contain a lot of Information
information. These information, called Features, is the basis for formulation conditions
for Classification or further segmentation steps.
It is crucial to find the right Feature and the right threshold for conditions to be used in
processing.
This Lesson gives you an introduction to the Features available and how to visualize
Feature values using the Feature View.
61
Image Objects - the Information Carriers
62
Image Objects - the Information Carriers
63
Image Objects - the Information Carriers
Preparation
1. From the folder LANDSAT_Dessau import the already existing Project
LANDSAT_Dessau_Segmented.dpr.
64
Image Objects - the Information Carriers
Figure 47: The Feature View window with the Features of the category Mean expanded.
Figure 48: Left: False color image with Object outlines; Right: Feature values for Brightness in a
gray range.
NOTE:
Per default the Feature Brightness is calculated using all image layers. But there is
the possibility to choose dedicated image layers. To do so, go to menu Classification
>Select Image Layers for Brightness and select those which should be used for
brightness calculation.
65
Image Objects - the Information Carriers
Result
Check
Figure 49: The Feature View with the updated Feature Brightness and switched on Feature range
checkbox.
66
Image Objects - the Information Carriers
Figure 50: The Feature Brightness with Feature range from minimum value to 40 as it is displayed.
4. Isolate the low values (water areas) by clicking the down arrow to the right of the
maximum value. Continue until you reach the value 40 or type it in.
Action!
The upper end of the range is decreased. Only Objects within this new range are now
displayed in color.
Result
Check
67
Image Objects - the Information Carriers
Figure 51: Final Feature range for the Mean of nir for the water bodies.
5. In the Feature View select the Feature Mean nir, right-click and select Update
Range.
Action! 6. Use the up arrow of the min values until all water body Objects are out of the
range.
NOTE:
Be sure to update the range of Feature values each time you select a different Feature.
Otherwise, the range of the recent Feature is used.
Everything displayed in color now has a too high value in nir to belong to water bodies.
Result
Check
68
Basic Classification
Create a Class
Define the first Classification Process
Define the second Classification Process
Alternative Classification method: Insert conditions in the Class Description
The most basic algorithm for Classification is the algorithm assign class. One fix Introduction
threshold for a Feature is defined directly in the Process condition. All Objects selected
in the Image Object Domain and which meet the condition are assigned to the given
class.
The most crucial part of Classification is to translate knowledge into Processes and
conditions.
In the chapter before we found that Brightness and the Mean of nir describes the Class
Water Body.
We will use the basic Classification algorithm assign class. With this algorithm you can
set one condition as basis for Classification.
Therefore we will need two Classification Processes. one to classify Objects with a
Brightness lower than 40 to the class Water and a second which un-classifies Water
Objects with a too high value for near infrared.
Before we start defining the Processes and the conditions we first have to create a class
Water.
All Classes are stored in the so called Class Hierarchy. To create a class you have two
possibilities:
For some algorithms you can create a class directly in the Process (e.g. assign
class)
You will learn more about class descriptions in the subsequent chapters. For the
Classification using the algorithm assign class only the name and the color is to be set.
1. Right-click in the Class Hierarchy window and select Insert Class.
The Class Description dialog box opens.
Action!
2. In the field Name enter Water.
3. From the drop-down list next to the field choose an appropriate color, e.g. blue.
4. Confirm the settings with OK.
Result
Check
70
Basic Classification
classifying the water bodies are then Child Processes to the Basic Classification
Parent Process.
1. Open the Process Tree and examine the existing Process structure.
2. Select the multiresolution segmentation Process, right-click on it and select
Append New from the menu. Action!
3. Name Process Basic Classification and confirm with OK.
Now the Parent Process Basic Classification is added in the Process Tree in the same
hierarchical Level as the multiresolution segmentation.
Rule Set
Check
4. Choose the smaller than operator and insert the value 40 as shown in the figure
below.
5. Confirm with OK.
Settings
Check
Select the Process in the Process Tree and right-click on it, then select
Execute Process.
Settings
Check
Figure 55: Process settings to the assign all Objects of Level1, which have a lower value than 40 for
the Feature Brightness to the class Water.
72
Basic Classification
Rule Set
Check
Figure 56: Process Tree with first Process for water Classification added.
Result
Check
Figure 58: Classification result for Water, with transparent view and only the nir layer displayed.
73
Basic Classification
74
Basic Classification
The first Classification Process is inserted and all Objects with Brightness lower than 40 Information
are classified as Water.
But there are some misClassifications. To get rid of the misclassified Objects, we will
insert a second Process with an additional condition.
We will assign all Water Objects with a Mean of nir larger than 44 to unclassified again.
To express that only the Water Objects shall be treated, the class has to be selected now
a in the Image Object Domain of the Process.
75
Basic Classification
Preparation
1. Append a new Process in the same hierarchy Level as the Classification Process
before.
Settings
Check
76
Basic Classification
Rule Set
Check
Result
Check
Figure 62: After the first Classification step there were misclassifications, the second Classification
Process eliminates the wrong classified Objects.
77
Basic Classification
Figure 63: Using the algorithm assign class two Processes were needed to come to a correct
Classification.
Another Method to achieve the same result, is to insert both conditions in the Class
Description of the class it self and then add only one Process pointing to the content of
the Class Description.
Important to understand is whenever you want to classify using the information
contained in the Class Description, you have to use the algorithm Classification.
Using Class Descriptions you can retrace why an Object was classified to a certain class.
This can be seen in the Image Object Information window and in the Feature View using
the Relations to Classification Features.
Figure 64: The Objects carry the Classification information, if the conditions are inserted in the
Class Description.
That the conditions can be combined using different operators like and/or.
That the conditions of the classes can be inherited to Child classes, if they are
arranged in the Inheritance Hierarchy.
78
Basic Classification
Settings
Check
Figure 65: Class Description for the Class Water containing two threshold conditions.
79
Basic Classification
Rule Set
Check
Figure 66: Process Tree with Process for Classification according to the Class Description added.
80
Basic Classification
Result
Check
Figure 67: The selected Object has a total membership value of 1. Both conditions have been
fulfilled. For Mean nir the Object has the value 16.08, for Brightness 28.28.
Create a Class
Define the first Classification Process
Define the second Classification Process
Alternative Classification method: Insert conditions in the Class Description
81
Basic Classification
82
Exercise: Recap Classification and Segmentation
Create a Class
Define the first Classification Process
Define the second Classification Process
Alternative Classification method: Insert conditions in the Class Description
This exercise has the aim that you recap the lessons learned on your own. The trainer will Introduction
assist you when questions occur. At the end the different results of the group are
discussed together.
The image data used in this and the following lessons is a subset of a Quickbird scene
(Data courtesy of Digital Globe).
Task: Classify water and roads in a Quickbird subset.
During this exercise, you will realize that water and roads have quite similar spectral
values in this image. For this exercise, try to classify with one Process water Objects
without any misclassifications. Simply said, rather miss some water areas as classifying
some roads as water. We will solve misclassified roads later using context information.
Guide Line:
Examine which Image Layers contain significant information for the class
water.
83
Exercise: Recap Classification and Segmentation
Introduction With a Child Process, create Objects using multispectral segmentation and
weight only the image layers with information about water
Figure 68: Left: higher resoluted panchromatic layer; Right: coarser resoluted multispectral layer.
1. Switch back to the Load and Manage Data View, right-click in the Workspace
window and select Add Project.
84
Exercise: Recap Classification and Segmentation
Settings
Check
Figure 69: Settings for the Create Project and the Assign No Data Values dialog box.
7.2 Segmentation
This Chapter covers the following content:
Examine which Image Layers contain significant information for the class water
Set up the Segmentation Process
The task of this lesson is to find water and road Objects. The first step is to define which Information
segmentation algorithm to use and which layers do contain the relevant information.
As it is a rather simple task to identify two very obvious classes in this subset the
multiresolution segmentation will do a good job. It will give back quite realistic Objects
outlining the homogeneous water and road Objects with larger Objects. The next crucial
step to do is to select those image layers which contain the most relevant information
for segmentation.
85
Exercise: Recap Classification and Segmentation
Result
Check
In the nir and the pan layer the water and the roads have a good visibility, therefore
these two layers should be used for segmentation. Using the pan layer has also the
advantage that the lower resolution gives back fine outlines.
Action!
Result
Check
Figure 71: Left scale parameter 300: too coarse scale parameter results in mixed Objects; Right:
scale 150: the Objects are smaller and gives back the layer values more accurate..
86
Exercise: Recap Classification and Segmentation
7. Enter different compactness values and evaluate the different results. Finally decide
which one to use.
Action!
Result
Check
Figure 72: Left: Shape 0.2, Compactness 0.2; Right; Shape 0.2, Compactness 0.8.
8. Enter the scale parameter, shape and compactness as you decide it to be optimal
and execute the Process.
Action!
Chapter 7.2 covered the following content:
Examine which Image Layers contain significant information for the class water
Set up the Segmentation Process
Rule Set
Check
Figure 73: Example solution: Water is classified using the assign class algorithm.
87
Exercise: Recap Classification and Segmentation
Figure 74: Example solution: Road is classified using the Classification algorithm and the content
of the Class Description.
Result
Check
Figure 75: Water and road is classified in this subset. Some Roads are still misclassified. This will be
eliminated in the next lesson.
Create a Class
Define the first Classification Process
Define the second Classification Process
Alternative Classification method: Insert conditions in the Class Description
88
Classify Using Context Information: Relative border to class
Context information is expressed in the so called Class Related Features. These Features
express relationships of Objects within one Level, between super and sub Objects.
Relations to neighbors within a Level can be: Existence or number of neighbors,
common border to a class, Relative area of a class, spatial distances or the spectral
difference to a class.
The Feature we will use in this lesson is the Relative Border to Feature. It expresses the
amount of border to a certain class compared to the overall border of the Object.
89
Classify Using Context Information: Relative border to class
NOTE:
To create the Feature for all classes at one right-click and select Create all.
3. Select Water from the Value drop-down list and confirm with OK.
The Feature is now available in the Feature View.
Result
Check
Figure 76: The Feature Rel. border to Water is created in the Feature View.
Result
Check
Figure 77: Classification, Classification in outlines view, compared with the Feature View for Relative
Border to Water Objects
90
Classify Using Context Information: Relative border to class
Settings
Check
Figure 78: Process settings for classifying Road Objects with border to Water Objects.
Rule Set
Check
91
Classify Using Context Information: Relative border to class
Result
Check
Figure 80: Left: previous Classification result; Right: current Classification result. All misclassified
Road Objects are now belonging to the class Water Body.
92
Merge Objects
The final goal of an Image Analysis is to have the outlines of the Objects of interest. Introduction
Within a more complex Rule Set there are always several segmentation and
Classification steps to come to the final Objects. In our first simple example we will now
merge all adjacent Objects.
93
Merge Objects
Rule Set
Check
Result
Check
94
Export Results
95
Export Results
Note:
For the field Desktop Export Folder. The Scene Dir means that the image is stored at
the same place where the original images come from. There is the possibility to point
to a selected place too. If you run the Rule Set in batch processing the default path
changes to the Workspace folder. It of course can be changed too, this must be edited
in the Analysis dialog box.
Settings
Check
Rule Set
Check
Figure 85: Process Tree with Process added to export the current view.
Result
Check
96
Export Results
Settings
Check
Rule Set
Check
Result
Check
Figure 89: The content of the exported csv file.
97
Export Results
Note:
You have also the possibility to influence the Vectorization by defining a smoothing
of the polygons. If you want to smooth the outlines you have to add additionally the
algorithm Set Rule Set options before the actual export algorithm.
98
Export Results
Note:
If you run the analysis in batch mode the scene name will be automatically added to
the defined name. This separates the shape files from each other and keeps reference
to the source file name.
Settings
Check
Note:
If you choose Smoothed, the vectors will be generalized after default thresholds of
according to the settings done via the algorithm Set Rule Set Options.
99
Export Results
define a the length, the number of decimal places (scale) of the output, if
needed.
Figure 91: The Edit Attribute Table Columns dialog box with settings to export.
100
Export Results
Note:
It is a known bug, that alias get lost when loading new Features!
Note:
Result
Check
Figure 92: Content of the exported .dbf file, which belongs to the .shp file.
101
Export Results
102
Sample Based Classification with Nearest Neighbor Classifier
The Nearest Neighbor (NN) classifier is Definiens solution for a quick and simple Introduction
Classification of Image Objects based on given sample Image Objects within a defined
Feature space.
After a representative set of sample Objects has been declared for each class, each
Image Object is assigned to the class of the nearest sample Object in the Feature space.
Starting with a few samples, it produces fast results that can quickly be improved by
adding or editing samples.
Nearest Neighbor : The Feature space can be defined independently for each
individual class.
103
Sample Based Classification with Nearest Neighbor Classifier
Classic workflow
Information To classify Image Objects using the Nearest Neighbor classifier, follow the
recommended workflow:
1. Choose the Features you want to use for the Features Space. The default Feature
Space are the Mean Features of the layers.
2. Load or create classes.
3. Append a new Process.
4. Choose algorithm Nearest Neighbor Configuration.
5. Set the Algorithm Parameters:
104
Sample Based Classification with Nearest Neighbor Classifier
105
Sample Based Classification with Nearest Neighbor Classifier
Preparation
1. Import the project file Dessau_NearestNeighbor.dpr. in the LANDSAT_Dessau
folder.
Action! 2. Open the project. An Image Object Level already exists and the following classes
have been inserted:
Woodland General
Grassland General
Impervious General
Water Bodies
Result
Check
Action!
Rule Set
Check
Figure 96: Process Tree with inserted Parent Process for NN Classification.
106
Sample Based Classification with Nearest Neighbor Classifier
Append Process using the algorithm nearest neighbor
configurations
With this algorithm the Feature Space can be defined and applied to the selected Information
classes. The Feature Space is an n-dimensional combination of Features used for
calculating membership values. Before defining the Feature space, decide which
Features you intend to use. The Feature View will help you define your Feature Space.
NOTE
This algorithm is not listed in the default algorithm list. You first have to make it
available. Therefore go to the end of the algorithm list and select more.
Settings
Check
107
Sample Based Classification with Nearest Neighbor Classifier
Settings
Check
Figure 98: Select Multiple Futures window with Mean and Standard deviation Features selected for
Features Space definition.
Settings
Check
108
Sample Based Classification with Nearest Neighbor Classifier
NOTE:
The Feature space for both the nearest neighbor and the standard nearest neighbor
classifier can be edited by double-clicking them in the class description.
Open the Sample Navigation tool bar and choose the tools there.
Settings
Figure 100: Toolbar Sample Navigation. Check
Sample Selection Information: Once a class has at least one sample, the quality of a
new sample can be assessed in this dialog box. It can help to decide if an Object contains
new information for a class, or if it should belong to another class. Three distances are
shown in the dialog box:
109
Sample Based Classification with Nearest Neighbor Classifier
Membership: shows the potential degree of membership according to the
adjusted function slope of the nearest neighbor classifier.
Mean Distance: shows the mean distance to all samples of the respective
class.
Action! 3. Click once on a sample Object for the Woodland General class.
4. Double-click to accept this Object as a sample for the Woodland General class.
5. Click another potential sample Object for the Woodland General class.
110
Sample Based Classification with Nearest Neighbor Classifier
The Sample Selection Information
Once a sample is assigned to a class, the quality of a new sample can be assessed in the Information
Sample Selection Information dialog box.
Analyze its membership value and its distance to the Woodland General class and to all
other classes within the Feature space.
Decide
if the sample includes new information to describe the selected class (low
membership value to selected class, low membership value to other classes)
Action!
Rule Set
Check
Figure 103: Process Tree with Process for NN configuration and Classification.
NOTE:
When you are finished collecting samples, be sure to click on Select Samples to turn
off sample selection from the Samples menu.
Rule Set
Check
Figure 104: Process Tree with helper Process for refining NN Classification.
112
Batch-Processing with eCognition Server
Definiens in combination with a Definiens eCognition Server allows batch processing Introduction
of entire sets of image data. The following Lesson will walk you through the different
steps involved in setting up, running and monitoring a batch process .
We will use a Rule Set for batch-processing, which classified impervious surface from
aerial RGB data and calculates categories compared to a GIS.
Note
Without a Definiens Server, you will not be able to submit data for batch processing.
However, it still may be useful to go through the steps of this tutorial to see how to
set up a Workspace. The Workspace can be quite useful for managing data and results
even without batch processing.
We will first load the data to be processed in the existing Workspace using an import Information
template. Multiple projects will be created automatically. Standardized import templates
are used to load the image data according to the necessary file structure.
The import via template guarantees the names of the image layers correspond to those
used in the Rule Set (red, green and blue). This includes the names of thematic layers
(parcels).
113
Batch-Processing with eCognition Server
Aerial images together with the corresponding subset of a shape file will be loaded.
Action! 5. Select the created Folder, right-click it and select Predefined Import.
6. From the drop-down Import Template list select DSS applanix with parcels.
7. Next to the Root Folder of Image Data field, click the folder icon to open the
Browse For Folder dialog box.
8. Make sure the Search in Subfolders checkbox is activated, because the data to be
imported is stored in different sub folders.
9. Select the folder containing the data to be analyzed, here:
\01_Definiens_ESSENTIALS_TRAINING\Module1\Aerial_Thematic.
10. Once the data is loaded in the Import Scenes dialog box, expand folders in the
Preview window by clicking the + sign or collapse them by clicking on the -
button to evaluate the data to be loaded.
11. Finally, click OK to import the data into the Workspace.
The Workspace now shows all datasets imported as well as additional information
available on the different Workspace items.
Result
Check
Figure 105: Left: Import Scenes dialog; Right: Workspace with automatically created Projects.
114
Batch-Processing with eCognition Server
Chapter 12.1 covered the following content
Settings
Check
After submitting data for analysis, the Workspace entries display the current processing
status. If statistical information is exported, it will be added to the details of each
Workspace entry.
115
Batch-Processing with eCognition Server
Workspace
Check
Workspace entries can be opened by double-clicking. The result project will be opened
in the current Definiens application. If the project is modified and saved, the status will
be set to Edited. To reset a Workspace entry, select the entry and right-click, then select
History.
Enter the job scheduler string. If a local job scheduler is used use
http://localhost:8184.
The HTML page is split into four parts: User Jobs and, Engines (on the left side of the
screen), Engine Usage (lower part) and the Job Overview on the right, which is
empty by default. You can resize the panes by clicking on the dividers and dragging
them.
116
Batch-Processing with eCognition Server
Inactive jobs encompass both successfully completed jobs and those that failed or
were cancelled.
Failed lists only those that did not successfully finish. Any filter in use is
surrounded by asterisks (this information applies to all filters on the page).
Look at some of the available data in this pane:|
1. Click Active to display only jobs currently running.
2. Push the Refresh button to reload the site.
Action!
3. Click Log to see additional information about how the job was processed. The log
lists the dates of events, followed by machine and engine number and the type of
event (an engine either connecting or shutting down).
4. Click on the index number of a job in the User Jobs pane to view its details in the
Job Overview section.
117
Batch-Processing with eCognition Server
done
failed
waiting
cancelled
processing
Information displayed about a specific job includes the start and end times, the version
number of your Definiens software, the (local) path of the utilized Rule Set, a list of the
image layers submitted for processing and the path of all the output files you specified
in the Configure Exported Results dialog. Also in case of errors a remarks section will
be displayed, providing information about the origin of the error.
118
Batch-Processing with eCognition Server
Chapter 12.3 covered the following content
119