Sunteți pe pagina 1din 969

Deadline User Manual

Release 7.1.0.35

Thinkbox Software

May 04, 2015

CONTENTS

Introduction
1.1 Overview . . . . . . . . . .
1.2 Feature Set . . . . . . . . .
1.3 Supported Software . . . .
1.4 Render Farm Considerations

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

3
3
7
10
30

Installation
2.1 System Requirements . . . . . . . . .
2.2 Licensing . . . . . . . . . . . . . . . .
2.3 Database and Repository Installation .
2.4 Client Installation . . . . . . . . . . .
2.5 Submitter Installation . . . . . . . . .
2.6 Upgrading or Downgrading Deadline .
2.7 Relocating the Database or Repository
2.8 Importing Repository Settings . . . . .

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

37
37
40
41
67
83
87
89
90

Getting Started
3.1 Application Configuration
3.2 Submitting Jobs . . . . .
3.3 Monitoring Jobs . . . . .
3.4 Controlling Jobs . . . . .
3.5 Archiving Jobs . . . . . .
3.6 Monitor and User Settings
3.7 Local Slave Controls . . .

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

93
93
97
104
112
144
148
156

Client Applications
4.1 Launcher . . .
4.2 Monitor . . . .
4.3 Slave . . . . .
4.4 Pulse . . . . .
4.5 Balancer . . .
4.6 Command . .
4.7 Web Service .
4.8 Mobile . . . .

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

163
163
168
192
199
204
209
212
217

Administrative Features
5.1 Repository Configuration
5.2 User Management . . . .
5.3 Slave Configuration . . .
5.4 Pulse Configuration . . .
5.5 Balancer Configuration . .

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

223
223
260
266
274
278

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

5.6
5.7
5.8
5.9
5.10
5.11
5.12
5.13
6

ii

Job Scheduling . . . . . . .
Pools and Groups . . . . . .
Limits and Machine Limits
Job Failure Detection . . . .
Notifications . . . . . . . .
Remote Control . . . . . .
Network Performance . . .
Cross Platform Rendering .

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.

284
287
292
296
299
301
310
315

Advanced Features
6.1 Manual Job Submission . . . . .
6.2 Power Management . . . . . . .
6.3 Slave Scheduling . . . . . . . . .
6.4 Farm Statistics . . . . . . . . . .
6.5 Client Configuration . . . . . . .
6.6 Auto Configuration . . . . . . . .
6.7 Render Environment . . . . . . .
6.8 Multiple Slaves On One Machine
6.9 Cloud Controls . . . . . . . . . .
6.10 Job Transferring . . . . . . . . .

.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.

319
319
328
338
341
354
359
362
365
368
371

Scripting
7.1 Scripting Overview . . .
7.2 Application Plugins . .
7.3 Event Plugins . . . . . .
7.4 Cloud Plugins . . . . .
7.5 Balancer Plugins . . . .
7.6 Monitor Scripts . . . . .
7.7 Job Scripts . . . . . . .
7.8 Web Service Scripts . .
7.9 Standalone Python API .

.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.

375
375
381
398
411
417
420
428
430
433

REST API
8.1 REST Overview
8.2 Jobs . . . . . . .
8.3 Job Reports . . .
8.4 Tasks . . . . . .
8.5 Task Reports . .
8.6 Slaves . . . . . .
8.7 Pulse . . . . . .
8.8 Balancer . . . .
8.9 Limits . . . . . .
8.10 Users . . . . . .
8.11 Repository . . .
8.12 Pools . . . . . .
8.13 Groups . . . . .

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.

437
437
439
451
453
458
459
464
466
468
471
475
481
484

Application Plugins
9.1 3ds Command . .
9.2 3ds Max . . . . .
9.3 After Effects . . .
9.4 Anime Studio . . .
9.5 Arion Standalone .
9.6 Arnold Standalone
9.7 AutoCAD . . . . .

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

.
.
.
.
.
.
.

489
489
498
552
563
566
569
573

9.8
9.9
9.10
9.11
9.12
9.13
9.14
9.15
9.16
9.17
9.18
9.19
9.20
9.21
9.22
9.23
9.24
9.25
9.26
9.27
9.28
9.29
9.30
9.31
9.32
9.33
9.34
9.35
9.36
9.37
9.38
9.39
9.40
9.41
9.42
9.43
9.44
9.45
9.46
9.47
9.48
9.49
9.50
9.51
9.52
9.53
9.54
9.55
9.56
9.57
9.58
9.59
9.60
9.61

Blender . . . . . . . . . . . . .
Cinema 4D . . . . . . . . . . .
Cinema 4D Team Render . . .
Clarisse iFX . . . . . . . . . .
Combustion . . . . . . . . . . .
Command Line . . . . . . . . .
Command Script . . . . . . . .
Composite . . . . . . . . . . .
Corona Standalone . . . . . . .
Corona Distributed Rendering .
CSiBridge . . . . . . . . . . .
CSiETABS . . . . . . . . . . .
CSiSAFE . . . . . . . . . . . .
CSiSAP2000 . . . . . . . . . .
DJV . . . . . . . . . . . . . . .
Draft . . . . . . . . . . . . . .
Draft Tile Assembler . . . . . .
EnergyPlus . . . . . . . . . . .
FFmpeg . . . . . . . . . . . . .
Fusion . . . . . . . . . . . . .
Fusion Quicktime . . . . . . .
Generation . . . . . . . . . . .
Hiero . . . . . . . . . . . . . .
Houdini . . . . . . . . . . . . .
Lightwave . . . . . . . . . . .
LuxRender . . . . . . . . . . .
LuxSlave . . . . . . . . . . . .
Mantra Standalone . . . . . . .
Maxwell . . . . . . . . . . . .
Maya . . . . . . . . . . . . . .
Media Encoder . . . . . . . . .
Mental Ray Standalone . . . . .
Messiah . . . . . . . . . . . . .
MetaFuze . . . . . . . . . . . .
MetaRender . . . . . . . . . .
MicroStation . . . . . . . . . .
modo . . . . . . . . . . . . . .
Naiad . . . . . . . . . . . . . .
Natron . . . . . . . . . . . . .
Nuke . . . . . . . . . . . . . .
Nuke Frame Server . . . . . . .
Octane Standalone . . . . . . .
PRMan (Renderman Pro Server)
Puppet . . . . . . . . . . . . .
Python . . . . . . . . . . . . .
Quicktime Generation . . . . .
Realflow . . . . . . . . . . . .
REDLine . . . . . . . . . . . .
Renderman (RIB) . . . . . . .
Rendition . . . . . . . . . . . .
Rhino . . . . . . . . . . . . . .
RVIO . . . . . . . . . . . . . .
Salt . . . . . . . . . . . . . . .
Shake . . . . . . . . . . . . . .

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

577
582
586
591
596
600
602
605
610
612
619
621
624
627
629
632
635
638
641
644
650
654
657
661
666
672
674
677
681
684
706
709
713
717
720
722
728
736
739
742
748
754
757
760
762
764
767
774
777
780
783
792
794
796

iii

9.62
9.63
9.64
9.65
9.66
9.67
9.68
9.69
9.70
9.71
9.72
9.73

SketchUp . . . . . . . . . . .
Softimage . . . . . . . . . . .
Terragen . . . . . . . . . . .
Tile Assembler . . . . . . . .
V-Ray Distributed Rendering
VRay Standalone . . . . . . .
VRay Ply2Vrmesh . . . . . .
VRay Vrimg2Exr . . . . . . .
VRED . . . . . . . . . . . .
VRED Cluster . . . . . . . .
Vue . . . . . . . . . . . . . .
xNormal . . . . . . . . . . .

10 Event Plugins
10.1 Draft . .
10.2 FontSync
10.3 ftrack . .
10.4 Puppet .
10.5 Salt . . .
10.6 Shotgun .

.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.

.
.
.
.
.
.
.
.
.
.
.
.

799
802
813
816
818
828
831
834
837
840
843
847

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

851
851
853
854
869
870
871

11 Cloud Plugins
11.1 Amazon EC2 . .
11.2 Google Cloud . .
11.3 Microsoft Azure
11.4 OpenStack . . .
11.5 vCenter . . . . .

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

893
893
896
903
905
907

12 Release Notes
12.1 Deadline 7.0.0.54 Release Notes
12.2 Deadline 7.0.1.3 Release Notes
12.3 Deadline 7.0.2.3 Release Notes
12.4 Deadline 7.0.3.0 Release Notes
12.5 Deadline 7.1.0.35 Release Notes

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

.
.
.
.
.

911
911
939
940
942
942

iv

.
.
.
.
.
.

.
.
.
.
.
.

.
.
.
.
.
.

Deadline User Manual, Release 7.1.0.35

search

CONTENTS

Deadline User Manual, Release 7.1.0.35

CONTENTS

CHAPTER

ONE

INTRODUCTION

1.1 Overview
Deadline is a hassle-free administration and rendering toolkit for Windows, Linux, and Mac OSX based render farms.
It offers a world of flexibility and a wide-range of management options for render farms of all sizes, and supports over
60 different rendering packages out of the box.
Deadline 7 is the latest version of Thinkbox Softwares scalable high-volume compute management solution. It features built-in VMX (Virtual Machine Extension) capabilities, which allow artists, architects and engineers to harness
resources in both public and private clouds.
In addition to enhanced cloud support, Deadline 7 expands support for the Jigsaw multi-region rendering feature,
which can now be accessed in 3ds Max, Maya, modo, and Rhino. Deadline 7 also includes an updated version of
Draft, Thinkboxs lightweight compositing and video processing plug-in designed to automate typical post-render
tasks such as image format conversion as well as the creation of animated videos and QuickTimes, contact sheets, and
watermark elements on exported images. Finally, Deadline 7 introduces a wealth of new features, enhancements, and
bug fixes.
Deadline 7.1 adds many new features to Deadline 7.0, including new slave metrics, better font synchronization, and
new application support. It also fixes some bugs that were discovered after Deadline 7.0 was released.
Note that a new 7.1 license is required to run this version. If you have a license for Deadline 7.0 or earlier, you will
need an updated license. In addition, the version of Draft that ships with Deadline 7.1 needs a new 1.3 license. If you
have a license for Draft 1.2 or earlier, you will need an updated license.

1.1.1 Components
The Deadline Render Farm Management System is built up of 3 components:
A single Deadline Database
A single Deadline Repository
One or more Deadline Clients

Deadline User Manual, Release 7.1.0.35

The Database and Repository together act as a global system where all of Deadlines data is stored. The Clients
(workstations and render nodes) then connect to this system to submit, render, and monitor jobs. It is important to
note that while the Database and Repository work together, they are still separate components, and therefore can be
installed on separate machines if desired.

1.1.2 Database
The Database is the global database component of the Deadline Render Farm Management System. It stores the jobs,
settings, and slave configurations. The Clients access the Database via a direct socket connection over the network. It
only needs to be installed on one machine (preferably a server), and does not require a license.

1.1.3 Repository
The Repository is the global file system component of the Deadline Render Farm Management System. It stores the
plugins, scripts, logs, and any auxiliary files (like scene files) that are submitted with the jobs. The Clients access the
Repository via a shared network path. It only needs to be installed on one machine (preferably a server), and does not
require a license.

1.1.4 Client
The Client should be installed on your render nodes, workstations, and any other machines you wish to participate in
submitting, rendering, or monitoring jobs. The Client consists of the following applications:

Chapter 1. Introduction

Deadline User Manual, Release 7.1.0.35

Launcher: Acts as a launch point for the Deadline applications on workstations, and facilitates remote communication on render nodes.
Monitor: An all-in-one application that artists can use to monitor their jobs and administrators can use to monitor
the farm.
Slave: Controls the rendering applications on the render nodes.
Command: A command line tool that can submit jobs to the farm and query for information about the farm.
Pulse: An optional mini server application that performs maintenance operations on the farm, and manages
more advanced features like Auto Configuration, Power Management, Slave Throttling, Statistics Gathering,
and the Web Service. If you choose to run Pulse, it only needs to be running on one machine.
Balancer: An optional Cloud-controller application that can create and terminate Cloud instances based on
things like available jobs and budget settings.
Note that the Slaves and the Balancer applications are the only Client applications that require a license.

1.1.5 Jobs
A Deadline job typically represents one of the following:
The rendering of an animation sequence from a 3D scene.
The rendering of a frame sequence from a composition. It could represent a single write node, or multiple write
nodes with the same frame range.
The generation of a Quicktime movie from an existing image sequence.
A simulation.
These are just some common cases. Since a job simply represents some form of processing, a plug-in can be created
for Deadline to do almost anything you can think of.
Job Breakdown
A job can be broken down into one or more tasks, where each task is an individual unit that can be rendered by the
Slave application. Each task can then consist of a single frame or a sequence of frames. Here are some examples:
When rendering an animation with 3ds Max where each frame can take hours to render, each frame can be
rendered as a separate task.
When rendering a compositing job with After Effects where each frame can take seconds to render, each task
could consist of 20 frames.
When rendering a Quicktime job to create a movie from an existing sequence of images, the job would consist
of a single task, and that task would consist of the entire image sequence.

1.1. Overview

Deadline User Manual, Release 7.1.0.35

Job Scheduling
Use numeric job priorities, machine groups and pools, and job-specific machine lists to explicitly control distribution
of rendering resources among multiple departments. Limits allow you to handle both limited license plug-ins and
render packages, while job dependencies and scheduling allow you to control when your jobs will begin rendering.

Chapter 1. Introduction

Deadline User Manual, Release 7.1.0.35

The Slave applications are fully responsible for figuring out which job they should render next, and they do this by
connecting directly to the Database. In other words, there is no central server application that controls which jobs the
Slaves are working on. The benefit to this is that as long as your Database and Repository are online, Deadline will be
fully operational.

1.2 Feature Set


1.2.1 Rock-steady Operation
Deadlines unique architecture removes the need for a centralized manager application by using a highly-scalable
database and basic file sharing to manage the farm. As long as your Database and File Server are running, Deadline is
running.

1.2.2 Intuitive User Interface


Built with your creativity in mind, Deadlines User Interface has evolved in response to extensive feedback from artists.
The flexible and intuitive interface provides a unified experience to artists and administrators across all platforms.
For job submission, Deadline offers integrated submission scripts for 3ds Max, After Effects, Blender, Cinema 4D,
Clarisse iFX, Composite, Fusion, Generation, Hiero, Houdini, Lightwave, Maya, Messiah, modo, Nuke, RealFlow,
Rhino, SketchUp 3D, Softimage, and Vue, providing a comfortable native environment for cross-application tasks.

1.2. Feature Set

Deadline User Manual, Release 7.1.0.35

1.2.3 Supported Software


Deadline supports over 60 different rendering packages out of the box. See the Supported Software page in the
Deadline documentation for more information.

1.2.4 Customizable and Scriptable


With its Python based plug-in API, studios can customize the out of the box plug-ins and scripts to suit their individual
pipelines, or create custom plug-ins to support in-house applications. Event plug-ins can be created to trigger events
like updating pipeline tools when jobs are submitted or finish rendering, and Cloud plug-ins can be created to control
VMs in public and private Cloud providers. Finally, job scripts can be created to setup custom dependencies, as well
as perform operations when a job starts, when a job finishes, and before and after each task is rendered.

1.2.5 Flexible Job Scheduling


Use numeric job priorities, machine groups and pools, and job-specific machine lists to explicitly control distribution
of rendering resources among multiple departments. Limits allow you to handle both limited license plug-ins and
render packages, while job, asset, and script based dependencies allow you to control when your jobs will begin
rendering. Stick with the default First-in, First-out scheduling logic, or switch to a Balanced or Weighted system.
Launch and configure an arbitrary number of Slaves on a single machine. Each Slave instance can be given a unique
name, and can be assigned its own list of pools and groups, which allows Slaves to work on separate jobs. A single
high performance machine can process multiple 3D, compositing, and simulation jobs simultaneously. Slave instances
running on the same machine will share a single Deadline license.

1.2.6 Notifications
Deadline can be configured to notify users of job completion or failure through an automatic e-mail notification or a
popup message on the users machine.
Administrators can also configure Deadline to notify them with information about Power Management, stalled Slaves,
licensing issues, and other issues that may arise on the farm.

1.2.7 Statistics Gathering


Deadline automatically stores job and render farm statistics in the Database. Statistics can be viewed from the Monitor,
or retrieved from the Database by custom pipeline tools.

1.2.8 Shotgun and ftrack Integration


Deadline integrates with Shotgun to enable a seamless render and review data flow. When a render job is submitted,
a version is automatically created in Shotgun with key metadata. When the render is complete, Shotgun is updated
with a thumbnail image, paths to frames, render stats, and playback links. Deadline can also automatically upload a
movie and/or a filmstrip when the render is complete. Shotgun then dispatches targeted notifications with links back
to the work. Studios can view versions in various contexts, create reports, and organize work into playlists for review
sessions where they can quickly take notes with the Shotgun Note App.
The Deadline/FTrack integration enables a seamless render and review data flow. When Deadline starts a render, an
Asset Version is automatically created within FTrack using key metadata. When the render is complete, Deadline
automatically updates the created Version appropriately a thumbnail image is uploaded, components are created
from the Jobs output paths (taking advantage of FTracks location plugins), and the Version is flagged for Review. In

Chapter 1. Introduction

Deadline User Manual, Release 7.1.0.35

doing so, Deadline provides a seamless transition from Job Submission to Review process, without artists needing to
monitor their renders.

1.2.9 Draft
Draft is a tool that provides simple compositing functionality. It is implemented as a Python library, which exposes
functionality for use in python scripts. Draft is designed to be tightly integrated with Deadline, but it can also be used
as a standalone tool.
Using the Draft plugin for Deadline, artists can automatically perform simple compositing operations on rendered
frames after a render job finishes. They can also convert them to a different image format, or generate Quicktimes for
dailies.
Active Deadline subscribers are entitled to Draft licenses at no additional cost. Active Deadline subscribers can request
a Draft license by emailing sales@thinkboxsoftware.com.

1.2.10 QuickTime Support


Install QuickTime on your slaves to create QuickTime movies from your own rendered frames.

1.2.11 Jigsaw and Tile Rendering


Jigsaw is available for 3ds Max, Maya, modo, and Rhino, and can be used to split up large frames into arbitrary sized
tiles and distribute them over your render farm. When the tiles are finished rendering, they are automatically assemged
into the final image using Draft. Specific tiles can be re-rendered and automatically composited on top of the original
image.
Regular tile rendering, which supports fixed tile sizes only, is still supported as well, and is available for 3ds Max,
Maya, modo, Rendition, Rhino, and Softimage.

1.2.12 Easy Installation and Upgrade Deployment


Deadline has gone through rigorous analysis to make the installation and configuration process smooth and efficient. A
detailed document provides easy, step-by-step instructions explaining the various components that will be installed. In
addition, Deadline has the ability to auto-upgrade the whole render farm from a centralized deployment - an incredible
time-saver for large render farms.
Auto Configuration allows studios to efficiently increase the size of their farm by removing the need to configure each
new Slave individually. The Repository Path, License Server, and additional settings can be configured in a single
location, and broadcast to the slaves when they start up.

1.2.13 Slave Scheduling and Idle Detection


Start and stop the slave based on the time of day to allow workstations to join the render farm overnight. Alternatively,
start the slave if the machine has been idle for a certain amount of time, and stop it when the machine is in use again.
Other criteria like CPU usage, memory usage, and running processes can also be checked before starting the slave.
Displays a warning message before starting the slave, allowing an artist to choose to delay when the slave starts if they
are still using the machine.

1.2. Feature Set

Deadline User Manual, Release 7.1.0.35

1.2.14 Local Slave Controls


Artists can monitor and control the slave application running on their workstation, which is useful if the slave is
running as a service. Override the Idle Detection settings for your slave, or change the slaves Job Dequeuing Mode
to control if the slave should render all jobs, jobs submitted from the artists machine, or jobs submitted by specific
users.

1.2.15 Remote Control and Farm Administration


Stream the log from a Slave in real time, or start, stop, and restart Slave instances (as well as the remote machine on
which it is running) remotely from within the Monitor. In addition, execute arbitrary command lines (applications,
command line operations or batch files) on a single or group of remote machines to rollout software or install updates.
In addition, Deadline integrates seamlessly with VNC, Remote Desktop Connection, Apple Remote Desktop, and
Radmin using custom scripts. These scripts can be modified or new scripts can be created to support other remote
access software.

1.2.16 Access Control and Auditing


While full access is granted for all users to modify their own jobs, the User Group Management System prevents users
from inadvertently disrupting other jobs, and allows Administrators to configure the types of actions available to each
user group. An optional password protected Super User mode allows for global network administration.
Any command that affects a job or Slave is logged along with the originating user name and machine. This allows
everyone, including project managers and supervisors, to track changes and troubleshoot issues with confidence. It
also encourages responsibility and cooperation on the part of all users.

1.2.17 Reduced Energy Footprint


Save on energy consumption, power and cooling costs with Power Management, a feature that shuts down idle machines and starts them back up when needed. This feature is available for render farms with machines that support
WakeOnLan.

1.3 Supported Software


Deadline offers extensive out of the box support for third party applications, as well as an Application Plugin API and
Event Plugin API for custom plugin development. The following applications (and associated renderers) are supported
out of the box.

10

Chapter 1. Introduction

Deadline User Manual, Release 7.1.0.35

1.3.1 3ds Max


Highlighted Features

Supports Versions 2010 to 2016


3ds Max and 3ds Max Design
Integrated Submission
RPManager Submission
Keeps Scene In Memory
Tile Rendering
Jigsaw Support
Interactive VRay Distributed Rendering
Interactive Corona Distributed Rendering
Offload VRay Distributed Rendering
Offload Mental Ray Distributed Rendering
Render To Texture Support
Maxscript Jobs
Scene States/Sub-States
Custom Sanity Check
Local Rendering
Sticky/Default Settings Configuration
Shotgun Support
ftrack Support
Draft Support
Path Mapping Of Scene File Path
Path Mapping Of Output File Path
Path Mapping Of Pre/Post Script Paths
Path Mapping Of Path Config File Path

Supported Renderers

Brazil r/s
Corona
finalRender
finalToon
Krakatoa
Maxwell
NVIDIA iray
NVIDIA Mental Ray
Quicksilver
RenderPipe
Scanline
VRay

Documentation: 3ds Max Documentation, 3ds Command Documentation

1.3.2 After Effects


Highlighted Features

Supports Versions CS3 to CS6 and CC to CC2014


Integrated Submission
Local Rendering
Multi-Machine Rendering
Submit Layers As Separate Jobs
Custom Sanity Check
Shotgun Support
ftrack Support
Draft Support
Path Mapping Of Scene File Path
Path Mapping Of Output Path
Path Mapping Of Scene File Contents (.aepx format only)

Documentation: After Effects Documentation

1.3. Supported Software

11

Deadline User Manual, Release 7.1.0.35

1.3.3 Anime Studio


Highlighted Features

Supports Version 8 to 11
Shotgun Support
ftrack Support
Draft Support
Path Mapping Of Scene File Path
Path Mapping Of Output Path

Documentation: Anime Studio Documentation

1.3.4 Arion Standalone


Highlighted Features

Supports Version 2 and Later


Shotgun Support
ftrack Support
Path Mapping Of Scene File Path
Path Mapping Of Output Path
Path Mapping Of Scene File Contents

Documentation: Arion Standalone Documentation

1.3.5 Arnold Standalone


Highlighted Features

Supports the Pre-Release Beta and Version 1


Local Rendering
Shotgun Support
ftrack Support
Path Mapping Of Input File Paths
Path Mapping Of Output Path
Path Mapping Of Plugin Folder Paths

Documentation: Arnold Standalone Documentation

12

Chapter 1. Introduction

Deadline User Manual, Release 7.1.0.35

1.3.6 AutoCAD
Highlighted Features

Supports AutoCAD 2015


Local Rendering
Plotting
Exporting
Shotgun Support
ftrack Support
Draft Support

Documentation: AutoCAD Documentation

1.3.7 Blender
Highlighted Features

Supports Version 2.5 and Later


Shotgun Support
ftrack Support
Draft Support
Path Mapping Of Scene File Path
Path Mapping Of Output Path

Supported Renderers
All

Documentation: Blender Documentation

1.3.8 Cinema 4D
Highlighted Features

Supports Versions 12 to 16
Integrated Submission
Local Rendering
Automatic Scene Exporting
Team Render Support
Custom Sanity Check
Shotgun Support
ftrack Support
Draft Support
Path Mapping Of Scene File Path
Path Mapping Of Output Path

Supported Renderers
All

Documentation: Cinema 4D Documentation, Cinema 4D Team Render Documentation

1.3. Supported Software

13

Deadline User Manual, Release 7.1.0.35

1.3.9 Clarisse iFX


Highlighted Features

Integrated Submission
Automatic Render Archiving
Path Mapping Of Scene File Path
Path Mapping Of Config File Path
Path Mapping Of Module Paths
Path Mapping Of Search Paths

Documentation: Clarisse iFX Documentation

1.3.10 Combustion
Highlighted Features

Supports Versions 4 and 2008


Shotgun Support
ftrack Support
Draft Support
Path Mapping Of Scene File Path
Path Mapping Of Scene File Contents

Documentation: Combustion Documentation

1.3.11 Command Line


Highlighted Features

Run Arbitrary Command Line Jobs


Run The Same Command For Different Frames
Run Different Commands For Different Tasks
Path Mapping Of Executable File Path
Path Mapping Of Arguments

Documentation: Command Line Documentation, Command Script Documentation

1.3.12 Composite
Highlighted Features

Supports Versions 2010 to 2016


Integrated Submission
Shotgun Support
ftrack Support

Documentation: Composite Documentation

14

Chapter 1. Introduction

Deadline User Manual, Release 7.1.0.35

1.3.13 Corona Standalone


Highlighted Features

Override number of passes and render time during submission


Specify multiple configuration files to use when rendering
Path Mapping Of Scene File Path
Path Mapping Of Output File Path
Path Mapping Of Config File Paths

Documentation: Corona Standalone Documentation

1.3.14 Corona Distributed Rendering


Highlighted Features

Supported Applications

Submit DR Jobs to Reserve Machines


Interactive Distributed Rendering
Existing Server Process Handling
Slave Auto Session Timeout Controls

3ds Max (fully integrated)


DR Server (server launching only)

Documentation: Corona Distributed Rendering Documentation

1.3.15 CSiBridge
Highlighted Features

Submit Solver, Analysis and Reporting jobs


Cleanup Options to Optimize Data Size
Optionally Perform Design after Analysis
Optional Automatic Compression of Output

Documentation: CSiBridge Documentation

1.3.16 CSiETABS
Highlighted Features
Submit Solver, Analysis and Reporting jobs
Cleanup Options to Optimize Data Size
Optional Automatic Compression of Output
Documentation: CSiETABS Documentation

1.3. Supported Software

15

Deadline User Manual, Release 7.1.0.35

1.3.17 CSiSAFE
Highlighted Features

Submit Solver, Analysis and Reporting jobs


Cleanup Options to Optimize Data Size
Optionally Export to external Database
Optional Automatic Compression of Output

Documentation: CSiSAFE Documentation

1.3.18 CSiSAP2000
Highlighted Features

Submit Solver, Analysis and Reporting jobs


Cleanup Options to Optimize Data Size
Optionally Perform Design after Analysis
Optional Automatic Compression of Output

Documentation: CSiSAP2000 Documentation

1.3.19 DJV
Highlighted Features

Image/Movie Type Conversion


Shotgun Support
ftrack Support
Draft Support
Path Mapping Of Input File Path
Path Mapping Of Output File Path
Path Mapping Of Slate Input Path

Documentation: DJV Documentation

1.3.20 Draft
Highlighted Features

Deep Integration With Deadline


Create Movies From Rendered Images
Perform Other Image Processing
Shotgun Support
ftrack Support
Path Mapping Of Template File Path
Path Mapping Of Template Arguments

Documentation: Draft Documentation, Draft Event Documentation

16

Chapter 1. Introduction

Deadline User Manual, Release 7.1.0.35

1.3.21 EnergyPlus
Highlighted Features

Off-load US Gov. Energy Analysis Jobs


Optional Weather EPW Files
Multithreading/DEBUG Options
Post-Processing Options
Optional Automatic Compression of Output

Documentation: EnergyPlus Documentation

1.3.22 FFmpeg
Highlighted Features

Up To 10 Input Files or Sequences


Path Mapping Of Input File Paths
Path Mapping Of Output File Path
Path Mapping Of Video Preset File Path
Path Mapping Of Audio Preset File Path
Path Mapping Of Subtitle Preset File Path

Documentation: FFmpeg Documentation

1.3.23 ftrack
Highlighted Features

Create new Asset Versions on job submission


Update Version status on job completion
Automatic thumbnail generation and upload
Automatic component upload

Documentation: ftrack Event Documentation

1.3.24 Fusion
Highlighted Features

Supports Versions 5 to 7
Integrated Submission
Keeps Scene In Memory
Custom Sanity Check
Quicktime Generation
Shotgun Support
ftrack Support
Draft Support

Documentation: Fusion Documentation, Fusion Quicktime Documentation

1.3. Supported Software

17

Deadline User Manual, Release 7.1.0.35

1.3.25 Generation
Highlighted Features
Integrated Submission
Submit Comp Jobs To Fusion
Documentation: Generation Documentation

1.3.26 Hiero
Highlighted Features
Integrated Submission
Submit Transcoding Jobs To Nuke
Documentation: Hiero Documentation

1.3.27 Houdini
Highlighted Features

Supports Versions 9 to 14
Integrated Submission
Submit ROPs as Separate Jobs
Submit Wedge ROPs as Separate Jobs
IFD Export Jobs
Custom Sanity Check
Shotgun Support
ftrack Support
Draft Support
Path Mapping Of Scene File Path
Path Mapping Of Output File Path
Path Mapping Of Scene File Contents
Path Mapping Of IFD File Path

Supported Renderers
All

Documentation: Houdini Documentation

18

Chapter 1. Introduction

Deadline User Manual, Release 7.1.0.35

1.3.28 Lightwave
Highlighted Features

Supports Versions 8 to 11 and 2015


FPrime Rendering
Integrated Submission
Keeps Scene In Memory
Custom Sanity Check
Shotgun Support
ftrack Support
Draft Support
Path Mapping Of Scene File Path
Path Mapping Of Config Folder Path
Path Mapping Of Content Folder Path
Path Mapping Of Content File Contents

Supported Renderers
All

Documentation: Lightwave Documentation

1.3.29 LuxRender
Highlighted Features
Path Mapping Of Scene File Path
Documentation: LuxRender Documentation

1.3.30 LuxSlave
Highlighted Features

Submit Luxconsole Jobs to Reserve Machines


Interactive Distributed Rendering
Existing Slave Process Handling
Slave Auto Session Timeout Controls

Documentation: LuxSlave Documentation

1.3.31 Mantra Standalone


Highlighted Features

Supports Versions 7 to 13
Shotgun Support
ftrack Support
Path Mapping Of IFD File Path
Path Mapping Of Output File Path
Path Mapping Of IFD File Contents

Documentation: Mantra Standalone Documentation

1.3. Supported Software

19

Deadline User Manual, Release 7.1.0.35

1.3.32 Maxwell
Highlighted Features

Supports Versions 2 and 3


Cooperative Rendering
Automatic MXI Merging
Local Rendering
Resume Rendering from MXI Files
Override Time and Sampling Level Values
Override Extra Sampling Values
Shotgun Support
ftrack Support
Path Mapping Of Scene File Path
Path Mapping Of MXI File Path
Path Mapping Of Output File Path

Documentation: Maxwell Documentation

1.3.33 Maya
Highlighted Features

Supports Versions 2010 to 2016


Integrated Submission
Keeps Scene In Memory
Tile Rendering
Jigsaw Support
VRay Distributed Rendering
Local Rendering
Submit Layers As Separate Jobs
Submit Cameras As Separate Jobs
Mental Ray Export Jobs
VRay Export Jobs
Renderman Export Jobs
Arnold Export Jobs
Melscript/Python Script Jobs
Custom Sanity Check
Shotgun Support
ftrack Support
Draft Support
Path Mapping Of Scene File Path
Path Mapping Of Output Folder Path
Path Mapping Of Project Folder Path
Path Mapping Of Scene File Contents (.ma format
only)

Supported Renderers

3Delight
Arnold
Caustic Visualizer
Final Render
Gelato
Krakatoa
Maxwell
MayaSoftware
MayaHardware
MayaVector
Mental Ray
Octane
Redshift
Renderman
Renderman RIS
Turtle
VRay

Documentation: Maya Documentation

20

Chapter 1. Introduction

Deadline User Manual, Release 7.1.0.35

1.3.34 Media Encoder


Highlighted Features

Local Rendering
Shotgun Support
ftrack Support
Path Mapping Of Input File Path
Path Mapping Of Output File Path

Documentation: Media Encoder Documentation

1.3.35 Mental Ray Standalone


Highlighted Features

Local Rendering
Shotgun Support
ftrack Support
Path Mapping Of Input File Path
Path Mapping Of Output File Path

Documentation: Mental Ray Standalone Documentation

1.3.36 Messiah
Highlighted Features

Integrated Submission
Shotgun Support
ftrack Support
Path Mapping Of Scene File Path
Path Mapping Of Output Folder Path
Path Mapping Of Content Folder Path

Documentation: Messiah Documentation

1.3.37 MetaFuze
Highlighted Features
Batch Folder Submission
Path Mapping Of Scene File Path
Documentation: MetaFuze Documentation

1.3. Supported Software

21

Deadline User Manual, Release 7.1.0.35

1.3.38 MetaRender
Highlighted Features
Path Mapping Of Input File Path
Path Mapping Of Output File Path
Documentation: MetaRender Documentation

1.3.39 MicroStation
Highlighted Features

Supports MicroStation v8i SS3


Integrated Submission
Keeps Design File in Memory
Animation Renders
Single View Renders
Batched View Renders
Export to modo scene file
Export to DWG / DXF
Export to ACIS SAT
Export to flat DGN
Export visible edges
Print jobs
Path Mapping Of Scene File Path
Path Mapping Of Output File Path

Supported Renderers
Luxology (modo)
All built-in renderers

Documentation: MicroStation Documentation

1.3.40 modo
Highlighted Features

Supports Versions 3xx to 8xx


Integrated Submission
Keeps Scene In Memory
Modo Distributed Rendering
Tile Rendering
Jigsaw Support
Pass Groups Support
Submit Pass Group As Separate Jobs
Shotgun Support
ftrack Support
Draft Support
Path Mapping Of Scene File Path
Path Mapping Of Output File Path
Path Mapping Of Scene File Contents

Supported Renderers
modos default renderer
VRay

Documentation: modo Documentation

22

Chapter 1. Introduction

Deadline User Manual, Release 7.1.0.35

1.3.41 Naiad
Highlighted Features

Simulation Jobs
EMP to PRT Conversion Jobs
Shotgun Support
ftrack Support
Path Mapping Of Scene File Path
Path Mapping Of EMP File Path

Documentation: Naiad Documentation

1.3.42 Natron
Highlighted Features

Supports Versions 0.9 to 1.0


Specify Writer Node to Render
Shotgun Support
ftrack Support
Path Mapping Of Project File Path
Path Mapping Of Project File Contents

Documentation: Natron Documentation

1.3.43 Nuke
Highlighted Features

Supports Versions 6 to 9
Integrated Submission
Keeps Scene In Memory
Submit Write Nodes As Separate Jobs
Submit Write Nodes in Precomp Nodes
Specify Views to Render
Render Using Proxy Mode
Nuke Studio Support
Studio Frame Server distributed rendering
Studio Sequence Submission
Custom Sanity Check
Shotgun Support
ftrack Support
Draft Support
Path Mapping Of Scene File Path
Path Mapping Of Scene File Contents

Documentation: Nuke Documentation

1.3. Supported Software

23

Deadline User Manual, Release 7.1.0.35

1.3.44 Octane Standalone


Highlighted Features

Shotgun Support
ftrack Support
Path Mapping Of Scene File Path
Path Mapping Of Output File Path

Documentation: Octane Standalone Documentation

1.3.45 PRMan (Renderman Pro Server)


Highlighted Features

Shotgun Support
ftrack Support
Path Mapping Of Input File Path
Path Mapping Of Working Directory Path

Documentation: PRMan Documentation

1.3.46 Puppet
Highlighted Features
Sync applications and plugins across render nodes
Automatically sync when render nodes are idle
Documentation: Puppet Event Documentation

1.3.47 Python
Highlighted Features

Supports Versions 2.3 to 2.7 and 3.0 to 3.2


Submit Python Scripts as Jobs
Path Mapping Of Script File Path
Path Mapping Of Script Arguments

Documentation: Python Documentation

24

Chapter 1. Introduction

Deadline User Manual, Release 7.1.0.35

1.3.48 Quicktime
Highlighted Features

Generate Quicktime Movies from Images


Shotgun Support
ftrack Support
Path Mapping Of Input File Path
Path Mapping Of Output File Path
Path Mapping Of Audio File Path

Documentation: Quicktime Documentation

1.3.49 RealFlow
Highlighted Features

Supports Versions 4 to 5, and 2012 to 2014


Integrated Submission
Submit IDOCs as Separate Jobs
Shotgun Support
ftrack Support
Path Mapping Of Scene File Path

Documentation: RealFlow Documentation

1.3.50 REDLine
Highlighted Features
Path Mapping Of Scene File Path
Path Mapping Of Output Folder Path
Path Mapping Of RSX File Path
Documentation: REDLine Documentation

1.3.51 Renderman (RIB)


Note that while this plugin supports PRMan, it is recommended that you use PRMans dedicated plugin instead if you
are using that renderer.

1.3. Supported Software

25

Deadline User Manual, Release 7.1.0.35

Highlighted Features

Shotgun Support
ftrack Support
Draft Support
Path Mapping Of Input File Path

Supported Renderers

3Delight
AIR
Aqsis
BMRT
Entropy
PRMan
Pixie
RenderDotC
RenderPipe

Documentation: Renderman Documentation

1.3.52 Rendition
Highlighted Features

Tile Rendering
Shotgun Support
ftrack Support
Path Mapping Of Scene File Path
Path Mapping Of Output File Path

Documentation: Rendition Documentation

1.3.53 Rhino
Highlighted Features

Supports Versions 4 and 5


Integrated Submission
Render Bongo Animations
Shotgun Support
ftrack Support
Draft Support
Tile Rendering
Jigsaw Support
Path Mapping Of Scene File Path
Path Mapping Of Output File Path

Supported Renderers

Brazil r/s
Flamingo Raytrace
Flamingo Photometric
Maxwell
Penguin
Rhino
TreeFrog
VRay

Documentation: Rhino Documentation

26

Chapter 1. Introduction

Deadline User Manual, Release 7.1.0.35

1.3.54 RVIO
Highlighted Features

Shotgun Support
ftrack Support
Path Mapping Of Input File Paths
Path Mapping Of Audio File Paths
Path Mapping Of Output File Path

Documentation: RVIO Documentation

1.3.55 Salt
Highlighted Features
Sync applications and plugins across render nodes
Automatically sync when render nodes are idle
Documentation: Salt Event Documentation

1.3.56 Shake
Highlighted Features
Shotgun Support
ftrack Support
Path Mapping Of Scene File Path
Documentation: Shake Documentation

1.3.57 Shotgun
Highlighted Features

Create new Versions on job submission


Update Version status on job completion
Automatic thumbnail generation and upload
Automatic movie generation and upload
Automatic film strip generation and upload

Documentation: Shotgun Event Documentation

1.3. Supported Software

27

Deadline User Manual, Release 7.1.0.35

1.3.58 SketchUp
Highlighted Features

Supports Versions 7 to 8 and 2013 to 2015


Integrated Submission
Export 3D Models
Export 2D Images
Export 2D Image Sequences
Path Mapping Of Scene File Path
Path Mapping Of Export Directory Path

Supported Renderers
All

Documentation: SketchUp Documentation

1.3.59 Softimage
Highlighted Features

Supports Versions 2010 to 2015


Integrated Submission
Keeps Scene In Memory
Tile Rendering
Local Rendering
Submit Passes As Separate Jobs
Fx Render Tree Jobs
Shotgun Support
ftrack Support
Draft Support
Path Mapping Of Scene File Path
Path Mapping Of Output File Path
Path Mapping Of Workgroup Folder Path

Supported Renderers
All

Documentation: Softimage Documentation

1.3.60 Terragen
Highlighted Features

Supports Versions 2 to 3
Local Rendering
Path Mapping Of Scene File Path
Path Mapping Of Output File Path

Documentation: Terragen Documentation

28

Chapter 1. Introduction

Deadline User Manual, Release 7.1.0.35

1.3.61 VRay Distributed Rendering


Highlighted Features

Supported Applications

Submit Spawner Jobs to Reserve Machines


Interactive Distributed Rendering

3ds Max (fully integrated)


Maya (fully integrated)
Rhino (spawner launching only)
SketchUp (spawner launching only)
Softimage (fully integrated)
VRay Standalone (spawner launching only)

Documentation: VRay Distributed Rendering Documentation

1.3.62 VRay Standalone


Highlighted Features

VRIMG to EXR Conversion


PLY to VRMESH Conversion
Shotgun Support
ftrack Support
Path Mapping Of Scene File Path
Path Mapping Of Output File Path
Path Mapping Of Scene File Contents

Documentation: VRay Standalone Documentation, Ply2Vrmesh Documentation, Vrimg2Exr Documentation

1.3.63 VRED
Highlighted Features

Supports VRED 2015


Local Rendering
Single Frame and Animation Rendering
Sequencer and RenderQueue Supported
Shotgun Support
ftrack Support
Draft Support

Documentation: VRED Documentation

1.3.64 VRED Cluster


Highlighted Features
Supports VRED 2015
Submit Cluster Jobs to Reserve Machines
Documentation: VRED Cluster Documentation

1.3. Supported Software

29

Deadline User Manual, Release 7.1.0.35

1.3.65 Vue
Highlighted Features

Supports Versions 7 to 11 and 2014 to 2015


Integrated Submission
Shotgun Support
ftrack Support
Path Mapping Of Scene File Path

Documentation: Vue Documentation

1.3.66 xNormal
Highlighted Features
Path Mapping Of Scene File Path
Documentation: xNormal Documentation

1.4 Render Farm Considerations


This is a list of things that should be taken into consideration before installing Deadline.

1.4.1 Rendering Software and Licensing


It is recommended that the rendering applications you plan to use for rendering (ie: 3ds Max, Maya, etc) be installed
on all of your render nodes. It is preferable that you install an application to the same location on each machine,
because this makes configuring the Deadline plugins easier. Note that some applications support being installed and
run from a network location, which can make setup and configuration easier. Refer to your rendering applications
documentation to see if this is supported.
In addition, it is recommended that all licensing that your rendering applications require be setup before attempting to
render on your network. Deadline doesnt handle the licensing of 3rd party rendering applications, so you should refer
to your applications documentation or contact its support team if you run into issues with licensing.

1.4.2 Store Assets On The Network


It is recommended that all assets (ie: scenes, footage, textures, etc) used by your render jobs be placed on a network
share (preferably a server), which can be accessed via a shared path or a mapped network drive. This is important for
two reasons:
It ensures that all the slaves in your render farm have access to your asset files.
It ensures that the slaves use the same version of the asset files that are used by your job.
Note that you can optionally submit the scene file with the job. This results in the scene file being sent to the Repository
or an alternate location, and then copied locally to the Slave that renders it. If the scene file contains relative asset
paths, it is recommended to not submit the scene file with the job, as these relative paths will likely be broken when
the Slave renders the scene from its local location.

30

Chapter 1. Introduction

Deadline User Manual, Release 7.1.0.35

When rendering in a mixed OS environment, you can configure Deadline to swap paths based on the operating system
it is running on. The way this works is often specific to the rendering application that you are using, so please refer
to Cross-Platform Rendering Considerations section for the plug-in that you are using for more information. You can
access plug-in specific documentation in the Plug-ins documentation.

1.4.3 Save Output Files To The Network


All output should be saved to a network share as well (preferably a server). This is important because it ensures that
all the slaves in your render farm have access to the output path.
When rendering in a mixed OS environment, you can configure Deadline to swap output paths based on the operating
system it is running on. The way this works is often specific to the rendering application that you are using, so please
refer to Cross-Platform Rendering Considerations section for the plug-in that you are using for more information. You
can access plug-in specific documentation in the Plug-ins documentation.

1.4.4 Remote Administration


Deadline has a Remote Administration feature that can be enabled in the Client Setup section of the Repository
Options, which can be accessed from the Monitor by selecting Tools -> Configure Repository Options while in Super
User Mode. This feature allows you to control all the render nodes remotely from a single machine, including starting
and stopping the Slave application, and running arbitrary command line applications on each machine. However, this
feature can be a potential security risk if you are not behind a firewall. If this is the case, we recommend that you keep
this feature disabled.

1.4.5 Automatic Updates


Deadline has an Automatic Updates feature that can be enabled in the Client Setup section of the Repository Options,
which can be accessed from the Monitor by selecting Tools -> Configure Repository Options while in Super User
Mode. Enabling this feature makes minor Deadline upgrades easy, with little to no downtime. Refer to the Upgrading
Documentation for more information.

1.4.6 Setup An SMTP Server for Emails


Deadline can use email to notify users when their jobs have succeeded or failed. Email can also be used to notify
system administrators of all sorts of events, like when slaves stall or when jobs fail. It is recommended that an SMTP
server be setup so that you can make use of these features.
You can configure the email notification settings in the Repository Options, which can be accessed from the Monitor
by selecting Tools -> Configure Repository Options while in Super User Mode.

1.4.7 Auto Login on Windows Render Nodes


If youre not running the Slave as a service, it can be set to start automatically when the render mode it is on starts up,
but this requires that the render node login automatically. On Windows, this can be done by modifying the registry on
each render node.
These are the steps to setup your render node registry to login:
1. Download the Registry Entry File For Auto Login from the Miscellaneous Deadline Downloads Page.
2. Edit the file to use the username and password you wish to.
3. Login to the render node as the specified user, then double-click on this file to run.
1.4. Render Farm Considerations

31

Deadline User Manual, Release 7.1.0.35

4. The next time you restart the machine, it should login automatically as the specified user.
By default, the Slaves are set to start automatically when the machine logs in. This setting, as well as others, can be
modified from the Launcher on each machine.

1.4.8 App Nap on Mac OS X Render Nodes and Workstations


App Nap is a collection of new features in OS X Mavericks (10.9+) that helps conserve CPU energy use by slowing
down or stopping applications that cannot be seen, for example if they are behind another window or the screen has
been put to sleep. However, this can have an adverse affect on Deadline and/or the applications it is rendering with.
Because of this, we recommend disabling App Nap and screen power saving modes (if applicable) on render nodes
across the entire operating system by enabling the Prevent App Nap checkbox via right-click Get Info for each
application on each machine or by following these steps in terminal:

1. Open a terminal (the Terminal can be found in /Applications/Utilities).


2. Run the following command (sudo rights required) and you must restart the machine
defaults write NSGlobalDomain NSAppSleepDisabled -bool YES
If you wish to re-enable App Nap, follow the steps above, but run the following command for (2) instead:
defaults delete NSGlobalDomain NSAppSleepDisabled
You can check the status of the setting (if it already exists on a machine) by the following command, where 1 means
App Nap is disabled and 0 means it is enabled:
defaults read NSGlobalDomain NSAppSleepDisabled
If workstations are being used as render nodes, it is recommended to disable App Nap on them as well. However, if
workstations are simply being used to submit and monitor render jobs, then this shouldnt be necessary.
On Macs which have built-in or connected external displays, once a screen saver has begun or the display has been put
to sleep by power management, Deadline as well as other rendering applications will be throttled down to conserve
energy, regardless of the per-app App Nap setting.
Finally, the machine that is running Pulse/Balancer should also have App Nap disabled, or at the very least, disabled for
the Pulse/Balancer applications. To disable App Nap for the Pulse/Balancer application only, right-click (or Commandclick) on the DeadlinePulse/DeadlineBalancer application in Finder, and select Get Info. Then in the General section,
check the Prevent App Nap box. If Pulse/Balancer is currently running, you will have to restart it for the changes to
take effect.

32

Chapter 1. Introduction

Deadline User Manual, Release 7.1.0.35

1.4.9 Disable WER on Windows


When applications crash on Windows, the system holds the application open in memory and displays a series of helpful
boxes asking if you want to submit the error report to Microsoft. While thats super handy for all sorts of reasons, if
theres no one there to click the dialog (headless render node), Deadline will assume the application is still running
and wait indefinitely by default.
The registry fix below will stop that from popping up on render nodes that dont have baby sitters. Meaning when the
application crashes, it actually exits like we know it should. This change is system-wide, but can be configured peruser if you like by changing the registry hive used (HKEY_CURRENT_USER versus HKEY_LOCAL_MACHINE).
Ensure you restart the machine after changing the registry setting and it is always recommended to take a backup before
editing a machines registry. Copy the code below into a file: DisableCrashReporting.reg and double-click this file
as a user with administrator privileges. Alternatively, you can manually add/edit the registry entry via regedit.exe or
inject the registry silently via the command-line regedit.exe /s DisableCrashReporting.reg.
Windows Registry Editor Version 5.00
[HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\Windows Error Reporting]
"Disabled"=dword:00000001

For more information about the possible settings, see here: MSDN article WER Settings.
Its also possible to just default to sending them if you like, or to store the crash dumps in a safe place if youre a
developer.

1.4.10 Firewall, Anti-Virus & Security Considerations


Here is a checklist of items which should be considered by those responsible for deploying Deadline repository and
client software.
Ensure you consider additional configuration requirements for any software/hardware firewall clients, network
switches, anti-virus software clients and Operating System specific security controls such as Windows UAC or
SELinux (Security-Enhanced), which may attempt to block Deadline communication.
It is recommended during initial setup & configuration to disable all firewalls, anti-virus software, etc and test the basic
operation and functionality of Deadline. Once this has been verified as correct, then slowly re-enable all necessary
other software, re-testing and confirming that Deadline execution is still correct.
Windows UAC
Ensure Windows UAC is correctly configured to allow Deadline communication and the correct execution of the
Deadline applications.
Anti-Virus Software
Ensure Anti-Virus software does NOT block Deadline and allows Deadline executables to run normally on ALL
machines.
Deadline Executables
Allow Deadline executables to pass-through any applicable Client Firewall. Ensure you consider all applicable policy
scopes (Windows - domain, private, public) and both inbound & outbound rules:
[INSTALL_PATH] Windows executable / Mac OSX executable / Linux executable
[INSTALL_PATH] deadlinecommand.exe / DeadlineCommand.app / deadlinecommand
[INSTALL_PATH] deadlinecommandbg.exe / DeadlineCommandBG.app / deadlinecommandbg
[INSTALL_PATH] deadlinelauncher.exe / DeadlineLauncher.app / deadlinelauncher

1.4. Render Farm Considerations

33

Deadline User Manual, Release 7.1.0.35

[INSTALL_PATH] deadlinelauncherservice.exe (Windows Only)


[INSTALL_PATH] deadlinemonitor.exe / DeadlineMonitor.app / deadlinemonitor
[INSTALL_PATH] deadlineslave.exe / DeadlineSlave.app / deadlineslave
[INSTALL_PATH] deadlinepulse.exe / DeadlinePulse.app / deadlinepulse
[INSTALL_PATH] deadlinebalancer.exe / DeadlineBalancer.app / deadlinebalancer
Deadlines default local client software [INSTALL_PATH] for each OS are as follows (where # is the Deadline version):
Windows: C:\Program Files\Thinkbox\Deadline#\bin
Mac OSX: /Applications/Thinkbox/Deadline#/bin
Linux: /opt/Thinkbox/Deadline#
Application Executables
Make sure you allow your application executables to pass-through any applicable Client Firewall. Ensure you consider
all applicable policy scopes (Windows - domain, private, public) and both inbound & outbound rules. See here for
specific 3dsMax Firewall Exceptions documentation.
MongoDB Server & Deadline clients
Ensure you allow MongoDB service daemon to pass through any firewall and network switch. Ensure you consider
all applicable policy scopes (Windows - domain, private, public) and both inbound & outbound rules:
[INSTALL_PATH] Windows executable / Mac OSX executable / Linux executable
[INSTALL_PATH] mongod.exe / mongod / mongod
Deadlines default local database software [INSTALL_PATH] for each OS are as follows (where # is the Deadline
version):
Windows: c:\DeadlineDatabase#\mongo\application\bin
Mac OSX: /Applications/Thinkbox/DeadlineDatabase#/mongo/application/bin
Linux: /opt/Thinkbox/DeadlineDatabase#/mongo/application/bin
Mono (Mac OSX / Linux Only)
Ensure Mono executable is allowed to pass-through any firewall / anti-virus software.
Port Configuration.
Ensure the machine(s) running the MongoDB, Deadline repository, Deadline Pulse/Balancer/Monitor/Slave ALL have
the ability to communicate with each other on your local and/or extended network with the following (default) TCP or
UDP ports.

34

Chapter 1. Introduction

Deadline User Manual, Release 7.1.0.35

Protocol
UDP

Port
Number
17061

TCP

17061

TCP

17062

TCP
TCP

27017
28017

TCP

8080

UDP

UDP
TCP

123
25

TCP

587

TCP

465

Service

Comment

Pulse
auto-configuration
Pulse
auto-configuration
Pulse

Default UDP port - Pulse listens for broadcasts on the UDP port

MongoDB
MongoDB Web
API
Pulse WebService
WoL
(Wake-On-Lan)
NTP
SMTP
SMTP
(submission)
SMTP SSL

Default TCP port - Pulse sends auto-config data over TCP


Default TCP port - Configure Repository Options - Pulse
Settings - General
Access the http web site (optional) for database information
Default TCP port - Configure Repository Options - Pulse
Settings - WebService
Default UDP port - Configure Repository Options - Wake On
Lan Settings
For mail server to receive e-mail notifications from Slaves and
Pulse

For sending notifications using SSL

License Server
If necessary, ensure that the Thinkbox Flexlm license file has been configured to run over an exact TCP port and
this port has also been allowed access through any required firewall or network switch. Please refer to the FLEXnet
Licensing Documentation.
External Web Service Access & Deadline Mobile
If external network access is required, please see the Web Service and Deadline Mobile documentation.

1.4. Render Farm Considerations

35

Deadline User Manual, Release 7.1.0.35

36

Chapter 1. Introduction

CHAPTER

TWO

INSTALLATION

2.1 System Requirements


This section covers the system requirements for all the Deadline components. It is also recommended to read through
the Render Farm Considerations documentation before proceeding with the installation.
For a more complete description of the Deadline components listed below, see the Deadline Overview documentation.

2.1.1 Database
Deadline uses MongoDB for the Database, and requires MongoDB 2.6.1 or later. The Repository installer can install the MongoDB database for you, or you can use an existing MongoDB installation providing that it is running
MongoDB 2.6.1 or later.
The following operating systems are supported for the Database:
Windows Server 2003 and later (64-bit)
Linux (64-bit)
Mac OS X 10.7 and later (64-bit)
These are the minimum recommended hardware requirements for a production Database:
64-bit Architecture
8 GB RAM
4 Cores
RAID or SSD disks
20 GB disk space
Note that MongoDB performs best if all the data fits into RAM, and it has fast disk write speeds. In addition, larger
farms may have to scale up on RAM and Cores as necessary, or even look at Sharding their database. Finally, while
you can install MongoDB to a 32-bit system for testing, it has limitations and is not recommended for production. For
example, the database size will be limited to 2 gigabytes, and Journaling will be disabled. Without Journaling, it will
not be possible to repair the database if a crash corrupts the data. See the MongoDB FAQ for more information.
Windows
If you choose a non-Server Windows Operating System (Vista, 7, or 8) to host the database, you should be aware
that these operating systems have a TCP/IP connection limitation of 10 new connections per second. If your render
farm consists of more than 10 machines, it is very likely that youll hit this limitation every now and then (and the

37

Deadline User Manual, Release 7.1.0.35

odds continue to increase as the number of machines increase). This is a limitation of the operating systems, and isnt
something that we can workaround, so we recommend using a Server edition of Windows, or a different operating
system like Linux.
Linux
If you choose a Linux system to host the database, you will need to make sure the system resource limits are configured
properly to avoid connection issues. More details can be found in the Database and Repository Installation Guide.
Other Linux recommendations include:
Do not run MongoDB on systems with Non-Uniform Access Memory (NUMA). It can cause a number of
operational problems, including slow performance or high system process usage.
Install on a system with a minimum Linux kernel version of 2.6.36.
Install on a system with Ext4 or XFS file systems.
Turn off atime or relatime for the storage volume containing the database files, as it can impact performance.
Do not use hugepages virtual memory pages as MongoDB performs better with normal virtual memory pages.
Mac OS X
If you choose a Mac OS X system to host the database, you will need to make sure the system resource limits are
configured properly to avoid connection issues. More details can be found in the Database and Repository Installation
Guide.

2.1.2 Repository
The Repository is just a collection of files and folders, so it can be installed to any type of share on any type of
operating system. Common Repository choices include:
Windows Server
Linux
FreeBSD
While the Repository can be installed on any operating system, the Repository installer is only supported on the
following operating systems. To install on a different operating system, first create the network share on that
system, and then run the Repository installer on one of the systems below and choose the network share as the
installation location.
Windows (32 and 64-bit)
Windows XP and later (32 and 64-bit)
Windows Server 2003 and later (32 and 64-bit)
Linux (64-bit only)
Ubuntu 12.04 and later
Debian 7 and later
Fedora 16 and later
CentOS 6 and later
RHEL 6 and later

38

Chapter 2. Installation

Deadline User Manual, Release 7.1.0.35

Mac OS X (64-bit only)


10.7 (OS X Lion) and later
If you choose a non-Server Windows Operating System (XP, Vista, 7, or 8), these operating systems usually will not
allow more than 10 incoming connections without purchasing additional user access licenses from Microsoft. This
means that if more than 10 machines (render nodes or workstations) connect to the Repository, connections will be
dropped, which could result unexpected behavior. This is a limitation of the operating systems, and isnt something
that we can workaround, so we recommend using a Server edition of Windows, or a different operating system like
Linux or FreeBSD.
For hardware requirements, it mainly depends on if you are planning to submit scene files and other auxiliary files
with your jobs. If you are, keep in mind that the Repository machine will need to serve out these files to the Client
machines, so you will want to treat it like another asset server when it comes to picking hardware. That being said, if
you already have an asset server, you could probably just install the Repository on it. If you are not submitting your
scene files with your jobs (because they are already stored in a network location), then you should be fine with a less
powerful machine.

2.1.3 Client
The Client can be installed on Windows, Linux, or Mac OS X. The requirements for todays rendering applications go
far beyond the requirements of the Client, so if a machine is powerful enough to be used for rendering, it is more than
capable of running the Client applications.
If you choose to run Pulse or Balancer, and you wish to run it on the same machine as the Database and/or Repository,
you will have to install the Client on that machine as well.
The following operating systems are supported for the Client:
Windows (32 and 64-bit)
Windows XP and later (32 and 64-bit)
Windows Server 2003 and later (32 and 64-bit)
Linux (64-bit only)
Ubuntu 12.04 and later
Debian 7 and later
Fedora 16 and later
CentOS 6 and later
RHEL 6 and later
Mac OS X (64-bit only)
10.7 (OS X Lion) and later
Note that on Linux, the Deadline applications have dependencies on some libraries that are installed with the lsb
(Linux Standard Base) package. To ensure you have all the dependencies you need, we recommend installing the full
lsb package. In addition, the libX11 and libXext must be installed on Linux for the Deadline applications to run, even
if running them with the -nogui flag. Theyre required for the Idle Detection feature, among other things. To check if
libX11 and libXext are installed, open a Terminal and run the following commands. If they are installed, then the path
to the libraries will be printed out by these commands.
ldconfig -p | grep libX11
ldconfig -p | grep libXext

2.1. System Requirements

39

Deadline User Manual, Release 7.1.0.35

If any of these libraries are missing, then please contact your local system administrator to resolve this issue. Here is
an example assuming you have root access, using YUM to install them on your system:
sudo -s
yum install redhat-lsb
yum install libX11
yum install libXext

Note that if you are choosing a machine to run Pulse, you should be aware that non-Server editions of Windows
have a TCP/IP connection limitation of 10 new connections per second. If your render farm consists of more than
10 render nodes, it is very likely that youll hit this limitation every now and then (and the odds continue to increase
as the number of machines increase). This is a limitation of the operating systems, and isnt something that we can
workaround, so we recommend using a Server edition of Windows, or a different operating system like Linux.

2.1.4 License Server


Deadline requires Flexnet License Server version 11.12 or later, and the license server can be run on the following
operating systems:
Windows (32 and 64-bit)
Windows XP and later (32 and 64-bit)
Windows Server 2003 and later (32 and 64-bit)
Linux (64-bit only)
Ubuntu 12.04 and later
Debian 7 and later
Fedora 16 and later
CentOS 6 and later
RHEL 6 and later
Mac OS X (64-bit only)
10.7 (OS X Lion) and later
See the License Server Documentation for more information on the License Server requirements.
Note that if you choose a non-Server Windows Operating System (XP, Vista, 7, or 8), you should be aware that these
operating systems have a TCP/IP connection limitation of 10 new connections per second. If your render farm consists
of more than 10 machines, it is very likely that youll hit this limitation every now and then (and the odds continue to
increase as the number of machines increase). This is a limitation of the operating systems, and isnt something that
we can workaround, so we recommend using a Server edition of Windows, or a different operating system like Linux.

2.2 Licensing
See the License Server Documentation for more information on installing and configuring the License Server.

40

Chapter 2. Installation

Deadline User Manual, Release 7.1.0.35

2.3 Database and Repository Installation


2.3.1 Overview
Before proceeding with this installation, it is highly recommended to read through the Render Farm Considerations
documentation.
The Database is the global database component of the Deadline Render Farm Management System. It stores the jobs,
settings, and slave configurations. The Clients access the Database via a direct socket connection over the network.
It only needs to be installed on one machine (preferably a server), and does not require a license. Deadline uses
MongoDB for the Database.
The Repository is the global file system component of the Deadline Render Farm Management System. It stores the
plugins, scripts, logs, and any auxiliary files (like scene files) that are submitted with the jobs. The Clients access the
Repository via a shared network path. It only needs to be installed on one machine (preferably a server), and does not
require a license.
The Database and Repository together act as a global system where all of Deadlines data is stored. The Clients
then connect to this system to submit, render, and monitor jobs. It is important to note that while the Database and
Repository work together, they are still separate components, and therefore can be installed on separate machines if
desired.
The Repository installer can install the MongoDB database for you, but you can also choose to connect to an existing
MongoDB installation.

2.3.2 Installation
While the Repository can be installed on any operating system, the Repository installer is only available for Windows,
Linux, and Mac OS X. However, the machine that you run the Repository installer on doesnt have to be the same
machine youre installing the Repository to. For example, if you have an existing share on a FreeBSD server or a NAS
system, you can run the Repository installer on Windows, Linux, or Mac OS X and choose that share as the install
location.
To install the Repository, simply run the appropriate installer for your operating system and follow the steps. This
procedure is identical for all operating systems. The Repository installer also supports silent installations.

2.3. Database and Repository Installation

41

Deadline User Manual, Release 7.1.0.35

When choosing the Installation Directory, you can choose either a local path on the current machine, or the path to an
existing network share. Note that if you choose a local path, you must ensure that path is shared on the network so that
the Clients can access it. Do not install over an existing installation unless its the same major version, or there
could be unexpected results.

42

Chapter 2. Installation

Deadline User Manual, Release 7.1.0.35

If youre installing over an existing Repository installation, all previous binaries, plug-ins, and scripts will be backed
up prior to being overwritten. After the installation is complete, you can find these backed up files in the Backup folder
in the Repository installation root. Note that installing over an existing repository is only supported for repairing a
damaged repository, or for performing a minor upgrade. Major upgrades require a fresh repository installation. See
the Upgrading or Downgrading Deadline Documentation for more information.

2.3. Database and Repository Installation

43

Deadline User Manual, Release 7.1.0.35

After choosing the installation directory, you will be asked to install the MongoDB Database, or connect to an existing
one. If you choose to install the MongoDB Database, you will be asked to choose an installation location and a port
number. It is highly recommended that you choose a local directory to install the Database.
Note that Deadline 7 requires a newer version of the MongoDB database application than the one shipped with
Deadline 6. However, this newer version is backward compatible with Deadline 6. So if you are installing the
MongoDB database application to a machine that already has a Deadline 6 database installed, you can just
install it over top of the existing Deadline 6 database installation.

44

Chapter 2. Installation

Deadline User Manual, Release 7.1.0.35

Next, you need to specify the Database Settings so that the installer can set up the Database. These settings will also
be used by the Clients to connect to the database. The following are required:
Database Server: The host name or the IP address of the machine that the MongoDB database is running on.
If desired, you can specify multiple entries and separate them with semicolons. There are a couple reasons to
specify multiple entries:
You have machines on different subnets that need to access the database differently (ie: machines in the
cloud might use a different host name than machines on the local network).
Some machines need to resolve the database machine by its host name, and others need to use its IP
address.
Note that if there are IP addresses listed that cannot be resolved, the Deadline Command application
can run slower on Linux and OSX Clients because it wont exit until the connection attempt for those IP
addresses time out.
Database Port: The port that the MongoDB database is listening on.
Database Name: The name of the Database. If you are setting up a new Database, you can leave this as the
default. If you are connecting to an existing Database, make sure to enter the same name you used when you
initially set up the Database.
Replica Set: If you set up your MongoDB database manually and it is part of a Replica Set, specify the Replica
Set Name here. If you dont have Replica Set, just leave this blank.
When you press Next, the installer will try to connect to the database using these settings to configure it. This can take
a minute or two. If an error occurs, you will be prompted with the error message. If the setup succeeds, you can then
proceed with the installation of the Repository.

2.3. Database and Repository Installation

45

Deadline User Manual, Release 7.1.0.35

Command Line or Silent Installation


The Repository installer can be run in command line mode or unattended mode on each operating system. Note though
that on Mac OS X, you must run the installbuilder.sh script that can be found in the Contents/MacOS folder, which is
inside the Mac Repository Installer package.
To run in command line mode, pass the mode text command line option to the installer. For example, on Linux:
./DeadlineRepository-X.X.X.X-linux-x64-installer.run --mode text

To run in silent mode, pass the mode unattended command line option to the installer. For example, on Windows:
DeadlineRepository-X.X.X.X-windows-installer.exe --mode unattended

To get a list of all available command line options, pass the help command line option to the installer. For example,
on Mac OS X:
/DeadlineRepository-X.X.X.X-osx-installer.app/Contents/MacOS/installbuilder.sh --help

Note that there are a few Repository installer options that are only available from the command line, which you can
view when running the help command. These options include:
backuprepo: If enabled, many folders in the Repository will be backed up before overwriting them (this is
enabled by default).
dbauth: If enabled, Deadline will use the given user and password to connect to MongoDB (if authentication
is enabled on your database).
dbuser: The user name to connect to MongoDB if authentication is enabled.
dbpassword: The password to connect to MongoDB if authentication is enabled.
dbsplit: If enabled, the database collections will be split into separate databases to improve performance (this
is enabled by default).
Database Config File
A file called config.conf is installed to the data directory in the database installation folder. This file is used to configure
the MongoDB database, and can be modified to add or change functionality. This is what you will typically see by
default:
#MongoDB config file
#where to log
systemLog:
destination: file
path: C:/DeadlineDatabase7/data/logs/log.txt
quiet: true
#verbosity: <integer>
#port for mongoDB to listen on
#uncomment below ipv6 and REST option to enable them.
net:
port: 27070
#ipv6: true
#http:

46

Chapter 2. Installation

Deadline User Manual, Release 7.1.0.35

#RESTInterfaceEnabled: true
#where to store the data
storage:
dbPath: C:/DeadlineDatabase7/data
#enable sharding
#sharding:
#clusterRole
#configDB
#setup replica set with give replica set name
#replication:
#replSetName
#enable authentication
#security:
#authorization: enabled

After making changes to this file, simply restart the mongod process for the changes to take effect. See the MongoDB
Configuration File Options for more information on the available options.
Manual Database Installation
The Repository installer installs MongoDB with the bare minimum settings required for Deadline to operate. Manually
installing the Database might be preferable for some because it gives you greater control over things like authentication,
and allows you to create sharded clusters or replica sets for backup.
If you wish to install MongoDB manually, you can download MongoDB from the MongoDB Downloads Page. Once
MongoDB is running, you can then run the Repository installer, and choose to connect to an existing MongoDB
Database. Here are some helpful links for manually installing the MongoDB database:
Installing MongoDB
Enabling Authentication
Replication
Sharding
MongoDB also has a management system called MMS. Its a cloud service that makes it easy to provision, monitor,
backup, and scale your MongoDB databse. Here are some helpful links for setting up and using MMS:
Getting Started
Add MongoDB Servers to MMS
Install the Automation Agent
The Automation Agent mentioned above makes it possible to setup your MongoDB database from a web interface, and
easily configure which MongoDB servers are replica sets or shards. It also allows you to easily upgrade the version of
your MongoDB database. Here are some additional links for how you can use the Automation Agent:
Deploy a Replica Set
Deploy a Sharded Cluster
Deploy a Standalone MongoDB Instance
Change the MongoDB Version

2.3. Database and Repository Installation

47

Deadline User Manual, Release 7.1.0.35

Note though that as of this writing, the Automation Agent is only available for Linux and Mac OS X.
Database Resource Limits
Linux and Mac OS X systems impose a limit on the number of resources a process can use, and these limits can
affect the number of open connections to the database. It is important to be aware of these limits, and make sure they
are set appropriately to avoid unexpected behaviour. Note that MongoDB will allocate 80% of the system limit for
connections, so if the system limit is 1024, the maximum number of connections will be 819.
If you choose a Linux system to host the database, make sure the system limits are configured properly to avoid connection issues. See MongoDBs Linux ulimit Settings documentation for more information, as well as the recommended
system limits to use.
You can check your current Linux/OSX ulimit settings in a terminal shell:
#overall ulimit settings on the machine
ulimit -a
#number of open files allowed
ulimit -n

MongoDB provides these Recommended ulimit Settings for optimal performance of your database. Note, you must
restart the Deadline Database daemon after changing these ulimit settings.
If you choose a Mac OS X system to host the database, and you use the Repository installer to install the database,
the resource limits will be set to 1024. These limits can be adjusted later by manually editing the HardResourceLimits
and SoftResourceLimits values in /Library/LaunchDaemons/org.mongodb.mongod.plist after the Repository installer
has finished.

2.3.3 Open Firewall Ports


To ensure that the Deadline applications can communicate with MongoDB, you will need to update the firewall on
the machine that MongoDB is running on. You can either disable the firewall completely (assuming it operates in an
internal network), or you can open the port that you chose for the database to use during install. More information on
opening ports can be found below.
Windows
Open Windows Firewall with Advanced Security. Click on Inbound Rules in the left panel to view all inbout rules,
and then right-click on Inbound Rules and select New Rule to start the Inbound Rule Wizard. Select Port for the Rule
Type, and then click Next.

48

Chapter 2. Installation

Deadline User Manual, Release 7.1.0.35

2.3. Database and Repository Installation

49

Deadline User Manual, Release 7.1.0.35

On the Protocol and Ports page, choose TCP, and then specify the port that you chose for the database during the
install, and then press next. Then on the Action page, choose Allow The Connection and press Next.

50

Chapter 2. Installation

Deadline User Manual, Release 7.1.0.35

2.3. Database and Repository Installation

51

Deadline User Manual, Release 7.1.0.35

On the Profile page, choose the networks that this rule applies to, and then press next. Then on the Name page, specify
a name for the rule (for example, MongoDB Connection), and then press Finish.

52

Chapter 2. Installation

Deadline User Manual, Release 7.1.0.35

2.3. Database and Repository Installation

53

Deadline User Manual, Release 7.1.0.35

Linux
On RedHat and CentOS, the following commands should allow incoming connections to the Mongo database if iptables are being used. Just make sure to specify the port that you chose for the database during the install.
sudo iptables -I INPUT 1 -p tcp --dport 27070 -j ACCEPT
sudo ip6tables -I INPUT 1 -p tcp --dport 27070 -j ACCEPT

Ubuntu has no firewall installed by default, and we have not yet tested Fedora Cores FirewallD.
Mac OS X
Mac OS X has its firewall disabled by default, but if enabled, it is possible to open ports for specific applications. Open
up System Preferences choose the Security & Privacy option, and click on the Firewall tab.

54

Chapter 2. Installation

Deadline User Manual, Release 7.1.0.35

Press the Firewall Options button to open the firewall options. Press the [+] button and choose the path to the mongod
application, which can be found in the database installation folder in mongo/application/bin (for example, /Applications/Thinkbox/DeadlineDatabase7/mongo/application/bin/mongod). Then click OK to save your settings.

2.3. Database and Repository Installation

55

Deadline User Manual, Release 7.1.0.35

2.3.4 Sharing The Repository Folder


In general, the Repository must have open read and write permissions for Deadline to operate properly. This section
explains how to share your Repository folder and configure its permissions to ensure the Clients have full access.
Without full read/write access, the Client applications will not be able to function properly.
Note that this guide is for applying full read/write permissions to the entire Repository folder structure. For the more
advanced user, it is possible to enforce tighter restrictions on the Repository folders. Just make sure the Clients have
full read/write access to the following folders in the Repository. The rest must have at least read access.
jobs: This is where job auxiliary files are copied to during submission.
jobsArchived: This is where archived jobs are exported to.
reports: This is where the physical log files for job and slave reports are saved to.
Windows
First, you need to configure the Repository folder permissions. Note that the images shown here are from Windows
XP, but the procedure is basically the same for any version of Windows.
On the machine where the Repository is installed, navigate to the folder where it is installed using
Windows Explorer.
56

Chapter 2. Installation

Deadline User Manual, Release 7.1.0.35

Right-click on the Repository folder and select Properties from the menu.
Select the Security tab.

If there is already an Everyone item under Group or user names, you can skip the next two steps.
Click on the Add button.
In the resulting dialog, type Everyone and click OK.

2.3. Database and Repository Installation

57

Deadline User Manual, Release 7.1.0.35

Select Everyone under Group or user names.


Ensure that Modify, Read & Execute, List Folder Contents, Read, and Write are all checked under
the Allow column.
Click on the OK button to save the settings.

58

Chapter 2. Installation

Deadline User Manual, Release 7.1.0.35

Second, you need to share the Repository folder. Note that the images shown here are from Windows XP, but the
procedure is basically the same for any version of Windows.
On the machine where the Repository is installed, navigate to the folder where it is installed using
Windows Explorer.
Right-click on the Repository folder and select Properties from the menu. If youre unable to see the
Sharing tab, you may need to disable Simple File Sharing in the Explorer Folder Options.

2.3. Database and Repository Installation

59

Deadline User Manual, Release 7.1.0.35

Select the Sharing tab.

60

Chapter 2. Installation

Deadline User Manual, Release 7.1.0.35

Select the option to Share This Folder, then specify the share name.
Click the Permissions button.
Give Full Control to the Everyone user.
Press OK on the Permissions dialog and then the Properties dialog.

2.3. Database and Repository Installation

61

Deadline User Manual, Release 7.1.0.35

Linux
Since the Clients expects full read and write access to the repository, its recommended to use a single user account
to mount shares across all machines. It is possible to add particular users to a deadline group, but you will need to
experiment with that on your own.
So for both of the sharing mechanisms we explain below, youll need to create a user and a group named deadline.
They dont need a login or credentials, we just need to be able to set files to be owned by them and for their account to
show up in /etc/passwd. So, to do this use the useradd command.
sudo useradd -d /dev/null -c "Deadline Repositry User" -M deadline

This should create a user named deadline with no home folder, and a fancy comment. The account login should also
be disabled, meaning your standard users cant ssh or ftp into your file server using this account. Set a password using
sudo passwd deadline if you need your users to login as deadline using ftp or ssh.
62

Chapter 2. Installation

Deadline User Manual, Release 7.1.0.35

Now add a group using


sudo groupadd deadline

And finally, have the Repository owned by this new user and group
sudo chown -R deadline:deadline /path/to/repository
sudo chmod -R 777 /path/to/repository

Now youre ready to set up your network sharing protocol. There are a many ways this can be done, and this just
covers a few of them.
Samba Share
This is an example entry in the /etc/samba/smb.conf file:
[DeadlineRepository]
path = /path/to/repository
writeable = Yes
guest ok = Yes
create mask = 0777
force create mode = 0777
force directory mode = 0777
unix extensions = No

NFS Share
The simplest thing that could possibly work. Note that this is not the most secure thing that could possibly work:
For Linux and BSD, open up /etc/exports as an administrator, and make one new export:
/path/to/repository

192.168.2.0/24(rw,all_squash,insecure)

Breakdown of this command is as follows:


/path/to/repository: The Repository folder to share. Change the path as necessary.
192.168.2.0/24: The IP range to allow. The zero is important for these ranges. You can also go by hostname if
you have reverse DNS, or * to allow from anyones computer.
rw: Allow read/write for the repository, which is required for the Clients to operate properly.
all_squash: Make every single person who connects to the Repository share map to the nobody:nogroup user
and group. This relieves a lot of permissions pain for new users at the cost of zero security. Files and folders
within your repository will be fully readable and writeable by whomever is able to connect to your NFS server.
The Clients require this, but it can also be achieved by creating a group and adding individual users into that
group. Many studios will only need all_squash as Deadline will keep track of who submits what jobs.
insecure: Required for Mac OS X to mount nfs shares. It simply means that NFS doesnt need to receive
requests on a port in the secure port range (a port number less than 1024).
Once thats done, you may need to install an NFS server. To do so, open a terminal or your favourite package manager
to install one. For Ubuntu Server, type the following:
sudo apt-get install nfs-kernel-server

Then start up the server (for those living in an init.d world):

2.3. Database and Repository Installation

63

Deadline User Manual, Release 7.1.0.35

sudo /etc/init.d/nfs-kernel-server start

Any time you change the exports file, youll need to issue the same command, but replace start with reload.
There is an excellent tutorial here as well: https://help.ubuntu.com/community/SettingUpNFSHowTo
Mac OS X
First, you need to configure the Repository folder permissions. Note that the images shown here are from Leopard
(10.5), but the procedure is basically the same for any version of Mac OS X.
On the machine where the Repository is installed, navigate to the folder where it is installed using
Finder.
Right-click on the Repository folder and select Get Info from the menu.
Expand the Sharing & Permissions section, and unlock the settings if necessary.
Give everyone Read & Write privileges.
While probably not necessary, also give admin Read & Write privileges.

64

Chapter 2. Installation

Deadline User Manual, Release 7.1.0.35

If you prefer to set the permissions from the Terminal, run the following commands:
$ chown -R nobody:nogroup /path/to/repository
$ chmod -R 777 /path/to/repository

Now you can share the folder. There are a many ways this can be done, and this just covers a few of them.
Using System Preferences
Note that the images shown here are from Leopard (10.5), but the procedure is basically the same for any version of
Mac OS X.
Open System Preferences, and select the Sharing option.
2.3. Database and Repository Installation

65

Deadline User Manual, Release 7.1.0.35

Make sure File Sharing is enabled, and then add the Repository folder to the list of shared folders.
Under Users, give everyone Read & Write privileges.
If sharing with Windows machines, press the Options button and make sure the Share files and
folders using SMB (Windows) is enabled.

Samba Share
Interestingly, Mac OS X uses samba as well. Apple just does a good job of hiding it. To create a samba share in Mac
OS X, past this at the bottom of /etc/smb.conf:
[DeadlineRepository]
path = /path/to/repository
writeable = Yes
guest ok = Yes
create mask = 0777
force create mode = 0777
force directory mode = 0777
unix extensions = No

2.3.5 Uninstallation
The Repository installer creates an uninstaller in the folder that you installed the Repository to. To uninstall the
Repository, simply run the uninstaller and confirm that you want to proceed with the uninstallation.

66

Chapter 2. Installation

Deadline User Manual, Release 7.1.0.35

Note that if you installed the Database with the Repository installer, it will be uninstalled as well. If you chose to
connect to a Database that you manually installed, the Database will be unaffected.
Command Line or Silent Uninstallation
The Repository uninstaller can be run in command line mode or unattended mode on each operating system.
To run in command line mode, pass the mode text command line option to the installer. For example, on Linux:
./uninstall --mode text

To run in silent mode, pass the mode unattended command line option to the installer. For example, on Windows:
uninstall.exe --mode unattended

To get a list of all available command line options, pass the help command line option to the installer. For example,
on Mac OS X:
./uninstall --help

2.4 Client Installation


2.4.1 Overview
Before proceeding with this installation, it is highly recommended to read through the Render Farm Considerations
documentation.
This guide will walk you through the installation of the Client. At this point, you should already have the Database and
Repository installed. If you do not, please see the Database and Repository Installation documentation for installation
instructions.
The Client consists of the following applications:
Launcher: Acts as a launch point for the Deadline applications on workstations, and facilitates remote communication on render nodes.
Monitor: An all-in-one application that artists can use to monitor their jobs and administrators can use to monitor
the farm.
Slave: Controls the rendering applications on the render nodes.
Command: A command line tool that can submit jobs to the farm and query for information about the farm.
2.4. Client Installation

67

Deadline User Manual, Release 7.1.0.35

Pulse: An optional mini server application that performs maintenance operations on the farm, and manages
more advanced features like Auto Configuration, Power Management, Slave Throttling, Statistics Gathering,
and the Web Service. If you choose to run Pulse, it only needs to be running on one machine.
Balancer: An optional Cloud-controller application that can create and terminate Cloud instances based on
things like available jobs and budget settings. If you choose to run Balancer, it only needs to be running on one
machine.
Note that the Slaves and the Balancer applications are the only Client applications that require a license.

2.4.2 Installing The Clients


The Client should be installed on your render nodes, workstations, and any other machines you wish to participate in
submitting, rendering, or monitoring jobs. The Slaves and the Balancer applications are the only Client applications
that require a license. Before you can configure the license for the Client, the license server must be running. See the
Licensing documentation for more information.
If you choose to run Pulse, you need to install the Client on the chosen machine. Note that if you wish to run it on the
same machine as the Database and/or Repository, you still have to install the Client on that machine.
There are Client installers for Windows, Linux, and Mac OS X. To install the Client, simply run the appropriate
installer for your operating system and follow the steps.
Windows
Start the installation process by double-clicking on the Windows Client Installer. The Windows Client installer also
supports silent installations with additional options.

68

Chapter 2. Installation

Deadline User Manual, Release 7.1.0.35

Choose an installation location and press Next to continue.

2.4. Client Installation

69

Deadline User Manual, Release 7.1.0.35

Configure the necessary Client Setup and Launcher Setup settings. The following Client settings are available:
Repository Directory: This is the shared path to the Repository. Note, if you are unable to browse to your
Repository shared path via your drive mapping in the install wizard, then this is more than likely due to a
problem with Windows UAC elevation. Essentially, even if the currently logged in user has the network drive
70

Chapter 2. Installation

Deadline User Manual, Release 7.1.0.35

configured; that configuration is not available in the elevated scope, as you are technically another user here. This
is something which is handled by the OS so we cannot do anything on our side. However, possible workarounds
are to simply select the UNC path instead that the drive is mapped to OR logon to the system as the user account
with elevated permissions (local administrator for example) and then run the client install wizard.
License Server: The license server entry should be in the format @SERVER, where SERVER is the host name
or IP address of the machine that the license server is running on. If you configured your license server to use a
specific port, you can use the format PORT@SERVER. For example, @lic-server or 27000@lic-server. If you
are running Deadline in LICENSE-FREE MODE, or you have not set up your license server yet, you can leave
this blank for now.
The following Launcher settings are available:
Launch Slave When Launcher Starts: If enabled, the Slave will launch whenever the Launcher starts.
Install Launcher As A Service: Enable this if you which to install the Launcher as a service. The service
must run under an account that has network access. See the Windows Service documentation below for more
information.
After configuring the Client and Launcher settings, press Next to continue with the installation.
Linux
Note that on Linux, the Deadline applications have dependencies on some libraries that are installed with the lsb
(Linux Standard Base) package. To ensure you have all the dependencies you need, we recommend installing the full
lsb package. In addition, the libX11 and libXext must be installed on Linux for the Deadline applications to run, even
if running them with the -nogui flag. Theyre required for the Idle Detection feature, among other things. To check if
libX11 and libXext are installed, open a Terminal and run the following commands. If they are installed, then the path
to the libraries will be printed out by these commands.
ldconfig -p | grep libX11
ldconfig -p | grep libXext

If any of these libraries are missing, then please contact your local system administrator to resolve this issue. Here is
an example assuming you have root access, using YUM to install them on your system:
sudo -s
yum install redhat-lsb
yum install libX11
yum install libXext

Start the installation process by double-clicking on the Linux Client Installer. The Linux Client installer also supports
silent installations with additional options.

2.4. Client Installation

71

Deadline User Manual, Release 7.1.0.35

72

Chapter 2. Installation

Deadline User Manual, Release 7.1.0.35

Choose an installation location and press Next to continue.

2.4. Client Installation

73

Deadline User Manual, Release 7.1.0.35

74

Chapter 2. Installation

Deadline User Manual, Release 7.1.0.35

Configure the necessary Client Setup and Launcher Setup settings. The following Client settings are available:
Repository Directory: This is the shared path to the Repository.
License Server: The license server entry should be in the format @SERVER, where SERVER is the host name
or IP address of the machine that the license server is running on. If you configured your license server to use a
specific port, you can use the format PORT@SERVER. For example, @lic-server or 27000@lic-server. If you
are running Deadline in LICENSE-FREE MODE, or you have not set up your license server yet, you can leave
this blank for now.
The following Launcher settings are available:
Launch Slave When Launcher Starts: If enabled, the Slave will launch whenever the Launcher launches.
Install Launcher As A Daemon: Enable this if you which to install the Launcher as a daemon. You can also
choose to run the daemon as a specific user. If you leave the user blank, it will run as root instead. See the Linux
Daemon documentation below for more information.
After configuring the Client and Launcher settings, press Next to continue with the installation.
Mac OSX
Start the installation process by double-clicking on the Mac Client Installer. The Mac Client installer also supports
silent installations with additional options.

2.4. Client Installation

75

Deadline User Manual, Release 7.1.0.35

76

Chapter 2. Installation

Deadline User Manual, Release 7.1.0.35

Choose an installation location and press Next to continue.

2.4. Client Installation

77

Deadline User Manual, Release 7.1.0.35

78

Chapter 2. Installation

Deadline User Manual, Release 7.1.0.35

Configure the necessary Client Setup and Launcher Setup settings. The following Client settings are available:
Repository Directory: This is the shared path to the Repository. Deadline isnt able to understand paths starting
with afp:// or smb://, so point the installer to the Repository path mounted under /Volumes.
License Server: The license server entry should be in the format @SERVER, where SERVER is the host name
or IP address of the machine that the license server is running on. If you configured your license server to use a
specific port, you can use the format PORT@SERVER. For example, @lic-server or 27000@lic-server. If you
are running Deadline in LICENSE-FREE MODE, or you have not set up your license server yet, you can leave
this blank for now.
The following Launcher settings are available:
Launch Slave When Launcher Starts: If enabled, the Slave will launch whenever the Launcher launches.
Install Launcher As A Daemon: Enable this if you which to install the Launcher as a daemon. You can also
choose to run the daemon as a specific user. If you leave the user blank, it will run as root instead. See the Mac
OSX Daemon documentation below for more information.
After configuring the Client and Launcher settings, press Next to continue with the installation.

2.4.3 Command Line or Silent Installation


The Client installer can be run in command line mode or unattended mode on each operating system. Note though that
on OSX, you must run the installbuilder.sh script that can be found in the Contents/MacOS folder, which is inside the
Mac Client Installer package.

2.4. Client Installation

79

Deadline User Manual, Release 7.1.0.35

To run in command line mode, pass the mode text command line option to the installer. For example, on Linux:
./DeadlineClient-X.X.X.X-linux-x64-installer.run --mode text

To run in silent mode, pass the mode unattended command line option to the installer. For example, on Windows:
DeadlineClient-X.X.X.X-windows-installer.exe --mode unattended

To get a list of all available command line options, pass the help command line option to the installer. For example,
on OSX:
/DeadlineClient-X.X.X.X-osx-installer.app/Contents/MacOS/installbuilder.sh --help

Note that there are quite a few Client installer options that are only available from the command line, which you can
view when running the help command. These options include:
configport: The port that the Client uses for Auto Configuration.
slavestartupport: The port that the Slaves use to ensure that only one slave is initializing at a time.
slavedatadir: The local path where the Slave temporarily stores plugin and job data from the Repository during
rendering (if not specified, the default location is used).
noguimode: If enabled, the Launcher, Slave, and Pulse will run without a user interface on this machine.
killprocesses: If enabled, the installer will kill any running Deadline processes before proceeding with the
installation (Windows only).
launcherport: The Launcher uses this port for Remote Administration, and it should be the same on all Client
machines.
launcherstartup: If enabled, the Launcher will automatically launch when the system logs in (non-service
mode on Windows only).
restartstalled: If enabled, the Launcher will try to restart the Slave application on this machine if it stalls.
autoupdateoverride: Overrides the Auto Update setting for this client installation (leave blank to use the value
specified in the Repository Options)
launcherservicedelay: If the Launcher is running as a service or daemon, this is the amount of seconds it waits
after starting up before launching other Deadline applications.

2.4.4 Installing as a Service or Daemon


On Windows and Linux, you can choose to install the Launcher as a service or daemon during installation. There are
a few things to keep in mind when running Deadline in this mode.
Windows Service
When running as a service on Windows, the Launcher will run without displaying its system tray icon. If the Slave or
Pulse application is started through the Launcher while it is in this mode, they will also run without a user interface.
Finally, the Launcher can still perform an auto-upgrade, but only when launching the Slave and Pulse applications
(launching the Monitor, for example, will not invoke an upgrade).
Note that when running the Launcher as a service, the Slave or Pulse application will also run in a service context.
Since services run in a different environment, and potentially under a different user profile than the one currently
logged in, certain considerations need to be made.

80

Chapter 2. Installation

Deadline User Manual, Release 7.1.0.35

First, the default user for a service has no access to network resources, so while Launcher service will run without any
issues, neither the Slave nor Pulse applications will be able to access the Repository. To avoid network access issues,
you must configure the service to run as a user with network privileges. Typical desktop users have this permission,
but check with your system administrator to find which account is best for this application.
Another issue presented by the service context is that there is no access to the default set of mapped drives. Applications will either need to map drives for themselves, or make use of UNC paths. While Deadline supports Automatic
Drive Mapping, the SMB protocol does not allow sharing a resource between two users on the same machine. This
means that mapping of drives or accessing a resource with different credentials may fail when running as a service on
a machine which already requires access to the Repository.
There is also an issue with hardware-based renderers. Starting with Windows Vista, services now run in a virtualized
environment which prevents them from accessing hardware resources. Because the renderer will run in the context of
a service, hardware-based renderers will typically fail to work.
Linux Daemon
When installing the daemon, the Client installer creates the appropriate deadlinelauncherservice script in /etc/init.d.
When running as a daemon on Linux, the Launcher will run without displaying its system tray icon. If the Slave or
Pulse application is started through the Launcher while it is in this mode, they will also run without a user interface.
This is useful when running Deadline on a Linux machine that doesnt have a Desktop environment.
Mac OSX Daemon
When installing the daemon, the Client installer creates the appropriate com.thinkboxsoftware.deadlinelauncher.plist
file in /Library/LaunchDaemons.
When running as a daemon on Mac OSX, the Launcher will run without displaying its system tray icon. If the Slave
or Pulse application is started through the Launcher while it is in this mode, they will also run without a user interface.

2.4.5 Client License Configuration


Before you can configure the license for the Client, the license server must be running. See the Licensing documentation for more information.
If you didnt configure the license for the Client during installation (see above), there are a couple of ways to set the
license for the Client. The quickest way is to use the right-click menu in the Launcher or the File menu in the Slave
application to change the license server.

2.4. Client Installation

81

Deadline User Manual, Release 7.1.0.35

The other option is to set up Auto Configuration so that the Client automatically pulls the license server information.

2.4.6 Uninstallation
The Client installer creates an uninstaller in the folder that you installed the Client to. To uninstall the Client, simply
run the uninstaller and confirm that you want to proceed with the uninstallation.

Command Line or Silent Uninstallation


The Client uninstaller can be run in command line mode or unattended mode on each operating system.
To run in command line mode, pass the mode text command line option to the installer. For example, on Linux:
./uninstall --mode text

To run in silent mode, pass the mode unattended command line option to the installer. For example, on Windows:
uninstall.exe --mode unattended

To get a list of all available command line options, pass the help command line option to the installer. For example,
on Mac OS X:
./uninstall --help

82

Chapter 2. Installation

Deadline User Manual, Release 7.1.0.35

2.5 Submitter Installation


2.5.1 Overview
This guide will walk you through the installation of the integrated submitters, which can be used to submit jobs from
within your application (3ds Max, Maya, Nuke, etc). These should be installed on any machines you wish to submit
jobs from. Note that jobs can also be submitted from the Submit menu in the Monitor. See the Submitting Jobs
documentation for more information.
At this point, you should already have the Database and Repository installed, and the Client software installed. If
you do not, please see the Database and Repository Installation and Client Installation documentation for installation
instructions. You also need to have the software that you will be submitting from installed as well (3ds Max, Maya,
Nuke, etc).

2.5.2 Installing The Submitters


The submitter installers can be found in the submission folder in the Deadline Repository. Open the folder for the
application you want to install the submitter for (3dsmax, Maya, Nuke, etc), and then open the Installers folder. There
will be an installer for each operating system that the current application runs on.
Simply run the appropriate installer then follow the step as follows. Note that these steps are similar for each application and each operating system.

2.5. Submitter Installation

83

Deadline User Manual, Release 7.1.0.35

The Deadline Client Bin Directory page shows what DEADLINE_PATH is currently set to. This value is originally
set by the Client installer, and is used by the submission scripts to find the Clients bin directory so that it can find the
Repository and submit jobs. You can change the DEADLINE_PATH value here if its incorrect or if it doesnt exist,
and the submitter installer will give you the option to make the change permanent.
The next page will show the Repository directory that the Client is currently connected to, which is where the submission scripts are installed from. If this path is incorrect, you can change it here.

84

Chapter 2. Installation

Deadline User Manual, Release 7.1.0.35

Select the components you wish to install (the installer will try to auto select the versions it detects), and then verify
the install location for each one.
After configuring these, press Next to continue with the installation.

2.5. Submitter Installation

85

Deadline User Manual, Release 7.1.0.35

2.5.3 Silent Installation


The Submitter installers can be run in command line mode or unattended mode on each operating system. Note though
that on OSX, you must run the installbuilder.sh script that can be found in the Contents/MacOS folder, which is inside
the Mac Submitter Installer package.
To run in command line mode, pass the mode text command line option to the installer. For example, on Linux:
./Nuke-submitter-linux-installer.run --mode text

To run in silent mode, pass the mode unattended command line option to the installer. For example, on Windows:
Maya-submitter-windows-installer.exe --mode unattended

To get a list of all available command line options, pass the help command line option to the installer. For example,
on OSX:
/Maya-submitter-osx-installer.app/Contents/MacOS/installbuilder.sh --help

Note that there are quite a few Submitter installer options that are only available from the command line, which you
can view when running the help command. These options include:
enable-components: Select the components which you would like to enable (programs installed in default
locations will be auto selected)
disable-components: Select the components which you would like to disable (programs installed in default
locations will be auto selected)
destDir###: The destination directories for the components (will be defaulted to if installed in default locations)
An example batch script that puts these all together:
@echo off
.\Maya-submitter-windows-installer.exe --mode unattended --disable-components Maya2014
.\3dsMax-submitter-windows-installer.exe --mode unattended
--enable-components 3dsMax2011,3dsMax2015
--disable-components 3dsMax2012,3dsMax2013,3dsMax2014
--destDir2011 "C:\3dsMax2011_64"
.\Nuke-submitter-windows-installer.exe --mode unattended

This script installs the submitters for Maya (ignoring Maya 2014), 3ds Max(2011 and 2015 only, with 2011 in an
unusual directory) and Nuke (default settings)

2.5.4 Change the DEADLINE_PATH Value


The DEADLINE_PATH value is a system setting that the Integrated Submission scripts use to determine where the
Deadline Client is installed to, and what the Repository path is. This value is set by the Client installer, and if youve
installed more than one version of Deadline on your machine, its possible that this value could be incorrect.
You can use a Submitter installer to change the DEADLINE_PATH value without installing anything by following
these steps:
Run any submitter installer and set the DEADLINE_PATH value on the Deadline Client Bin Directory page.
Skip past the Repository Directory page.

86

Chapter 2. Installation

Deadline User Manual, Release 7.1.0.35

Uncheck all options on the Components page.


Click Next on the Ready to Install page.
The installer will then update the DEADLINE_PATH variable without actually installing anything.

2.6 Upgrading or Downgrading Deadline


2.6.1 Overview
This will guide you through the process of upgrading or downgrading an existing Deadline installation.

2.6.2 Major Upgrades or Downgrades


If upgrading to a new major version (for example, Deadline 6 to 7), or downgrading from a new major version (for
example, Deadline 7 to 6), you will need to install a new Repository and Database, and you will need to reinstall the
Client software. This is necessary because there are often breaking changes between major releases. Do not install
over an existing installation unless its the same major version, or there could be unexpected results.
Note that Deadline 7 requires a newer version of the MongoDB database application. However, this newer
version is backward compatible with Deadline 6. So if you are installing the MongoDB database application
to a machine that already has a Deadline 6 database installed, you can just install it over top of the existing
Deadline 6 database installation.
You should also reinstall your integrated submission scripts on your workstations, since its possible these were
changed between major releases. See the Application Plug-ins documentation for more information on how to set
up the integrated submission scripts (where applicable).
The license server should also be upgraded to ensure it will work with newer releases in case there are incompatibilities
with the previous version of the license server.
Please refer to the following documentation for more information:
Database and Repository Installation Guide
Client Installation Guide
Licensing Guide

2.6.3 Minor Upgrades or Downgrades


If upgrading or downgrading to a minor version that is part of the same major release cycle (for example, Deadline 7.0
to 7.0.1, or Deadline 7.1 to 7.1), you can simply install over the existing installation. If you have Automatic Upgrades
/ Downgrades enabled, you can have the Clients automatically upgrade or downgrade themselves after upgrading or
downgrading the Database and Repository. Automatic Upgrades / Downgrades can be enabled in the Client Setup
section of the Repository Configuration.
You can also enable Remote Administration in the Client Setup section of the Repository Configuration. This will
make it easier to upgrade or downgrade your render nodes remotely.
Note that this upgrade/downgrade method is only supported when upgrading or downgrading an existing Repository
installation. For example, it is NOT recommended to install the Deadline 7.1 Repository to a new location and then
have your 7.0 Clients upgrade by pointing them to the new Repository path. Instead, you should first move your
Repository installation and then do the upgrade once your 7.0 Clients are connected to the new Repository.

2.6. Upgrading or Downgrading Deadline

87

Deadline User Manual, Release 7.1.0.35

Important Notice When Upgrading From 7.0 to 7.1: Due to a change in the Slave Scheduling settings in the
database, you should avoid editing the Slave Scheduling settings from a machine running version 7.1 until all machines
have upgraded to 7.1. Otherwise, you will get the following error when the Launcher tries to auto-upgrade. If you
get this error when the Launcher tries to auto-upgrade, the workaround is to delete all Slave Scheduling groups in the
Slave Scheduling settings, and then recreate them once all machines have upgraded to 7.1.
An error occurred while deserializing the SlaveSchedulingGroups property of class
Deadline.Configuration.DeadlineNetworkSettings: Element 'AllSlaves' does not match
any field or property of class Deadline.Slaves.SlaveSchedulingGroup.
(System.IO.FileFormatException)

Upgrading or Downgrading the Database and Repository


Launch the new Repository installer, and choose the existing Repository folder for the Installation Directory. Then
choose the option to connect to an existing MongoDB database, and use the same Database Settings you used when
installing the previous version (they should be pre-populated for you).
During the installation, all binaries, plug-ins, and scripts from the previous version will be backed up. You can find
them in the backup folder in the Repository after the installation is complete. Note that any scripts or plugins in the
custom folder will not be affected when upgrading the Repository.
After upgrading or downgrading the Database and Repository, you can then upgrade or downgrade the Clients.
Upgrading or Downgrading Pulse and Balancer
Before upgrading or downgrading all of your client machines, you should first upgrade or downgrade Pulse and the
Balancer (if youre running either of them). If you dont have Automatic Upgrades / Downgrades enabled, you will
have to upgrade or downgrade Pulse and the Balancer manually, which involves running the Client Installer on the
Pulse machine. See the Client Installation Guide for more information.
If you have Automatic Upgrades / Downgrades enabled, all you have to do is restart the Pulse or Balancer application
from the Monitor, providing that Remote Administration is enabled. The Client will notice that the Repository has
been upgraded or downgraded, and will automatically upgrade or downgrade itself.
To restart Pulse remotely, select Pulse in the Pulse List in the Monitor while in Super User mode, then right
click and select Remote Control -> Restart Pulse.
To restart the Balancer remotely, select the Balancer in the Balancer List in the Monitor while in Super User
mode, then right click and select Remote Control -> Restart Balancer.
See the Remote Control documentation for more information about the remote commands that are available.
Upgrading or Downgrading the Clients
If you dont have Automatic Upgrades / Downgrades enabled, you will have to upgrade or downgrade the Clients
manually, which involves running the Client Installers on the machines. See the Client Installation Guide for more
information.
If you have Automatic Upgrades / Downgrades enabled, all you have to do is restart the Slave application on each
render node through the Launcher. The Client will notice that the Repository has been upgraded or downgraded,
and will automatically upgrade or downgrade itself. In addition, the next time artists launch the Monitor on their
workstations through the Launcher, their installation will also be upgraded or downgraded.
To restart the Slaves remotely, Remote Administration must be enabled. Select the Slaves you want to upgrade or
downgrade in the Monitor while in Super User mode, then right click and select Remote Control -> Restart Slaves.

88

Chapter 2. Installation

Deadline User Manual, Release 7.1.0.35

If the slaves are currently rendering and you dont want to disrupt them, you can choose the option to Restart Slaves
After Current Task instead. This option will allow the Slaves to upgrade or downgrade after they finishe rendering
their current task to prevent the loss of any render time. See the Remote Control documentation for more information.
After restarting the Slaves, several Slaves may appear offline or a message may pop up saying the certain Slaves did
not respond. This may occur because all the Slaves are trying to upgrade or downgrade at once. Wait a little bit and
eventually all the Slaves should come back online.

2.7 Relocating the Database or Repository


2.7.1 Overview
There may come a time where you have to move the Database or Repository (or both) to another location or another
machine. This guide will walk you through the steps required.

2.7.2 Migrating the Database


These are the steps to move your Database to a new location:
1. Shut down all the Slave applications running on your render nodes. You dont want them making changes during
the move.
2. Stop the mongod process on the Database machine.
3. Copy the Database folder from the original location to the new one.
4. Update the config.conf file in the data folder to point to the new system log folder and storage folder locations.
5. Start the mongod process on the Database machine.
6. Modify the dbConnect.XML file in the settings folder in the Repository to set the new database host name or IP
address (if you moved it to another machine).
7. Start up the Slaves and ensure that they can connect to the new Database.
Here is an example of how you would update the config.conf file if you copied the new database location was
C:\NEW_DATABASE_FOLDER:
systemLog:
destination: file
path: C:/NEW_DATABASE_FOLDER/data/logs/log.txt
quiet: true
storage:
dbPath: C:/NEW_DATABASE_FOLDER/data

Because the Clients use the dbConnect.xml file in the Repository to determine the database connection settings, you
dont have to reconfigure the Clients to find the new database.

2.7.3 Migrating the Repository


These are the steps to move your Repository to a new location:
1. Ensure that the share for the new location already exists. Also ensure that the proper permissions have been set.

2.7. Relocating the Database or Repository

89

Deadline User Manual, Release 7.1.0.35

2. Shut down all the Slave applications running on your render nodes. You dont want them making changes during
the move.
3. Copy the Repository folder from the original location to the new location.
4. Redirect all your Client machines to point to the new Repository location.
5. Start up the Slaves and ensure that they can connect to the new Repository location.
6. Delete the original Repository (optional).
As an alternative to step (4), you can configure your share name (if the new Repository is on the same machine) or
your DNS settings (if the new Repository is on a different machine) so that the new Repository location has the same
path as the original. This saves you the hassle of having to reconfigure all of your Client machines.

2.8 Importing Repository Settings


After installing a new Repository, you can import settings from a previous Repository into the new one. To do this,
open the Monitor and ensure that youre connected to the new Repository (the title bar for the Monitor window will
show the Repository that youre connected to). The enter Super User Mode from the Tools menu, and select Tools ->
Import Settings to bring up the Import Repositroy Settings window.

Specify the path to the old Repository that you want to import the settings from, and then choose which settings you
want to import and press the Import Settings button. Note that all passwords in Repository Options (Super User, SMTP,

90

Chapter 2. Installation

Deadline User Manual, Release 7.1.0.35

Mapped Drives) and Users (Web Service, Windows Login) will not be transferred, so these must be set manually after
the transfer is complete.
Also note that this feature only allows you to import settings from Deadline 6 or later. An un-supported Python
script DeadlineV5Migration.py attempts to migrate Deadline v5.x customers over to Deadline v6.x. It can be found
together with other useful example scripts on our Github site. Please note the disclaimer before executing this script
in your Deadline queue.

2.8. Importing Repository Settings

91

Deadline User Manual, Release 7.1.0.35

92

Chapter 2. Installation

CHAPTER

THREE

GETTING STARTED

3.1 Application Configuration


3.1.1 Overview
Deadline needs to know the executable file path to your installed application before being able to process jobs across
your network. For many applications (which ship themselves with a default install path), the binary executable file
and its path for each operating system and version is already included in the Configure Plugins... dialog for each
application, which can be accessed via Deadline Monitor > Tools > Super User Mode > Configure Plugins....
Below are example default application paths for the MayaBatch and Nuke plugins.

93

Deadline User Manual, Release 7.1.0.35

3.1.2 Multiple Application Paths


Looking at the MayaBatch plugin configuration section as an example. There are multiple render executable paths
defined. When a Deadline Slave dequeues (starts) a MayaBatch job, amongst many other functions in the ../<DeadlineRepository>/plugins/MayaBatch/MayaBatch.py plugin file, the slave attempts to retrieve the first Application
Path that exists on the machine which has been configured for the exact version of MayaBatch to be used, bitness
build (None, 32bit, 64bit) and also for all operating systems. In the example of MayaBatch below we also have
separate versions for Maya (2016) and Mayas Extension (2016_5) which are defined in the ../<DeadlineRepository>/plugins/MayaBatch/MayaBatch.dlinit configuration file as a semicolon separated list:

RenderExecutable2016_0=C:\\Program Files\\Autodesk\\Maya2016\\bin\\MayaBatch.exe;/usr/autodesk/maya2
RenderExecutable2016_5=C:\\Program Files\\Autodesk\\Maya2016.5\\bin\\MayaBatch.exe;/usr/autodesk/may

The MayaBatch.dlinit file is automatically written to as you commit UI changes in the Plugin Configuration dialog
in Monitor. There is no need to manually edit these text files although this is possible. The ../<DeadlineRepository>/plugins/MayaBatch/MayaBatch.param file is an optional file that is used by the Plugin Configuration dialog in
the Monitor. It declares properties that the Monitor uses to generate a user interface for modifying custom settings in
the MayaBatch.dlinit file.

94

Chapter 3. Getting Started

Deadline User Manual, Release 7.1.0.35

Typically, there are 3 functions in our scripting API which help us identify the correct application executable to
return as the Render Executable to be used, depending on which Build option is selected in your in-app or monitor
submission UI (see above for an example) to be used - None (default), 32bit or 64bit. These functions check the actual
bitness of the application binary executable to ensure we use a 32bit or 64bit application if applicable:
FileUtils. SearchFileList ( string fileList ) Searches a semicolon separated list of files (fileList) for the first
one that exists. For relative file paths in the list, the current directory and the PATH environment variable
will be searched. Returns the first file that exists, or if no file is found.
FileUtils. SearchFileListFor32Bit ( string fileList ) Searches a semicolon separated list of files (fileList) for
the first 32bit file that exists. For relative file paths in the list, the current directory and the PATH environment variable will be searched. Returns the first file that exists, or if no file is found.
FileUtils. SearchFileListFor64Bit ( string fileList ) Searches a semicolon separated list of files (fileList) for
the first 64bit file that exists. For relative file paths in the list, the current directory and the PATH environment variable will be searched. Returns the first file that exists, or if no file is found.

3.1.3 Network Installed Applications


If the application in question supports running across a network, then you can add network application install paths
to the Plugin Configuration dialog in Monitor as well. Access permissions should be correctly configured for the user
account(s) that Deadline Slave runs under to have the correct access. Alternatively, you may wish to create desktop
shortcuts/symlinks instead and configure these paths in the Plugin Configuration dialog. Beyond the scope of this
documentation, please note that although many Windows based applications can be installed to a network location,
they still require the presence of many c++/c#/.NET re-distribution packages to be installed.
Application Wrapper Scripts
Typically, Linux based VFX studios use a bash/python wrapper script which is called to startup an application. This
allows the studio to execute other commands, configure environment variables accordingly before the actual launching
of an application such as Maya. As the bash/python script file is not a binary executable directly, our 2 x functions
which check the actual bitness of your script file will cause a failure, which can be skipped by simply ensuring None
as the build option is used. This can be better explained by showing a working example if we inspect the actual Python
code in the MayaBatch plugin:

3.1. Application Configuration

95

Deadline User Manual, Release 7.1.0.35

## Called by Deadline to get the render executable.


def RenderExecutable( self ):
versionString = str( self.Version ).replace( ".", "_" )
mayaExecutable = ""
mayaExeList = self.deadlinePlugin.GetConfigEntry( "RenderExecutable" + versionString )

if( self.Build == "32bit" ):


self.deadlinePlugin.LogInfo( "Enforcing 32 bit build of Maya" )
if( SystemUtils.IsRunningOnWindows() ):
mayaExecutable = FileUtils.SearchFileListFor32Bit( mayaExeList )
if( mayaExecutable == "" ):
self.deadlinePlugin.FailRender( "32 bit Maya " + versionString + " render executable
else:
# Need to check bitness of Render because maya is just a shell script.
mayaExeList = mayaExeList.replace( "\\", "/" )
for executable in mayaExeList.split( ";" ):
tempExecutable = PathUtils.ChangeFilename( executable, "Render" )
tempExecutable = FileUtils.SearchFileListFor32Bit( tempExecutable )
if tempExecutable != "":
mayaExecutable = executable
break
if( mayaExecutable == "" ):
self.deadlinePlugin.FailRender( "32 bit Maya " + versionString + " render executable

elif( self.Build == "64bit" ):


self.deadlinePlugin.LogInfo( "Enforcing 64 bit build of Maya" )
if( SystemUtils.IsRunningOnWindows() ):
mayaExecutable = FileUtils.SearchFileListFor64Bit( mayaExeList )
if( mayaExecutable == "" ):
self.deadlinePlugin.FailRender( "64 bit Maya " + versionString + " render executable
else:
# Need to check bitness of Render because maya is just a shell script.
mayaExeList = mayaExeList.replace( "\\", "/" )
for executable in mayaExeList.split( ";" ):
tempExecutable = PathUtils.ChangeFilename( executable, "Render" )
tempExecutable = FileUtils.SearchFileListFor64Bit( tempExecutable )
if tempExecutable != "":
mayaExecutable = executable
break
if( mayaExecutable == "" ):
self.deadlinePlugin.FailRender( "64 bit Maya " + versionString + " render executable
else:
self.deadlinePlugin.LogInfo( "Not enforcing a build of Maya" )
mayaExecutable = FileUtils.SearchFileList( mayaExeList )
if( mayaExecutable == "" ):
self.deadlinePlugin.FailRender( "Maya " + versionString + " render executable was not fo
return mayaExecutable

96

Chapter 3. Getting Started

Deadline User Manual, Release 7.1.0.35

3.2 Submitting Jobs


3.2.1 Overview
The easiest and most common way to submit render jobs to Deadline is via our many submission scripts, which are
written for each rendering application it supports. After you have submitted your job, you can monitor its progress
using the Monitor. See the Monitoring Jobs documentation for more information.
If you would like more control over the submission process, or would like to submit arbitrary command line jobs to
Deadline, see the Manual Job Submission documentation for more information.

3.2.2 Integrated Submission Scripts


Where possible, we have created integrated submission scripts that allow you to submit jobs directly from the application youre working with. These scripts are convenient because you dont have to launch a separate application to
submit the job. In addition, these scripts often provide more submission options because they have direct access to the
scene or project file you are submitting.
See the Plug-ins documentation for more information on how to set up the integrated submission scripts (where applicable) and submit jobs for specific applications.

3.2. Submitting Jobs

97

Deadline User Manual, Release 7.1.0.35

3.2.3 Monitor Submission Scripts


In cases where an application doesnt have an integrated submission script, you can submit the jobs from the Submit
menu in the Monitor. Note that applications that have integrated submission scripts also have Monitor scripts here,
but in most cases there are less options to choose from. This is because the integrated submission scripts use the
applications native scripting language to pull additional information from the file being submitted. See the Plug-ins
documentation for more information on how submit jobs for specific applications.
You can also create your own submission scripts for the Monitor. Check out the Monitor Scripting documentation for
more details.

3.2.4 Common Job Submission Options


There are many job options that can be specified on submission. A lot of these options are general job properties
that arent specific to the application youre rendering with. Some of these options are described below. There are
also many other options that are specific to the application that youre rendering with. These are covered in each
applications plug-in guide, which can be found in the Plug-ins documentation.
Job Name
The name of your job. This is optional, and if left blank, it will default to Untitled.
Comment
A simple description of your job. This is optional and can be left blank.
Department
The department you belong to. This is optional and can be left blank.
Pool and Group

98

Chapter 3. Getting Started

Deadline User Manual, Release 7.1.0.35

The pool and group that the job belongs to. See the Job Scheduling documentation for more information
on how these options affect job scheduling.
Priority
A job can have a numeric priority ranging from 0 to 100, where 0 is the lowest priority and 100 is the
highest priority. See the Job Scheduling documentation for more information on how this option affects
job scheduling.
Task Timeout and Auto Task Timeout
The number of minutes a slave has to render a task for this job before an error is reported and the task is
requeued. Specify 0 for no limit. If the Auto Task Timeout is properly configured in the Repository Options, then enabling the Auto Task Timeout option will allow a task timeout to be automatically calculated
based on the render times of previous frames for the job.
Concurrent Tasks and Limiting Tasks To A Slaves Task Limit
The number of tasks that can render concurrently on a single slave. This is useful if the rendering application only uses one thread to render and your slaves have multiple CPUs. Caution should be used when
using this feature though if your renders require a large amount of RAM.
If you limit the tasks to a slaves task limit, then by default, the slave wont dequeue more tasks then it has
CPUs. This task limit can be overridden for individual slaves by an administrator. See the Slave Settings
documentation for more information.
Machine Limit and Machine Whitelists/Blacklists
Use the Machine Limit to specify the maximum number of slaves that can render your job at one time.
Specify 0 for no limit. You can also force the job to render on specific slaves by using a whitelist, or you
can avoid specific slaves by using a blacklist. See the Limit Documentation for more information.
Limits
The limits that your job must adhere to. See the Limit Documentation for more information.
Dependencies
Specify existing jobs that this job will be dependent on. This job will not start until the specified dependencies finish rendering.
On Job Complete
If desired, you can automatically archive or delete the job when it completes.
Submit Job As Suspended
If enabled, the job will submit in the suspended state. This is useful if you dont want the job to start
rendering right away. Just resume it from the Monitor when you want it to render.
Scene/Project/Data File (if applicable)
The file path to the Scene/Project/Data File to be processed/rendered as the job. The file needs to be in
a shared location so that the slave machines can find it when they go to render it directly. See Submit
Scene/Project File With Job below for a further option. Note, all external asset/file paths referenced by
the Scene/Project/Data File should be resolvable by your slave machines on your network.
Frame List
The list of frames to render. See the Frame List Formatting Options below for valid frame lists.
Frames Per Task
Also known as Chunk Size. This is the number of frames that will be rendered at a time for each job task.
Increasing the Frames Per Task can help alleviate some of the inherited overhead that comes with network

3.2. Submitting Jobs

99

Deadline User Manual, Release 7.1.0.35

rendering, but if your frames take longer than a couple of minutes to render, it is recommended that you
leave the Frames Per Task at 1.
Submit Scene/Project File With Job
If this option is enabled, the scene or project file you want to render will be submitted with the job, and
then copied locally to the slave machine during rendering. The benefit to this is that you have a copy of
the file in the state that it was in when it was submitted. However, if your scene or project file uses relative
asset paths, enabling this option can cause the render to fail when the asset paths cant be resolved.
Note, only the Scene/Project File is submitted with the job and ALL external/asset files referenced by the
Scene/Project File are still required by the slave machines.
If this option is disabled, the file needs to be in a shared location so that the slave machines can find
it when they go to render it directly. Leaving this option disabled is required if the file has references
(footage, textures, caches, etc) that exist in a relative location. Note though that if you modify the original
file, it will affect the render job.

3.2.5 Draft and Integration Submission Options


The majority of the submission scripts that ship with Deadline have Integration options to connect to Shotgun and
ftrack, and/or use Draft to perform post-rendering compositing operations. The Integration and Draft job options are
essentially the same in every submission script, and more information can be found in their respective documentation:
Draft Documentation
Shotgun Documentation
ftrack Documentation

3.2.6 Jigsaw
Jigsaw is a flexible multi-region rendering system for Deadline, and is available for 3ds Max, Maya, modo, and Rhino.
It can be used to render regions of various sizes for a single frame, and in 3ds Max and Maya, it can be used to track
and render specific objects over an animation.
Draft can then be used to automatically assemble the regions into the final frame or frames. It can also be used to
automatically composite re-rendered regions onto the original frame.
Jigsaw is built into the 3ds Max, Maya, modo, and Rhino submitters, and with the exception of 3ds Max, Jigsaw
viewport will be displayed in a separate window.

100

Chapter 3. Getting Started

Deadline User Manual, Release 7.1.0.35

The viewport can be used to create and manipulate regions, which will then be submitted to Deadline to render. The
available options are listed below.
General Options
These options are always available:
Add Region: Adds a new region.
Delete All: Deletes all the current regions.
Create From Grid: Creates a grid of regions to cover the full viewport. The X value controls the number of
columns and the Y value controls the number of rows.
Fill Regions: Automatically creates new regions to fill the parts of the viewport that are not currently covered
by a region.
Clean Regions: Deletes any regions that are fully contained within another region.
Undo: Undo the last change made to the regions.
Redo: Redo the last change that was previously undone.
Selected Regions Options
These options are only available when one or more regions are selected.
Delete: Deletes the selected regions.
Split: Splits the selected regions into sub-regions based on the Tiles In X and Tyles In Y settings.
These options are only available when a single region is selected:
Clone: Creates a duplicate region parallel to the selected region in the specified direction.
Lock Postion: If enabled, the region will be locked to its current position.
Enable Region: If disabled, the region will be ignored when submitting the job.
X Position: The horizontal position of the selected region, taken from the left.

3.2. Submitting Jobs

101

Deadline User Manual, Release 7.1.0.35

Y Position: The vertical position of the selected region, taken from the top.
Width: The width of the selected region.
Height: The height of the selected region.
These options are only available when multiple regions are selected.
Merge: Combines the selected regions into a single region that covers the full area of the selected regions.
Zoom Options
These zoom options are always available:
Zoom Slider: Use the slider to zoom the viewport in and out. You can also use the mouse wheel to zoom in and
out, and you can click the mouse wheel down to pan the image if it doesnt fit in the viewport.
Reset Zoom: Resets the zoom within the viewport.
Fit Viewport: Zoom to see everything in the viewport.
Keep Fit: Zoom to see everything in the viewport, and force the viewport to not change. This allows the
viewport to scale when resizing the Jigsaw window.
Maya Options
These options are currently only available for Maya:
Reset Background: Gets the current viewport image from Maya.
Fit Selection: Create regions surrounding the selected items in the Maya scene.
Mode: The type of regions to be used when fitting the selected items. The options are Tight (fitting the minimum
2D bounding box of the points) and Loose (fitting the minimum 2D bounding box of the bounding box of the
object).
Padding: The amount of padding to add when fitting the selection (this is a percentage value that is added in
each direction).
Save Regions: Saves the informations in the regions directly into the Maya scene.
Load Regions: Loads the saved regions information from the Maya scene.

3.2.7 Frame List Formatting Options


During job submission, you usually have the option to specify the frame list you want to render, which often involves
manually typing the frame list into a text box. In this case, you can make use of the following frame list formatting
options.
Specifying Individual Frames or a Sequence
You can specify a single frame just by typing in the frame number:
5

You can specify individual frames by separating each frame with a comma or a space:
5,10,15,20
5 10 15 20

You can specify a frame range by separating the start and end frame with a dash:

102

Chapter 3. Getting Started

Deadline User Manual, Release 7.1.0.35

1-100

Specifying a Sequence with a Step Frame


You can specify a step frame for a sequence using x, step, by, or every:
1-100x5
1-100step5
1-100by5
1-100every5

Each of these examples will render every 5th frame between 1 and 100 (1, 6, 11, 16, etc).
Specifying a Reverse Frame Sequence
You can specify a reverse frame range by separating the end frame and start frame with a dash:
100-1

Using a step frame also works for reverse frame sequences:


100-1x5

Advanced Frame Lists


Individual frames for the same job are never repeated when creating tasks for a job, which allows you to get creative
with your frame lists without worrying about rendering the same frame more than once.
To render frames 5, 18, and then from 28 to 100, you can specify one of the following:
5,18,28-100
5 18 28-100

To render every 5th frame between 1 to 100, then fill in the rest, you can specify one of the following:
1-100x5,1-100
1-100x5 1-100

To render every 10th frame between 1 to 100, then every 5th frame, then every 2nd frame, then fill in the rest, you can
specify one of the following:
1-100x10,1-100x5,1-100x2,1-100
1-100x10 1-100x5 1-100x2 1-100

To render in a mix of forward and reverse by different Nth frames, then fill in the rest in reverse, you can specify one
of the following:
100-1x10,0-100x5,100-1
100-1x10 0-100x5 100-1

NOTE, a jobs frame range can be modified after a job has been submitted to Deadline by right-clicking on a job and
selecting Modify Frame Range....

3.2. Submitting Jobs

103

Deadline User Manual, Release 7.1.0.35

3.3 Monitoring Jobs


3.3.1 Overview
The Monitor application lets you monitor and control your jobs after they have been submitted to the farm. This
documentation only covers some of the basics regarding the Monitor application. For more in-depth information, see
the Monitor documentation.

If youre launching the Monitor for the first time on your machine, you will be prompted with a Login dialog. Simply
choose your user name or create a new one before continuing. Once the Monitor is running, youll see your user name
in the bottom right corner. If this is the wrong user, you can log in as another user by selecting File -> Change User.
Note that if your administrator set up Deadline to lock the user to the systems login account, you will have to log off
of your system and log back in as the correct user.

3.3.2 Finding Your Jobs


Information in the Monitor is broken up into different panels. When monitoring your jobs, you typically want to use
the following panels:

104

Chapter 3. Getting Started

Deadline User Manual, Release 7.1.0.35

Job Panel: This panel shows all the jobs in the farm.
Task Panel: When a job is selected, this will show all the tasks for the job.
Job Reports Panel: When a job is selected, this will show all reports (logs and errors) for the job.
These panels, and others, can be created from the View menu, or from the main toolbar. They can be re-sized, docked,
or floated as desired. This allows for a highly customized viewing experience which is adaptable to the needs of
different users. See the Panel Features documentation for instructions on how to create new panels in the Monitor.

The easiest way to find your jobs is to enable Ego-Centric Sorting in the job panels drop down menu, which can be
found in the upper-right corner of the panel. This keeps all of your jobs at the top of the job list, regardless of which
column the job list is sorted on. Then sort on the Submit Date/Time column to show your jobs in the order they were
submitted.

3.3. Monitoring Jobs

105

Deadline User Manual, Release 7.1.0.35

3.3.3 Filtering the Job List


Another way to find the jobs you are interested in is to use the filtering options in the job panel. The Quick Filter
option in the job panels drop down menu will open a side panel that allows you to filter out jobs based on status, user,
pool, group, and plugin.

For more advanced filtering, use the Edit Filter option in the drop down menu to filter on any column in the job list. If
you would like to save a filter for later use, use the Pinned Filters option in the drop down menu to pin your filter. You
106

Chapter 3. Getting Started

Deadline User Manual, Release 7.1.0.35

will then be able to select it later from the Pinned Filters sub menu.

3.3. Monitoring Jobs

107

Deadline User Manual, Release 7.1.0.35

Finally, you can use the search box above the job list to filter your results even further.

3.3.4 Job Batches


Jobs that share the same Batch Name property will be grouped together in the job list. All of the job submitters that
are included with Deadline will automatically set the Batch Name if they are submitting multiple jobs that are related
to each other. The Batch Name for a job can be modified in the Job Properties in the Monitor.

108

Chapter 3. Getting Started

Deadline User Manual, Release 7.1.0.35

If you prefer to not have the jobs grouped together in the job list, you can disable the Group Jobs By Batch Name
option in the Monitor and User Settings.

3.3.5 Controlling Your Jobs


If you need to pause your job, you can right-click on the job in the job list and select Suspend job. When you are ready
to let the job continue, simply right-click on the job again and select Resume Job. See the Job States documentation
for more information.

To modify the properties of your job, you can double-click on the job, or right-click on it and select Modify Properties.
Here you can change scheduling options such as priority and pool, as well as other general properties like the job
name. If you wish to limit which render nodes your job runs on, as well as the number of nodes that can render it
concurrently, you can do so on the Machine Limit page. Depending on the application youre rendering with, you

3.3. Monitoring Jobs

109

Deadline User Manual, Release 7.1.0.35

may see an extra page at the bottom of the properties list (with the name of the plug-in) that allows you to modify
properties which are specific to that application. More information on job properties can be found in the Job Properties
documentation.

3.3.6 Why Is My Job Not Rendering?


If a slave isnt rendering a job that you think it should be, you can use the Job Candidate Filter option in the panels drop
down menu to try and figure out why. See the Job Candidate Filter section in the Slave Configuration documentation
for more information.
The job could also be producing errors when rendering. See the following section below about handling job errors.

3.3.7 Handling Job Errors


If your job starts producing errors, youll notice that your job will change from green to brown, then eventually to
red (depending on the number of errors). These error reports can be viewed in the Job Reports panel, which can be
opened from the View menu, or from the jobs right-click menu. Here you will find all the reports generated for a job
including the error reports which will be red. You can filter and sort the reports to help find what you are looking for.
Often, the error reports will clearly show what the cause of the error is, allowing you can take the appropriate steps
to resolve the problem. If youre ever unsure of what an error means, feel free to email the error report to Deadline
Support and well try to help. See the Job Reports and History documentation for more information.

110

Chapter 3. Getting Started

Deadline User Manual, Release 7.1.0.35

3.3.8 Completed Jobs


When your job is complete, you can view the output images by right-clicking on the individual tasks in the task list
and selecting the output filename. This will open the image in the application that is set to open that type of file by
default. Note that this option isnt always available for some applications. In most cases though, you can view the
output image folder by right-clicking on the job and selecting Explore Output. See the Job Output documentation for
more information.
You can also view the logs for the job in the Job Reports panel, which can be opened from the View menu, or from the
jobs right-click menu. Finally, once you are happy with the results and no longer need the job, you can delete it by
right-clicking on the job and selecting Delete Job.

3.3. Monitoring Jobs

111

Deadline User Manual, Release 7.1.0.35

3.3.9 Re-rendering Jobs


If you have a completed job that you need to re-render, you can do so by right-clicking on the job and selecting
Requeue Job. If you only need to re-render a few bad frames, you can just requeue their corresponding tasks by
right-clicking on one or more tasks in the task list selecting Requeue Tasks.
In some cases, the Monitor can try to detect bad frames for you. You can use this feature by right-clicking on the
job and selecting Scan For Missing Output. The scan will check for missing frames or frames that dont meet a size
threshold. You will then have the option to requeue all the corresponding tasks automatically. Note that the Scan For
Missing Output option isnt available for all jobs. See the Job Output documentation for more information.

3.4 Controlling Jobs


3.4.1 Overview
The Jobs panel allows jobs to be controlled and modified using the right-click menu. In addition, the Task panel allows
specific tasks to be controlled using the right-click menu. Note that the availability of these options can vary depending
on the context in which they are used, as well as the User Group Permissions that are defined for the current user.

112

Chapter 3. Getting Started

Deadline User Manual, Release 7.1.0.35

If the Job or Task panels are not visible, see the Panel Features documentation for instructions on how to create new
panels in the Monitor.

3.4.2 Job States


The state of jobs can be changed using the Job panels right-click menu. In addition, the states of specific tasks can be
changed using the Task panels right-click menu. Note that it is possible to modify the states of multiple jobs or tasks
at the same time, providing the selected jobs or tasks are all in the same state.

3.4. Controlling Jobs

113

Deadline User Manual, Release 7.1.0.35

When suspending a job, a confirmation message will appear that gives you the option to suspend the tasks for the
job that are currently rendering. If you disable this option, any tasks that are currently rendering will be allowed to
complete.

These are the states that a job can be in. They are color coded to make it clear which state the job is in.
Queued (white): No tasks for the job are currently being rendered.
Rendering (green): At least one task for the job is being rendered.
Completed (blue): All tasks for the job have finished rendering.
Suspended (gray): The job will not be rendered until it is resumed.
Pending (orange): The job is waiting on dependencies to finish, or is scheduled to start at a later time.
Failed (red): The job has failed due to errors. It must be resumed before it can be rendered again.
You may notice Queued or Rendering jobs turn slightly red or brown as they sit in the farm. This is an indication that
the job is reporting errors. See the Job Reports section further down for more information.
The Job panels right-click menu also gives the option to delete or archive jobs. Both options will remove the jobs
from the farm, but archived jobs can be imported again for later use. You can import archived jobs from the File menu
in the Monitor. See the Job Archiving documentation for more information.

3.4.3 Resubmitting Jobs


If you want to render a specific job again, but you dont want to lose the statistics for original job, you can resubmit it
from the Job panels right-click menu. This will bring up a window allowing you to adjust the frame list and frames
per task if you want to. All other job properties will remain identical.

114

Chapter 3. Getting Started

Deadline User Manual, Release 7.1.0.35

Note that you can resubmit it as a normal job or a maintenance job. Maintenance jobs are special jobs where each task
for the job will render the same frame(s) on a different machine in your farm. This is useful for performing benchmark
tests on your machines. When a maintenance job is submitted, a task will automatically be created for each slave, and
once a slave has finished a task, it will no longer pick up the job.
Its even possible to resubmit specific tasks as a new job, which can be done from the Task panels right-click menu.
Note though that a Maintenance job can only be resubmitted from the Job panel.
Note that Tile jobs will have their own resubmission dialog, and only the Tile frame can be changed.

3.4.4 Job Properties


To modify job properties, select the Modify Job Properties option from the Job panels right-click menu. Doubleclicking on a job will also bring up the Job Properties window. There are many pages of properties you can modify,
which are covered below. Note that it is possible to modify the properties of multiple jobs at the same time.
General
These are the most common job properties, and most of these were specified when the job was originally submitted.

3.4. Controlling Jobs

115

Deadline User Manual, Release 7.1.0.35

The properties are as follows:


Job ID: The internal ID of the job.
Job Name: The name of the job.
Comment: The comment for the job.
Department: The department the job was submitted from.
Batch Name: The batch the job belongs to. Jobs with the same Batch Name are grouped together in the Monitor.
User: The user who submitted the job.
Pool: The pool that the job belongs to.
Secondary Pool: If enabled, the job can fall back to the secondary pool if there are machines available in that
pool.
Group: The group that the job belongs to.
Priority: The priority of the job (0 = lowest, 100 = highest).
Concurrent Tasks: The number of tasks a slave can dequeue at a time (1-16). Note that not all plug-ins support
this feature, such as Digital Fusion.
Limit Tasks To Slaves Task Limit: If checked, a slave will not dequeue more tasks than it is allowed to based
on its settings.

116

Chapter 3. Getting Started

Deadline User Manual, Release 7.1.0.35

On Job Complete: When a job completes, you can auto-archive or auto-delete it. You can also choose to do
nothing when the job completes.
Job Is Protected: If enabled, the job can only be deleted by the jobs user, a super user, or a user that belongs
to a user group that has permissions to handle protected jobs. Other users will not be able to delete the job, and
the job will also not be cleaned up by Deadlines automatic house cleaning.
Re-synchronize Auxiliary Files Between Tasks: If checked, all job files will be synchronized by the Slave
between tasks for this job. This can add significant network overhead, and should only be used if you are
manually editing any of the files that were submitted with the job.
Reload Plugin Between Tasks: If checked, the slave reloads all the plug-in files between tasks for the same
job.
Enforce Sequential Rendering: Sequential rendering forces a slave to render the tasks of a job in order. If
an earlier task is ever requeued, the slave wont go back to that task until it has finished the remaining tasks in
order.
Suppress Event Plugins: If enabled, this job will not trigger any event plugins while in the queue.
Job Is Interruptible: If enabled, tasks for this job can be interrupted during rendering by a job with a higher
priority.
Interruptible %: A task for this job will only be interrupted if the task progress is less than or equal to this
value.
Timeouts
These properties effect how a job will timeout. It is important to note that the Auto Task Timeout feature is based on
the Auto Job Timeout Settings in the Repository Options. The timeout is based on the render times of the tasks that
have already finished for this job, so this option should only be used if the frames for the job have consistent render
times.

3.4. Controlling Jobs

117

Deadline User Manual, Release 7.1.0.35

The properties are as follows:


Minimum Task Render Time: The minimum amount of time a Slave has to render a task. If a task finishes
faster, an error will be reported.
Maximum Task Render Time: The maximum amount of time a Slave has to render a task. If a Maximum Start
Job Time is set, the Maximum Task Render Time will not be applied to the Starting phase of a job.
Maximum Start Job Time: The maximum amount of time a Slave has to start a job.
On Task Timeout: You have the option to have the job report an error or notify you when a timeout is reached.
Enable Timeouts For Pre/Post Job Scripts: If checked, then the timeouts for this job will also affect its
pre/post job scripts, if any are defined.
Enable Auto Task Timeout: If the job should automatically timeout based on parameters specified in the
Repository Options.
Use Frame Timeouts: If enabled, timeouts will be calculated based on frames instead of by tasks. The timeouts
entered for tasks will be used for each frame in that task.
Notifications
These properties allow you to notify user(s) are jobs complete. There are two list controls beside each other on this
panel. The left list contains all the current users on your farm. The right list contains the names of the users of whom
will receive notifications. You can move users from one list to another using the arrow controls between the two lists.

118

Chapter 3. Getting Started

Deadline User Manual, Release 7.1.0.35

The properties are as follows:


Notification Email Addresses: A comma delimited list of the Notification Users email addresses.
Job Completion Notes: Notes to attach in the email sent when the job has completed.
Override Notification Method: If checked, you can select whether to send an email or to not send an email.
Machine Limit
A Machine Limit can be used to limit the number of slaves that can render one particular job. This is useful if you
want to render a bunch of jobs simultaneously. The list you create can be a whitelist or a blacklist. A whitelist is the
list of slaves that are approved to render this job (only these approved machines will render this job) while a black list
contains slaves which are will not render this job. To move a machine from one list to another you can use the arrow
buttons between the two lists, drag and drop the machine names you want, or simply double click the machine name.
You are also able to load and save your machine list from a file so you can use the same list across multiple jobs. The
file used will save each machine name to a single line.

3.4. Controlling Jobs

119

Deadline User Manual, Release 7.1.0.35

You can modify the following options for the machine limit:
Slaves that can render this job simultaneously: The number of slaves that can render this job at the same
time.
Return Limit Stub When Task Progress % Reaches: If enabled, you can have a slave release its limit stub
when the current task it is rendering reaches the specified progress. Note that not all plug-ins report task progress,
in which case the machine limit stub will not be released until the task finishes rendering.
Whitelisted/Blacklisted Slaves: If slaves are on a blacklist, they will never try to render this job. If slaves are
on a whitelist, only those slaves will try to render this job. Note that an empty blacklist and an empty whitelist
are functionally equivalent, and have no impact on which machines the job renders on.
Load Machine List: Open a file dialog to load a list of slaves to be used in the white/blacklist. One machine
name per line in the file (.txt).
Save Machine List: Open a file dialog to save the current white/black list. Each machine name will be written
to a single line.
Limits
Here you can add or remove the limits that will effect your job. Limits are used to ensure floating licences are used
correctly on your farm. To add a limit to your job, you can select the limit(s) you require from from the limit list and
press the right arrow between the Limit List and the Required Limits. You are also able to drag and drop your selected
limits into or from the required limits or just double click a limit to move it from one list to another.

120

Chapter 3. Getting Started

Deadline User Manual, Release 7.1.0.35

Dependencies
Dependencies can be used to control when a job should start rendering. See the Job Dependency Options below for
more information.

3.4. Controlling Jobs

121

Deadline User Manual, Release 7.1.0.35

Failure Detection
Here you can set how your job handles errors and determine when to fail a job.

122

Chapter 3. Getting Started

Deadline User Manual, Release 7.1.0.35

The properties are a follows:


Override Job Error Limit: Once checked, the job override limit will be set to the user specified value.
Override Task Error Limit: Once checked, the task error limit will be changed to the user specified value.
Send Warning Notification For Job Errors: Whether or not to send a notification to the users specified in the
Notification Panel when a job error occurs.
Ignore Bad Slave Error Limit: If checked, A bad slave error will not count towards job errors.
Clear Bad Slave List: Determines whether or not the bad slave list should currently be cleared.
Cleanup
Here you can override if and how your job is automatically cleaned up when it completes.

3.4. Controlling Jobs

123

Deadline User Manual, Release 7.1.0.35

The properties are a follows:


Override Automatic Job Cleanup: If enabled, these cleanup settings will be used instead of the ones in the
Repository Options.
Cleanup Job After This Many Days: If enabled, this is the number of days to wait after this job has completed
before cleaning it up.
Cleanup Mode: Whether the cleanup should archive the job or delete it.
Scheduling
You can schedule the job to start and/or stop at a specific date and time, and even repeat on regular intervals. This
can be useful for maintenance jobs that need to run every few days or weeks. In addition, you can define a custom
schedule so that the jobs can start and/or stop at different times on different days of the week.

124

Chapter 3. Getting Started

Deadline User Manual, Release 7.1.0.35

Scheduling properties are as follows:


Scheduling Mode: Determines how the job will be scheduled. Possible values are Disabled, One Time, Repeating, or Custom.
Once or Repeating Scheduling Settings:
Start Date and Time: The date and time this job should start.
Stop Date and Time: If enabled, the date and time this job should be marked as complete if it is still
active.
Day Interval: The number of days to wait before repeating this job if the Scheduling Mode is set to
Repeating.
Custom Scheduling Settings: Configure the days and times that the job should start and/or stop.
It should be noted that if the job is not put into the Pending state, the job will not wait for the scheduled time to begin
rendering. When the scheduling settings change, you will be prompted to put the job in the pending state. This can
also be done by right clicking the job and choosing Mark as Pending.
Scripts
You can attach custom Python scripts to your job which can be run before and after your job has rendered. You may
also attach scripts to your jobs tasks which can be run before and after your jobs tasks render. For more information
on creating custom job scripts, see the Job Scripting section of the documentation.

3.4. Controlling Jobs

125

Deadline User Manual, Release 7.1.0.35

You may attach the following scripts which will be executed at different times:
Pre Job Script: Executed before a job is run.
Post Job Script: Executed after a job has completed.
Pre Task Script: Executed before a task is completed.
Post Task Script: Executed after a task has completed.
For more details on these script properties, see the Job Scripting section of the documentation.
Environment
When running a job, you are able to attach environment variables through the Environment tab. The environment variables are specified as key-value pairs and are set on the slave machine running the job. You are able to specify whether
your job specific environment variables will only be set while your job is rendering. All job specific environment
variables will be removed when the job has finished running.
You are also able to set a custom plugin directory on this panel. This acts as an alternative directory to load your jobs
plugin from. It is useful while creating and testing custom job plugins or when you need 1 or more jobs to specifically
use a custom job plugin which is not stored in the Deadline Repository.

126

Chapter 3. Getting Started

Deadline User Manual, Release 7.1.0.35

The Environment properties are as follows:


Custom Plugin Directory: An alternative directory to load your jobs plugin from.
Environment Variables: A list of environment variables to set while running a job. Stored as a list of key value
pairs.
Only Use Job Environment Variables When Rendering: Environment variables for your job will only be set
when the job is in the rendering state. Will be removed when the job is finished rendering.
Extra Info
When a job is submitted, it can have extra information embedded in it. For example, if a studio has an in-house
pipeline tool, they may want to embed information in the job that will be used to update the pipeline tool when the job
finishes rendering.

3.4. Controlling Jobs

127

Deadline User Manual, Release 7.1.0.35

The Extra Info 0-9 properties can be renamed from the Jobs section of the Repository Options, and have corresponding
columns in the Job list that can be sorted on. The additional key/value pairs in the list at the bottom do not have
corresponding columns, and can be used to contain internal data that doesnt need to be displayed in the job list.
Submission Params
Here you can view and export the job info and plugin info parameters that were specified when the job was submitted.
The exported files can be passed to the the Command application to manually re-submit the job. See the Manual Job
Submission documentation for more information.

128

Chapter 3. Getting Started

Deadline User Manual, Release 7.1.0.35

Plugin Specific Properties


The Plug-in specific properties vary between the different plug-ins, and some plug-ins may not have a Plug-in specific
properties tab at all. Note that when modifying properties for multiple jobs at the same time, the Plug-in specific tab
will only be available if all selected jobs use the same plug-in.

3.4. Controlling Jobs

129

Deadline User Manual, Release 7.1.0.35

To get a description of specific plug-in properties, just hover your mouse cursor over them in the properties dialog and
a tooltip will pop up with a description.

3.4.5 Job Dependency Options


Dependencies can be used to control when a job should start rendering. There are three types of dependencies available,
and one or more can be specified for a job:
Jobs: Job dependencies can be used to start a job when other jobs that it depends on are finished.
Assets: Asset dependencies can be used to start a job when specific files exist on disk.
Scripts: Script dependencies can be used to start a job based on if a Python script returns True or False.
There are a few ways to set up dependencies in the Monitor, which are described below.
Job Properties
In the Job tab on the Dependencies page, you have the ability to set which jobs your job is dependent on. By default,
the job will only resume when each of its dependencies have completed, but you can also have your job resume when
the dependencies have failed, or have been deleted from the queue. Note that you can only set which jobs this job is
dependent on, not which jobs are dependent on this job.
You can also make the job frame dependent, which means that a frame from the job wont begin rendering until the
same frame from the other job(s) is complete. This is useful if you have a job that is dependent on the frames of
another job, and you want the two jobs to render concurrently.
130

Chapter 3. Getting Started

Deadline User Manual, Release 7.1.0.35

In the Asset tab, you can make this job dependent on asset files (textures, particle caches, etc). This job wont be able
to render on a slave unless it can access all the files listed here.

3.4. Controlling Jobs

131

Deadline User Manual, Release 7.1.0.35

Iin the Script tab, you can make this job dependent on the results of the specified scripts.

132

Chapter 3. Getting Started

Deadline User Manual, Release 7.1.0.35

The following properties apply to all dependency types:


Resume On Completed Dependencies: This job will resume when its dependencies complete.
Resume On Failed Dependencies: This job will resume when its dependencies fail.
Resume On Deleted dependencies: This job will resume when its dependencies are deleted from the queue.
Resume When Each Dependency is % Complete: This job will resume when each of the jobs this job is
dependent on reaches a certain percentage of completion.
Use Frame Dependencies: Specifies that this job is dependent on specific frames from its dependencies, and
will release tasks for this job as appropriate.
Frame Offset Start/End: Use these to offset the frames that this job is dependent on. It can also be used to
make frames for this job dependent on multiple frames from other jobs.
You can also specify notes and set overrides for individual dependencies by clicking on them in the dependency list.
Click the Overrides button to view the overrides panel.

3.4. Controlling Jobs

133

Deadline User Manual, Release 7.1.0.35

Drag and Drop


In the Jobs panel, you can drag one or more jobs and drop them on another job. You will then be presented with some
choices on how to set the dependencies.

134

Chapter 3. Getting Started

Deadline User Manual, Release 7.1.0.35

Note that drag & drop dependencies will not work if you are holding down a modifier key (SHIFT, CTRL, etc). This
is to help avoid accidental drag & drops when selecting multiple jobs in the list.
If you would like to disable drag & drop dependencies, you can do so from the Monitor Options, which can be accessed
from the main toolbar. Note that if you change this setting, you will have to restart the Monitor for the changes to take
effect.
Dependency View
The Job Dependency View is used to be able to visualize and modify your jobs and their dependencies. You can open
the Job Dependency View panel from the View menu in the Monitor.

3.4. Controlling Jobs

135

Deadline User Manual, Release 7.1.0.35

The view will show your currently selected job and all nodes that are linked to it by dependencies. The job node colors
indicate the state of the job, while the asset nodes are yellow and the script nodes are purple.
Jobs are dependent on everything that has a connection to the Square Socket on their left side. Connections can be
made by dragging from the sockets on the nodes (square/circle) to the socket/main body of the other node. Connections
can be broken by either dragging the connection off of the node or by selecting the connection and pressing the delete
key. Note that changes made in the dependency view do not take effect until saved. If you have made changes and go
to close the dependency view, you will be notified that you have unsaved changes.
Additional job nodes can be added to the view by dragging them in from the job list (after locking the dependency
first), or through the right click menu. Asset and script nodes can also be added by dragging the file in from your
explorer/finder window, or through the right click menu as well.
Dependencies can be tested by pressing the Test Dependency button in the toolbar. The results are represented by the
following colors:
Green: The dependency test has passed.
Red: The dependency test has failed.
Yellow: The job is frame dependent, and the dependency test for some of the frames has passed.

136

Chapter 3. Getting Started

Deadline User Manual, Release 7.1.0.35

All the available dependency view options can be found across the toolbar at the top of the view, and/or from the
views right click menu.

Options in toolbar and right click menu:


Lock View: When enabled, the view will no longer show the currently selected job and will display the last
job selected before locking. This is necessary before additional job can be dragged from the job list into the
dependency view.
Reload View: This redraws the dependency view for the selected job. If changes have been made, you will be
prompted to save your changes.

3.4. Controlling Jobs

137

Deadline User Manual, Release 7.1.0.35

Save View: Saves the changes made to the dependency view for the selected job.
Selection Style: If off, all nodes and connections touched by the selection area will be selected. If on, only
nodes and connections that are fully contained by the selection are will be selected.
Minimap: Controls if the minimap is visible and if so, in which corner.
Elide Titles: Control whether or not the titles of nodes should be elided and if so, in which direction.
Zoom All: Zoom the view to the point where the entire view (area that has been used) is visible.
Zoom Extents: Zoom the view to the point where all nodes currently in the view are visible.
Options in toolbar only:
Modify Job Details: This allows you to set which properties are visible in the nodes.
Test Dependencies: This allows you to test your dependencies.
Zoom Level: The current zoom level.
Options in right-click menu only:
Job Menu: If one or more jobs are selected, you can use the same job menu that is available in the job list.
Add Job: Choose a job to add to the dependency view.
Add Asset: Choose an asset file to add to the dependency view.
Add Script: Choose a script file to add to the dependency view.
Expand/Collapse: Expand or collapse the details in all nodes.

3.4.6 Job Frame Range


To modify the frame range, select the Modify Frame Range option from the Job panels right-click menu. Note that
modifying these settings will stop and requeue all tasks that are currently rendering.

See the Frame List Formatting Options documentation for more information on options for formatting frame lists.

3.4.7 Job Reports and History


All reports for a job can be viewed in the Job Reports panel. This panel can be opened from the View menu or from
the main toolbar in the Monitor. It can also be opened from the Job and Task panels right-click menu.

138

Chapter 3. Getting Started

Deadline User Manual, Release 7.1.0.35

The following reports can be viewed from the Job Report panel:
Render Logs: These are the reports from tasks that rendered successfully.
Render Errors: This are the reports from tasks that failed to render.
Event Logs: These are the reports from Events that were handled successfully.
Event Errors: These are the reports from Events that raised errors.
Requeues: These are reports explaining why tasks were requeued.
You can use the Job Report panels right-click menu to save reports as files to send to Deadline Support. You can also
delete reports from this menu as well. Finally, if a particular Slave is reporting lots of errors, you can blacklist it from
this menu (or remove it from the jobs whitelist).
In addition to viewing job reports, you can also view the jobs history. The History window can be brought up from
the Job panels right-click menu by selecting the Job History option.

3.4. Controlling Jobs

139

Deadline User Manual, Release 7.1.0.35

3.4.8 Job Output


Many jobs have the options to explore and view the jobs output directly from the Job or Task panels right-click menu.
If the options to explore and view the output are available for the job, there will also be the option to copy the output
path to the clipboard. This is helpful if you need to paste the path into another application.
Note that the availability of these options is based on how much information about the jobs output could be determined
at the time the job was submitted. In some cases, the submitter cant determine where all or some of the jobs output
will be saved to, so these options wont be available.

140

Chapter 3. Getting Started

Deadline User Manual, Release 7.1.0.35

When viewing the output for a job, the Monitor will typically open the image file in the default application on the
machine. You can configure the Monitor to use specific image viewer applications in the Monitor Options, which can
be accessed from the main toolbar.

3.4. Controlling Jobs

141

Deadline User Manual, Release 7.1.0.35

Finally, some jobs will support the ability to scan completed tasks for a job to see if any output is missing or below an
expected file size. The Scan For Missing Output window can be opened by right-clicking on a job and selecting Job
Output -> Scan For Missing Output. If any missing output is detected, or the output file is smaller than the Minimum
File Size specified, you are given the option to requeue those tasks (simply place a check mark beside the tasks to
requeue).

142

Chapter 3. Getting Started

Deadline User Manual, Release 7.1.0.35

3.4.9 Job Auxiliary Files


Many jobs have additional files submitted with them, such as the scene file being rendered. These files are copied to
the server and are then copied to the Slaves when they render the jobs. If a job has auxiliary files submitted with it,
you can explore these files from the Job panels right-click menu. There will also be the option to copy the auxiliary
path to the clipboard, which is helpful if you need to paste the path into another application.

3.4. Controlling Jobs

143

Deadline User Manual, Release 7.1.0.35

3.5 Archiving Jobs


3.5.1 Overview
Deadline allows you to archive jobs, which is useful if you want to keep a backup of every job youve rendered, or
if you want to remove a job from one farm and place it in another. It can also be used to give a problematic job to
Deadline Support for testing purposes.
Jobs can be archived automatically or manually. When a job is archived, its job and task information are exported as
JSON to separate text files. These files are placed in a zip file with any auxiliary files that were submitted with the job,
and any reports the job currently has. The name of the zip file will contain the jobs user, plugin, name, and ID (to
guarantee uniqueness). It will have the following format:
USER__PLUGIN__JOBNAME__JOBID.zip

Typically, this zip file is placed in the jobsArchived folder in the Repository. However, when manually archiving a
job, you have the option to choose an alternative archive location.

3.5.2 Manual Job Archiving


Users can manually archive a job by right-clicking on it in the job list in the Monitor and selecting Archive Job. This
will bring up the following window:

144

Chapter 3. Getting Started

Deadline User Manual, Release 7.1.0.35

By default, it will save the archive to the jobsArchived folder in the Repository. However, you can choose a different
folder to archive the job. You can also choose whether or not to delete the job from the database after archiving it.
Once case where you might not want to delete it is if you are archiving a job to send to Deadline Support for testing
purposes.
If the Job panel is not visible, see the Panel Features documentation for instructions on how to create new panels in
the Monitor.

3.5.3 Automatic Job Archiving


When submitting a job, users can set the On Job Complete setting to Archive. When the job finishes, it will automatically be archived to the jobsArchived folder in the Repository.

3.5. Archiving Jobs

145

Deadline User Manual, Release 7.1.0.35

Administrators can also configure Deadline to automatically archive all jobs after they have finished rendering and
place them in the jobsArchived folder in the Repository. This can be done in the Job Settings section of the Repository
Options.

146

Chapter 3. Getting Started

Deadline User Manual, Release 7.1.0.35

3.5.4 Importing Archived Jobs


To import an archived job, simply select File -> Import Archived Jobs in the Monitor and choose one or more zip files
containing archived jobs.

3.5. Archiving Jobs

147

Deadline User Manual, Release 7.1.0.35

3.6 Monitor and User Settings


3.6.1 Overview
You can customize your Monitor options, User settings, and Styles in the Monitor Options. On Windows and Linux,
select Tools -> Options, and on Mac OS X, select DeadlineMonitor -> Preferences. You can also open these settings
from the main toolbar in the Monitor.

3.6.2 Monitor Options


The Monitor options allow you to customize a few aspects of the Monitor.

148

Chapter 3. Getting Started

Deadline User Manual, Release 7.1.0.35

Job List
Enable Drag & Drop Dependencies: If enabled, you can drag jobs and drop them on other jobs to set dependencies. Note that you must restart the Monitor for this setting to take effect. See the Controlling Jobs
documentation for more information on setting dependencies this way.
Show Task States In Job Progress Bar: If enabled, the job progress bars will show the states of all the tasks
for the job.
Group Jobs By Batch Name: If enabled, jobs that have the same Batch Name will be grouped together in the
job list. Note that you must restart the Monitor for this setting to take effect.
Change Color Of Jobs That Accumulate Errors: If enabled, jobs will change color from the Rendering color
to the Failed color as they accumulate errors. See the Styles section further down for more on the colors.
Task List
Task Double-click Behavior: Customize the double-click behavior of rendering, completed, and failed tasks in
the task list. Double-clicking on tasks in other states will bring up the task reports panel. These are the available
options:

3.6. Monitor and User Settings

149

Deadline User Manual, Release 7.1.0.35

View Reports: This will bring up the task reports panel for the selected task.
Connect To Slave Log: This will connect to the Slave that is rendering or has rendered the selected task.
View Image: This will open the output image for the selected task in the default viewer.
Change Color Of Tasks That Accumulate Errors: If enabled, tasks will change color from the Rendering color
to the Failed color as they accumulate errors. See the Styles section further down for more on the colors.
Miscellaneous
Start In Super User Mode: If enabled, the Monitor will start with Super User mode enabled. If Super User
mode is password protected, you will be prompted for the password when you start the Monitor.
Stream Job Logs from Pulse: If enabled, the Monitor will stream the job logs from Pulse instead of reading
them directly from the Repository. While streaming the logs this way is typically slower, it can be useful if the
connection to the Repository server is slow.
Show House Cleaning Updates In Status Bar: If enabled, the Monitor status bar will show when the last
House Cleaning was performed.
Show Repository Repair Updates In Status Bar: If enabled, the Monitor status bar will show when the last
Repository Repair was performed.
Show Pending Job Scan Updates In Status Bar: If enabled, the Monitor status bar will show when the last
Pending Job Scan was performed.
Enable Slave Pinging: If enabled, the Slave List will show if slave machines can be pinged or not.

3.6.3 Image Viewers


Configure the image viewer applications that the Job and Task panels uses to view output images. See the Controlling
Jobs documentation for more information on viewing job output.

150

Chapter 3. Getting Started

Deadline User Manual, Release 7.1.0.35

You can specify up to three image view applications with the following options:
Executable: The path to the image viewer executable you want to use.
Arguments: The arguments to pass to the image viewer executable. The default is {FRAME}, which represents a path to a single image file for a task. More information about the support argument tags can be found
below.
Name: The viewer name, which is used in the menu item created for this image viewer (defaults to the executable
name if left blank).
Viewer Supports Chunked Tasks: If enabled, the tasks image viewer dialog will not be shown when viewing
the output for jobs with Frames Per Task greater than 1.
The following tags are supported in the custom viewer arguments, and can be combined with other arguments that the
image viewer accepts:
3.6. Monitor and User Settings

151

Deadline User Manual, Release 7.1.0.35

{FRAME}: This represents the tasks frame file. For example: /path/to/image0002.png
{SEQ#}: This represents the tasks frame sequence files, using # as the padding.
/path/to/image####.png

For example:

{SEQ?}: This represents the tasks frame sequence files, using ?


/path/to/image????.png

as the padding.

For example:

{SEQ@}: This represents the tasks frame sequence files, using @ as the padding.
/path/to/image@@@@.png

For example:

{SEQ%}: This represents the tasks frame sequence files, using %d as the padding.
/path/to/image%04d.png

For example:

You can also specify the Preferred Image Viewer, which is the default image viewer to use when viewing output files.
If set to DefaultViewer, the systems default application for the output file type will be used.

3.6.4 User Settings


You can configure you user settings here.

152

Chapter 3. Getting Started

Deadline User Manual, Release 7.1.0.35

Notification Settings
If you would like to receive email notifications for your job, you can specify your email address in the Notification
Settings and enable the option to receive them. Note that this requires your administrator to configure the email settings
in the Repository Options.
If you would like to receive popup message notifications for your job, you can specify your machine name in the
Notification Settings and enable the option to receive them. Note that this requires the Launcher to be running on the
machine that you specify here.
Render Job As User Settings
If the Render Job As User option is enabled in the job settings in the Repository Options, these options will be used to
launch the rendering process as the specified user. For Linux and OSX, only the User Name is required. For Windows,
the Domain and Password must be provided for authentication. See the Render Jobs As Jobs User documentatison
for more information.
Web Service Authentication Settings
You can also specify a Web Service password, which is typically used for the Mobile application. A password is
required to authenticate with the Deadline web service if authentication has been enabled and empty passwords are
not allowed.
Region
A users region is used for cross platform rendering. All the paths a user sees in the Monitor will be replaced based on
the path mappings for their region. Example: Viewing the output of a completed job. See Region Settings and Regions
for more information.

3.6.5 Styles
The Styles panel can be used to customize the color palette and the fonts that the Deadline Applications use. Custom
styles can be saved and imported as well.

3.6. Monitor and User Settings

153

Deadline User Manual, Release 7.1.0.35

By default, the current style will be Default Style, which is the style shipped with Deadline and cannot be modified in
any way. Previously saved styles will be available in the Saved Styles list. Custom styles can be created and deleted
by clicking the Create New Style and Delete Style buttons, respectively.
Once a custom style has been selected, the styles color palette can be modified:
The General Palette color is used to generate the colors for the various controls and text in the Deadline applications. Note that dark palettes will result in light text, and light palettes will result in dark text.
The Selection color is used to highlight selected items or text.
The remaining colors are used to color the text for jobs, tasks, slaves, etc, based on their current state. It is
recommened to choose colors that contrast well with the General Palette and Selection colors to ensure the text
is readable.
The styles font can be modified as well:
Primary Font: This is the font used for almost all the text in the Deadline applications.
Console Font: This is the font used in console and log windows. By default, a monospace font is used for these
windows.

154

Chapter 3. Getting Started

Deadline User Manual, Release 7.1.0.35

Any style changes made are not saved until the Monitor Options dialog is accepted by clicking OK. Once the dialog
has been accepted, the Monitor must be restarted in order to apply the style changes. In order to facilitate testing out
new styles, there is a Preview Style button which opens a dialog that displays an approximation of the current style
settings.

Note that the Deadline applications will always load with the style that was last selected in the Styles panel in the
Monitor Options.

3.6. Monitor and User Settings

155

Deadline User Manual, Release 7.1.0.35

Styles may also be saved and loaded using the View menu in the Monitor. Note that when saving styles, all of the
custom styles are saved, and when loading saved styles from disk the loaded styles will be appended to the list of styles
currently present, overwriting any styles with a shared name.

3.7 Local Slave Controls


3.7.1 Overview
The Local Slave Controls allow you to control the slave on your machine, as well as configure Idle Detection and the
Job Dequeing Mode. You can access the Local Slave Controls from the Launchers menu, or from the Tools menu in
the Monitor.

156

Chapter 3. Getting Started

Deadline User Manual, Release 7.1.0.35

Note that it is possible for Administrators to disable the Local Slave Controls. If thats the case, you will see this
message when trying to open them.

3.7.2 Slave Controls


This section allows you to view the state of the slave running on your machine. Also, if the slave is rendering, you can
see which job it is currently rendering in the list. Finally, you can control the slave on you machine by right-clicking
on them in the list.

3.7. Local Slave Controls

157

Deadline User Manual, Release 7.1.0.35

More information about the avaiable controls can be found in the Remote Control documentation.

3.7.3 Override Idle Detection


This section overrides the global Slave Scheduling settings for your machine (if there are any). It can be used to
start the slave when your machine becomes idle (based on keyboard and mouse activity), and stop the slave when the
machine is in use again. Note that Idle Detection is managed by the Launcher, so it must be running for this feature to
work.

158

Chapter 3. Getting Started

Deadline User Manual, Release 7.1.0.35

The available Idle Detection settings are as follows:


Start Slave When Machine Is Idle For: This option enables Idle Detection, and you can specify the number
of minutes without keyboard, mouse or tablet activity before the slave should start.
Only Start Slave If CPU Usage Less Than: If enabled, the slave will only start if the machines CPU usage is
less then the given value.
Only Start Slave If Free Memory More Than: If enabled, the slave will only start if the machine has this
much free memory available.
Only Start Slave If These Processes Are Not Running: If enabled, the slave will not start if any of the listed
processes are running.
Only Start Slave If Launcher Is Not Running As These Users: If eanbled, the slave will not start if the
Launcher process is running as any of the listed users.
Stop Slave When Machine Is No Longer Idle: If enabled, the slave will automatically stop when there is
keyboard, mouse or tablet activity again.
Only Stop Slave If Started By Idle Detection: If enabled, the Slave will only be stopped when the machine is
no longer idle if that Slave was originally started by Idle Detection. If the Slave was originally started manually,
it will not be stopped.
Allow Slave To Finish Its Current Task When Stopping: If enabled, the slave will finish its current task

3.7. Local Slave Controls

159

Deadline User Manual, Release 7.1.0.35

before stopping when the machine is no longer idle. If disabled, the slave wil requeue its current task before
stopping so that another slave can render it.
There are some limitations with Idle Detection depending on the operating system:
On Windows, Idle Detection will not work if the Launcher is running as a service. This is because the service
runs in an environment that is separate from the Desktop, and has no knowledge of any mouse or keyboard
activity.
On Linux, the Launcher uses X11 to determine if there has been any mouse or keyboard activity. If X11 is not
available, Idle Detection will not work.

3.7.4 Job Dequeuing Mode


This section can be used to control how your slave dequeues jobs.

The available dequeing modes are:


All Jobs: This is the default behavior. The slave will dequeue any job that it can work on.
Only Jobs Submitted From This Slaves Machine: This option will only allow the slave to dequeue jobs
submitted from the same machine. This is a useful way of ensuring that your slave will only render your jobs.
Only Jobs Submitted From These Users: This option will only allow the slave to dequeue jobs submitted by

160

Chapter 3. Getting Started

Deadline User Manual, Release 7.1.0.35

the specified users. This is another way of ensuring that your slave will only render your jobs. However, it can
also be used to make your slave render jobs from other specific users, which is useful if youre waiting on the
results of those jobs.

3.7. Local Slave Controls

161

Deadline User Manual, Release 7.1.0.35

162

Chapter 3. Getting Started

CHAPTER

FOUR

CLIENT APPLICATIONS

4.1 Launcher
4.1.1 Overview
The Launchers main use is to provide a means of remote communication between the Monitor and the Slave or Pulse
applications, and therefore should always be left running on your render nodes and workstations. It can also detect if
the Slave running on the machine has stalled, and restart it if it does.
Unless the Launcher is running as a service or daemon, you should see the
icon in your system tray or notification
area. You can right-click on the icon to access the Launcher menu, or double-click it to launch the Monitor.

4.1.2 Running The Launcher


To start the Launcher:
On Windows, you can start the Launcher from the Start Menu under Thinkbox\Deadline.
On Linux, you can start the Launcher from a terminal window by running the deadlinelauncher script in the bin
folder.
On Mac OS X, you can start the Launcher from Finder by running the DeadlineLauncher application in Applications/Thinkbox/Deadline.
The Launcher can also be started from a command prompt or terminal window. For more information, see the Launcher
Command Line documentation.

4.1.3 Administration Features


Running the Launcher can help make some administrative tasks easier, which is why its recommended to keep it
running at all times on your render nodes and workstations.
Automatic Updates
If you have enabled Automatic Upgrades under the Client Setup section of the Repository Options, whenever you
launch the Monitor, Slave, or Pulse using the Launcher, it will check the Repository for updates and upgrade itself
automatically if necessary before starting the selected application.
Note that the upgrade will only trigger when launching applications through the Launcher. Also, if the Launcher is
running as a service on Windows, launching the Monitor will not trigger an update.

163

Deadline User Manual, Release 7.1.0.35

Remote Administration
If you have enabled Remote Administration under the Client Setup section of the Repository Options, you will be able
to control the Slave or Pulse applications remotely, and remotely execute arbitrary commands. Note that it may be a
potential security risk to leave it running if you are connected to the internet and are not behind a firewall. In this case,
you should leave Remote Administration disabled.

4.1.4 Launcher Menu Options


Right-click on the Launcher system tray icon to bring up the Launcher menu. The available options are listed below.
Note that if the Launcher is running as a service or daemon, this menu is unavailable because the system tray icon will
be hidden.

Launch Monitor
Launches the Monitor application. If the Repository has been upgraded recently, and Automatic Updates
is enabled, this will automatically upgrade the client machine.
Launch Slave(s)
Launches the Slave application. If this machine has been configured to run more than one Slave instance,
this will launch all of them. If the Repository has been upgraded recently, and Automatic Updates is
enabled, this will automatically upgrade the client machine.
Launch Slave By Name
Launch a specific Slave instance, or add/remove Slave instances from this machine (if enabled for the
current user). Note that new Slave instances must have names that only contain alphanumeric characters,
underscores, or hyphens. See the documentation on running Multiple Slaves On One Machine for more
information.
Local Slave Controls
Opens the Local Slave Controls window, which allows you to control and configure the Slave that runs on
your machine.
Launch Slave at Startup

164

Chapter 4. Client Applications

Deadline User Manual, Release 7.1.0.35

If enabled, the Slave will launch when the Launcher starts up.
Restart Slave If It Stalls
If enabled, the Launcher will try to restart the Slave on the machine if it stalls.
Scripts
Allows you to run general scripts that you can create. Note that these are the same scripts that you can
access from the Scripts menu in the Monitor. Check out the Monitor Scripts documentation for more
information.
Submit
Allows you to submit jobs for different rendering plug-ins. Note that these are the same submission scripts
that you can access from the Submit menu in the Monitor. More information regarding the Monitor submission scripts for each plug-in can be found in the Plug-Ins section of the documentation. You can also
add your own submission scripts to the submission menu. Check out the Monitor Scripts documentation
for more information.
Change Repository
Change the Repository that the client connects to.
Change User
Change the current user on the client.
Change License Server
Change the license server that the Slave connects to.
Explore Log Folder
Opens the Deadline log folder on the machine.

4.1.5 Command Line Options


To run the Launcher from a command prompt or terminal window, navigate to the Deadline bin folder (Windows or
Linux) or the Resources folder (Mac OS X) and run the deadlinelauncher application. To view all available command
line arguments, you can run the following:
deadlinelauncher -help

Available Options
To start the Monitor with the Launcher, use the -monitor option. If another Launcher is already running, this will tell
the existing Launcher to start the Monitor. If an upgrade is available, this will trigger an automatic upgrade:
deadlinelauncher -monitor

To start the Slave with the Launcher, use the -slave option. If another Launcher is already running, this will tell the
existing Launcher to start the Slave. If an upgrade is available, this will trigger an automatic upgrade:
deadlinelauncher -slave

To start Pulse with the Launcher, use the -pulse option. If another Launcher is already running, this will tell the existing
Launcher to start Pulse. If an upgrade is available, this will trigger an automatic upgrade:

4.1. Launcher

165

Deadline User Manual, Release 7.1.0.35

deadlinelauncher -pulse

To start the Balancer with the Launcher, use the -balancer option. If another Launcher is already running, this will tell
the existing Launcher to start the Balancer. If an upgrade is available, this will trigger an automatic upgrade:
deadlinelauncher -balancer

To trigger an automatic upgrade if one is available, use the -upgrade flag:


deadlinelauncher -upgrade

To run the Launcher without a user interface, use the -nogui option. Note that if the Launcher is running in this mode,
if you launch the Slave or Pulse through the Launcher, they will also run without a user interface:
deadlinelauncher -nogui
deadlinelauncher -nogui -slave

To shutdown the Launcher if its already running, use the -shutdown option:
deadlinelauncher -shutdown

To shutdown the Slaves, Pulse, and Balancer on the machine before shutting down the Launcher, use the -shutdownall
option:
deadlinelauncher -shutdownall

4.1.6 Launcher As A Service


When installing the Deadline Client on Windows, you can choose to install the Launcher as a service. If you want
to configure the Launcher to run as a service after the Client has been installed, it is possible to set up the service
manually, which is explained below. However, its probably easier to simply run the Client installer again and enable
the service option during installation.
There are also some considerations that need to be made when installing the Launcher as a service. See the Windows
Service documentation for more information.
Manually Installing the Launcher Service
You can use Deadline Command along with the following commands to install or uninstall the Launcher service:
InstallLauncherService
[true/false]

InstallLauncherServiceLogOn
[User Name]
[Password]
[true/false]

166

Installs the Deadline Launcher Service, and


optionally starts it.
Whether or not to start the Launcher Service
after it has been installed (optional)

Installs the Deadline Launcher Service with the


given account, and optionally starts it.
The account user name
The account password
Whether or not to start the Launcher Service

Chapter 4. Client Applications

Deadline User Manual, Release 7.1.0.35

after it has been installed (optional)

UninstallLauncherService

Stops and uninstalls the Deadline Launcher


Service.

StartLauncherService

Starts the Deadline Launcher Service if it is


running.

StopLauncherService

Stops the Deadline Launcher Service if it is


running.

Here is an example command line to install the service:


deadlinecommand.exe -InstallLauncherServiceLogOn "USER" "PASSWORD"

Here is an example command line to uninstall the service:


deadlinecommand.exe -UninstallLauncherService

4.1.7 FAQ
Why should the Launcher application be left running on the client machines?
Its main purpose is to provide a means of remote communication between the Monitor and the Slave
applications. If its not running, the Slave will have to be stopped and started manually.
In addition, whenever you launch the Monitor or Slave using the Launcher, it will check the Repository
for updates and upgrade itself automatically if necessary before starting the selected application. If the
Launcher is not running, updates will not be detected.
Finally, the Launcher can detect if the Slave running on the machine has stalled, and restart it.
Can I run the Launcher without a user interface?
Yes, you can do this by passing the -nogui command line argument to the Launcher application:
deadlinelauncher -nogui

I have Idle Detection enabled, but the Launcher doesnt start the Slave on Linux when its been idle long enough.
The libX11 and libXext libraries must be installed on Linux for Idle Detection to work. To check if libX11
and libXext are installed, open a Terminal and run the following commands. If they are installed, then the
path to the libraries will be printed out by these commands.
ldconfig -p | grep libX11
ldconfig -p | grep libXext

If any of these libraries are missing, then please contact your local system administrator to resolve this
issue. Here is an example assuming you have root access, using YUM to install them on your system:

4.1. Launcher

167

Deadline User Manual, Release 7.1.0.35

sudo -s
yum install redhat-lsb
yum install libX11
yum install libXext

4.2 Monitor
4.2.1 Overview
The Monitor application offers detailed information and control options for each job and Slave in your farm. It provides
normal users a means of monitoring and controlling their jobs, and it gives administrators options for configuring and
controlling the entire render farm.

If youre launching the Monitor for the first time on your machine, you will be prompted with a Login dialog. Simply
choose your user name or create a new one before continuing. Once the Monitor is running, youll see your user name
in the bottom right corner. If this is the wrong user, you can log in as another user by selecting File -> Change User.
Note that if your administrator set up Deadline to lock the user to the systems login account, you will have to log off
of your system and log back in as the correct user.

168

Chapter 4. Client Applications

Deadline User Manual, Release 7.1.0.35

4.2.2 Running the Monitor


To start the Monitor:
On Windows, you can start the Monitor from the Start Menu under Thinkbox\Deadline, or from the Launchers
right-click menu.
On Linux, you can start the Monitor from a terminal window by running the deadlinemonitor script in the bin
folder, or from the Launchers right-click menu.
On Mac OS X, you can start the Monitor from Finder by running the DeadlineMonitor application in Applications/Thinkbox/Deadline, or from the Launchers right-click menu.
The Monitor can also be started from a command prompt or terminal window. For more information, see the Monitor
Command Line documentation.

4.2.3 Panel Features


Information in the Monitor is broken up into different panels, which are described further down. These panels have
many features in common, which are explained here.
Customization
Monitor panels can be created from the View menu, or from the main toolbar. They can be re-sized, docked, or floated
as desired. This allows for a highly customized viewing experience which is adaptable to the needs of different users.

4.2. Monitor

169

Deadline User Manual, Release 7.1.0.35

The current layout can be pinned to the Pinned Layouts menu so that it can be restored at a later time. This can be
done from the View menu, or from the main toolbar. The current layout can also be saved to a file from the View
menu, and then loaded from that file later.

When you pin a layout you can chose to save the location and size of the monitor by checking the Save Location and
Size box when pinning the layout.

170

Chapter 4. Client Applications

Deadline User Manual, Release 7.1.0.35

To prevent accidental modifications to the current layout, you can lock the layout from the View menu, by pressing
Alt-, or from the main toolbar. When locked, panels cannot be moved, but they can still be docked and undocked.
To dock a floating panel while the layout is locked, simply double-click on the panels title. It will be docked to the
same location it was originally undocked from.

The columns in monitor panels are customizable. The columns can be resized by simply clicking on the separator
column line and moving it and can be reordered by clicking on a column and moving it. Right clicking on the column
headers in a panel allows you to toggle the visibility of each column.

4.2. Monitor

171

Deadline User Manual, Release 7.1.0.35

In this menu you can modify the visibility and ordering of the columns by clicking the Customize.. menu item.
Moving columns to the left side list hides them, and the order that columns are listed in the right list corresponds to
the order they will appear in the panel (top->bottom corresponds to left->right). You move the columns around by
clicking the arrow buttons.

172

Chapter 4. Client Applications

Deadline User Manual, Release 7.1.0.35

Once you have configured your column layout you can pin it.

4.2. Monitor

173

Deadline User Manual, Release 7.1.0.35

You can also set the current list layout as the list layout to load by default, when opening new panels of the same type,
by clicking Save Current List Layout As Default. If you want to restore the original list layout default click the
Reset Default List Layout.
Data Filtering
Almost every panel has a search box that you can use to filter the information youre interested in. You can simply
type in the word(s) you are looking for, or use regular expressions for more advanced searching.

In addition, every panel that has a search box also supports a more advanced filtering system. To add a filter to a
panel, select the Edit Filter option in the panels drop down menu, which can be found in the upper-right corner of the
panel. A window will appear allowing you to specify the name the filter being created. You can select to match all of
the filters added or any of the filters added. If all must match, only records where all data matches each filter will be
shown, while if any can match, if a record contains one or more matches it will be shown.

174

Chapter 4. Client Applications

Deadline User Manual, Release 7.1.0.35

4.2. Monitor

175

Deadline User Manual, Release 7.1.0.35

Clicking the add filter button generates a new filter. The filter requires a column to be selected, an operation to perform,
and a value to use in the operation. Filters can also be removed by clicking the minus button to the right of each filter.
After all filters are are entered, press OK to apply the filter to the current panel.
A filter can be cloned and opened in a new tab within the panel through the Clone Filter option in the panel drop down
menu. The Clear Filter option can be used to clear all filters from the current panel.
Finally, you can pin the current filters so that they can be restored at a later time using the Pinned Filters sub menu in
the panel drop down menu. Note that the Pin Current Filter option is only available if a filter is currently being applied.
If there are no filters, the Pin Current Filter option will be hidden.
Automatic Sorting and Filtering
Almost every panel has an option to do automatic sorting and filtering when data changes in the panel. When this
option is disabled, sorting and filters must manually be re-applied to ensure that the data is sorted and filtered properly.
Note that automatic sorting and filtering can affect the Monitors performance if there are lots of jobs (10,000+) or lots
of slaves (1000+) in the farm. To improve Monitor performance in this case, it is recommended to disable automatic
sorting and filtering. There is an option in the Monitor Settings in the Repository Configuration to automatically
disable it by default.
Saving and Loading Panel Layouts
Every list-based panel (Jobs, Slaves, Tasks, etc) has an option to save and load the list layout, which you can find in
the panels drop down menu. This allows you to save out a lists filters, column order and visibility, etc, and load them
again later or share them with another user.

176

Chapter 4. Client Applications

Deadline User Manual, Release 7.1.0.35

Note that when loading a list layout, you must choose a layout that was saved from the same type of list. For example,
you cannot save a layout from the Job list and then load it into the Slave list.
Graph Views
Almost every panel supports showing a graphical representation of the data. The graph can be shown by selecting
the Graph View option in the panels drop down menu, which can be found in the upper-left corner of the panel. The
graph view can be saved as an image file by right-clicking anywhere in view and selecting Save Graph As Image.

4.2. Monitor

177

Deadline User Manual, Release 7.1.0.35

If the graph is a line graph, the following operations are available:


Zoom In: Use the mouse wheel or the UP arrow key to zoom in. You can also click and hold the left mouse
button and drag to select a sub-area of the graph to zoom in.
Zoom Out: Use the mouse wheel or the DOWN arrow key to zoom out.
Reset Zoom: Use the right-click menu to reset the zoom level.
Pan: Use the middle mouse button or the LEFT and RIGHT arrow keys to pan the graph.
Show/Hide Series: If the line graph has a legend, you can use the right-click menu to customize which series
are shown or hidden.

If the graph is a pie chart, you can filter the data from the graph view by holding down the SHIFT key and clicking
on one of the pie slices. The data will be filtered to only show records that are represented by the pie slice that was
clicked on.

178

Chapter 4. Client Applications

Deadline User Manual, Release 7.1.0.35

Scripts
Almost every panel has the option to run custom scripts from the panels right-click menu. Many scripts are already
shipped with Deadline, and additional custom scripts can be written. See the Monitor Scripts documentation for more
information.
These script menus can also be customized from the Repository Options.

4.2.4 Information Panels


As mentioned earlier, information in the Monitor is broken up into different panels. These panels can be created from
the View menu, or from the main toolbar. They can be re-sized, docked, or floated as desired. This allows for a highly
customized viewing experience which is adaptable to the needs of different users.
Jobs
The Jobs panel contains a list that shows all jobs in the farm. It also displays useful information about each job such
as its name, user, status, error count, plugin, etc. As jobs change states, their colors will change. Active jobs will
appear as green, and will remain green as they continue to render without errors. But if it starts to accumulate errors, it
will turn brown and then eventually red. This allows you to see at a glance which jobs are having problems. For more
information on job monitoring, see the Monitoring Jobs documentation.

4.2. Monitor

179

Deadline User Manual, Release 7.1.0.35

The Jobs panel supports standard filtering, but it also has a Quick Filter option in the panels drop down menu to make
it easier to filter out unwanted jobs. By toggling the options within the Status, User, Pool, Group, and Plugin sections,
you can quickly drill down to the jobs you are interested in. There is also an Ego-Centric Sorting optino in the panels
drop down menu which can be used to keep all of your jobs at the top of the job list.

The Jobs panel also supports the ability to group jobs together based on their Batch Name property. All of the job
submitters that are included with Deadline will automatically set the Batch Name if they are submitting multiple jobs
that are related to each other. The Batch Name for a job can be modified in the Job Properties. If you prefer to not

180

Chapter 4. Client Applications

Deadline User Manual, Release 7.1.0.35

have the jobs grouped together in the job list, you can disable the Group Jobs By Batch Name option in the Monitor
and User Settings.

Finally, the Jobs panel allows jobs to be controlled and modified using the right-click menu. You can also bring up the
Job Properties window by double clicking on a job. See the Controlling Jobs documentation for more information.
Tasks
The Task panel shows all the tasks for the job that is currently selected. It displays useful information about each task
such as its frame list, status, and if applicable, the Slave that is rendering it.

The Task panel also allows you to control tasks from the right-click menu. See the Controlling Jobs documentation

4.2. Monitor

181

Deadline User Manual, Release 7.1.0.35

for more information. In addition, the double-click behavior in the Task panel can be set in the Monitor and User
Settings, which can be accessed from the main toolbar.
Job Details
The Job Details panel shows all available information about the job that is currently selected. The information is split
up into different sections that can be expanded or collapsed as desired.

Job Dependency View


This panel allows you to view and modify a jobs dependency tree in a node-based view. You can lock the view to
the currently selected job, which allows you to drag & drop other jobs into the view to hook up new dependencies. In
addition, you can drag & drop Python scripts or asset files directly into the view and hook them up as dependencies.
See the Controlling Jobs documentation for more information.

182

Chapter 4. Client Applications

Deadline User Manual, Release 7.1.0.35

Job Report
All reports for a job can be viewed in the Job Reports panel. This includes error reports, logs, and task requeue
reports. This panel can also be opened by right-clicking on a job in the Job List and selecting View Job Reports. More
information can be found in the Controlling Jobs documentation.

4.2. Monitor

183

Deadline User Manual, Release 7.1.0.35

Slaves
The Slave panel shows all the Slaves that are in your farm. It shows system information about each Slave, as well as
information about the job the slave is currently rendering.

If you see a slave that is colored orange in the list, this means that the slave is unable to get a license or that the license
is about to expire. When the slave cannot get a license, it could be because there is a network issue, the license has
expired, or the license limit has been reached.
If a slave isnt rendering a job that you think it should be, you can use the Job Candidate Filter option in the panels drop
down menu to try and figure out why. See the Job Candidate Filter section in the Slave Configuration documentation

184

Chapter 4. Client Applications

Deadline User Manual, Release 7.1.0.35

for more information.


The Slave panels right-click menu allows you to modify Slave settings and control the Slaves remotely. See the Slave
Configuration documentation for more information.
Slave Reports
All log and error reports for a Slave can be viewed in the Slave Reports panel. This panel can also be opened by
right-clicking on a slave in the Slave List and selecting View Slave Reports.

Pulses
The Pulse panel shows which machine Pulse is running on, as well as previous machines that Pulse has run on. It also
shows system information about each machine.

4.2. Monitor

185

Deadline User Manual, Release 7.1.0.35

Balancers
The Balancer panel shows which machines the Balancer is running on. It also shows system information about each
machine.

The Balancer panels right-click menu allows you to modify Balancer settings and control the Balancer remotely. See
the Balancer Configuration documentation for more information.
Limits
The Limit panel shows all the Limits that are in your farm. You can access many options for the Limits by rightclicking on them. See the Limits and Machine Limits documentation for more information.

Console
The Console panel shows all lines of text that is written to the Monitors log.

186

Chapter 4. Client Applications

Deadline User Manual, Release 7.1.0.35

Remote Commands
The Remote Command panel shows all pending and completed remote commands that were sent from the Monitor.
When sending a remote command, if this panel is not already displayed, it will be displayed automatically (assuming
you have permissions to see the Remote Command panel). See the Remote Control documentation for more information.

Cloud
The Cloud panel shows all the instances from the cloud providers that the Monitor is connected to. This panel allows
you to control and close your existing instances. See the Cloud Controls documentation for more information.

4.2. Monitor

187

Deadline User Manual, Release 7.1.0.35

4.2.5 Monitor Menu Options


The available options are listed below. They are available in the Monitors main menu, and some are also available in
the main toolbar. Note that the availability of these options can vary depending on the context in which they are used,
as well as the User Group Permissions that are defined for the current user.
File Menu
Change Repository
Connect to a different repository, or reconnect to the current repository if the Monitor becomes disconnected. There is also a toolbar button for this option.
Change User
Change the current user. You have the choice to select a different user or create a new one. There is also
a toolbar button for this option.
Import Archived Jobs
Opens a file dialog which allows you to select a zip file containing an archived job which you would like
to add back to the monitor. See the Archiving Jobs documentation for more information.
View Menu
Manual Refresh
Forces an immediate refresh of all the data in the Monitor. Manual refreshing is disabled by default, and
can only be enabled in the Monitor Settings in the Repository Configuration.
New Panel
Spawn a new information panel. See the Information Panels section above for more information. There
is also a toolbar button for this option.
Lock Panels
Prevents the panels from being moved. Panels can still be floated, docked, and closed. To dock a floating
panel, double-click on the panels title. There is also a toolbar button for this option.
Pinned Layouts
You are able to save different Monitor layout for quick use. By selecting Pin Current Layout, your current
layout will be added to your pinned layouts. Selecting a pinned layout will restore the monitors panels to
the pinned layouts state. There is also a toolbar button for this option.
Open Layout

188

Chapter 4. Client Applications

Deadline User Manual, Release 7.1.0.35

Load a previously saved layout from file.


Save Layout
Saves the current layout to file.
Save All Pined Layouts
Save all the pinned Monitor layouts to a zip file.
Reset Layout
Reset the current layout to the Monitors default layout.
Submit and Script Menus
Submission scripts can be found under the Submit menu, and general scripts can be found under the Scripts menu.
Many scripts are already shipped with Deadline, and additional custom scripts can be written. Check out the Monitor
Scripts documentation for more information.
Tools Menu
Super User
Enter Super User Mode, which allows you to access the administrative Monitor options. Super User
mode can be password protected simply by specifying a password in the Access Control section of the
Repository Configuration.
View Repository History
View all repository history entries generated on the farm.
View Power Management History
View all power management history entries on the farm. See the Power Management documentation for
more information.
View Farm Reports
View various repository statistical information. See the Farm Statistics documentation for more information.
Manage Pools
Add or remove Pools, and configure which Pools are assigned to the Slaves. See the Pools and Groups
documentation for more information.
Manage Groups
Add or remove Groups, and configure which Groups are assigned to the Slaves. See the Pools and Groups
documentation for more information.
Manage Users
Add or remove users, and set user information. See the User Management documentation for more
information.
Manage User Groups
Add or remove a user group, and set user group permissions to control which features are accessible. See
the User Management documentation for more information
Configure Repository Settings

4.2. Monitor

189

Deadline User Manual, Release 7.1.0.35

Configure a wide range of global settings. See the Repository Configuration documentation for more
information.
Configure Slave Scheduling
Configure the slave scheduling options. See the Slave Scheduling documentation for more information.
Configure Power Management Options
Configure the Power Management settings. See the Power Management documentation for more information.
Configure Cloud Providers
Set up and enable cloud service providers. See the Cloud Controls documentation for more information.
Configure Plugins
Configure the available render plugins, such as 3ds Max, After Effects, Maya, and Nuke. See the plugin
documentation for more information on the configurable settings for each plugin.
Configure Event Plugins
Configure the available event plugins such as Draft and Shotgun. See the event plugin documentation for
more information on the configurable settings for each plugin.
Connect to Pulse Log
Use this to remotely connect to the Pulse log. See the Remote Control documentation for more information.
Perform Pending Jobs Scan
Performs a scan of pending jobs and determines if any should be released. This operation is normally
performed automatically, but you can force an immediate clean up with this option if desired.
Perform House Cleaning
Clean up files for deleted jobs, check for stalled slaves, etc. This operation is normally performed automatically, but you can force an immediate clean up with this option if desired.
Undelete Jobs
Use this to recover any deleted jobs that havent been purged from the database yet.
Explore Repository Root
View the root directory of the current Repository.
Import Settings
Import settings from another Repository. See the Importing Repository Settings documentation for more
information.
Synchronize Scripts and Plugin Icons
Rebuilds the script-specific menus, and updates your local plugin icon cache with the icons that are currently in the Repository. Note that if any new icons are copied over, you will have to restart the Monitor
before the jobs in list show the new icons.
Local Slave Controls
Opens the Local Slave Controls window, which allows you to control and configure the Slave that runs on
your machine.
Options
Modify the Monitor and User Settings. There is also a toolbar button for this option.

190

Chapter 4. Client Applications

Deadline User Manual, Release 7.1.0.35

4.2.6 Command Line Options


To run the Monitor from a command prompt or terminal window, navigate to the Deadline bin folder (Windows or
Linux) or the Resources folder (Mac OS X) and run the deadlinemonitor application. To view all available command
line arguments, you can run the following:
deadlinemonitor -help

Available Options
To start a new Monitor if there already another Monitor running, use the -new option:
deadlinemonitor -new

To start the Monitor connected to a different repository, use the -repository option. You can combine this with the
-new option to have different Monitors connected to different repositories:
deadlinemonitor -repository "\\repository\path"
deadlinemonitor -new -repository "\\repository\path"

To start the Monitor without the splash screen, use the -nosplash option:
deadlinemonitor -nosplash

To shutdown the Monitor if its already running, use the -shutdown option:
deadlinemonitor -shutdown

You can also set all of the Monitor Options using command line options. For example:
deadlinemonitor -draganddropdep True -groupjobbatches False

4.2.7 FAQ
Im unable to move panels in the Monitor, or dock floating panels.
You need to unlock the Monitory layout. This can be done from the View menu or from the toolbar.
Can I dock a floating panel when the Monitor layout is locked?
Yes, you can dock the floating panel by double-clicking on its title bar. It will be docked to its previous
location, or to the bottom of the Monitor if it wasnt docked previously.
What does it mean when a Slave is orange in the Slave list?
This means that the Slave is currently unable to get a license.

4.2. Monitor

191

Deadline User Manual, Release 7.1.0.35

4.3 Slave
4.3.1 Overview
The Slave is the application that controls the rendering applications and should be running on any machine you want
to to include in the rendering process.

192

Chapter 4. Client Applications

Deadline User Manual, Release 7.1.0.35

4.3.2 Running the Slave


To start the Slave:
On Windows, you can start the Slave from the Start Menu under Thinkbox\Deadline, or from the Launchers

4.3. Slave

193

Deadline User Manual, Release 7.1.0.35

right-click menu.
On Linux, you can start the Slave from a terminal window by running the deadlineslave script in the bin folder,
or from the Launchers right-click menu.
On Mac OS X, you can start the Slave from Finder by running the DeadlineSlave application in Applications/Thinkbox/Deadline, or from the Launchers right-click menu.
You can also configure the Slave to launch automatically when the Launcher starts up. To enable this, just enable the
Launch Slave At Startup option in the Launcher menu.
The Slave can also be started from a command prompt or terminal window. For more information, see the Slave
Command Line documentation.

4.3.3 Licensing
The Slave requires a license to run, and more information on setting up licensing can be found in the Licensing Guide.
The Slave only requires a license while rendering. If a Slave cannot get a license, it will continue to run, but it wont
be able to pick up jobs for rendering. In addition when a slave becomes idle it will return its license. The Slaves
licensing information can be found under the Slave Information tab (see next section).
If you have more then one slave running on a machine they will all share the same licence.

4.3.4 Job and Slave Information Tabs


The Job Information tab shows information about the job currently being rendered. By default, the tab will show
information about all render threads combined, but the drop down control gives the option to show information about
a specific render thread. The Slave Information tab shows information about the Slave and the machine that its running
on, including license information and resource usage (CPU and memory).

194

Chapter 4. Client Applications

Deadline User Manual, Release 7.1.0.35

4.3. Slave

195

Deadline User Manual, Release 7.1.0.35

4.3.5 Viewing the Slave Log


To view the Slaves current log, simply press the Open Slave Log button at the bottom of the Slave window. This will
open the Slaves log in a new window to avoid impacting the performance of the main Slave application.

If the Slave is running in the background or without an interface, you can connect to the Slaves log from the command
line. In a command prompt or terminal window, navigate to the Deadline bin folder (Windows or Linux) or the
Resources folder (Mac OS X) and run the following, where SLAVENAME is the name of the Slave you want to
connect to:
deadlinecommand -ConnectToSlaveLog "SLAVENAME"

4.3.6 Slave Menu Options


The available options are listed below. They are available in the Slaves window, or from the Slave system tray icons
right-click menu. Note that if the Slave is running in the background or without an interface, these options will be
unavailable.
196

Chapter 4. Client Applications

Deadline User Manual, Release 7.1.0.35

File Menu
Change License Server
Change the license server that the Slave connects to.
Options Menu
Hide When Minimized
The Slave is hidden when minimized, but can be restored using the Slave icon in the system tray.
Minimize On Startup
Starts the Slave in the minimized state.
Control Menu
Search For Jobs
If the Slave is sitting idle, this option can be used to force the slave to search for a job immediately.
Cancel Current Task
If the Slave is currently rendering a task, this forces the slave to cancel it.
Continue Running After Current Task Completion
Check to keep the Slave application running after it finishes its current task completion.
Stop/Restart Slave After Current Task Completion
Check to stop or restart the Slave application after it finishes its current task.
Shutdown/Restart Machine After Current Task Completion
Check to shutdown or restart the machine after the Dealine Slave finishes its current task.

4.3.7 Command Line Options


To run the Slave from a command prompt or terminal window, navigate to the Deadline bin folder (Windows or
Linux) or the Resources folder (Mac OS X) and run the deadlineslave application. To view all available command
line arguments, you can run the following:
deadlineslave -help

Available Options
To start a new instance of the Slave, use the -name option. If you already have multiple instances of the Slave
configured, use the -name option to start a specific instance:
deadlineslave -name "second-slave"

To start the Slave without a user interface, use the -nogui option:

4.3. Slave

197

Deadline User Manual, Release 7.1.0.35

deadlineslave -nogui

To start the Slave without the splash screen, use the -nosplash option:
deadlineslave -nosplash

To shut down the Slave if its already running, use the -shutdown option. This can be combined with the -name option
if you have more than one Slave instance running and you want to shut down a specific instance:
deadlineslave -shutdown
deadlineslave -shutdown -name "second-slave"

To control what a running Slave should do after it finishes rendering its current task, use the -aftertask option. The
available options are Continue, StopSlave, RestartSlave, ShutdownMachine, or RestartMachine. This can be combined
with the -name option if you have more than one Slave instance running and you want to control a specific instance:
deadlineslave -aftertask RestartSlave
deadlineslave -aftertask RestartMachine -name "second-slave"

4.3.8 FAQ
Can I run the Slave on an artists workstation?
Yes. On Windows and Linux, you can set the Affinity in the Slave Settings to help reduce the impact that
the renders have on the artists workstation.
Can I run the Slave as a service or daemon?
Yes. If youre running the Launcher as a service or daemon, then it will run the Slave in the background
as well. See the Client Installation documentation for more information.
The Slave keeps reporting errors for the same job instead of moving on to a different job. What can I do?
You can enable Bad Slave Detection in the Repository Configuration to have a slave mark itself as bad for
a job when it reports consecutive errors on it.
What does it mean when a Slave is stalled, and is this a bad thing?
Slaves become stalled when they dont update their status for a long period of time, and is often an
indication that the slave has crashed. A stalled slave isnt necessarily a bad thing, because its possible the
slave just wasnt shutdown properly (it was killed from the Task Manager, for example). In either case,
its a good idea to check the slave machine and restart the slave application if necessary.
On Linux, the Slave is reporting that the operating system is simply Linux, instead of showing the actual
Linux distribution.
In order for the Slave to report the Linux distribution properly, you need to have lsb installed, and
lsb_release needs to be in the path. You can use any package management application to install lsb.
On Linux, the Slave crashes shortly after starting up.
The libX11 and libXext libraries must be installed on Linux for the Slave to run, even if running it with
the -nogui flag. To check if libX11 and libXext are installed, open a Terminal and run the following
commands. If they are installed, then the path to the libraries will be printed out by these commands.

198

Chapter 4. Client Applications

Deadline User Manual, Release 7.1.0.35

ldconfig -p | grep libX11


ldconfig -p | grep libXext

If any of these libraries are missing, then please contact your local system administrator to resolve this
issue. Here is an example assuming you have root access, using YUM to install them on your system:
sudo -s
yum install redhat-lsb
yum install libX11
yum install libXext

4.4 Pulse
4.4.1 Overview
Pulse is an optional mini server application that performs maintenance operations on the farm, and manages more
advanced features like Auto Configuration, Power Management, Slave Throttling, Statistics Gathering, and the Web
Service. If you choose to run Pulse, it only needs to be running on one machine. Note that Pulse does not play a role
in job scheduling, so if you are running Pulse and it goes down, Deadline will still be fully operational (minus the
advanced features). Note to build redundancy if Primary Pulse fails in your environment, consider protecting yourself
by configuring Pulse Redundancy.

4.4. Pulse

199

Deadline User Manual, Release 7.1.0.35

If you are choosing a machine to run Pulse, you should be aware that non-Server editions of Windows have a TCP/IP
connection limitation of 10 new connections per second. If your render farm consists of more than 10 render nodes,
it is very likely that youll hit this limitation every now and then (and the odds continue to increase as the number of
machines increase). This is a limitation of the operating systems, and isnt something that we can workaround, so we
recommend using a Server edition of Windows, or a different operating system like Linux.

200

Chapter 4. Client Applications

Deadline User Manual, Release 7.1.0.35

4.4.2 Running Pulse


To start Pulse:
On Windows, you can start Pulse from the Start Menu under Thinkbox\Deadline.
On Linux, you can start Pulse from a terminal window by running the deadlinepulse script in the bin folder.
On Mac OS X, you can start Pulse from Finder by running the DeadlinePulse application in Applications/Thinkbox/Deadline.
You can configure Pulse to launch automatically when the Launcher starts up (similar to how the Slave does this). This
can be done by adding the LaunchPulseAtStartup=True to the systems deadline.ini file. See the Client Configuration
documentation for more information.
Pulse can also be started from a command prompt or terminal window. For more information, see the Pulse Command
Line documentation.

4.4.3 Viewing the Pulse Log


To view Pulses current log, simply press the Open Pulse Log button at the bottom of the Pulse window. This will
open the Pulse log in a new window to avoid impacting the performance of the main Pulse application.

If Pulse is running in the background or without an interface, you can connect to the Pulse log from the command line.
In a command prompt or terminal window, navigate to the Deadline bin folder (Windows or Linux) or the Resources

4.4. Pulse

201

Deadline User Manual, Release 7.1.0.35

folder (Mac OS X) and run the following, where PULSENAME is the name of the Pulse you want to connect to:
deadlinecommand -ConnectToPulseLog "PULSENAME"

4.4.4 Configuring Pulse


Pulse needs to be configured so that the Slave applications know how to connect to Pulse. This is necessary for the
Slave Throttling feature to function properly. There are a couple different ways to configure Pulse, which are described
below.
Auto Configuration
If you launch Pulse, and a Primary Pulse hasnt been set yet, it will automatically configure itself to be the Primary,
and configure itself to be connected to by its host name. These settings can be changed from the Pulse Panel in the
Monitor at any time. See the Pulse Configuration documentation for more information.
If Pulse has already been configured, but you want to quickly switch to another machine to run Pulse on, simply launch
Pulse on the desired machine. Then when it appears in the Pulse list in the Monitor, right-click on it and select Auto
Configure Pulse. Generally, this feature is only available in Super User mode.

Manual Configuration
The connection settings, as well as additional settings, can be configured for Pulse from the Monitor. Advanced
features like Auto Configuration, Power Management, Slave Throttling, Statistics Gathering, and the Web Service can
also be configured in the Monitor. See the Pulse Configuration documentation for more information.

4.4.5 Pulse Menu Options


The available options are listed below. They are available in Pulses window, or from the Pulse system tray icons rightclick menu. Note that if Pulse is running in the background or without an interface, these options will be unavailable.

202

Chapter 4. Client Applications

Deadline User Manual, Release 7.1.0.35

Options Menu
Hide When Minimized
Pulse is hidden when minimized, but can be restored using the Pulse icon in the system tray.
Minimize On Startup
Starts Pulse in the minimized state.
Control Menu
Perform Pending Job Scan
If Pulse is between repository pending job scans, this option can be used to force Pulse to perform a
pending job scan immediately. A pending job scan releases pending jobs by checking their dependencies
or scheduling options.
Perform Repository Clean-up
If Pulse is between repository clean-ups, this option can be used to force Pulse to perform a repository
clean-up immediately. A repository clean-up includes deleting jobs that are marked for automatic deletion.
Perform Repository Repair
If Pulse is between repository repairs, this option can be used to force Pulse to perform a repository repair
immediately. A repository repair includes checking for stalled slaves and orphaned limit stubs.
Perform Power Management Check
If Pulse is between power management checks, this option can be used to force Pulse to perform a power
management check immediately.

4.4.6 Command Line Options


To run Pulse from a command prompt or terminal window, navigate to the Deadline bin folder (Windows or Linux)
or the Resources folder (Mac OS X) and run the deadlinepulse application. To view all available command line
arguments, you can run the following:
deadlinepulse -help

Available Options
To start Pulse without a user interface, use the -nogui option:
deadlinepulse -nogui

To start Pulse without the splash screen, use the -nosplash option:
deadlinepulse -nosplash

To shut down Pulse if its already running, use the -shutdown option:
deadlinepulse -shutdown

4.4. Pulse

203

Deadline User Manual, Release 7.1.0.35

4.4.7 FAQ
Does Pulse use any license?
No. It is an unlicensed product and included in the Deadline Client software installer.
Can I run Pulse on any machine in my farm?
You can run Pulse on any machine in your farm, including the Repository or Database machine. However,
for larger farms, we recommend running Pulse on a dedicated machine.
When choosing a machine to run Pulse on, you should be aware that non-Server editions of Windows
have a TCP/IP connection limitation of 10 new connections per second. If your render farm consists of
more than 100 machines, it is very likely that youll hit this limitation every now and then (and the odds
continue to increase as the number of machines increase). Therefore, if you are running Pulse on a farm
with 100 machines or more, we recommend using a Server edition of Windows, or a different operating
system like Linux.
Can I run Pulse as a service or daemon?
Yes. If youre running the Launcher as a service or daemon, then it will run Pulse in the background as
well. See the Client Installation documentation for more information.
If Pulse is shutdown or terminated, is the Power Management feature still functional?
In this case, the only aspect of Power Management that is still functional is the Temperature Checking.
Redundancy for Temperature checking has been built into the Slave application, so if Pulse isnt running,
youre still protected if the temperature in your farm room begins to rise.
Which temperature sensors work with Power Management?
We have tested with many different temperature sensors. Basically, as long as the temperature sensors use
SNMP, and you know its OID (which is configurable in the Power Management settings), it should work.
Can I run multiple Pulses on separate machines?
Yes and like typical IT best practices, this will provide Pulse Redundancy. Note, only one Pulse can be
Primary at any given time.

4.5 Balancer
4.5.1 Overview
The Balancer is a cloud controller application capable of virtual/physical, private/public, remote/local simultaneous
machine orchestration. It can create, start, stop and terminate cloud instances based on the current queue load taking
into account jobs and tasks. Further customization to take into account other job/task factors can be achieved by
utilizing the Deadline plugin API to create a custom Balancer algorithm. Note to build redundancy if Primary Balancer
fails in your environment, consider protecting yourself by configuring Balancer Redundancy.

204

Chapter 4. Client Applications

Deadline User Manual, Release 7.1.0.35

The Balancer works in cycles, and each cycle consists for a number of stages.
First, the Balancer will do a House Keeping step in which it will clean up any disks or instances that havent
been terminated like they were supposed to.
Second, the Balancer will execute the Balancer Algorithm. These are the steps of the default algorithm (note
that these steps can be customized with your own Balancer Algorithm plugin):
Create State Structure: This sets up the data structures used in the rest of the algorithm.
Compute Demand: Examines the groups for jobs that are queued and assigns a weighting to the group
based on the amount of tasks that need to be done and the group priority.
Determine Resources: Here we determine how much space we have available with our provider and how
many limits we have.
Compute Targets: Based on the Demand and the available Resources we set a target number of instances
for each group.
Populate Targets: This sets up a full target data structure for use in Deadline.
4.5. Balancer

205

Deadline User Manual, Release 7.1.0.35

Third, the Balancer will equalize the targets by starting or terminating instances.

4.5.2 Running the Balancer


To start the Balancer:
On Windows, you can start the Balancer from the Start Menu under Thinkbox\Deadline.
On Linux, you can start the Balancer from a terminal window by running the deadlinebalancer script in the bin
folder.
On Mac OS X, you can start the Balancer from Finder by running the DeadlineBalancer application in Applications/Thinkbox/Deadline.
You can configure the Balancer to launch automatically when the Launcher starts up (similar to how the Slave does
this). This can be done by adding the LaunchBalancerAtStartup=True to the systems deadline.ini file. See the Client
Configuration documentation for more information.
The Balancer can also be started from a command prompt or terminal window. For more information, see the Balancer
Command Line documentation.

4.5.3 Viewing the Balancer Log


To view the Balancers current log, simply press the Open Balancer Log button at the bottom of the Balancer window.
This will open the Balancer log in a new window to avoid impacting the performance of the main Balancer application.

206

Chapter 4. Client Applications

Deadline User Manual, Release 7.1.0.35

If the Balancer is running in the background or without an interface, you can connect to the Balancer log from the
command line. In a command prompt or terminal window, navigate to the Deadline bin folder (Windows or Linux)
or the Resources folder (Mac OS X) and run the following, where BALANCERNAME is the name of the Balancer
you want to connect to:
deadlinecommand -ConnectToBalancerLog "BALANCERNAME"

4.5.4 Configuring the Balancer


The Balancer needs to be configured before it can do anything. See the Balancer Configuration documentation for
more information.

4.5. Balancer

207

Deadline User Manual, Release 7.1.0.35

4.5.5 Balancer Menu Options


The available options are listed below. They are available in the Balancers window, or from the Balancer system tray
icons right-click menu. Note that if the Balancer is running in the background or without an interface, these options
will be unavailable.
Options Menu
Hide When Minimized
The Balancer is hidden when minimized, but can be restored using the the Balancer icon in the system
tray.
Minimize On Startup
Starts the Balancer in the minimized state.
Control Menu
Perform Balancing
If the Balancer is between Balancing cyles, this option forces the Balancer to perform a balancing cycle
immediately. A balancing cycle looks at tasks, groups, limits and cloud regions to determine if it should
create or terminate cloud instances.

4.5.6 Command Line Options


To run the Balancer from a command prompt or terminal window, navigate to the Deadline bin folder (Windows or
Linux) or the Resources folder (Mac OS X) and run the deadlinebalancer application. To view all available command
line arguments, you can run the following:
deadlinebalancer -help

Available Options
To start the Balancer without a user interface, use the -nogui option:
deadlinebalancer -nogui

To start the Balancer without the splash screen, use the -nosplash option:
deadlinebalancer -nosplash

To shut down the Balancer if its already running, use the -shutdown option:
deadlinebalancer -shutdown

208

Chapter 4. Client Applications

Deadline User Manual, Release 7.1.0.35

4.5.7 FAQ
Can I run Balancer on any machine in my farm?
You can run Balancer on any machine in your farm, including the Repository or Database machine.
However, for larger farms, we recommend running Balancer on a dedicated machine.
When choosing a machine to run Balancer on, you should choose a machine which has the correct network
routable access to your local renderfarm as well as external access to any public/private connections via
technologies such as VPN.
Can I run Balancer as a service or daemon?
Yes. If youre running the Launcher as a service or daemon, then it will run Balancer in the background
as well. See the Client Installation documentation for more information.
Can I run multiple Balancers on separate machines?
Yes and like typical IT best practices, this will provide Balancer Redundancy. Note, only one Balancer
can be Primary at any given time and this is the machine which will checkout a Flexlm based Balancer
license.
Does Balancer use a Deadline Slave license?
No. Primary Balancer will checkout a Balancer specific license which is included to all customers who are
currently on Thinkbox annual support for Deadline. The Draft and Balancer licenses will be renewed for
another 12 months as you renew your annual Thinkbox Deadline support contract. Please email Deadline
Sales for further details.

4.6 Command
4.6.1 Overview
The deadlinecommand application is a command line tool for the Deadline render farm management system. It can be
used to control, query, and submit jobs to the farm.
There is also a deadlinecommandbg application which is identical to deadlinecommand, except that it is executed in
the background. When using deadlinecommandbg, the output and exit code are written to the Deadline temp folder
as dsubmitoutput.txt and dsubmitexitcode.txt respectively. If you want to control where these files get written to, you
can use the -outputFiles option, followed by the paths to the output and exit code file names. For example:
deadlinecommandbg -outputFiles c:\output.txt c:\exitcode.txt -pools

You can find the deadlinecommand and deadlinecommandbg applications in the Deadline bin folder (Windows or
Linux) or the Resources folder (Mac OS X).

4.6.2 Command Line Options


The supported command line options and their usage instructions can be printed out by running deadlinecommand
from a command prompt or terminal with the -help argument.
deadlinecommand -help

To get usage information for a specific command, specify the command name after the -help argument:

4.6. Command

209

Deadline User Manual, Release 7.1.0.35

deadlinecommand -help SubmitCommandLineJob

4.6.3 Long Command Lines


Some operating systems have a limit on the number of characters that an individual command line can consist of, which
can cause problems if you are using deadlinecommand with a large number of command line options. To workaround
this issue, you can create a text file with one command line option per line, and pass that file as the only argument to
deadlinecommand or deadlinecommandbg. For example, you can create a file called args.txt that looks like this:
-SubmitMultipleJobs
-dependent
-job
\\path\to\job_1_info_file.txt
\\path\to\job_1_plugin_file.txt
-job
\\path\to\job_2_info_file.txt
\\path\to\job_2_plugin_file.txt
-job
\\path\to\job_3_info_file.txt
\\path\to\job_3_plugin_file.txt

You would then pass it to deadlinecommand like this:


deadlinecommand args.txt

4.6.4 Usage Examples


Submitting a Job
To submit a 3dsmax scene (ie. C:\MyScene.max), you must first create a job submission info file (ie. C:\job_info.job)
and a 3dsmax plugin info file (ie. C:\max_info.job). See the Manual Job Submission documentation for more information.
Once the files are created, you can submit the job using this command:
deadlinecommand "C:\job_info.job" "C:\max_info.job" "C:\MyScene.max"

Querying For Jobs Using Filters


To query for all jobs that belong to jsmith or cdavis:
deadlinecommand -getjobsfilter username=jsmith username=cdavis

To query for all of jsmiths jobs with completed status:


deadlinecommand -getjobsfilterand username=jsmith status=completed

210

Chapter 4. Client Applications

Deadline User Manual, Release 7.1.0.35

Checking Which Slaves Assigned To A Specific Pool


To check which slaves are assigned to the 3dsmax pool:
deadlinecommand -getslavenamesinpool 3dsmax Assigned

To Check which slaves are excluded from the xsi pool:


deadlinecommand -getslavenamesinpool Xsi Excluded

Querying For Task Information


To query for task information for the job with the ID of 546cc87357dbb04344a5c6b5:
deadlinecommand -getjobtasks 546cc87357dbb04344a5c6b5

Retrieving and Changing Job Status


To retrieve the status of the job with the ID of 546cc87357dbb04344a5c6b5:
deadlinecommand -getjob 546cc87357dbb04344a5c6b5

To retrieve all of the jobs details:


deadlinecommand -getjobdetails

546cc87357dbb04344a5c6b5

To suspend the job with the ID of 546cc87357dbb04344a5c6b5:


deadlinecommand -suspendjob 546cc87357dbb04344a5c6b5
deadlinecommand -suspendjobnonrenderingtasks 546cc87357dbb04344a5c6b5

To resume the job:


deadlinecommand -resumejob 546cc87357dbb04344a5c6b5

To requeue the job:


deadlinecommand -requeuejob 546cc87357dbb04344a5c6b5

To delete the job:


deadlinecommand -deletejob 546cc87357dbb04344a5c6b5

To archive the job:


deadlinecommand -archivejob 546cc87357dbb04344a5c6b5

4.6. Command

211

Deadline User Manual, Release 7.1.0.35

Sending An Email
To send the message to jsmith@mycompany.com(cc cjones@mycompany.com):
deadlinecommand -sendemail -to jsmith@mycompany.com -cc cjones@mycompany.com
-subject "the subject" -message "C:\MyMessage.html"

To send the same message with the attachment C:\MyAttachment.txt:


deadlinecommand -sendemail -to jsmith@mycompany.com -cc cjones@mycompany.com
-subject "the subject" -message "C:\MyMessage.html" -attach "C:\MyAttachment.txt"

Note that the -to, -subject, and -message options are required. The other two options are optional.

4.6.5 FAQ
Whats the difference between the deadlinecommand and deadlinecommandbg applications?
The deadlinecommandbg application is identical to deadlinecommand, except that it is executed in the
background. When using deadlinecommandbg, the exit code and output are written to the Deadline temp
directory as dsubmitexitcode.txt dsubmitoutput.txt respectively.

4.7 Web Service


4.7.1 Overview
The deadlinewebservice application is a command line application for the Deadline render farm management system.
It allows you to get query information from Deadline over an Internet connection, which you can view with the Mobile
application, or you can write custom Web Service Scripts to display this information in a manner of your choice, such
as a web page.
You can find the deadlinewebservice application in the Deadline bin folder (Windows or Linux) or the Resources
folder (Mac OS X).
The Pulse application also has the web service built into it, so if you are already running Pulse, you can just connect to
it directly instead of running the standalone deadlinewebservice application. That being said, there are a few benefits
to running the standalone deadlinewebservice application if you are already running Pulse:
If you make heavy use of the web service, it wont impact Pulses performance.
You can run multiple instances of the standalone deadlinewebservice application on different machines.
Migrating the web service to another machine doesnt require you to migrate Pulse as well.
If you would like to use Pulses web service feature, you must enable it in Pulse, which can be done from the Web
Service tab in the Pulse Settings in the Repository Configuration. Note that if you enable or disable the web service
feature while Pulse is running, you must restart Pulse for the changes to take effect.

212

Chapter 4. Client Applications

Deadline User Manual, Release 7.1.0.35

4.7.2 Setup
Before you can use the web service, you need to configure the general Web Service settings in the Repository Configuration. These settings apply to both the standalone deadlinewebservice application, and Pulses web service feature.

4.7. Web Service

213

Deadline User Manual, Release 7.1.0.35

4.7.3 RESTful HTTP API


The RESTful API in the web service can be used to request information from the database, store new data, alter
existing data or remove entries from the database.
See the REST Overview documentation for more information.

4.7.4 Additional Web Service Functionality


This additional web service functionality is still supported, but is now deprecated in favor of the new RESTful HTTP
API.
Connecting to the Web Service
You can connect to the web service using a URL containing the host name or IP address of the machine that is hosting
the web service application, as well as the port, which we will assume to be 8080 for now (this can be configured in

214

Chapter 4. Client Applications

Deadline User Manual, Release 7.1.0.35

the Web Service Settings). Note that if port 8080 is being blocked by a firewall, the web service will not be able to
accept web requests. An example URL will look like the following:
http://[myhost]:8080/[command][arguments]

Where:
myhost is your web service servers IP address or host name.
command is the command you want to execute. The web service can support two different types of commands,
which are explained below.
arguments represents the arguments being passed to the command. This can be optional, and depends on the
command.
To confirm that you can at least connect to the web service, try the following URL.
http://[myhost]:8080/

You should see the following if you connect to the web service successfully:
This is the Deadline web service!

Windows Namespace Reservation


If the web service is running on Windows, you may also need to add a namespace reservation for the current user
that the web service is running under, so that it can reserve namespaces for the URL connection. See the Configuring
Namespace Reservations section in this MSDN Article for more information.
Note that by default, the web service listens on http://*:8080/, so make sure you set the port number correctly in the
URL you use when reserving the namespace. For example:
netsh http add urlacl url=http://*:8080/ user=USERNAME

Ensure you have correctly elevated permissions when executing the above in a command prompt and replace USERNAME with the appropriate %USERNAME% that the web service is running under. Depending on your local security
policy, the user account may need to have local administrator rights temporarily for you to initially reserve the namespace. The namespace reservation will also need updating if you ever modify the port number or user account used.
Use the following command in a command prompt to help list what namespace reservations are currently present on
your machine:
netsh http show urlacl

Running Commands
The first set of commands are the same commands that you can use with the Command application. However, these
commands are disabled by default. To enable them, you need to enable the Allow Non-Script Commands setting in the
Web Service settings. If left disabled, you will see the following results when trying to call one of these commands:
Error - Non-Script commands are disabled.

Here is an example of how you would use the web service to call the -GetSlaveNames command:

4.7. Web Service

215

Deadline User Manual, Release 7.1.0.35

http://[myhost]:8080/GetSlaveNames

Here is an example of the results that would be displayed:


Jupiter
Rnd-vista
Slave-29
Monkeypantswin7
Electron.franticfilms.com
Test3
Monkeypants
Slave-27d
Proton.franticfilms.com
Atom.franticfilms.com
Rnd-suse
Opensuse-64
Pathos
Neutron.franticfilms.com

Some commands can take arguments. To include arguments, you need to place a ? between the command name and
the first argument, and then a & between additional arguments. Here is an example of how you would use the web
service to call the -GetSlaveNamesInPool command, and pass it two pools as arguments:
http://[myhost]:8080/GetSlaveNamesInPool?show_a&show_b

Here is an example of the results that would be displayed:


Monkeypants
Pathos

Calling Python Scripts


The second set of commands are actually Python scripts that you can create in the Repository. These scripts use Pulses
Python API to get data, and then return the data in a readable manner. So basically, you can create scripts to access
any type of data and display it in any way you want. See the Web Service Scripts documentation for more information
on how to create these scripts.
Once a script has been created, you can call it by using the name of the script, without the .py extension. For example,
if you have a web service script called GetFarmStatistics.py, you would call it using:
http://[myhost]:8080/GetFarmStatistics

Some scripts can take arguments. To include arguments, you need to place a ? between the command name and the
first argument, and then a & between additional arguments. Here is an example of how you would pass arg1, arg2,
and arg3 as separate arguments to the GetFarmStatistics.py script:
http://[myhost]:8080/GetFarmStatistics?arg1&arg2&arg3

The way the results are displayed depends on the format in which they are returned. Again, see the Web Service
Scripting documentation for more information.

216

Chapter 4. Client Applications

Deadline User Manual, Release 7.1.0.35

4.8 Mobile
4.8.1 Overview
The Mobile application allows you to monitor your jobs from anywhere. The application connects to the Deadline
web service to download information about the state of your jobs, so the web service must be running before you can
use the Mobile application. See the Web Service documentation for more information.
The minimum requirements for the Mobile application are as follows.
Mobile Device
Android
iPhone or iPad
Windows Phone

Minimum Requirements
Deadline 5.0 and Android 2.1
Deadline 4.1 and iPhone OS 3.0 - 7.10
Deadline 5.0 and Windows Phone 7.0

4.8.2 Mobile Setup


When you launch the Mobile application for the first time, you will need to configure it so that it can connect to your
Deadline web service. Just press the Settings button in the top left corner. The important settings are the Deadline
User settings and the Pulse Server settings. For Mobile to connect to the web service, you must provide the following
information:
Deadline User Settings -> User Name: This is the Deadline User is the user that you normally submit render
jobs from.
Deadline User Settings -> Password: If the web service has been configured to require authentication, and
empty passwords are not allowed, you must enter your user password here. This is the password that you
specify in your User Settings in the Monitor. See the User Settings documentation for more information.
Pulse Server Settings -> Server Name: This is the host name or IP address of the server machine that is running
the web service.
Pulse Server Settings -> Server Port: The default is 8080, and should only be changed if the web service has
been configured to listen on a different port.
Note that the Pulse Server Settings can be used to connect to a Pulse instance if the web service feature is enabled,
or it can be used to connect to the standalone web service application. See the Web Service documentation for more
information.
After you have configured your Server and User settings, press the Job List button to return and press the Refresh
button to connect to the web service and load the job list. If you get an error when Mobile attempts to contact the web
service, see the Troubleshooting section for known errors and solutions.

4.8.3 Job List


The job list is the main screen, and by default it shows all the jobs in the repository. See the Settings section below for
information on how to sort and filter this list. You can also use the search field to search for specific jobs.

4.8. Mobile

217

Deadline User Manual, Release 7.1.0.35

To refresh the job list, just press the Refresh button. If you want to see more information about a specific job, press
the button to the right of the job name to bring up the job details panel.

4.8.4 Job Details


The job details panel shows additional information for a specific job. In this view, you can see most of the information
you could normally see in the Monitor.

To refresh the job details, just press the Refresh Job button. To return to the job list, press the Job List button in the
upper left corner.

218

Chapter 4. Client Applications

Deadline User Manual, Release 7.1.0.35

4.8.5 Settings
The settings panel can be accessed from the job list by pressing the Settings button. You can access the online help by
pressing the Help button in the top right corner (Android) or by scrolling down to find the Online Help link (iPhone).
To return to the job list, press the Job List button in the upper left corner.

Auto Refresh Settings


Job List: If enabled, the job list will automatically refresh itself at the increment defined in Job List Interval.
Job Details: If enabled, the jobs details will automatically refresh itself at the increment defined in Job Details
Interval.
Job List Filter Settings
Configure filters to only show the jobs that youre interested in.
Job List Sort Settings
Ego-centric Sort: If enabled, all of your jobs will appear at the top of the job list, followed by the remaining
jobs.
Primary Sort: Set the primary sort field and order for the job list.
Secondary Sort: Set the secondary sort field and order for the job list.
Deadline User Settings
User Name: Your Deadline user name. This is the user that you normally submit render jobs under.
Password: If the web service has been configured to require authentication, and empty passwords are not
allowed, you must enter your user password here. This is the password that you specify in your User Settings in
the Monitor. See the User Settings documentation for more information.
Pulse Server Settings
Server Name: This is the host name or IP address of the server machine that is running the web service.
Server Port: The default is 8080, and should only be changed if the web service has been configured to listen
on a different port.

4.8. Mobile

219

Deadline User Manual, Release 7.1.0.35

Note that the Pulse Server Settings can be used to connect to a Pulse instance if the web service feature is enabled,
or it can be used to connect to the standalone web service application. See the Web Service documentation for more
information.
Proxy Server Settings
Server URL: If you are using a proxy web server, you may need to set a more specific URL to connect to the
web service.
Http Authorization: If your proxy web server requires HTTP authorization, you should enable this option and
specify the user name and password.
SSL: If you are using a proxy web server that requires SSL, you should enable this option. Note that this will
change the server port in the Pulse Server Settings to 443 by default.
Download Information
This a running tally of the data that youve downloaded from the web service.

4.8.6 Proxy Server


Depending on the security restrictions of your studio, you may wish to to setup a proxy server that acts as a middleman
between Mobile and the web service. You can run the proxy server on a different machine, and configure it to require
authentication, use SSL, etc.
We have example scripts that you can start with by downloading the Pulse Proxy Script For Deadline Mobile file from
the Miscellaneous Deadline Downloads Page.
Place these scripts into a cgi script executable folder. For apache, the default is the cgi-bin directory, but different
folders can be configured as script folders.
Once the scripts are in the folder, running them should yield a 403: Not authorized error until the script has been
configured.
The proxy scripts have been written to assume that the root web directory will be where the scripts will be run. Because
of this, if they are placed into the cgi-bin folder you must prepend \cgi-bin\ to the URI regular expression test in the
scripts. Note that all slashes and regular expression special characters must be escaped (hence the double slash).
Common pitfalls with this are forgetting to mark the scripts as executable on unix based systems (use chmod og+x
Mobile_GetJob* to mark them executable), and forgetting to set the owner and group to be the same as the webserver
runs as (use chown www:www Mobile_GetJob* on most systems).
Note that we provide these scripts as is, and we dont officially support them. However, if you are having difficulties,
contact Deadline Support and well do what we can to help.

4.8.7 Troubleshooting
These are some known Mobile errors and solutions.
You must provide a password for authentication
This error occurs when a password has not been set for the current user while authentication is enabled
and empty passwords are not accepted. To resolve this issue, you must fill in the Web Service Password
field for the user in the User Settings in the Monitor. Before you can connect, you may need to wait for
the web service to update its network settings or manually restart the web service.
The provided user name and password are invalid

220

Chapter 4. Client Applications

Deadline User Manual, Release 7.1.0.35

This error occurs when the password provided is incorrect for the given user. If you believe the password
is correct, you may need to wait for the web service to update its network settings or manually restart the
web service.
The provided user name is invalid
This error occurs when the provided user is not in the web services cached list. If the user name is
valid, you may need to wait for the web service to update its network settings or manually restart the web
service.
There was an error connecting to Pulse
This error occurs when there are two errors connecting to the web service in a row. The likely cause of
this error is that the web service is not running on the specified server. Verify that the web service is
running on the specified server and that you have entered the servers name or IP address correctly. If you
have a name specified for the server and are not on the local area network of that machine, you may need
to enter the servers IP address instead of its name.
Network Error
The connection with the server failed. Please check your server settings in the Settings Section
Double check your settings in Mobile to make sure they match the required information. If all the Mobile settings
are entered correctly and you still cannot connect, look in your general mobile device settings and make sure you are
connected to the right network. Depending on how things are set up, your device will try to connect to the strongest
network in the area. If the network it switches to doesnt have the correct settings to connect to your server then the
connection will fail.
If you are still unable to connect try rebooting the device (fully power off your device and power it back on). This
error also occurs when the server you are trying to connect to has lost access to the internet. Double check that the
server is connected to the internet.

4.8.8 FAQ
How do I get the Mobile application?
The Mobile application can be downloaded from the Android Market and the iPhone App Store.
How much does Mobile cost?
Nothing, its free!

4.8. Mobile

221

Deadline User Manual, Release 7.1.0.35

222

Chapter 4. Client Applications

CHAPTER

FIVE

ADMINISTRATIVE FEATURES

5.1 Repository Configuration


5.1.1 Overview
There are a wide variety of Repository options that can be configured. These options can be modified at any time from
the Deadline Monitor while in Super User Mode by selecting Tools -> Configure Repository Options. If you want to
restore all the Repository Options to their defaults, simply click the Reset Settings button.

223

Deadline User Manual, Release 7.1.0.35

Note that long-running applications like the Launcher, Slave, and Pulse only update these settings every 10 minutes,
so after making changes, it can take up to 10 minutes for all machines to recognize them. You can restart these
applications to have them recognize the changes immediately.

5.1.2 Client Setup


These settings affect the Deadline Client installed on each machine.
Remote Administration: Enabling Remote Administration allows the Deadline Clients to be controlled remotely from the Monitor running on another machine. Note that this can be a security risk if you are not behind
a firewall.
Automatic Upgrades: Enabling Automatic Upgrades allows the Deadline Clients to detect if the Repository
has been upgraded, and upgrade themselves if necessary. Note that the upgrade check is only performed when
launching applications via the Launcher.

224

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

5.1.3 Monitor Settings


These settings affect the Deadline Monitor application on each machine.
Monitor Layouts
Existing Monitor layouts can be added here. These layouts can be assigned to User Groups as a users default layout.
If the Pinned option is enabled, they can also be chosen from the Pinned Layouts menu in the Monitor. The order of
the layouts here will be the same in the Pinned Layouts menu.

5.1. Repository Configuration

225

Deadline User Manual, Release 7.1.0.35

To add a new layout, simply press the Add button, and then choose an existing Monitor layout file, or use the current
Monitors layout. Note that Monitor layout files can be saved from the Monitor by selecting View -> Save Layout.

Update Settings
Enable Manual Refreshing
If your Auto Refreshing Intervals are set to longer intervals, manual refreshing in the Monitor can be enabled to allow

226

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

users to get the most up to date data immediately. To prevent users from abusing manual refreshing, a minimum
interval between manual refreshes can be configured.
Sorting and Filtering
For farms that have a large number of jobs (10,000+) or slaves (1000+), disabling Automatic Sorting and Filtering in
the lists in the Monitor can improve the Monitors overall performance. This option in the Repository Options can be
used to disable Automatic Sorting and Filtering by default, and users can enable it later in their Monitors if desired.

5.1.4 Slave Settings


These settings affect the Deadline Slave application on each machine.
Slave Settings
General
Limit the number of characters per line for standard output handling: Lines of standard output that are
longer than the specified limit will be ignored by the Slaves stdout handling.

5.1. Repository Configuration

227

Deadline User Manual, Release 7.1.0.35

Delete Offline/Stalled Slaves from the Repository after this many days: Slaves that are Offline or Stalled
will be removed from the Repository after this many days.
Gather System Resources (CPU and RAM) When Rendering Tasks On Linux/Mac: If enabled, the Slave
will collect CPU and RAM usage for a task while it is rendering. We have seen cases where this can cause the
Slave to crash on Linux or Mac, so you should only disable this feature if you run into this problem.
Use fully qualified domain name (FQDN) for Machine Name instead of host name: If enabled, the Slave
will try to use the machines fully qualified domain name (FQDN) when setting its Machine Name instead of
using the machines host name. The FQDN will then be used for Remote Control, which can be useful if the
remote machine name isnt recognized in the local network. If the Slave cant resolve the FQDN, it will just use
the host name instead.
Use Slaves IP Address for Remote Control: If enabled, the Slaves IP address will be used for remote control
instead of trying to resolve the Slaves host name.
Wait Times
Number of Minutes Before An Unresponsive Slave is Marked as Stalled: If a slave has not provided a status
update in this amount of time, it will be marked as stalled.
Number of Seconds To Wait Fora Response When Connecting to Pulse: The number of seconds a salve that
is connected to plus will wait for pulse to respond when querying for a job.
Number of Seconds Between Thermal Shutdown Checks if Pulse is Offline: The number of seconds between
thermal shutdown checks. The Slave only does this check if Pulse is not running.

228

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

Extra Properties
Extra arbitrary properties can be set for slaves, and these properties can be given user friendly names so that they can
easily be identified and used to filter and sort slaves in the Monitor.

5.1. Repository Configuration

229

Deadline User Manual, Release 7.1.0.35

5.1.5 Performance Settings


These settings are used to influence the performance of Deadline by modifying update intervals.
Auto Adjust
The auto adjust option will try to choose the best interval settings based on the number of slaves in your farm. These
should act as a good base that you can modify later as necessary. Press the Auto Adjust button to bring up the interval
settings. Note that this will show you what your current settings are, and what theyll be changed to based on the
number of slaves you entered.

230

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

Monitor Referesh Intervals


Number of Seconds Between Job Updates: This controls how often the Monitor reads in new job updates.
Number of Seconds Between Slave Updates: This controls how often the Monitor reads in new slave updates.
Number of Seconds Between Pulse Updates: This controls how often the Monitor reads in new pulse updates.
Number of Seconds Between Limit Updates: This controls how often the Monitor reads in new limit updates.
Number of Seconds Between Settings Updates: This controls how often the Settings such as groups, pools
and users are updated.
Number of Seconds Between Cloud Updates: This controls how often the Monitor updates the Cloud Panel.
Number of Seconds Between Balancer Updates: This controls how often the Monitor reads in new Balancer
updates.
Slave Intervals
Number of Seconds Between Slave Information Updates: This controls how often the Slave updates the
information thats shown in the Slave list in the Monitor.
Number of Seconds Between Queries For New Tasks While the Slave is Rendering: The number of seconds
a Slave will wait after it finishes a task before moving on to another. This delay is not applied when the Slave is
idle.
Multiplier to determine seconds between queries while the Slave is Idle: The multiplier to be applied to the
number of slaves that will determine how long a slave will wait between polls to the Repository for tasks when
5.1. Repository Configuration

231

Deadline User Manual, Release 7.1.0.35

it is idle.
Maximum number of seconds between Job queries while the Slave is Idle: The maximum number of seconds
a slave will wait between polls to the Repository for tasks when it is idle.
Minimum number of seconds between Job queries when the Slave is Idle: The minimum number of seconds
a slave will wait between polls to the Repository for tasks when it is idle.

5.1.6 Pulse Settings


These settings control how the Slaves connect to Pulse for Throttling, and are also used by the Slave to determine if
Pulse is running.
General
Maximum Incoming Connections: The maximum number of Slaves that can connect to Pulse at any given
time.
Connection Timeout (in milliseconds): The number of milliseconds messages to and from Pulse have to
complete before they timeout.
232

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

Maximum Connection Attempts: The maximum number of times a Slave will attempt to connect to Pulse
before giving up.
Stalled Pulse Threshold (in minutes): Deadline determines if a Pulse has stalled by checking the last time that
the Pulse has provided a status update. If a Pulse has not updated its state in the specified amount of time, it will
be marked as Stalled.
Use Pulses IP Address When Slaves Connect To Pulse and For Remote Control: If enabled, the Pulses IP
address will be used when the slaves connect to pulse, and for remote control, instead of trying to resolve the
Pulses host name.

Power Management
Power Management Check Interval: How often Pulse performs Power Management operations.

5.1. Repository Configuration

233

Deadline User Manual, Release 7.1.0.35

Throttling
Throttling can be used to limit the number of slave applications that are copying over the job files at the same time.
This can help network performance if large scene files are being submitted with the jobs. Note that a Slave only copies
over the job files when it starts up a new job. When it goes to render subsequent tasks for the same job, it will not be
affected by the throttling feature.
Enable Throttling: Allow throttling to occur.
Maximum Number of Slaves That Can Copy Job Files at The Same Time: The maximum number of Slaves
that can copy a scene file at the same time.
The Interval a Slave Waits Between Updates To See If It Can Start Copying Job Files: The amount of
time(in seconds) a Salve will wait to send throttle checks and updates to Pulse.
Throttle Update Timeout Multiplier (based on the Slave Interval): The interval a slave waits between updates is multiplied by this value to determine the timeout value.

234

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

Web Service
Enable the Web Service
The Web Service allows you to execute commands and scripts from a browser, and must be enabled to use the Mobile
applications and the Pulse RESTful API (see REST Overview). While there is a standalone web service application, it
can also be enabled in Pulse if you are running it. All other Web Service settings can be set in the Web Service page,
which is covered further down this page.
Enable the Web Service: Makes the Pulse Web Service Available. Note that if you enable or disable the Web
Service feature while Pulse is running, it must be restarted for the changes to take effect.

5.1. Repository Configuration

235

Deadline User Manual, Release 7.1.0.35

5.1.7 Balancer Settings


These settings control general settings for the Balancer.
Balancer Update Interval: How often the Balancer performs a balancing cycle.
Current Algorithm Logic: The Balancer Plugin to use for determining balancing targets.
Use Balancers IP Address for Remote Control: If enabled, the Balancers IP address will be used for remote
control instead of trying to resolve the Balancers host name.
Stalled Balancer Threshold (in minutes): Deadline determines if a Balancer has stalled by checking the last
time that the Balancer has provided a status update. If a Balancer has not updated its state in the specified
amount of time, it will be marked as Stalled.
Error Tolerance: How many times we try to connect to the primary Balancer before it fails and we make
another Balancer the new primary.
Enable Group Switching: If there are group mappings that have the same image and hardware types instances
will move between groups as needed. If its not enabled instances will shutdown and startup like normal.
The settings for the currently selected Algorithm Logic will be shown here as well (if there are any settings).
236

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

5.1.8 Region Settings


This is where you can set up Regions in Deadline. Regions are logical groupings for slaves and users. Cross Platform
Rendering and Balancer Settings can be unique to each region. For example a slave thats in the thinkbox_west
Region will use the path mapping settings for that Region. The list on the right shows the Cloud Regions and the
list on the left shows the general Regions. Regions must have a unique name.all and none are reserved names that
cannot be used. See Regions for more information.

5.1. Repository Configuration

237

Deadline User Manual, Release 7.1.0.35

5.1.9 Email Notification


This section handles all email related settings within the repository.
Primary and Secondary Server
Set up a primary SMTP server to send email notifications. You can set up an optional secondary SMTP server for
Deadline to use if the primary server is unavailable.
SMTP Server: The SMTP server used by Deadline to send emails.
Sender Account: The email account that Deadline will use to send emails from.
Port: The SMTP port to use.
Use SSL: The email account from which the notifications will be sent.
SMTP Server Requires Authentication: Enable if the SMTP server requires a user name and password to
authenticate.

238

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

Testing: Send a test email to the specified email address.


Automatically Generate Email Addresses for New Users: Generates new email address for new users in the
form username@postfix, where postfix is the value entered in the Email Address Postfix field.

Note that if you have SSL enabled, you may need to configure your Linux and OSX machines for SSL to work. The
process for doing this is explained in Monos Security Documentation.
If you using Google Mail to send emails (smtp.gmail.com), you will typically use port 25 if SSL is disabled, and port
465 if SSL is enabled. See Googles documentation on Sending Emails for more information.
Notifications
Job Completed: When a job completes, an email will be sent to these email addresses.
Job Timed Out: When a job times out, an email will be sent to these email addresses.
Job Error Warning: When a job accumulates a certain number of errors, a warning email will be sent to these
email addresses. You can configure the warning limit in the Failure Detection settings.
Job Failed: When a job fails, an email will be sent to these email addresses.
Job Corrupted: When a corrupted job is detected, an email will be sent to these email addresses.
5.1. Repository Configuration

239

Deadline User Manual, Release 7.1.0.35

Slave License Errors: When a slave is unable to get a license, an email will be sent to these email addresses.
Slave Status Errors: When a slave is unable to update its state in the Repository, an email will be sent to these
email addresses.
Slave Error Warning: When a slave accumulates a certain number of errors in one session, a warning email
will be sent to these email addresses. You can configure the warning limit in the Failure Detection settings.
Stalled Slave: When a stalled slave detected, an email will be sent to these email addresses.
System Administrator: When users use the option in the Error Report Viewer to report error messages to their
system administrator, those emails will be sent to these email addresses.
Low Database Connections: Low Database connection notification emails will be sent to these email addresses.
Database Connection Thresholds: When the number of available database connections is below the set threshold a warning email will be sent.

Power Management Notifications


Idle Shutdown: Notifications for Idle Shutdown operations will be sent to these email addresses.
Machine Startup: Notifications for Machine Startup operations will be sent to these email addresses.
240

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

Thermal Shutdown: Notifications for Thermal Shutdown operations will be sent to these email addresses.
Machine Restart: Notifications for Machine Restart operations will be sent to these email addresses.

5.1.10 House Cleaning


Pending Job Scan
Pending Job Scan Interval: The maximum amount of time between Pending Job Scans in seconds.
Allow Slaves to Perform the Pending Job Scan If Pulse is not Running: If enabled, the Slaves will perform
the pending job scan if Pulse is not running. If disabled, only Pulse can perform the pending job scan.
Run Pending Job Scan in a Separate Process: If enabled, the pending job scan will be run in a separate
process. This can be useful when using dependency scripts to ensure that a crash caused by the script doesnt
cause the main application (Pulse, Slave, or Monitor) to crash.
Write Pending Job Scan Output to Seperate Log File: If enabled, all output from the pending job scan
will be placed into a seperate log file.

5.1. Repository Configuration

241

Deadline User Manual, Release 7.1.0.35

Pending Job Scan Process Timeout: If running the pending job scan in a separate process, this is the
maximum amount of time the process can take before it is aborted.
Asynchronous Job Events: If enabled, many job events will be processed asynchronously by the Pending Job
Scan operation, which can help improve improve the performance of the Monitor when performing operations
on batches of jobs. If this is enabled, the OnJobSubmitted event will still be processed synchronously to ensure
that any updates to the job are committed before the job can be picked up by Slaves.
Maximum Job Events Per Session: The maximum number of pending job events that can be processed
per scan.

House Cleaning
House Cleaning Interval: The maximum amount of time between House Cleaning operations in seconds.
Allow Slaves to Perform House Cleaning If Pulse is not Running: If enabled, the Slaves will perform house
cleaning if Pulse is not running. If disabled, only Pulse can perform house cleaning.
Run House Cleaning in a Separate Process: If enabled, the house cleaning operation will be run in a separate
process.

242

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

Write House Cleaning Output to Seperate Log File: If enabled, all output from the house cleaning will
be placed into a seperate log file.
House Cleaning Process Timeout: If running the house cleaning in a separate process, this is the maximum amount of time the process can take before it is aborted.
House Cleaning Maximum Per Session
Maximum Deleted Jobs: The maximum number of deleted jobs that can be purged per session.
Maximum Archived Jobs: The maximum number of jobs that can be archived per session.
Maximum Auxiliary Folders: The maximum number of job auxiliary folders that can be deleted per
session.
Maximum Job Reports: The maximum number of jobs report files that can be deleted per session.

Repository Repair
Repository Repair Interval: The maximum amount of time between Repository Repair operations in seconds.
Allow Slaves to Perform the Repository Repair If Pulse is not Running: If enabled, the Slaves will perform
the repository repair if Pulse is not running. If disabled, only Pulse can perform the repository repair.
5.1. Repository Configuration

243

Deadline User Manual, Release 7.1.0.35

Run Repository Repair in a Separate Process: If enabled, the repository repair operation will be run in a
separate process.
Write Repository Repair Output to Seperate Log File: If enabled, all output from the repository repair
will be placed into a seperate log file.
Repository Repair Process Timeout: If running the repository repair in a separate process, this is the
maximum amount of time the process can take before it is aborted.
Automatic Primary Election: If enabled, the Repository Repair operation will elect another running
Pulse/Balancer instance as the Primary if the current Primary instance is no longer running.

5.1.11 Auto Configuration


This allows you to configure your Slaves from a single location. When a Slave starts up, it will automatically pull this
configuration from Pulse and apply it before fully initializing. See the Auto Configuration documentation for more
information.

244

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

5.1.12 User Security


Super User Password: The password needed to access Super User Mode in the Monitor. Leave blank for no
password.
Enhanced User Security: When using the System User for the Deadline User, the only way to switch Deadline
users is to log off the system and log back in as someone else. This helps improve Deadliness user security, as
it prevents users from impersonating others to modify their jobs.
Use The System User For The Deadline User: Enable to use enhanced user security, which prevents
users from impersonating others.
Rendering Jobs As User: By default, the rendering process will run under the same user account that the
Slave is running as. If Render Jobs As User is enabled, the rendering process will run under the user account
associated with the user that submitted the job. Each Deadline user must have their Render Jobs As User settings
configured properly for this to work. On Windows, the users Run As Name, Domain, and Password settings
will be used to start the rendering process as that user. On Linux and Mac OS X, only the users Run As Name
setting will be used with su or sudo to start the rendering process as that user. Note that on Linux and Mac
OS X, the Slave must be running as root for this to work properly.

5.1. Repository Configuration

245

Deadline User Manual, Release 7.1.0.35

Render Jobs As User: Enable to have jobs render as the user that submitted them.
Use su Instead Of sudo On Linux and Mac OS X: If enabld, su will be used to run the process as
another user instead of sudo. This setting is ignored on Windows.
Preserve Environment On Linux and Mac OS X: If enabled, the user environment will be preserved
when running the process as another user using su or sudo. This setting is ignored on Windows, and is
ignored on Mac OS X when using su instead of sudo.

5.1.13 Job Settings


Job Scheduling
Scheduling Order
Job Scheduling Order: The order of priority that Deadline uses to schedule jobs. See the Job Scheduling
documentation for more details.
Priority Weight: Weight given to job priority when using a Weighted scheduling order.
Submission Time Weight: Weight given to job submission time when using a Weighted scheduling order.
246

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

Error Weight: Weight given to the number of errors a job has when using a Weighted scheduling order.
Rendering Task Weight: Weight given to the number of rendering tasks a job has when using a Weighted
scheduling order.
Rendering Task Buffer: A buffer that is used by slaves to give their job extra priority on the farm.
Enhanced Balancing Logic: If enabled, a more enhanced method of balancing slave between jobs is used,
which should prevent slaves from jumping between jobs as much. This feature is still considered experimental.
Submission Limitations
Task Limit For Jobs: The maximum number of tasks a job can have. Note that this does not impose a frame
limit so you you can always increase the number of frames per task to stay below this limit.
Maximum Job Priority: The maximum priority value a job can have.
Automatic Job Timeout
Configure Deadline to automatically determine a timeout for a job based on the render times of tasks that have already
completed. If a task goes longer than that timeout, a timeout error will occur and the task will be requeued.
Minimum number of completed tasks required before calculating a timeout: The minimum number of tasks
that must be completed before Auto Job Timeout Checking occurs.
Minimum percent of completed tasks required before calculating a timeout: The minimum percent of tasks
that must be completed before Auto Job Timeout Checking occurs.
Enforce an automatic job timeout for all jobs: If enabled, the Auto Job Timeout will be enabled for all jobs
overriding the per job specification of the value.
Timeout Multiplier: To calculate the Auto Job Timeout, the longest render time of the completed tasks is
multiplied by this value to determine the timeout time.

5.1. Repository Configuration

247

Deadline User Manual, Release 7.1.0.35

Failure Detection
Job Failure Detection
Sends warnings and fail jobs or tasks if they generate too many errors.
Send a warning to the jobs user after it has generated this many errors: A warning will be sent to the jobs
notification list once its error count has reach. By default, the submitting user is automatically added to this list.
Mark a job as failed after it has generated this many errors: The number of errors a job must throw before
it is marked as failed.
Mark a task as failed after it has generated this many errors: The number of errors a task must throw before
it is marked as failed.
Automatically delete corrupted jobs from the Repository: If enabled, if a job is found to be corrupted it will
it will be automatically removed from the the Repository.
Maximum Number of Job Error Reports Allowed: This is the maximum number of error reports each job
can generate. Once a job generate this many errors it will fail and can not be resumed until some of its error
reports are deleted or this value is increased.

248

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

Slave Failure Detection


Sends warnings and prevent Slaves from reattempting jobs that keep generating errors.
Send a warning after a Slave has generated this many errors for a job in a row: The maximum number of
errors that can occur before email warnings are sent to the users specified in the Email Notification section.
Mark a Slave as bad after it has generated this many errors for a job in a row: If a Slave hits this many
errors, it will be marked as bad for its current job.
Frequency at which a slave will attempt a job that it has been marked bad for: The percentage of time a
Slave will attempt a task it has been marked bad for if no good jobs are available.

Cleanup
Automatic Job Cleanup
Cleanup Jobs After This Many Days: If enabled, this is the number of days to wait before cleaning up unarchived jobs.
Cleanup Mode: Whether the cleanup should archive the jobs found or delete them.
You can also set the number of hours since the job was last modified before cleaning it up.
5.1. Repository Configuration

249

Deadline User Manual, Release 7.1.0.35

Deleted Job Purging


Set the number of hours after a job has been deleted before it is purged from the database.

Auxiliary Files
Many jobs have an option to submit the scene file and other auxiliary files with the job. This can be useful because it
stores a copy of the scene file with the job that can be referred to later. However, if the size of these files are large and
the Repository server isnt designed to handle this load, it can seriously impact the Repository machines performance.
This problem can be avoided by storing these files in a location on a different server that is designed to handle the
load.
Store job auxiliary files in a different location: If enabled, job auxiliary files submitted to Deadline will be
stored at a location specified and not the Repository.

250

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

Extra Properties
Extra arbitrary properties can be submitted with a job, and these properties can be given user friendly names so that
they can easily be identified and used to filter and sort jobs in the Monitor.

5.1. Repository Configuration

251

Deadline User Manual, Release 7.1.0.35

5.1.14 Application Logging


Application Log Cleanup
Delete Monitor logs after this many days: The number of days before a Monitor log will be deleted.
Delete Slave logs after this many days: The number of days before a Slave log will be deleted.
Delete Pulse logs after this many days: The number of days before a Pulse log will be deleted.
Delete Balancer logs after this many days: The number of days before a Balancer log will be deleted.
Delete Launcher logs after this many days: The number of days before a Launcher log will be deleted.
History Entries
Maximum Number of Repository History Entries: The maximum number of repository history entries that
are stored before old entries are overwritten.
Maximum Number of Job History Entries: The maximum number of job history entries that are stored before
old entries are overwritten.

252

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

Maximum Number of Slave History Entries: The maximum number of slave history entries that are stored
before old entries are overwritten.
Maximum Number of Pulse History Entries: The maximum number of pulse history entries that are stored
before old entries are overwritten.
Maximum Number of Balancer History Entries: The maximum number of balancer history entries that are
stored before old entries are overwritten.
Logging Verbosity
Slave Verbose Logging: If enabled, more information will be written to the Slave log while it is running.
Pulse Verbose Logging: If enabled, more information will be written to the Pulse log while it is running.
Balancer Verbose Logging: If enabled, more information will be written to the Balancer log while it is running.

5.1.15 Statistics Gathering


Configure Deadline to keep track of job and farm statistics. Note that Pulse must be running to gather Slave and
Repository statistics. Job statistics will be gathered regardless if Pulse is running or not.

5.1. Repository Configuration

253

Deadline User Manual, Release 7.1.0.35

Enable Statistics Gathering: If enabled, Deadline will gather statistical information.


Slave Statistics Gathering Interval(in minutes): The amount of time between polling Slaves for statistical
information.
Repository Statistics Gathering Interval(in minutes): The amount of time between polling the Repository
for statistical information.
Delete Job Statistics After This Many Days: The number of days from generation that job statistics will be
kept before they are deleted.
Delete Slave Statistics After This Many Days: The number of days from generation that Slave statistics will
be kept before they are deleted.
Delete Repository Statistics After This Many Days: The number of days from generation that Repository
statistics will be kept before they are deleted.

5.1.16 Mapped Paths


Paths to be mapped before rendering(based on Operating System). You may add, remove, or edit paths as well as
modify the order in which they will be mapped. See the Cross Platform Rendering section for more details.

254

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

5.1.17 Mapped Drives


Drives to be mapped before rendering(Windows Only).
Drive: The drive to be mapped.
Remote Path: The remote path for the drive.
Only Map If Unmapped: Enable to only map the drive if it is unmapped. Disabled by default.
Requires Authentication: (Optional) Enable if the drive requires authentication. If unchecked, the existing
logged in user account credentials will be used.
Username: Username. Must not be blank.
Password: Password. Must not be blank.
Note, drives can be mapped when running as a service. Beware that if a user is logged in and has mapped drives set
up for them, the Deadline Slave service wont see them because they run in a different environment. However, if the
drives are mapped in the services environment (which is what the slave is doing), then they will work fine. Using the
following setting can help remove this potential situation.

5.1. Repository Configuration

255

Deadline User Manual, Release 7.1.0.35

Only map drives when the Slave is running as a service: If checked, the slave will only map the drives if its
running as a service. If unchecked, it will also do it when the slave is running as a normal application.

5.1.18 Script Menus


There are many scripts that ship with Deadline, and its more than likely that you dont need to use them all, especially
the submission scripts. Here, you can configure the contents of the individual script menus to only display what you
use. You can also set icons and keyboard shortcuts for your script menu items. If a script menu item has the same
shortcut as an existing menu item, the script menu items shortcut will take precedence.
Note though that these settings will affect all Monitors that connect to this Repository.

256

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

5.1.19 Python Settings


A list of additional paths to be added to the Python search paths. Specify one path per line, and use the Add Path
button to browse for paths.

5.1. Repository Configuration

257

Deadline User Manual, Release 7.1.0.35

5.1.20 Wake On Lan Settings


Deadlines Power Management uses Wake On Lan to wake up machines, and you can configure which port(s) the
WOL packet is sent over. If no ports are listed here, Deadline will use port 9 by default.

258

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

5.1.21 Web Service


The Web Service allows you to execute commands and scripts from a browser, and must be enabled to use Deadlines
Mobile applications. The Web Service can be run as a console application or as part of Pulse. Note that only one
instance of the Web Service can run on a machine at a time. Also note that all changes to the Web Service settings
require the Web Service to be restarted before they will be implemented.
Listening Port: The port on which the Web Service will listen.
Connection Limit: The maximum number of concurrent connections allowed for the Pulse Web Service.
Connection Timeout(in seconds): The amount of time in between sending and receiving messages to and from
the Web Service before a timeout occurs.
If the Web Service requires authentication, users would use their Deadline user name along with the password stored
in their User Settings. If empty passwords are allowed, they can leave their password setting blank.
Require Authentication: If enabled, the Pulse Web Service will require a username and password. These are
stored in the user settings.
Allow Empty Passwords: If enabled, the Web Service will accept empty passwords.

5.1. Repository Configuration

259

Deadline User Manual, Release 7.1.0.35

Allow Execution of Non-Script Commands: If enabled, users are allowed access to Deadline Command
commands.

5.2 User Management


5.2.1 Overview
Deadline has its own user system, which is primarily used to tie users to Jobs. By default, users cannot control or
modify the settings of another Users Jobs.
Each user can configure their own user settings from the Monitor by selecting Tools -> Options. See the Monitor and
User Settings documentation for more information on the available user settings.

5.2.2 Managing Users


Administrators can manage the all users from the Monitor. This is done by selecting Tools -> Manage Users in Super
User mode, or as a user with appropriate User Group privileges. From here, you can add or remove individual users,

260

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

and edit their user settings. See the Monitor and User Settings documentation for more information on the available
user settings.

5.2.3 User Security


User Security settings can be configured in the Repository Configuration.

5.2. User Management

261

Deadline User Manual, Release 7.1.0.35

By default, Deadline does not enforce Enhanced User Security. This means that a user can switch to a different User
and edit someone elses Jobs. For some pipelines, this honor system will work fine, but for those looking for tighter
security, you should enable Enhanced User Security, so that it uses the system user as the Deadline User. When this
option is enabled, users will not be able to switch to a another Deadline User unless they log off their system and log
back in as someone else.
It is also recommended that you add a Super User password if you are looking for enhanced security, as a Super
User without a password would allow Users to circumvent User Job-editing restriction, as well as circumventing any
restrictions imposed on them by their User Groups (see below).

5.2.4 User Group


User Groups allow Administrators to restrict what functionality is available to certain users, as well as make certain
features accessible to others without requiring the use of the Super User mode.
Deadline automatically creates an Everyone User Group, which always contains all Users, and cannot be removed
or disabled. This User Group is also populated with the default Permission Settings recommended for normal users.

262

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

Managing User Groups


The User Group Management section can be accessed as a Super User through the Tools -> Manage User Groups
menu in the Monitor.

The left side of this dialog contains the list of User Groups that have already been created in the Repository. There are
also controls allowing you to manipulate this list in many ways:
Add: Will create a new User Group using the default options and feature access levels (equivalent to the default
Everyone group before modification).
Remove: Will delete the selected User Group from the Repository. Note that the Everyone group can never
be Removed in order to guarantee that all Users will at least be part of this group.
Clone: Will create a new User Group using the Options and Feature Access Levels of the currently selected
group as defaults.
This list is visible regardless of which tab is selected, allowing you to quickly change which Group youre modifying,
and ensuring youre always aware of which one is currently selected.

5.2. User Management

263

Deadline User Manual, Release 7.1.0.35

General Options
This tab contains basic higher-level settings for User Groups. Note that most of the features on this tab, described
below, will be disabled when modifying the Everyone group, since it is a special Group that must always be active
and enabled for all Users.
Group Options
Group Enabled: This indicates whether or not this User Group is currently active or not. Disabling
a User Group instead of Removing it altogether can be useful if you just want to temporarily disable
access for a group of users without having to re-create it later. This is always true for the Everyone
Group.
Group Expires: This setting will cause a Group to only be valid up to the specified Date and Time.
This can be useful if you are hiring temporary staff and know in advance that you will need to revoke
their access on a certain Date. This cannot be set for the Everyone Group.
Job Access Level
Can View Other Users Jobs: This setting determines whether or not Users belonging to the Group
can see other users jobs.
Can Modify Other Users Jobs: This setting indicates whether or not Users in this Group should be
allowed to modify other users jobs (change properties, job state, etc).
Can Handle Protected Jobs: This setting determines whether or not Users belonging to the Group
can archive or delete protected jobs that dont belong to them.
Can Submit Jobs: This setting determines whether or not Users belonging to the Group can submit
jobs.
Default Monitor Layout: Here you can select a Monitor layout that was added to the Repository Configuration.
This layout will act as the default for users belonging to this user group. The Priority setting is used as a tie
breaker if a user is part of more than one group with a default layout. When a user selects View -> Reset Layout,
it will reset to their user groups default layout instead of the normal default. Finally, if the Reset Layout On
Startup setting is enabled, the Monitor will always start up with that layout when it is launched.
Time-Restricted Access: This section allows you to set windows of time during which this Group is considered
Active. This is useful if you want to set up permissions to change based on the time of day, or if you just want
to lock out certain Users after hours. This cannot be enabled for the Everyone Group.
Group Members: This is where you control which Users are considered members of the currently selected
Group. Users can be part of multiple Groups. All Users are always part of the Everyone Group, and this
cannot be changed.
Controlling Feature Access
The other tabs in the Group Management dialog are dedicated to enabling or restricting access to certain Features on
a per-group basis.
Each tab groups displays a different type of Feature, that represent different aspects of the end-user experience:
Menu Items: This tab contains all the Menu Item features, including the main menu bar, right-click menus, and
toolbar items.
Job Properties: This tab contains all of a Jobs modifiable properties, and determines which ones a User will
be allowed to change. Note that this is only for Jobs a User is allowed to modify in the first place, if he is not
allowed to modify other Users Jobs (see section above).

264

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

Scripts: This contains all the different type of Scripts a User could run from the Monitor. This section is a
little different than the others, because the actual Features are dynamically generated based on which Scripts are
currently in the Repository. Note that all scripts will default to a value of Inherited, so make sure to revisit this
screen when adding new Scripts to your Repository.
UI Features: This tab contains all the different types of Panels that a User can spawn in the Monitor, and
controls whether or not a particular User Group is allowed to spawn them.
These Features are also grouped further within each tab into logical categories, to try and make maintenance easier.

There are three possible Access Levels that you can specify for each Feature:
Enabled: The members of this Group will have access to this particular Feature.
Disabled: This Group is not granted access to this Feature. Note, however, that Users in this Group might be
granted access to this Feature by a different Group.
Inherited: Whether or not this Feature is Enabled or Disabled is deferred to the Features Parent Category.
Its current inherited value is reflected in the coloured square next to the dropdown; Red indicates it is currently Disabled, while Green indicates it is currently Enabled. Top-level Parents in a category cannot be set to
Inherited.
If Users are part of multiple Groups, they will always use the least-restrictive Group for a particular Feature. In other
words, a given User will have access to a Feature as long as he is part of at least one currently active Group that has
access to that Feature, regardless of whether or not his other Groups typically allow it.

5.2. User Management

265

Deadline User Manual, Release 7.1.0.35

5.3 Slave Configuration


5.3.1 Overview
The Slaves panel allows Slaves to be controlled and modified using the right-click menu. Note that the availability of
these options can vary depending on the context in which they are used, as well as the User Group Permissions that
are defined for the current user.
If the Slaves panel is not visible, see the Panel Features documentation for instructions on how to create new panels
in the Monitor.

5.3.2 Slave States


These are the states that a Slave can be in. They are color coded to make it clear which state the Slave is in.
Offline (gray): The Slave application is closed.
Idle (white): The Slave application is running, but it is not currently rendering.
Rendering (green): The Slave application is running, and is rendering a job.
Stalled (red): A Slave becomes stalled if it hasnt updated its state for a certain amount of time. This could be
because the machine crashed, or the Slave simply didnt shutdown cleanly.
Disabled (yellow): The Slave has been disabled by an administrator. This prevents the Slave application from
launching on the machine.* License Warning - Slave received a license error when last attempting to render.
View Job Reports to find the exact error message.
License Problems (orange): The Slave cannot acquire a license, or its temporary license is about to expire.
If you see an orange Slave in the Slave list, it means that the Slave is having licensing problems, or that the license it
is using will expire in less than 10 days. You can check the License column in the Slave list to see what the problem
is.
If you see a red Slave, it means the Slave has been marked as stalled. This happens if the Slave hasnt updated its state
for a certain amount of time. You can configure this amount of time in the Wait Times section of the Slave Settings in
the Repository Configuration. When a Slave is marked as stalled, it usually means that the machine crashed, or that the
Slave simply didnt shutdown cleanly. In the latter case, you can simply mark the Slave as offline from the right-click
menu.
The Slave panels right-click menu also gives the option to delete or disable Slaves. When disabled, the Slave application will not be allowed to launch on the machine. This is useful if you are doing maintenance on a machine and you
dont want the Slave accidentally starting up on it.

5.3.3 Job Candidate Filter


If a slave isnt rendering a job that you think it should be, you can use the Job Candidate Filter option in the Slave
Panels drop down menu to try and figure out why. When the option is enabled, simply click on a job in the Job Panel
and the Slave Panel will be filtered to only show the slaves that can render the selected job based on the jobs settings.
The filtering takes the following into account:
The jobs pool and group (see the Pools and Groups documentation for more information).
The jobs whitelist/blacklist, and the whitelist/blacklist in the jobs assigned limits (see the Limits and Machine
Limits documentation for more information).
If the slave has been marked bad for the job (see the Job Failure Detection documentation for more information).

266

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

5.3.4 Slave Settings


Most of the Slave settings be configured from the Monitor while in Super User Mode (or with the proper user privileges) by right-clicking on one or more of them and selecting Modify Slave Properties. To configure Pools and
Groups, you can use the Tools menu, or you can use the Slave panels right-click menu. See the Pools and Groups
documentation for more information.

Note that the only settings here that have an actual impact on rendering are the Concurrent Tasks and CPU Affinity
settings. Furthermore, the CPU Affinity feature is only supported on Windows and Linux operating systems, since
OSX does not support process affinity.
General
These are some general Slave settings:
Slave Description: A description of the selected Slave. This can be used to provide some pertinent information
about the slave, such as certain system information.
Slave Comment: A short comment regarding the Slave. This can be used to inform other users why certain
changes were made to that Slaves settings, or of any known potential issues with that particular Slave.
Normalized Render Time Multiplier: This value is used to calculate the normalized render time of Tasks. For
example, a Slave that normally takes twice as long to render a Task should be assigned a multiplier of 2.

5.3. Slave Configuration

267

Deadline User Manual, Release 7.1.0.35

Normalized Task Timeout Multiplier: This value is used to calculate the normalized render time of Task
Timeouts. Typically, this should be the same value as above.
Concurrent Task Limit Override: The concurrent Task Limit for the Slave. If 0, the Slaves CPU count is
used as the limit.
Host Name/IP Address Override: Overrides the Host name/IP address for remote commands.
MAC Address Override: This is used to override the MAC Address associated with this Slave. This is useful
in the event that the slave defaults to a different MAC Address than the one needed for Wake On Lan.
Region: The Slaves region. Used for cross platform rendering. Default is None. See Regions for more
information.
Exclude Jobs in the none Pool: Enable this option to prevent the Slave from picking up Jobs that are assigned
to the none Pool.
Exclude Jobs in the none Group: Enable this option to prevent the Slave from picking up Jobs that are
assigned to the none Group.

Idle Detection
These settings can be used to override the global Slave Scheduling settings for the slave (if there are any). It can be
used to start the slave when its machine becomes idle (based on keyboard and mouse activity), and stop the slave when
its machine is in use again. Note that Idle Detection is managed by the Launcher, so it must be running for this feature
to work.
Start Slave When Machine Idle For: If enabled, the Slave will be started on the machine if it is idle. A
machine is considered idle if there hasnt been any keyboard, mouse or tablet activity for the specified amount
of time.

268

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

Only Start Slave If CPU Usage Less Than: If enabled, the slave will only be launched if the machines CPU
usage is less than the specified value.
Only Start Slave If Free Memory More Than: If enabled, the slave will only be launched if the machine has
more free memory than the specified value (in Megabytes).
Only Start Slave If These Processes Are Not Running: If enabled, the slave will only be launched if the
specified processes are not running on the machine.
Only Start If Launcher Is Not Running As These Users: If enabled, the slave will only be launched if the
launcher is not running as one of the specified users.
Stop Slave When Machine Is No Longer Idle: If enabled, the Slave will be stopped when the machine is no
longer idle. A machine is considered idle if there hasnt been any keyboard, mouse or tablet activity for the
specified amount of time.
Only Stop Slave If Started By Idle Detection: If enabled, the Slave will only be stopped when the machine is
no longer idle if that Slave was originally started by Idle Detection. If the Slave was originally started manually,
it will not be stopped.
Allow Slave To Finish Its Current Task When Stopping: If enabled, the Slave application will not be closed
until it finishes its current Task.

There are some limitations with Idle Detection depending on the operating system:
On Windows, Idle Detection will not work if the Launcher is running as a service. This is because the service
runs in an environment that is separate from the Desktop, and has no knowledge of any mouse or keyboard
activity.
On Linux, the Launcher uses X11 to determine if there has been any mouse or keyboard activity. If X11 is not
available, Idle Detection will not work.

5.3. Slave Configuration

269

Deadline User Manual, Release 7.1.0.35

Job Dequeuing
These setting are used to determine when a Slave can dequeue Jobs.
All Jobs: In this mode, the Slave will dequeue any job.
Only Jobs Submitted From This Slaves Machine: In this mode, the Slave will only dequeue job submitted
from the machine its running on.
Only Jobs Submitted From These Users: In this mode, the Slave will only dequeue job submitted from the
specified users.

CPU Affinity
These settings affect the number of CPUs the Slave renders with (Windows and Linux only):
Override CPU Affinity: Enable this option to override which CPUs the Slave and its child processes are limited
to.
Specify Number of CPUs to use: Choose this option if you just want to limit the number of CPUs used, and
you arent concerned with which specific CPUs are used.
Select Individual CPUs: Choose this option if you want to explicitly pick which CPUs are used. This is useful
if you are running multiple Slaves on the same machine and you want to give each of them their own set of
CPUs.

270

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

Extra Info
Like jobs, extra arbitrary properties can also be set for slaves.

5.3. Slave Configuration

271

Deadline User Manual, Release 7.1.0.35

The Extra Info 0-9 properties can be renamed from the Slaves section of the Repository Configuration, and have
corresponding columns in the Slave list that can be sorted on.

5.3.5 Slave Reports and History


All error reports for a Slave can be viewed in the Slave Reports panel. This panel can be opened from the View menu
or from the main toolbar in the Monitor. It can also be opened from the Slave panels right-click menu.

272

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

You can use the Slave Report panels right-click menu to save reports as files to send to Deadline Support. You can
also delete reports from this menu as well.
In addition to viewing Slave reports, you can also view the Slaves history. The History window can be brought up
from the Slave panels right-click menu by selecting the View Slave History option.

5.3. Slave Configuration

273

Deadline User Manual, Release 7.1.0.35

5.3.6 Remote Control


You can view the live log for Slaves or control them remotely from the right-click menu. See the Remote Control
documentation for more information.

5.4 Pulse Configuration


5.4.1 Overview
Pulse has two sets of options that can be configured. There are the global Pulse settings in the Repository Options,
which are applied to every running instance of Pulse, and there are the per-Pulse settings that can be configured from
the right-click menu in the Pulse panel. Note that the availability of these options can vary depending on the context
in which they are used, as well as the User Group Permissions that are defined for the current user.

274

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

If the Pulse panel is not visible, see the Panel Features documentation for instructions on how to create new panels in
the Monitor.

5.4.2 Pulse States


These are the states that a Pulse can be in. They are color coded to make it clear which state the Pulse is in.
Offline (gray): The Pulse application is closed.
Running (white): The Pulse application is running.
Stalled (red): Pulse becomes stalled if it hasnt updated its state for a certain amount of time. This could be
because the machine crashed, or that Pulse simply didnt shutdown cleanly.
If you see a red Pulse, it means the Pulse has been marked as stalled. This happens if the Pulse hasnt updated its
state for a certain amount of time. You can configure the Stalled Pulse Threshold in the General Pulse settings in the
Repository Options. When a Pulse is marked as stalled, it usually means that the machine crashed, or that Pulse simply
didnt shutdown cleanly. In the latter case, you can simply mark Pulse as offline from the right-click menu.
The Pulse panels right-click menu also gives the option to delete Pulses.

5.4.3 Pulse Settings


As mentioned above, there are the global Pulse settings in the Repository Options, which are applied to every running
instance of Pulse. However, there are also settings that can be specified for individual Pulse instances, which can be
modified by right-clicking on a Pulse in the Pulse panel and selecting Modify Pulse Properties.

You can also auto-configure a Pulse instance by right-clicking on it in the Monitor and selecting Auto Configure
Pulse. This will automatically make this Pulse the Primary Pulse, and set its connection settings.
General
These are some general Pulse settings:
This Pulse Is The Primary: If enabled, this is the Primary Pulse that the Slaves will connect to. If there is no
Primary, the Slaves will not be able to connect to Pulse.

5.4. Pulse Configuration

275

Deadline User Manual, Release 7.1.0.35

Override Port: If enabled, this port will be used by Pulse instead of a random port.
Host Name/IP Address Override: Overrides the Host name/IP address used by the Slaves to connect to Pulse,
and for remote commands.
MAC Address Override: This is used to override the MAC Address associated with this Pulse. This is useful
in the event that the pulse defaults to a different MAC Address than the one needed for Wake On Lan.
Region: The region for Pulse. Used for path mapping when executing commands with the Web Service.
When the Slaves connect to Pulse, they will use Pulses host name, unless the option to use Pulses IP address is
enabled in the Pulse Settings in the Repository Options. Use the Host Name/IP Address Override setting above to
override what the Slaves use to connect to Pulse.

5.4.4 Pulse History


You can view a Pulses history by right-clicking on it in the Pulse panel and selecting the View Pulse History option.

276

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

5.4.5 Remote Control


You can view the live log for Pulse or control it remotely from the right-click menu. See the Remote Control documentation for more information.

5.4.6 Pulse Redundancy


You can run multiple instances of Pulse on separate machines as backups in case your Primary Pulse instance goes
down. If the Primary Pulse goes offline or becomes stalled, Deadlines Repository Repair operation can elect another
running instance of Pulse as the Primary, and the Slaves will automatically connect to the new Primary instance.
To enable Pulse Redundancy, you must enable the Automatic Primary Pulse Election optin in the Repository Repair
settings in the Repository Options.
Note that when multiple Pulse instances are running, only the Primary Pulse is used by the Slaves for Throttling.
In addition, only Primary Pulse is used to perform Housecleaning, Power Management, and Statistics Gathering.
However, you can connect to any Pulse instance to use the Web Service.

5.4.7 Advanced Features


Many advanced features are built into Pulse. These features are described below.
Auto Configuration

5.4. Pulse Configuration

277

Deadline User Manual, Release 7.1.0.35

This allows you to set the repository path in a single location. When a Slave starts up, it will automatically pull the
repository path from Pulse and from that apply some settings before fully initializing. See the Auto Configuration
documentation for more information.
Slave Throttling
Pulse supports a throttling feature, which is helpful if youre submitting large files with your jobs. This is used to limit
the number of Slaves that copy over the job and plugin files at the same time. See the Network Performance Guide
documentation for more information.
Power Management
Power management is a system for controlling how machines startup and shutdown automatically based on sets of
conditions on the render farm, including job load and temperature. Power management is built into Pulse, so Pulse must
be running to use this feature. The only exception to this rule is Temperature checking. See the Power Management
documentation for more information.
Statistics Gathering
While Pulse isnt required to gather job statistics, it is required to gather the Slave and Repository statistics. See the
Farm Statistics documentation for more information.
Web Service
While Deadline has a standalone Web Service application, Pulse also has a web service feature built in. The web
service can be used to get information over an Internet connection. It is used by the Mobile application, and can also
be used to display information in a web page. See the Web Service documentation for more information.

5.5 Balancer Configuration


5.5.1 Overview
Balancer has three sets of options that can be configured:
Global Balancer settings in the Repository Options.
Cloud Provider Balancer settings in the Cloud Provider Configuration dialog.
Per-Balancer settings that can be configured from the right-click menu in the Balancer panel.
Note that the availability of these options can vary depending on the context in which they are used, as well as the
User Group Permissions that are defined for the current user.
If the Balancer panel is not visible, see the Panel Features documentation for instructions on how to create new panels
in the Monitor.

5.5.2 Global Balancer Settings


As mentioned above, there are the global Balancer settings in the Repository Options, which are applied to every
running instance of Balancer.

5.5.3 Cloud Provider Configuration


Before the Balancer can do anything, youll need to setup a Cloud Region. Balancer settings for each Cloud Provider
can be configured in the Cloud Provider Configuration dialog. Deadline supports a number of cloud providers by
default. Custom cloud plugins can be written to support different providers. Heres a list of all the supported Cloud
Plugins.
278

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

When adding a new Cloud Region youll have to enter all of your credentials and settings for that particular provider.
You can look at the documentation for each plugin for further details about all the settings and credentials. Enabling
the region will show instances in the Cloud Panel. Your credentials need to be verified before youre able to enable
the region to work with the Balancer.

Basic Configuration
The basic configuration options are:
Enabled: Enabling the region makes it usable by the Balancer.
Region Preference: Weighting towards the region.
Region Budget: Total Budget for a region. Governs how many instances will be started for this region.
Asset Checking
Asset Checking can be used to sync assets between the repository and the slaves. The Asset Checking options are:
Enable Asset Checking: Enables asset crawler for jobs with assets.
Asset Crawler Hostname: Hostname for the Asset Crawler.
Asset Crawler Port: Port number for the Asset Crawler.
Asset Crawler OS: Operating system of the Asset Crawler.
The asset script itself can be found in the vmx folder in the Repository, and is called AssetCrawler_Server.py.

5.5. Balancer Configuration

279

Deadline User Manual, Release 7.1.0.35

Balancer Plugins
The Balancer uses an algorithm thats defined in a Balancer Plugin. That can be set in the Balancer Settings section in
Repository Configuration. Weve included a default algorithm that should be fine for most use cases but you can write
your own for your specific needs.
Group Mappings
Group Mappings are the heart of the Balancer. They tell the Balancer what kinds of instances to start for each group.

A Group Mapping is mainly comprised of a group, an image, a hardware type and a budget. The image and hardware
type are from the provider. The Cost is how much of the regions budget will be consumed by each instance.

280

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

You can also add Pools to a mapping so that instances will be started in those pools.

5.5.4 Balancer States


These are the states that a Balancer can be in. They are color coded to make it clear which state the Balancer is in.
Offline (gray): The Balancer application is closed.
Running (white): The Balancer application is running.
Stalled (red): Balancer becomes stalled if it hasnt updated its state for a certain amount of time. This could be
because the machine crashed, or that Balancer simply didnt shutdown cleanly.
If you see a red Balancer, it means the Balancer has been marked as stalled. This happens if the Balancer hasnt
updated its state for a certain amount of time. You can configure the Stalled Balancer Threshold in the General
Balancer settings in the Repository Options. When a Balancer is marked as stalled, it usually means that the machine
crashed, or that Balancer simply didnt shutdown cleanly. In the latter case, you can simply mark Balancer as offline
from the right-click menu.
The Balancer panels right-click menu also gives the option to delete Balancers.

5.5.5 Balancer Settings


There are settings that can be specified for individual Balancer instances, which can be modified by right-clicking on
a Balancer in the Balancer panel and selecting Modify Balancer Properties.

5.5. Balancer Configuration

281

Deadline User Manual, Release 7.1.0.35

You can also auto-configure a Balancer instance by right-clicking on it in the Monitor and selecting Auto Configure
Balancer. This will automatically make this Balancer the Primary Balancer.
General
These are some general Balancer settings:
This Balancer Is The Primary: If enabled, this is the Primary Balancer.
Host Name/IP Address Override: Overrides the Host name/IP address for remote commands.
MAC Address Override: This is used to override the MAC Address associated with this Balancer. This is
useful in the event that the balancer defaults to a different MAC Address than the one needed for Wake On Lan.
Region: The region for Balancer.

5.5.6 Balancer History


You can view a Balancers history by right-clicking on it in the Balancer panel and selecting the View Balancer History
option.

282

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

5.5.7 Remote Control


You can view the live log for Balancer or control it remotely from the right-click menu. See the Remote Control
documentation for more information.

5.5.8 Balancer Redundancy


You can run multiple instances of Balancer on separate machines as backups in case your Primary Balancer instance
goes down. If the Primary Balancer goes offline or becomes stalled, Deadlines Repository Repair operation can elect
another running instance of Balancer as the Primary.
To enable Balancer Redundancy, you must enable the Automatic Primary Balancer Election optin in the Repository
Repair settings in the Repository Options.

5.5. Balancer Configuration

283

Deadline User Manual, Release 7.1.0.35

Note that when multiple Balancer instances are running, only the Primary Balancer is starting and stopping virtual
instances.

5.6 Job Scheduling


5.6.1 How a Job is Selected by a Slave
By default, a job is selected by a Slave based on the following properties, in this order:
1. The Pools and Groups that the Job has been submitted to.
A Slave will only select a Job if it has been assigned to the Pool and Group to which the Job belongs.
Pools are priority-based, so a Slave will favour Jobs in Pools that are higher on its priority list. This
ordering can be configured on a per-Slave basis through the Manage Pools utility.
Groups are not priority-based, and are typically just used to ensure that Jobs render on machines with
appropriate hardware and software configurations.
2. The Jobs Priority:
By default, a Job has a numeric Priority ranging from 0 to 100, where 0 is the lowest priority and 100 is
the highest priority. You can adjust the maximum Job priority in the Job Settings section of the Repository
Configuration.
Everything else being equal, the highest Priority Job will always be chosen first when a Slave is selecting
its next Job.
3. The Date and Time at which the Job was submitted:
This is set automatically and is the timestamp of when the Job was submitted to Deadline.
Everything else being equal, an older Job will take priority over a newer Job when a Slave is looking for a
new one.
4. The Jobs Limits and Machine Limits
With Limits, if a Job has the highest priority, but requires a Limit that is maxed out, a Slave will try to
select a different Job.
A Machine Limit is a special type of Limit that restricts the number of machines that can render that
particular Job at the same time.

5.6.2 Changing the Scheduling Order


It is possible to change the order in which Jobs are scheduled in the Job Settings section of the Repository Configuration.

284

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

The following options are available:


First-in First-Out: Job order will be based solely on submission date, and will be rendered in the order they
are submitted.
Pool, First-in First-Out: Job order will be based on the jobs pool first, with submission date being the tiebreaker.
Pool, Priority, First-in First-Out: This is the default scheduling order that is used. Job order will be based on
the jobs pool, then priority, with submission date being the tie-breaker.
Priority, First-in First-Out: Job order will be based on the jobs priority first, with submission date being the
tie-breaker.
Priority, Pool, First-in First-Out: Job order will be based on the jobs priority, then pool, with submission date
being the tie-breaker.
Balanced: Job order will be balanced so that each job has the same number of slaves rendering them at a time.
Pool, Balanced: Job order will be based on the jobs pool first, with a balance being applied to jobs that are in
the same pool.
Pool, Priority, Balanced: Job order will be based on the jobs pool, then priority, with a balance being applied

5.6. Job Scheduling

285

Deadline User Manual, Release 7.1.0.35

to jobs that have the same pool and priority.


Priority, Balanced: Job order will be based on the jobs priority first, with a balance being applied to jobs that
have the same priority.
Priority, Pool, Balanced: Job order will be based on the jobs priority, then pool, with a balance being applied
to jobs that have the same pool and priority.
Weighted, First-in First-out: A weighted system that takes priority, submission time, number of rendering
tasks, and number of job errors into account, but does not take pools into account. If two or more jobs have the
same calculated weight, the submission date will act as the tie-breaker.
Pool, Weighted, First-in First-out: A weighted system that still respects pool priority. If two or more jobs have
the same calculated weight, the submission date will act as the tie-breaker.
Weighted, Balanced: A weighted system that takes priority, submission time, number of rendering tasks, and
number of job errors into account, but does not take pools into account. A balance will be applied to jobs that
have the same calculated weight.
Pool, Weighted, Balanced: A weighted system that still respects pool priority. A balance will be applied to
jobs that have the same calculated weight.
Note that the Secondary Pool feature was designed for job scheduling orders that have Pool listed first, and might not
work as expected otherwise. For example, if Priority is listed first, a job with lower priority thats found during the
initial Primary Pool scan will be preferred over a job with higher priority thats found during the Secondary Pool scan.
This is because the Secondary Pool scan is only performed if no jobs are found during the initial Primary Pool scan.
See the Pools and Groups documentation for more information.
Balanced Scheduling
For the balanced options, you can can have slaves give the job they are currently working on more priority using the
Rendering Task Buffer. This can help prevent slaves from jumping between jobs. For example, if this is set to 3, a
slave will only drop its current job for another one if the other job has more than 3 less rendering tasks than the current
job.
There is also an experimental option to enhance the balancing logic. When this option is enabled, the slaves will use
the database to get a more accurate snapshot of all the rendering jobs in the farm, and use this information to make
better decisions about which job they should be rendering. Testing has shown that when this option is enabled, a
proper distribution of Slaves among jobs is much more consistent, and Slaves no longer jump between jobs of the
same priority. The result is more predictable behavior, and less wasted time due to the overhead of switching between
jobs that are expensive to start up.
Weighted Scheduling
For the weighted options, you can control how much weight is applied to the job priority, submission time, number of
rendering tasks, and number of errors. You can also give weight to the job that the slave is currently working on using
the Rendering Task Buffer. The buffer is subtracted from the rendering task count for the current job, which pushes it
higher in the queue.
Deadline then sorts by this weight so that jobs with the largest weight value have the higher priority. Note that the
weight values can be negative. For example, if you set a negative weight value to the number of job errors, a job with
more errors will end up having a lower overall weight so that precedence is given to other jobs in the queue.
Here is how the weight is calculated:
weight = (job.Priority * PW) +
(job.Errors * EW) +

286

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

((NOW - job.SubmissionTimeSeconds) * SW) +


((job.RenderingTasks - RB) * RW)

Where:
PW = priority weight
EW = error weight
SW = submission time weight
RW = rendering task weight
RB = rendering task buffer
NOW = the current repository time
Note that because the job submission time is measured in seconds, it will have the greatest impact on the overall
weight. Reducing the SW value can help reduce the submission times impact on the weight value.
There is also an experimental option to enhance the balancing logic. When this option is enabled, the slaves will use
the database to get a more accurate snapshot of all the rendering jobs in the farm, and use this information to make
better decisions about which job they should be rendering. Testing has shown that when this option is enabled, a
proper distribution of Slaves among jobs is much more consistent, and Slaves no longer jump between jobs of the
same priority. The result is more predictable behavior, and less wasted time due to the overhead of switching between
jobs that are expensive to start up.

5.7 Pools and Groups


5.7.1 What are Pools and Groups?
Groups can be used to organize your farm based on machine configurations (e.g., specs, installed software, etc). For
example, if you have several 64-bit machines with 3ds Max installed, you could assign them to groups like 3dsmax, or
3dsmax_64bit, or simply 3D. Groups have no impact on the order in which Jobs are rendered, they just help to ensure
that Jobs render on machines with proper an appropriate hardware/software setup. If you dont care about grouping
your machines, you can simply use the default none Group.
Pools are similar to Groups, except that they do affect the order in which Jobs are rendered. Because of this, it is
encouraged to use Pools for prioritizing different shows, shots, types of Jobs, etc. If you dont want to set up Pools on
your farm, you can simply use the default none Pool. Note that the none Pool always has the lowest priority of all
the Pools.
Jobs can be added to an optional Secondary Pool. When searching for a Job, a Slave does a first pass using the Primary
Pool of the available Jobs. If the Slave doesnt find any Jobs using the Primary Pool, it then makes a second pass using
the Secondary Pool. This system can allow a Job to spread to a Secondary Pool as necessary, and it can also ease the
configuration of Pools in the farm if there are lots of Pools and Slaves. An example of this is shown below.
Note that the Secondary Pool feature was designed for Job Scheduling Orders that have Pool listed first, and might not
work as expected otherwise. For example, if Priority is listed first, a job with lower priority thats found during the
initial Primary Pool scan will be preferred over a job with higher priority thats found during the Secondary Pool scan.
This is because the Secondary Pool scan is only performed if no jobs are found during the initial Primary Pool scan.

5.7.2 Managing Pools and Groups


Pools and Groups can be managed from the Monitor while in Super User mode (or as a User with the proper User
Group privileges). Just select Manage Pools (or Manage Groups) from the Tools menu, or from the Slave panels
5.7. Pools and Groups

287

Deadline User Manual, Release 7.1.0.35

right-click menu.
The dialogs are very similar to each other, but the nuances between the two are described below in detail. Note that if
you used the Slave panels right-click menu to open these dialogs, they will be pre-filtered to just show the slaves that
you right-clicked on. They will also show the same columns that are currently being shown in the slave list.
Group Management Dialog
From here, you can manage individual Groups, and assign them to various Slaves. It is a bit simpler than the Pool
Management Dialog, which will be covered below in more detail, since it does not have to worry about the order of
Groups for a given Slave.

The functions you can perform here are as follows:


Groups: This section displays existing Groups and allows you to manipulate them, or create new ones. Your
selection here will determine which Groups will be affected by the Group Operations.
New: This will create a new Group in the Repository, and allow you to assign the Group to different Slaves.
You will be prompted for a name for the new Group. Group names cannot be changed once the Group has
been created. Adding a Group with the name of previously Deleted Group will effectively re-instate that
Group if it hasnt been Purged yet (see below).
Delete: This will Delete all of the selected Groups from the Repository, and enable the option to Purge
them (described below).
Purge Obsolete Groups on Close: This will purge any obsolete (deleted) Groups from existing Jobs and
remove from any Slaves that are currently assigned to it. They will be replaced with the Group selected in
the drop down. Note that if you choose not to Purge the obsolete Groups right now, you can always return
to this dialog and do it later.
Slaves: This section displays a list of all known Slaves that have connected to your Repository. Your selection
here will determine which Slaves will be affected by the Groups Operations.
288

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

Only Show Slaves Assigned to a Selected Group: This option will filter the displayed Slaves to only
include the ones that are currently assigned to at least one of the selected Groups.
Group Operations: These operations are used to manipulate which Groups are assigned to which Slaves. They
typically require a selection of one or more Groups and one or more Slaves to be active.
Add: This will add all of the selected Groups to all of the selected Slaves, if it wasnt already there.
Remove: This will remove all of the selected Groups from all of the selected Slaves, if applicable.
Copy: This will copy the groups from the selected slave to the clipboard.
Paste: This will paste the groups that were copied using the Copy button to the selected slaves.
Clear: This will clear all the groups from all of the selected Slaves. This option does not require a Group
to be selected.
Pool Management Dialog
The Pool Management dialog functions similarly to the Group Management dialog described above, but with a few
added options to deal with managing Pool Ordering for individual Slaves.

The functions you can perform here are as follows. Note that a lot of these overlap with the described Group Management functionality described in the previous section.
Pools: This section displays existing Pools and allows you to manipulate them, or create new ones. Your
selection here will determine which Pools will be affected by the Pool Operations described below.
New: This will create a new Pool in the Repository, and allow you to assign it to Slaves. You will be
prompted for a name for the new Pool; not that Pool names cannot be changed once the Pool has been
created. Adding a Pool with the name of previously Deleted Pool will effectively re-instate that Pool if it
hasnt been Purged yet (see below).

5.7. Pools and Groups

289

Deadline User Manual, Release 7.1.0.35

Delete: This will Delete all of the selected Pools from the Repository, and enable the option to Purge them
(described below).
Purge Obsolete Pools on Close: This will purge any obsolete (deleted) Pools from existing Jobs and
remove them from any Slaves that may have them in their list. They will be replaced with the Pool
selected in the drop down. Note that if you choose not to Purge the obsolete Pools right now, you can
always return to this dialog and do it later.
Priority Distribution: This graph visualizes how many Slaves have one of the selected Pools as #1 priority, #2 priority, etc. It also displays how many Slaves are not currently assigned to the selected Pools.
Slaves: This section displays a list of all known Slaves that have connected to your Repository. Your selection
here will determine which Slaves will be affected by the Pool Operations described below.
Only Show Slaves Assigned to a Selected Pool: This option will filter the displayed Slaves to only include
the ones that are currently assigned to at least one of the selected Pools.
Pool Operations: These operations are used to manipulate which Pools are assigned to which Slaves. They
typically require a selection of one or more Pools and one or more Slaves to be active.
Add: This will add all of the selected Pools to all of the selected Slaves, if it wasnt already there.
Remove: This will remove all of the selected Pools from all of the selected Slaves, if applicable.
Promote: This will bump up the selected Pools by one position in all of the selected Slaves Pool lists.
Any selected Slaves that are not assigned to the selected Pool(s) are unaffected.
Demote: This will bump down the selected Pools by one position in all of the selected Slaves Pool lists.
Any selected Slaves that are not assigned to the selected Pool(s) are unaffected. Note that a Pool cannot
be demoted to be lower than the none pool the none Pool is always assigned the lowest priority by
Slaves.
Copy: This will copy the pools from the selected slave to the clipboard.
Paste: This will paste the pools that were copied using the Copy button to the selected slaves.
Clear: This will clear all the Pools from all of the selected Slaves. This option does not require a Pool to
be selected.
Preventing Slaves from Rendering Jobs in the none Pool or Group
In some cases, it may be useful to prevent on or more Slaves from rendering Jobs that are assigned to the none Pool
or Group. For example, you may have a single machine that you want to only render Quicktime Jobs. Normally, you
could add this machine to a quicktime Group, but if there are noe Quicktime Jobs, the Slave could move on to Jobs
that are in the none Group. If you want this machine to only be available for Quicktime Jobs, you can configure it to
exclue Jobs in the none Group.
The option to exclude Jobs in the none Pool or Group can be found in the Slave Settings in the Monitor.

5.7.3 Pools and Job Scheduling


How pools affect the Job selection process is best explained through an example. Note that this example relies on a
Scheduling Order where Pools are the primary determining factor of scheduling (such as the default Pool -> Priority
-> Submit Date scheme).
Say we need to render Jobs for two different shows, and weve already created corresponding pools for each show in
Deadline:
show_a

290

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

show_b
Now say we have 10 machine in our render farm, and we want to give each show top priority on half of it. To do this,
wed just assign the pools to our Slaves like this:
Slaves 1-5:
1. show_a
Slaves 6-10:
1. show_b
With this setup, if Jobs from both shows are in the queue, then Slaves 1-5 will pick up the Jobs from show_a, while
Slaves 6-10 will work on Jobs from show_b. This effectively splits our farm in half, like we desired, but with this
configuration Slaves 1-5 would sit idle once show_a finishes production, even if there are plenty of show_b Jobs in the
queue. The reverse would also be true if show_b production slows down while show_a is still ramping up.
To accomplish this second goal of maximizing our resources, well assign the Pools to our Slaves as follows:
Slaves 1-5:
1. show_a
2. show_b
Slaves 6-10:
1. show_b
2. show_a
Now, Slaves 1-5 will still give top priority to show_a Jobs, and Slaves 6-10 will similarly give top priority to show_b
Jobs. However, if there are no show_a Jobs currently in the queue, Slaves 1-5 will start working on show_b Jobs
until another show_a Job comes along. Similarly, Slaves 6-10 would start working on show_a if no show_b Jobs were
available.
This concept is also extensible to any number of shows/pools, you just have to make sure to have an even Priority
Distribution across your farm (the Priority Distribution graph should help with that). Heres an example of what the
Priority Distribution for a 3-show farm with 15 Slaves could look like:
Slaves 1-5:
1. show_a
2. show_b
3. show_c
Slaves 6-10:
1. show_b
2. show_c
3. show_a
Slaves 11-15:
1. show_c
2. show_a
3. show_b

5.7. Pools and Groups

291

Deadline User Manual, Release 7.1.0.35

5.7.4 Secondary Pools and Job Scheduling


How secondary pools affect the Job selection process is best explained through an example. Note that this example
relies on a Scheduling Order where Pools are the primary determining factor of scheduling (such as the default Pool
-> Priority -> First-in First-out option). The Secondary Pool feature was designed for job scheduling orders that have
Pool listed first, and might not work as expected otherwise.
Lets say you have 5 pools and 10 slaves. You want each pool to have top priority on 2 machines, but then be able to
spread to the rest of them if they are idle. Without using the secondary pool system, you might have something like
this:
Slaves 0-1: pool_1, pool_2, pool_3, pool_4, pool_5
Slaves 2-3: pool_2, pool_3, pool_4, pool_5, pool_1
Slaves 4-5: pool_3, pool_4, pool_5, pool_1, pool_2
Slaves 6-7: pool_4, pool_5, pool_1, pool_2, pool_3
Slaves 8-9: pool_5, pool_1, pool_2, pool_3, pool_4
This can be tricky to maintain if you have to reorganize pools or new slaves are added to the farm. The new secondary
pool system can make this easier:
Slaves 0-1: pool_1, pool_all
Slaves 2-3: pool_2, pool_all
Slaves 4-5: pool_3, pool_all
Slaves 6-7: pool_4, pool_all
Slaves 8-9: pool_5, pool_all
In this case, all jobs could have pool_all as their secondary pool, and will spread to the rest of the farm if machines
become available.

5.8 Limits and Machine Limits


5.8.1 Overview
In order to support rendering applications that use floating licensing to limit the number of clients rendering at any one
time, Deadline supports the ability to create Limits to manage this restriction. When creating a Limit, be sure to set
the limit to the number of network licenses you have for the product.
For example, if you have 20 nodes in your render farm and only 10 licenses of Nuke, you can create a Nuke Limit
with a limit of 10. During rendering Deadline will ensure that no more than 10 machines are rendering Jobs associated
with this Nuke Limit at any given time. Because of this, you never have to worry about licensing issues.
Machine Limits function similarly, but are on a per-Job basis. Instead of limiting how many Slaves can render a group
of Jobs, they limit the number of Slaves that can render one particular Job. This is useful if you want to prevent a job
from potentially taking up the entire farm.

5.8.2 Job Machine Limits


Machine Limits are a per-Job option, and can be managed through the Jobs Properties window, which you can get to
by right-clicking on the Job and selecting Modify Job Properties. More information on the available Machine Limit
settings can be found in the Controlling Jobs documentation.

292

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

5.8.3 Limits
Limits can be managed from the Limit list in the Monitor while in Super User mode (or as a user with appropriate
User Group privileges). This list shows all the Limits that are in your Repository. It also displays useful information
about each Limit, such as its name, its limit, and the number of Limit stubs that are currently in use. You can access
many options for the Limits (listed below) by right-clicking on them, and you can create a new Limit by clicking on
the [+] button in the Limit lists toolbar.

If the Limits panel is not visible, see the Panel Features documentation for instructions on how to create new panels
in the Monitor.

5.8. Limits and Machine Limits

293

Deadline User Manual, Release 7.1.0.35

New Limit
Use this option to add a new Limit to your Repository.

You can modify the following settings for the new Limit:
Name
The name of the new Limit. Note that this setting cannot be changed once the Limit has been created.
Usage Level
The level at which a Limit Stub will be checked out. Slave is the default, and will require each Slave
to acquire a Stub; if Machine is selected, only a single Stub will be required for all Slaves on the same
machine. Conversely, if Task is selected, Slaves will try to acquire one Stub per concurrent Render
Thread. Note that this setting cannot be changed after Limit creation.
Limit
The maximum number of simultaneous uses that this Limit can support at any given time. What counts

294

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

as a use is based on the usage Level (will be either on a Machine, Slave, or Task level).
Release at Task Progress
If enabled, Slaves will release their Limit stub when the current Task reaches the specified percentage.
Note that not all Plugins report Task progress.
Whitelisted/Blacklisted Slaves
If Slaves (or Machines, depending on Level selected above) are on a Blacklist, they will never try to render
Jobs associated with this Limit. If Slaves/Machines are on a Whitelist, then they are the only ones that
will try to render Jobs associated with this Limit. Note that an empty blacklist and an empty whitelist are
functionally equivalent, and have no impact on which machines the job renders on.
Slaves Excluded From Limit
These Slaves (or Machines, depending on Level selected above) will ignore this Limit and wont contribute to the Limits stub count. This is useful if you are juggling a mix of floating and node-locked
licenses, in which case your machines with node-locked licenses should be placed on this list.
Clone Limit
This option allows you to create a new Limit while using an existing Limit as a template. It will bring up a dialog
very similar to the one pictured in Create Limit, with all the same options. This option is handy if you want to create
a Limit that is very similar to an existing one, but with a small variation.
Modify Limit Properties
This option allows you to edit the settings for an existing Limit. All of the settings described in the New Limit section
above can be changed except for the Limits Name and Usage Level, which cannot be changed once the Limit has
been created.
Reset Limit Usage Count
Sometimes a Limit stub will get orphaned, meaning that it is counting against the Limits usage count, but not machines
are actually using it. After a while, Deadline will eventually clean up these orphaned Limit stubs. This option provides
the means to delete all existing stubs immediately (whether they are orphaned or not), which will require Slaves to
acquire them again.
Delete Limit
Removes an existing Limit from your Repository. Any Jobs associated with deleted Limits will still be able to render,
but they will print out Warnings indicating that the Limit no longer exists.

5.8.4 Limits and Job Scheduling


Although Limits and Job Machine Limits arent priority-based like Pools, they do have an impact on job scheduling.
Here are some examples.
Limits
If a job is assigned to a Limit, and that Limit is currently maxed out, the job will not be picked up by any
additional slaves.

5.8. Limits and Machine Limits

295

Deadline User Manual, Release 7.1.0.35

If a job is assigned to a Limit, and that Limit has a whitelist, the job will only render on the slaves in that
whitelist.
If a job is assigned to two Limits, and one of those Limits is currently maxed out, the job will not be picked up
by any additional slaves. This is because a slave must be able to acquire all Limits that the job requires.
If a job is assigned to two Limits, and one of those Limits has slave_1 on its blacklist, slave_1 will never pick
up the job. This is because a slave must be able to acquire all Limits that the job requires.
Job Machine Limits
If a job has a Machine Limit greater than 0, and that Limit is currently maxed out, the job will not be picked up
by any additional slaves.
If a job has a whitelist, the job will only render on the slaves in that whitelist.

5.9 Job Failure Detection


5.9.1 Overview
Job Failure Detection can be used to prevent problematic Jobs from wasting previous render time on the farm. There are
two types of Failure Detection, which are both explained below. By default, Jobs will fail after they have accumulated
100 errors, but this can be changed in the Job Settings section of the Repository Configuration.

296

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

5.9.2 Job Failure Detection


A Job will enter the Failed state when it has accumulated the maximum permitted number of errors. Once in the Failed
state, the Job will no longer be picked up by Slaves for rendering without manual intervention. Because of this, Job
Failure Detection can help ensure that problematic Jobs are flagged appropriately and wont waste precious rendering
time. In the Repository Options, can setup failure thresholds for Jobs and for individual Tasks.

5.9. Job Failure Detection

297

Deadline User Manual, Release 7.1.0.35

If youve resolved the problems that were preventing the Job from rendering properly, you can right-click on it in the
Monitor and select Resume Failed Job. You will then be prompted with the option to ignore or override Failure
Detection for this Job going forward. Note that an Error Limit of 0 indicates that there is no limit, and the Job will
never be marked as Failed by Failure Detection.

If you choose not to ignore Failure Detection, make sure to clear the Jobs errors, or a single new error will result in
the Job failing again, because its error limit is still over the maximum. To clear a Jobs errors, simply delete all of the
Jobs Error Reports using the Job Reports Panel.

5.9.3 Slave Failure Detection


Slave Failure Detection works a little differently than Job Failure Detection. Basically, if a particular Slave reports
consecutive errors for a given Job, it will add itself to the Jobs list of Bad Slaves. When a Slave has been marked as
bad for a particular Job, it will not try to render that Job again until it has no other Jobs available. This helps ensure
that Slaves arent wasting render time on Jobs that they likely arent able to render.
If the issue preventing a Slave from rendering a particular Job properly has been resolved, you can remove it from a
298

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

Jobs bad list by navigating to the Failure Detection section of a Jobs Properties dialog. There is also an option in
this section to have your Job completely ignore Slave Failure Detection, if you wish.

5.10 Notifications
5.10.1 Overview
Deadline can be configured to notify Users when their Jobs finish, or if they have failed. In addition, Deadline can
be configured to send notifications to administrators when certain events occur on the farm (e.g., when a Slave has
stalled, or if a Slave is being shutdown by Power Management).

5.10.2 Email Notifications


Before Deadline can send email notifications, you need to configure the Email Notification settings in the Repository
Configuration.

5.10. Notifications

299

Deadline User Manual, Release 7.1.0.35

5.10.3 Popup Message Notifications


The popup message notification system can be used to send job notifications to users by popping up a message window
on their workstations.
In order to receive popup message notifications, the user needs to have the Launcher running on their workstation, and
have their workstation machine name specified in their User Settings (see below).

5.10.4 Job Notifications


Users can edit their User Settings to control whether or not they receive notifications for their own Jobs.

300

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

In order to receive email notifications, the user needs to set their Email Address setting and enable the Email Notification option. Note that email notifications will only be sent if the SMTP settings in the Repository Options are set
properly, as mentioned in the previous section.
In order to receive popup message notifications, the user needs to have the Launcher running on their workstation, and
have their workstation machine name specified in their User Settings.

5.11 Remote Control


5.11.1 Overview
Remote control features are built into the Monitor to make farm administration easier. These features allow you to
connect to and control the Slave application on your render nodes, and also run remote commands on them. They also
allow you to control Pulse and the Balancer as well (if youre running them on your farm).
If the Slaves, Pulse, or Balancer panel are not visible, see the Panel Features documentation for instructions on how to
create new panels in the Monitor.
5.11. Remote Control

301

Deadline User Manual, Release 7.1.0.35

5.11.2 Connecting to the Application Logs


You can remotely connect to the Slave, Pulse, or Balancer log from the Monitor.
Connecting to the Slave Log
You can remotely connect to a Slave by double-clicking on it in the Slave panel, or by right-clicking on it and selecting
Connect To Slave Log. This will bring up the Slave Log window, allowing you to see what the Slave is currently
doing.

There are a few places in the Monitor you can find the option to connect to the Slave log:
The Slave panel right-click menu.
The Task panel right-click menu. Note that it will only appear for rendering or completed tasks.
The Job Report panel right-click menu.
The Slave Report panel right-click menu.

302

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

Connecting to the Pulse Log


You can remotely connect to a Pulse by double-clicking on it in the Pulse panel, or by right-clicking on it and selecting
Connect To Pulse Log. This will bring up the Pulse Log window, allowing you to see what the Pulse is currently doing.

Connecting to the Balancer Log


You can remotely connect to a Balancer by double-clicking on it in the Balancer panel, or by right-clicking on it
and selecting Connect To Balancer Log. This will bring up the Balancer Log window, allowing you to see what the
Balancer is currently doing.

5.11. Remote Control

303

Deadline User Manual, Release 7.1.0.35

5.11.3 Remote Controlling Slaves, Pulses, and Balancers


The Remote Control menu can be found in the Slave, Pulse, and Balancer panels right-click menu. Note that the
availability of these options can vary depending on the context in which they are used, as well as the User Group
Permissions that are defined for the current user. Remote Administration must also be enabled on the farm, and can be
enabled in the Client Setup.
These are the options that are available in the Slave, Pulse, and Balancer Remote Control menus:
Start Machine: Starts the machine using Wake On Lan.
Shutdown Machine: Turns off the machine.
Restart Machine: Restarts the machine.
Suspend Machine: Sets the machines state as suspended (Windows Only).
Execute Command: Executes an arbitrary command on the machine.

304

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

When executing an arbitrary command, if you want to execute a DOS command on a Windows machine, the command
must be preceded with cmd /C. This opens the DOS prompt, executes the command, and closes the prompt. For
example:
cmd /C echo "foo" > c:\test.txt

These remote commands do not allow for user input for any command requiring a prompt. An example where this
might cause confusion is with Microsofts xcopy command. Here, if the destination is uncertain to be a file or folder,
xcopy will immediately exit as though successful instead of asking what should be done.
If a command returns a non-zero exit code, the command will be interpreted as having failed.
Slave Remote Control Options
These options are only available in the Slave Remote Control menu:
Search For Jobs: Forces the Slave to search the Repository for a job to do.
Cancel Current Tasks: Forces the Slave to cancel its current tasks.
Start Slave: Starts the Slave instance.
Stop Slave: Stops the Slave instance.
Restart Slave: Restarts the Slave instance.
Continue Running After Current Task Completion: The Slave will continue to run after it finishes its current
task.
Stop Slave After Current Task Completion: The Slave will stop after the current task is completed.
Restart Slave After Current Task Completion: The Slave will restart after the current task is completed.
Shutdown Machine After Current Task Completion: The Machine running the Slave will stop after the
current task is completed.
Restart Machine After Current Task Completion: The machine running the Slave will restart after the current
task is completed.
Start All Slave Instances: Starts all the slave instances on the selected machines.
Start New Slave Instance: Starts a new slave instance with the specified name on the selected machine.

5.11. Remote Control

305

Deadline User Manual, Release 7.1.0.35

Pulse Remote Control Options


These options are only available in the Pulse Remote Control menu:
Perform Pending Job Scan: Forces Pulse to perform the Pending Job Scan operation.
Perform House Cleaning: Forces Pulse to perform the House Cleaning operation.
Perform Repository Repair: Forces Pulse to perform the Repository Repair operation.
Perform Power Management Check: Forces Pulse to perform the Power Management check.
Start Pulse: Starts the Pulse instance.
Stop Pulse: Stops the Pulse instance.
Restart Pulse: Restarts the Pulse instance.

306

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

Balancer Remote Control Options


These options are only available in the Pulse Remote Control menu:
Perform Balancing: Forces the Balancer to perform the Balancing operation.
Start Balancer: Starts the Balancer instance.
Stop Balancer: Stops the Balancer instance.
Restart Balancer: Restarts the Balancer instance.

5.11.4 Remote Commands Panel


The Remote Command panel shows all pending and completed remote commands that were sent from the Monitor.
When sending a remote command, if this panel is not already displayed, it will be displayed automatically (assuming
you have permissions to see the Remote Command panel).

You can view the results of a remote command by clicking on the command in the list. The full results will be shown
in the log window below. All successful commands will start with Connection Accepted.

5.11.5 Remote Desktop Software


There are many applications which allows you to remotely control another computer. The following applications are
supported by Deadline out of the box via Monitor scripts. The scripts can be run from the Scripts menu in the Monitor,
or by right-clicking on a Slave, Pulse, or Balancer in their respective panels. Right-click scripts can also be found in
the Task and Report panels.

5.11. Remote Control

307

Deadline User Manual, Release 7.1.0.35

Apple Remote Desktop (ARD)


With Apple Remote Desktop (ARD), you can observe and obtain access to the computers on your network. Note that
in order to connect to a machine from the Monitor, that machine must already be in the ARD list of computers because
Deadline cant create new computer entries and add them to the list. An error message is displayed if the machine
cant be found in the ARD list.

The following options are available in the ARD window in the Monitor:
Machine IP Address(s): Specify which machines to connect to. Use a comma to separate multiple machine
names.
Hide this window if running from a right-click Scripts menu: If enabled, this window will be hidden if run
from a right-click menu in the Monitor. You can always run it from the main Scripts menu to see this window.
Radmin
Radmin is fast, secure and affordable remote-control software that enables you to work on a remote computer in real
time as if you were sitting in front of it.

308

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

The following options are available in the Radmin window in the Monitor:
Machine Name(s): Specify which machines to connect to. Use a comma to separate multiple machine names.
Radmin Viewer: The Radmin viewer executable to use.
Radmin Port: The Radmin port.
Hide this window if running from a right-click Scripts menu: If enabled, this window will be hidden if run
from a right-click menu in the Monitor. You can always run it from the main Scripts menu to see this window.
Remote Desktop Connection (RDC)
With Remote Desktop Connection (RDC), you can easily connect to a terminal server or to another computer running
Windows. All you need is network access and permissions to connect to the other computer.

The following options are available in the RDC window in the Monitor:
Machine Name(s): Specify which machines to connect to. Use a comma to separate multiple machine names.
Settings:
No Settings: When this option is chosen, no existing RDP settings are used to connect.
Settings File: When this option is chosen, the specified RDP config file is used to connect.
Settings Folder: When this option is enabled, existing RDP config files in this folder are used to
connect. If the machine does not have an RDP config file, youll have the option to save one before
connecting.
Hide this window if running from a right-click Scripts menu: If enabled, this window will be hidden if run
from a right-click menu in the Monitor. You can always run it from the main Scripts menu to see this window.

5.11. Remote Control

309

Deadline User Manual, Release 7.1.0.35

VNC
Virtual Network Computing (VNC) is a desktop protocol to remotely control another computer. It transmits the keyboard presses and mouse clicks from one computer to another relaying the screen updates back in the other direction,
over a network. There are many options available for VNC software. TightVNC, RealVNC, UltraVNC, and Chicken
have all been used successfully with Deadline.

The following options are available in the VNC window in the Monitor:
Machine Name(s): Specify which machines to connect to. Use a comma to separate multiple machine names.
VNC Viewer: The VNC viewer executable to use.
Password: The VNC password.
VNC Port: The VNC port.
Remember Password: Enable to remember your password between sessions.
Hide this window if running from a right-click Scripts menu: If enabled, this window will be hidden if run
from a right-click menu in the Monitor. You can always run it from the main Scripts menu to see this window.

5.12 Network Performance


5.12.1 Overview
This guide is intended to help you find and fix potential bottlenecks in your Deadline render farm. If you are noticing
sluggish performance when you are using Deadline, there are a few things you can do to try and improve it.

5.12.2 Adjust Monitor and Slave Settings


There are a few Monitor and Slave settings in the Repository Options that you can tweak to help improve performance,
and reduce load on both the network and the database. You can also use the Auto Adjust option to figure out the best
default values to use based on the number of Slaves in your farm. See the Repository Options documentation for more
information.

310

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

5.12.3 Enable Throttling


Pulse supports a Throttling feature, which is helpful if youre submitting large files with your jobs. This can be used
to limit the number of Slaves that are copying over the Job files at the same time. The Throttling settings can be found
in the Pulse Settings section of the Repository Options.

5.12. Network Performance

311

Deadline User Manual, Release 7.1.0.35

For example, if you have 100 Slaves, and youre submitting 500MB scene files with your jobs, you may notice a
performance hit if all 100 Slaves try to copy over the Job and Plugin files at the same time. You could set the Slave
Throttle Limit to 10, so that only 10 of those Slaves will ever be copying those files at the same time. When it goes
to render subsequent tasks for the same Job, it will not be affected by the throttling feature, since it already has the
required files. Note that for this feature to work, you must be running Pulse.

5.12.4 Utilize Limits / Machine Limits


Irrespective of Pulse Throttling, if your scene files (Maya, 3dsMax, modo, etc) are referencing a large amount of
external asset files (textures, geo caches), then at initial startup of this job on multiple machines, your network file
storage solution may struggle with this fire storm of i/o demand. To lower this demand on your file server, you can
use Machine Limits or Limits. One of the aspects of the limits feature is the ability to tell the slave to not return the
stub until the particular task being rendered by the slave in question has reached a certain percentage, which would
presume it has downloaded all the assets it needs. Note that not all Plugins report task progress, which is a requirement
of this feature to operate correctly.

312

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

5.12. Network Performance

313

Deadline User Manual, Release 7.1.0.35

5.12.5 Manage Job Auxiliary Files


If you are submitting your scene files with your Jobs, this can affect overall performance if the scene files are quite
large. This is because whenever a Slave starts a new Job, it copies those Job files locally before rendering, including
the Scene file if submitted with the Job. As mentioned in the previous section, if you have hundreds of Slaves starting
a Job with a large scene file, and your Repository hardware isnt built to handle a large load, your performance will
suffer.
If enabling Throttling isnt helping, another option (which can also be used in conjunction if you wish) is to configure
Deadline to store these scene files in an alternate location (like a separate, dedicated file server). This can be done by
configuring the Job Auxiliary Files settings in the Repository Options.

314

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

From here, you can choose a server thats better equipped to handle the load, which will help improve the performance
and stability of your Repository machine, especially if it is also hosting your Database backend. In a mixed farm
environment, you need to ensure that the paths for each operating system resolve to the same location. Otherwise, a
scene file submitted with the Job on one operating system will not be visible to a Slave running on another.

5.13 Cross Platform Rendering


5.13.1 Overview
Many of the applications that Deadline supports are available for multiple operating systems, and if you have a mixed
farm, you will probably run into one or more of these scenarios:
You want to submit Jobs from one operating system and render on a different one.
You want one or more Jobs to render on machines with different operating systems concurrently.
Both of these can be achieved, thanks to Deadlines Path Mapping feature. While there may be other considerations to
take into account, depending on the application youre rendering with, the Path Mapping feature will do most of the
work for you.

5.13. Cross Platform Rendering

315

Deadline User Manual, Release 7.1.0.35

5.13.2 Mapped Path Setup


When using a mixed render farm, it is all but guaranteed that asset paths will be different on each operating system.
In many cases, Deadline is aware of the paths being passed to the rendering application, so you can configure Path
Mappings to swap out these paths when appropriate based on the operating system.

To add a new Path Mapping, just click the Add button. Then, you specify the path that needs to be swapped out,
along with the paths that will be swapped in based on the operating system. You can also specify a region so you can
have different mappings for the same path across different regions. For best results, make sure that all paths end with
their appropriate path separator (/ or \). This helps avoid mangled paths that are a result of one path with a trailing
separator, and one without.

316

Chapter 5. Administrative Features

Deadline User Manual, Release 7.1.0.35

Note that these swaps only work one-way, so if you are swapping from PC to Linux and vice-versa, you will need
two separate entries. For example, lets say the PC machines use the path \\server\share\ for assets, while the Linux
machines use the path /mnt/share/. Here are what your two entries should look like:
Entry 1 (replaces the Linux path with the PC path on PCs):
Replace Path: /mnt/share/
Windows Path: \\server\share\
Linux Path:
Mac Path:

Entry 2 (replaces the PC path with the Linux path on Linux):


Replace Path: \\server\share\
Windows Path:
Linux Path: /mnt/share/
Mac Path:

If you have Mac machines as well, you will need three entries. For example, if the Macs use /Volumes/share/ to
access the assets from the previous example, here are what your three entries should look like:
Entry 1 (replaces the Linux path with the PC path on PCs and the Mac path on Macs):
Replace Path: /mnt/share/
Windows Path: \\server\share\
Linux Path:
Mac Path: /Volumes/share/

Entry 2 (replaces the PC path with the Linux path on Linux and the Mac path on Macs):
Replace Path: \\server\share\
Windows Path:
Linux Path: /mnt/share/
Mac Path: /Volumes/share/

Entry 3 (replaces the Mac path with the PC path on PCs and the Linux path on Linux):
Replace Path: /Volumes/share/
Windows Path: \\server\share\

5.13. Cross Platform Rendering

317

Deadline User Manual, Release 7.1.0.35

Linux Path: /mnt/share/


Mac Path:

By default, Deadline just uses regular string replacement to swap out the paths. In this case, Deadline takes care of
the path separators (/ and \) automatically. If you want more flexibility, you can configure your path mappings to
use regular expressions, but not that you will need to handle the path separators manually using [/\\] in your regular
expressions.

5.13.3 Application-Specific Considerations


For some applications, like Maya and Nuke, configuring Path Mappings is enough to allow for cross-platform rendering. For other applications, like After Effects and Cinema 4D, more setup is required. More information on how to
render with these applications on mixed farms can be found in their Cross-Platform Rendering Considerations sections
in the Plugins documentation.

5.13.4 Regions
Regions can be used to setup different mappings for the same path across your farm. For example, lets say we have a
local farm and a remote farm, and we want to map the path /mnt/share/ in our remote farm but not in our local farm.
All we have to do is set the region of our mapping to the same region our remote slaves are in. Slaves in the region
will replace /mnt/share/ but all the other slaves will use /mnt/share/ normally. We could also setup an alternate path
for the slaves in our local farm.
A mapping in the All region will apply to every region. It should be noted that a regions mapping is applied before
the All region.

318

Chapter 5. Administrative Features

CHAPTER

SIX

ADVANCED FEATURES

6.1 Manual Job Submission


6.1.1 Overview
Manual job submission is useful if you want more control over the submission process. For example, if youre writing
a custom submission script, or you are integrating the submission process into an internal pipeline tool, you will
probably want full control over which job settings are being set.
If you are just looking to submit jobs from one of the many scripts that are shipped with Deadline, you should refer to
the Submitting Jobs documentation.

6.1.2 Arbitrary Command Line Jobs


To manually submit arbitrary command line jobs, you can use the -SubmitCommandLineJob option with the Command
application. The key parameters that you need to specify are:
-executable: The executable we wish to use.
-arguments: The arguments we wish to pass to the executable. In the arguments string, there are a few key
words that will be replaced with the appropriate text when rendering the job:
<STARTFRAME> will be replaced with the current start frame for each task.
<ENDFRAME> will be replaced by the current end frame for each task.
<STARTFRAME%#> will be replaced with the current start frame for each task, and will be padded with
0s to match the length specified for #. For example, <STARTFRAME%4> will ensure a start frame
padding of 4.
<ENDFRAME%#> will be replaced by the current end frame for each task, and will be padded with 0s to
match the length specified for #. For example, <ENDFRAME%6> will ensure an end frame padding of 6.
<QUOTE> will be replaced with an actual quote character ().
-frames: The frames we wish to render.
The following parameters can also be included, but are optional:
-startupdirectory: The directory that the command line will start up in.
-chunksize: The number of frames per task (defaults to 1).
-pool: The pool we wish to submit to (defaults to none).
-group: The group we wish to submit to (defaults to none).

319

Deadline User Manual, Release 7.1.0.35

-priority: The jobs priority (defaults to 50).


-name: The jobs name (defaults to Untitled).
-department: The jobs department (defaults to ).
-initialstatus: Specify Active or Suspended (defaults to Active).
-prop: Specify additional job properties in the form KEY=VALUE, where KEY is any of the property names
that can be specified in the Job Info File.
For example, say we want to submit a job that uses 3dsmaxcmd.exe to render frames in the scene file
\\shared\path\scene.max. We want to render frames 1 to 10, and we want an image resolution of 480x320. The
command line to do this from a command prompt would look like:
3dsmaxcmd.exe
-start:1
-end:10
-width:480
-height:320
"\\shared\path\scene.max"

For the job, we want a task chunk size of 2, we want to submit to the 3dsmax group, we want a priority of 50, and we
want a machine limit of 5. Finally, we want to call the job 3dsmax command line job. The command line to submit
this job would look like this:
deadlinecommand.exe
-SubmitCommandLineJob
-executable "c:\Program Files\Autodesk\3dsmax8\3dsmaxcmd.exe"
-arguments "-start:<STARTFRAME> -end:<ENDFRAME>
-width:480 -height:320 <QUOTE>\\shared\path\scene.max<QUOTE>"
-frames 1-10
-chunksize 2
-group 3dsmax
-priority 50
-name "3dsmax command line job"
-prop MachineLimit=5

6.1.3 Maintenance Jobs


Maintenance jobs are special jobs where each task for the job will render on a different machine in your farm. This
is useful for performing benchmark tests, installing new software, synchronizing files to each machine, etc. When a
maintenance job is submitted, a task will automatically be created for each slave, and once a slave has finished a task,
it will no longer pick up the job.
One way to submit a Maintenance job is to manually submit a job to Deadline by creating the the necessary job
submission files as documented below. In the job info file, you must set MaintenanceJob to True:
MaintenanceJob=True

By default, a Maintenance job will render frame 0 on every machine. To render a different frame, or a sequence of
frames, you can specify the MaintenanceJobStartFrame and MaintenanceJobEndFrame options in the job info file:
MaintenanceJob=True
MaintenanceJobStartFrame=1
MaintenanceJobEndFrame=5

320

Chapter 6. Advanced Features

Deadline User Manual, Release 7.1.0.35

Note that if you specify a whitelist or blacklist in the job info file, the number of tasks that are created for the Maintenance job will equal the number of valid slaves that the job could render on.
Another way to submit a Maintenance job is to right-click on an existing job in the Monitor and choose the Resubmit
Job option. See the Resubmitting Jobs section of the Controlling Jobs documentation for more information.

6.1.4 Creating Job Submission Files


This is the method that our submission scripts use to submit jobs. This method is far more flexible, but requires more
work to setup the job. It also uses the Command application to submit the job.
Before the job can be submitted though, a Job Info File and a Plug-in Info File must be created. These are the first two
files that should always be submitted with the job.
You can also submit additional auxiliary files with the job, such as the scene file you want to render, or any other files
the job will need. Any number of auxiliary files can be specified after the job info and plugin info file. These auxiliary
file are copied to the Repository, and are then copied locally to each machine during rendering. Because these files
will be copied to the same folder, it is necessary that every file name be unique.
Once these files are ready to go, you can submit the job using the command line:
deadlinecommand.exe
[Job Info File] [Plug-in Info File] [Auxiliary File 1] [Auxiliary File 2]

Job Info File


The Job Info File is a plain text file that uses Key/Value pairs (key=value) to define all the generic job options used to
render the job. A couple options are required, but most are optional. All jobs can use these options, regardless of the
plug-in that they use. Some examples have been provided further down. Required Options
These options must be specified in the job info file, or the job will fail to submit. The rest of the options are optional.
Plugin=<plugin name> : Specifies the plugin to use. Must match an existing plugin in the repository.
General Options
Frames=<1,2,3-10,20> : Specifies the frame range of the render job. See the Frame List Formatting Options in
the Job Submission documentation for more information (default = 0).
Name=<job name> : Specifies the name of the job (default = Untitled).
UserName=<username> : Specifies the jobs user (default = current user).
MachineName=<machineName> : Specifies the machine the job was submitted from (default = current machine).
Department=<department name> : Specifies the department that the job belongs to. This is simply a way to
group jobs together, and does not affect rendering in any way (default = blank).
Comment=<comment> : Specifies a comment for the job (default = blank).
Group=<groupName> : Specifies the group that the job is being submitted to (default = none).
Pool=<poolName> : Specifies the pool that the job is being submitted to (default = none).
SecondaryPool=<poolName> : Specifies the secondary pool that the job can spread to if machines are available. If not specified, the job will not use a secondary pool.
Priority=<0 or greater> : Specifies the priority of a job with 0 being the lowest (default = 50). The maximum
priority can be configured in the Job Settings of the Repository Options, and defaults to 100.

6.1. Manual Job Submission

321

Deadline User Manual, Release 7.1.0.35

ChunkSize=<1 or greater> : Specifies how many frames to render per task (default = 1).
ForceReloadPlugin=<true/false> : Specifies whether or not to reload the plugin between subsequent frames of
a job (default = false). This deals with memory leaks or applications that do not unload all job aspects properly.
SynchronizeAllAuxiliaryFiles=<true/false> : If enabled, all job files (as opposed to just the job info and plugin
info files) will be synchronized by the Slave between tasks for this job (default = false). Note that this can add
significant network overhead, and should only be used if you plan on manually editing any of the files that are
being submitted with the job.
InitialStatus=<Active/Suspended> : Specifies what status the job should be in immediately after submission
(default = Active).
LimitGroups=<limitGroup,limitGroup,limitGroup> : Specifies the limit groups that this job is a member of
(default = blank).
MachineLimit=<0 or greater> : Specifies the maximum number of machines this job can be rendered on at
the same time (default = 0, which means unlimited).
MachineLimitProgress=<0.1 or greater> : If set, the slave rendering the job will give up its current machine
limit lock when the current task reaches the specified progress. If negative, this feature is disabled (default =
-1.0). The usefulness of this feature is directly related to the progress reporting capabilities of the individual
plugins.
Whitelist=<slaveName,slaveName,slaveName> : Specifies which slaves are on the jobs whitelist (default =
blank). If both a whitelist and a blacklist are specified, only the whitelist is used.
Blacklist=<slaveName,slaveName,slaveName> : Specifies which slaves are on the jobs blacklist (default =
blank). If both a whitelist and a blacklist are specified, only the whitelist is used.
ConcurrentTasks=<1-16> : Specifies the maximum number of tasks that a slave can render at a time (default
= 1). This is useful for script plugins that support multithreading.
LimitTasksToNumberOfCpus=<true/false> : If ConcurrentTasks is greater than 1, setting this to true will
ensure that a slave will not dequeue more tasks than it has processors (default = true).
Sequential=<true/false> : Sequential rendering forces a slave to render the tasks of a job in order. If an earlier
task is ever requeued, the slave wont go back to that task until it has finished the remaining tasks in order
(default = false).
Interruptible=<true/false> : Specifies if tasks for a job can be interrupted by a higher priority job during
rendering (default = false).
SuppressEvents=<true/false> : If true, the job will not trigger any event plugins while in the queue (default =
false).
NetworkRoot=<repositoryUNCPath> : Specifies the repository that the job will be submitted to. This is
required if you are using more than one repository (default = current default repository for the machine from
which submission is occurring).
Cleanup Options
Protected=<true/false> : If enabled, the job can only be deleted by the jobs user, a super user, or a user that
belongs to a user group that has permissions to handle protected jobs. Other users will not be able to delete the
job, and the job will also not be cleaned up by Deadlines automatic house cleaning.
OnJobComplete=<Nothing/Delete/Archive> : Specifies what should happen to a job after it completes (default = Nothing).
DeleteOnComplete=<true/false> : Specifies whether or not the job should be automatically deleted after it
completes (default = false).

322

Chapter 6. Advanced Features

Deadline User Manual, Release 7.1.0.35

ArchiveOnComplete=<true/false> : Specifies whether or not the job should be automatically archived after it
completes (default = false).
OverrideAutoJobCleanup=<true/false> : If true, the job will ignore the global Job Cleanup settings and
instead use its own (default = false).
OverrideJobCleanup=<true/false> : If OverrideAutoJobCleanup is true, this will determine if the job should
be automatically cleaned up or not.
JobCleanupDays=<true/false> : If OverrideAutoJobCleanup and OverrideJobCleanup are both true, this is the
number of days to keep the job before cleaning it up.
OverrideJobCleanupType=<ArchiveJobs/DeleteJobs> :
Cleanup are both true, this is the job cleanup mode.

If OverrideAutoJobCleanup and OverrideJob-

Environment Options
EnvironmentKeyValue#=<key=value> : Specifies environment variables to set when the job renders. This
option is numbered, starting with 0 (EnvironmentKeyValue0), to handle multiple environment variables. For
each additional variable, just increase the number (EnvironmentKeyValue1, EnvironmentKeyValue2, etc). Note
that these variables are only applied to the rendering process, so they do not persist between jobs.
IncludeEnvironment=<true/false> : If true, the submission process will automatically grab all the environment
variables from the submitters current environment and set them in the jobs environment variables (default =
false). Note that these variables are only applied to the rendering process, so they do not persist between jobs.
UseJobEnvironmentOnly=<true/false> : If true, only the jobs environment variables will be used at render
time (default = false). If False, the jobs environment variables will be merged with the slaves current environment, with the jobs variables overwriting any existing ones with the same name.
CustomPluginDirectory=<directoryName> : If specified, the job will look for for the plugin it needs to render
in this location. If it does not exist in this location, it will fall back on the Repository plugin directory. For example, if you are rendering with a plugin called MyPlugin, and it exists in \\server\development\plugins\MyPlugin,
you would set CustomPluginDirectory=\\server\development\plugins.
Failure Detection Options
OverrideJobFailureDetection=<true/false> : If true, the job will ignore the global Job Failure Detection
settings and instead use its own (default = false).
FailureDetectionJobErrors=<0 or greater> : If OverrideJobFailureDetection is true, this sets the number of
errors before the job fails. If set to 0, job failure detection will be disabled.
OverrideTaskFailureDetection=<true/false> : If true, the job will ignore the global Task Failure Detection
settings and instead use its own (default = false).
FailureDetectionTaskErrors=<0 or greater> : If OverrideTaskFailureDetection is true, this sets the number
of errors before a task for the job fails. If set to 0, task failure detection will be disabled.
IgnoreBadJobDetection=<true/false> : If true, slaves will never mark the job as bad for themselves. This
means that they will continue to make attempts at jobs that often report errors until the job is complete, or until
it fails (default = false).
SendJobErrorWarning=<true/false> : If the job should send warning notifications when it reaches a certain
number of errors (default = false).
Timeout Options
MinRenderTimeSeconds=<0 or greater> : Specifies the minimum time, in seconds, a slave should render a
task for, otherwise an error will be reported (default = 0, which means no minimum). Note that if MinRenderTimeSeconds and MinRenderTimeMinutes are both specified, MinRenderTimeSeconds will be ignored.

6.1. Manual Job Submission

323

Deadline User Manual, Release 7.1.0.35

MinRenderTimeMinutes=<0 or greater> : Specifies the minimum time, in minutes, a slave should render a
task for, otherwise an error will be reported (default = 0, which means no minimum). Note that if MinRenderTimeSeconds and MinRenderTimeMinutes are both specified, MinRenderTimeSeconds will be ignored.
TaskTimeoutSeconds=<0 or greater> : Specifies the time, in seconds, a slave has to render a task before it
times out (default = 0, which means unlimited). Note that if TaskTimeoutSeconds and TaskTimeoutMinutes are
both specified, TaskTimeoutSeconds will be ignored.
TaskTimeoutMinutes=<0 or greater> : Specifies the time, in minutes, a slave has to render a task before it
times out (default = 0, which means unlimited). Note that if TaskTimeoutSeconds and TaskTimeoutMinutes are
both specified, TaskTimeoutSeconds will be ignored.
StartJobTimeoutSeconds=<0 or greater> : Specifies the time, in seconds, a slave has to start a render job
before it times out (default = 0, which means unlimited). Note that if StartJobTimeoutSeconds and StartJobTimeoutMinutes are both specified, StartJobTimeoutSeconds will be ignored.
StartJobTimeoutMinutes=<0 or greater> : Specifies the time, in minutes, a slave has to start a render job
before it times out (default = 0, which means unlimited). Note that if StartJobTimeoutSeconds and StartJobTimeoutMinutes are both specified, StartJobTimeoutSeconds will be ignored.
OnTaskTimeout=<Error/Notify/ErrorAndNotify/Complete> : Specifies what should occur if a task times
out (default = Error).
EnableAutoTimeout=<true/false> : If true, a slave will automatically figure out if it has been rendering too
long based on some Repository Configuration settings and the render times of previously completed tasks (default = false).
EnableTimeoutsForScriptTasks=<true/false> : If true, then the timeouts for this job will also affect its
pre/post job scripts, if any are defined (default = false).
Dependency Options
JobDependencies=<jobID,jobID,jobID> : Specifies what jobs must finish before this job will resume (default
= blank). These dependency jobs must be identified using their unique job ID, which is outputted after the job
is submitted, and can be found in the Monitor in the Job ID column.
JobDependencyPercentage=<-1, or 0 to 100> : If between 0 and 100, this job will resume when all of its
job dependencies have completed the specified percentage number of tasks. If -1, this feature will be disabled
(default = -1).
IsFrameDependent=<true/false> : Specifies whether or not the job is frame dependent (default = false).
FrameDependencyOffsetStart=<-100000 to 100000> : If the job is frame dependent, this is the start frame
offset (default = 0).
FrameDependencyOffsetEnd=<-100000 to 100000> : If the job is frame dependent, this is the end frame
offset (default = 0).
ResumeOnCompleteDependencies=<true/false> : Specifies whether or not the dependent job should resume
when its dependencies are complete (default = true).
ResumeOnDeletedDependencies=<true/false> : Specifies whether or not the dependent job should resume
when its dependencies have been deleted (default = false).
ResumeOnFailedDependencies=<true/false> : Specifies whether or not the dependent job should resume
when its dependencies have failed (default = false).
RequiredAssets=<assetPath,assetPath,assetPath> : Specifies what asset files must exist before this job will
resume (default = blank). These asset paths must be identified using full paths, and multiple paths can be
separated with commas. If using frame dependencies, you can replace padding in a sequence with the #
characters, and a task for the job will only be resumed when the required assets for the tasks frame) exist.

324

Chapter 6. Advanced Features

Deadline User Manual, Release 7.1.0.35

ScriptDependencies=<scriptPath,scriptPath,scriptPath> : Specifies what Python script files will be executed


to determine if a job can resume (default = blank). These script paths must be identified using full paths,
and multiple paths can be separated with commas. See the Scripting section of the documentation for more
information on script dependencies.
Scheduling Options
ScheduledType=<None/Once/Daily> : Specifies whether or not you want to schedule the job (default = None).
ScheduledStartDateTime=<dd/MM/yyyy HH:mm> [The date/time at which the job will run. The start
date/time must match the specified format. Heres an explanation:]
dd: The day of the month. Single-digit days must have a leading zero.
MM: The numeric month. Single-digit months must have a leading zero.
yyyy: The year in four digits, including the century.
HH: The hour in a 24-hour clock. Single-digit hours must have a leading zero.
mm: The minute. Single-digit minutes must have a leading zero.
ScheduledDays=<day interval> : If scheduling a Daily job, this is the day interval for when the job runs
(default = 1).
JobDelay=<dd:hh:mm:ss> : A start time delay. If there is no ScheduledStartDateTime this delay will be
applied to the submission date. The delay value is represented by the number of days, hours, minutes, and
seconds, all separated by colons.
Output Options
OutputFilename#=<fileName> : Specifies the output image filenames for each frame (default = blank). This
allows the Monitor to display the View Output Image context menu option in the task list. There is no
minimum or maximum limit to padding length supported. A padding of 4 x #### is very common in many
applications. If the filename is a full path, then the OutputDirectory# option is not needed. This option is numbered, starting with 0 (OutputFilename0), to handle multiple output file names per frame. For each additional
file name, just increase the number (OutputFilename1, OutputFilename2, etc).
OutputFilename#Tile?=<fileName> : Specifies the output image filenames for each task of a Tile job (default
= blank). This allows the Monitor to display the View Output Image context menu option in the task list for
Tile jobs. The # is used to support multiple outputs per frame (see OutputFilename# above), and the ? is
used to specify the output for each task in the Tile job. For example, a Tile job with 2 outputs and 2 tiles would
specify OutputFilename0Tile0, OutputFilename0Tile1, OutputFilename1Tile0, and OutputFilename1Tile1.
OutputDirectory#=<directoryName> : Specifies the output image directory for the job (default = blank). This
allows the Monitor to display the Explore Output context menu option in the job list. If the filename is a full
path, then the OutputDirectory# option is not needed. This option is numbered, starting with 0 (OutputDirectory0), to handle multiple output directories per frame. For each additional directory, just increase the number
(OutputDirectory1, OutputDirectory2, etc).
OutputDirectory0=\\fileserver\Project\Renders\OutputFolder\
OutputFilename0=o_HDP_010_BG_v01.####.exr
OutputDirectory1=\\fileserver\Project\Renders\OutputFolder\
OutputFilename1=o_HDP_010_SPEC_v01####.dpx
OutputDirectory2=\\fileserver\Project\Renders\OutputFolder\
OutputFilename2=o_HDP_010_RAW_v01_####.png

Notification Options
NotificationTargets=<username,username,username> : A list of users, separated by commas, who should be
notified when the job is completed (default = blank).

6.1. Manual Job Submission

325

Deadline User Manual, Release 7.1.0.35

ClearNotificationTargets=<true/false> : If enabled, all of the jobs notification targets will be removed (default
= false).
NotificationEmails=<email,email,email> : A list of additional email addresses, separated by commas, to send
job notifications to (default = blank).
OverrideNotificationMethod=<true/false> : If the job users notification method should be ignored (default =
false).
EmailNotification=<true/false> : If overriding the job users notification method, whether to use email notification (default = false).
PopupNotification=<true/false> : If overriding the job users notification method, whether to use popup notification (default = false).
NotificationNote=<note> : A note to append to the notification email sent out when the job is complete (default
= blank). Separate multiple lines with [EOL], for example:
This is a line[EOL]This is another line[EOL]This is the last line

Script Options
PreJobScript=<path to script> : Specifies a full path to a python script to execute when the job initially starts
rendering (default = blank).
PostJobScript=<path to script> : Specifies a full path to a python script to execute when the job completes
(default = blank).
PreTaskScript=<path to script> : Specifies a full path to a python script to execute before each task starts
rendering (default = blank).
PostTaskScript=<path to script> : Specifies a full path to a python script to execute after each task completes
(default = blank).
Tile Job Options
TileJob=<true/false> : If this job is a tile job (default = false).
TileJobFrame=<frameNumber> : The frame that the tile job is rendering (default = 0).
TileJobTilesInX=<xCount> : The number of tiles in X for a tile job (default = 0). This should be specified
with the TileJobTilesInY option below.
TileJobTilesInY=<yCount> : The number of tiles in Y for a tile job (default = 0). This should be specified
with the TileJobTilesInX option above.
TileJobTileCount=<count> : The number of tiles for a tile job (default = 0). This is an alternative to specifying
the TileJobTilesInX and TileJobTilesInY options above.
Maintenance Job Options
MaintenanceJob=<true/false> : If this job is a maintenance job (default = false).
MaintenanceJobStartFrame=<frameNumber> : The first frame for the maintenance job (default = 0).
MaintenanceJobEndFrame=<frameNumber> : The last frame for the maintenance job (default = 0).
Extra Info Options
These are extra arbitrary properties that have corresponding columns in the Monitor that can be sorted on. There are a
total of 10 Extra Info properties that can be specified.
ExtraInfo0=<arbitrary value>
ExtraInfo1=<arbitrary value>

326

Chapter 6. Advanced Features

Deadline User Manual, Release 7.1.0.35

ExtraInfo2=<arbitrary value>
ExtraInfo3=<arbitrary value>
ExtraInfo4=<arbitrary value>
ExtraInfo5=<arbitrary value>
ExtraInfo6=<arbitrary value>
ExtraInfo7=<arbitrary value>
ExtraInfo8=<arbitrary value>
ExtraInfo9=<arbitrary value>
These are additional arbitrary properties. There is no limit on the number that are specified, but they do not have
corresponding columns in the Monitor.
ExtraInfoKeyValue0=<key=value>
ExtraInfoKeyValue1=<key=value>
ExtraInfoKeyValue2=<key=value>
ExtraInfoKeyValue3=<key=value>
...
Job Info File Examples
3ds Max Job Info File:
Plugin=3dsmax
ForceReloadPlugin=false
Frames=0-400
Priority=50
Pool=3dsmax
Name=IslandWaveScene_lighted01
Comment=Testing
OutputDirectory0=\\fileserver\Renders\OutputFolder\
OutputFilename0=islandWaveBreak_Std####.png

Lightwave Job Info File:


Plugin=Lightwave
Frames=1-10,21-30
ChunkSize=10
Priority=99
Pool=LightwavePool
Group=NiceShot
Name=Lightwave Test
OutputFilename0=\\fileserver\Renders\OutputFolder\test####.tif
DeleteOnComplete=true
MachineLimit=5
SlaveTimeout=3600

Fusion Job Info File:


Plugin=Fusion
Frames=1-100
Priority=50

6.1. Manual Job Submission

327

Deadline User Manual, Release 7.1.0.35

Group=Fusion
Name=Fusion Dependency Test
OutputFilename0=\\fileserver\Renders\OutputFolder\dfusion_test####.tif
JobDependencies=546cc87357dbb04344a5c6b5,53d27c9257dbb027b8a4ccd2
InitialStatus=Suspended
LimitGroups=DFRNode
ExtraInfo0=Regression Testing
ExtraInfoKeyValue0=TestID=344
ExtraInfoKeyValue1=DeveloperID=12

Plug-in Info File


The Plug-in Info File is a plain text file that uses Key/Value pairs (key=value) to define the plug-in specific options
that are used by the individual plug-ins to render the job. Often, these options are used to build up the command line
arguments that are to be passed on to the rendering application.
The plug-ins read in the settings from the Plug-in Info File using the script functions GetPluginInfoEntry(...) and
GetPluginInfoEntryWithDefault(...), which are discussed in more detail in the Plug-in Scripting documentation (Application Plugins).

6.2 Power Management


6.2.1 Overview
Power Management is a system that automatically controls when machines in the farm start up or shut down, based on
the current conditions of the farm. It can start machines if they are required to render jobs in the farm, and it can shut
down machines that are no longer needed for rendering. It can also poll an external temperature sensor using SNMP
and shut down machines if the server room is too hot. Finally, it can reboot problematic machines on a regular basis.

6.2.2 Running Pulse


Power Management is built into Pulse, so Pulse must be running for Power Management to work. The only exception
for this is the Thermal Shutdown feature. Redundancy for this feature has been built into the Slave applications, so if
Pulse isnt running, youre still protected if the temperature of your server room gets too hot.
See the Pulse documentation for more information about running and configuring Pulse.

328

Chapter 6. Advanced Features

Deadline User Manual, Release 7.1.0.35

6.2.3 Configuration
Power Management can be configured from the Monitor by selecting Tools -> Configure Power Management. You
will need to be in Super User mode for this, if you are not part of a User Group that has access to this feature.

6.2. Power Management

329

Deadline User Manual, Release 7.1.0.35

Machine Groups are used by Power Management to organize Slave machines on the farm, and each group has four
sections of settings that can be configured independently of each other. To add a new Machine Group, simply click the
Add button in the Machine Group section.

330

Chapter 6. Advanced Features

Deadline User Manual, Release 7.1.0.35

Power Management Group Settings:


Group Name: The name with which the Power Management Group will be identified.
Group Mode: Whether this particular Group is enabled or disabled.
Include All Slaves in this Group: If enabled, all slaves will be included in this group. Note that you cannot
override the slave order for the Power Management features if this is enabled.
Slaves Not In Group: The Slaves that will not be part of this Group.
Slaves In Group: Slaves that will be part of this Group.
To edit the Power Management settings within a group, simply click on the group in the Machine Groups list.
Idle Shutdown
Idle Shutdown is a system that forces Slaves to shutdown after they have been idle for a certain period of time. This
can be used to save on energy costs when the render farm is idle, without having to shutdown machines manually.
Combining this feature with Wake-On-Lan will ensure that machines in the render farm are only running when they
are needed.
You can split the idle time period between a Daytime period and an Evening period. This is useful because in most
cases, you want most of your machines to stay on during the workday, and then shutdown during the evening when
there are no renders left. In addition, you can also specify exceptions to these two periods, which means (for example)
you could have different idle periods for the weekend.

6.2. Power Management

331

Deadline User Manual, Release 7.1.0.35

Idle Shutdown Settings:


Idle Shutdown Mode: Select Disabled, Enabled, or Debug mode. In Debug mode, all the checks are performed
as normal, but no action is actually taken.
Number of Minutes Before Idles Slaves Will be Shutdown: Self explanatory.
Number of Slaves to Leave Running: The minimum number of Slaves to keep running at all times.
Slave Shutdown Type: The method that will be used to shutdown Idle Slaves:
Shutdown: Power off the machine using the normal shutdown method.
Suspend: Suspends the machines instead of shutting them down. Only works for Windows Slaves.
Run Command: Use this method to have the Slave run a command when attempting to shutdown a Slave.
Important Processes: If the Slave has any of these processes running, it will not shutdown.

332

Chapter 6. Advanced Features

Deadline User Manual, Release 7.1.0.35

Overrides: Define overrides for different days and times. Simply specify the day(s) of the week, the time
period, the minimum number of Slaves, and the idle shutdown time for each override required. For example, if
more machines are required to be running continuously for Friday evening and Saturday afternoon, this can be
accomplished with an override.
Override Shutdown Order: Whether or not to define the order in which Slaves are shutdown. If disabled,
Slaves will be shut down in alphabetical order. If enabled, use the Set Shutdown Order dialog to define the order
in which you would like the Slaves to shut down. Note that this feature is not available if the Power Management
Group is configured to include all slaves.
Machine Startup
This is a system that allows powered-down machines to be started automatically when new Jobs are submitted to the
render farm. Combining this feature with Idle Shutdown will ensure that machines in the render farm are only running
when they are needed.
If Slave machines support it, Wake On Lan (WOL) or IPMI commands can be used to start them up after they shutdown. By default, the WOL packet is sent over port 9, but you can change this in the Wake On Lan settings in the
Repository Configuration. Make sure there isnt a firewall or other security software blocking communication over the
selected port(s).
WOL Packets are sent to the MAC addresses that Deadline has on file for each of the Slaves. If your Slaves have multiple Ethernet ports, the Slave may have registered the wrong MAC address, which may prevent WOL from working
properly. If this is the case, you will have to manually set MAC Address overrides for the Slaves that are having this
problem, which can be done through the Slave Settings dialog.
Note that if machines in the group begin to be shutdown due to temperature, this feature may be automatically disabled
for the group to prevent machines from starting up and raising the temperature again.

6.2. Power Management

333

Deadline User Manual, Release 7.1.0.35

Machine Startup Settings:


Machine Startup Mode: Select Disabled, Enabled, or Debug mode. In Debug mode, all the checks are performed, but no action is actually taken.
Number of Slaves to Wake Up per Interval: The maximum number of machines that will be started in a given
Power Management check interval. The interval itself can be configured in the Pulse section of the Repository
Options.
Wake Up Mode: This determines how the machines will be woken up. See the available Wake Up Modes below
for more information.
Override Startup Order: Whether or not to define the order in which Slaves are started up. If disabled, Slaves
will be started in alphabetical order. If enabled, use the Set Startup Order dialog to define the order. Note that
this feature is not available if the Power Management Group is configured to include all slaves.
Wake Up Modes:
Use Wake On Lan: Wake On Lan packets will be sent to machines to wake them up.
334

Chapter 6. Advanced Features

Deadline User Manual, Release 7.1.0.35

Run Command: This is primarily for IPMI support. If enabled, Pulse will run a given command to start Slave
machines. This command will be run once for each Slave that is being woken up. A few tags can be used within
the command:
{SLAVE_NAME} is replaced with the current Slaves hostname.
{SLAVE_MAC} is replaced with the current Slaves MAC address.
{SLAVE_IP} is replaced with the current Slaves IP address.
Thermal Shutdown
The Thermal Shutdown system polls temperature sensors and responds by shutting down machines if the temperature
gets too high. The sensors we have used for testing are NetTherms, and APC Sensors are also known to be compatible.
Note that the temperature sensor uses port 161, and should be automatically unblocked.

Thermal Shutdown settings:


6.2. Power Management

335

Deadline User Manual, Release 7.1.0.35

Thermal Shutdown Mode: Select Disabled, Enabled, or Debug mode. In Debug mode, all the checks will be
performed, but no action is actually taken.
Temperature Units: The units used to display and configure the temperatures. Note that this is separate from
the units that the actual sensors use.
Thermal Sensors: The host and OID (Object Identifier) of the sensor(s) in the zone. To add a new sensor,
simply click the Add button.
Temperature Threshold: Thresholds can be added for any temperature. When a sensor reports a temperature
higher than a particular threshold, the Slaves in the zone will respond accordingly. Note that higher temperature
thresholds take precedence over lower ones.
Shut down Slaves if sensors are offline for this many minutes: If enabled, Slaves will shut down after a period
of time in which the temperature sensor could not be reached for temperature information.
Disable Machine Startup if thermal threshold is reached: If enabled, Machine Startup for the current group
will be disabled if a thermal threshold is reached.
Re-enable Machine Startup when temperature returns to: If enabled, this will re-enable Machine Startup
when the temperature returns to the specified temperature.
Override Shutdown Order: Whether or not to define a custom order in which Slaves will be shutdown. If
disabled, Slaves will be shut down in alphabetical order. If enabled, use the Set Shutdown Order dialog to
define the order. Note that this feature is not available if the Power Management Group is configured to include
all slaves.

Sensor Settings:
Sensor Hostname or IP Address: The host of the temperature sensor.
Sensor OID: The OID (Object Identifier) of the temperature sensor. The default OID is for the particular type
of sensor we use.
Sensor SNMP Community: If testing the sensor fails with private is selected, try using public.

336

Chapter 6. Advanced Features

Deadline User Manual, Release 7.1.0.35

Sensor Reports Temperature As: Select the units that your temperature sensor uses to report the temperature.
Sensor Timeout in Milliseconds: The timeout value for contacting the sensor.
Sensor Testing Temperature: If enabled, the corresponding temperature will always be returned by this sensor.
This is useful for testing purposes.
Test Sensor: Queries the sensor for the current temperature, and displays it. If the temperature displayed seems
incorrect, make sure you have selected the correct unit of temperature above.
If you simply want to test the Thermal Shutdown feature, but you dont have any thermal sensors to test with, you
can enable the Sensor Testing Temperature in the Sensor settings above. When enabled, you dont need to provide a
Sensor Hostname or Sensor OID, and the test sensor will alway return the specified temperature.
Machine Restart
If you have problematic machines that you need to reboot periodically, you can configure the Machine Restart feature
of Power Management to restart your Slaves for you . Note that if the Slave on the machine is in the middle of
rendering a Task, it will finish its current Task before the machine is restarted.

6.2. Power Management

337

Deadline User Manual, Release 7.1.0.35

Machine Restart settings:


Machine Restart Mode: Select Disabled, Enabled, or Debug mode. In Debug mode, all the checks are performed as normal, but no action is actually taken.
Restart machines after Slave has been running for: The interval, in minutes, at which this group of Slaves
will be restarted.

6.3 Slave Scheduling


6.3.1 Overview
You can use the Slave Scheduling feature to configure when Slaves applications should be launched and shut down.
Slave Scheduling is controlled by the Launcher, so the Launcher must be running on the machines for Slave Scheduling
to work.
338

Chapter 6. Advanced Features

Deadline User Manual, Release 7.1.0.35

If a slave is scheduled to start on a machine, a notification message will pop up for 30 seconds indicating that the slave
is scheduled to start. If someone is still using the machine, they can choose to delay the start of the slave for a certain
amount of time.

6.3.2 Configuration
Slave Scheduling can be configured from the Monitor by selecting Tools -> Configure Slave Scheduling. You will
need to be in Super User mode for this, if you are not part of a User Group that has access to this feature.

Machine Groups are used by Slave Scheduling to organize Slave machines on the farm, and each group can have
different scheduling settings. To add a new Machine Group, simply click the Add button in the Machine Group
section.

6.3. Slave Scheduling

339

Deadline User Manual, Release 7.1.0.35

Slave Scheduling Group Settings:


Group Name: The name with which the Slave Scheduling Group will be identified.
Group Mode: Whether this particular Group is enabled or disabled.
Include All Slaves in this Group: If enabled, all slaves will be included in this group.
Unassigned Slaves: The Slaves that will not be part of this Group.
Slaves In Group: Slaves that will be part of this Group.
To edit the scheduling settings within a group, simply click on the group in the Machine Groups list.
Slave Scheduling
These settings are used to define the schedule for when slaves should start and stop.
Ensure Slave Is Running During Scheduled Hours: If enabled, slaves will be restarted if they are shut down
during the scheduled hours.
Day of the Week: Configure which days of the week you want to set a schedule for.
Start Time: The time on the selected day that the Slave application should be launched if it is not already
running.
Stop Time: The time on the selected day that the Slave application should be closed if it is running.

340

Chapter 6. Advanced Features

Deadline User Manual, Release 7.1.0.35

Idle Detection
These settings are used to launch the slave if the machine has been idle for a certain amount of time (idle means no
keyboard or mouse input). There is also additional criteria that can be checked before launching the slave, including
the machines current memory and CPU usage, the current logged in user, and the processes currently running on the
machine. Finally, this system can stop the slave automatically when the machine is no longer idle.
Start Slave When Machine Is Idle For ___ Minutes: If enabled, the Slave will be started on the machine if
it is idle. A machine is considered idle if there hasnt been any keyboard or mouse activity for the specified
amount of time.
Stop Slave When Machine Is No Longer Idle: If enabled, the Slave will be stopped when the machine is no
longer idle. A machine is considered idle if there hasnt been any keyboard or mouse activity for the specified
amount of time.
Only Stop Slave If Started By Idle Detection: If enabled, the Slave will only be stopped when the machine is
no longer idle if that Slave was originally started by Idle Detection. If the Slave was originally started manually,
it will not be stopped.
There are some limitations with Idle Detection depending on the operating system:
On Windows, Idle Detection will not work if the Launcher is running as a service. This is because the service
runs in an environment that is separate from the Desktop, and has no knowledge of any mouse or keyboard
activity.
On Linux, the Launcher uses X11 to determine if there has been any mouse or keyboard activity. If X11 is not
available, Idle Detection will not work.
Note that Idle Detection can be overridden in the Local Slave Controls so that users can configure if their local slave
should launch when the machine becomes idle.
Miscellaneous Options
These settings are applied to both Slave Scheduling and Idle Detection.
Only Start Slave If CPU Usage Less Than ___%: If enabled, the slave will only be launched if the machines
CPU usage is less than the specified value.
Only Start Slave If Free Memory More Than ___ MB: If enabled, the slave will only be launched if the
machine has more free memory than the specified value (in Megabytes).
Only Start Slave If These Processes Are Not Running: If enabled, the slave will only be launched if the
specified processes are not running on the machine.
Only Start If Launcher Is Not Running As These Users: If enabled, the slave will only be launched if the
launcher is not running as one of the specified users.
Allow Slaves to Finish Their Current Task When Stopping: If enabled, the Slave application will not be
closed until it finishes its current Task.

6.4 Farm Statistics


6.4.1 Overview
Deadline can keep track of some basic statistics. It can keep track of all of your completed Jobs so that you refer to
them later. It stores the User that submitted the Job, when the Job was submitted, the error count, as well as some

6.4. Farm Statistics

341

Deadline User Manual, Release 7.1.0.35

useful rendering metrics like render time, CPU usage, and memory usage. You can use all of this information to figure
out if there are any Slaves that arent being utilized to their full potential.
Statistical information is also gathered for individual slaves, including the slaves running time, rendering time, and
idle time. It also includes information about the number of tasks the slave has completed, the number of errors it has
reported, and its average Memory and CPU usage.
Note that some statistics can only be gathered if Pulse is running.

6.4.2 Enabling Statistics Gathering


You must first make sure Statistics Gathering has been enabled before Deadline will start logging information, which
can be done in the Statistics Gathering section of the Repository Options.

Note that if Pulse is not running, only statistics for completed Jobs, User usage and Slave Statistics will be recorded.
You must run Pulse to keep track of Slave Resource Usage and overall Repository statistics. When running, Pulse will
periodically gather information about Slaves Resource Usage and the general state of the repository, and record them
in the Database.

342

Chapter 6. Advanced Features

Deadline User Manual, Release 7.1.0.35

6.4.3 Viewing Farm Reports


To view Statistics, open the Monitor and select Tools -> View Farm Reports. This must be done in Super User
mode, unless you have the proper User Privileges to do so.

From this window, you can specify which type of report(s) to generate, and a date range to filter the statistics. You can
also specify a region to filter the statistics, but only the Active Slave Stats and Slaves Overview reports will use it.
There are five default Reports that will always be available, but custom reports can also be created and saved for later
use (see the Custom Reports section below for more info).
Active Slave Stats
The Active Slave Stats report displays Slave usage statistics for the farm, which are logged by Slaves as they are
running. The statistics displayed by this report are generated by each individual slave at regular intervals and do not
require Pulse to be running.

6.4. Farm Statistics

343

Deadline User Manual, Release 7.1.0.35

Completed Job Stats


The Completed Job Stats report consists of a list of completed Jobs with detailed statistics. Pulse does not need to be
running to gather these statistics.

344

Chapter 6. Advanced Features

Deadline User Manual, Release 7.1.0.35

Farm Overview
The Farm Overview report displays statistics about the Farm using graphs. The statistics displayed by this report are
assembled by Pulse, and will therefore only be gethered if Pulse is running.
The State Counts section displays the statistics in terms of counts.

6.4. Farm Statistics

345

Deadline User Manual, Release 7.1.0.35

The State Totals gives a visual representation of the statistics in terms of percentages.

346

Chapter 6. Advanced Features

Deadline User Manual, Release 7.1.0.35

Slaves Overview
The Slaves Overview report displays the statistics for each Slave on the farm with graphs to help display the statistics.
The statistics displayed by this report are assembled by Pulse, and will therefore only be gathered if Pulse is running.
The Slaves Overview chart shows now many slaves were in each state (starting job, rendering, idle, offline, stalled,
and disabled).

The Available/Active Slaves charts show the number of slaves that are available, and the number of available slaves
that are active.

6.4. Farm Statistics

347

Deadline User Manual, Release 7.1.0.35

The Individual Slaves list and charts show the average CPU and Memory usage for individual slaves, as well as average
time each slave spends in each state.

348

Chapter 6. Advanced Features

Deadline User Manual, Release 7.1.0.35

User Farm Time Report


The User Farm Time Report displays the farm usage statistics for each User. Pulse does not need to be running to
gather these statistics.

6.4. Farm Statistics

349

Deadline User Manual, Release 7.1.0.35

Custom Reports
Users can create their own custom Reports to control how the gathered statistics are aggregated and presented. By
doing this, users can create their own arsenal of specialized reports that help to drill down and expose potential
problems with the farm.
In order to create or edit Custom Reports you first need to be in Super User mode, or have the appropriate User Group
Permissions to do so. If that is the case, there should be a new set of buttons below the list of Reports, providing
control over Custom Reports.
By clicking the New button, you will be prompted to specify a name for your new report and select the type of
statistics which this report will display.

Once youve done that, youll be brought to the Edit view for your new Report. Youll note that this is very similar to generating a report under normal circumstances, but with the addition of several buttons that allow further

350

Chapter 6. Advanced Features

Deadline User Manual, Release 7.1.0.35

customization of your Report.

Chief among these new buttons is the Edit Data Columns button, which will allow you to select which columns are
displayed. You can also specify if you want to aggregate row information by selecting a Group By column, and a
Group Op for each other column.

6.4. Farm Statistics

351

Deadline User Manual, Release 7.1.0.35

The way the aggregation works is similar to a SQL query with a group by statement. Data rows will be combined
based on identical values of the Group By column, while the values of other columns will be determined by performing
the Group Ops on the combined rows.
As a simple example to demonstrate how this works in practice, let us consider a case where you might want to view
the error information on a per-plugin basis. We dont have a built-in report to do this, but all this information is
contained in Completed Job Stats. With that in mind, you can create a Custom Report based on Completed Job Stats
to group by Plugin, and aggregate Error Counts and Wasted Error Time, as illustrated below.

352

Chapter 6. Advanced Features

Deadline User Manual, Release 7.1.0.35

Once youve specified which columns are displayed, and whether/how rows are aggregated, you can also add simple
Graphs to your report. Simply click the Add Graph button, and specify the type of graph you want along with the
columns on which the graph should be based. Graphs are always based on all of the data presented the list view, and
currently cannot be based on selection or a different data model.

Once youre done customizing your new report, simply click the OK button on the Farm Status Reports window, and
your changes will be committed to the Database. Now, every time anyone brings up this dialog, they should be able to

6.4. Farm Statistics

353

Deadline User Manual, Release 7.1.0.35

generate the report youve just created!

6.4.4 Custom Statistics


If you need to keep track of more information, we suggest writing your own tool that uses Deadline Command.
Deadline Command can be used to query the repository for all sorts of information, like the current state of all the
Jobs and all the Slaves. You can have it print these out in an ini file format and use any ini file parser to extract the
information (Python has a module for this). This is also handy if you want to post stats to a web page, or insert entries
into a separate database.

6.5 Client Configuration


6.5.1 Overview
Clients are configured using the deadline.ini file. Some settings are stored in a system deadline.ini file, and some
are stored in a per-user deadline.ini file. Most of these settings are set during the Client Installation, but they can be
changed afterwards by editing the deadline.ini file directly. Some of these settings can also be updated using Auto
Configuration.
This guide will cover the various settings, and how they can be configured.

6.5.2 DEADLINE_PATH Environment Variable


The DEADLINE_PATH Environment variable is an environment variable on Windows and Linux which contains the
path to Deadlines bin directory. On OSX, it is instead a file located at /Users/Shared/Thinkbox which contains the
path to Deadlines resources directory.
DEADLINE_PATH is used by the integrated submisison scripts that are shipped with Deadline to determine where the
Deadline Client is installed to, and what the Repository path is. While it is possible to modify this value on the system
manually, you can instead use one of the Submitter installers to Change the DEADLINE_PATH Value.

6.5.3 Local Slave Instance Files


Deadline supports the ability to run Multiple Slaves On One Machine. The local slave instances are represented by .ini
files which are stored in the slaves folder in the following locations. Note that the # in the path will change based on
the Deadline version number.
Windows: %PROGRAMDATA%\Thinkbox\Deadline#\slaves\
Linux: /var/lib/Thinkbox/Deadline#/slaves/
OSX: /Users/Shared/Thinkbox/Deadline#/slaves/
To remove local slave instances, simply delete their corresponding .ini file. Note that this does not remove the slave
entries from the repository that the slaves connected to.

6.5.4 Configuration File Format


The deadline.ini file has an ini file format, so there will be a [Deadline] section followed by a number of key=value
pairs that represent each setting. For example:

354

Chapter 6. Advanced Features

Deadline User Manual, Release 7.1.0.35

[Deadline]
LicenseServer=@my-server
NetworkRoot=\\\\repository\\path
LauncherListeningPort=17060
AutoConfigurationPort=17061

6.5.5 System Configuration File


The system deadline.ini file can be found in the following locations. Note that the # in the path will change based on
the Deadline version number.
Windows: %PROGRAMDATA%\Thinkbox\Deadline#\deadline.ini
Linux: /var/lib/Thinkbox/Deadline#/deadline.ini
OSX: /Users/Shared/Thinkbox/Deadline#/deadline.ini
The following settings can be configured in the system deadline.ini file. Note that other settings can show up in this
file, but they are used internally by Deadline and are not documented here.
NetworkRoot
The NetworkRoot setting tells the Client which Repository to connect to.
NetworkRoot=\\\\repository\\path

There can also be additional NetworkRoot# settings that store previous Repository paths. These paths will be prepopulated in the drop down list when changing Repositories.
NetworkRoot0=\\\\repository\\path
NetworkRoot1=\\\\another\\repository
NetworkRoot2=\\\\test\\repository

This setting can be changed using the Change Repository option in the Launcher or the Monitor, and it can also be
configured using Auto Configuration.
LicenseServer
The LicenseServer setting tells the Client where it can get a license from.
LicenseServer=@my-server

This setting can be changed using the Change License Server option in the Launcher or the Slave, and it can also be
configured using Auto Configuration.
LauncherListeningPort
The LauncherListeningPort setting is the port that the Launcher listens on for Remote Control. It must be the same on
all Clients.

6.5. Client Configuration

355

Deadline User Manual, Release 7.1.0.35

LauncherListeningPort=17060

This setting can only be changed manually.


LauncherServiceStartupDelay
The LauncherServiceStartupDelay setting is the number of seconds that the Launcher waits during startup when running as a service or daemon. This delay helps ensure that the machine has set its host name before the Launcher starts
up any other Deadline applications.
LauncherServiceStartupDelay=60
This setting can only be changed manually.
SlaveStartupPort
The SlaveStartupPort setting is the port that the Slaves on this machine use when starting up to ensure that only one
Slave starts up at a time.
LauncherListeningPort=17063

This setting can only be changed manually.


AutoConfigurationPort
The AutoConfigurationPort setting is the port that the Clients use when Auto Configuring themselves. It must be the
same on all Clients.
AutoConfigurationPort=17061

This setting can only be changed manually


SlaveDataRoot
The SlaveDataRoot setting tells the Slave where to copy its job files temporarily during rendering. The default location
is the slave folder in the same folder as the per-user deadline.ini file. If left blank, the default location will be used
as well.
SlaveDataRoot=C:\\LocalSlaveData

This setting can be configured using Auto Configuration.


MultipleSlavesEnabled
The MultipleSlavesEnabled setting indicates if multiple slaves are allowed to run on this machine or not. The default
is True.
MultipleSlavesEnabled=True

This setting can only be changed manually.


356

Chapter 6. Advanced Features

Deadline User Manual, Release 7.1.0.35

RestartStalledSlave
The RestartStalledSlave setting indicates if the Launcher should try to restart the Slave on the machine if it becomes
stalled. The default is True.
RestartStalledSlave=True

This setting can be changed from the Launcher menu, and it can also be configured using Auto Configuration.
LaunchPulseAtStartup
The LaunchPulseAtStartup setting controls if the Launcher should automatically launch Pulse after the launcher starts
up. The default is False.
LaunchPulseAtStartup=True

This setting can only be changed manually.


KeepPulseRunning
The KeepPulseRunning setting controls if the Launcher should automatically relaunch Pulse if it is shut down or
crashes. The default is False.
KeepPulseRunning=True

This setting can only be changed manually.


LaunchBalancerAtStartup
The LaunchBalancerAtStartup setting controls if the Launcher should automatically launch the Balancer after the
launcher starts up. The default is False.
LaunchBalancerAtStartup=True

This setting can only be changed manually.


KeepBalancerRunning
The KeepBalancerRunning setting controls if the Launcher should automatically relaunch Balancer if it is shut down
or crashes. The default is False.
KeepBalancerRunning=True

This setting can only be changed manually.

6.5. Client Configuration

357

Deadline User Manual, Release 7.1.0.35

LaunchWebServiceAtStartup
The LaunchWebServiceAtStartup setting controls if the Launcher should automatically launch the Web Service after
the launcher starts up. The default is False.
LaunchWebServiceAtStartup=True

This setting can only be changed manually.


KeepWebServiceRunning
The KeepWebServiceRunning setting controls if the Launcher should automatically relaunch Web Service if it is shut
down or crashes. The default is False.
KeepWebServiceRunning=True

This setting can only be changed manually.


AutoUpdateOverride
The AutoUpdateOverride setting can be used to override the Automatic Upgrades setting in the Repository Configuration. If left blank, then it will not override the Repository Options, which is also the default behavior if this setting
isnt specified.
AutoUpdateOverride=False

This setting can be configured using Auto Configuration.

6.5.6 Per-User Configuration File


The per-user deadline.ini file can be found in the following locations. Note that the # in the path will change based on
the Deadline version number.
Windows: %LOCALAPPDATA%\Thinkbox\Deadline#\deadline.ini
Linux: ~/Thinkbox/Deadline#/deadline.ini
OSX: ~/Library/Application Support/Thinkbox/Deadline#/deadline.ini
The following settings can be configured in the per-user deadline.ini file.
User
The User setting is used by the Client to know which user you are when launching the Monitor or when submitting
jobs.
User=Ryan

This setting can be changed using the Change User option in the Launcher or the Monitor. To prevent users from
changing who they are, see the User Management documentation.

358

Chapter 6. Advanced Features

Deadline User Manual, Release 7.1.0.35

LaunchSlaveAtStartup
The LaunchSlaveAtStartup setting controls if the Launcher should automatically launch the Slave after the launcher
starts up. The default is True.
LaunchSlaveAtStartup=False

This setting can be changed from the Launcher menu, and it can also be configured using Auto Configuration.

6.6 Auto Configuration


6.6.1 Overview
Auto Configuration allows you to configure many Client settings from a single location. When the Deadline applications start up, they will automatically pull these settings, save them locally, and apply them before fully initializing.
Note that Pulse must be running for the Deadline applications to pull the Repository Path setting. All the other settings
are pulled directly from the Database once the applications are able to connect to it. Note that if Pulse isnt running, the
other settings will still be pulled directly from the Database. To configure and run Pulse, see the Pulse documentation.

6.6. Auto Configuration

359

Deadline User Manual, Release 7.1.0.35

6.6.2 Rulesets
You can set up Client Configuration Rulesets from the Auto Configuration section of the Repository Configuration. If
you want to configure groups of Clients differently from others, you can add multiple Rulesets. This is useful if you
have more than one Repository on your network, or if you want to configure your render nodes differently than your
workstations.
New Rulesets can be added by pressing the Add button. You can give the Ruleset a name, and then choose a Client
Filter method to control which Clients will use this Ruleset. There are currently three types of Slave Filters:
Hostname Regex: You can use regular expressions to match a Clients host name. If your Slaves are using IPv6,
this is probably the preferred method to use. Note that this is case-sensitive. For example:
.*host.* will match hostnames containing the word host in lower case.
host.* will match hostnames starting with host.
.*[Hh]ost will match ending with Host or host.
.* will match everything.
IP Regex: You can use regular expressions to match a Clients IP address. This works with both IPv4 and IPv6
addresses. For example:
192.168..* will match IPv4 addresses not transported inside IPv6 starting with 192.168.
[:fF]*192.168.
should match IPv4 address even if they are carried over IPv6 addresses (ex
::ffff:192.168.2.128).
.* will match everything.
IPv4 Match: You can specify specific IP addresses, or a range of IP addresses (by using wildcards or ranges).
Note that this only works with IPv4. Do not use this for IPv6 addresses. For example:
192.168.0.1-150
192.168.0.151-255
192.168.*.*
*.*.*.*
Configurations are generated starting from the top rule working down one by one. When there is a match for the
requesting Client, any properties in the rule which are not marked as (Inherited) will override a previous setting. By
default, Slaves will use their local configuration for any property which is not set by a rule. Based on the example
here, all clients starting with the name Render- and ending with a whole number will use the same Repository Path
and launch the Client at startup, while the Default rule above it matches all Clients and sets their license server.

360

Chapter 6. Advanced Features

Deadline User Manual, Release 7.1.0.35

The available options are:


License Server: The license server setting. Use the format @SERVER, or if you have configured your license
file to use a specific port, use PORT@SERVER.
Launch Slave At Startup: Whether or not the Slave should automatically launch when the Launcher starts up.
Auto Update Override: Whether or not launching the Client should trigger an automatic upgrade if it is available.
Restart Slave If It Stalls: If enabled, the Launcher will try to restart the Slave on the machine if it stalls.
Repository Path: This is the path to the Repository that the Slave will connect to. You can specify a different
path for each operating system.
Local Data Path: The local path where the Client temporarily stores plugin and job data from the Repository
during rendering. Note that this should be a local path to avoid conflicts. You can specify a different path for
each operating system.

6.6. Auto Configuration

361

Deadline User Manual, Release 7.1.0.35

6.7 Render Environment


6.7.1 Job Environment Variables
Environment variables can be set for a job, and these variables will be applied to the rendering process environment.
These variables can be set in the Job Properties in the Monitor, and they can be set during Manual Job Submission.
Manual Job Submission
For manual job submission, these variables can be specified in the job info file like this:
EnvironmentKeyValue0=mykey=myvalue
EnvironmentKeyValue1=anotherkey=anothervalue
EnvironmentKeyValue2=athirdkey=athirdvalue
...

There is also an IncludeEnvironment option that takes either True or False (False is the default). When IncludeEnvironment is set to True, Deadline will automatically grab all the environment variables from the submitters environment
and set them as the jobs environment variables.
IncludeEnvironment=True

This can be used in conjunction with the EnvironmentKeyValue# options above, but note that the EnvironmentKeyValue# options will take precedence over any current environment variables with the same name.
Finally, there is a UseJobEnvironmentOnly option that takes either True or False (False is the default):
UseJobEnvironmentOnly=True

The UseJobEnvironmentOnly setting controls how the jobs environment variables are applied to the rendering environment. If True, ONLY the jobs environment variables will be used. If False, the jobs environment variables will
be merged with the Slaves current environment, with the jobs variables overwriting any existing ones with the same
name.
Job Rendering
At render time, the jobs environment variables are applied to the rendering process. As explained above, the jobs
environment can either be merged with the Slaves current environment, or the jobs environment can be used exclusively.
Note though that if the jobs plugin defines any environment variables, those will take precedence over any job environment variables with the same name. In a jobs plugin, there are two functions that are available for the DeadlinePlugin
object that can be used to set environment variables:
SetProcessEnvironmentVariable( key, value ):
This should be used in Advanced plugins only.
Any variables set by this function are applied to all process launched through Deadlines plugin API.
Note that calling SetProcessEnvironmentVariable in Simple plugins or within ManagedProcess callbacks
will not affect the current process environment.
When using SetProcessEnvironmentVariable in an Advanced plugin, make sure to call it outside of the
ManagedProcess callbacks.
362

Chapter 6. Advanced Features

Deadline User Manual, Release 7.1.0.35

SetEnvironmentVariable( key, value ):


This is typically used in Simple plugins, or within ManagedProcess callbacks in Advanced plugins.
Any variables set by this function are only applied to the process they are starting up, and they take
precedence over any variables set by SetProcessEnvironmentVariable.
See the Application Plugins documentation for more information.

6.7.2 Render Jobs As Jobs User


Deadline has some features that allow jobs to be rendered with the the jobs user account, rather than the user account
that the Slave is running as.
On Windows, this is done by using the jobs user account credentials to start the rendering process using that
account.
On Linux and Mac OS X, the Slave must be running as root. It will then use sudo to start the rendering process
using the jobs user account.
Enabling Render Jobs As User
To render jobs as the jobs user, you must enable Render Jobs As User in the User Security section of the Repository
Options. Note that this setting affects all jobs, and requires users to ensure that their User Account Settings are
configured properly (see below).

6.7. Render Environment

363

Deadline User Manual, Release 7.1.0.35

User Account Settings


The user account settings used to start the rendering process are stored in the User Settings for each user. For Linux
and OSX, only the User Name is required. For Windows, the Domain and Password must also be provided for
authentication.

364

Chapter 6. Advanced Features

Deadline User Manual, Release 7.1.0.35

6.8 Multiple Slaves On One Machine


6.8.1 Overview
Deadline has the ability to launch and configure an arbitrary number of Slave instances on a single machine. Each
Slave instance can be given a unique name, and can be assigned its own list of Pools and Groups, which allows Slaves
to work independently on separate Jobs. A single high-performance machine could potentially process multiple 3D,
compositing, and simulation Jobs simultaneously.
Note that the configurations for these slave instances are stored locally on the slave machine. This means that these
slave instances exist independently from the repository that the slaves connect to. So if you delete a slave from the
repository, the local configuration for that slave instance still exists. Conversely, if you delete a local slave instance,
the slave will still have an entry in the repository. It is possible to remove both the slave from the repository and the
local slave instance from the slave machine, which is covered below.

6.8. Multiple Slaves On One Machine

365

Deadline User Manual, Release 7.1.0.35

6.8.2 Licensing
In Deadline 7, all Slave instances running on a single machine will use the same license. For example, if you had 3
slave instances running on one machine, they would only use 1 license.

6.8.3 Adding and Running Slaves


There are three ways to launch new slave instances:
From the Launcher menu by selecting Launch Slave By Name -> New Slave Instance. This is disabled by
default, but can be enabled in the User Group Management settings.

From the right-click menu in the Slave list in the Monitor by selecting Remote Control -> Slave Commands ->
Start New Slave Instance. By default, this is only available when in Super User Mode.

From the command line using the -name option.


deadlineslave -name "instance-01"

Additonally for a headless/no GUI machine, you would add a -nogui flag.
deadlineslave -name "instance-01" -nogui

Note that the name you enter is the postfix that is appended to the slaves base name. For example, if the slaves base
name is Render-02, and you start a new instance on it called instance-01, the full name for that slave instance will
be Render-02-instance-01. This is done so that if the slaves machine name is changed, the full slave name will be
updated accordingly. Using the same example, if the machine was renamed to Node-05, the slave instance will now
be called Node-05-instance-01.
Once the new Slave shows up in the Slave List in the Monitor, you can configure it like any other Slave. You might
want to use Slave Settings (see Slave Configuration) to assign the different Slaves to run on separate CPUs. It might
also be a good idea to assign them to different Pools and Groups, so that they can work on different types of Jobs to
avoid competing for the same resource (e.g., you could have one Slave assigned to CPU intensive Jobs, while the other
works on RAM intensive ones).
Once the Slave has been created, you can also launch it remotely like you would any other Slave. See the Remote
Control documentation for more information.

366

Chapter 6. Advanced Features

Deadline User Manual, Release 7.1.0.35

6.8.4 Removing Slaves


There are three ways to remove existing slave instances:
From the Launcher menu by selecting Launch Slave By Name -> Remove Slave Instances. This is disabled by
default, but can be enabled in the User Group Management settings.

From the right-click menu in the Slave list in the Monitor by selecting Remote Control -> Slave Commands
-> Remove Slave Instance. This method gives the additional option to automatically remove the slave instance
from the repository as well. By default, this is only available when in Super User Mode.

Manually delete the .ini files that define the local slaves instances on the machine that the slave runs on. See the
Client Configuration documentation for more information.

6.8.5 Limiting and Disabling Multiple Slaves


By default, users do not have the ability to launch additional Slaves on their own machines (see User Group Management). However, there are some cases where you might want to completely disable the ability to run multiple slaves
on the same machine.
The only known situation where this might be necessary is if your render nodes all net-boot off the same installation
(meaning they share the same file system). In this case, if multiple Slaves are enabled, each render node will end up
trying to run a Slave instance for every other render node net-booting off the same installation.

6.8. Multiple Slaves On One Machine

367

Deadline User Manual, Release 7.1.0.35

In this scenario, you can disable the multi-slave feature by opening the systems deadline.ini file and adding this line:
MultipleSlavesEnabled=False

The system deadline.ini file can be found in the following locations. Note that the # in the path will change based on
the Deadline version number.
Windows: %PROGRAMDATA%\Thinkbox\Deadline#\deadline.ini
Linux: /var/lib/Thinkbox/Deadline#/deadline.ini
OSX: /Users/Shared/Thinkbox/Deadline#/deadline.ini

6.9 Cloud Controls


6.9.1 Overview
Deadline has some built in cloud features that allows it to connect to different cloud providers and control your
instances. Currently, Amazon EC2, Microsoft Azure, Google Cloud, OpenStack, and vCenter are supported, but more
providers may be added in the future.
Note that Deadline only allows you to control existing instances. It does not create instances for you, except in the
case where you clone an existing instance. In order to use instances for rendering, you will need to set them up first,
which includes installing the Deadline Client, installing your rendering software, and setting up any licensing that is
required.
Permission for the Cloud Panel can be editted in the User Group Permissions form. See Controlling Feature Access.

6.9.2 Cloud Providers


Cloud providers can be configured from the Monitor by selecting Tools -> Configure Cloud Providers. By default, this
option is hidden for normal users, so you may need to enter Super User Mode. This will bring up the Cloud Options
window.

368

Chapter 6. Advanced Features

Deadline User Manual, Release 7.1.0.35

Adding Providers
To add a provider, click the Add button under the Cloud Region list. Choose the Cloud plugin you wish to use, and
give it a region name. This is useful for providers like Amazon EC2 that have more than one region. Then click OK.

The new Cloud region will now show up in the Cloud Region list.
Configuring Providers
To configure an existing provider, select it in the Cloud Region box, which will bring up its configuration settings.
This are the settings that the Monitor will use to connect to your cloud provider(s).
6.9. Cloud Controls

369

Deadline User Manual, Release 7.1.0.35

Every provider has an option to enable or disable it, but the other options can vary between providers. To get more
information about a particular setting, just hover your mouse over the setting text, or refer to the Cloud Plugins section
of the documentation.

6.9.3 Cloud Panel


The Cloud panel in the Monitor shows all the instances from the cloud providers that the Monitor is connected to. By
default, this panel is hidden for normal users, so you may need to enter Super User Mode before you can open it.

If the Cloud panel is not visible, see the Panel Features documentation for instructions on how to create new panels in
the Monitor.
Controlling Instances
The Cloud panel allows you to create new instances and control your existing instances using the right-click context
menu. The following options are available when you right-click on an instance:
Create New Instance: Creates a new instance.
Start Instance: Starts an instance that is currently stopped.
Stop Instance: Stops an instance that is currently running.

370

Chapter 6. Advanced Features

Deadline User Manual, Release 7.1.0.35

Destroy Instance: Destroys an existing instance. Once an instance is destroyed, it can not be recovered.
Clone Instance: Clones an existing instance. This allows you to quickly launch multiple copies of the selected
instance.
Reboot Instance: Reboots an instance that is currently running.
It should be noted that some cloud providers dont provide the ability to Start/Stop instances.

6.9.4 Cloud Plug-ins


Cloud providers are supported via the Cloud Plug-in system. This means that the existing ones can be customized,
or you can write your own. See the Cloud Plugins documentation for more information on creating cloud plug-ins.
Plugin data is only loaded and updated when the Cloud Panel is being displayed.

6.10 Job Transferring


6.10.1 Overview
If you have multiple office locations that each have their own Deadline Repository, it is possible to transfer Jobs
between them. This can be handy if one offices farm is sitting idle while the other is completely swamped.
Note though that Deadline will only transfer over the files that are submitted with the Job, which in most cases is just
the scene file. You must ensure that all assets the scene requires and all output paths that it writes to exist in the remote
location before transferring the Job.

6.10.2 Setting Up a Transfer


Before you can transfer a Job, it must be in the Suspend, Completed, or Failed state. Just right-click on the Job, and
select Scripts -> TransferSubmission. A Transfer Job window will be displayed.

6.10. Job Transferring

371

Deadline User Manual, Release 7.1.0.35

Youll notice that youre actually submitting another Job that will transfer the original Job. The general Deadline
options are explained in the Job Submission documentation. The Job Transfer specific options are:
Frame List and Frames Per Task: This is the frame list for the original Job that will be transferred. It will
default to the values for the original Job, but you can change them if you only want to transfer a subset of frames.
New Repository: This is the path to the remote Repository that the original Job will be transferred to. Note that
the Slaves that the transfer Job will be running on must be able to see this path in order to transfer the original
Job to the new repository.
Compress Files During Transfer: If enabled, the original Jobs files will be compressed during the transfer.
Suspend Remote Job After Transfer: If enabled, the original Job will be submitted in the Suspended state to
the new Repository.
Email Results After Transfer: If enabled, you will be emailed when the original Job has been successfully

372

Chapter 6. Advanced Features

Deadline User Manual, Release 7.1.0.35

transferred. Note that this requires you to have your email notification options set up properly.
Remove Local Job After Transfer: If enabled, the original Job in the local Repository will be deleted after the
Job has been successfully transferred to the remote Repository.
Once you have your options set, click the Submit button to submit the transfer Job.

6.10.3 Global Transfer Options


Job Transfers are handled by a JobTransfer plugin, which has a few options that can be configured which will affect
all transfers. To change the JobTransfer plugin options, open the Monitor and select Tools -> Configure Plugins as
a Super User, and then select the JobTransfer plugin from the list on the left.

The following options are available:


Notification Email(s): The email(s) where successful Job Transfer reports will be sent, so that sys admins can
keep track of all successfully transferred Jobs. Leave blank to disable this feature. Use commas to specify more
than one email address.

6.10. Job Transferring

373

Deadline User Manual, Release 7.1.0.35

374

Chapter 6. Advanced Features

CHAPTER

SEVEN

SCRIPTING

7.1 Scripting Overview


7.1.1 Overview
Scripts can be used to customize various aspects of Deadline, including creating custom plug-ins, submitting jobs to
the farm, or automating specific tasks after a job completes. The scripting language that Deadline uses is Python 2.7,
which is supported using Python for .NET. In addition to supporting native cPython modules, Python for .NET allows
your scripts to make use of the .NET Libraries. This fantastic combination of cPython & .NET allows for the best
of both worlds, suiting both seasoned cPython scripters and .NET technology based developers. Studios are free to
choose to use either or both technologies to their advantage in further customizing the Deadline compute management
framework.

7.1.2 Custom Repository Folder


If desired, custom scripts and plugins can be placed in the custom folder in the Repository. This folder contains
subfolders for different plugins and scripts, allowing you to customize the following areas of Deadline:
Application Plugins ../<DeadlineRepository>/custom/plugins/
Event Plugins ../<DeadlineRepository>/custom/events/
Cloud Plugins ../<DeadlineRepository>/custom/cloud/
Balancer Plugins ../<DeadlineRepository>/custom/balancer/
Monitor Scripts
Submission Scripts ../<DeadlineRepository>/custom/scripts/Submission/
General Scripts ../<DeadlineRepository>/custom/scripts/General/
Job Scripts ../<DeadlineRepository>/custom/scripts/Jobs/
Task Scripts ../<DeadlineRepository>/custom/scripts/Tasks/
Slave Scripts ../<DeadlineRepository>/custom/scripts/Slaves/
Pulse Scripts ../<DeadlineRepository>/custom/scripts/Pulse/
Balancer Scripts ../<DeadlineRepository>/custom/scripts/Balancer/
Limit Scripts ../<DeadlineRepository>/custom/scripts/Limits/
Job Report Scripts ../<DeadlineRepository>/custom/scripts/JobReports/
Slave Report Scripts ../<DeadlineRepository>/custom/scripts/SlaveReports/

375

Deadline User Manual, Release 7.1.0.35

Web Service Scripts ../<DeadlineRepository>/custom/scripts/WebService/


Note that any scripts or plugins in the custom folder will not be affected when upgrading or downgrading the Repository. The Repository installer also creates a backup of the custom directory together with the other Deadline directories during the install process to ../backup/[timeStamp] and/or [mostRecent]/custom directory. In addition, any
scripts or plugins in the custom folder will override any scripts or plugins that are shipped with Deadline if they share
the same name. If you want to check out the scripts and plugins that are shipped with Deadline, you can find then in
the events, plugins, and scripts folders in the Repository.
There is also an option for a job to load its Application Plug-in from another location, which can be set in the Job
Properties. This can be useful when testing plugins before updating them directly in the Repository.
Note, the in-app submitters stored under ../<DeadlineRepository>/submission/ are not included in the Custom Repository Folder system, due to the complexity and limitation of some of the application scripting languages. To customize
any of the code under the submission directory, it is recommended to take a copy/backup for later reference. Note,
any customization you make, will still get backed up when the repository installer is run during an upgrade. However,
the contents of the submission directory will be overwritten during an upgrade.

7.1.3 Scripting Reference


The full Deadline Scripting Reference can be found on the Thinkbox Software Documentation Website. Offline PDF
and HTML versions can be downloaded from here as well. Ensure you select the correct drop-down version of
Deadline to view the matching API to your current Deadline version.
There are also many scripts and plug-ins that are shipped with Deadline, which you can use as a reference or starting
point for your own customization. These scripts can be found in the following folders in the Repository:
../<DeadlineRepository>/cloud Cloud Plugins
../<DeadlineRepository>/events Event Plugins
../<DeadlineRepository>/plugins Application Plugins
../<DeadlineRepository>/scripts Monitor Scripts

7.1.4 Application Submission Scripting Reference


Located under the ../<DeadlineRepository>/submission directory in Deadlines repository are the application specific
script files for all the deeply integrated application submitters. Each application directory where applicable has 3 x
sub-directories:
Client: The local proxy Client script is stored here, which typically is manually copied over to the local submitting client machine, thereby allowing users to open up the submission UI. These scripts tend not to be modified
very often and purely serve as a proxy script, which references/pulls the Main submission script from the Deadline repository, where the actual submission code resides.
Main: The Main script(s) files here are referenced or loaded into application memory, typically by the local
proxy Client script. It is in these script file(s) that the deep, submission integration code resides for each
application in question. All this code is unprotected and studios are invited to customize if they so choose.
Note, the in-app submitters stored under ../<DeadlineRepository>/submission are not included in the Custom
Repository Folder system, due to the complexity and limitation of some of the application scripting languages.
To customize any of the code under the submission directory, it is recommended to take a copy/backup for
later reference. Note, any customization you make, will still get backed up when the repository installer is run
during an upgrade. However, the contents of the submission directory will be overwritten during an upgrade.
Installers: Each of our applications that have an in-app submitter, on their respective documentation page, there
will be instructions on how to manually install the local proxy Client script into the correct directory and any
further configuration that may be required to get up and running. As an alternative, we provide Installer(s) which
376

Chapter 7. Scripting

Deadline User Manual, Release 7.1.0.35

can be run with the correct access permissions, to install the local proxy Client script(s) for you and also carry
out any further configuration that may be required. Where applicable, Installers are provided for the different
operating systems.
The following in-application deeply integrated submitters are available for reference or as a starting point for your
own custom submitter:
3ds Command ../<DeadlineRepository>/submission/3dsCmd/
3ds Max ../<DeadlineRepository>/submission/3dsmax/
Corona Distributed Rendering ../<DeadlineRepository>/submission/3dsmaxCoronaDR/
RPManager Script Setup ../<DeadlineRepository>/submission/3dsmaxRPM/
3ds Max ../<DeadlineRepository>/submission/3dsmaxVRayDBR/
After Effects ../<DeadlineRepository>/submission/AfterEffects/
AutoCAD ../<DeadlineRepository>/submission/AutoCAD/
Blender ../<DeadlineRepository>/submission/Blender/
Cinema 4D ../<DeadlineRepository>/submission/Cinema4D/
Cinema 4D Team Render ../<DeadlineRepository>/submission/Cinema4DTeamRender/
Clarisse iFX ../<DeadlineRepository>/submission/Clarisse/
Composite ../<DeadlineRepository>/submission/Composite/
Draft ../<DeadlineRepository>/submission/Draft/
ftrack ../<DeadlineRepository>/submission/FTrack/
Fusion ../<DeadlineRepository>/submission/Fusion/
Generation ../<DeadlineRepository>/submission/Generation/
Hiero ../<DeadlineRepository>/submission/Hiero/
Houdini ../<DeadlineRepository>/submission/Houdini/
Jigsaw ../<DeadlineRepository>/submission/Jigsaw/
Lightwave ../<DeadlineRepository>/submission/Lightwave/
Maya ../<DeadlineRepository>/submission/Maya/
Maya ../<DeadlineRepository>/submission/MayaVRayDBR/
Messiah ../<DeadlineRepository>/submission/Messiah/
MicroStation ../<DeadlineRepository>/submission/MicroStation/
modo ../<DeadlineRepository>/submission/Modo/
Interactive Distributed Rendering ../<DeadlineRepository>/submission/ModoDBR/
Nuke ../<DeadlineRepository>/submission/Nuke/
Realflow ../<DeadlineRepository>/submission/RealFlow/
Rhino ../<DeadlineRepository>/submission/Rhino/
SketchUp ../<DeadlineRepository>/submission/SketchUp/
Softimage ../<DeadlineRepository>/submission/Softimage/
Softimage ../<DeadlineRepository>/submission/SoftimageVRayDBR/

7.1. Scripting Overview

377

Deadline User Manual, Release 7.1.0.35

7.1.5 Running Scripts from the Command Line


To run scripts from the command line, the only requirement is that you define a __main__ function. This is the function
called by the Command application when it executes the script.
def __main__( *args ):
# Replace "pass" with code
pass

If you save this script to a file called myscript.py, you can execute it using this command:
deadlinecommand -ExecuteScript "myscript.py"

If you are running the script in a headless environment where there is no display, you should use this command again:
deadlinecommand -ExecuteScriptNoGui "myscript.py"

The only difference between these commands is that ExecuteScriptNoGui doesnt pre-import any of the user interface
modules so that it can run in a headless environment. If your script doesnt use any user interface modules, then you
can use ExecuteScriptNoGui regardless of whether or not youre in a headless environment.

7.1.6 Migrating Scripts From Deadline 5


Some changes were made to the Scripting API in Deadline 6, which means that Deadline 6 and later are NOT backward
compatible with scripts written for Deadline 5.
One change that affects all Deadline scripts is that the globally defined Deadline functions are no longer available.
However, many have functional replacements, which are mentioned below.
For migration tips for specific scripts, see the appropriate documentation:
Application Plug-ins
Event Plug-ins
Monitor Scripts
Job Scripts
Web Service Scripts

378

Chapter 7. Scripting

Deadline User Manual, Release 7.1.0.35

Deadline Repository Path Functions


Original
Global
Function
GetJobsDirectory()
GetJobDropDirectory()
GetLimitGroupsDirectory()
GetPluginsDirectory()
GetPulseDirectory()
GetRootDirectory()
GetScriptsDirectory()
GetSettingsDirectory()
GetSlavesDirectory()
GetSubmissionDirectory()
GetTempDirectory()
GetTrashDirectory()
GetUsersDirectory()

Replacement Function

There is no replacement for this function because most job information is now stored in the
Database. If you want to get the auxiliary folder for a job, use
RepositoryUtils.GetJobAuxiliaryPath(job), which takes an instance of a job as a parameter.
There is no replacement for this function because drop jobs have been removed.
There is no replacement for this function because Limit information is now stored in the
Database.
RepositoryUtils.GetPluginsDirectory()
There is no replacement for this function because Pulse information is now stored in the
Database.
RepositoryUtils.GetRootDirectory()
RepositoryUtils.GetScriptsDirectory()
RepositoryUtils.GetSettingsDirectory()
There is no replacement for this function because Slave information is now stored in the
Database.
There is no replacement for this function.
There is no replacement for this function because there is no longer a temp folder in the
Repository.
There is no replacement for this function because there is no longer a trash folder in the
Repository.
There is no replacement for this function because User information is now stored in the
Database.

Deadline Client Path Functions


Original Global Function
GetDeadlineBinPath()
GetDeadlineHomeCurrentUserPath()
GetDeadlineHomePath()
GetDeadlineSettingsPath()
GetDeadlineTempPath()
GetLocalApplicationDataPath()
GetSystemTempPath()

7.1. Scripting Overview

Replacement Function
ClientUtils.GetBinDirectory()
ClientUtils.GetCurrentUserHomeDirectory()
ClientUtils.GetUsersHomeDirectory()
ClientUtils.GetUsersSettingsDirectory()
ClientUtils.GetDeadlineTempPath()
PathUtils.GetLocalApplicationDataPath()
PathUtils.GetSystemTempPath()

379

Deadline User Manual, Release 7.1.0.35

General Process Functions


Original Global Function
IsProcessRunning(processName)
KillAllProcesses(processName)
KillParentAndChildProcesses(processName)
WaitForProcessToStart(processName,
timeoutSeconds)

Replacement Function
ProcessUtils.IsProcessRunning(name)
ProcessUtils.KillProcesses(name)
ProcessUtils.KillParentAndChildProcesses(name)
ProcessUtils.WaitForProcessToStart(name,
timeoutMilliseconds)

File/Path/Directory Functions
Original Global Function
AddToPath(semicolonSeparatedList)
ChangeFilename(path, filename)
FileExists(filename)
GetExecutableVersion(filename)
GetFileSize(filename)
GetIniFileKeys(iniFilename, section)
GetIniFileSections(iniFilename)
GetIniFileSetting(iniFilename, section, key,
default)
Is64BitDllOrExe(filename)
SearchDirectoryList(semicolonSeparatedList)
SearchFileList(semicolonSeparatedList)
SearchFileListFor32Bit(semicolonSeparatedList)
SearchFileListFor64Bit(semicolonSeparatedList)
SearchPath(filename)
SetIniFileSetting(iniFilename, section, key,
value)
SynchronizeDirectories(srcPath, destPath,
deepCopy)
ToShortPathName(filename)

Replacement Function
DirectoryUtils.AddToPath(directory)
PathUtils.ChangeFilename(path, filename)
FileUtils.FileExists(filename)
FileUtils.GetExecutableVersion(filename)
FileUtils.GetFileSize(filename)
FileUtils.GetIniFileKeys(fileName, section)
FileUtils.GetIniFileSections(fileName)
FileUtils.GetIniFileSetting(fileName, section, key, defaultValue)
FileUtils.Is64BitDllOrExe(filename)
DirectoryUtils.SearchDirectoryList(directoryList)
FileUtils.SearchFileList(fileList)
FileUtils.SearchFileListFor32Bit(fileList)
FileUtils.SearchFileListFor64Bit(fileList)
DirectoryUtils.SearchPath(filename)
FileUtils.SetIniFileSetting(filename , section, key, value)
DirectoryUtils.SynchronizeDirectories(sourceDirectory,
destDirectory,deepCopy)
PathUtils.ToShortPathName(path)

Miscellaneous Functions
Original Global Function
BlankIfEitherIsBlank(str1, str2)
ExecuteScript(scriptFilename, arguments)
Sleep(milliseconds)

380

Replacement Function
StringUtils.BlankIfEitherIsBlank(str1, str2)
ClientUtils.ExecuteScript(scriptFilename, arguments)
SystemUtils.Sleep(milliseconds)

Chapter 7. Scripting

Deadline User Manual, Release 7.1.0.35

OS Functions
Original Global Function
GetAvailableRam()
GetApplicationPath(filename)
GetCpuCount()
GetRegistryKeyValue(keyName, valueName,
defaultValue)
GetTotalRam()
GetUsedRam()
Is64Bit()
IsRunningOnLinux()
IsRunningOnMac()
IsRunningOnWindows()

Replacement Function
SystemUtils.GetAvailableRam()
PathUtils.GetApplicationPath(applicationName)
SystemUtils.GetCpuCount()
SystemUtils.GetRegistryKeyValue(keyName, valueName,
defaultValue)
SystemUtils.GetTotalRam()
SystemUtils.GetUsedRam()
SystemUtils.Is64Bit()
SystemUtils.IsRunningOnLinux()
SystemUtils.IsRunningOnMac()
SystemUtils.IsRunningOnWindows()

7.2 Application Plugins


7.2.1 Overview
All of Deadlines plug-ins are written in Python, which means that its easy to create your own plug-ins or customize
the existing ones. See the Scripting Overview documentation for more information, and links to the Deadline Scripting
reference.
Note that because the Python scripts for application plug-ins will be executed in a non-interactive way, it is important
that your scripts do not contain any blocking operations like infinite loops, or interfaces that require user input.
When a plugin is loaded the log will show where the plugin is being loaded from.

7.2.2 General Plug-in Information


There are two types of plug-ins that can be created:
Simple
Advanced
Simple plug-ins provide the basics to wrap a command line application, and is typically used to build up command
line arguments to pass to the application. Advanced plug-ins provide more control, and are typically used when
running a simple command line application isnt enough. Other than the plug-in Python script itself though, Simple
and Advanced plug-ins are very similar.

7.2.3 Creating a New Plug-in


This section covers the the areas that Simple and Advanced plug-ins have in common. Specifics for Simple and
Advanced plug-ins are covered later on.
To create a new plug-in, start by creating a folder in the Repositorys custom\plugins folder and give it the name of
your plug-in. See the Scripting Overview documentation for more information on the custom folder in the Repository
and how its used.
For the sake of this document, we will call our new plug-in MyPlugin. All relative script and configuration files for
this plug-in are to be placed in this plug-ins folder (some are required and some are optional).

7.2. Application Plugins

381

Deadline User Manual, Release 7.1.0.35

The dlinit File - Required


The first required file is MyPlugin.dlinit, which is the main configuration file for this plug-in. It is a plain text file that
defines a few general key=value plug-in properties, which include:
Key
Name
About
ConcurrentTasks
DebugLogging
DeprecatedMode

Description
A short description of the plug-in.
Set to True or False (default is False). If tasks for this plug-in can render concurrently without
interfering with each other, this can be set to True.
Set to True or False (default is False). If set to True, then debug plug-in logging will be printed out
during rendering.
Set to True or False (default is False). Only set to True if you want a custom Python.NET plug-in
from Deadline 5.1 or 5.2 to work with Deadline 6 or later. More information on DeprecatedMode
can be found later on.

It can also define key=value custom settings to be used by the plug-in. A common custom setting is the executable to
use to render the job. For this example, our MyPlugin.dlinit file might look like this:
About=My Example Plugin for Deadline
# This is a comment
ConcurrentTasks=True
MyPluginRenderExecutable=c:\path\to\my\executable.exe

The py File - Required


The other required file is MyPlugin.py, which is the main plug-in script file. It defines the main DeadlinePlugin class
that contains the necessary code that Deadline uses to render a job. This is where Simple and Advanced plug-ins will
differ, and the specifics for each can be found later on, but the template for this script file might look like this:
from Deadline.Plugins import *
######################################################################
## This is the function that Deadline calls to get an instance of the
## main DeadlinePlugin class.
######################################################################
def GetDeadlinePlugin():
return MyPlugin()
######################################################################
## This is the function that Deadline calls when the plugin is no
## longer in use so that it can get cleaned up.
######################################################################
def CleanupDeadlinePlugin( deadlinePlugin ):
deadlinePlugin.Cleanup()
######################################################################
## This is the main DeadlinePlugin class for MyPlugin.
######################################################################
class MyPlugin (DeadlinePlugin):
# TODO: Place code here instead of "pass"
pass

382

Chapter 7. Scripting

Deadline User Manual, Release 7.1.0.35

The first thing to note is that were importing the Deadline.Plugins namespace so that we can access the DeadlinePlugin
class.
The GetDeadlinePlugin() function is important, as it allows the Slave to get an instance of our MyPlugin class (which is
extending the abstract DeadlinePlugin class). In Deadline 6.2 and later, the GetDeadlinePluginWithJob( job ) function
can be defined as an alternative. It works the same as GetDeadlinePlugin(), except that it accepts an instance of the
Job object that the plug-in is being loaded for. If either of these functions are not defined, the Slave will report an error
when it tries to render the job.
The MyPlugin class will need to implement certain callbacks based on the type of plug-in it is, and these callbacks must
be hooked up in the MyPlugin constructor. One callback that all plug-ins should implement is the InitializeProcess
function. There are many other callbacks that can be implemented, which are covered in the Events section for the
DeadlinePlugin class in the Deadline Scripting reference.
The CleanupDeadlinePlugin() function is also important, as it is necessary to clean up the plug-in when it is no longer
in use. Typically, this is used to clean up any callbacks that were created when the plug-in was initialized.
To start off, the InitializeProcess callback is typically used to set some general plug-in settings:
from Deadline.Plugins import *
######################################################################
## This is the function that Deadline calls to get an instance of the
## main DeadlinePlugin class.
######################################################################
def GetDeadlinePlugin():
return MyPlugin()
######################################################################
## This is the function that Deadline calls when the plugin is no
## longer in use so that it can get cleaned up.
######################################################################
def CleanupDeadlinePlugin( deadlinePlugin ):
deadlinePlugin.Cleanup()
######################################################################
## This is the main DeadlinePlugin class for MyPlugin.
######################################################################
class MyPlugin (DeadlinePlugin):
## Hook up the callbacks in the constructor.
def __init__( self ):
self.InitializeProcessCallback += self.InitializeProcess
## Clean up the plugin.
def Cleanup():
del self.InitializeProcessCallback
## Called by Deadline to initialize the plugin.
def InitializeProcess( self ):
# Set the plugin specific settings.
self.SingleFramesOnly = False
self.PluginType = PluginType.Simple

These are the common plug-in properties that can be set in InitializeProcess callback. See the DeadlinePlugin class in
the Deadline Scripting reference for additional properties.

7.2. Application Plugins

383

Deadline User Manual, Release 7.1.0.35

Property
PluginType
SingleFramesOnly

Description
The type of plug-in this is (PluginType.Simple/PluginType.Advanced).
Set to True or False. Set to True if your plug-in can only work on one frame at a time, rather
than a frame sequence.

The param File - Optional


The MyPlugin.param file is an optional file that is used by the Plugin Configuration dialog in the Monitor. It declares
properties that the Monitor uses to generate a user interface for modifying custom settings in the MyPlugin.dlinit file.
After youve created this file, open the Monitor and enter Super User mode. Then select Tools -> Configure Plugins
and look for your plug-in in the list on the left.

The file might look something like:


[MyPluginRenderExecutable]
Type=filename
Label=My Plugin Render Executable
Default=c:\path\to\my\executable.exe
Description=The path to the executable file used for rendering.

Comment lines are supported in the param file, and must start with either ; or #. For example:
# This is the file name picker control to set the executable for this plugin.
[MyPluginRenderExecutable]
Type=filename
Label=My Plugin Render Executable
Default=c:\path\to\my\executable.exe
Description=The path to the executable file used for rendering.

384

Chapter 7. Scripting

Deadline User Manual, Release 7.1.0.35

Youll notice that the property name between the square brackets matches the MyPluginRenderExecutable custom
setting we defined in our MyPlugin.dlinit file. This means that this control will change the MyPluginRenderExecutable
setting. The available key=value pairs for the properties defined here are:
Key
Name
Category
CategoryIndex
CategoryOrder
Default
DefaultValue
Description
DisableIfBlank
IgnoreIfBlank
Index
Label
Required
Type

Description
The category the control should go under.
This determines the controls order under its category. This does the same thing as Index.
This determines the categorys order among other categories. If more than one CategoryOrder is
defined for the same category, the lowest value is used.
The default value to be used if this property is not defined in the dlinit file. This does the same thing
as DefaultValue.
The default value to be used if this property is not defined in the dlinit file. This does the same thing
as Default.
A short description of the property the control is for (displayed as a tooltip in the UI).
If True, a control will not be shown if this property is not defined in the dinit file (True/False). This
does the same thing as IgnoreIfBlank.
If True, a control will not be shown if this property is not defined in the dinit file (True/False). This
does the same thing as DisableIfBlank.
This determines the controls order under its category. This does the same thing as CategoryIndex.
The control label.
If True, a control will be shown for this property even if its not defined in the dlinit file (True/False).
The type of control (see table below).

These are the available controls.


Control Type
Boolean
Color
Enum
Enumeration
Filename
FilenameSave
Float
Folder
Integer
Label
MultiFilename
MultiLineMultiFilename
MultiLineMultiFolder
MultiLineString
Password
SlaveList
String

Description
A drop-down control that allows the selection of True or False.
Allows the selection of a color.
A drop-down control that allows the selection of an item from a list.
Same as Enum above.
Allows the selection of an existing file.
Allows the selection of a new or existing file.
An floating point spinner control.
Allows the selection of an existing folder.
An integer spinner control.
A read-only text field.
Allows the selection of multiple existing files, which are then separated by semicolons in
the text field.
Allows the selection of multiple existing files, which are then placed on multiple lines in
the text field.
Allows the selection of multiple existing folders, which are then placed on multiple lines
in the text field.
A text field with multiple lines.
A text field that masks the text.
Allows the selection of existing Slaves, which are then separated by commas in the text
field.
A text field.

There are also key/value pairs for specific controls:

7.2. Application Plugins

385

Deadline User Manual, Release 7.1.0.35

Key Name
DecimalPlaces
Filter
Increment
Items
Maximum
Minimum
Validator
Values

Description
The number of decimal places for the Float controls.
The filter string for the Filename, FilenameSave, or MultiFilename controls.
The value to increment the Integer or Float controls by.
The semicolon separated list of items for the Enum control. This does the same thing as Values.
The maximum value for the Integer or Float controls.
The minimum value for the Integer or Float controls.
A regular expression for the String control that is used to ensure the value is valid.
The semicolon separated list of items for the Enum control. This does the same thing as Items.

The options File - Optional


The MyPlugin.options file is an optional file that is used by the Job Properties dialog in the Monitor. It declares
properties that the Monitor uses to generate a user interface for modifying plug-in specific options as they appear in
the plug-in info file that was submitted with the job. After youve created this file, you can right-click on a job in the
Monitor that uses this plug-in and select Modify Properties. You should then see a MyPlugin page at the bottom of the
list on the left which you can select to view these properties.

Often, these plug-in specific options are used to build up the arguments to be passed to the rendering application. Lets
assume that our render executable takes a -verbose argument that accepts a boolean parameter, and that the plug-in
info file submitted with the job contains the following:
Verbose=True

386

Chapter 7. Scripting

Deadline User Manual, Release 7.1.0.35

Now we would like to be able to change this value from the Job Properties dialog in the Monitor, so our MyPlugin.options file might look like this:
[Verbose]
Type=boolean
Label=Verbose Logging
Description=If verbose logging is enabled.
Required=true
DisableIfBlank=false
DefaultValue=True

Youll notice that the property name between the square brackets matches the Verbose setting in our plug-in info file.
This means that this control will change the Verbose setting. The available key=value pairs for the properties defined
here are the same as those defined for the param file above. Comment lines are also supported in the options file in the
same way they are supported in the param file.
The ico File - Optional
The MyPlugin.icon file is an optional 16x16 icon file that can be used to easily identify jobs that use this plug-in in
the Monitor. Typically, it is the plug-in applications logo, or something else that represents the plug-in. If a plug-in
does not have an icon file, a generic icon will be shown in the jobs list in the Monitor
The JobPreLoad.py File - Optional
The JobPreLoad.py file is an optional script that will be executed by the Slave prior to loading a job that uses this
plug-in. Note that in this case, the file does not share its name with the plug-in folder. This script can be used to do
things like synchronize plug-ins or scripts prior to starting the render job.
The only requirement for the PreJobLoad.py script is that you define a __main__ function, which is called by the Slave
when it executes the script. It must accept a single parameter, which is the current instance of the DeadlinePlugin class.
Here is an example script that copies a couple files from a server to the local machine, and sets some environment
variables:
from System import *
from System.IO import *
def __main__( deadlinePlugin ):
deadlinePlugin.LogInfo( "Copying some files" )
File.Copy( r"\\server\files\file1.ext", r"C:\local\files\file1.ext", True )
File.Copy( r"\\server\files\file2.ext", r"C:\local\files\file2.ext", True )
deadlinePlugin.LogInfo( "Setting EnvVar1 to True" )
deadlinePlugin.SetProcessEnvironmentVariable( "EnvVar1", "True" )
deadlinePlugin.LogInfo( "Setting EnvVar2 to False" )
deadlinePlugin.SetProcessEnvironmentVariable( "EnvVar2", "False" )

The PluginPreLoad.py File - Optional


The PluginPreLoad.py file is an optional script that will be executed by the Slave prior to executing any python script
for the plug-in (MyPlugin.py or JobPreLoad.py), and any pre or post job or task script for the current job. Note that
in this case, the file does not share its name with the plug-in folder. This script can be used to set up the Python

7.2. Application Plugins

387

Deadline User Manual, Release 7.1.0.35

environment prior to running any other python script, including setting sys.path to control where additional modules
will be loaded from.
The only requirement for the PluginPreLoad.py script is that you define a __main__ function, which is called by the
Slave when it executes the script. It does not accept any parameters. Here is an example script that updates sys.path
with custom paths:
import sys
def __main__():
path = r"\\server\python"
if path not in sys.path:
sys.path.append( path )

7.2.4 Simple Plug-ins


A render job goes through three stages:
StartJob: A job enters this stage when it is first picked up by a Slave.
RenderTasks: A job can enter this stage many times (once for each task a Slave dequeues while it has the
current job loaded).
EndJob: A job enters this stage when a Slave is unloading the job.
Simple plug-ins only covers the RenderTasks stage, and are pretty straight forward. They are commonly used to render
with applications that support simple command line rendering (running a command line executable and waiting for it
to complete). For example, After Effects has a command line renderer called aerender.exe, which can be executed by
the Slave to render specific frames of an After Effects project file.
Initialization
By default, a plug-in is considered to be a Simple plug-in, but you can explicitly set this in the InitializeProcess()
callback (as explained above). You can also define settings specific to the simple plug-in, as well as any popup or
stdout handlers that you need. These additional settings are covered in the ManagedProcess class in the Deadline
Scripting reference (note that the DeadlinePlugin class inherits from the ManagedProcess class). For example:
from Deadline.Plugins import *
from System.Diagnostics import *
######################################################################
## This is the function that Deadline calls to get an instance of the
## main DeadlinePlugin class.
######################################################################
def GetDeadlinePlugin():
return MyPlugin()
######################################################################
## This is the function that Deadline calls when the plugin is no
## longer in use so that it can get cleaned up.
######################################################################
def CleanupDeadlinePlugin( deadlinePlugin ):
deadlinePlugin.Cleanup()
######################################################################

388

Chapter 7. Scripting

Deadline User Manual, Release 7.1.0.35

## This is the main DeadlinePlugin class for MyPlugin.


######################################################################
class MyPlugin (DeadlinePlugin):
## Hook up the callbacks in the constructor.
def __init__( self ):
self.InitializeProcessCallback += self.InitializeProcess
## Clean up the plugin.
def Cleanup():
# Clean up stdout handler callbacks.
for stdoutHandler in self.StdoutHandlers:
del stdoutHandler.HandleCallback
del self.InitializeProcessCallback
## Called by Deadline to initialize the process.
def InitializeProcess( self ):
# Set the plugin specific settings.
self.SingleFramesOnly = False
self.PluginType = PluginType.Simple
# Set the ManagedProcess specific settings.
self.ProcessPriority = ProcessPriorityClass.BelowNormal
self.UseProcessTree = True
#StdoutHandling should be enabled if required in your plugin
self.StdoutHandling = True
#PopupHandling should be enabled if required in your plugin
self.PopupHandling = True
# Set the stdout handlers.
self.AddStdoutHandlerCallback(
"WARNING:.*" ).HandleCallback += self.HandleStdoutWarning
self.AddStdoutHandlerCallback(
"ERROR:(.*)" ).HandleCallback += self.HandleStdoutError
# Set the popup ignorers.
self.AddPopupIgnorer( "Popup 1" )
self.AddPopupIgnorer( "Popup 2" )
# Set the popup handlers.
self.AddPopupHandler( "Popup 3", "OK" )
self.AddPopupHandler( "Popup 4", "Do not ask me this again;Continue" )
## Callback for when a line of stdout contains a WARNING message.
def HandleStdoutWarning( self ):
self.LogWarning( self.GetRegexMatch(0) )
## Callback for when a line of stdout contains an ERROR message.
def HandleStdoutError( self ):
self.FailRender( "Detected an error: " + self.GetRegexMatch(1) )

7.2. Application Plugins

389

Deadline User Manual, Release 7.1.0.35

Stdout Handlers
The AddStdoutHandlerCallback() function accepts a string parameter, which is a POSIX compliant regular expression
used to match against lines of stdout from the command line process. This function also returns a RegexHandlerCallback instance, which you can hook up a callback to that is called when a line of stdout is matched. This can all be
done on one line, which is shown in the example above.
Examples of handler callback functions are also shown in the example above. Within these handler functions, the
GetRegexMatch() function can be used to get a specific match from the regular expression. The parameter passed to
GetRegexMatch() is the index for the matches that were found. 0 returns the entire matched string, and 1, 2, etc returns
the matched substrings (matches that are surrounded by round brackets). If there isnt a corresponding substring, youll
get an error (note that 0 is always a valid index).
In HandleStdoutWarning(), 0 is the only valid index because there is no substring in round brackets in the regular
expression. In HandleStdoutError(), 0 and 1 are valid. 0 will return the entire matched string, whereas 1 will return
the substring in the round brackets.
For further examples, please open up any of our application plugin Python script files and inspect them. An example
of comprehensive Stdout handlers can be found in the MayaBatch plugin.
../plugins/MayaBatch/MayaBatch.py
Note, that Deadlines default shipping StdoutHandlers require the Slaves Operating System to be using ENGLISH as
its language.
Popup Ignorers and Handlers
The AddPopupIgnorer() function accepts a string parameter, which is a POSIX compliant regular expression. If a
popup is displayed with a title that matches the given regular expression, the popup is simply ignored. Popup ignorers
should only be used if the popup doesnt halt the rendering because it is waiting for a button to be pressed. In the
case where a button needs to be pressed to continue, popup handlers should be used instead. The AddPopupHandler()
function takes two parameters: a regular expression string, and the button(s) to press (multiple buttons can be separated
with semicolons).
Note, that Deadlines default shipping PopupIgnorers and PopupHandlers require the Slaves Operating System to be
using ENGLISH as its language.
Here is an example using .* at the beginning and end of the title search string which acts as a wildcard. The dialog
also has a Adopt the Files Unit Scale checkbox that needs to be checked ON and then the OK button should be
pressed in that order.
self.PopupHandling = True
self.AddPopupHandler( ".*File Load: Units Mismatch.*", "Adopt the File's Unit Scale?;OK" )

In this example, the Optical Flares license popup uses a wxWindowClassNR control for its OK button, so we need
to add this special class type to our built-in list of possible button classes, just for the After Effects plugin. Once this
class is added, we can search for it and react by pressing the OK button in its dialog. Although, in this case, visually
the button displays the word OK, but actually the name of the button is panel.
self.PopupHandling = True
self.PopupButtonClasses = ( "Button", "wxWindowClassNR" )
# Handle Optical Flares License popup (the "OK" button is actually called "panel")
self.AddPopupHandler( ".*Optical Flares License.*", "panel" )

For users without access to a recent (2012+) version of Visual Studio which includes the Spy++ utility, then the free
application WinSpy++ is very useful to help identify the correct syntax for a dialogs title or button.

390

Chapter 7. Scripting

Deadline User Manual, Release 7.1.0.35

In this example, we force all Qt based widgets to be native instead of alien based widgets, set our HandleQtPopups
variable to True and then we are able to handle V-Ray Qt based alien widget based dialogs whilst rendering in Rhino
by pressing the [X] symbol in the top right corner of the Rhino Qt dialog:
self.PopupHandling = True
self.HandleQtPopups = True
self.SetEnvironmentVariable( "QT_USE_NATIVE_WINDOWS","1" )
self.AddPopupHandler( r"Rhino", "[X]" )

In this final example, we need to handle Windows 8 Mobile / Windows 10 based popup dialogs and ensure we react
correctly depending on the name of the dialog title. A sometimes tricky task if you have multiple, very similar named
popup title dialogs in the application. We use the .* characters as a wildcard, the ^ character to ensure the text
appears at the start of the string and the $ character to ensure the text appears at the end of the string we are searching
for.
self.PopupHandling = True
self.HandleWindows10Popups = True
self.AddPopupIgnorer( "SAFE 12.*" )
self.AddPopupIgnorer( "^SAFE$" )
self.AddPopupHandler( "^$", "[X]" )
self.AddPopupHandler( "Tip of the Day", "[X]" )

For further examples, please open up any of our application plugin Python script files and inspect them. Good examples
are to be found in:
../plugins/3dsmax/3dsmax.py
../plugins/AfterEffects/AfterEffects.py
../plugins/CSiSAFE/CSiSAFE.py
../plugins/Rhino/Rhino.py
Further information on Regular Expressions can be found on Wikipedia and many online POSIX compliant RegEx
testers are available to help you develop and test your RegEx before testing your code in Deadline:
regex101
regexr
regexpal
regextester
Finally, the Deadline FranticX.Processes.ManagedProcess class has a number of functions to further assist with
Popup Handling and it is recommended to review our Scripting API docs for further information on these functions:
PopupButtonClasses
PopupMaxChildWindows
PopupTextClasses
PressEnterDuringRender
Render Executable and Arguments
The RenderExecutable() callback is used to get the path to the executable that will be used for rendering. This callback
must be implemented in a Simple plug-in, or an error will occur. Continuing our example from above, well use the
path specified in the MyPlugin.dlinit file, and we can access it using the global GetConfigEntry() function.

7.2. Application Plugins

391

Deadline User Manual, Release 7.1.0.35

Another important (but optional) callback is the RenderArgument() callback. This callback should return the arguments you want to pass to the render executable. Typically, these arguments are built from values that are pulled from
the DeadlinePlugin class (like the scene file name, or the start and end frame for the task), or from the plug-in info file
that was submitted with the job using the GetPluginInfoEntry() function. If this callback is not implemented, then no
arguments will be passed to the executable.
After adding these callbacks, our example plug-in script now looks like this:
from Deadline.Plugins import *
from System.Diagnostics import *
######################################################################
## This is the function that Deadline calls to get an instance of the
## main DeadlinePlugin class.
######################################################################
def GetDeadlinePlugin():
return MyPlugin()
######################################################################
## This is the function that Deadline calls when the plugin is no
## longer in use so that it can get cleaned up.
######################################################################
def CleanupDeadlinePlugin( deadlinePlugin ):
deadlinePlugin.Cleanup()
######################################################################
## This is the main DeadlinePlugin class for MyPlugin.
######################################################################
class MyPlugin (DeadlinePlugin):
## Hook up the callbacks in the constructor.
def __init__( self ):
self.InitializeProcessCallback += self.InitializeProcess
self.RenderExecutableCallback += self.RenderExecutable
self.RenderArgumentCallback += self.RenderArgument
## Clean up the plugin.
def Cleanup():
# Clean up stdout handler callbacks.
for stdoutHandler in self.StdoutHandlers:
del stdoutHandler.HandleCallback
del self.InitializeProcessCallback
del self.RenderExecutableCallback
del self.RenderArgumentCallback
## Called by Deadline to initialize the process.
def InitializeProcess( self ):
# Set the plugin specific settings.
self.SingleFramesOnly = False
self.PluginType = PluginType.Simple
# Set the ManagedProcess specific settings.
self.ProcessPriority = ProcessPriorityClass.BelowNormal
self.UseProcessTree = True
self.StdoutHandling = True
self.PopupHandling = True

392

Chapter 7. Scripting

Deadline User Manual, Release 7.1.0.35

# Set the stdout handlers.


self.AddStdoutHandlerCallback(
"WARNING:.*" ).HandleCallback += self.HandleStdoutWarning
self.AddStdoutHandlerCallback(
"ERROR:(.*)" ).HandleCallback += self.HandleStdoutError
# Set the popup ignorers.
self.AddPopupIgnorer( "Popup 1" )
self.AddPopupIgnorer( "Popup 2" )
# Set the popup handlers.
self.AddPopupHandler( "Popup 3", "OK" )
self.AddPopupHandler( "Popup 4", "Do not ask me this again;Continue" )
## Callback for when a line of stdout contains a WARNING message.
def HandleStdoutWarning( self ):
self.LogWarning( self.GetRegexMatch(0) )
## Callback for when a line of stdout contains an ERROR message.
def HandleStdoutError( self ):
self.FailRender( "Detected an error: " + self.GetRegexMatch(1) )
## Callback to get the executable used for rendering.
def RenderExecutable( self ):
return self.GetConfigEntry( "MyPluginRenderExecutable" )
## Callback to get the arguments that will be passed to the executable.
def RenderArgument( self ):
arguments = " -continueOnError"
arguments += " -verbose " + self.GetPluginInfoEntry( "Verbose" )
arguments += " -start " + str(self.GetStartFrame())
arguments += " -end " + str(self.GetEndFrame())
arguments += " -scene \"" + self.GetDataFilename() + "\""
return arguments

There are many other callbacks that can be implemented for Simple plug-ins, which are covered in the Events section
for the ManagedProcess class in the Deadline Scripting reference. The best place to find examples of Simple plug-ins
is to look at some of the plug-ins that are shipped with Deadline. These range from the very basic (Blender), to the
more complex (MayaCmd).

7.2.5 Advanced Plug-ins


To reiterate, a render job goes through three stages:
StartJob: A job enters this stage when it is first picked up by a Slave.
RenderTasks: A job can enter this stage many times (once for each task a Slave dequeues while it has the
current job loaded).
EndJob: A job enters this stage when a Slave is unloading the job.
Advanced plug-ins are more complex, as they control all three of these job stages. They are commonly used to render
with applications that support some sort of slave/server mode that Deadline can interact with. Usually, this requires the
application to be started during the StartJob phase, fed commands during the RenderTasks stage(s), and finally shut
down during the EndJob stage. For example, the 3ds Max plug-in starts up 3dsmax in slave mode and forces it to load

7.2. Application Plugins

393

Deadline User Manual, Release 7.1.0.35

our Lightning plug-in. The Lightning plug-in listens for commands from Deadline and executes them as necessary.
After rendering is complete, 3ds Max is shut down.
Initialization
To indicate that your plug-in is an Advanced plug-in, you need to set the PluginType property in the InitializeProcess()
callback.
from Deadline.Plugins import *
######################################################################
## This is the function that Deadline calls to get an instance of the
## main DeadlinePlugin class.
######################################################################
def GetDeadlinePlugin():
return MyPlugin()
######################################################################
## This is the function that Deadline calls when the plugin is no
## longer in use so that it can get cleaned up.
######################################################################
def CleanupDeadlinePlugin( deadlinePlugin ):
deadlinePlugin.Cleanup()
######################################################################
## This is the main DeadlinePlugin class for MyPlugin.
######################################################################
class MyPlugin (DeadlinePlugin):
## Hook up the callbacks in the constructor.
def __init__( self ):
self.InitializeProcessCallback += self.InitializeProcess
## Clean up the plugin.
def Cleanup():
del self.InitializeProcessCallback
## Called by Deadline to initialize the process.
def InitializeProcess( self ):
# Set the plugin specific settings.
self.SingleFramesOnly = False
self.PluginType = PluginType.Advanced

Render Tasks
The RenderTasks() callback is the only required callback for Advanced plug-ins. If it is not implemented, an error will
occur. It contains the code to be executed for each task that a Slave renders. This could involve launching applications,
communicating with already running applications, or simply running a script to automate a particular task (like backing
up a group of files).
Other common callbacks for Advanced plug-ins are the StartJob() and EndJob() callbacks. The StartJob() callback can
be used to start up an application, or to set some local variables that will be used in other callbacks. If the StartJob()
callback is not implemented, then nothing is done during the StartJob phase. The EndJob() callback can be used to
shut down a running application, or to clean up temporary files. If the EndJob() callback is not implemented, then
nothing is done during the EndJob phase.

394

Chapter 7. Scripting

Deadline User Manual, Release 7.1.0.35

In the example below, we will be launching our application during the StartJob phase. The benefit to this is that
the application can be left running during the duration of the job, which eliminates the overhead of having to launch
the application for each task. To launch and monitor the application, we will be implementing a ManagedProcess
class, and calling it MyPluginProcess .This ManagedProcess class will define the render executable and command line
arguments for launching the process we will be monitoring. Note that we arent passing it any frame information, as
this needs to be handled in the RenderTasks() callback when it interacts with the process.
After adding these three callbacks, and the MyPluginProcess class, our example code looks like this. Note that the
RenderTasks() callback still needs code to allow it to interact with the running process launched in the StartJob()
callback.
from Deadline.Plugins import *
######################################################################
## This is the function that Deadline calls to get an instance of the
## main DeadlinePlugin class.
######################################################################
def GetDeadlinePlugin():
return MyPlugin()
######################################################################
## This is the function that Deadline calls when the plugin is no
## longer in use so that it can get cleaned up.
######################################################################
def CleanupDeadlinePlugin( deadlinePlugin ):
deadlinePlugin.Cleanup()
######################################################################
## This is the main DeadlinePlugin class for MyPlugin.
######################################################################
class MyPlugin (DeadlinePlugin):
## Variable to hold the Managed Process object.
Process = None
## Hook up the callbacks in the constructor.
def __init__( self ):
self.InitializeProcessCallback += self.InitializeProcess
self.StartJobCallback += self.StartJob
self.RenderTasksCallback += self.RenderTasks
self.EndJobCallback += self.EndJob
## Clean up the plugin.
def Cleanup():
del self.InitializeProcessCallback
del self.StartJobCallback
del self.RenderTasksCallback
del self.EndJobCallback
# Clean up the managed process object.
if self.Process:
self.Process.Cleanup()
del self.Process
## Called by Deadline to initialize the process.
def InitializeProcess( self ):
# Set the plugin specific settings.
self.SingleFramesOnly = False

7.2. Application Plugins

395

Deadline User Manual, Release 7.1.0.35

self.PluginType = PluginType.Advanced
## Called by Deadline when the job starts.
def StartJob( self ):
myProcess = MyPluginProcess()
StartMonitoredManagedProcess( "My Process", myProcess )
## Called by Deadline for each task the Slave renders.
def RenderTasks( self ):
# Do something to interact with the running process.
pass
## Called by Deadline when the job ends.
def EndJob( self ):
ShutdownMonitoredManagedProcess( "My Process" )
######################################################################
## This is the ManagedProcess class that is launched above.
######################################################################
class MyPluginProcess (ManagedProcess):
deadlinePlugin = None
## Hook up the callbacks in the constructor.
def __init__( self, deadlinePlugin ):
self.InitializeProcessCallback += self.InitializeProcess
self.RenderExecutableCallback += self.RenderExecutable
self.RenderArgumentCallback += self.RenderArgument
## Clean up the managed process.
def Cleanup():
# Clean up stdout handler callbacks.
for stdoutHandler in self.StdoutHandlers:
del stdoutHandler.HandleCallback
del self.InitializeProcessCallback
del self.RenderExecutableCallback
del self.RenderArgumentCallback
## Called by Deadline to initialize the process.
def InitializeProcess( self ):
# Set the ManagedProcess specific settings.
self.ProcessPriority = ProcessPriorityClass.BelowNormal
self.UseProcessTree = True
self.StdoutHandling = True
self.PopupHandling = True
# Set the stdout handlers.
self.AddStdoutHandlerCallback(
"WARNING:.*" ).HandleCallback += self.HandleStdoutWarning
self.AddStdoutHandlerCallback(
"ERROR:(.*)" ).HandleCallback += self.HandleStdoutError
# Set the popup ignorers.
self.AddPopupIgnorer( "Popup 1" )
self.AddPopupIgnorer( "Popup 2" )
# Set the popup handlers.
self.AddPopupHandler( "Popup 3", "OK" )

396

Chapter 7. Scripting

Deadline User Manual, Release 7.1.0.35

self.AddPopupHandler( "Popup 4", "Do not ask me this again;Continue" )


## Callback for when a line of stdout contains a WARNING message.
def HandleStdoutWarning( self ):
self.deadlinePlugin.LogWarning( self.GetRegexMatch(0) )
## Callback for when a line of stdout contains an ERROR message.
def HandleStdoutError( self ):
self.deadlinePlugin.FailRender( "Detected an error: " + self.GetRegexMatch(1) )
## Callback to get the executable used for rendering.
def RenderExecutable( self ):
return self.deadlinePlugin.GetConfigEntry( "MyPluginRenderExecutable" )
## Callback to get the arguments that will be passed to the executable.
def RenderArgument( self ):
arguments = " -verbose " + self.deadlinePlugin.GetPluginInfoEntry( "Verbose" )
arguments += " -scene \"" + self.deadlinePlugin.GetDataFilename() + "\""
return arguments

Because the Advanced plug-ins are much more complex than the Simple plug-ins, we recommend taking a look at the
following plug-ins that are shipped with Deadline for examples:
3dsmax
Fusion
Lightwave
MayaBatch
Modo
Nuke
SoftimageBatch

7.2.6 Migrating Plug-ins from Deadline 5


Some changes were made to the Scripting API in Deadline 6, which means that Deadline 6 and later are NOT backward
compatible with plugin scripts written for Deadline 5. However, migrating your scripts over is relatively straightforward, and this guide will walk you through the API changes so that you can update your scripts as necessary.
Global Functions
In Deadline 6, all global API functions were removed, and replaced with DeadlinePlugin member functions, or with
static utility functions. See the Migrating Scripts From Deadline 5 section in the Scripting Overview documentation
for more information, including replacement functions.
Almost all plugin-specific global functions are now DeadlinePlugin member functions. For example, the global
LogInfo( message ) function has been replaced with a member function for the DeadlinePlugin class, which you
created in your event python file. So instead of:
LogInfo( "this is a test message" )

You would use this code:

7.2. Application Plugins

397

Deadline User Manual, Release 7.1.0.35

self.LogInfo( "this is a test message" )

The only functions that arent DeadlinePlugin member functions are listed below, along with their replacement utility
functions.
Original Global Function
CheckPathMapping( path )
CheckPathMappingInFile( inFileName,
outFileName )
CheckPathMappingInFileAndReplaceSeparator(
inFileName, outFileName , separatorToReplace,
newSeparator )
PathMappingRequired( path )

Replacement Function
RepositoryUtils.CheckPathMapping( path )
RepositoryUtils.CheckPathMappingInFile( inFileName,
outFileName )
RepositoryUtils.CheckPathMappingInFileAndReplaceSeparator(
inFileName, outFileName, separatorToReplace,
newSeparator )
RepositoryUtils.PathMappingRequired( path )

Callbacks
You need to set up callbacks in the constructor of your DeadlinePlugin class that you created in your plugin python
file. Examples are shown in the documentation above, and you can look at the plug-ins that ship with Deadline for
references as well. For example:
def __init__( self ):
self.InitializeProcessCallback += self.InitializeProcess
self.RenderExecutableCallback += self.RenderExecutable
self.RenderArgumentCallback += self.RenderArgument
self.PreRenderTasksCallback += self.PreRenderTasks
self.PostRenderTasksCallback += self.PostRenderTasks

Note that these callbacks need to be manually cleaned up when the plug-in is no longer in use. See the documentation
regarding the CleanupDeadlinePlugin function above for more information.
Deprecated Mode
As mentioned above, you can set the DeprecatedMode property in your dlinit file to True. This mode allows
Python.NET plug-ins written for Deadline 5.1 or 5.2 to work with Deadline 6 and later, which can make the transition to Deadline 6 easier if you have custom plug-ins.
Note that when DeprecatedMode is enabled, all global functions will still be available, so if you have custom
Python.NET plug-ins, you just need to drop them in the custom/plugins folder in the Repository, and add DeprecatedMode=True to your dlinit file.
If you have custom IronPython plug-ins from Deadline 5.2 or earlier, they will not work with Deadline 6 and later.

7.3 Event Plugins


7.3.1 Overview
Event plug-ins can be created to execute specific tasks in response to specific events in Deadline (like when a job is
submitted or when it finishes). For example, event plug-ins can be used to communicate with in-house pipeline tools
to update the state of shots or tasks, or they can be used to submit a post-processing job when another job finishes. All
of Deadlines event plug-ins are written in Python, which means that its easy to create your own plug-ins or customize

398

Chapter 7. Scripting

Deadline User Manual, Release 7.1.0.35

the existing ones. See the Scripting Overview documentation for more information, and links to the Deadline Scripting
reference.
Note that because the Python scripts for event plug-ins will be executed in a non-interactive way, it is important that
your scripts do not contain any blocking operations like infinite loops, or interfaces that require user input.
When an event is executed the log will show where the script is being loaded from.

7.3.2 Triggering Events


An event plug-in can respond to one or more of the following DeadlineEventListener events:
When a job is submitted OnJobSubmittedCallback
When a job starts rendering OnJobStartedCallback
When a job finishes rendering OnJobFinishedCallback
When a job is requeued OnJobRequeuedCallback
When a job fails OnJobFailedCallback
When a job is suspended OnJobSuspendedCallback
When a suspended or failed job is resumed OnJobResumedCallback
When a job is placed in the pending state OnJobPendedCallback
When a job is released from a pending state OnJobReleasedCallback
When a job is deleted OnJobDeletedCallback
When a job error occurs during rendering OnJobErrorCallback
When a job is about to be purged from the database OnJobPurgedCallback
When a house cleaning operation finishes OnHouseCleaningCallback
When a repository repair operation finishes OnRepositoryRepairCallback
When a slave starts OnSlaveStartedCallback
When a slave stops OnSlaveStoppedCallback
When a slave becomes idle OnSlaveIdleCallback
When a slave starts rendering OnSlaveRenderingCallback
When a slave starts a job OnSlaveStartingJobCallback
When a slave is marked as stalled OnSlaveStalledCallback
When power managements Idle Shutdown feature shuts down slaves OnIdleShutdownCallback
When power managements Machine Startup feature starts up slaves OnMachineStartupCallback
When power managements Thermal Shutdown feature shuts down slaves OnThermalShutdownCallback
When power managements Machine Restart feature restarts slaves OnMachineRestartCallback
The corresponding Event Callbacks for these events can be found in the Deadline.Events.DeadlineEventListener Class
Reference section of the Deadline Scripting Reference documentation. The full Deadline Scripting Reference can be
found on the Thinkbox Software Documentation Website. Offline PDF and HTML versions can be downloaded from
here as well.
By default, all jobs will trigger event plug-ins when they are submitted or change state. However, there is a job property
that can be enabled to suppress events. In the Monitor, you can set the Suppress Events property under the Advanced

7.3. Event Plugins

399

Deadline User Manual, Release 7.1.0.35

tab in the Job Properties dialog. If you have a custom submission tool or script, you can specify the following in the
job info file:
SuppressEvents=True

Note that events will be executed by different Deadline applications, depending on the context of the event. For
example, the job submission event is processed by the Command application after the job has been submitted, while
the job finished event is normally processed by the Slave that finishes the last task for the job. However, the job finished
event could also be processed by the Monitor if manually marking a job as complete.

7.3.3 Creating an Event Plug-in


To create a custom event plug-in, you start by creating a folder in the Repositorys custom\events folder and give it the
name of your event plug-in. See the Scripting Overview documentation for more information on the custom folder
in the Repository and how its used.
For the sake of this document, we will call our new event plug-in MyEvent. All relative script and configuration files
for this event plug-in are to be placed in this folder (some are required and some are optional).
The dlinit File - Required
The first required file is MyEvent.dlinit, which is the main configuration file for this event plug-in. It is a plain text file
that defines a few general key=value event plug-in properties, which include:
Key
Name
Enabled
DeprecatedMode

Description
Set to True or False (default is False). Only enabled event plug-ins will respond to events.
Set to True or False (default is False). Only set to True if you want a custom Python.NET event
plug-in from Deadline 5.1 or 5.2 to work with Deadline 6 or later. More information on
DeprecatedMode can be found later on.

It can also define key=value custom settings to be used by the event plug-in. For example, if you are connecting to an
in-house pipeline tool, you may want the URL and credentials to be configurable, in which case our MyEvent.dlinit
file might look like this:
Enabled=True
PipelineURL=http://[myserver]/pipeline
PipelineUserName=myuser
PipelinePassword=mypassword

The py File - Required


The other required file is MyEvent.py, which is the main event plug-in script file. It defines the main DeadlineEventListener class that contains the necessary callbacks that will respond to specific events. The template for this script file
might look like this:
from Deadline.Events import *
######################################################################
## This is the function that Deadline calls to get an instance of the
## main DeadlineEventListener class.
######################################################################
def GetDeadlineEventListener():

400

Chapter 7. Scripting

Deadline User Manual, Release 7.1.0.35

return MyEvent()
######################################################################
## This is the function that Deadline calls when the event plugin is
## no longer in use so that it can get cleaned up.
######################################################################
def CleanupDeadlineEventListener( deadlinePlugin ):
deadlinePlugin.Cleanup()
######################################################################
## This is the main DeadlineEventListener class for MyEvent.
######################################################################
class MyEvent (DeadlineEventListener):
# TODO: Place code here to replace "pass"
pass

The first thing to note is that were importing the Deadline.Events namespace so that we can access the DeadlineEventListener class.
The GetDeadlineEventListener() function is important, as it allows Deadline to get an instance of our MyEvent class
(which is extending the abstract DeadlineEventListener class). In Deadline 6.2 and later, the GetDeadlineEventListenerWithJobs( jobs ) function can be defined as an alternative. It works the same as GetDeadlineEventListener(), except
that it accepts a list of the Job objects that the event plug-in is being loaded for. If either of these functions are not
defined, Deadline will report an error when it tries to load the event plug-in.
The MyEvent class will need to implement certain callbacks based on the events you want to respond to, and these
callbacks must be hooked up in the MyEvent constructor. All callbacks are optional, but make sure to include at
least one so that your event plug-in actually does something. For a list of all available callbacks, refer to the DeadlineEventListener class in the Deadline Scripting reference.
The CleanupDeadlineEventListener() function is also important, as it is necessary to clean up the event plug-in when
it is no longer in use. Typically, this is used to clean up any callbacks that were created when the event plug-in was
initialized.
After implementing a few functions, your MyEvent.py script file might look something like this:
from Deadline.Events import *
######################################################################
## This is the function that Deadline calls to get an instance of the
## main DeadlineEventListener class.
######################################################################
def GetDeadlineEventListener():
return MyEvent()
######################################################################
## This is the function that Deadline calls when the event plugin is
## no longer in use so that it can get cleaned up.
######################################################################
def CleanupDeadlineEventListener( deadlinePlugin ):
deadlinePlugin.Cleanup()
######################################################################
## This is the main DeadlineEventListener class for MyEvent.
######################################################################
class MyEvent (DeadlineEventListener):

7.3. Event Plugins

401

Deadline User Manual, Release 7.1.0.35

def __init__( self ):


# Set up the event callbacks here
self.OnJobSubmittedCallback += self.OnJobSubmitted
self.OnJobFinishedCallback += self.OnJobFinished
def Cleanup( self ):
del self.OnJobSubmittedCallback
del self.OnJobFinishedCallback
def OnJobSubmitted( self, job ):
# TODO: Connect to pipeline site to notify it that a job has been submitted
# for a particular shot or task.
pass
def OnJobFinished( self, job ):
# TODO: Connect to pipeline site to notify it that the job for a particular
# shot or task is complete.
pass

The param File - Optional


The MyEvent.param file is an optional file that is used by the Event Configuration dialog in the Monitor. It declares
properties that the Monitor uses to generate a user interface for modifying custom settings in the MyEvent.dlinit file.
After youve created this file, open the Monitor and enter Super User mode. Then select Tools -> Configure Events
and look for your event plug-in in the list on the left.

The file might look something like:

402

Chapter 7. Scripting

Deadline User Manual, Release 7.1.0.35

[Enabled]
Type=boolean
Label=Enabled
Default=True
Description=If this event plug-in should respond to events.
[PipelineURL]
Type=string
Label=Pipeline URL
Default=http://[myserver]/pipeline
Description=The URL for our pipeline website.
[PipelineUserName]
Type=string
Label=Pipeline User Name
Default=
Description=The user name for our pipeline website.
[PipelinePassword]
Type=string
Label=Pipeline Password
Default=
Description=The password for our pipeline website.

Comment lines are supported in the param file, and must start with either ; or #. For example:
# This is a comment about this PipelineURL property.
[PipelineURL]
Type=string
Label=Pipeline URL
Default=http://[myserver]/pipeline
Description=The URL for our pipeline website.

Youll notice that the property names between the square brackets matches the custom keys we defined in our
MyEvent.dlinit file. This means that these control will change the corresponding settings. The available key=value
pairs for the properties defined here are:

7.3. Event Plugins

403

Deadline User Manual, Release 7.1.0.35

Key
Name
Category
CategoryIndex
CategoryOrder
Default
DefaultValue
Description
DisableIfBlank
IgnoreIfBlank
Index
Label
Required
Type

Description
The category the control should go under.
This determines the controls order under its category. This does the same thing as Index.
This determines the categorys order among other categories. If more than one CategoryOrder is
defined for the same category, the lowest value is used.
The default value to be used if this property is not defined in the dlinit file. This does the same thing
as DefaultValue.
The default value to be used if this property is not defined in the dlinit file. This does the same thing
as Default.
A short description of the property the control is for (displayed as a tooltip in the UI).
If True, a control will not be shown if this property is not defined in the dinit file (True/False). This
does the same thing as IgnoreIfBlank.
If True, a control will not be shown if this property is not defined in the dinit file (True/False). This
does the same thing as DisableIfBlank.
This determines the controls order under its category. This does the same thing as CategoryIndex.
The control label.
If True, a control will be shown for this property even if its not defined in the dlinit file (True/False).
The type of control (see table below).

These are the available controls.


Control Type
Boolean
Color
Enum
Enumeration
Filename
FilenameSave
Float
Folder
Integer
Label
MultiFilename
MultiLineMultiFilename
MultiLineMultiFolder
MultiLineString
Password
SlaveList
String

Description
A drop-down control that allows the selection of True or False.
Allows the selection of a color.
A drop-down control that allows the selection of an item from a list.
Same as Enum above.
Allows the selection of an existing file.
Allows the selection of a new or existing file.
An floating point spinner control.
Allows the selection of an existing folder.
An integer spinner control.
A read-only text field.
Allows the selection of multiple existing files, which are then separated by semicolons in
the text field.
Allows the selection of multiple existing files, which are then placed on multiple lines in
the text field.
Allows the selection of multiple existing folders, which are then placed on multiple lines
in the text field.
A text field with multiple lines.
A text field that masks the text.
Allows the selection of existing Slaves, when are then separated by commas in the text
field.
A text field.

There are also key/value pairs for specific controls:

404

Chapter 7. Scripting

Deadline User Manual, Release 7.1.0.35

Key Name
DecimalPlaces
Filter
Increment
Items
Maximum
Minimum
Validator
Values

Description
The number of decimal places for the Float controls.
The filter string for the Filename, FilenameSave, or MultiFilename controls.
The value to increment the Integer or Float controls by.
The semicolon separated list of items for the Enum control. This does the same thing as Values.
The maximum value for the Integer or Float controls.
The minimum value for the Integer or Float controls.
A regular expression for the String control that is used to ensure the value is valid.
The semicolon separated list of items for the Enum control. This does the same thing as Items.

7.3.4 Event Plug-in and Error Reports


Logs and reports can be stored with the job or the slave, depending on the event type.
Job Event Reports
Event types that start with OnJob... will save reports with the corresponding job.
When an event plug-in that uses the LogInfo or LogWarning functions finishes executing, its log will be stored with
the jobs other render logs, which you can view in the Monitor by right-clicking on the job and selecting View Job
Reports.
When an error occurs in an event-plugin, an error report will also be stored with the jobs other render errors, which
you can view in the Monitor by right-clicking on the job and selecting View Job Reports.
Slave Event Reports
Event types that start with OnSlave... will save reports with the corresponding slave.
When an event plug-in that uses the LogInfo or LogWarning functions finishes executing, its log will be stored with
the slaves other render logs, which you can view in the Monitor by right-clicking on the slave and selecting View
Slave Reports.
When an error occurs in an event-plugin, an error report will also be stored with the slaves other render errors, which
you can view in the Monitor by right-clicking on the slave and selecting View Slave Reports.

7.3.5 Quicktime Generation Example


An event plug-in can be used to automatically submit a Quicktime job to create a movie from the rendered images of
a job that just finished. An example of an event plug-in like this can be downloaded from the Miscellaneous Deadline
Downloads Page. To install the event plugin, just unzip the downloaded file to your Repositorys custom/events folder.
Configuration Files
The QuicktimeGen.dlinit and QuicktimeGen.param files define a couple of settings that can be configured from the
Monitor. Here you can specify a path to the Quicktime settings XML file you want to use. This settings file can be
generated from the Submit Quicktime Job To Deadline submitter in the Monitor.
The QuicktimeGen.dlinit file:
Enabled=True
QTSettings=\\ws-wpg-026\share\quicktime_export_settings.xml

The QuicktimeGen.param file:

7.3. Event Plugins

405

Deadline User Manual, Release 7.1.0.35

[Enabled]
Type=boolean
Label=Enabled
Default=True
Description=If this event plug-in should respond to events.
[QTSettings]
Type=filename
Label=QT Settings XML File
Default=
Description=The QT settings xml file.

7.3.6 Cron / Scheduled Event

406

Chapter 7. Scripting

Deadline User Manual, Release 7.1.0.35

A regular time interval based event plugin can be called via listening for the House Cleaning event in Deadline to
be completed. This is ideal for the execution of a Deadline event plugin, at a regular time interval when the Deadline
database is as up to date as possible. The time interval of the House Cleaning operation is controlled in the repository
options.
Deadline provides the possibility of integration with IT monitoring systems such as Zabbix, Zenoss, Nagios, Opennms,
SolarWinds or indeed any other monitoring software via the house cleaning event callback. As an example, this event
could be used to regularly inject Deadline data based on its job, slave, pulse, balancer statistics or info/settings into
another database thereby providing integration and consistency between separate information systems in different
departments in a company.
Building your own scheduled event script file might look something like this:
from Deadline.Events import *
######################################################################
## This is the function that Deadline calls to get an instance of the
## main DeadlineEventListener class.
######################################################################
def GetDeadlineEventListener():
return ScheduledEvent()
######################################################################
## This is the function that Deadline calls when the event plugin is
## no longer in use so that it can get cleaned up.
######################################################################
def CleanupDeadlineEventListener( deadlinePlugin ):
deadlinePlugin.Cleanup()
######################################################################
## This is the main DeadlineEventListener class for ScheduledEvent.
######################################################################
class ScheduledEvent (DeadlineEventListener):
def __init__( self ):
# Set up the event callbacks here
self.OnHouseCleaningCallback += self.OnHouseCleaning
def Cleanup( self ):
del self.OnHouseCleaningCallback
def OnHouseCleaning( self ):
# TODO: Execute generic pipeline duties here such as
# reporting to an external studio database or injecting
# Deadline Farm Stats into Zabbix, Zenoss, Nagios for IT

7.3. Event Plugins

407

Deadline User Manual, Release 7.1.0.35

pass

7.3.7 Software Configuration Management Integration

408

Chapter 7. Scripting

Deadline User Manual, Release 7.1.0.35

Deadline provides the possibility of integration with Software Configuration Management (SCM) systems such as
CFEngine, Puppet, Saltstack, Chef, SCCM or indeed any SCM software via the slave event callbacks. Deadline ships
with Puppet and Salt Maintenance Jobs which can be submitted to Deadline via their monitor submission scripts and
also via Puppet and Salt slave centric event plugins.
Building your own SCM event plugin might look something like this:
from Deadline.Events import *
######################################################################
## This is the function that Deadline calls to get an instance of the
## main DeadlineEventListener class.
######################################################################
def GetDeadlineEventListener():
return SoftwareEvent()
######################################################################
## This is the function that Deadline calls when the event plugin is
## no longer in use so that it can get cleaned up.
######################################################################
def CleanupDeadlineEventListener( deadlinePlugin ):
deadlinePlugin.Cleanup()
######################################################################
## This is the main DeadlineEventListener class for SoftwareEvent.
######################################################################
class SoftwareEvent (DeadlineEventListener):
def __init__( self ):
# Set up the event callbacks here
self.OnSlaveIdleCallback += self.OnSlaveIdle
self.OnSlaveStartedCallback += self.OnSlaveStarted
self.OnSlaveStartingJobCallback += self.OnSlaveStartingJob
def Cleanup( self ):
del self.OnSlaveIdleCallback
del self.OnSlaveStartedCallback
del self.OnSlaveStartingJob
# This is called when a slave becomes idle.
def OnSlaveIdle(self, string):
# If a slave is IDLE, then it is not processing,

7.3. Event Plugins

409

Deadline User Manual, Release 7.1.0.35

# which might be an optimal time to check for


# system updates.
self.SoftwareUpdate()
# This is called when a slave is started.
def OnSlaveStarted(self, string):
# If a slave has just started on a rendernode,
# this can typically be a reliable and safe time
# to carry out config/software deployment.
self.SoftwareUpdate()
# This is called when a slave starts a job.
def OnSlaveStartingJob(self, string, job):
# You could query the returned job object when a
# slave first starts a job. Correct version of
# renderer installed?
self.SoftwareUpdate()
def SoftwareUpdate(self):
ClientUtils.LogText("Preparing for Software Update")
# TODO: Execute command here to query your in-house
# software deployment tool (SCM) to see if any new
# software/sys env variables are required to be updated.
pass

7.3.8 Migrating Event Plug-ins from Deadline 5


Some changes were made to the Scripting API in Deadline 6, which means that Deadline 6 and later are NOT backward compatible with event plugin scripts written for Deadline 5. However, migrating your scripts over is relatively
straightforward, and this guide will walk you through the API changes so that you can update your scripts as necessary.
Global Functions
In Deadline 6, all global API functions were removed, and replaced with DeadlineEventListener member functions, or
with static utility functions. See the Migrating Scripts From Deadline 5 section in the Scripting Overview documentation for more information, including replacement functions.
Almost all event plugin-specific global functions are now DeadlineEventListener member functions. For example, the
global LogInfo( message ) function has been replaced with a member function for the DeadlineEventListener class,
which you created in your event python file. So instead of:
LogInfo( "this is a test message" )

You would use this code:


self.LogInfo( "this is a test message" )

The only functions that arent DeadlineEventListener member functions are listed below, along with their replacement
utility functions.

410

Chapter 7. Scripting

Deadline User Manual, Release 7.1.0.35

Original Global Function


CheckPathMapping( path )
CheckPathMappingInFile( inFileName,
outFileName )
CheckPathMappingInFileAndReplaceSeparator(
inFileName, outFileName , separatorToReplace,
newSeparator )
PathMappingRequired( path )

Replacement Function
RepositoryUtils.CheckPathMapping( path )
RepositoryUtils.CheckPathMappingInFile( inFileName,
outFileName )
RepositoryUtils.CheckPathMappingInFileAndReplaceSeparator(
inFileName, outFileName, separatorToReplace,
newSeparator )
RepositoryUtils.PathMappingRequired( path )

Callbacks
You need to set up callbacks in the constructor of your DeadlineEventListener class that you created in your event
python file. Examples are shown in the documentation above, and you can look at the event plug-ins that ship with
Deadline for references as well. For example:
def __init__( self ):
self.OnJobSubmittedCallback += self.OnJobSubmitted
self.OnJobStartedCallback += self.OnJobStarted
self.OnJobFinishedCallback += self.OnJobFinished
self.OnJobRequeuedCallback += self.OnJobRequeued
self.OnJobFailedCallback += self.OnJobFailed

Note that these callbacks need to be manually cleaned up when the event plug-in is no longer in use. See the documentation regarding the CleanupDeadlineEventListener function above for more information.
Deprecated Mode
As mentioned above, you can set the DeprecatedMode property in your dlinit file to True. This mode allows
Python.NET event plug-ins written for Deadline 5.1 or 5.2 to work with Deadline 6 and later, which can make the
transition to Deadline 6 easier if you have custom event plug-ins.
Note that when DeprecatedMode is enabled, all global functions will still be available, so if you have custom
Python.NET event plug-ins, you just need to drop them in the custom/events folder in the Repository, and add
DeprecatedMode=True to your dlinit file.
If you have custom IronPython event plug-ins from Deadline 5.2 or earlier, they will not work with Deadline 6 and
later.

7.4 Cloud Plugins


7.4.1 Overview
Cloud plug-ins can be created to allow Deadline to communicate with different cloud providers. All of Deadlines
cloud plug-ins are written in Python, which means that its easy to create your own plug-ins or customize the existing
ones. You can also refer to them in the Repositorys cloud folder for examples of how they work. See the Scripting
Overview documentation for more information, and links to the Deadline Scripting reference.
Note that because the Python scripts for cloud plug-ins will be executed in a non-interactive way, it is important that
your scripts do not contain any blocking operations like infinite loops, or interfaces that require user input.
When a cloud script is executed the log will show where the script is being loaded from.

7.4. Cloud Plugins

411

Deadline User Manual, Release 7.1.0.35

7.4.2 Creating a Cloud Plug-in


To create a custom cloud plug-in, you start by creating a folder in the Repositorys custom\cloud folder and give it the
name of your cloud plug-in. See the Scripting Overview documentation for more information on the custom folder
in the Repository and how its used.
For the sake of this document, we will call our new cloud plug-in MyCloud. All relative script and configuration files
for this cloud plug-in are to be placed in this folder.
The py File
The first required file is MyCloud.py, which is the main cloud plug-in script file. It defines the main CloudPluginWrapper class that contains the necessary callbacks that will respond to specific commands. The template for this
script file might look like this:
from Deadline.Cloud import *
######################################################################
## This is the function that Deadline calls to get an instance of the
## main CloudPluginWrapper class.
######################################################################
def GetCloudPluginWrapper():
return MyCloudPlugin()
######################################################################
## This is the function that Deadline calls when the cloud plugin is
## no longer in use so that it can get cleaned up.
######################################################################
def CleanupCloudPlugin( deadlinePlugin ):
deadlinePlugin.Cleanup()
######################################################################
## This is the main DeadlineCloudListener class for MyCloud.
######################################################################
class MyCloud (CloudPluginWrapper):
# TODO: Place code here instead of "pass"
pass

The GetCloudPluginWrapper() function is important, as it allows Deadline to get an instance of our MyCloud class
(which is extending the abstract CloudPluginWrapper class). If this function isnt defined, Deadline will report an
error when it tries to load the cloud plug-in. Notice that were importing the Deadline.Cloud namespace so that we
can access the CloudPluginWrapper class.
The MyCloud class will need to implement certain callbacks so that Deadline can get information from the cloud
provider, and these callbacks must be hooked up in the MyCloud constructor. For a list of all available callbacks, refer
to the CloudPluginWrapper class in the Deadline Scripting reference.
After implementing a few functions, your MyCloud.py script file might look something like this:
from Deadline.Cloud import *
######################################################################
## This is the function that Deadline calls to get an instance of the
## main CloudPluginWrapper class.
######################################################################
def GetCloudPluginWrapper():

412

Chapter 7. Scripting

Deadline User Manual, Release 7.1.0.35

return MyCloudPlugin()
######################################################################
## This is the function that Deadline calls when the cloud plugin is
## no longer in use so that it can get cleaned up.
######################################################################
def CleanupCloudPlugin( deadlinePlugin ):
deadlinePlugin.Cleanup()
######################################################################
## This is the main DeadlineCloudListener class for MyCloud.
######################################################################
class MyCloud (CloudPluginWrapper):
def __init__( self ):
#Set up our callbacks for cloud control
self.VerifyAccessCallback += self.VerifyAccess
self.AvailableHardwareTypesCallback += self.GetAvailableHardwareTypes
self.AvailableOSImagesCallback += self.GetAvailableOSImages
self.CreateInstancesCallback += self.CreateInstances
self.TerminateInstancesCallback += self.TerminateInstances
self.CloneInstanceCallback += self.CloneInstance
self.GetActiveInstancesCallback += self.GetActiveInstances
self.StopInstancesCallback += self.StopInstances
self.StartInstancesCallback += self.StartInstances
self.RebootInstancesCallback += self.RebootInstances
def Cleanup( self ):
#Clean up our callbacks for cloud control
del self.VerifyAccessCallback
del self.AvailableHardwareTypesCallback
del self.AvailableOSImagesCallback
del self.CreateInstancesCallback
del self.TerminateInstancesCallback
del self.CloneInstanceCallback
del self.GetActiveInstancesCallback
del self.StopInstancesCallback
del self.StartInstancesCallback
del self.RebootInstancesCallback
def VerifyAccess( self ):
#TODO: Return True if connection to cloud provider can be verified.
pass
def GetAvailableHardwareTypes( self ):
#TODO: Return list of HardwareType objects representing the hardware
#types supported by this provider.
#Must be implemented for the Balancer to work.
pass
def GetAvailableOSImages( self ):
#TODO: Return list of OSImage objects representing the OS images
#supported by this provider.
#Must be implemented for the Balancer to work.
pass
def GetActiveInstances( self ):
#TODO: Return list of CloudInstance objects that are currently active.

7.4. Cloud Plugins

413

Deadline User Manual, Release 7.1.0.35

pass
def CreateInstances( self, hardwareID, imageID, count ):
#TODO: Start instances and return list of CloudInstance objects that
#have been started.
#Must be implemented for the Balancer to work.
pass
def TerminateInstances( self, instanceIDs ):
#TODO: Return list of boolean values indicating which instances
#terminated successfully.
#Must be implemented for the Balancer to work.
pass
def StopInstances( self, instanceIDs ):
#TODO: Return list of boolean values indicating which instances
#stopped successfully.
pass
def StartInstances( self, instanceIDs ):
#TODO: Return list of boolean values indicating which instances
#started successfully.
pass
def RebootInstances( self, instanceIDs ):
#TODO: Return list of boolean values indicating which instances
#rebooted successfully.
pass

The param File


The MyCloud.param file is an optional file that is used by the Cloud Provider Configuration dialog in the Monitor. It
declares properties that the Monitor uses to generate a user interface for modifying settings for this provider, which are
then stored in the database. After youve created this file, open the Monitor and enter Super User mode. Then select
Tools -> Configure Cloud Providers and click the Add button under the Cloud Region box to see your cloud plugin.

414

Chapter 7. Scripting

Deadline User Manual, Release 7.1.0.35

The file might look something like:


[Enabled]
Type=boolean
Label=Enabled
Default=True
Description=If this cloud plug-in should be enabled.
[AccessID]
Type=string
Category=Options
CategoryOrder=0
Index=1
Label=Access ID
Default=
Description=Your Cloud Provider Access ID.
[SecretKey]
Type=password
Category=Options
CategoryOrder=0
Index=2
Label=Secret Key
Default=
Description=Your Cloud Provider Secret Key.

Comment lines are supported in the param file, and must start with either ; or #. For example:
# This is a comment about this Enabled property.
[Enabled]
Type=boolean
Label=Enabled
Default=True
Description=If this cloud plug-in should be enabled.

The available key=value pairs for the properties defined here are:

7.4. Cloud Plugins

415

Deadline User Manual, Release 7.1.0.35

Key
Name
Category
CategoryIndex
CategoryOrder
Default
DefaultValue
Description
DisableIfBlank
IgnoreIfBlank
Index
Label
Required
Type

Description
The category the control should go under.
This determines the controls order under its category. This does the same thing as Index.
This determines the categorys order among other categories. If more than one CategoryOrder is
defined for the same category, the lowest value is used.
The default value to be used if this property is not defined in the dlinit file. This does the same thing
as DefaultValue.
The default value to be used if this property is not defined in the dlinit file. This does the same thing
as Default.
A short description of the property the control is for (displayed as a tooltip in the UI).
If True, a control will not be shown if this property is not defined in the dinit file (True/False). This
does the same thing as IgnoreIfBlank.
If True, a control will not be shown if this property is not defined in the dinit file (True/False). This
does the same thing as DisableIfBlank.
This determines the controls order under its category. This does the same thing as CategoryIndex.
The control label.
If True, a control will be shown for this property even if its not defined in the dlinit file (True/False).
The type of control (see table below).

These are the available controls.


Control Type
Boolean
Color
Enum
Enumeration
Filename
FilenameSave
Float
Folder
Integer
Label
MultiFilename
MultiLineMultiFilename
MultiLineMultiFolder
MultiLineString
Password
SlaveList
String

Description
A drop-down control that allows the selection of True or False.
Allows the selection of a color.
A drop-down control that allows the selection of an item from a list.
Same as Enum above.
Allows the selection of an existing file.
Allows the selection of a new or existing file.
An floating point spinner control.
Allows the selection of an existing folder.
An integer spinner control.
A read-only text field.
Allows the selection of multiple existing files, which are then separated by semicolons in
the text field.
Allows the selection of multiple existing files, which are then placed on multiple lines in
the text field.
Allows the selection of multiple existing folders, which are then placed on multiple lines
in the text field.
A text field with multiple lines.
A text field that masks the text.
Allows the selection of existing Slaves, when are then separated by commas in the text
field.
A text field.

There are also key/value pairs for specific controls:

416

Chapter 7. Scripting

Deadline User Manual, Release 7.1.0.35

Key Name
DecimalPlaces
Filter
Increment
Items
Maximum
Minimum
Validator
Values

Description
The number of decimal places for the Float controls.
The filter string for the Filename, FilenameSave, or MultiFilename controls.
The value to increment the Integer or Float controls by.
The semicolon separated list of items for the Enum control. This does the same thing as Values.
The maximum value for the Integer or Float controls.
The minimum value for the Integer or Float controls.
A regular expression for the String control that is used to ensure the value is valid.
The semicolon separated list of items for the Enum control. This does the same thing as Items.

7.5 Balancer Plugins


7.5.1 Overview
Balancer plugins can be created to customize the algorithm logic for the Balancer application. Balancer plugins are
written in Python, which means that they can easily be created and customized. You can also refer to the default plugin
in the Repoistorys balancer folder for a full example of how it works. See the Scripting Overview documentation for
more information, and links to the Deadline Scripting reference.

7.5.2 Creating a Balancer Plug-in


To create a custom balancer plug-in, you start by creating a folder in the Repositorys custom\balancer folder and give
it the name of your balancer plug-in. See the Scripting Overview documentation for more information on the custom
folder in the Repository and how its used.
For the sake of this document, we will call our new balancer plug-in MyBalancerAlgorithm. All relative script and
configuration files for this balancer plug-in are to be placed in this folder.
The py File
The first required file is MyBalancerAlgorithm.py, which is the main balancer plugin script. It defines the BalancerPluginWrapper class that contains all the necessary callbacks that will be used during a balancer cycle. The template
for this script file might look like this:
from Deadline.Balancer import *
###########################################################################
## This is the function that Deadline calls to get an instance of the
## main BalancerPluginWrapper class.
###########################################################################
def GetBalancerPluginWrapper():
return MyBalancerPlugin()
###########################################################################
## This is the main DeadlineBalancerListener class for MyBalancerAlgorithm.
###########################################################################
class MyBalancerAlgorithm (BalancerPluginWrapper):
# TODO: Place code here instead of "pass"
pass

7.5. Balancer Plugins

417

Deadline User Manual, Release 7.1.0.35

The GetBalancerPluginWrapper() function is important, as it allows Deadline to get an instance of our MyBalancerAlgorithm class (which is extending the abstract BalancerPluginWrapper class). If this function isnt defined, Deadline
will report an error when it tries to load the balancer plug-in. Notice that were importing the Deadline.Balancer
namespace so that we can access the BalancerPluginWrapper class.
The MyBalancerAlgorithm class will need to implement the BalancerAlgorithm callback so that Deadline can know
how to balance your farm, and these callbacks must be hooked up in the MyBalancerAlgorithm constructor.
After implementing a few functions, your MyBalancerAlgorithm.py script file might look something like this:
from Deadline.Balancer import *
###########################################################################
## This is the function that Deadline calls to get an instance of the
## main BalancerPluginWrapper class.
###########################################################################
def GetBalancerPluginWrapper():
return MyBalancerPlugin()
###########################################################################
## This is the main DeadlineBalancerListener class for MyBalancerAlgorithm.
###########################################################################
class MyBalancerAlgorithm (BalancerPluginWrapper):
def __init__( self ):
self.BalancerAlgorithmCallback += self.BalancerAlgorithm
def BalancerAlgorithm(self, stateStruct):
#TODO: Return a target struct to the Balancer.
pass

Heres what a BalancerTargetStruct looks like:


/// <summary>
/// The BalancerTargetStruct indicates the ideal number of VM instances that should
/// be running in each enabled Group of each CloudRegion. The BalancerTargetStruct
/// is populated by a Balancer Logic Plug-in.
/// </summary>
public class BalancerTargetStruct
{
public BalancerTargetStruct() { }
// Logic plug-in can set this to true to indicate that an error occurred.
public bool ErrorEncountered;
// Logic plugin can convey an error message here
// (ErrorEncountered should be set to true).
public string ErrorMessage;
// Logic plugin can convey a non-error message here.
public string Message;
// An array of cloud region targets.
public CloudRegionTargetStruct[] CloudRegionTargets;
// The time the structure was filled.
public DateTime Time;
}

418

Chapter 7. Scripting

Deadline User Manual, Release 7.1.0.35

public class CloudRegionTargetStruct


{
public CloudRegionTargetStruct() { }
// The unique ID of the region.
public string RegionID;
// An array of Group targets
public GroupTargetStruct[] GroupTargets;
}
public class GroupTargetStruct
{
public GroupTargetStruct() { }
public GroupTargetStruct(string Name, int Count)
{
this.Name = Name;
this.Count = Count;
}
// The name of the group.
public string Name;
// The target number of VM instances for the group.
public int Count;
}

The param File


The MyBalancerAlgorithm.param file is an optional file that is used in the Balancer Settings panel of the Repository
Options dialog in the Monitor. It declares properties that the Monitor uses to generate a user interface for modifying
settings for this algorithm, which are then stored in the database. After youve created this file, open the Monitor
and enter Super User mode. Then select Tools -> Repository Options -> Balancer Settings and click the dropdown to
MyBalancerAlgorithm to see your settings. Comment lines are supported in the param file, and must start with either
; or #.
The dlinit File
The last required file is MyBalancerAlgorithm.dlinit, which is the main configuration file for this plugin. It is a plain
text file that defines a few general key=value plug-in properties, which include:
Key
Name
About
ConcurrentTasks
DebugLogging
DeprecatedMode

Description
A short description of the plug-in.
Set to True or False (default is False). If tasks for this plug-in can render concurrently without
interfering with each other, this can be set to True.
Set to True or False (default is False). If set to True, then debug plug-in logging will be printed out
during rendering.
Set to True or False (default is False). Only set to True if you want a custom Python.NET plug-in
from Deadline 5.1 or 5.2 to work with Deadline 6 or later. More information on DeprecatedMode
can be found later on.

7.5. Balancer Plugins

419

Deadline User Manual, Release 7.1.0.35

It can also define key=value custom settings to be used by the plug-in. For this example, our MyBalancerAlgorithm.dlinit file might look like this:
About=My Example Plugin for Deadline
SomeSortOfScript=c:\path\to\my\script.py

Comment lines are supported in the dlinit file, and must start with either ; or #.

7.6 Monitor Scripts


7.6.1 Overview
There are several different types of Monitor scripts available. While the large majority of the ones shipping with
Deadline are Submission Scripts used to submit new Jobs to the farm, the Monitor has the capability of running utility
scripts in the context of specific Jobs, Tasks, Slaves, Limits, or even Reports.
Below, we go into more detail for each of the different types of Scripts, and how to create your own.

7.6.2 Scripting Reference


As with all other Deadline scripts, Monitor scripts use Python 2.7, which is supported using Python for .NET. This
means that in addition to typical cPython modules, Python for .NET allows your scripts to make use of .NET Libraries,
and Deadlines own internal functions.
The full Deadline Scripting Reference can be downloaded in CHM or PDF format from the Deadline Downloads page.
Particular functions of note relevant to Monitor Scripting can be found in the aforementioned Scripting Reference,
under the following sections:
Deadline.Scripting.MonitorUtils
Deadline.Scripting.JobUtils
Deadline.Scripting.SlaveUtils
It can also be very helpful when developing your own Monitor Script to take a look at how our built-in Monitor Scripts
of that type are structured.

7.6.3 General Script Template


We follow a fairly specific template when making any new built-in Monitor scripts. The template is loosely as follows:
Define your __main__ function: This is the function that Deadline will call when invoking your script. This is
mandatory, and your script will generate an error if it isnt done.
def __main__( *args ):
#Replace "pass"
pass

Build the submission UI: Typically done in the __main__ function by creating a ScriptDialog object, and
adding controls to it. Each controls name must be unique, so that each control can be identified properly. You
can also set the dialogs size (if not using a grid layout), the row and column (if using a grid layout), title, and a
few other settings. For more details, see the ScriptDialog and ScriptControl sections of the Reference Manual.
For an example on how to use the grid layout see the Grid Layout Example Script documentation.
420

Chapter 7. Scripting

Deadline User Manual, Release 7.1.0.35

Define and Load Sticky Settings: Sticky settings are settings that persist after the dialog has been closed.
They are defined by creating a string array that contains the names of the controls for which you want the
settings to persist. After defining them, you can load them by calling the LoadSettings function of your
ScriptDialog.
Show the Dialog: The last thing you should do in your __main__ function is to show your ScriptDialog,
by using its ShowDialog function.
Define Your Functions: Specify any functions that may be used by your script. These could just be helper
functions, or event handlers that do stuff when UI values are modified.
Note that you dont necessarily need to follow this template, but the closer you stick to it, the more examples youll
have to draw on.

7.6.4 Monitor Scripts


There are many different types of scripts you can write for the Monitor, which are listed below. It is recommended that
these scripts be created in the custom folder in the Repository to avoid issues when upgrading your Repository in the
future. See the Scripting Overview documentation for more information on the custom folder in the Repository and
how its used.
When a monitor script is executed the log will show where the script is being loaded from.
Submission Scripts
Submission Scripts are used to create custom Submission dialogs, and ultimately submit new Jobs to Deadline. They
are located in the Submit menu of the Monitors main menu bar, as well as the Submit menu in the the Launcher.
Creating your own custom Submission dialog is quite simple, and the process is described below.
To create new submission scripts, simply navigate to the custom\scripts\Submission folder in your Repository. Then,
create a new Python file named MySubmissionScript.py, where MySubmissionScript is the name of your new
script.
Once created, you can follow the template outlined above in the General Script Template section to build up your
script.
General Scripts
General scripts are used to perform any sort of custom action by selecting them from the Monitors (or Launchers)
Scripts menu. Under the hood, there technically isnt anything different between General and Submission scripts.
The only real difference is that they show up under different menus, which is just to help keep scripts semantically
separated.
To create new General scripts, simply navigate to the custom\scripts\General folder in your Repository. Then, create
a new Python file named MyGeneralScript.py, where MyGeneralScript is the name of your new script.
Once created, you can follow the template outlined above in the General Script Template section to build up your
script.
Job Scripts
Job Scripts are typically used to modify or to perform actions on a selected Job in the Monitor. They can be accessed
by right-clicking an existing Job in the Job Panel, under the Scripts sub-menu.
To create new Job scripts, simply navigate to the custom\scripts\Jobs folder in your Repository. Then, create a new
Python file named MyJobScript.py, where MyJobScript is the name of your new script.
7.6. Monitor Scripts

421

Deadline User Manual, Release 7.1.0.35

Once created, you can follow the template outlined above in the General Script Template section to build up your
script.
Task Scripts
Task Scripts are typically used to modify or to perform actions on a selected Task in the Monitor. They can be accessed
by right-clicking an existing Task in the Task Panel, under the Scripts sub-menu.
To create new Task scripts, simply navigate to the custom\scripts\Tasks folder in your Repository. Then, create a
new Python file named MyTaskScript.py, where MyTaskScript is the name of your new script.
Once created, you can follow the template outlined above in the General Script Template section to build up your
script.
Slave Scripts
Slave Scripts are typically used to modify or to perform actions on a selected Slave in the Monitor. They can be
accessed by right-clicking an existing Slave in the Slave Panel, under the Scripts sub-menu.
To create new Slave scripts, simply navigate to the custom\scripts\Slaves folder in your Repository. Then, create a
new Python file named MySlaveScript.py, where MySlaveScript is the name of your new script.
Once created, you can follow the template outlined above in the General Script Template section to build up your
script.
Pulse Scripts
Pulse Scripts are typically used to modify or to perform actions on a selected Pulse in the Monitor. They can be
accessed by right-clicking an existing Pulse in the Pulse Panel, under the Scripts sub-menu.
To create new Pulse scripts, simply navigate to the custom\scripts\Pulse folder in your Repository. Then, create a
new Python file named MyPulseScript.py, where MyPulseScript is the name of your new script.
Once created, you can follow the template outlined above in the General Script Template section to build up your
script.
Balancer Scripts
Balancer Scripts are typically used to modify or to perform actions on a selected Balancer in the Monitor. They can
be accessed by right-clicking an existing Balancer in the Balancer Panel, under the Scripts sub-menu.
To create new Balancer scripts, simply navigate to the custom\scripts\Balancer folder in your Repository. Then,
create a new Python file named MyBalancerScript.py, where MyBalancerScript is the name of your new script.
Once created, you can follow the template outlined above in the General Script Template section to build up your
script.
Limit Scripts
Limit Scripts are typically used to modify or to perform actions on selected Limits in the Monitor. They can be
accessed by right-clicking an existing Limit in the Pulse Panel, under the Scripts sub-menu.
To create new Limit scripts, simply navigate to the custom\scripts\Limits folder in your Repository. Then, create a
new Python file named MyLimitScript.py, where MyLimitScript is the name of your new script.

422

Chapter 7. Scripting

Deadline User Manual, Release 7.1.0.35

Once created, you can follow the template outlined above in the General Script Template section to build up your
script.
Job Report Scripts
Job Report Scripts are typically used to modify or to perform actions on selected Job Reports in the Monitor. They
can be accessed by right-clicking an existing Job Report in the Job Report Panel, under the Scripts sub-menu.
To create new Job Report scripts, simply navigate to the custom\scripts\JobReports folder in your Repository. Then,
create a new Python file named MyJobReportScript.py, where MyJobReportScript is the name of your new script.
Once created, you can follow the template outlined above in the General Script Template section to build up your
script.
Slave Report Scripts
Slave Report Scripts are typically used to modify or to perform actions on selected Slave Reports in the Monitor. They
can be accessed by right-clicking an existing Slave Report in the Slave Report Panel, under the Scripts sub-menu.
To create new Slave Report scripts, simply navigate to the custom\scripts\SlaveReports folder in your Repository.
Then, create a new Python file named MySlaveReportScript.py, where MySlaveReportScript is the name of your
new script.
Once created, you can follow the template outlined above in the General Script Template section to build up your
script.

7.6.5 Customizing Script Display


As with any built-in script, once youve created your new Monitor Script you can change its Display Name, Keyboard
Shortcut, Icon, and its position within the menu in the Repository Configuration.
You can also control who can see (and use) your Submission Script through by tweaking its access level in User
Management. It is probably a good idea to disable access to it for most users until you have your new script in
working order.

7.6.6 Grid Layout Example Script


Grid layouts allow your script dialog to dynamically resize its contents to fit the the size of the dialog. Below are some
examples of how to use the new grid layout to build a script dialog.
First you must create a ScriptDialog object and start a grid. Once all controls have been added you must end the grid
dg = DeadlineScriptDialog()
dg.AddGrid()
#...
#Added controls go here
#...
dg.EndGrid()

Once you start a grid you can add controls to it by row and column. There is no need to specify how many rows or
columns you want the grid to have, just specify the row and column where you want the control to be and the grid will
grow to accommodate. Here is an example of adding a label and a text field to the dialog in the same row.

7.6. Monitor Scripts

423

Deadline User Manual, Release 7.1.0.35

dg.AddGrid()
dg.AddControlToGrid("Label1", "LabelControl", "I'm a label.", 0,0, "A tooltip", False)
dg.AddControlToGrid( "TextBox1", "TextControl", "", 0, 1 )
dg.EndGrid()

Here is an example of what this dialog would look like:

It is not possible to specify the size of the controls you want to add to the grid, however it is also not necessary to do
so. The contents of the grid(s) will automatically adjust themselves to share the size of the dialog. If you want certain
elements to not grow within a row you can set the expand property to be disabled. If you want a control to take more
space you can set the control span multiple rows or columns using rowSpan and colSpan, respectively. By default
controls have expand set and have their colSpan and rowSpan properties set to 1.
This is an example of a dialog with two rows and four columns. The first row contains a label in the first column and
is set to not grow any bigger than it needs to and a text control that spans the next 3 columns and is allowed to grow.
The second row contains three labels that are not allowed to grow in the first three columns and a text control in the
fourth column that can grow as needed.
dg.AddGrid()
dg.AddControlToGrid(
"L1", "LabelControl", "I'm a label.", 0,0, "A tooltip", expand=False)
dg.AddControlToGrid(
"TextBox1", "TextControl", "", 0, 1, colSpan=3)
dg.AddControlToGrid(
"L2", "LabelControl", "I'm
dg.AddControlToGrid(
"L3", "LabelControl", "I'm
dg.AddControlToGrid(
"L4", "LabelControl", "I'm
dg.AddControlToGrid(
"TextBox2", "TextControl",

another label.", 1,0, "A tooltip", expand=False)


another label.", 1,1, "A tooltip", expand=False)
another label.", 1,2, "A tooltip", expand=False)
"", 1, 3)

dg.EndGrid()

Here is an example of what this dialog would look like:

When you expand the dialog horizontally, only the text controls will grow in the above example. Nothing will grow,
other than the dialog itself, when expanding vertically. Note that if you set all controls in a row to not expand that this
will cause the cells in the grid that the controls are in to expand without allowing any of the controls to expand with it.
This will result in the dialog losing its layout when it is expanded.

424

Chapter 7. Scripting

Deadline User Manual, Release 7.1.0.35

Here is an example of what this dialog would look like expanded horizontally:

Here is an example of what this dialog would look like expanded vertically:

Here is an example of what the dialog would look like expanded horizontally if all controls had expand=False set.

If you want to space controls out in the grid you can use labels filled with white space, or you can use horizontal
spacers. Here is an example of adding two buttons to a dialog and keeping them to the far right of the dialog.
dg.AddGrid()
dg.AddHorizontalSpacerToGrid( "DummyLabel", 0, 0 )
ok = dg.AddControlToGrid(
"Ok", "ButtonControl", "OK", 0, 1, expand=False )
ok.ValueModified.connect(OkButtonPressed)
cancel = dg.AddControlToGrid(
"Cancel", "ButtonControl", "Cancel", 0, 2, expand=False )
cancel.ValueModified.connect(CancelButtonPressed)
dg.EndGrid()

Here is an example of what this dialog will look like when expanded horizontally:

7.6. Monitor Scripts

425

Deadline User Manual, Release 7.1.0.35

All together, here is an example of a basic script dialog using grid layouts.
from DeadlineUI.Controls.Scripting.DeadlineScriptDialog import DeadlineScriptDialog
########################################################################
## Globals
########################################################################
dg = None
########################################################################
## Main Function Called By Deadline
########################################################################
def __main__( *args ):
global dg
dg = DeadlineScriptDialog()
dg.SetTitle( "Example Deadline Script" )
dg.AddGrid()
dg.AddControlToGrid(
"L1", "LabelControl", "I'm a label.", 0,0, "A tooltip", expand=False)
dg.AddControlToGrid(
"TextBox1", "TextControl", "", 0, 1, colSpan=3)
dg.AddControlToGrid(
"L2", "LabelControl", "I'm
dg.AddControlToGrid(
"L3", "LabelControl", "I'm
dg.AddControlToGrid(
"L4", "LabelControl", "I'm
dg.AddControlToGrid(
"TextBox2", "TextControl",

another label.", 1,0, "A tooltip", expand=False)


another label.", 1,1, "A tooltip", expand=False)
another label.", 1,2, "A tooltip", expand=False)
"", 1, 3)

dg.EndGrid()
#Adds an OK and Cancel button to the dialog
dg.AddGrid()
dg.AddHorizontalSpacerToGrid( "DummyLabel", 0, 0 )
ok = dg.AddControlToGrid(
"Ok", "ButtonControl", "OK", 0, 1, expand=False )
ok.ValueModified.connect(OkButtonPressed)
cancel = dg.AddControlToGrid(
"Cancel", "ButtonControl", "Cancel", 0, 2, expand=False )
cancel.ValueModified.connect(CancelButtonPressed)
dg.EndGrid()
dg.ShowDialog( True )
def CloseDialog():

426

Chapter 7. Scripting

Deadline User Manual, Release 7.1.0.35

global dg
dg.CloseDialog()
def CancelButtonPressed():
CloseDialog()
def OkButtonPressed( *args ):
global dg
dg.ShowMessageBox( "You pressed the OK button.", "Button Pressed" )

Here is what this dialog looks like:

7.6.7 Migrating Scripts from Deadline 5


Some changes were made to the Scripting API in Deadline 6, which means that Deadline 6 and later are NOT backward
compatible with scripts written for Deadline 5. However, migrating your scripts over is relatively straightforward, and
this guide will walk you through the API changes so that you can update your scripts as necessary.
Global Functions
The globally defined functions are no longer available. See the Migrating Scripts From Deadline 5 section in the
Scripting Overview documentation for more information, including replacement functions.
User Interface
If you are creating a user interface using the ScriptDialog object, you can no longer get an instance of it from DeadlineScriptEngine using the following:
scriptDialog = DeadlineScriptEngine.GetScriptDialog()

Instead, you need to import the DeadlineScriptDialog class, and use its constructor to create an instance:
from DeadlineUI.Controls.Scripting.DeadlineScriptDialog import DeadlineScriptDialog
...
scriptDialog = DeadlineScriptDialog()

Another change is how the ValueModifed event handlers are hooked up for the ScriptDialog controls. For example,
this is how the event was hooked up in Deadline 5:

7.6. Monitor Scripts

427

Deadline User Manual, Release 7.1.0.35

compBox = scriptDialog.AddControl(
"CompBox", "TextControl", "", dialogWidth-labelWidth-24,-1)
compBox.ValueModified += CompChanged

Now, because the ScriptDialog object is a Qt object, you need to use the connect function to hook up events:
compBox = scriptDialog.AddControl(
"CompBox", "TesxtControl", "", dialogWidth-labelWidth-24,-1)
compBox.ValueModified.connect( CompChanged )

The File Browser based controls have also changed their file filter syntax. In Deadline 5, the file filter syntax looked
like this:
scriptDialog.AddRow()
scriptDialog.AddControl(
"FileLabel", "LabelControl", "Select File", labelWidth, -1)
scriptDialog.AddSelectionControl("FileBox", "FileBrowserControl", "",
"All Files (*.*)|*.*|CAD Files: JT (*.jt)|*.jt", dialogWidth-labelWidth-24, -1)
scriptDialog.EndRow()

Now, because the ScriptDialog object is a Qt object, you need to use the following syntax to filter files in any of the
browser controls. Note the replacement of the | character for ;; and there is no longer the requirement to provide
a file extension filter per file format entry as the filter is taken from the text label (*.txt) or (*.*) as per the example
below:
scriptDialog.AddRow()
scriptDialog.AddControl(
"FileLabel", "LabelControl", "Select File", labelWidth, -1)
scriptDialog.AddSelectionControl( "FileBox", "FileBrowserControl", "",
"Text Files (*.txt);;All Files (*.*)", dialogWidth-labelWidth-24, -1)
scriptDialog.EndRow()

7.7 Job Scripts


7.7.1 Overview
Job scripts and Dependency scripts can use Python to implement additional automation. Job scripts can be used to
perform additional tasks during rendering, and Dependency scripts can control when jobs start rendering.
Note that because the Python scripts will be executed in a non-interactive way, it is important that your scripts do not
contain any blocking operations like infinite loops, or interfaces that require user input. See the Scripting Overview
documentation for more information, and links to the Deadline Scripting reference.

7.7.2 Job Scripts


Job scripts can be assigned to Jobs in order to automate certain tasks before a Job starts rendering (Pre-Job Script),
after a Job finished rendering (Post-Job Script), or before and after each individual Job Task has been completed (Pre
and Post-Task Scripts).
After you create your scripts, you can assign them to a Job by right-clicking on the desired Job in the Monitor, and
selecting Modify Job Properties. The script options can be found under the Scripts section of the Job Properties
428

Chapter 7. Scripting

Deadline User Manual, Release 7.1.0.35

window. In addition to this, Job scripts can be specified by custom submitters by including them in the Job Info File
on submission. Note that a full path to the script is required, so it is recommended that the script file be stored in a
location that is accessible to all Slaves.
Creating Job Scripts
The only requirement for a Job script is that you define a __main__ function. This is the function that will be called
by Deadline when it comes time to execute the script, and an instance of the DeadlinePlugin object will be passed as
a parameter.
def __main__( *args ):
#Replace "pass"
pass

A common use for Post-Task scripts is to do some processing with the output image files. Here is a sample script that
demonstrates how to get the output file names for the current task, and print them out to the render log:
import re
from System.IO import *
from Deadline.Scripting import *
def __main__( *args ):
deadlinePlugin = args[0]
job = deadlinePlugin.GetJob()
outputDirectories = job.OutputDirectories
outputFilenames = job.OutputFileNames
paddingRegex = re.compile("[^\\?#]*([\\?#]+).*")
s
for i in range( 0, len(outputDirectories) ):
outputDirectory = outputDirectories[i]
outputFilename = outputFilenames[i]
startFrame = deadlinePlugin.GetStartFrame()
endFrame = deadlinePlugin.GetEndFrame()
for frameNum in range(startFrame, endFrame+1):
outputPath = Path.Combine(outputDirectory,outputFilename)
outputPath = outputPath.replace("//","/")
m = re.match(paddingRegex,outputPath)
if( m != None):
padding = m.group(1)
frame = StringUtils.ToZeroPaddedString(frameNum,len(padding),False)
outputPath = outputPath.replace( padding, frame )
deadlinePlugin.LogInfo( "Output file: " + outputPath )

7.7.3 Dependency Scripts


Dependency scripts can be used to control when a job starts rendering. For example, the script could connect to an
internal pipeline database to see if the job has been approved to start rendering.
After you create your dependency scripts, you can assign them to a Job by right-clicking on the desired Job in the
Monitor, and selecting Modify Job Properties. The Script Dependencies options can be found under the Scripts
section of the Job Properties window. In addition to this, Job scripts can be specified by custom submitters by including

7.7. Job Scripts

429

Deadline User Manual, Release 7.1.0.35

them in the Job Info File on submission. Note that a full path to the script is required, so it is recommended that the
script file be stored in a location that is accessible to all Slaves.
Creating Dependency Scripts
The only requirement for a Job script is that you define a __main__ function. This is the function that will be called
by Deadline when it comes time to execute the script to determine if a job should be released or not.
For jobs without Frame Dependencies enabled, only the job ID will be passed as a parameter. The __main__ function
should then return True if the job should be released or False if it shouldnt be.
For jobs with Frame Dependencies enabled, the job ID will be passed as the first parameter, and a list of pending task
IDs will be passed as the second parameter. The __main__ function should then return the list of task IDs that should
be released, or an empty list of none should be released.
Here is a very simple example that will work regardless of whether Frame Dependencies are enabled or not:
def __main__( jobId, taskIds=None ):
if not taskIds:
# Frame Dependencies are disabled
releaseJob = False
#figure out if job should be released
return releaseJob
else:
# Frame Dependencies are enabled
tasksToRelease = []
#figure out which tasks should be released, and append their IDs to the array
return tasksToRelease

By giving the taskIds parameter a default of None, it allows the script to function regardless of whether Frame Dependencies are enabled or not. You can check if taskIds is None, and if it is, you know that Frame Dependencies are
disabled.

7.7.4 Migrating Scripts from Deadline 5


Some changes were made to the Scripting API in Deadline 6, which means that Deadline 6 and later are NOT backward
compatible with scripts written for Deadline 5. However, migrating your scripts over is relatively straightforward, and
this guide will walk you through the API changes so that you can update your scripts as necessary.
The only significant change is that the globally defined functions are no longer available. See the Migrating Scripts
From Deadline 5 section in the Scripting Overview documentation for more information, including replacement functions.

7.8 Web Service Scripts


7.8.1 Overview
Web service scripts allow you to retrieve data from Deadline and display it in any way you see fit. See the Web Service
for more information on Deadlines web service feature and how you can use it to call scripts and commands.

430

Chapter 7. Scripting

Deadline User Manual, Release 7.1.0.35

7.8.2 Creating Web Service Scripts


Custom web service scripts can be created in the custom\scripts\WebService folder in your repository. See the
Scripting Overview documentation for more information on the custom folder in the Repository and how its used.
Just place any new scripts directly into this folder, and they will be available to the Web Service. Script files names
should not contain any spaces, and should end in a .py extension (ie, they must be Python scripts).
The __main__ Function
All web service scripts must define a __main__ function that accepts *args (a tuple containing 2 items). This is the
function that will be called when the web service executes the script. Note that if you decide not to accept args, and an
argument string is passed to your script in the URL, it will result in an exception being thrown. The function should
also return a string value, which is used to display the results. The string can be HTML, XML, plain text, etc.
def __main__( *args ):
results = ""
#...
#append data to results
#...
return results

It is also possible for the web service script to set the HTTP status code. This can be done by including the status code
after the results in the return statement. For example:
def __main__( *args ):
results = ""
statusCode = "200"
#...
#append data to results, and set statusCode as necessary
#...
return results, statusCode

Finally, it is possible for the web service script to set additional headers to be included in the HTTP response. This
can be done by including an arbitrary number of key=value strings after the status code in the return statement. For
example:
def __main__( *args ):
results = ""
statusCode = "200"
#...
#append data to results, and set statusCode as necessary
#...
return results, statusCode, "header1=value1", "header2=value2"

7.8. Web Service Scripts

431

Deadline User Manual, Release 7.1.0.35

Supporting Arguments
Arguments can be passed to web service scripts as a tuple with 2 items, and can be accepted in two different ways.
The first way is to simply accept args, which will be an array of length 2. The other way is to accept the tuple as two
separate variables, for instance (dlArgs, qsArgs) for Deadline arguments and query string arguments. In the first case,
args[0] is equivalent to dlArgs (Deadline arguments), and args[1] is equivalent to qsArgs (Query String Arguments).
Deadline Arguments
The web service will automatically pass your script a dictionary as the first item in the args tuple. The Dictionary
will contain at least one key (Authenticated), but may contain more if the user authenticated with the web service.
Currently, if the user has not authenticated, the Dictionary will only contain the Authenticated key, with a value
of False. However, if the user has authenticated, it will also contain the UserName key, with a value of the user
executing the script.
Query String Arguments
Arguments are passed to your script by a query string defined in the URL, and can be in one of the following forms:
Key/Value Pairs: This is the preferred method of passing arguments. Arguments in this form will look something like
this at the end of the URL:
?key0=value0&key1=value1

List of Values: Arguments in this form will instead look something like this:
?value0&value1

The query string will be passed to the Python script as a NameValueCollection and it will be the second item of the
tuple passed to your scripts __main__ function.
Relevant API Functions
For functions that will be relevant to most Web Service scripts, see the Deadline.PulseUtils section of the Deadline
Scripting Reference documentation. The full Deadline Scripting Reference can be found on the Thinkbox Software
Documentation Website. Offline PDF and HTML versions can be downloaded from here as well.

7.8.3 Calling Web Service Scripts


Once the script has been created, you can call it using the web service. See the Web Service Documentation for more
information on how to set this up. For example, if you have a Web Service script called GetFarmStatistics.py, you
would call it using the following URL (where [myhost] is the hostname pointing to your web service machine):
http://[myhost]:8080/GetFarmStatistics

Some scripts can take arguments, as detailed in the previous section. To include arguments, you need to place a ?
between the base URL and the first argument, with & separating addition arguments. Here is an example of how you
would pass arg1, arg2, and arg3 as a list of arguments to the GetFarmStatistics.py script:
http://[myhost]:8080/GetFarmStatistics?arg1&arg2&arg3

Here is an example of how you would pass values for arguments named arg1, arg2, and arg3 in the form of
key-value pairs:

432

Chapter 7. Scripting

Deadline User Manual, Release 7.1.0.35

http://[myhost]:8080/GetFarmStatistics?arg1=value1&arg2=value2&arg3=value3

The way the results of the script will be displayed is entirely dependent on the format in which the Script returns them.

7.8.4 Migrating Scripts from Deadline 5


Some changes were made to the Scripting API in Deadline 6, which means that Deadline 6 and later are NOT backward
compatible with scripts written for Deadline 5. However, migrating your scripts over is relatively straightforward, and
this guide will walk you through the API changes so that you can update your scripts as necessary.
The only significant change is that the globally defined functions are no longer available. See the Migrating Scripts
From Deadline 5 section in the Scripting Overview documentation for more information, including replacement functions.

7.9 Standalone Python API


7.9.1 Overview
The Standalone Python API can be used in Python for communicating with the HTTP API (documented in REST
Overview). In order to use the HTTP API you must have the Web Service running on a machine whose address and
port number you know. For a list of the APIs functions and how they are used go to the Deadline Downloads page and
download the documentation. Essentially, our Standalone Python API is a Python wrapper API around our RESTful
HTTP API.
Note, as all communication to Deadline travels through the machine running the Web Service and not the local host,
there are consequences that should be considered carefully. Any file paths provided need to be valid on the Web Service
machine, including any differences between operating systems if for example, your local host is running Windows but
the Web Service machine is Linux. In the case of submitting a job, the jobs username will be the user account currently
running the Web Service, NOT the submitting local user, unless a UserName is provided in the job info.

7.9.2 Set-up
In order to use the Standalone Python API you must have Python 2.7 or later installed. Copy the Deadline
Folder containing the Standalone Python API from \\your\repository\api\python to the site-packages folder of
your Python installation and the API is ready to use.

7.9.3 Using the API


A DeadlineCon object must be created which is used to communicate with the web service to send and receive requests.
First enter import Deadline.DeadlineConnect as Connect, then create your connection object connectionObject =
Deadline.DeadlineConnect.DeadlineCon(PulseName, PulsePortNumber), where PulseName is the DNS name or
IP address of the machine currently running the web service and PulsePortNumber is the web service port number
as configured in the Web Service settings in the Repository Options. By default it is: 8080. The connectionObject
variable can now be used to communicate requests to the web service.
Example: Getting group names and suspending a job
>>> from Deadline.DeadlineConnect import DeadlineCon as Connect
>>> con = Connect('PulseName', 8080)
>>> con.Groups.GetGroupNames()

7.9. Standalone Python API

433

Deadline User Manual, Release 7.1.0.35

[u'none', u'group1', u'group2', u'group3']


>>> jobId = validjobID
>>> con.Jobs.SuspendJob(jobId)
'Success'

Documentation for all the possible API functions can be found on at the Deadline Downloads page.

7.9.4 Authenticating
If your Web Service has authentication enabled then you must set up authentication for the Python API. This can be
achieved through the EnableAuthentication and SetAuthenticationCredentials functions. Setting your authentication credentials allows the Python API to use them for as long as that instance of python is running.
>>> from Deadline.DeadlineConnect import DeadlineCon as Connect
>>> con = Connect('PulseName', 8080)
>>> con.Groups.GetGroupNames()
"Error: HTTP Status Code 401. Authentication with the Web Service failed.
Please ensure that the authentication credentials are set, are correct, and
that authentication mode is enabled."
>>> con.AuthenticationModeEnabled()
False
>>> con.EnabledAuthentication(True)
>>> con.AuthenticationModeEnabled()
True
>>> con.SetAuthenticationCredentials("username", "password")
>>> con.Groups.GetGroupNames()
[u'none', u'group1', u'group2', u'group3']

By default SetAuthenticationCredentials also enables authentication, so it is not actually necessary to explicitly call
EnableAuthentication as well. If you want to store your credentials without enabling authentication you may do so
as well using the optional third parameter.
>>> con.SetAuthenticationCredentials("username", "password", False)

7.9.5 API Functions


All of the Standalone Python API functions return a Python dictionary, a Python list, or a Python string. Lists often
contain dictionaries.
Examples: Getting a list, a list containing dictionaries, a dictionary, and a string back.
>>> groupNames = con.Groups.GetGroupNames()
>>> groupNames[0]
group1
>>> jobs = con.Jobs.GetJobs()
>>> jobs[0]['FailedChunks']
12
>>> task = con.Tasks.GetJobTask(jobId, 0)
>>> task["Errs"]
8
>>> root = con.Repository.GetRootDirectory()
>>> root
'C:/DeadlineRepository'

434

Chapter 7. Scripting

Deadline User Manual, Release 7.1.0.35

Example: Getting a job, changing the pool and priority then saving it.
>>> job = con.Jobs.GetJob(jobId)
>>> str(job['Props']['Pool'])
none
>>> job['Props']['Pool'] = unicode('jobPool')
>>> str(job['Props']['Pool'])
jobPool
>>> print str(job['Props']['Pri'])
50
>>> job['Props']['Pri'] = 75
>>> str(job['Props']['Pri'])
75
>>> con.Jobs.SaveJob(job)
'Success'
>>> job = con.Jobs.GetJob(jobId)
>>> str(job['Props']['Pool']) + ' ' +str(job['Props']['Pri'])
jobPool 75

Example: Submitting a reserve VraySpawner job using Python dictionaries.


import Deadline.DeadlineConnect as Connect
if __name__ == '__main__':
Deadline = Connect.DeadlineCon('PulseName', 8080)
JobInfo = {
"Name": "Submitted via Python",
"UserName": "UserName",
"Frames": "0-1",
"Plugin": "VraySpawner"
}
PluginInfo = {
"Version": "Max2014"
}
try:
newJob = Deadline.Jobs.SubmitJob(JobInfo, PluginInfo)
print newJob
except:
print "Sorry, Web Service is currently down!"

Note, when submitting a job, the JobInfo and PluginInfo dictionaries should contain ALL the minimum necessary
KEY=VALUE pairs to successfully run this plugin job type in Deadline. As the KEY=VALUE pairs are internal and
change depending on the application plugin, it is recommended you submit a job normally to Deadline and then inspect
the jobs Submission Params to see what KEY=VALUE pairs should be submitted for this job type. You can also use
the Export button to take a copy of the JobInfo and PluginInfo files to submit the job using these files instead of via
Python dictionaries.

7.9. Standalone Python API

435

Deadline User Manual, Release 7.1.0.35

436

Chapter 7. Scripting

CHAPTER

EIGHT

REST API

8.1 REST Overview


8.1.1 Overview
The RESTful HTTP API can be used to interact with an instance of the web service. HTTP requests can be made
to request information from the database, store new data, alter existing data or remove entries from the database.
Requests to the API can be categorized by the type of data you are attempting to access and by the type of HTTP
request you are using to access said data. In order to use the HTTP API you must have the Web Service running on a
machine whose address and port number you know.
Note, as all communication to Deadline travels through the machine running the Web Service and not the local host,
there are consequences that should be considered carefully. Any file paths provided need to be valid on the Web Service
machine, including any differences between operating systems if for example, your local host is running Windows but
the Web Service machine is Linux. In the case of submitting a job, the jobs username will be the user account currently
running the Web Service, NOT the submitting local user, unless a UserName is provided in the job info.
Requests that alter data are primarily POST or PUT messages, and they typically return text stating whether they
succeeded or if there was an error. Requests made to retrieve data are done using GET messages and return JavaScript
Object Notation (JSON) formatted objects if successful, and text explaining the error if not. Some POST or PUT
messages will return JSON objects as well, but usually only if there is information about the action that the user may
need (an example of this would be a request to create a new object, the objects primary key may be returned on
creation). Requests made to remove data are typically done using DELETE messages and return text stating whether
they succeeded or if there was an error, just like POST and PUT messages. In the event of an error message being
returned the HTTP Status Code will also be set to describe the error.

8.1.2 Request Types


Jobs
Job Reports
Groups
Pools
Limits
Repository
Pulse
Slaves
Tasks

437

Deadline User Manual, Release 7.1.0.35

Task Reports
Users
Balancer

8.1.3 Request Formats and Responses


GET
Request for some data. These messages are constructed entirely within the URL. Successful requests
will usually return a JSON object and failed requests will return a brief error message along with the
HTTP Status Code. There are some GET requests that will return plain text for a successful request.
PUT
Typically a request to modify some data. These messages use the URL to specify what type of
data that you wish to alter, and use the message body for storing the message to the database. The
message body must be a JSON object, although how this object must be built depends on the data
being modified. PUT messages for data that does not exist will often fail, but in some cases will act
as a POST. Successful requests will usually return text stating success. Failed requests will return
a brief error message along with the HTTP Status Code. There are some PUT messages that return
JSON objects, and this usually occurs when data has been created instead of altered.
POST
Request to create some data. These messages use the URL to specify what type of data that you wish
to create, and use the message body for storing the message to the database. The message body must
be a JSON object, although how this object must be built depends on the data being modified. POST
messages for data that already exists will fail. Successful requests will usually return text stating
success. Failed requests will return a brief error message along with the HTTP Status Code. There
are some POST messages that return JSON objects.
DELETE
Request to delete some data. These messages are constructed entirely within the URL. Successful
requests will usually return text stating success. Failed requests will return a brief error message
along with the HTTP Status Code.

8.1.4 HTTP Status Codes


The following are the HTTP Status Code that can be returned, and what they signify in Deadline.
200 - OK
Request completed without error. Note that this does not always mean the request modified everything as intended. Example: trying to send a complete message to a completed job will do nothing
and return this status code. Another example: trying to release a job from pending when the job is
not pending will return this status code and do nothing.
400 - Bad Request
Request could not be completed due to incorrect request message structure in either the URL or the
body of the request message.
404 - Not Found
Requested data could not be found, or requested command could not be found.
405 - Method Not Allowed

438

Chapter 8. REST API

Deadline User Manual, Release 7.1.0.35

Requested operation could not be completed using the request format given.
500 - Internal Server Error
Request message could not be interpreted properly, or the action being attempted causing an exception in Deadline.
501 - Not Implemented
Request type is not supported. For example, a JobReport PUT request would return this because only
GET is supported.

8.1.5 Additional Information


If a request is made for a JSON object, and an empty JSON object is returned, then the information provided for the
request did not match any entry in the repository.
Adding additional key-value pairs to a JSON object for a request that does not specify their use can have surprising
consequences. Keys that are not used by other commands will be ignored, but be sure to read the documentation
for each possible query for each request type before building a JSON object for your query, as some commands are
identical other than the presence of a single key and have vastly different effects.
If a documented query requires a JSON object that you do not know how to properly construct, it is often possible to
do a GET query for the same object type and receive the JSON format that the query expects.
A query that returns Success does not imply that the actions your query requested occurred. Some actions are
impossible, but do not warrant an error message. (Example, sending a Suspend message to a Suspended job, or
Deleting a Slave that does not exist or was already Deleted.)

8.2 Jobs
8.2.1 Overview
Job requests can be used to set and retrieve information for one or many jobs. Job requests support GET, PUT, POST
and DELETE request types. For more about these request types and their uses see the Request Formats and Responses
documentation.

8.2.2 Requests and Responses


List of possible requests for Jobs. All PUT and POST requests may also return a 400 Bad Request error if there was
no message body in the request. All PUT requests may also return a 400 Bad Request error message if the command
key is not present in the message bodys JSON object. All PUT requests may also return a 500 Internal Server Error
error message if the command key in the message body contained an invalid command.
Get All The Jobs
URL: http://hostname:portnumber/api/jobs
Request Type: GET
Message Body: N/A
Response: JSON object containing all the job information for every job in the repository.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.

8.2. Jobs

439

Deadline User Manual, Release 7.1.0.35

Get Jobs In Specified State Gets jobs in the specified state(s). Valid states are Active, Suspended, Completed, Failed,
and Pending. Note that Active covers both Queued and Rendering jobs. Specify more than one state by separating them with commas (ie: Active,Completed,Suspended).
URL: http://hostname:portnumber/api/jobs?States=states
Request Type: GET
Message Body: N/A
Response: JSON object containing all the jobs in the specified state(s).
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get All The Job IDs
URL: http://hostname:portnumber/api/jobs?IdOnly=true
Request Type: GET
Message Body: N/A
Response: JSON object containing all the job IDs in the repository.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get Job Gets job info for the given job ID.
URL: http://hostname:portnumber/api/jobs?JobID=validjobidhere
Request Type: GET
Message Body: N/A
Response: JSON object containing all the job information for the job ID provided.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Save Job Saves the job info provided. Job info must be in JSON format.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
Command = save
Job = JSON object containing the job info
Response: Success
Possible Errors:
400 Bad Request: There was no Job entry in the JSON object in the message body.
500 Internal Server Error: An exception occurred within the Deadline code.
Suspend Job Puts the job with the matching ID into the Suspended state.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
Command = suspend

440

Chapter 8. REST API

Deadline User Manual, Release 7.1.0.35

JobID = the ID of the Job to be suspended


Response: Success
Possible Errors:
400 Bad Request: There was no JobID entry in the JSON object in the message body.
500 Internal Server Error:
An exception occurred within the Deadline code, or
Job ID provided does not correspond to a Job in the repository.
Suspend Job: Non-rendering tasks Puts the job with the matching ID into the Suspended state.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
Command = suspendnonrendering
JobID = the ID of the Job to be suspended
Response: Success
Possible Errors:
400 Bad Request: There was no JobID entry in the JSON object in the message body.
500 Internal Server Error:
An exception occurred within the Deadline code, or
Job ID provided does not correspond to a Job in the repository.
Resume Job Resumes the job with the ID that matches the provided ID.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
Command = resume
JobID = the ID of the Job to be resumed
Response: Success
Possible Errors:
400 Bad Request: There was no JobID entry in the JSON object in the message body.
500 Internal Server Error:
An exception occurred within the Deadline code, or
Job ID provided does not correspond to a Job in the repository.
Resume Failed Job Resumes the failed job with the ID that matches the provided ID.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT

8.2. Jobs

441

Deadline User Manual, Release 7.1.0.35

Message Body:
JSON object where the following keys are mandatory:
Command = resumefailed
JobID = the ID of the failed Job to be resumed
Response: Success
Possible Errors:
400 Bad Request: There was no JobID entry in the JSON object in the message body.
500 Internal Server Error:
An exception occurred within the Deadline code, or
Job ID provided does not correspond to a Job in the repository.
Requeue Job Requeues the job with the ID that matches the provided ID.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
Command = requeue
JobID = the ID of the Job to be requeued
Response: Success
Possible Errors:
400 Bad Request: There was no JobID entry in the JSON object in the message body.
500 Internal Server Error:
An exception occurred within the Deadline code, or
Job ID provided does not correspond to a Job in the repository.
Archive Job Archives the job with the ID that matches the provided ID.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
Command = archive
JobID = the ID of the Job to be archived
Response: Success
Possible Errors:
400 Bad Request: There was no JobID entry in the JSON object in the message body.
500 Internal Server Error:
An exception occurred within the Deadline code, or
Job ID provided does not correspond to a Job in the repository.

442

Chapter 8. REST API

Deadline User Manual, Release 7.1.0.35

Import Job Imports the job path provided.


URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
Command = import
File = the file location of the archived job/s (May be an array)
The following keys are optional:
DeleteFile = true (deletes the archive file/s after importing)
Response: The job ids of the imported jobs and of the jobs that were not imported.
Possible Errors:
400 Bad Request: There was no File path provided.
500 Internal Server Error: An exception occurred within the Deadline code.
Pend Job Puts the job with the ID that matches the provided ID in the pending state.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
Command = pend
JobID = the ID of the Job to be put in the pending state
Response: Success
Possible Errors:
400 Bad Request: There was no JobID entry in the JSON object in the message body.
500 Internal Server Error:
An exception occurred within the Deadline code, or
Job ID provided does not correspond to a Job in the repository.
Release Pending Job Releases the job with the ID that matches the provided ID from the pending state.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
Command = releasepending
JobID = the ID of the Job to be release from the pending state
Response: Success
Possible Errors:
400 Bad Request: There was no JobID entry in the JSON object in the message body.

8.2. Jobs

443

Deadline User Manual, Release 7.1.0.35

500 Internal Server Error:


An exception occurred within the Deadline code, or
Job ID provided does not correspond to a Job in the repository.
Complete Job Marks the job with the ID that matches the provided ID as complete.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
Command = complete
JobID = the ID of the Job to be marked as complete
Response: Success
Possible Errors:
400 Bad Request: There was no JobID entry in the JSON object in the message body.
500 Internal Server Error:
An exception occurred within the Deadline code, or
Job ID provided does not correspond to a Job in the repository.
Fail Job Marks the job with the ID that matches the provided ID as failed.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
Command = fail
JobID = the ID of the Job to be marked as failed
Response: Success
Possible Errors:
400 Bad Request: There was no JobID entry in the JSON object in the message body.
500 Internal Server Error:
An exception occurred within the Deadline code, or
Job ID provided does not correspond to a Job in the repository.
Update Job Submission Date Updates the Submission Date for the job with the ID that matches the provided ID.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
Command = updatesubmissiondate
JobID = the ID of the Job to have the submission date updated for

444

Chapter 8. REST API

Deadline User Manual, Release 7.1.0.35

Response: Success
Possible Errors:
400 Bad Request: There was no JobID entry in the JSON object in the message body.
500 Internal Server Error:
An exception occurred within the Deadline code, or
Job ID provided does not correspond to a Job in the repository.
Set Job Machine Limit Sets the Job Machine Limit for the job with the ID that matches the provided ID.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
Command = setjobmachinelimit
JobID = the ID of the Job
The following keys are optional:
Limit = the new job machine limit, must be an integer
SlaveList = the slave/s to be set as the slave list (May be an array)
WhiteListFlag = boolean : sets the whitelistflag to true or false
Progress = Floating point number for the release percentage
Response: Success
Possible Errors:
400 Bad Request: There was no JobID entry in the JSON object in the message body.
500 Internal Server Error:
An exception occurred within the Deadline code, or
Job ID provided does not correspond to a Job in the repository.
Add Slaves To Job Machine Limit List Adds the provided Slaves to the job with the ID that matches the provided
ID.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
Command = addslavestojobmachinelimitlist
JobID = the ID of the Job
SlaveList = the slave/s to be added to the slave list (May be an array)
Response: Success
Possible Errors:
400 Bad Request:

8.2. Jobs

445

Deadline User Manual, Release 7.1.0.35

There was no JobID entry in the JSON object in the message body, or
There needs to be at least one Slave passed.
500 Internal Server Error:
An exception occurred within the Deadline code, or
Job ID provided does not correspond to a Job in the repository.
Remove Slaves From Job Machine Limit List Removes the provided Slaves from the Job Machine Limit List for
the job with the ID that matches the provided ID.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
Command = removeslavesfromjobmachinelimitlist
JobID = the ID of the Job
SlaveList = the slave/s to be removed from the slave list (May be an array)
Response: Success
Possible Errors:
400 Bad Request:
There was no JobID entry in the JSON object in the message body, or
There needs to be at least one Slave passed.
500 Internal Server Error:
An exception occurred within the Deadline code, or
Job ID provided does not correspond to a Job in the repository.
Set Job Machine Limit Listed Slaves Sets provided Slaves as Job Machine Limit Listed Slaves for the Job whose
ID matches the provided ID.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
Command = setjobmachinelimitlistedslaves
JobID = the ID of the Job
SlaveList = the slave/s to be set as the slave list (May be an array)
Response: Success
Possible Errors:
400 Bad Request:
There was no JobID entry in the JSON object in the message body, or
There needs to be at least one Slave passed.
500 Internal Server Error:

446

Chapter 8. REST API

Deadline User Manual, Release 7.1.0.35

An exception occurred within the Deadline code, or


Job ID provided does not correspond to a Job in the repository.
Set Job Machine Limit White List Flag Sets Job Machine Limit White List Flag for the job with the ID that matches
the provided ID.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
Command = setjobmachinelimitwhitelistflag
JobID = the ID of the Job
WhiteListFlag = boolean : sets the whitelistflag to true or false
Response: Success
Possible Errors:
400 Bad Request:
There was no JobID entry in the JSON object in the message body, or
Must pass a boolean WhiteListFlag.
500 Internal Server Error:
An exception occurred within the Deadline code, or
Job ID provided does not correspond to a Job in the repository.
Set Job Machine Limit Sets Job Machine Limit Maximum for the job with the ID that matches the provided ID.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
Command = setjobmachinelimitmaximum
JobID = the ID of the Job
Limit = the new job machine limit, must be an integer
Response: Success
Possible Errors:
400 Bad Request:
There was no JobID entry in the JSON object in the message body, or
Must pass an integer Limit
500 Internal Server Error:
An exception occurred within the Deadline code, or
Job ID provided does not correspond to a Job in the repository.

8.2. Jobs

447

Deadline User Manual, Release 7.1.0.35

Set Job Frame Range Sets the frame range for the job with the ID that matches the provided ID.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
Command = setjobframerange
JobID = the ID of the Job
FrameList = the new frame list
ChunkSize = the new chunk size
Response: Success
Possible Errors:
400 Bad Request:
There was no JobID entry in the JSON object in the message body, or
500 Internal Server Error:
An exception occurred within the Deadline code, or
Job ID provided does not correspond to a Job in the repository.
Append Job Frame Range Appends frames to the job with the ID that matches the provided ID. This adds new tasks
without affecting the jobs existing tasks.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
Command = appendjobframerange
JobID = the ID of the Job
FrameList = the frame list to append to the jobs existing frames
Response: Success
Possible Errors:
400 Bad Request:
There was no JobID entry in the JSON object in the message body, or
500 Internal Server Error:
An exception occurred within the Deadline code, or
Job ID provided does not correspond to a Job in the repository.
Submit Job Submits a job using the job info provided.
URL: http://hostname:portnumber/api/jobs
Request Type: POST
Message Body:

448

Chapter 8. REST API

Deadline User Manual, Release 7.1.0.35

JSON object where the following keys are mandatory:


JobInfo = JSON object containing the Job Info
PluginInfo = JSON object containing the Plugin Info
AuxFiles = Array of Auxiliary File paths (May be empty, but must be provided)
IdOnly = Set to true to only return the job ID (defaults to false)
Response: JSON object containing the new Job that was submitted or the Job ID
Possible Errors:
400 Bad Request: Missing one or more of the mandatory keys listed above.
500 Internal Server Error:
An exception occurred within the Deadline code, or
Could not access the file path specified in NetworkRoot.
Delete Jobs Deletes the job corresponding to the job ID provided.
URL: http://hostname:portnumber/api/jobs?JobID=listOfJobIdsToDelete
Request Type: DELETE
Message Body: N/A
Response: Success
Possible Errors:
400 Bad Request: Need to provide at least one job ID to delete.
500 Internal Server Error: An exception occurred within the Deadline code.
Get Job Details Gets the Job Details, similar to the Job Details panel, for the Jobs corresponding to the provided Job
IDs.
URL: http://hostname:portnumber/api/jobs?JobID=listOfJobIds&Details=true
Request Type: GET
Message Body: N/A
Response: A JSON object containing the Job Details.
Possible Errors:
400 Bad Request: Need to provide at least one job ID to get details for.
500 Internal Server Error: An exception occurred within the Deadline code.
Get Deleted Jobs Gets the Deleted Jobs that correspond to the provided Job IDs.
URL: http://hostname:portnumber/api/jobs?JobID=listOfJobIds&Deleted=true
Request Type: GET
Message Body: N/A
Response: A JSON object containing the deleted Jobs.
Possible Errors:
400 Bad Request: Need to provide at least one deleted job ID.
500 Internal Server Error: An exception occurred within the Deadline code.

8.2. Jobs

449

Deadline User Manual, Release 7.1.0.35

Get All Deleted Jobs Gets all the Deleted Jobs.


URL: http://hostname:portnumber/api/jobs?Deleted=true
Request Type: GET
Message Body: N/A
Response: A JSON object containing the deleted Jobs.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Purge Deleted Jobs Purges the Deleted Jobs that correspond to the provided Job IDs.
URL: http://hostname:portnumber/api/jobs?JobID=listOfJobIdsToDelete&Purge=true
Request Type: DELETE
Message Body: N/A
Response: Success
Possible Errors:
400 Bad Request: Need to provide at least one job ID to delete.
500 Internal Server Error: An exception occurred within the Deadline code.
Undelete Jobs Undeletes the Deleted Jobs that correspond to the provided Job IDs.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
Command = undelete
JobID/s = job ID/list of job IDs to undelete
Response: Success
Possible Errors:
400 Bad Request: Need to provide at least one job ID to delete.
500 Internal Server Error: An exception occurred within the Deadline code.

8.2.3 Job Property Values


Values for some Job properties are represented by numbers. Those properties and their possible values are listed below.
Stat (Status)
0 = Unknown
1 = Active
2 = Suspended
3 = Completed
4 = Failed
6 = Pending

450

Chapter 8. REST API

Deadline User Manual, Release 7.1.0.35

Note that an active job can either be idle or rendering. Use the RenderingChunks property to determine if anything is
rendering.
Timeout (OnTaskTimeout)
0 = Both
1 = Error
2 = Notify
OnComp (OnJobComplete)
0 = Archive
1 = Delete
2 = Nothing
Schd (ScheduledType)
0 = None
1 = Once
2 = Daily

8.3 Job Reports


8.3.1 Overview
Job Report requests can be used to retrieve Job Reports for a Job using the GET request type. PUT, POST and
DELETE are not supported and sending a message of any of these types will result in a 501 Not Implemented error
message. For more about these request types and their uses see the Request Formats and Responses documentation.

8.3.2 Requests and Responses


List of possible requests for Job Reports. It is possible to get a 400 Bad Request error message for any of the requests
if the value for Data is incorrect.
Get All Job Reports Gets all the Job Reports for the Job that corresponds to the provided Job ID.
URL: http://hostname:portnumber/api/jobreports?Data=all&JobID=validJobID
http://hostname:portnumber/api/jobreports?JobID=validJobID
Request Type: GET
Message Body: N/A
Response: JSON object containing all the job reports for the requested job, or a message stating that there are
no reports for the job.
Possible Errors:
400 Bad Request: No Job ID was provided.
500 Internal Server Error:
An exception occurred within the Deadline code, or
The Job ID provided does not correspond to any Job in the repository.

8.3. Job Reports

451

Deadline User Manual, Release 7.1.0.35

Get Job Error Reports Gets all the Job Error Reports for the Job that corresponds to the provided Job ID.
URL: http://hostname:portnumber/api/jobreports?Data=error&JobID=validJobID
Request Type: GET
Message Body: N/A
Response: JSON object containing all the job error reports for the requested job, or a message stating that there
are no error reports for the job.
Possible Errors:
400 Bad Request: No Job ID was provided.
500 Internal Server Error:
An exception occurred within the Deadline code, or
The Job ID provided does not correspond to any Job in the repository.
Get Job Log Reports Gets all the Job Reports for the Job that corresponds to the provided Job ID.
URL: http://hostname:portnumber/api/jobreports?Data=log&JobID=validJobID
Request Type: GET
Message Body: N/A
Response: JSON object containing all the job log reports for the requested job, or a message stating that there
are no log reports for the job.
Possible Errors:
400 Bad Request: No Job ID was provided.
500 Internal Server Error:
An exception occurred within the Deadline code, or
The Job ID provided does not correspond to any Job in the repository.
Get Job Requeue Reports Gets all the Job Requeue Reports for the Job that corresponds to the provided Job ID.
URL: http://hostname:portnumber/api/jobreports?Data=requeue&JobID=validJobID
Request Type: GET
Message Body: N/A
Response: JSON object containing all the job requeue reports for the requested job, or a message stating that
there are no requeue reports for the job.
Possible Errors:
400 Bad Request: No Job ID was provided.
500 Internal Server Error:
An exception occurred within the Deadline code, or
The Job ID provided does not correspond to any Job in the repository.
Get Job History Entries Gets all the Job History Entries for the Job that corresponds to the provided Job ID.
URL: http://hostname:portnumber/api/jobreports?Data=history&JobID=validJobID
Request Type: GET
Message Body: N/A

452

Chapter 8. REST API

Deadline User Manual, Release 7.1.0.35

Response: JSON object containing all the job history entries for the requested job, or a message stating that
there are no history entries for the job.
Possible Errors:
400 Bad Request: No Job ID was provided.
500 Internal Server Error:
An exception occurred within the Deadline code, or
The Job ID provided does not correspond to any Job in the repository.

8.3.3 Job Report Property Values


Values for some Job Report properties are represented by numbers. Those properties and their possible values are
listed below.
Type (ReportType)
0 = LogReport
1 = ErrorReport
2 = RequeueReport

8.4 Tasks
8.4.1 Overview
Task requests can be used to set and retrieve Task information using GET and PUT request types. POST and DELETE
are not supported and sending a message of either of these types will result in a 501 Not Implemented error message.
For more about these request types and their uses see the Request Formats and Responses documentation.

8.4.2 Requests and Responses


List of possible requests for Tasks. For all PUT requests it is possible to return a 400 Bad Request error message if the
message body is empty or if no command key is provided. All requests may return a 400 Bad request error message
if no Job ID is provided or a 500 Internal Server Error if the Job ID provided does not correspond to any Job in the
repository.
Get Task IDs
Gets all the Task IDs for the Job that corresponds to the Job ID provided.
URL: http://hostname:portnumber/api/tasks?IdOnly=true&JobID=aValidJobID
Request Type: GET
Message Body: N/A
Response: JSON object containing all the Task IDs for the Job.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get Task

8.4. Tasks

453

Deadline User Manual, Release 7.1.0.35

Gets the Task that correspond to the Task ID provided for the Job that corresponds to the Job ID provided.
URL: http://hostname:portnumber/api/tasks?TaskID=oneValidTaskID&JobID=aValidJobID
Request Type: GET
Message Body: N/A
Response: JSON object containing the Task information for the requested Task.
Possible Errors:
400 Bad Request:
No Task ID provided, or
Task ID must be an integer value.
500 Internal Server Error: An exception occurred within the Deadline code.
Get All Tasks
Gets the Tasks for the Job that corresponds to the Job ID provided.
URL: http://hostname:portnumber/api/tasks?JobID=aValidJobID
Request Type: GET
Message Body: N/A
Response: JSON object containing the Task information for all the Job Tasks.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Requeue Tasks
Requeues the Tasks that correspond to the Task IDs provided for the Job that corresponds to the Job ID
provided. If no Task IDs are provided, all Job tasks will be requeued.
URL: http://hostname:portnumber/api/tasks
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
Command = requeue
JobID = the id of the Job
The following keys are optional:
TaskList = integer Task ID/s (May be an Array)
Response: Success
Possible Errors:
400 Bad Request: TaskList contains entries, but none of them are valid integers.
404 Not Found: Requested Task ID does not correspond to a Task for the Job.
500 Internal Server Error: An exception occurred within the Deadline code.
Complete Tasks
Completes the Tasks that correspond to the Task IDs provided for the Job that corresponds to the Job ID
provided. If no Task IDs are provided, all Job tasks will be completed.
URL: http://hostname:portnumber/api/tasks

454

Chapter 8. REST API

Deadline User Manual, Release 7.1.0.35

Request Type: PUT


Message Body:
JSON object where the following keys are mandatory:
Command = complete
JobID = the id of the Job
The following keys are optional:
TaskList = integer Task ID/s (May be an Array)
Response: Success
Possible Errors:
400 Bad Request: TaskList contains entries, but none of them are valid integers.
404 Not Found: Requested Task ID does not correspond to a Task for the Job.
500 Internal Server Error: An exception occurred within the Deadline code.
Suspend Tasks
Suspend the Tasks that correspond to the Task IDs provided for the Job that corresponds to the Job ID
provided. If no Task IDs are provided, all Job tasks will be suspended.
URL: http://hostname:portnumber/api/tasks
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
Command = suspend
JobID = the id of the Job
The following keys are optional:
TaskList = integer Task ID/s (May be an Array)
Response: Success
Possible Errors:
400 Bad Request: TaskList contains entries, but none of them are valid integers.
404 Not Found: Requested Task ID does not correspond to a Task for the Job.
500 Internal Server Error: An exception occurred within the Deadline code.
Fail Tasks
Fails the Tasks that correspond to the Task IDs provided for the Job that corresponds to the Job ID provided. If no Task IDs are provided, all Job tasks will be failed.
URL: http://hostname:portnumber/api/tasks
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
Command = fail
JobID = the id of the Job

8.4. Tasks

455

Deadline User Manual, Release 7.1.0.35

The following keys are optional:


TaskList = integer Task ID/s (May be an Array)
Response: Success
Possible Errors:
400 Bad Request: TaskList contains entries, but none of them are valid integers.
404 Not Found: Requested Task ID does not correspond to a Task for the Job.
500 Internal Server Error: An exception occurred within the Deadline code.
Resume Failed Tasks
Resumes the Failed Tasks that correspond to the Task IDs provided for the Job that corresponds to the Job
ID provided. If no Task IDs are provided, all Job failed tasks will be resumed.
URL: http://hostname:portnumber/api/tasks
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
Command = resumefailed
JobID = the id of the Job
The following keys are optional:
TaskList = integer Task ID/s (May be an Array)
Response: Success
Possible Errors:
400 Bad Request: TaskList contains entries, but none of them are valid integers.
404 Not Found: Requested Task ID does not correspond to a Task for the Job.
500 Internal Server Error: An exception occurred within the Deadline code.
Pend Tasks
Pends the Tasks that correspond to the Task IDs provided for the Job that corresponds to the Job ID
provided. If no Task IDs are provided, all Job tasks will be pended.
URL: http://hostname:portnumber/api/tasks
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
Command = pend
JobID = the id of the Job
The following keys are optional:
TaskList = integer Task ID/s (May be an Array)
Response: Success
Possible Errors:
400 Bad Request: TaskList contains entries, but none of them are valid integers.

456

Chapter 8. REST API

Deadline User Manual, Release 7.1.0.35

404 Not Found: Requested Task ID does not correspond to a Task for the Job.
500 Internal Server Error:
An exception occurred within the Deadline code, or
Trying to pend a task for a Suspended Job.
Release Pending Tasks
Releases the pending Tasks that correspond to the Task IDs provided for the Job that corresponds to the
Job ID provided. If no Task IDs are provided, all Job pending tasks will be released.
URL: http://hostname:portnumber/api/tasks
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
Command = releasepending
JobID = the id of the Job
The following keys are optional:
TaskList = integer Task ID/s (May be an Array)
Response: Success
Possible Errors:
400 Bad Request: TaskList contains entries, but none of them are valid integers.
404 Not Found: Requested Task ID does not correspond to a Task for the Job.
500 Internal Server Error:
An exception occurred within the Deadline code, or
Trying to release a task from pending for a Suspended Job.

8.4.3 Task Property Values


Values for some Task properties are represented by numbers. Those properties and their possible values are listed
below.
Stat (Status)
1 = Unknown
2 = Queued
3 = Suspended
4 = Rendering
5 = Completed
6 = Failed
8 = Pending

8.4. Tasks

457

Deadline User Manual, Release 7.1.0.35

8.5 Task Reports


8.5.1 Overview
Task Report requests can be used to retrieve Task Reports for a Job Task using the GET request type. PUT, POST and
DELETE are not supported and sending a message of any of these types will result in a 501 Not Implemented error
message. For more about these request types and their uses see the Request Formats and Responses documentation.

8.5.2 Requests and Responses


List of possible requests for Task Reports. It is possible to get a 400 Bad Request error message for any of the requests
if the value for Data is incorrect. All requests may return a 400 Bad request error message if no Job ID is provided or
a 500 Internal Server Error if the Job ID provided does not correspond to any Job in the repository. All requests may
also return a 400 Bad Request error message if the Task ID was not provided, or was not valid, or was not an integer.
Get All Task Reports
Gets all the Task Reports for the Job Task that corresponds to the provided Job ID and provided Task ID.
URL: http://hostname:portnumber/api/taskreports?Data=all&JobID=validJobID&TaskID=validTaskID
http://hostname:portnumber/api/taskreports?JobID=validJobID&TaskID=validTaskID
Request Type: GET
Message Body: N/A
Response: JSON object containing all the Task reports for the requested Job Task, or a message stating
that there are no reports for the Job Task.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get Task Error Reports
Gets all the Task Error Reports for the Job Task that corresponds to the provided Job ID and provided
Task ID.
URL: http://hostname:portnumber/api/taskreports?Data=error&JobID=validJobID&TaskID=validTaskID
Request Type: GET
Message Body: N/A
Response: JSON object containing the Task error reports for the requested Job Task, or a message stating
that there are no error reports for the Job Task.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get Task Log Reports
Gets all the Task Log Reports for the Job Task that corresponds to the provided Job ID and provided Task
ID.
URL: http://hostname:portnumber/api/taskreports?Data=log&JobID=validJobID&TaskID=validTaskID
Request Type: GET
Message Body: N/A
Response: JSON object containing the Task log reports for the requested Job Task, or a message stating
that there are no log reports for the Job Task.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.

458

Chapter 8. REST API

Deadline User Manual, Release 7.1.0.35

Get Task Requeue Reports


Gets all the Task Requeue Reports for the Job Task that corresponds to the provided Job ID and provided
Task ID.
URL: http://hostname:portnumber/api/taskreports?Data=requeue&JobID=validJobID&TaskID=validTaskID
Request Type: GET
Message Body: N/A
Response: JSON object containing the Task requeue reports for the requested Job Task, or a message
stating that there are no requeue reports for the Job Task.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.

8.5.3 Task Report Property Values


Values for some Task Report properties are represented by numbers. Those properties and their possible values are
listed below.
Type (ReportType)
0 = LogReport
1 = ErrorReport
2 = RequeueReport

8.6 Slaves
8.6.1 Overview
Slave requests can be used to set or retrieve Slave information. Slave requests support GET, PUT and DELETE request
types. POST is not supported and sending such a message will result in a 501 Not Implemented error message. For
more about these request types and their uses see the Request Formats and Responses documentation.

8.6.2 Requests and Responses


List of possible requests for Slaves. For all PUT requests it is possible to return a 400 Bad Request error message if
there is no message body or if the command key is not set. PUT requests may also return a 500 Internal Server Error
message if the command key is set to an invalid command.
Get Slave Names
Gets all the Slave names.
URL: http://hostname:portnumber/api/slaves?NamesOnly=true
Request Type: GET
Message Body: N/A
Response: JSON object containing all the Slave names.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get Slaves InfoSettings

8.6. Slaves

459

Deadline User Manual, Release 7.1.0.35

Gets the InfoSettings for every Slave name provided.


URL: http://hostname:portnumber/api/slaves?Name=oneOrMoreSlaveNames&Data=infosettings
Request Type: GET
Message Body: N/A
Response: JSON object containing the Slave InfoSettings for all the Slave names provided.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get All Slaves InfoSettings
Gets the InfoSettings for every Slave.
URL: http://hostname:portnumber/api/slaves?Data=infosettings
Request Type: GET
Message Body: N/A
Response: JSON object containing the Slave InfoSettings for all the Slaves.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get Slaves Information
Gets the Slave Information for every Slave name provided.
URL: http://hostname:portnumber/api/slaves?Name=oneOrMoreSlaveNames&Data=info
Request Type: GET
Message Body: N/A
Response: JSON object containing the Slave Information for all the Slave names provided.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get All Slaves Information
Gets the Slave Information for every Slave.
URL: http://hostname:portnumber/api/slaves?Data=info
Request Type: GET
Message Body: N/A
Response: JSON object containing the Slave Information for all the Slaves.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get Slaves Settings
Gets the Slave Settings for every Slave name provided.
URL: http://hostname:portnumber/api/slaves?Name=oneOrMoreSlaveNames&Data=settings
Request Type: GET
Message Body: N/A
Response: JSON object containing the Slave Settings for all the Slave names provided.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get All Slaves Settings

460

Chapter 8. REST API

Deadline User Manual, Release 7.1.0.35

Gets the Slave Settings for every Slave.


URL: http://hostname:portnumber/api/slaves?Data=settings
Request Type: GET
Message Body: N/A
Response: JSON object containing the Slave Settings for all the Slaves.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Save Slave Information
Saves the Slave Information provided.
URL: http://hostname:portnumber/api/slaves
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
Command = saveinfo
SlaveInfo = JSON object containing the Slave information to save.
Response: Success
Possible Errors:
400 Bad Request: JSON object containing Slave Information was not provided.
500 Internal Server Error: An exception occurred within the Deadline code.
Save Slave Settings
Saves the Slave Settings provided.
URL: http://hostname:portnumber/api/slaves
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
Command = savesettings
SlaveInfo = JSON object containing the Slave Settings to save.
Response: Success
Possible Errors:
400 Bad Request: JSON object containing Slave Settings was not provided.
500 Internal Server Error: An exception occurred within the Deadline code.
Delete Slaves
Deletes every Slave that corresponds to a Slave name provided.
URL: http://hostname:portnumber/api/slaves?Name=oneOrMoreSlaveNames
Request Type: DELETE
Message Body: N/A
Response: Success

8.6. Slaves

461

Deadline User Manual, Release 7.1.0.35

Possible Errors:
400 Bad Request: Need to provide at least one Slave name to delete.
500 Internal Server Error: An exception occurred within the Deadline code.
Get Slaves Reports
Gets all Slave Reports for all Slave names provided.
URL: http://hostname:portnumber/api/slaves?Name=oneOrMoreSlaveNames&Data=reports
Request Type: GET
Message Body: N/A
Response: JSON object containing all Slave Reports for all Slave names provided.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get Slave Reports For All Slaves
Gets all Slave Reports for all Slaves.
URL: http://hostname:portnumber/api/slaves?Data=reports
Request Type: GET
Message Body: N/A
Response: JSON object containing all Slave Reports for all Slave names provided.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get Slaves History
Gets all Slave History Entries for all Slave names provided.
URL: http://hostname:portnumber/api/slaves?Name=oneOrMoreSlaveNames&Data=history
Request Type: GET
Message Body: N/A
Response: JSON object containing all Slave History Entries for all Slave names provided.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get Slave History For All Slaves
Gets all Slave History Entries for all Slaves.
URL: http://hostname:portnumber/api/slaves?Data=history
Request Type: GET
Message Body: N/A
Response: JSON object containing all Slave History for all Slaves.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get Slave Names Rendering Job
Gets all Slave names rendering Job that corresponds to Job ID provided.
URL: http://hostname:portnumber/api/slavesrenderingjob?JobID=validJobID
Request Type: GET
Message Body: N/A

462

Chapter 8. REST API

Deadline User Manual, Release 7.1.0.35

Response: JSON object all the Slave names rendering the Job.
Possible Errors:
400 Bad Request: No Job ID was provided.
500 Internal Server Error:
An exception occurred within the Deadline code, or
Job ID provided does not correspond to a Job in the repository.
Get Host Names of Machines Rendering Job
Gets all machine host names for slaves rendering Job that corresponds to Job ID provided.
URL: http://hostname:portnumber/api/machinessrenderingjob?JobID=validJobID
Request Type: GET
Message Body: N/A
Response: JSON object containing all the host names.
Possible Errors:
400 Bad Request: No Job ID was provided.
500 Internal Server Error:
An exception occurred within the Deadline code, or
Job ID provided does not correspond to a Job in the repository.
Get IP Address of Machines Rendering Job
Gets all machine IP addresses for slaves rendering Job that corresponds to Job ID provided.
URL: http://hostname:portnumber/api/machinessrenderingjob?JobID=validJobID&GetIpAddress=true
Request Type: GET
Message Body: N/A
Response: JSON object containing all the IP addresses.
Possible Errors:
400 Bad Request: No Job ID was provided.
500 Internal Server Error:
An exception occurred within the Deadline code, or
Job ID provided does not correspond to a Job in the repository.

8.6.3 Slave Property Values


Values for some Slave Info, Settings, and Report properties are represented by numbers. Those properties and their
possible values are listed below.
Stat (SlaveStatus)
0 = Unknown
1 = Rendering
2 = Idle

8.6. Slaves

463

Deadline User Manual, Release 7.1.0.35

3 = Offline
4 = Stalled
8 = StartingJob
Type (ReportType)
0 = LogReport
1 = ErrorReport
2 = RequeueReport

8.7 Pulse
8.7.1 Overview
Pulse requests can be used to set and retrieve Pulse information using GET and PUT. POST and DELETE are not
supported and sending a message of either of these types will result in a 501 Not Implemented error message. For
more about these request types and their uses see the Request Formats and Responses documentation.

8.7.2 Requests and Responses


List of possible requests for Pulse.
Get Pulse Names
Gets all the Pulse names.
URL: http://hostname:portnumber/api/pulse?NamesOnly=true
Request Type: GET
Message Body: N/A
Response: JSON object containing all the Pulse names.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get Pulse Information
Gets the Pulse information for the Pulse names provided.
URL:
http://hostname:portnumber/api/pulse?Info=true&Names=oneOrMorePulseNamesOR
http://hostname:portnumber/api/pulse?Info=true&Name=onePulseName
Request Type: GET
Message Body: N/A
Response: JSON object containing all the Pulse information for the requested Pulse names.
Possible Errors:
404 Not Found: Pulse name provided does not exist (can only occur if you use Name= )
500 Internal Server Error: An exception occurred within the Deadline code.
Save Pulse Information

464

Chapter 8. REST API

Deadline User Manual, Release 7.1.0.35

Saves the Pulse information provided.


URL: http://hostname:portnumber/api/pulse
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
Command = saveinfo
PulseInfo = JSON object containing all the Pulse information.
Response: Success
Possible Errors:
400 Bad Request: Did not provide a Pulse Information JSON object
500 Internal Server Error: An exception occurred within the Deadline code.
Get Pulse Settings
Gets the Pulse settings for the Pulse names provided.
URL:
http://hostname:portnumber/api/pulse?Settings=true&Names=oneOrMorePulseNamesOR
http://hostname:portnumber/api/pulse?Settings=true&Name=onePulseName
Request Type: GET
Message Body: N/A
Response: JSON object containing all the Pulse settings for the requested Pulse names.
Possible Errors:
404 Not Found: Pulse name provided does not exist (can only occur if you use Name= )
500 Internal Server Error: An exception occurred within the Deadline code.
Save Pulse Settings
Saves the Pulse settings provided.
URL: http://hostname:portnumber/api/pulse
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
Command = savesettings
PulseSettings = JSON object containing all the Pulse information.
Response: Success
Possible Errors:
400 Bad Request: Did not provide a Pulse Information JSON object
500 Internal Server Error: An exception occurred within the Deadline code.
Get Pulse InfoSettings
Gets the Pulse information and settings for the Pulse names provided.
URL:
http://hostname:portnumber/api/pulse?Names=oneOrMorePulseNamesOR
http://hostname:portnumber/api/pulse?Name=onePulseName

8.7. Pulse

465

Deadline User Manual, Release 7.1.0.35

Request Type: GET


Message Body: N/A
Response: JSON object containing all the Pulse information and settings for the requested Pulse names.
Possible Errors:
404 Not Found: Pulse name provided does not exist (can only occur if you use Name= )
500 Internal Server Error: An exception occurred within the Deadline code.

8.7.3 Pulse Property Values


Values for some Pulse properties are represented by numbers. Those properties and their possible values are listed
below.
Stat (PulseStatus)
0 = Unknown
1 = Running
2 = Offline
4 = Stalled

8.8 Balancer
8.8.1 Overview
Balancer requests can be used to set and retrieve Balancer information using GET and PUT. POST and DELETE are
not supported and sending a message of either of these types will result in a 501 Not Implemented error message. For
more about these request types and their uses see the Request Formats and Responses documentation.

8.8.2 Requests and Responses


List of possible requests for Balancer.
Get Balancer Names
Gets all the Balancer names.
URL: http://hostname:portnumber/api/balancer?NamesOnly=true
Request Type: GET
Message Body: N/A
Response: JSON object containing all the Balancer names.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get Balancer Information
Gets the Balancer information for the Balancer names provided.
URL:
http://hostname:portnumber/api/balancer?Info=true&Names=oneOrMoreBalancerNamesOR
http://hostname:portnumber/api/balancer?Info=true&Name=oneBalancerName
Request Type: GET
466

Chapter 8. REST API

Deadline User Manual, Release 7.1.0.35

Message Body: N/A


Response: JSON object containing all the Balancer information for the requested Balancer names.
Possible Errors:
404 Not Found: Balancer name provided does not exist (can only occur if you use Name= )
500 Internal Server Error: An exception occurred within the Deadline code.
Save Balancer Information
Saves the Balancer information provided.
URL: http://hostname:portnumber/api/balancer
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
Command = saveinfo
BalancerInfo = JSON object containing all the Balancer information.
Response: Success
Possible Errors:
400 Bad Request: Did not provide a Balancer Information JSON object
500 Internal Server Error: An exception occurred within the Deadline code.
Get Balancer Settings
Gets the Balancer settings for the Balancer names provided.
URL: http://hostname:portnumber/api/balancer?Settings=true&Names=oneOrMoreBalancerNamesOR
http://hostname:portnumber/api/balancer?Settings=true&Name=oneBalancerName
Request Type: GET
Message Body: N/A
Response: JSON object containing all the Balancer settings for the requested Balancer names.
Possible Errors:
404 Not Found: Balancer name provided does not exist (can only occur if you use Name= )
500 Internal Server Error: An exception occurred within the Deadline code.
Save Balancer Settings
Saves the Balancer settings provided.
URL: http://hostname:portnumber/api/balancer
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
Command = savesettings
BalancerSettings = JSON object containing all the Balancer information.
Response: Success

8.8. Balancer

467

Deadline User Manual, Release 7.1.0.35

Possible Errors:
400 Bad Request: Did not provide a Balancer Information JSON object
500 Internal Server Error: An exception occurred within the Deadline code.
Get Balancer InfoSettings
Gets the Balancer information and settings for the Balancer names provided.
URL:
http://hostname:portnumber/api/balancer?Names=oneOrMoreBalancerNamesOR
http://hostname:portnumber/api/balancer?Name=oneBalancerName
Request Type: GET
Message Body: N/A
Response: JSON object containing all the Balancer information and settings for the requested Balancer
names.
Possible Errors:
404 Not Found: Balancer name provided does not exist (can only occur if you use Name= )
500 Internal Server Error: An exception occurred within the Deadline code.

8.8.3 Balancer Property Values


Values for some Balancer properties are represented by numbers. Those properties and their possible values are listed
below.
Stat (BalancerStatus)
0 = Unknown
1 = Running
2 = Offline
4 = Stalled

8.9 Limits
8.9.1 Overview
Limit Group requests can be used to set and retrieve information about one or many Limit Groups. Limit Group
requests support GET, PUT, POST and DELETE request types. For more about these request types and their uses see
the Request Formats and Responses documentation.

8.9.2 Requests and Responses


List of possible requests for Limit Groups. All PUT and POST requests can return a 400 Bad Request error message
if no message body is passed, or if no command key is present in the message body. All PUT and POST requests may
also return a 500 Internal Server Error error message if the command key in the message body contained an invalid
command.

468

Chapter 8. REST API

Deadline User Manual, Release 7.1.0.35

Get Limit Group Names Gets the names of all Limit Groups in the repository.
URL: http://hostname:portnumber/api/limitgroups?NamesOnly=true
Request Type: GET
Message Body: N/A
Response: JSON object containing all the Limit Group names.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get Limit Groups Gets the Limit Groups for the provided Limit Group names.
URL: http://hostname:portnumber/api/limitgroups?Names=listOfOneOrMoreLimitGroupNames
http://hostname:portnumber/api/limitgroups?Name=aSingleLimitGroupName
Request Type: GET
Message Body: N/A
Response: JSON object containing the requested Limit Group/s
Possible Errors:
404 Not Found: There is no Limit Group with provided Name (this can only occur if a single name is
passed)
500 Internal Server Error: An exception occurred within the Deadline code.
Get All Limit Groups Gets the names of all Limit Groups in the repository.
URL: http://hostname:portnumber/api/limitgroups
Request Type: GET
Message Body: N/A
Response: JSON object containing all the Limit Groups.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Set Limit Group Sets the Limit, Slave List, White List Flag, Release Percentage and/or Excluded Slaves for an
existing Limit Group, or creates a new Limit Group with the provided properties.
URL: http://hostname:portnumber/api/limitgroups
Request Type: PUT/POST
Message Body:
JSON object where the following keys are mandatory:
Command = set
Name = name of Limit Group
The following keys are optional:
Limit= integer limit
Slaves = list of slave names to include
SlavesEx = list of slave names to exclude
RelPer = floating point number for release percentage
White = boolean white list flag
Response: Success

8.9. Limits

469

Deadline User Manual, Release 7.1.0.35

Possible Errors:
400 Bad Request: No name provided for the Limit Group
500 Internal Server Error: An exception occurred within the Deadline code.
Save Limit Group Updates a Limit Group using a JSON object containing all the Limit Group information.
URL: http://hostname:portnumber/api/limitgroups
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
Command = save
LimitGroup = JSON object containing all relevant Limit Group information
Response: Success
Possible Errors:
400 Bad Request: No valid Limit Group object provided.
500 Internal Server Error: An exception occurred within the Deadline code.
Reset Limit Group Resets the counts for a Limit Group.
URL: http://hostname:portnumber/api/limitgroups
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
Command = save
Name = name of Limit Group
Response: Success
Possible Errors:
400 Bad Request: No name provided for the Limit Group
404 Not Found: Provided Limit Group name does not correspond to a Limit Group in the repository.
500 Internal Server Error: An exception occurred within the Deadline code.
Delete Limit Groups Deletes the Limit Groups for the provided Limit Group names.
URL: http://hostname:portnumber/api/limitgroups
Request Type: DELETE
Message Body: N/A
Response: JSON object containing the requested Limit Group/s
Possible Errors:
400 Bad Request: Must provide at least one Limit Group name to delete.
500 Internal Server Error: An exception occurred within the Deadline code.

470

Chapter 8. REST API

Deadline User Manual, Release 7.1.0.35

8.9.3 Limit Group Property Values


Values for some Limit Group properties are represented by numbers. Those properties and their possible values are
listed below.
Type (LimitGroupType)
0 = General
1 = JobSpecific
2 = MachineSpecific
StubLevel (currently not used)
0 = Slave
1 = Task
2 = Machine

8.10 Users
8.10.1 Overview
User requests can be used to set and retrieve information for one or many Users. User requests support GET, PUT,
POST and DELETE request types. For more about these request types and their uses see the Request Formats and
Responses documentation.

8.10.2 Request and Responses


List of possible requests for Users.
Get User Names
Gets all the User names.
URL: http://hostname:portnumber/api/users?NamesOnly=true
Request Type: GET
Message Body: N/A
Response: JSON object containing all the User names.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get Users
Gets all the User information for the provided User names.
URL: http://hostname:portnumber/api/users?Name=oneOrMoreValidUserNames
Request Type: GET
Message Body: N/A
Response: JSON object containing all the User information for the Users provided.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get All Users

8.10. Users

471

Deadline User Manual, Release 7.1.0.35

Gets all the Users.


URL: http://hostname:portnumber/api/users
Request Type: GET
Message Body: N/A
Response: JSON object containing all the User information for the Users provided.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Save User
Saves the User Information provided.
URL: http://hostname:portnumber/api/users
Request Type: PUT/POST
Message Body: JSON object containing all the User Information to save.
Response: Success for PUT, the User name and ID for POST.
Possible Errors:
400 Bad Request:
No user information provided, or
No User name provided, or
User info already exists (POST error only).
500 Internal Server Error: An exception occurred within the Deadline code.
Delete User
Deletes the Users corresponding to the User names provided.
URL: http://hostname:portnumber/api/users?Name=oneOrMoreValidUserNames
Request Type: DELETE
Message Body: N/A
Response: Success
Possible Errors:
400 Bad Request:
No user information provided, or
No User names provided.
500 Internal Server Error: An exception occurred within the Deadline code.
Get User Group Names
Gets all the User Group names.
URL: http://hostname:portnumber/api/usergroups
Request Type: GET
Message Body: N/A
Response: JSON object containing all the User Group names.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.

472

Chapter 8. REST API

Deadline User Manual, Release 7.1.0.35

Get Users Names For User Group


Gets all the User names for the User Group that corresponds to the provided User Group name.
URL: http://hostname:portnumber/api/usergroups?Name=oneValidUserGroupName
Request Type: GET
Message Body: N/A
Response: JSON object containing all the User names in the User Group.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get User Group Names For User
Gets all the User Group names for the User corresponding to the provided User name.
URL: http://hostname:portnumber/api/usergroups?User=onValidUserName
Request Type: GET
Message Body: N/A
Response: JSON object containing all the User Group names for the User.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Add Users To User Groups
Adds the Users corresponding to the User names provided to the User Groups corresponding with the
User Group names provided.
URL: http://hostname:portnumber/api/usergroups
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
Command = add
User = the user name/s to add (May be an Array)
Group = the user group name/s to add to (May be an Array)
Response: Success
Possible Errors:
400 Bad Request: Missing one or more of the required keys in the JSON object in the message
body.
500 Internal Server Error:
An exception occurred within the Deadline code, or
Command key does not contain a valid command string, or
None of the provided User names correspond to real Users, or
None of the provided User Group names correspond to real User Groups.
Remove Users From User Groups
Removes the Users corresponding to the User names provided from the User Groups corresponding with
the User Group names provided.
URL: http://hostname:portnumber/api/usergroups

8.10. Users

473

Deadline User Manual, Release 7.1.0.35

Request Type: PUT


Message Body:
JSON object where the following keys are mandatory:
Command = remove
User = the user name/s to add (May be an Array)
Group = the user group name/s to add to (May be an Array)
Response: Success
Possible Errors:
400 Bad Request: Missing one or more of the required keys in the JSON object in the message
body.
500 Internal Server Error:
An exception occurred within the Deadline code, or
Command key does not contain a valid command string, or
None of the provided User names correspond to real Users, or
None of the provided User Group names correspond to real User Groups.
Set Users For User Groups
Sets the Users corresponding to the User names provided for the User Groups corresponding with the
User Group names provided.
URL: http://hostname:portnumber/api/usergroups
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
Command = set
User = the user name/s to add (May be an Array)
Group = the user group name/s to add to (May be an Array)
Response: Success
Possible Errors:
400 Bad Request: Missing one or more of the required keys in the JSON object in the message
body.
500 Internal Server Error:
An exception occurred within the Deadline code, or
Command key does not contain a valid command string, or
None of the provided User names correspond to real Users, or
None of the provided User Group names correspond to real User Groups.
Create New User Groups

474

Chapter 8. REST API

Deadline User Manual, Release 7.1.0.35

Creates and saves new user groups with the given names.
URL: http://hostname:portnumber/api/usergroups
Request Type: POST
Message Body:
JSON object where the following keys are mandatory:
Group = the user group name/s to create (array)
Response: Success
Possible Errors:
400 Bad Request: Missing one or more of the required keys in the JSON object in the message
body.
500 Internal Server Error: An exception occurred within the Deadline code
Delete User Groups
Deletes a user groups with the given name.
URL: http://hostname:portnumber/api/usergroups?Name=user+group+name+to+delete
Request Type: DELETE
Message Body: N/A
Response: Success
Possible Errors:
400 Bad Request: Must provide a user group name to delete.
500 Internal Server Error: An exception occurred within the Deadline code

8.11 Repository
8.11.1 Overview
Repository requests can be used to retrieve Repository information, such as directories or paths, using the GET request
type. Repository requests can also be used for adding history entries for jobs, slaves or the repository using the POST
request type. PUT and DELETE are not supported and sending a message of either of these types will result in a
501 Not Implemented error message. For more about these request types and their uses see the Request Formats and
Responses documentation.

8.11.2 Requests and Responses


List of possible requests for the Repository.
Get Root Directory
URL: http://hostname:portnumber/api/repository?Directory=root
Request Type: GET
Message Body: N/A
Response: JSON Object containing the root directory, or a message stating that the directory is not set.

8.11. Repository

475

Deadline User Manual, Release 7.1.0.35

Possible Errors:
400 Bad Request: Must provide a Directory or an Auxiliary Path to find.
404 Not Found: Requested Directory could not be found.
Get Bin Directory
URL: http://hostname:portnumber/api/repository?Directory=bin
Request Type: GET
Message Body: N/A
Response: JSON Object containing the bin directory, or a message stating that the directory is not set.
Possible Errors:
400 Bad Request: Must provide a Directory or an Auxiliary Path to find.
404 Not Found: Requested Directory could not be found.
Get Settings Directory
URL: http://hostname:portnumber/api/repository?Directory=settings
Request Type: GET
Message Body: N/A
Response: JSON Object containing the settings directory, or a message stating that the directory is not
set.
Possible Errors:
400 Bad Request: Must provide a Directory or an Auxiliary Path to find.
404 Not Found: Requested Directory could not be found.
Get Events Directory
URL: http://hostname:portnumber/api/repository?Directory=events
Request Type: GET
Message Body: N/A
Response: JSON Object containing the events directory, or a message stating that the directory is not set.
Possible Errors:
400 Bad Request: Must provide a Directory or an Auxiliary Path to find.
404 Not Found: Requested Directory could not be found.
Get Custom Events Directory
URL: http://hostname:portnumber/api/repository?Directory=customevents
Request Type: GET
Message Body: N/A
Response: JSON Object containing the custom events directory, or a message stating that the directory is
not set.
Possible Errors:
400 Bad Request: Must provide a Directory or an Auxiliary Path to find.
404 Not Found: Requested Directory could not be found.

476

Chapter 8. REST API

Deadline User Manual, Release 7.1.0.35

Get Plugins Directory


URL: http://hostname:portnumber/api/repository?Directory=plugins
Request Type: GET
Message Body: N/A
Response: JSON Object containing the plugins directory, or a message stating that the directory is not set.
Possible Errors:
400 Bad Request: Must provide a Directory or an Auxiliary Path to find.
404 Not Found: Requested Directory could not be found.
Get Custom Plugins Directory
URL: http://hostname:portnumber/api/repository?Directory=customplugins
Request Type: GET
Message Body: N/A
Response: JSON Object containing the custom plugins directory, or a message stating that the directory
is not set.
Possible Errors:
400 Bad Request: Must provide a Directory or an Auxiliary Path to find.
404 Not Found: Requested Directory could not be found.
Get Scripts Directory
URL: http://hostname:portnumber/api/repository?Directory=scripts
Request Type: GET
Message Body: N/A
Response: JSON Object containing the scripts directory, or a message stating that the directory is not set.
Possible Errors:
400 Bad Request: Must provide a Directory or an Auxiliary Path to find.
404 Not Found: Requested Directory could not be found.
Get Custom Scripts Directory
URL: http://hostname:portnumber/api/repository?Directory=customscripts
Request Type: GET
Message Body: N/A
Response: JSON Object containing the custom scripts directory, or a message stating that the directory is
not set.
Possible Errors:
400 Bad Request: Must provide a Directory or an Auxiliary Path to find.
404 Not Found: Requested Directory could not be found.
Get Auxiliary Path

8.11. Repository

477

Deadline User Manual, Release 7.1.0.35

URL: http://hostname:portnumber/api/repository?AuxiliaryPath=job&JobID=aValidJobID
Request Type: GET
Message Body: N/A
Response: JSON Object containing the auxiliary path for the provided job id, or a message stating that
the path is not set.
Possible Errors:
400 Bad Request:
Must provide a Directory or an Auxiliary Path to find, or
Must provide a Job ID.
404 Not Found:
Requested Directory could not be found, or
Job ID provided does not correspond to a Job in the repository.
Get Alternate Auxiliary Path
URL: http://hostname:portnumber/api/repository?AuxiliaryPath=alternate
Request Type: GET
Message Body: N/A
Response: JSON Object containing the alternate auxiliary path, or a message stating that the path is not
set.
Possible Errors:
400 Bad Request: Must provide a Directory or an Auxiliary Path to find.
404 Not Found: Requested Directory could not be found.
Get Windows Alternate Auxiliary Path
URL: http://hostname:portnumber/api/repository?AuxiliaryPath=windowsalternate
Request Type: GET
Message Body: N/A
Response: JSON Object containing the windows alternate auxiliary path, or a message stating that the
path is not set.
Possible Errors:
400 Bad Request: Must provide a Directory or an Auxiliary Path to find.
404 Not Found: Requested Directory could not be found.
Get Linux Alternate Auxiliary Path
URL: http://hostname:portnumber/api/repository?AuxiliaryPath=linuxalternate
Request Type: GET
Message Body: N/A
Response: JSON Object containing the linux alternate auxiliary path, or a message stating that the path is
not set.
Possible Errors:

478

Chapter 8. REST API

Deadline User Manual, Release 7.1.0.35

400 Bad Request: Must provide a Directory or an Auxiliary Path to find.


404 Not Found: Requested Directory could not be found.
Get Mac Alternate Auxiliary Path
URL: http://hostname:portnumber/api/repository?AuxiliaryPath=macalternate
Request Type: GET
Message Body: N/A
Response: JSON Object containing the mac alternate auxiliary path, or a message stating that the path is
not set.
Possible Errors:
400 Bad Request: Must provide a Directory or an Auxiliary Path to find.
404 Not Found: Requested Directory could not be found.
Get Maximum Priority
URL: http://hostname:portnumber/api/maximumpriority
Request Type: GET
Message Body: N/A
Response: JSON Object containing the Maximum Priority.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline.
Path Mapping
URL: http://hostname:portnumber/api/mappedpaths
Request Type: POST
Message Body:
JSON object that must contain the following keys:
OS = Operating system (Windows, Linux, or Mac).
Paths = Array of paths to map.
Region = The region to be used for mapping paths (optional, defaults to none).
Response: JSON Object containing the updated paths.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline.
Get Plugin Names
URL: http://hostname:portnumber/api/plugins
Request Type: GET
Message Body: N/A
Response: JSON Object containing the plugin names
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline.
Get Plugin Event Names
URL: http://hostname:portnumber/api/plugins?EventNames=true
Request Type: GET

8.11. Repository

479

Deadline User Manual, Release 7.1.0.35

Message Body: N/A


Response: JSON Object containing the plugin event names
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline.
Get Database Connection String
URL: http://hostname:portnumber/api/repository?DatabaseConnection
Request Type: GET
Message Body: N/A
Response: The Database Connection string in the form of: (server:port,server:port...).
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline.
Add Job History Entry
URL: http://hostname:portnumber/api/repository
Request Type: POST
Message Body:
JSON object where the following keys are mandatory:
Command = jobhistoryentry
JobID = The job id string.
Entry = The entry string to be added.
Response: Success
Possible Errors:
400 Bad Request:
JSON object was not provided in message body or,
The provided JSON object is missing some values.
500 Internal Server Error: An exception occurred within the Deadline.
Add Slave History Entry
URL: http://hostname:portnumber/api/repository
Request Type: POST
Message Body:
JSON object where the following keys are mandatory:
Command = slavehistoryentry
SlaveName = The slave name.
Entry = The entry string to be added.
Response: Success
Possible Errors:
400 Bad Request:
JSON object was not provided in message body or,
The provided JSON object is missing some values.

480

Chapter 8. REST API

Deadline User Manual, Release 7.1.0.35

500 Internal Server Error: An exception occurred within the Deadline.


Add Repository History Entry
URL: http://hostname:portnumber/api/repository
Request Type: POST
Message Body:
JSON object where the following keys are mandatory:
Command = repositoryhistoryentry
Entry = The entry string to be added.
Response: Success
Possible Errors:
400 Bad Request:
JSON object was not provided in message body or,
The provided JSON object is missing some values.
500 Internal Server Error: An exception occurred within the Deadline.

8.12 Pools
8.12.1 Overview
Pool requests can be used to set and retrieve information for one or many Pools. Pool requests support GET, PUT,
POST and DELETE request types. For more about these request types and their uses see the Request Formats and
Responses documentation.

8.12.2 Requests and Responses


List of possible requests for Pools
Get Pool Names Gets Pool Names.
URL: http://hostname:portnumber/api/pools
Request Type: GET
Message Body: N/A
Response: JSON object containing all the Pool names.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get Slaves For Pools Gets all the Slave names for the provided Pool names.
URL: http://hostname:portnumber/api/pools?Pool=oneOrMorePoolNames
Request Type: GET
Message Body: N/A
Response: JSON object containing all Slave names that are in the provided Pools.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.

8.12. Pools

481

Deadline User Manual, Release 7.1.0.35

Add Pools Creates new Pools using the provided Pool names.
URL: http://hostname:portnumber/api/pools
Request Type: POST
Message Body:
JSON object that must contain the following keys:
Pool = pool name/s (May be an Array)
Response: Success
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Set Pools Removes all pools not provided and creates any provided pools that did not exist.
URL: http://hostname:portnumber/api/pools
Request Type: POST
Message Body:
JSON object that must contain the following keys:
Pool = pool name/s (May be an Array)
OverWrite = true
Response: Success
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Add Pools to Slaves Adds the provided Pools to the assigned pools for each provided Slave. For both Pools and
Slaves, only the names are required.
URL: http://hostname:portnumber/api/pools
Request Type: PUT
Message Body:
JSON object that must contain the following keys:
Slave = slave name/s (May be an Array)
Pool = pool name/s (May be an Array)
Response: Success
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Set Pools for Slaves Sets provided Pools as the assigned pools for each provided Slave. For both Pools and Slaves,
only the names are required.
URL: http://hostname:portnumber/api/pools
Request Type: PUT
Message Body:
JSON object that must contain the following keys:
Slave = slave name/s (May be an Array)
ReplacementPool = pool name to replace the pools being purged
OverWrite = true

482

Chapter 8. REST API

Deadline User Manual, Release 7.1.0.35

Response: Success
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Purge Pools Purges all obsolete pools using the provided replacement pool.
URL: http://hostname:portnumber/api/pools
Request Type: PUT
Message Body:
JSON object that must contain the following keys:
OverWrite = true
ReplacementPool = pool name to replace the pools being purged
Response: Success
Possible Errors:
500 Internal Server Error: An exception occurred within the Deadline code, or
Replacement Pool name provided does not exist.
Set and Purge Pools Sets the list of pools to the provided list of pool names, creating them if necessary. Purges all
the obsolete pools using the provided replacement pool.
URL: http://hostname:portnumber/api/pools
Request Type: PUT
Message Body:
JSON object that must contain the following keys:
OverWrite = true
ReplacementPool = pool name to replace the pools being purged
Pool = the pool/s provided for setting, the replacement pool must be in this pool list or must be
none (May be an Array)
Response: Success
Possible Errors:
500 Internal Server Error: An exception occurred within the Deadline code, or
Replacement Pool name provided does not exist.
Add and Purge Pools Adds the list of provided pools, creating them if necessary. Purges all the obsolete pools using
the provided replacement pool.
URL: http://hostname:portnumber/api/pools
Request Type: PUT
Message Body:
JSON object that must contain the following keys:
OverWrite = true
ReplacementPool = pool name to replace the pools being purged
Pool = the pool/s provided for adding (May be an Array)
Response: Success

8.12. Pools

483

Deadline User Manual, Release 7.1.0.35

Possible Errors:
500 Internal Server Error: An exception occurred within the Deadline code, or
Replacement Pool name provided does not exist.
Delete Pools Deletes all Pools with the provided Pool names.
URL: http://hostname:portnumber/api/pools?Pool=oneOrMorePoolNames
Request Type: DELETE
Message Body: N/A
Response: Success
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Delete Pools From Slaves Deletes all Pools from the Slaves list of pools.
URL: http://hostname:portnumber/api/pools?Pool=oneOrMorePoolNames&Slaves=oneOrMoreSlaveNames
Request Type: DELETE
Message Body: N/A
Response: Success
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.

8.13 Groups
8.13.1 Overview
Group requests can be used to set and retrieve information for one or many Groups. Group requests support GET,
PUT, POST and DELETE request types. For more about these request types and their uses see the Request Formats
and Responses documentation.

8.13.2 Requests and Responses


List of possible requests for Groups
Get Group Names Gets Group Names.
URL: http://hostname:portnumber/api/groups
Request Type: GET
Message Body: N/A
Response: JSON object containing all the Group names.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get Slaves For Groups Gets all the Slave names for the provided Group names.
URL: http://hostname:portnumber/api/groups?Group=oneOrMoreGroupNames
Request Type: GET
Message Body: N/A
Response: JSON object containing all Slave names that are in the provided Groups.

484

Chapter 8. REST API

Deadline User Manual, Release 7.1.0.35

Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Add Groups Creates new Groups using the provided Group names.
URL: http://hostname:portnumber/api/groups
Request Type: POST
Message Body:
JSON object that must contain the following keys:
Group = group name/s (May be an Array)
Response: Success
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Set Groups Removes all groups not provided and creates any provided groups that did not exist.
URL: http://hostname:portnumber/api/groups
Request Type: POST
Message Body:
JSON object that must contain the following keys:
Group = group name/s (May be an Array)
OverWrite = true
Response: Success
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Add Groups to Slaves Adds the provided Groups to the assigned groups for each provided Slave. For both Groups
and Slaves, only the names are required.
URL: http://hostname:portnumber/api/groups
Request Type: PUT
Message Body:
JSON object that must contain the following keys:
Slave = slave name/s (May be an Array)
Group = group name/s (May be an Array)
Response: Success
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Set Groups for Slaves Sets provided Groups as the assigned groups for each provided Slave. For both Groups and
Slaves, only the names are required.
URL: http://hostname:portnumber/api/groups
Request Type: PUT
Message Body:
JSON object that must contain the following keys:
Slave = slave name/s (May be an Array)
ReplacementGroup = group name to replace the groups being purged
OverWrite = true

8.13. Groups

485

Deadline User Manual, Release 7.1.0.35

Response: Success
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Purge Groups Purges all obsolete groups using the provided replacement group.
URL: http://hostname:portnumber/api/groups
Request Type: PUT
Message Body:
JSON object that must contain the following keys:
OverWrite = true
ReplacementGroup = group name to replace the groups being purged
Response: Success
Possible Errors:
500 Internal Server Error: An exception occurred within the Deadline code, or
Replacement Group name provided does not exist.
Set and Purge Groups Sets the list of groups to the provided list of group names, creating them if necessary. Purges
all the obsolete groups using the provided replacement group.
URL: http://hostname:portnumber/api/groups
Request Type: PUT
Message Body:
JSON object that must contain the following keys:
OverWrite = true
ReplacementGroup = group name to replace the groups being purged
Group = the group/s provided for setting, the replacement group must be in this group list or must
be none (May be an Array)
Response: Success
Possible Errors:
500 Internal Server Error: An exception occurred within the Deadline code, or
Replacement Group name provided does not exist.
Add and Purge Groups Adds the list of provided groups, creating them if necessary. Purges all the obsolete groups
using the provided replacement group.
URL: http://hostname:portnumber/api/groups
Request Type: PUT
Message Body:
JSON object that must contain the following keys:
OverWrite = true
ReplacementGroup = group name to replace the groups being purged
Group = the group/s provided for adding (May be an Array)
Response: Success

486

Chapter 8. REST API

Deadline User Manual, Release 7.1.0.35

Possible Errors:
500 Internal Server Error: An exception occurred within the Deadline code, or
Replacement Group name provided does not exist.
Delete Groups Deletes all Groups with the provided Group names.
URL: http://hostname:portnumber/api/groups?Group=oneOrMoreGroupNames
Request Type: DELETE
Message Body: N/A
Response: Success
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Delete Groups From Slaves Deletes all Groups from the Slaves list of groups.
URL: http://hostname:portnumber/api/groups?Group=oneOrMoreGroupNames&Slaves=oneOrMoreSlaveNames
Request Type: DELETE
Message Body: N/A
Response: Success
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.

8.13. Groups

487

Deadline User Manual, Release 7.1.0.35

488

Chapter 8. REST API

CHAPTER

NINE

APPLICATION PLUGINS

9.1 3ds Command


9.1.1 Job Submission
You can submit jobs from within 3ds Max by installing the integrated submission script, or you can submit them from
the Monitor. The instructions for installing the integrated submission script can be found further down this page.
To submit from within 3ds Max, select the Deadline (3dsCmd) menu item that you created during the integrated
submission script setup.

489

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. The 3ds Command specific options are:
Force Build: You can force 32 bit or 64 bit rendering.
Path Config: Allows you to specify an alternate path file in the MXP format that the slaves can use to find
bitmaps that are not found on the primary map paths.
Show Virtual Frame Buffer: Enable the virtual frame buffer during rendering.
Apply VideoPost To Scene: Whether or not to use VideoPost during rendering.
Continue On Errors: Enable to have the 3ds command line renderer ignore errors during rendering.
Enable Local Rendering: If enabled, the frames will be rendered locally, and then copied to their final network
location.
Gamma Correction: Enable to apply gamma correction during rendering.
490

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Split Rendering: Enable split rendering. Specify the number of strips to split the frame into, as well as the
overlap you want to use.
VRay/Mental Ray DBR: Enable this option to offload a VRay or Mental Ray DBR render to Deadline. See the
VRay/Mental Ray DBR section for more information.
Run Sanity Check On Submission: Check for scene problems during submission.
VRay/Mental Ray off-load DBR
You can offload a VRay or Mental Ray DBR job to Deadline by enabling the Distributed Rendering option in your
VRay or Mental Ray settings, and by enabling the VRay/Mental Ray DBR checkbox in the submission dialog. With
this option enabled, a job will be submitted with its task count equal to the number of Slaves you specify, and it will
render the current frame in the scene file.
The slave that picks up task 0 will be the master, and will wait until all other tasks are picked up by other slaves.
Once the other tasks have been picked up, the master will update its local VRay or Mental Ray config file with the
names of the machines that are rendering the other tasks. It will then start the distributed render by connecting to the
other machines. Note that the render will not start until ALL tasks have been picked up by a slave.
It is recommended to setup VRay DBR or Mental Ray DBR for 3ds Max and verify it is working correctly prior
to submitting a DBR off-load job to Deadline. RTT (Render To Texture) is not supported with distributed bucket
rendering. If running multiple Deadline slaves on one machine, having these 2 or more slaves both pick up a different
DBR job concurrently as either master or slave is not supported.
Notes for VRay DBR:
Ensure VRay is the currently assigned renderer in the 3ds Max scene file prior to submission.
You must have the Distributed Rendering option enabled in your VRay settings under the Settings tab.
Ensure Save servers in the scene (Save hosts in the scene in VRay v2) option in VRay distributed rendering
settings is DISABLED as otherwise it will ignore the vray_dr.cfg file list!
Ensure Max servers value is set to 0. When set to 0 all listed servers will be used.
It is recommended to disable Use local host checkbox to reduce network traffic on the master machine,
when using a large number of slaves (5+). If disabled, the master machine only organises the DBR process,
sending rendering tasks to the Deadline slaves. This is particularly important if you intend to use the VRay v3+
Transfer missing assets feature. Note that Windows 7 OS has a limitation of a maximum of 20 other machines
concurrently connecting to the master machine.
VRay v3.00.0x has a bug in DBR when the Use local host is unchecked, it still demands a render node license.
This is resolved in a newer version of VRay. Please contact Chaos Group for more information.
The slaves will launch the VRay Spawner executable found in the 3ds Max root directory. Do NOT install the
VRay Spawner as a service on the master or slave machines. Additionally, Drive Mappings are unsupported
when running as a service.
The vray_dr.cfg file in the 3ds Maxs plugcfg directory must be writeable so that the master machine can
update it. This is typically located in the user profile directory, in which case it will be writeable already.
Chaos Group recommend that each machine to be used for DBR has previously rendered at least one other 3ds
Max job prior to trying DBR on the same machine.
Ensure all slaves can correctly access any mapped drives or resolve all UNC paths to obtain any assets required
by the 3ds Max scene file to render successfully. Use the Deadline Mapped Drives feature to ensure the necessary
drive mappings are in place.
Default lights are not supported by Chaos Group in DBR mode and will not render.

9.1. 3ds Command

491

Deadline User Manual, Release 7.1.0.35

Ensure you have sufficient VRay DR licenses if processing multiple VRay DBR jobs through Deadline concurrently. Use the Deadline Limits feature to limit the number of licenses being used at any time.
Ensure the necessary VRay executables & TCP/UDP ports have been allowed to pass-through the Windows
Firewall. Please consult the VRay user manual for specific information.
VRay does NOT currently support in 3ds Max the ability to dynamically add or remove DBR slaves to the
currently processing DBR render once started on the master slave.
Notes for Mental Ray DBR:
Ensure Mental Ray is the currently assigned renderer in the 3ds Max scene file prior to submission.
You must have the Distributed Render option enabled in your Mental Ray settings under the Processing tab.
The Mental Ray Satellite service must be running on your slave machines. It is installed by default during the
3ds Max installation.
The max.rayhosts file must be writeable so that the master machine can update it. Its location is different for
different versions of 3ds Max:
2010 and earlier: It will be in the mentalray folder in the 3ds Max root directory.
2011 and 2012: It will be in the mentalimages folder in the 3ds Max root directory.
2013 and later: It will be in the NVIDIA folder in the 3ds Max root directory.
Ensure the Use Placeholder Objects checkbox is enabled in the Translator Options rollout of the Processing tab. When placeholder objects are enabled, geometry is sent to the renderer only on demand.
Ensure Bucket Order is set to Hilbert in the Options section of the Sampling Quality rollout of the
Renderer tab. With Hilbert order, the sequence of buckets to render uses the fewest number of data transfers.
Contour shading is not supported with distributed bucket rendering.
Autodesk Mental Ray licensing in 3ds Max is restricted. Autodesk says Satellite processors allow any owner
of a 3ds Max license to freely use up to four slave machines (with up to four processors each and an unlimited
number of cores) to render an image using distributed bucket rendering, not counting the one, two, or four
processors on the master system that runs 3ds Max. Mental Ray Standalone licensing can be used to go beyond
this license limit. Use the Deadline Limits feature to limit the number of licenses being used at any time if
required.
Ensure the necessary Mental Ray executables & TCP/UDP ports have been allowed to pass-through the Windows Firewall. Please consult the Autodesk 3ds Max user manual for specific information.
Sanity Check
The 3ds Command Sanity Check script defines a set of functions to be called to ensure that the scene submission does
not contain typical errors like wrong render view and frame range settings, incorrect output path, etc.
The Sanity Check is enabled by the Run Sanity Check Automatically Before Submission checkbox in the User Options
group of controls in the Submit To Deadline (3dsmaxCmd) dialog. You can also run the Sanity Check automatically
by clicking the Run Now! button.

492

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

The dialog contains the following elements:


The upper area (Error Report) lists the problems found in the current scene.
The lower area (Feedback Messages) lists the actions the Sanity Check performs and gives feedback to the user.
The latest message is always on top.
Between the two areas, there is a summary text line listing the total number of errors and a color indicator of the
current Sanity Check state. When red, the Sanity Check will not allow a job submission to be performed.
The Error Report
The left column of the Error Report displays a checkbox and the type of the error. The checkbox determines whether
the error will be taken into account by the final result of the check. Currently, there are 3 types of errors:
FATAL: The error cannot be fixed automatically and requires manual changes to the scene itself. A job submission with such error would be pointless. The state of the checkbox is ignored and assumed always checked.
Can Be Fixed: The error can be fixed automatically or manually. If the checkbox is active, the error contributes
to the final result. If unchecked, the error is ignored and handled as a warning.
Warning: The problem might not require fixing, but could be of importance to the user. It is not taken into
account by the final result (the state of the checkbox is ignored and assumed always unchecked).
Repairing Errors
Right-clicking an Error Message in the Error Report window will cause an associated repair function to be executed
and/or a Report Message to be output in the Feedback Messages window. This difference was caused by the switch
to DotNet controls which handle double-clicks as checked events, changing the checkbox state in front of the error
instead.
Updating the Error Report
You can rerun/update the Sanity Check in one of the following ways:

9.1. 3ds Command

493

Deadline User Manual, Release 7.1.0.35

Clicking the dialog anywhere outside of the two message areas will rerun the Sanity Check and update all
messages.
Double-clicking any Message in the Feedback Messages window will rerun the Sanity Check and update all
messages.
Reparing an error by double-clicking will also automatically rerun the Sanity Check
Pressing the Run Now! button in the Submit To Deadline dialog will update the Sanity Check.
The following Sanity Checks are FATAL. These are errors that must be fixed manually before the job can be submitted.
Message
The scene does not
contain ANY objects!
Maxwell is the renderer
and the current view is
NOT a Camera.

Description
The scene is empty and should not be sent to
Deadline.
Maxwell renderer must render through an
actual camera and will fail through a viewport.

The scene contains


objects or groups with
the same name as a
camera!

The scene contains objects or groups with a


duplicate name to a camera which could result
in an incorrect object being used as the camera.

Fix
Load a valid scene or create/ merge
objects, then try again.
Double-click the error message to
open a Select By Name dialog to
pick a camera for the current
viewport.
Ensure you remove any duplicate
named objects from your scene.

The following Sanity Checks can be automatically fixed before the job is submitted.

494

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Message
The current Scene
Name is Untitled.

The current view is


NOT a camera.

Description
The scene has never been saved to a MAX
file.
While it is possible to submit an untitled
scene to Deadline, it is not a good practice.
The active viewport is not a camera
viewport.

The Render Time


Output is set to
SINGLE FRAME!

While it is ok to send a single frame to


Deadline, users are sending animations
99% of the cases.

The Render Output


Path appears to
point at a LOCAL
DRIVE!

While it is technically possible to save


locally on each Slave, this is a bad idea - all
Slaves should write their output to a central
location on the network. Currently, disks
C:, D: and E: are considered local and will
be tested agains the output path.
The Name to be saved to ends with one,
two or three digits. Rendering to this file
name will append 4 more digits and make
loading sequential files in other
applications hard or impossible. This check
is performed only when the type is not AVI
or MOV and will ignore 4 trailing digits
which will be replaced by 3dsmax correctly
when rendering to sequential files.
No renders will be saved as Render Scene
Dialog checkbox is currently disabled.

The Render Output


File Name ends
with a DIGIT trailing numbers
might fail.

The Render Output


will not be saved to
a file.
The Distributed
Rendering option is
enabled for this
renderer.

Check if Distributed Rendering is enabled


for MR or V-Ray renderer.

Fix
Double-click the error message to open a
Save As dialog and save to disk.

Double-click the error message to open a


Select By Name dialog to pick a camera for
the current viewport.
Double-click the error message to set the
Render Time Output to Active Time
Segment:. The Render Dialog will open so
you can check the options and set to Range
or Frames instead.
Double-click the error message to open the
Render Dialog and select a valid path, then
double-click again to retest.

Double-click the error message to add an


underscore _ to the end of the file name, for
example z:\temp\test123.tga will be
changed to z:\temp\test123_.tga

Double-click the error message to open the


Render Dialog and to enable the Save File
checkbox.
Double-click the error message to disable
Distributed rendering.

The following Sanity Checks are simply warnings.


Message
The Render
Output Path is
NOT
DEFINED!
The Render
Output is set
to a MOVIE
format.

Description
No frames will be saved to disk. This is allowed if
you want to output render elements only.

Fix
Double-click the error message to open
the Render Dialog and select a valid
path, then double-click again to retest.

The file extension is set to an AVI or MOV format.


In the current version of Deadline, this would result
in a sequence of single frame MOV files rendered by
separate slaves. In the future, the behaviour might
be changed to render a single MOV or AVI file on a
single slave as one Task.

Double-click the error message to open


the Render Dialog and select a single
frame output format, then double-click
again to retest.

This list will be extended to include future checks and can be edited by 3rd parties by adding new definitions and
functions to the original script. Documentation on extending the script will be published later. Please email suggestions
for enhancements and additional test cases to Deadline Support.

9.1. 3ds Command

495

Deadline User Manual, Release 7.1.0.35

9.1.2 Plug-in Configuration


You can configure the 3ds Command plug-in settings from the Monitor. While in super user mode, select Tools ->
Configure Plugins and select the 3ds Command plug-in from the list on the left.

Render Executables
3ds Max Cmd Executable: The path to the 3dsmaxcmd.exe executable file used for rendering. Enter alternative
paths on separate lines. Different executable paths can be configured for each version installed on your render
nodes.
Render Options
3ds Cmd Verbosity Level: The verbose level (0-5).
VRay DBR and Mental Ray Satellite Rendering
Use IP Addresses: If offloading a VRay DBR or Mental Ray Satellite render to Deadline, Deadline will update
the appropriate config file with the host names of the machines that are running the VRay Spawner or Satellite
service. If this is enabled, the IP addresses of the machines will be used instead.

9.1.3 Integrated Submission Script Setup


The following procedure describes how to install the integrated Autodesk 3ds Command submission script. The
integrated submission script allows for submitting 3ds Command Line render jobs to Deadline directly from within
the Max editing GUI. The integrated render job submission script and the following installation procedure has been
tested with Max versions 2010 and later (including Design editions).
Note: Due to a maxscript bug in the initial release of 3ds Max 2012, the integrated submission scripts will not work.
However, this bug has been addressed in 3ds Max 2012 Hotfix 1. If you cannot apply this patch, it means that you
must submit your 3ds Max 2012 jobs from the Monitor.

496

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

You can either run the Submitter installer or manually install the submission script
Submitter Installer
Run the Submitter Installer located at <Repository>/submission/3dsCmd/Installers
Manual Installation of the Submission Script
Copy [Repository]/submission/3dsCmd/Client/Deadline3dsCmdClient.mcr to [3ds Install Directory]/MacroScripts. If you dont have a MacroScripts folder in your 3ds Max install directory, check to
see if you have a UI/Macroscripts folder instead, and copy the Deadline3dsCmdClient.mcr file there if you do.
Copy
[Repository]/submission/3dsmax/Client/SMTDSetup.ms
tory]/scripts/Startup/SMTDSetup.ms

to

[3ds

Max

Install

Direc-

Launch 3ds Max, and find the new Deadline menu.

9.1.4 FAQ
Which versions of Max are supported?
The 3dsCommand plugin has been tested with 3ds Max 2010 and later (including Design editions).
Note: Due to a maxscript bug in the initial release of 3ds Max 2012, the integrated submission scripts
will not work. However, this bug has been addressed in 3ds Max 2012 Hotfix 1. If you cannot apply this
patch, it means that you must submit your 3ds Max 2012 jobs from the Monitor.
When should I use the 3dsCommand plugin to render Max jobs instead of the original?
This plugin should only be used when a particular feature doesnt work with our normal 3dsmax plugin.
For example, there was a time when using the 3dsCommand plugin was the only way to render scenes
that made use of Vrays Frame Buffer features.
9.1. 3ds Command

497

Deadline User Manual, Release 7.1.0.35

Note that the 3dsCommand plugin has less features in the submission dialog, and the error handling isnt as
robust. In addition, using 3dsCommand causes Max to take extra time to start up because 3dsmaxcmd.exe
needs to be launched for each task, so renders might take a little extra time to complete.
Is PSofts Pencil+ render effects plugin supported?
Yes. Ensure the render output and render element output directory paths all exist on the file server before
rendering commences. Please note at least Pencil+ v3.1 is required if you are using the alternative 3dsmax(Lightning) plugin in Deadline. Note, you will require the correct network render license from PSoft
for each Deadline Slave, which is not the same as the full, workstation license of Pencil+.

9.1.5 Error Messages And Meanings


This is a collection of known 3ds Command error messages and their meanings, as well as possible solutions. We
want to keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email
Deadline Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.2 3ds Max


9.2.1 Job Submission
You can submit jobs from within 3ds Max after installing the integrated Submit Max To Deadline (SMTD) script,
or you can submit them from the Monitor. The instructions for installing the integrated SMTD script can be found
further down this page. You can also submit jobs from within RPManager, the Render Pass Manager for 3ds Max.
The instructions for installing the integrated submitter for RPManager can also be found further down the page.
To submit from within 3ds Max, select the Deadline menu item that you created during the integrated submission
script setup.

498

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

If you are submitting from RPManager, just select the Network tab in RPManager after setting up the integrated
submitter.

9.2. 3ds Max

499

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. The 3ds Max specific options are as follows.
Scene File Submission Options
SAVE and Submit Current Scene File with the Job to the REPOSITORY: The current scene will be saved
to a temporary file which will be sent with the job and will be stored in the Jobs folder in the Repository.
SAVE and Submit Current Scene File to GLOBAL NETWORK PATH: The current scene will be saved
to a temporary file which will be copied to a Globally-Defined Alternative Network Location (e.g. dedicated
file server). It is specified in [Repository]\submission\3dsmax\Main\SubmitMaxToDeadline_Defaults.ini under
[GlobalSettings] as the SubmitSceneGlobalBasePath key. It will be referenced by the Job via its path only. This
will reduce the load on the Repository server.
SAVE and Submit Current Scene File to USER-DEFINED NETWORK PATH: The current scene will be
saved to a temporary file which will be copied to a User-Defined Alternative Network Location (e.g. dedicated

500

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

file server) stored as a local setting. It will be referenced by the Job via its path only. This will reduce the load
on the Repository server.
DO NOT SAVE And Use Current Scenes ORIGINAL NETWORK PATH: The current scene will NOT be
saved, but the original file it was opened from will be referenced by the job. Assuming the file resides on a
dedicated file server, this will speed up submission and rendering significantly, but current changes to the scene
objects will be ignored.
Sanity Check
Run Sanity Check Automatically Before Submission: This options forces Submit To Deadline to perform a
Sanity Check before submitting the job. The Sanity Check is implemented as a separate set of scripted functions
which can be enhanced by 3rd parties to meet specific studio needs. For more information, please refer to the
Sanity Check section.
Run Sanity Check Now!: This button performs a Sanity Check without submitting a job. Any potential problems will be reported and can be fixed before actually submitting the job.
Job Tab
Job Options

Render Task Chunk Size (Frames Per Task): Defines the number of Tasks (Frames) to be processed at once
by a Slave.
Limit Number of Machines Rendering Concurrently: When checked, only the number of Slaves specified
by the [Machines] value will be allowed to dequeue the job. When unchecked, any number of Slaves can work
on the job.
Machines: Defines the number of Slaves that will be allowed to dequeue the job at the same time.
Out-Of-Order Rendering Every Nth Frame: Deadline will render every Nth frame based on the order selected
in the drop down box. This option can be very useful when rendering long test animations - you can render a

9.2. 3ds Max

501

Deadline User Manual, Release 7.1.0.35

rough animation containing evey Nth frame early enough to detect any major issues before all frames have been
rendered, or in cases where the major action happens in the end of the sequence, reverse the rendering order.
Log: Print Frame Sequence to the Log File, then double-click the feedback window to open the Log, Copy &
Paste into Monitor > Jobs Frame Range.
Render Preview Job First: When the checkbox is checked, two jobs will be submitted. The first job will have
[PREVIEW FRAMES] added to its name, have a priority of 100, and will render only N frames based on the
spinners value. The step will be calculated internally. If the spinner is set to 2, the first and the last frame will
be rendered. With a value of 3, the first, middle and last frames will be rendered and so on. The second job will
have [REST OF FRAMES] added to its name, and will be DEPENDENT on the first job and will start rendering
once the preview frames job has finished. It will have the priority specified in the dialog, and render all frames
not included in the preview job.
Priority+: Defines the Priority Increase for the PREVIEW job. For example if the Job Priority is set to 50 and
this value is +5, the PREVIEW job will be submitted with Priority of 55 and the REST job with 50.
Dependent: When checked, the [REST OF FRAMES] Job will be made dependent on the [PREVIEW
FRAMES] Job. When unchecked, the [REST OF FRAMES] Job will use the same dependencies (none or
custom) as the [PREVIEW FRAMES] Job.
Frames: Defines the number of frames to be submitted as a PREVIEW job. The frames will be taken at equal
intervals, for example a value of 2 will send the first and last frames, a value of 3 will send first, middle and last
and so on.
Task Timeout: When checked, a task will be requeued if it runs longer than the specified time. This is useful
when the typical rendering time of the job is known from previous submissions and will prevent stalling.
Enable Auto Task Timeout: Enables the Auto Task Timeout option.
Restart 3ds Max Between Tasks: When unchecked (default), 3ds Max will be kept in memory for the duration
of the give jobs processing. This can reduce render time significantly as multiple Tasks can be rendered in
sequence without reloading 3ds Max. When checked, 3ds Max will be restarted between tasks, thus releasing
all memory and resetting the scene settings at cost of startup time.
Enforce Sequential Rendering: When checked, the Tasks will be processed in ascending order in order to
reduce the performance hit from History-Dependent calculations, for example from particle systems. When
unchecked, Tasks can be picked up by Slaves in any order. Recommended for Particle Rendering.
Submit Visible Objects Only: This option should be used at your own risk, as it is heavily dependent on the
content of your scene. In most cases, it can be used to submit only a subset of the current scene to Deadline,
skipping all hidden objects that would not render anyway. This feature will be automatically disabled if the
current scene contains any Scene XRefs. The feature will create an incorrect file if any of the scene objects
depend INDIRECTLY on hidden objects.
Concurrent Tasks: Defines the number of Tasks a single Slave can pick up at once (by launching multiple
instances of 3ds Max on the same machine). Note that only one Deadline license will be used, but if rendering
in Workstation Mode, multiple licenses of 3ds Max might be required. This is useful to maximize performance
when the tasks dont saturate all CPUs at 100% and dont use up all memory. Typically, as a rule of thumb, this
feature is NOT required as 3ds Max uses 100% of CPUs during rendering.
Limit Tasks To Slaves Task Limit: When checked, the number of Concurrent Tasks will be limited by the
Slaves Task Limit which is typically set to the number of available CPUs. For example, if Concurrent Tasks
is set to 16 but a Slave has 8 cores, only 8 concurrent tasks will be processed.
On Job Completion: Defines the action to perform when the job has completed rendering successfully. The
job can be either left untouched, ARCHIVED to improve Repository performance, or automatically DELETED
from the Repository.
Submit Job As Suspended: When checked, the Job will be submitted to the Repository as Suspended. It will
require manual user intervention before becoming active.

502

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Force 3ds Max Build: This drop-down list allows you to specify which build of 3ds Max (32 bit vs. 64 bit) to
use when rendering the job. The list will be greyed out when running in 3ds Max 8 or earlier.
Make Force 3ds Max Build Sticky: When the checkbox is unchecked, the Force 3ds Max Build dropdown list selection will NOT persist between sessions and will behave as documented above in the Default
section. When the checkbox is checked, the Force 3ds Max Build drop-down list selection will persist between
sessions. For example, if you are submitting from a 64 bit build of 3ds Max to an older network consisting of
only 32 bit builds, you can set the drop-down list to 32bit once and lock that setting by checking Make Force
3ds Max Build Sticky.
Job Dependencies
When the checkbox is checked and one or more jobs have been selected from the multi-list box, the job will be set
to Pending state and will start rendering when all jobs it depends on have finished rendering. Use the Get Jobs List
button to populate the Job List and the Filter options with job data from the Repository.

RPM Pass Dependencies - Global Setup


This option is ONLY available when submitting jobs from RPManager. If enabled, all passes that are submitted will
be dependent on the passes selected in this rollout.

9.2. 3ds Max

503

Deadline User Manual, Release 7.1.0.35

Job Scheduling
Enable job scheduling. See the Scheduling section of the Modifying Job Properties documentation for more information on the available options.

Job Failure Detection


Override the job failure detection settings. See the Scheduling section of the Modifying Job Properties documentation
for more information on the available options.

Render Tab
3ds Max Rendering

504

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Use Alternate Plugin.ini file: By default, 3ds Max will launch using the default plugin.ini file in the local
installation. You can use this option to select an alternative plugin.ini file to use instead. Alternative plugin.ini
files can be added to [Repository]\plugins\3dsmax, and then they will appear in the drop down box in the
submitter (see the Custom Plugin.ini File Creation section for more information). If you have the [Default]
option selected, its the equivalent to having this feature disabled.
Fail On Black Frames: This option can be used to fail the render if a certain portion of the output image or
its render elements is black. The Black Pixel % defines the minimum percentage of the images pixels that
must be black in order for the image to be considered black. If each of RGB are all less than or equal to the
Threshold, and the alpha is not between the Threshold and (1.0 - threshold), then the pixel is considered black.
If the Threshold is greater than or equal to 0.5, then the alpha value has no effect.
Override Bitmap Pager Setting While Rendering: You can specify if you want the 3dsmax Bitmap Pager
setting to be enabled or disabled.
Submit External Files With Scene: Whether the external files (bitmaps, xrefs etc.) will be submitted with the
scene or not.
Merge Object XRefs: If object XRefs will be merged during submission.
Merge Scene XRefs: If scene XRefs will be merged during submission.
Force 3dsmax Workstation Mode (Uses up a 3dsmax License): Used mainly for testing and debugging
purposes and should be left unchecked. When this option is unchecked, 3ds max will be started in Slave mode
without the User Interface, which does not require a 3ds Max license. When checked, 3ds max will be launched
in full Interactive mode and will require a license. Note that Workstation mode is set automatically when
submitting MAXScripts to Deadline.
Enabled Silent Mode: This option is only available when Force Workstation Mode is checked. It can help
suppress some popups that 3ds Max displays (although some popups like to ignore this setting).
Ignore Missing External File Errors: Missing external files could mean that the 3ds Max scene will render
incorrectly (with textures missing etc). In some cases though, missing external files could be ignored- for
example if the job is meant for test rendering only. If you want the job to fail if a missing external resource is
detected, uncheck this checkbox.
Ignore Missing UVW Errors: Missing UVWs could mean that some 3ds Max object would render incorrectly
(with wrong texture mapping etc). In some cases though, missing UVWs could be ignored (for example if the
job is meant for test rendering).
Ignore Missing XREF Errors: Missing XFEFs could mean that the 3ds Max scene cannot be loaded correctly.
In some cases though, missing XFEFs could be ignored. If you want the job to fail if a missing XFEF message
is detected at startup, keep this checkbox unchecked.

9.2. 3ds Max

505

Deadline User Manual, Release 7.1.0.35

Ignore Missing DLL Errors: Missing DLLs could mean that the 3ds Max scene cannot be loaded or rendered
correctly. In some cases though, missing DLLs could be ignored. If you want the job to fail if a missing DLL
message is detected at startup, keep this checkbox unchecked.
Do Not Save Render Element Files: Enable this option to have Deadline skip the saving of Render Element
image files during rendering (the elements themselves are still rendered).
Show Virtual Frame Buffer: If checked, the 3ds Max frame buffer will be displayed on the slave during
rendering.
Override Renderer Frame Buffer Visibility: If checked, the current renderers frame buffer visibility will be
overridden by the next setting (Show Renderer Frame Buffer).
Show Renderer Frame Buffer: If checked, the current renderers frame buffer will be made visible during
rendering (V-Ray and Corona Frame Buffers currently supported).
Disable Progress Update Timeout: Enable this option to disable progress update checking. This is useful for
renders like Fume FX sims that dont constantly supply progress to 3dsmax.
Disable Frame Rendering: Enable this option to skip the rendering process. This is useful for renders like
Fume FX sims that dont actually require any rendering.
Restart Renderer Between Frames: This option can be used to force Deadline to restart the renderer after each
frame to avoid some potential problems with specific renderers. Enabling this option has little to no impact on
the actual render times. This feature should be ENABLED to resolve V-Ray renders where typically the beauty
pass renders correctly but the Render Elements are all black or perhaps seem to be swapped around. When
enabled, the c++ Lightning plugin (unique to Deadline), will unload the renderer plugins and then reload them
instantly. This has the effect of forcing a memory purge and helps to improve renderer stability, as well as ensure
the lowest possible memory footprint. This can be helpful, when rendering close to the physical memory limit
of a machine. Ensure this feature is DISABLED if you are sending FG/LC/IM caching map type jobs to the
farm, as the renderer will get reset for each frame and the FG/LC/IM file(s) wont get incrementally increased
with the additional data per frame.
Disable Multipass Effects: Enable this option to skip over multipass effects if they are enabled for the camera
to be rendered.
V-Ray/Mental Ray DBR: Enable this option to offload a V-Ray or Mental Ray DBR render to Deadline. See
the V-Ray/Mental Ray DBR section for more information.
Job Is Interruptible: If enabled, this job will be cancelled if a job with higher priority is submitted to the queue.
Apply Custom Material To Scene: If checked, all geometry objects in the scene will be assigned one of the
user-defined materials available in the drop down box.
3ds Max Gamma Options

Gamma Correction: Enable to apply gamma correction during rendering.


3ds Max Pathing Options

506

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Remove Filename Padding: If checked, the output filename will be (for example) output.tga instead of
output0000.tga. This feature should only be used when rendering single frames. If you render a range of
frames with this option checked, each frame will overwrite the previous existing frame.
Force Strict Output Naming: If checked, the output image filename is automatically modified
to include the scenes name.
For example, if the scene name was myScene.max and the output image path was \\myServer\images\output.tga, the output image path would be changed to \\myServer\images\myScene\myScene.tga. If the new output image path doesnt exist, it is created by the 3dsmax
plugin before rendering begins.
Purify Filenames: If checked, all render output including Render Elements will be purged of any illegal characters as defined by PurifyCharacterCodes in SubmitMaxToDeadline_Defaults.ini file.
Force Lower-Case Filenames: If checked, all render output including Render Elements will be forced to have
a lowercase filename.
Update Render Elements Paths: Each Render Element has its own output path which is independent from
the render output path. When this option is unchecked, changing the output path will NOT update the Render
Elements paths and the Elements could be written to the wrong path, possibly overwriting existing passes
from a previous render. When checked, the paths will be updated to point at sub-folders of the current Render
Output path with names based on the name and class of the Render Element. The actual file name will be left
unchanged.
Also Update REs Filenames: If enabled, the Render Element file names will also be updated along with their
paths.
Include RE Name in Paths: 88If enabled, the new Render Element files will be placed in a folder that contains
the RE name.
Include RE Name in Filenames: If enabled, the new Render Element files will contains the RE name in the
file name.
Include RE Type in Paths: If enabled, the new Render Element files will be placed in a folder that contains the
RE type.

9.2. 3ds Max

507

Deadline User Manual, Release 7.1.0.35

Include RE Type in Filenames: If enabled, the new Render Element files will contains the RE type in the file
name.
Permanent RE Path Changes: When this checkbox is checked and the above option is also enabled, changes
to the Render Elements paths will be permanent (in other words after the submission, all paths will point at
the new locations created for the job). When unchecked, the changes will be performed temporarily during the
submission, but the old path names will be restored right after the submission.
Rebuild Render Elements: If checked, Render Elements will be automatically removed and rebuilt during
submission to try and work around known 3dsMax issues.
Include Local Paths With Job: (Thinkbox internal use only) Currently not hooked up to any functionality.
Use Alternate Path: Allows you to specify an alternate path file in the MXP format that the slaves can use to
find bitmaps that are not found on the primary map paths.
Render Output Autodesk ME Image Sequence (IMSQ) Creation

Save File: Specify the render output. Note that this updates the 3ds Max Render Output dialog, and is meant as
a convenience to update the output file.
Create Image Sequence (IMSQ) File: If checked, an Autodesk IMSQ file will be created from the output files
at the output location.
Copy IMSQ File On Completion: If checked, the IMSQ file will be copied to the location specified in the text
field.
Options Tab
User Options

508

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Enable Local Rendering: If checked, Deadline will render the frames locally before copying them over to the
final network location.
One Cpu Per Task: Forces each task of the job to only use a single CPU. This can be useful when doing single
threaded renders and the Concurrent Tasks setting is greater than 1.
Automatically Update Job Name When Scene File Name Changes: If checked, the Job Name setting in the
submission dialog will automatically match the file name of the scene loaded. So if you load a new scene, the
Job Name will change accordingly.
Override Renderers Low Priority Thread Option (Brazil r/s, V-Ray): When checked, the Low Priority
Thread option of the renderers supporting this feature will be forced to false during the submission. Both
Brazil r/s and V-Ray provide the feature to launch the renderer in a low priority thread mode. This is useful
when working with multiple applications on a workstation and the rendering should continue in the background
without eating all CPU resources. When submitting a job though, this should be generally disabled since we
want all slaves to work at 100% CPU load.
Clear Material Editor In The Submitted File: Clears the material editor in the submitted file during submission.
Unlock Material Editor Renderer: If checked, the Material Editors Renderer will be unlocked to use the
Default Scanline Renderer to avoid problems with some old versions of V-Ray.
Delete Empty State Sets In The Submitted File: Deletes any empty State Sets in the submitted file during
submission and the State Sets dialog/UI will be reset. This fixes an ADSK bug when running 3dsMax as a
service.
Warn about Missing External Files on Submission: When checked, a warning will be issued if the scene
being submitted contains any missing external files (bitmaps etc.). Depending on the state of the Ignore Missing
External File Errors checkbox under the Render tab, such files might not cause the job to fail but could cause
the result to look wrong. When unchecked, scenes with missing external files will be submitted without any
warnings.
Warn about Copying External Files with Job only if: the count is greater than 100 or the size is greater than
1024MB. Both values can be configured to a studios need.
Override 3ds Max Language: If enabled, you can choose a language to force during rendering.
Export Renderer-Specific Advanced Settings

9.2. 3ds Max

509

Deadline User Manual, Release 7.1.0.35

If this option is enabled for a specific renderer, you will be able to modify a variety of settings for that renderer after
submission from the Monitor. To modify these settings from the Monitor, right-click on the job and select Modify
Properties, then select the 3dsmax tab.
Submission Timeouts

Job Submission Timeout in seconds: This value spinner defines how many seconds to wait for the external
Submitter application to return from the Job submission before stopping the attempt with a timeout message.
Quicktime Submission Timeout in seconds: This value spinner defines how many seconds to wait for the
external Submitter application to return from the Quicktime submission before stopping the attempt with a
timeout message.
Data Collection Timeout in seconds: This value spinner defines how many seconds to wait for the external
Submitter application to return from data collecting before stopping the attempt with a timeout message. Data
collecting includes collecting Pools, Categories, Limit Groups, Slave Lists, Slave Info, Jobs etc.
Limits Tab
Blacklist/Whitelist Slaves
Set the whitelist or blacklist for the job. See the Scheduling section of the Modifying Job Properties documentation
for more information on the available options.

510

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Limits

9.2. 3ds Max

511

Deadline User Manual, Release 7.1.0.35

Set the Limits that the job requires. See the Scheduling section of the Modifying Job Properties documentation for
more information on the available options.

512

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

StateSets Tab

Select the State Sets you want to submit to Deadline. This option is only available in 3ds Max 2012 (Subscription
Advantage Pack 1) and later.
Integration Tab
Project Management Data
The available Integration options are explained in the Draft and Integration documentation.

9.2. 3ds Max

513

Deadline User Manual, Release 7.1.0.35

Deadline Draft Post-Render Processing


The available Draft/Integration options are explained in the Draft and Integration documentation.

Extra Info
These are some extra arbitrary properties that can be set for the job. Note that some of these are reserved when enabling
514

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

the Shotgun, FTrack or Draft settings.

Scripts Tab
Run Python Scripts

Run Pre-Job Script: Specify the path to a Python script to execute when the job initially starts rendering.
Run Post-Job Script: Specify the path to a Python script to execute when the job finishes rendering.
Run Pre-Task Script: Specify the path to a Python script to execute before each task starts rendering.
Run Post-Task Script: Specify the path to a Python script to execute after each task finishes rendering.

9.2. 3ds Max

515

Deadline User Manual, Release 7.1.0.35

Run Maxscript Script

Submit Script Job: This checkbox lets you turn the submission into a MAXScript job. When checked, the
scene will NOT be rendered, instead the specified MAXScript code will be executed for the specified frames.
Options that collide with the submission of a MAXScript Job like Tile Rendering and Render Preview Job
First will be disabled or ignored.
Single Task: This checkbox lets you run the MAXScript Job on one slave only. When checked, the job will
be submitted with a single task specified for frame 1. This is useful when the script itself will perform some
operations on ALL frames in the scene, or when per-frame operations are not needed at all. When unchecked,
the frame range specified in the Render Scene Dialog of 3ds Max will be used to create the corresponding
number of Tasks. In this case, all related controls in the Job tab will also be taken into account.
Workstation Mode: This checkbox is a duplicate of the one under the Render tab (checking one will affect the
other). MAXScript Jobs that require file I/O (loading and saving of 3ds Max files) or commands that require the
3ds Max UI to be present, such as manipulating the modifier stack, HAVE TO be run in Workstation mode (using
up a 3ds Max license on the Slave). MAXScript Jobs that do not require file I/O or 3ds Max UI functionality
can be run in Slave mode on any number of machines without using up 3ds Max licenses.
New Script From Template: This button creates a new MAXScript without any execution code, but with all
the necessary template code to run a MAXScript Job on Deadline.
Pick Script: This button lets you select an existing script from disk to use for the MAXScript Job. It is advisable
to use scripts created from the Template file using the New Script From Template button.

516

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Edit MAXScript File: This button lets you open the current script file (if any) for editing.
Run Pre-Load Script: This checkbox lets you run a MAXScript specified in the text field below it BEFORE
the 3ds Max scene is loaded for rendering by the Slave.
Run Post-Load Script: This checkbox lets you run a MAXScript specified in the text field below it AFTER the
3ds Max scene is loaded for rendering by the Slave.
Run Pre-Frame Script: This checkbox lets you run a MAXScript specified in the text field below it BEFORE
the Slave renders a frame.
Run Post-Frame Script: This checkbox lets you run a MAXScript specified in the text field below it AFTER
the Slave renders a frame.
Post-Submission Function Call: This field can be used by TDs to enter an arbitrary user-defined MAXScript
Expression (NOT a path to a script!) which will be executed after the submission has finished. This can be used
to trigger the execution of user-defined functions or to press a button in a 3rd party script. In the screenshot, the
expression presses a button in a globally defined rollout which is part of an in-house scene management script.
If you want to execute a multi-line script after each submission, you could enter fileIn c:\temp\somescript.ms
in this field and the content of the specified file will be evaluated. The content of this field is sticky and saved in
the local INI file - it will persist between sessions until replaced or removed manually.
The
MAXScript
Job
Template
file
is
located
in
the
Repository
under
\submission\3dsmax\Main\MAXScriptJobTemplate.ms. When the button is pressed, a copy of the template file with a name
pattern MAXScriptJob_TheSceneName_XXXX.ms will be created in the \3dsmax#\scripts\SubmitMaxToDeadline
folder where XXXX is a random ID and 3dsmax# is the name of the 3ds Max root folder. The script file will open in
3ds Max for editing. You can add the code to be executed in the marked area and save to disk. The file name of the
new template will be set as the current MAXScript Job file automatically. If a file name is already selected in the UI,
you will be prompted about replacing it first.
Deadline exposes an interface to MAXScript, which allows you to gather information about the job being rendered.
See the Maxscript Interface documentation for the available functions and properties.
Tiles Tab
Tile & Region Rendering Options

Region Rendering Mode: This drop-down list controls the various rendering modes:

9.2. 3ds Max

517

Deadline User Manual, Release 7.1.0.35

FULL FRAME Rendering, All Region Options DISABLED - this is the default mode of the Submitter.
No region rendering will be performed and the whole image will be rendered.
SINGLE FRAME, MULTI-REGION Jigsaw Rendering - Single Job, Regions As Tasks - this mode
allows one or more regions to be defined and rendered on one or more network machines. Each region
can be optionally sub-divided to a grid of sub-regions to split between machines. The resulting fragments
will then be combined to a new single image, or optionally composited over a previous version of the full
image using DRAFT. This mode is recommended for large format single frame rendering. Note that the
current frame specified by the 3ds Max TIME SLIDER will be rendered, regardless of the Render Dialog
Time settings.
ANIMATION, MULTI-REGION Jigsaw Rendering - One Job Per Region, Frames As Tasks - this
mode allows one or more regions to be defined and rendered on one or more network machines. Each
region can be optionally sub-divided to a grid of sub-regions to split between machines. Each region
can be optionally animated over time by hand or by using the automatic tracking features. The resulting
fragments from each frame will then be combined to a new single image, or optionally composited over
a previous version of the full image using DRAFT. This mode is recommended for animated sequences
where multiple small portions of the scene are changing relative to the previous render iteration.
SINGLE FRAME TILE Rendering - Single Job, Tiles As Tasks - this mode splits the final single image
into multiple equally-sized regions (Tiles). Each Tile will be rendered by a different machine and the final
image can be assembled either using DRAFT, or by the legacy command line Tile Assembler. This mode
is recommended when the whole image needs to be re-rendered, but you want to split it between multiple
machines.
ANIMATION, TILE Rendering - One Job Per Tile, Frames As Tasks - this mode submits a job for each
tile and a post task maxscript will assemble the tiles once they are all rendered per frame for each job.
3DS MAX REGION Rendering - Single Job, Frames As Tasks - this mode allows for traditional 3ds
Max REGION, BLOWUP and CROP render modes to be used via Deadline.
Cleanup Tiles After Assembly: When checked, the Tile image files will be removed after the final image has
been assembled. Keep this unchecked if you intend to resubmit some of the tiles and expect them to re-assemble
with the previous ones.
Pixel Padding: Default is 4 pixels. This is the number of pixels to be added on each side of the region or tile to
ensure better stitching through some overlapping. Especially when rendering Global Illumination, it might be
necessary to render tiles with significant overlapping to avoid artefacts.
Copy Draft Config Files To Output Folder: When checked, the configuration files for Draft Assembly jobs
will be duplicated in the output folder(s) for archiving purposes. The actual assembling will be performed using
the copies stored in the Job Auxiliary Files folder. Use this option if you want to preserve a copy next to the
assembled frames even after the Jobs have been deleted from the Deadline Repository.
Draft Assembly Job Error On Missing Tiles: When unchecked, missing region or tile fragments will not
cause errors and will simply be ignored, leaving either black background or the previous images pixels in the
assembled image. When checked, the Assembly will only succeed if all requested input images have been found
and actually put together.
Override Pool, Group, Priority for Assembly Job: When enabled, the Assembly Pool, Secondary Pool, Group
and Priority settings will be used for the Assembly Job instead of the main jobs settings.
The output formats that are supported by the Tile Assembler jobs are BMP, DDS, EXR, JPG, JPE, JPEG, PNG, RGB,
RGBA, SGI, TGA, TIF, and TIFF.
Jigsaw [Single-Frame | Animation] Multi-Region Rendering

518

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

This rollout contains all controls related to defining, managing and animating multiple regions for theJigsaw modes.
The rollout title will change to include an ACTIVE: prefix and the Single-Frame or Animation token when the
respective mode is selected in the Region Rendering Mode drop-down list (see above).
UPDATE List: Press this button to refresh the ListView.
LOAD/SAVE File...: Click to open a menu with the following options:
LOAD Regions From Disk Preset File...: Selecting this option will open a file open dialog and let you
select a previously saved Regions Preset. Any existing regions will be replaced by the ones from the file.
MERGE Regions From Disk Preset File...: Selecting this option will open a file open dialog and let you
select a previously saved Regions Preset. Any existing regions will be preserved, and the file regions will
be appended to the end of the list.
SAVE Regions To Disk Preset File...: Only enabled if there are valid regions on the list. When selected,
a file save dialog will open and let you save the current regions list to a disk preset for later loading or
merging in the same or different projects.
GET From Camera...: If the current view is a Camera, a list of region definitions stored in the current views
Camera will be displayed, allowing you to replace the current region list with the stored one. If the current view
is not a Camera view, a warning message will be shown asking you to select a Camera view. If the current views
Camera does not have any regions stored in it, nothing will happen.
STORE In Camera...: If the current view is a Camera, a list of region definitions stored in the current views
9.2. 3ds Max

519

Deadline User Manual, Release 7.1.0.35

Camera will be displayed, with the added option to Save New Preset... in a new slot. Alternatively, you can
select any of the previously stored slots to override or update. The Notes text specified in the Notes: field
below will be used to describe the preset. Also, additional information including the number of regions, the
user, machine name, date and time and the MAX scene name will be stored with the preset.
Notes: Enter a description of the current Region set to be used when saving a Preset to disk or camera. When a
preset is loaded, the field will display the notes stored with the preset.
ADD New Region: Creates a new region and appends it to the list. If objects are selected in the scene, the
region will be automatically resized to frame the selection. If nothing is selected, the Region will be set to the
full image size.
CREATE From...: Click to open a context menu with several multi-region creation options:
Create from SCENE SELECTION...: Select one or more objects in the scene and pick this option to create one
region for each object in the selection. Note that regions might overlap or be completely redundant depending
on the size and location of the selected objects - use the OPTIMIZE options below to reduce.
Create from TILES GRID...: Pick this option to create one region for each tile specified in the Tiles rollout.
For example, if the Tiles in X is set to 4 and Tiles in Y is 3, 12 regions resembling the Tile Grid will be created.
Note that once the regions are created, some of them can be merged together, others can be subdivided or split
as needed to distribute regions with different content and size to different machines, providing more flexibility
than the original Tiles mode.
Create from 3DS MAX REGION...: Create a region with the size specified by the 3ds Max Region gizmo.
OPTIMAL FILL Of Empty Areas: After the grid is created, two passes are performed: first a Horizontal
Fill where regions are merged horizontally to produce wider regions, then a Vertical Fill merging regions with
shared horizontal edges. The result is the least amount of tiles and equivalent to manually merging any neighbor
tiles with shared edges in Maya Jigsaw. Thus, it is the top (recommended) option.
HORIZONTAL FILL Of Empty Areas: After creating the grid, a pass is performed over all regions to find
neighbors sharing vertical edges. When two regions share an edge and the same top and bottom corner, they
get merged. This is the equivalent to the Maya Jigsaw behavior, producing wider regions where possible, but
leaving a lot of horizontal edges between tiles with the same width.
VERTICAL FILL Of Empty Areas: After creating the grid, a pass is performed to merge neighboring regions
sharing a horizontal edge with the same left/right corners. The result is the opposite of the Horizontal Fill - a lot
of tall regions.
GRID FILL Of Empty Areas: Takes the horizontal and vertical coordinates of all tiles and creates a grid that
contains them all. No merging of regions will be performed.
OPTIMIZE Regions, Overlap Threshold > 25%: Compare the overlapping of all highlighted regions and if
the overlapping area is more than 25% of the size of the smaller one of the two, combine the two regions to a
single region. Repeat for all regions until no overlapping can be detected.
OPTIMIZE Regions, Overlap Threshold > 50%: Same as the previous option, but with a larger overlap
threshold.
OPTIMIZE Regions, Overlap Threshold > 75%: Same as the previous options, but with an even larger
overlap threshold.
Clone LEFT|RIGHT: Select a single region in the list and click with the Left Mouse Button to clone the region
to the left, or Right Mouse Button to clone to the right. The height will be retained. The width will be clamped
automatically if the new copy is partially outside the screen.
Clone UP|DOWN: Select a single region in the list and click with the Left Mouse Button to clone the region up,
or Right Mouse Button to clone down. The width will be retained. The height will be clamped automatically if
the new copy is partially outside the screen.

520

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

FIT to N Objects / Fit Padding Value: Highlight exactly one region in the list and select one or more objects
in the scene, then click with the Left Mouse Button to perform a precise vertex-based Fit to the selection, or
click with the Right Mouse Button to perform a quick bounding-box based Fit to the selection. Click the small
button with the number to the right to select the Padding Percentage to use when fitting in either modes.
TRACK Region...: Left-click to open the Track dialog in Vertex-based mode for the currently selected region
and scene objects. Right-click for Bounding Box-based mode. While you can switch the mode in the dialog
itself, both the radio buttons and the Padding % values will be adjusted for faster access according to the mouse
button pressed.
SELECT | INVERT: Left-click to highlight all regions on the list. Right-click to invert the current selection.
DELETE Regions: Click to delete the highlighted regions on the list.
SET Keyframe: Highlight one or more regions and click this button to set a keyframe with the current region
settings at the current time.
<< PREVIOUS Key: Click to change the time slider to the previous key of the highlighted region(s), if case
there are such keys.
NEXT Key >>: Click to change the time slider to the next key of the highlighted region(s), if case there are
such keys.
DELETE Keyframe: Click to delete the keys (if any) of the highlighted regions. If there is no key on the
current frame, nothing will happen. Use in conjunction with Previous/Next Key navigation to delete actually
existing keys.
Regions ListView: The list view is the main display of the current region settings. It provides several columns
and a set of controls under each column for editing the values on the list:
On # column: Shows a checkbox to toggle a region on and off for rendering, and the index of the region.
X and Y columns: These two columns display the coordinates of the upper left corner of the Region. Note
that internally the values are stored in relative screen coordinates, but in the list they are shown in current
output resolution pixel coordinates for convenience. Changing the output resolution in the Render Setup
dialog and pressing the UPDATE List button will recalculate the pixel coordinates accordingly.
Width and Height columns: These two columns display the width and height of the region in pixels. Like
the upper left corners X and Y coordinates, they are stored internally as relative screen coordinates and
are shown as pixels for convenience.
Tiles column: Each region can be subdivided additionally horizontally and vertically into a grid of subtiles, each to be rendered by a different network machine. This column shows the number of tiles of the
region, default is 1x1.
Keys column: This column shows the number of animation keys recorded for the region. By default regions have no animation keys and will show in the column unless animated manually or via the Tracking
option.
Locked column: After Tracking, the region will be locked automatically to avoid accidental changes to
its position and size. You can also lock the region manually if you want to prevent it from being moved
accidentally.
Notes column: This column displays auto-generated or user-defined notes for each region. When a region
is created, it might be given a name based on the object it was fitted to, the original region it was cloned or
split from etc. You can enter descriptive notes to explain what every region was meant for.
UNDO... / REDO...: Most operations performed in the Multi-Region rollout will create undo records automatically. The Undo buffer is saved to disk in a similar form as the presets, and you can undo or redo individual
steps by left-clicking the button, or multiple steps at once by right-clicking and selecting from a list.

9.2. 3ds Max

521

Deadline User Manual, Release 7.1.0.35

HOLD: Not all operations produce a valid undo record. If you feel that the next operation might be dangerous,
you can press the HOLD button to force the creation of an Undo record at the current point to ensure you can
return back to it in case the following operations dont produce desirable results.
SPLIT To Tiles: Pressing this button will split the highlighted region to new regions according to the Tiles
settings, assuming they are higher than 1x1 subdivisions. You can use this feature together with the Tiles
controls to quickly produce a grid of independent regions from a single large region. For example, if you create
a single region with no scene selection, it will have the size of the full screen. Enter Tile values like 4 and 3 and
hit the SPLIT To Tiles to produce a grid of 12 regions.
MERGE Selected: Highlight two or more regions to merge them into a single region. The regions dont have
to necessarily touch or overlap - the minimum and maximum extents of all regions will be found and they will
be replaced by a single region with that position and size.
Summary Field: This field displays information about the number of regions and sub-regions (tiles), the number
of pixels to be rendered by these regions, and the percentage of pixels that would be rendered compared to the
full image.
Assemble Over... drop-down list: This list provides the assembly compositing options:
Assemble Over EMPTY Background: The regions will be assembled into a new image using a black
empty background with zero alpha.
Compose Over PREVIOUS OUTPUT Image: The regions will be assembled over the previously rendered (or assembled) image matching the current output filename (if it exists). If such an image does not
exist, the regions will be assembled over an empty background.
Compose Over CUSTOM SINGLE Image: The regions will be assembled over a user-defined bitmap
specified with the controls below. The same image will be used on all frames if an animation is rendered.
Compose Over CUSTOM Image SEQUENCE: The regions will be assembled over a user-defined image
sequence specified with the controls below. Each frame will use the corresponding frame from the image
sequence.
Pick Custom Background Image: Press this button to select the custom image or image sequence to be used in
the last compositing modes above. Make sure you specify a network location that can be accessed by the Draft
jobs on Deadline performing the Assembly!
[Single-Frame | Animation] Tile Rendering

522

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Tiles In X / Tiles In Y: These values specify the number of tiles horizontally and vertically. The total number
of tiles (and jobs) to be rendered is calculated as X*Y and is displayed in the UI.
Show Tiles In Viewport: Enables the tile display gizmo.
Tile Pixel Padding: This value defines the number of pixels to overlap between tiles. By default it is set to
0, but when rendering Global Illumination, it might be necessary to render tiles with significant overlapping to
avoid artifacts.
Re-Render User-Defined Tiles: When checked, only user-defined tiles will be submitted for re-rendering. Use
the [Specify Tiles To Re-render...] check-button to open a dialog and select the tiles to be rendered.
Specify Tiles To Re-render: When checked, a dialog to select the tiles to be re-rendered will open. To close
the dialog, either uncheck the button or press the [X] button on the dialogs title bar.
Enable Blowup Mode: If enabled, tile rendering will work by zooming in on the region and rendering it at a
smaller resolution. Then that region is blown up to bring it to the correct resolution. This has been known to
help save memory when rendering large high resolution images.
Submit All Tiles As A Single Job: By default, a separate job is submitted for each tile (this allows for tile
rendering of a sequence of frames). For easier management of single frame tile rendering, you can choose to
submit all the tiles as a single job.
Submit Dependent Assembly Job: When rendering a single tile job, you can also submit a dependent assembly
job to assemble the image when the main tile job completes.
Use Draft For Assembly: If enabled, Draft will be used to assemble the images. Note that youll need a Draft
license from Thinkbox.
Region Rendering

9.2. 3ds Max

523

Deadline User Manual, Release 7.1.0.35

When enabled, only the specified region will be rendered and depending on the region type selected, it can be cropped
or blown up as well. If the Enable Distributed Tiles Rendering checkbox is checked, it will be unchecked. This
option REPLACES the Crop option in the Render mode drop-down list in the 3ds Max UI. In other words, the
3ds Max option does not have to be selected for Region Rendering to be performed on Deadline. The region can be
specified either using the CornerX, CornerY, Width and Height spinners, or by getting the current region from the
active viewport. To do so, set the Render mode drop-down list to either Region or Crop, press the Render icon and
drag the region marker to specify the desired size. Then press ESC to cancel and press the Get Region From Active
View to capture the new values.
Misc Tab
Quicktime Generation From Rendered Frame Sequence

Create a Quicktime movie from the frames rendered by a 3ds Max job. See the Quicktime documentation for more
information on the available options.
Render To Texture

524

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

This option enables texture baking through Deadline. Use the Add, Remove, and Clear All buttons to add and remove
objects from the list of objects to bake. * One Object Per Task: If enabled, each RTT object will be allocated to an
individual task thereby allowing multiple machines to carry out RTT processing simultaneously.
Batch Submission

Use Data from 3ds Max Batch Render: This checkbox enables Batch Submission using the 3ds Max Batch
Render dialog settings. If checked, a single MASTER job will be sent to Deadline which in turn will spawn
all necessary BATCH jobs.
Open Dialog: This button opens the 3ds Max Batch Render dialog in Version 8 and higher.
Update Info: This button reads the 3ds Max Batch Render dialog settings and displays the number of enabled
vs. defined Views.
Sanity Check
The 3ds Max Sanity Check script defines a set of functions to be called to ensure that the scene submission does not
contain typical errors like wrong render view and frame range settings, incorrect output path, etc.
The Sanity Check is enabled by the Run Sanity Check Automatically Before Submission checkbox in the User Options
group of controls in the Submit To Deadline (3dsmax) dialog. You can also run the Sanity Check automatically by
clicking the Run Now! button.

9.2. 3ds Max

525

Deadline User Manual, Release 7.1.0.35

The dialog contains the following elements:


The upper area (Error Report) lists the problems found in the current scene.
The lower area (Feedback Messages) lists the actions the Sanity Check performs and gives feedback to the user.
The latest message is always on top.
Between the two areas, there is a summary text line listing the total number of errors and a color indicator of the
current Sanity Check state. When red, the Sanity Check will not allow a job submission to be performed.
The Error Report
The left column of the Error Report displays a checkbox and the type of the error. The checkbox determines whether
the error will be taken into account by the final result of the check. Currently, there are 3 types of errors:
FATAL: The error cannot be fixed automatically and requires manual changes to the scene itself. A job submission with such an error would be pointless. The state of the checkbox is ignored and considered always
checked.
Can Be Fixed: The error can be fixed automatically or manually. If the checkbox is active, the error contributes
to the final result. If unchecked, the error is ignored and handled as a warning.
Warning: The problem might not require fixing, but could be of importance to the user. It is not taken into
account by the final result (the state of the checkbox is ignored and considered always unchecked).
Repairing Errors
Right-clicking an Error Message in the Error Report window will cause an associated repair function to be executed
and/or a Report Message to be output in the Feedback Messages window. This difference was caused by the switch
to DotNet controls which handle double-clicks as checked events, changing the checkbox state in front of the error
instead.
Updating the Error Report
You can rerun/update the Sanity Check in one of the following ways:
526

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Clicking the dialog anywhere outside of the two message areas will rerun the Sanity Check and update all
messages.
Double-clicking any Message in the Feedback Messages window will rerun the Sanity Check and update all
messages.
Reparing an error by double-clicking will also automatically rerun the Sanity Check
Pressing the Run Now! button in the Submit To Deadline dialog will update the Sanity Check.
FATAL Sanity Checks
These are errors that must be fixed manually before the job can be submitted.
Message
The scene does not contain
ANY objects!
Maxwell is the renderer and the
current view is NOT a Camera.
The scene contains objects or
groups with the same name as a
camera!
Maxwell is the renderer and the
Render Time Output is set to a
SINGLE FRAME! (Check is
currently disabled in SMTD)
Render Output Path length
exceeds 255 characters!
Render Elements Output Path
length exceeds 255 characters!

Duplicate Render Elements


saving to same File Found!
Scene Object(s) contain names
> 255 characters!

Corrupt Group(s) detected in


your Scene!

Multi-Region Rendering
Requested, But No Active
Regions Found!

9.2. 3ds Max

Description
The scene is empty and should not
be sent to Deadline.
Maxwell renderer must render
through an actual camera and will
fail through a viewport.
The scene contains objects or groups
with a duplicate name to a camera
which could result in an incorrect
object being used as the camera.
Maxwell has an issue with single
frame rendering.

Fix
Load a valid scene or create/ merge
objects, then try again.
Double-click the error message to
open a Select By Name dialog to pick
a camera for the current viewport.
Ensure you remove any duplicate
named objects from your scene.

Ensure the render output file save


path is less than 255 characters in
length.
Ensure any Render Element file save
path lengths are less than 255
characters in length.

Double-click the error message will


open the Render Scene Dialog for you
to manually shorten the path length.
Double-click the error message will
open the Render Scene Dialog for you
to manually shorten the RE path
length.
Double-click the error message will
open the Render Scene Dialog for you
to manually resolve the duplication.
Shorten the character length of all the
objects in your scene to ensure
stability.

1 or more Render Elements are


saving to an identical file path and
file name.
1 or more objects in the scene has an
object name which is greater than
255 characters in length, which will
crash Max.
One or more objects in your scene
are a group head but have no child
members!
Jigsaw Multi-Region Rendering has
been enabled, but there are NO
active regions enabled in the SMTD
Tiles Tab UI.

Double-click the error message will


change the Rendering Output Time to
animation with just the current frame.

Double-click the error message to


automatically have these corrupt nodes
deleted from the scene. Results are
printed to the Sanity Check Window.
Ensure at least 1 region is active in the
Jigsaw Multi- Region Rendering UI in
the Tiles tab of SMTD.

527

Deadline User Manual, Release 7.1.0.35

V-Ray Save Raw Image File is


Enabled, but Raw Image File
Path is Empty!

V-Ray VFB Save Raw Image File is enabled


but NO save file path has been declared!

V-Ray Save Separate Render


Channels is Enabled, but
Separate Render Channels File
Path is Empty!

V-Ray VFB Save Separate Render Channels


is Enabled, but NO save file path has been
declared!

V-Ray VFB Save Raw Image


File - [Generate preview]
should be Disabled!

V-Ray VFB [Generate Preview] must be


disabled for network rendering.

V-Ray VFB - [Region render]


button should be Disabled!

V-Ray VFB [Region render] must be disabled


for network rendering.

V-Ray VFB - [Track mouse


while rendering] button should
be Disabled! (Check is
currently disabled in SMTD).
V-Ray RE:[Alpha,Reflection,
Refraction] or [Save alpha]
requires Draft Tile Assembler.
NOT supported with TA.

V-Ray VFB [Track mouse while rendering]


button must be disabled for network
rendering.
When using Jigsaw SingleFrame Tile
Rendering with V-Ray REs such as Alpha,
Reflection, Refraction OR [Save alpha] via
the VFB, ensure you use Draft Tile
Assembler which is able to support the higher
bit depths created by these REs.

Double-click the error


message will open the Render
Scene Dialog for you to
manually enter a valid file save
path.
Double-click the error
message will open the Render
Scene Dialog for you to
manually enter a valid file save
path.
Double-click the error
message will disable the
[Generate Preview] button in
the VFB.
Double-click the error
message will disable the
[Region render] button in the
VFB.
Double-click the error
message will disable the
[Track mouse while rendering]
button in the VFB.
Double-click the error
message will enable Draft as
the Tile Assembler.

Fixable Sanity Checks


The following Sanity Checks can be automatically fixed before the job is submitted.

528

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Message
The current Scene
Name is Untitled.

The current view is


NOT a camera.

Description
The scene has never been saved to a
MAX file.
While it is possible to submit an untitled
scene to Deadline, it is not a good
practice.
The active viewport is not a camera
viewport.

The Render Time


Output is set to
SINGLE FRAME!

While it is ok to send a single frame to


Deadline, users are sending animations
99% of the cases.

The Render Output


Path appears to point at
a LOCAL DRIVE!

While it is technically possible to save


locally on each Slave, this is a bad idea all Slaves should write their output to a
central location on the network.
Currently, disks C:, D: and E: are
considered local and will be tested agains
the output path.
The Name to be saved to ends with one,
two or three digits. Rendering to this file
name will append 4 more digits and make
loading sequential files in other
applications hard or impossible. This
check is performed only when the type is
not AVI or MOV and will ignore 4
trailing digits which will be replaced by
3dsmax correctly when rendering to
sequential files.
No renders will be saved as Render Scene
Dialog checkbox is currently disabled.

The Render Output File


Name ends with a
DIGIT - trailing
numbers might fail.

The Render Output will


not be saved to a file.
The Distributed
Rendering option is
enabled for this
renderer.
Workstation Mode
must be enabled to use
V-Ray Distributed
Rendering.
The Render Time
Output is NOT set to
single frame, and
Remove Filename
Padding is enabled!

9.2. 3ds Max

Fix
Double-click the error message to open a
Save As dialog and save to disk.

Double-click the error message to open a


Select By Name dialog to pick a camera
for the current viewport.
Double-click the error message to set the
Render Time Output to Active Time
Segment:. The Render Dialog will open
so you can check the options and set to
Range or Frames instead.
Double-click the error message to open
the Render Dialog and select a valid path,
then double-click again to retest.

Double-click the error message to add an


underscore _ to the end of the file name,
for example z:\temp\test123.tga will be
changed to z:\temp\test123_.tga

Check if Distributed Rendering is enabled


for MR or V-Ray renderer.

Double-click the error message to open


the Render Dialog and to enable the Save
File checkbox.
Double-click the error message to disable
Distributed rendering.

3dsMax must use a workstation license to


allow Distributed Rendering to work,
when it is being offloaded onto the farm.

Double-click the error message and


Workstation Mode will be enabled in
SMTD.

When rendering animations, you should


allow filename padding to ensure an
image sequence is created during
rendering.

Double-click the error message will


change the Rendering Output Time to
SINGLE FRAME.

529

Deadline User Manual, Release 7.1.0.35

The current Renderer is


Krakatoa and Particle
Cache is ON!
One or more Render
Element Save File Paths
are EMPTY! (V-Ray? Disable the Individual
RE)
Camera Match
Background Image(s) in
your Scene. Right-click
to REMOVE these ref.
bitmaps!
Alpha Channel will NOT
be stored if saving *.tga
file @ 16/24bit depth!
Select 32bit for Alpha!

Particle and Lighting Cache should be


disabled during SMTD submission to
Deadline queue.
Ensure that a Render Element Output File
has been selected for each Render Element!
If using V-Ray Frame Buffer and ALL REs
have been Disabled, then IGNORE this
Sanity Check!
Surplus camera match bkgrd images in your
scene cause unnecessary bitmap refs. in
your scene file.

Double-click the error message to set


the PCache & LCache to be disabled.

Ensure you select 32bit in the TGA image


plugin file format options to ensure an alpha
channel is stored in the TGA file.

Double-click the error message will


open the Render Scene Dialog for you
to manually configure the bit depth of
the TGA image file to be saved.

Double-click the error message will


open the Render Scene Dialog for
you to manually resolve the issue or it
can be safely ignored.
Double-click the error message will
delete any background image in ALL
cameras in your scene file.

Warnings
The following Sanity Checks are simply warnings.
Message
The Render Output Path is
NOT DEFINED!

Description
No frames will be saved to disk. This is
allowed if you want to output render
elements only.

The Render Output is set


to a MOVIE format.

The file extension is set to an AVI or MOV


format.
In the current version of Deadline, this would
result in a sequence of single frame MOV
files rendered by separate slaves. In the
future, the behaviour might be changed to
render a single MOV or AVI file on a single
slave as one Task.
Dont render final image is enabled, so
Restart Renderer and Machine Limit should
be set to 1 in SMTD.

Not rendering final image


(GI) so Restart Renderer
should be disabled, and
Machine Limit set to 1.
Restart Renderer Between
Frames is disabled and
V-Ray or Brazil is the
selected renderer.
Viewport is currently
locked, which can result in
incorrect renders with
Deadline.
Tile Rendering is enabled
and the V-Ray VFB is
currently on.

V-Ray & Brazil renderers need Restart


Renderer to be enabled to ensure memory
levels are purged during rendering.
The locked viewport setting in 3dsMax
2009-2014 is ignored in the SDK but is fixed
in 3dsMax 2015 onwards.
Unexpected results can occur when the
V-Ray VFB is enabled and you are Tile
Rendering. Consider where any Render
Elements may be saving to, including the use
of the VFB Split Channels and RAW output.

Fix
Double-click the error message to
open the Render Dialog and select
a valid path, then double-click
again to retest.
Double-click the error message to
open the Render Dialog and select
a single frame output format, then
double-click again to retest.

Double-click the error message and


Restart Renderer will be disabled
and Machine Limit enabled and set
to 1 in SMTD.
Double-click the error message to
enable Restart Renderer in the
SMTD settings.
Double-click the error message to
disable the Locked Viewport
(padlock) in the Render Scene
Dialog.
Double-click the error message to
disable the V-Ray VFB output
checkbox.

This list will be extended to include future checks and can be edited by 3rd parties by adding new definitions and
functions to the original script. Documentation on extending the script will be published later. Please email suggestions
for enhancements and additional test cases to Deadline Support.
530

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

9.2.2 V-Ray/Mental Ray off-load DBR


You can offload a V-Ray or Mental Ray DBR job to Deadline by enabling the Distributed Rendering option in your
V-Ray or Mental Ray settings, and by enabling the V-Ray/Mental Ray DBR checkbox in the submission dialog (under
the Render tab). With this option enabled, a job will be submitted with its task count equal to the number of Slaves
you specify, and it will render the current frame in the scene file.
The slave that picks up task 0 will be the master, and will wait until all other tasks are picked up by other slaves.
Once the other tasks have been picked up, the master will update its local V-Ray or Mental Ray config file with the
names of the machines that are rendering the other tasks. It will then start the distributed render by connecting to the
other machines. Note that the render will not start until ALL tasks have been picked up by a slave.
It is recommended to setup V-Ray DBR or Mental Ray DBR for 3ds Max and verify it is working correctly prior
to submitting a DBR off-load job to Deadline. RTT (Render To Texture) is not supported with distributed bucket
rendering. If running multiple Deadline slaves on one machine, having these 2 or more slaves both pick up a different
DBR job concurrently as either master or slave is not supported.
Notes for V-Ray DBR:
You MUST have the Force Workstation Mode option enabled in the submission dialog (under the Render tab).
This means that the master will use up a 3ds Max license. If you dont want to use a 3ds Max license, you can
submit to the 3ds Command plugin instead.
Ensure V-Ray is the currently assigned renderer in the 3ds Max scene file prior to submission.
You must have the Distributed Rendering option enabled in your V-Ray settings under the Settings tab.
Ensure Save servers in the scene (Save hosts in the scene in V-Ray v2) option in V-Ray distributed rendering
settings is DISABLED as otherwise it will ignore the vray_dr.cfg file list!
Ensure Max servers value is set to 0. When set to 0 all listed servers will be used.
It is recommended to disable Use local host checkbox to reduce network traffic on the master machine,
when using a large number of slaves (5+). If disabled, the master machine only organises the DBR process,
sending rendering tasks to the Deadline slaves. This is particularly important if you intend to use the V-Ray v3+
Transfer missing assets feature. Note that Windows 7 OS has a limitation of a maximum of 20 other machines
concurrently connecting to the master machine.
V-Ray v3.00.0x has a bug in DBR when the Use local host is unchecked, it still demands a render node license.
This is resolved in a newer version of V-Ray. Please contact Chaos Group for more information.
The slaves will launch the V-Ray Spawner executable found in the 3ds Max root directory. Do NOT install the
V-Ray Spawner as a service on the master or slave machines. Additionally, Drive Mappings are unsupported
when running as a service.
The vray_dr.cfg file in the 3ds Maxs plugcfg directory must be writeable so that the master machine can
update it. This is typically located in the user profile directory, in which case it will be writeable already.
Chaos Group recommend that each machine to be used for DBR has previously rendered at least one other 3ds
Max job prior to trying DBR on the same machine.
Ensure all slaves can correctly access any mapped drives or resolve all UNC paths to obtain any assets required
by the 3ds Max scene file to render successfully. Use the Deadline Mapped Drives feature to ensure the necessary
drive mappings are in place.
Default lights are not supported by Chaos Group in DBR mode and will not render.
Ensure you have sufficient V-Ray DR licenses if processing multiple V-Ray DBR jobs through Deadline concurrently. Use the Deadline Limits feature to limit the number of licenses being used at any time.
Ensure the necessary V-Ray executables & TCP/UDP ports have been allowed to pass-through the Windows
Firewall. Please consult the V-Ray user manual for specific information.

9.2. 3ds Max

531

Deadline User Manual, Release 7.1.0.35

V-Ray does NOT currently support in 3ds Max the ability to dynamically add or remove DBR slaves to the
currently processing DBR render once started on the master slave.
Notes for Mental Ray DBR:
Ensure Mental Ray is the currently assigned renderer in the 3ds Max scene file prior to submission.
You must have the Distributed Render option enabled in your Mental Ray settings under the Processing tab.
The Mental Ray Satellite service must be running on your slave machines. It is installed by default during the
3ds Max 2014 or earlier installation. Note that ADSK changed this default from 3dsMax 2015 onwards and the
Mental Ray Satellite Service is installed as part of the install process but is NOT automatically started, so you
will need to start it manually the very first time. See this AREA blog post about Distributed Bucket Rendering
in 3ds Max 2015.
The max.rayhosts file must be writeable so that the master machine can update it. Its location is different for
different versions of 3ds Max:
2010 and earlier: It will be in the mentalray folder in the 3ds Max root directory.
2011 and 2012: It will be in the mentalimages folder in the 3ds Max root directory.
2013 and later: It will be in the NVIDIA folder in the 3ds Max root directory.
Ensure the Use Placeholder Objects checkbox is enabled in the Translator Options rollout of the Processing tab. When placeholder objects are enabled, geometry is sent to the renderer only on demand.
Ensure Bucket Order is set to Hilbert in the Options section of the Sampling Quality rollout of the
Renderer tab. With Hilbert order, the sequence of buckets to render uses the fewest number of data transfers.
Contour shading is not supported with distributed bucket rendering.
Autodesk Mental Ray licensing in 3ds Max is restricted. Autodesk says Satellite processors allow any owner
of a 3ds Max license to freely use up to four slave machines (with up to four processors each and an unlimited
number of cores) to render an image using distributed bucket rendering, not counting the one, two, or four
processors on the master system that runs 3ds Max. Mental Ray Standalone licensing can be used to go beyond
this license limit. Use the Deadline Limits feature to limit the number of licenses being used at any time if
required.
Ensure the necessary Mental Ray executables & TCP/UDP ports have been allowed to pass-through the Windows Firewall. Please consult the Autodesk 3ds Max user manual for specific information.

9.2.3 Plug-in Configuration


You can configure the 3dsmax plugin settings from the Monitor. While in super user mode, select Tools -> Configure
Plugins and select the 3dsmax plugin from the list on the left.

532

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

3ds Max Render Executables


3ds Max Executable: The path to the 3ds Max executable file used for rendering. Enter alternative paths on
separate lines. Different executable paths can be configured for each version installed on your render nodes.
3ds Max Design Render Executables
3ds Max Design Executable: The path to the 3ds Max Design executable file used for rendering. Enter alternative paths on separate lines. Different executable paths can be configured for each version installed on your
render nodes.
Render Options
Alternate Plugin ini File: Location of alternate plugin ini file.
Fail On Existing 3dsmax Process: Prevent deadline from rendering when 3dsmax is already open.
Run Render Sanity Check: If enabled, Deadline will do a quick sanity check with 3dsmaxcmd.exe prior to
rendering to ensure 3dsmax is properly set up for network rendering.
Kill ADSK Comms Center Process: If enabled, Deadline will kill Autodesk Communications Center process
if its running during network rendering.
Disable Saving Output To Alternate File Name: If enabled, Deadline wont try to rename the output file(s) if
it is unable to save the output to its default file name.
Timeouts
Timeout For Loading 3dsmax: Maximum time for 3dsmax to load, in seconds.
Timeout For Starting A Job: Maximum time for 3dsmax to start a job, in seconds.
Timeout For Progress Updates: Maximum time before progress update times out, in seconds.
V-Ray DBR and Mental Ray Satellite Rendering

9.2. 3ds Max

533

Deadline User Manual, Release 7.1.0.35

Use IP Addresses: If offloading a V-Ray DBR or Mental Ray Satellite render to Deadline, Deadline will update
the appropriate config file with the host names of the machines that are running the V-Ray Spawner or Satellite
service. If this is enabled, the IP addresses of the machines will be used instead.

9.2.4 Firewall Considerations


Here is an non-exhaustive list of specific 3dsMax related application executables which should be granted access to
pass through the Windows Firewall for all applicable policy scopes (Windows - domain, private, public) and both
inbound & outbound rules (where <maxroot> is the 3dsMax install directory):
<maxroot>/3dsmax.exe
<maxroot>/3dsmaxcmd.exe
<maxroot>/maxadapter.adp.exe
<maxroot>/vrayspawnerYYYY.exe where YYYY is the yearDate such as 2015 (Only applicable if V-Ray
installed)
<maxroot>/python/python.exe
<maxroot>/python/pythonw.exe
Autodesk Communication Center (InfoCenter) Path (dependent on 3dsMax version being used):
3dsMax 2009-2010: C:\Program Files\Common Files\Autodesk Shared\WSCommCntr1.exe
3dsMax 2011: C:\Program Files\Common Files\Autodesk Shared\WSCommCntr\lib\WSCommCntr2.exe
3dsMax 2012: C:\Program Files\Common Files\Autodesk Shared\WSCommCntr3\lib\WSCommCntr3.exe
3dsMax 2013-2015: C:\Program Files\Common Files\Autodesk Shared\WSCommCntr4\lib\WSCommCntr4.exe
It is recommended to always start 3dsMax for the very first time with Administrative rights to ensure the application is
fully initialized correctly. This can also be achieved by right-clicking the 3dsmax.exe application and selecting Run
as administrator.

9.2.5 Integrated Submission Script Setup


The following procedures describe how to install the integrated Autodesk 3ds Max submission script. The integrated
submission script allows for submitting 3ds Max render jobs to Deadline directly from within the 3ds Max editing
GUI. The integrated render job submission script and the following installation procedure has been tested with 3ds
Max versions 2010 and later (including Design editions).
You can either run the Submitter installer or manually install the submission script
Submitter Installer
Run the Submitter Installer located at <Repository>/submission/3dsmax/Installers
Manual Installation of the Submission Script
Copy [Repository]/submission/3dsmax/Client/Deadline3dsMaxClient.mcr to [3ds Max Install Directory]/MacroScripts. If you dont have a MacroScripts folder in your 3ds Max install directory, check to
see if you have a UI/Macroscripts folder instead, and copy the Deadline3dsMaxClient.mcr file there if you do.
Copy
[Repository]/submission/3dsmax/Client/SMTDSetup.ms
tory]/scripts/Startup/SMTDSetup.ms.

to

[3ds

Max

Install

Direc-

Launch 3ds Max, and find the new Deadline menu.

534

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

RPManager Script Setup


To install the 3ds Max integrated submission script in RPManager, just follow these steps:
Create a Deadline7 folder in [maxdir]\scripts\RPManager
Copy
[repo]\submission\3dsmaxRPM\Client\Deadline3dsMaxRPMClient.ms
[maxdir]\scripts\RPManager\Deadline7\Deadline3dsMaxRPMClient.ms

to

In RPManager, select Customize -> Preferences to open the preferences window


In the Network Manager section, choose Custom Submit in the drop down, and then choose the Deadline3dsMaxRPMClient.ms script you copied over

9.2. 3ds Max

535

Deadline User Manual, Release 7.1.0.35

Click OK to close the preferences, and then click on the Network tab to see the submitter

9.2.6 Advanced Features For Technical Directors


MAXScript Interface
When running a MAXScript job through Deadline, there is an interface called DeadlineUtil which you can use to get
information about the job being rendered. The API for the interface between MAXScript and Deadline is as follows:
Functions

536

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Function
string GetAuxFilename( int index)
string GetJobInfoEntry( string key )
string GetOutputFilename( int
index )
string GetSubmitInfoEntry( string
key )
int
GetSubmitInfoEntryElementCount(
string key )
string GetSubmitInfoEntryElement(
int index, string key )
void FailRender( string message )
void LogMessage( string message )
void SetProgress( float percent )
void SetTitle( string title )
void WarnMessage( string message
)

Description
Gets the file with the given index that was submitted with the job.
Gets a value from the plugin info file that was submitted with the job, and
returns an empty string if the key doesnt exist.
Gets the output file name for the job at the given index.
Gets a value from the job info file that was submitted with the job, and
returns an empty string if the key doesnt exist.
If the job info entry is an array, this gets the number of elements in that
array.
If the job info entry is an array, this gets the element at the given index.
Fails the render with the given error message.
Logs the message to the slave log.
Sets the progress of the render in the slave UI.
Sets the render status message in the slave UI.
Logs a warning message to the slave log.

Properties
Property
int CurrentFrame
int CurrentTask
string JobsDataFolder
string PluginsFolder
string SceneFileName
string SceneFilePath

Description
Gets the current frame.
Gets the current task ID.
Gets the local folder on the slave where the Deadline job files are copied to.
Gets the local folder on the slave where the Deadline plugin files are copied to.
Gets the file name of the loaded 3ds Max scene.
Gets the file path of the loaded 3ds Max scene.

Submitters Sticky Settings and Factory Defaults


The latest version of the Submit Max To Deadline script allows the user to control the stickiness of most User Interface
controls and, in the case of non-sticky settings, the defaults to be used. In previous versions of SMTD, both the
stickiness and the defaults were hard-coded.
Overview
Two INI files located in the Repository in the folder \submission\3dsmax control the stickiness and the defaults:
SubmitMaxToDeadline_StickySettings.ini - this file can be used to define which controls in the SMTD UI will
be stored locally in an INI file (sticky) and which will be reset to defaults after a restart of the Submitter.
SubmitMaxToDeadline_Defaults.ini - this file can be used to define the default settings of those controls set to
non-sticky in the other file.
In addition, a local copy of the SubmitMaxToDeadline_StickySettings.ini file can be saved in a users application
data folder. This file will OVERRIDE the stickiness settings in the Repository and can contain a sub-set of the
definitions in the global file.
Details
When SMTD is initializing, it will perform the following operations:
1. The SMTDSettings Struct will be initialized to the factory defaults of all settings.
2. Each UI setting will be initially assumed to be sticky.
3. The global Stickiness definition file is searched for a key matching the current UI settings name.
9.2. 3ds Max

537

Deadline User Manual, Release 7.1.0.35

If the key is set to false, the setting is not sticky.


If the key is set to anything but false, the setting is sticky.
If the key does not exist, the stickiness still defaults to the initial value of true.
4. A local Stickiness definition file is searched for a key matching the current UI settings name.
If the key is set to false, the setting is not sticky and overrides whatever was found in the global file.
If the key is set to anything but false, the setting is sticky, overriding whatever was found in the
global file.
If the key does not exist in the local file, the last known value (initial or from the global file) remains
in power.
5. At this point, SMTD knows whether the setting is sticky or not. Now it gets the global default value:
If a matching key exists in the file SubmitMaxToDeadline_Defaults.ini, the setting is initialized to its
value.
If no matching key exists in the global defaults file, the original factory default defined in the SMTDStruct definition will remain in power.
If the setting is sticky, SMTD loads the last known value from the local INI file. If the value turns out
to be invalid or not set, it uses the default instead.
If the setting is not sticky, the default loaded from the global defaults file or, if no such default was
loaded, the factory default, will be assigned to the setting.
When the User Interface is created, the stickiness info from the local and global files will determine whether a *star
character will be added to the controls name, reflecting the current stickiness settings.
Using this new feature, a facility can customize the submitter globally to default to the preferred settings and keep
certain settings sticky so their values can be determined by the artists. In addition, single users can override the
company-wide stickiness settings using a local file if they feel their workflow require a different setup.
Custom Job Name Controls
There are two ways to customize the job name. You can use keys in the job name that are replaced with actual values
(like $scene), or you can have the job name be generated from a list of shows, shots, etc. You will then be able to use
the [>>] button to the right of the Job Name field to select these custom job names.
Generate Job Name From Keys
There is a file in the ..\submission\3dsmax\Main\ folder in your Repository called SubmitMaxToDeadline_NameFormats.ini. In addition, a local copy of the SubmitMaxToDeadline_NameFormats.ini file can be saved
in a users application data folder. This file will OVERRIDE the name formats in the Repository and can contain a
sub-set of the definitions in the global file. This file will contain some key-value pairs such as:
$scene=(getfilenamefile(maxfilename))
$date=((filterstring (localtime) " ")[1])
$deadlineusername=(SMTDFunctions.GetDeadlineUser())
$username=(sysInfo.username)
$maxversion=(((maxVersion())[1]/1000) as string)

The key to the left of = is the string that will be replaced in the job name. The value to the right of the = is the maxscript
code that is executed to return the replacement string (note that the value returned must be returned as a string). So if
you use $scene in your job name, it will be swapped out for the scene file name. You can append additional key-value
pairs or modify the existing ones as you see fit.

538

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

By default, the [>>] button will already have $scene or $outputfilename as selectable options. You can then create an
optional JobNames.ini file in the 3dsmax submission folder, with each line representing an option. For example:
$scene
$outputfilename
$scene_$camera_$username
$maxversion_$date

These options will then be available for selection in the submission dialog. This allows for all sorts of customization
with regards to the job name.
Generate Job Name For Shows
This advanced feature allows the addition of custom project, sequence, shot and pass names to the [>>] list to the
right of the Job Name field. Producers in larger facilities could provide full shot lists via a central set of files in the
Repository to allow users to pick existing shot names and ensuring consistent naming conventions independent from
the 3ds Max scene naming.
To create a new set of files, go to the ..\submission\3dsmax\Main\ folder in your Repository and create the following
files:
Projects.ini - This file describes the projects currently available for Custom Job Naming. Each Project is defined as a
Category inside this file, with two keys: Name and ShortName.
For example:
[SomeProject]
Name=Some Project in 3D
ShortName=SP
[AnotherProject]
Name=Another Project
ShortName=AP

SomeProject.ini - This is a file whose name should match exactly the Category name inside the file Projects.ini and
contains the actual sequence, shot and pass description of the particular project. One file is expected for each project
definition inside the Projects.ini file.
For example:
[SP_SS_010]
Beauty=true
Diffuse=true
Normals=true
ZDepth=true
Utility=true
[SP_SS_150]
Beauty=true
Diffuse=true
Utility=true
[SP_SO_020]
Beauty=true
[SP_SO_030]
Beauty=true

The Submitter will parse this file and try to collect the Sequences by matching the prefix of the shot names, for example
in the above file, it will collect two sequences - SP_SS and SP_SO - and build a list of shots within each sequence,
then also build a list of passes within each shot.

9.2. 3ds Max

539

Deadline User Manual, Release 7.1.0.35

Then, when the [>>] button is pressed, the context menu will contain the name of each project and will provide a
cascade of sub-menus for its sequences, shots and passes.

If you selected the entry SomeProject>SP_SS>SP_SS_150>Diffuse,


SP_SS_150_Diffuse:

the resulting Job Name will be

You can enter as many projects into your Projects.ini file as you want and provide one INI file for each project
describing all its shots and passes. If an INI file is missing, no data will be displayed for that project.
Custom Comment Controls
Just like job names, you can use keys in the comment field that are replaced with actual values (like $scene). There is a
file in the ..\submission\3dsmax\Main\ folder in your Repository called SubmitMaxToDeadline_CommentFormats.ini.
In addition, a local copy of the SubmitMaxToDeadline_CommentFormats.ini file can be saved in a users application
data folder. This file will OVERRIDE the comment formats in the Repository and can contain a sub-set of the definitions in the global file. This file will contain some key-value pairs such as:
$default=("3ds Max " + SMTDFunctions.getMaxVersion() + " Scene Submission")
$scene=(getfilenamefile(maxfilename))
$date=((filterstring (localtime) " ")[1])
$deadlineusername=(SMTDFunctions.GetDeadlineUser())
$username=(sysInfo.username)
$maxversion=(((maxVersion())[1]/1000) as string)

The key to the left of = is the string that will be replaced in the comment. The value to the right of the = is the maxscript
code that is executed to return the replacement string (note that the value returned must be returned as a string). So if
you use $scene in your comment, it will be swapped out for the scene file name. You can append additional key-value
pairs or modify the existing ones as you see fit.
By default, the [>>] button will already have $default. You can then create an optional Comments.ini file in the 3dsmax
submission folder, with each line representing an option. For example:

540

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

$default
$scene
$outputfilename
$scene_$camera_$username
$maxversion_$date

These options will then be available for selection in the submission dialog. This allows for all sorts of customization
with regards to the comment field.
Auto-Suggest Category and Priority Mechanism
This feature has been implemented to help Producers suggest categories and priorities based on Shots and Sequence
signatures which are part of the 3ds Max Scene Name.
This feature DOES NOT ENFORCE the Category and Priority for the job, it only suggests a value based on project
guidelines - the Category and Priority can be changed manually after the suggestion.
To use this feature, you have to edit the file called SubmitMaxToDeadline_CategoryPatterns.ms located in the Repository in the \submission\3dsmax folder. As a shortcut, you can press the button Edit Patterns... in the Options tab of
the Submitter - the file will open in the built-in MAXScript Editor.
The file defines a global array variable called SMTD_CategoryPatterns which will be used by the Submitter to perform
pattern matching on the Job Name and try to find a corresponding Category and optionally a priority value in the array.
The array can contain one or more sub-arrays, each one representing a separate pattern definition.
Every pattern sub-array consists of four array elements:
The first element is an array containing zero, one or more string patterns using * wildcards. These strings will
be used to pattern match the Job Name. If it matches, it will be considered for adding to the Category and for
changing the Priority. If the subarray is empty, all jobs will be considered matching the pattern.
The second element is also an array containing similar pattern strings. These strings will be used to EXCLUDE
jobs matching these patterns from being considered for this Category and Priority. If the subarray is empty, no
exclusion matching will be performed.
The third element contains the EXACT name (Case Sensitive!) of the category to be set if the Job Name matches
the patterns. If the category specified here does not match any of the categories defined via the Monitor, no action
will be performed.
The fourth element specifies the Priority to give the job if it matches the patterns. If the value is -1, the existing
priority will NOT be changed.
The pattern array can contain any number of pattern definitions. The higher a definition is on the list, the higher its
priority - if a Job Name matches multiple pattern definitions, only the first one will be used.
The pattern matching will be performed only if the checkbox Auto-Suggest Job Category and Priority in the Options
Tab is checked. It will be performed when the dialog first opens or when the the Job Name is changed.
An example:
Lets assume that a VFX facility is working on a project called SomeProject with multiple sequences labelled
AB, CD and EF.
The network manager has created categories called SomeProject, AB_Sequence, CD_Sequence,
EF_Sequence and High_Priority via the Monitor.
The Producers have instructed the Artists to name their 3ds Max files SP_AB_XXX_YYY_where SP stands
for SomeProject, AB is the label of the sequence followed by the scene and shot numbers.

9.2. 3ds Max

541

Deadline User Manual, Release 7.1.0.35

Now we want to set up the Submitter to suggest the right Categories for all Max files sent to Deadline based on
these naming conventions.
We want jobs from the CD sequence to be set to Priority of 60 unless they are from the scene with number
007.
We want jobs from the AB sequence to be set to Priority of 50
We dont want to enforce any priority to jobs for sequence EF.
Also we want shots from the AB sequence with scene number 123 and EF sequence with scene shot
number 038 to be sent at highest priority and added to the special High Priority category for easier filtering
in the Monitor.
Finally we want to make sure that any SP project files that do not contain a sequence label are added to the
general SomeProject category with lower priority.
To implement these rules, we could create the following definitions in the SubmitMaxToDeadline_CategoryPatterns.ms - press the Edit Patterns... button in the Options tab to open the file:
SMTD_CategoryPatterns = #(
#(#("*AB_123*","*EF_*_038*"),#(),"High_Priority",100),
#(#("*AB_*"),#(),"AB_Sequence",50),
#(#("*CD_*"),#("*CD_007_*"),"CD_Sequence",60),
#(#("*EF_*"),#(),"EF_Sequence",-1),
#(#("SP_*"),#(),"SomeProject",30),
)

The first pattern specifies that files from the AB sequence, scene 123 and EF sequence, shot 038 (regardless of scene number) will be suggested as Category High_Priority and set Priority to 100.
The second pattern specifies all AB jobs to have priority of 50 and be added to Category AB_Sequence. Since
the special case of AB_123 has been handled in the previous pattern, this will not apply to it.
The third pattern sets jobs that contain CD_ in their name but NOT the signature CD_007_ to the
CD_Sequence Category and sets the Priority to 60.
The fourth pattern sets jobs that contain EF_ in their name to the EF_Sequence Category but does not
change the priority (-1).
The fifth pattern specifies that any jobs that have not matched the above rules but still start with the SP_
signature should be added to the SomeProject Category and set to low priority of 30.
Note that since we used * instead of SP_in the beginning of the first 4 patterns, even if the job is not named
correctly with the project prefix SP_, the pattern will correctly match the job name.
Custom Plugin.ini File Creation
This section covers the Alternate Plugin.ini feature in the 3ds Max Rendering rollout (under the Render tab).
Alternate Plugin.ini File
The plugin.ini list will show a list of alternative plug-in configuration files located in the Repository. By default, there
will be no alternative plugin.ini files defined in the repository. The list will show only one entry called [Default],
which will cause all slaves to render using their own local plugin.ini configuration and is equivalent to having the Use
Custom Plugin.ini file unchecked.
To define an alternative plugin.ini, copy a local configuration file from one of the slaves to [Repository]\plugins\3dsmax in the repository. Edit the name of the file by adding a description of it. For example, plugin_brazil.ini, plugin_vray.ini, plugin_fr.ini, plugin_mentalray.ini, etc. Open the file and edit its content to include the

542

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

plug-ins you want and exclude the ones you dont want to use in the specific case. The next time you launch Submit
To Deadline, the list will show all alternative files whose names start with plugin and end with .ini. The list will
be alphabetically sorted, with [Default] always on top. You can then select an alternative plugin.ini file manually from
the list.
Pressing the Edit Plugin.ini File button will open the currently selected alternative configuration file in a MAXScript
Editor window for quick browsing and editing, except when [Default] is selected. Pressing the Browse Directory
button will open Windows Explorer, taking you directly to the plug-ins directory containing the alternative plugin.ini
files. Note that if you create a new plugin.ini file, you will have to restart the Submit To Deadline script to update the
list.
Since the alternative plug-in configuration file is located in the Repository and will be used by all slave machines, the
plug-in paths specified inside the alternative plugin.ini will be used as LOCAL paths by each slave. There are two
possible installation configurations that would work with alternative plug-ins (you could mix the two methods, but its
not recommended):
Centralized Plug-ins Repository: In this case, all 3dsmax plug-ins used in the network are located at a centralized location, with all Slaves mapping a drive letter to the central plug-in location and loading the SAME copy
of the plug-in. In this case, the alternative plugin.ini should also specify the common drive letter of the plug-in
repository.
Local Plug-in: To avoid slow 3dsmax booting in networks with heavy traffic, some studios (including ones we
used to work for) deploy local versions of the plug-ins. Every slaves 3dsmax installation contains a full set
of all necessary plug-ins (which could potentially be automatically synchronized to a central repository to keep
all machines up-to-date). In this case, the alternative plugin.ini files should use the LOCAL drive letter of the
3dsmax installation, and all Slaves 3dsmax copies MUST be installed on the same partition, or at least have the
plug-ins directory on the same drive, for example, C:.
Auto-Detect Plugin.ini For Current Renderer
When enabled, the following operations will be performed:
1. When you check the checkbox, the current renderer assigned to the scene will be queried.
2. The first 3 characters of the renderers name will be compared to a list of known renderers.
3. If the renderer is not on the list, the alternative list will be reset to [Default].
4. If the renderer is the Default Scanline Renderer of 3dsmax, the alternative list will be reset to [Default].
5. If the renderer is a known renderer, the plugin*.ini file that matches its name will be selected.
Supported renderers for auto-suggesting an alternative configuration are:
Brazil plugin*.ini should contain brazil in its name (i.e.: plugin_brazil.ini, plugin-brazil.ini, pluginbrazil_1_2.ini etc).
Entropy plugin*.ini should contain entropy in its name (i.e.: plugin_entropy.ini, plugin-entropy.ini, pluginentropy.ini, etc).
finalRender plugin*.ini should contain fr or final in its name (i.e.: plugin_fr.ini, plugin-finalrender.ini, plugin_finalRender_Stage1.ini etc).
MaxMan plugin*.ini should contain maxman in its name (i.e.: plugin_maxman.ini, plugin-maxman.ini, pluginmaxman001.ini etc).
mentalRay plugin*.ini should contain mr or mental in its name (i.e.: plugin_mr.ini, plugin-mentalray.ini,
plugin_mental33.ini etc).
V-Ray plugin*.ini should contain vray in its name (i.e.: plugin_vray.ini, plugin-vray.ini, pluginvray109.ini
etc).
Notes:

9.2. 3ds Max

543

Deadline User Manual, Release 7.1.0.35

In 3dsmax 5 and higher, opening a MAX file while the Auto-Detect option is checked will trigger a callback
which will perform the above check automatically and switch the plugin.ini to match the renderer used by the
scene.
In 3dsmax 6 and higher, changing the renderer via the Current Renderers rollout of the Render dialog will
also trigger the auto-suggesting mechanism.
You can override the automatic settings anytime by disabling the Auto-Detect option and selecting from the list
manually.
Custom Extra Info Controls
Just like job names and comments, you can use keys in the Extra Info 0-9 fields (under the Integration tab in SMTD)
that are replaced with actual values (like $scene). There is a file in the ..\submission\3dsmax\Main\ folder in your
Repository called SubmitMaxToDeadline_ExtraInfoFormats.ini. In addition, a local copy of the SubmitMaxToDeadline_ExtraInfoFormats.ini file can be saved in a users application data folder. This file will OVERRIDE the comment
formats in the Repository and can contain a sub-set of the definitions in the global file. This file will contain some
key-value pairs such as:
$scene=(getfilenamefile(maxfilename))
$date=((filterstring (localtime) " ")[1])
$deadlineusername=(SMTDFunctions.GetDeadlineUser())
$username=(sysInfo.username)
$maxversion=(((maxVersion())[1]/1000) as string)

The key to the left of = is the string that will be replaced in the comment. The value to the right of the = is the maxscript
code that is executed to return the replacement string (note that the value returned must be returned as a string). So if
you use $scene in your comment, it will be swapped out for the scene file name. You can append additional key-value
pairs or modify the existing ones as you see fit.
NOTE, if you are using Shotgun or FTrack Integration, ExtraInfo0 to ExtraInfo5 will be used automatically and take
precendence over any $keys in these particular fields.
As an example, you may wish to use the automatic SMTD BatchName functionality to group logical job submissions
together in your Deadline queue, but also use custom Extra Info fields to help track pipeline information such as
Project, Sequence, Shot or Job Number of a particular 3dsMax/Jigsaw/Draft/Quicktime job submission such as:
$project=[execute maxscript code here, returning a string value]
$sequence=123456
$shot=[use maxscript to get shot # from the current render output naming convention]
$jobnumber=[maxscript to query database and get project's job number as a string]

Once this additional pipeline information is injected into your Deadline jobs, the Extra Info columns can be given
user friendly names so that they can easily be identified and used to filter and sort jobs in the Monitor. See the Job
Extra Properties section for more information. NOTE, the Extra Info X columns are also injected into the Completed
Job Stats, thereby allowing you to store and later analyse/create reports against previous jobs by the data stored in your
Extra Info X columns.

9.2.7 FAQ
Which versions of 3ds Max are supported?
3ds Max versions 2010 and later are all supported (including Design editions).

544

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Note: Due to a maxscript bug in the initial release of 3ds Max 2012, the integrated submission scripts
will not work. However, this bug has been addressed in 3ds Max 2012 Hotfix 1. If you cannot apply this
patch, it means that you must submit your 3ds Max 2012 jobs from the Deadline Monitor.
Which 3ds Max renderers are supported?
Deadline should already be compatible with all 3ds Max renderers, but it has been explicitly tested with
Scanline, MentalRay, Brazil, V-Ray, Corona, finalRender, and Maxwell. If you have successfully used a
3ds Max renderer that is not on this list, please email Deadline Support.
Does Backburner need to be installed to render with Deadline?
Yes. Backburner installs the necessary files that are needed for command line and network rendering, so
it must be installed to render with Deadline.
Does the 3ds Max plugin support Tile Rendering?
Yes. See the Tile Rendering section of the submission dialog documentation for more details.
Does the 3ds Max plugin support multiple arbitrary sized, multi-resolution Tile Rendering for both stills or
animations and automatic re-assembly, including the use of multi-channel image formats and Render Elements
(incl. V-Ray VFB specific image files)?
Yes. We call it Jigsaw and its unique to the Deadline system! See the Tile Rendering section of the
submission dialog documentation for more details.
Does the 3ds Max plugin support Batch Rendering?
Yes. See the Batch Rendering section of the submission dialog documentation for more details.
Is PSofts Pencil+ render effects plugin supported?
Yes. Please note at least Pencil+ v3.1 is required to resolve an issue with the line element render element
failing to be rendered. Note, you will require the correct network render license from PSoft for each
Deadline Slave or render with a Deadline Slave that already has a full, workstation license of Pencil+
already installed.
When I submit a render with a locked viewport, Deadline sometimes renders a different viewport.
Prior to the release of 3ds Max 2009, the locked viewport feature wasnt exposed to the 3ds Max SDK,
so it was impossible for Deadline to know whether a viewport is locked or not. Now that the feature has
been exposed, we are working to improve Deadlines locked viewport support. However, in the 3ds Max
2010 SDK, there is a bug that prevents us from supporting it completely (Autodesk is aware of this bug).
As of 3ds Max 2015, this bug is now resolved. For earlier versions, we can only continue to recommend
that users avoid relying on the locked viewport feature, and instead ensure that the viewport they want to
render is selected before submitting the job. The SMTD sanity check continues to provide a warning for
those versions of 3ds Max, where the locked viewport SDK bug still exists.
When Deadline is running as a service, 3ds Max 2015 render jobs crash during startup.
This can happen if the new Scene (Content) Explorer is docked.
This is a known issue with 3ds Max network rendering when it is launched by a program running as a
service. See this AREA blog post about running 3ds Max 2015 as a service for a workaround and more
information.
Can I mix 3ds Max and 3ds Max Design jobs in Deadline?
Yes. ADSK have introduced (April 2014) a new system environment variable you can set which will
make all jobs from 3ds Max and 3ds Max Design appear as 3ds Max jobs: MIX_MAX_DESIGN_BB
set to 1 to enable this feature. Note, Windows typically requires a machine restart or log-off/log-on for
the new environment setting value to become available once set. ADSK have confirmed this works for
3ds Max 2015, 3ds Max Design 2015 with Backburner 2015.0.1. It may also work with 2014 SP5 version

9.2. 3ds Max

545

Deadline User Manual, Release 7.1.0.35

of 3ds Max and 3ds Max Design, with Backburner 2015.0.1. See this AREA blog post about mixing 3ds
Max and 3ds Max design on a render farm for more information. Note, Backburner Manager or Server
are NOT required to be running to make this system work in Deadline, although Backburner software still
needs to be installed on your rendernodes.
When I submit a render job that uses more than one default light, only one default light gets rendered.
The workaround for this problem is to add the default lights to the scene before submitting the job. This
can be done from within 3ds Max by selecting Create Menu -> Lights -> Standard Lights -> Add Default
Lights to Scene.
Is it possible to submit MAXscripts to Deadline instead of just a *.max scene?
Yes. Deadline supports MAXscript jobs from the Scripts tab in the submission dialog.
Does Deadlines custom interface for rendering with 3ds Max use workstation licenses?
No. Deadlines custom interface for rendering with 3ds max does not use any workstation licenses when
running on slaves unless you have the Force Workstation Mode option checked in the submission dialog,
a workstation license will be used.
Slaves are rendering their first frame/tile correctly, but subsequent frames and render elements have problems
or are rendered black.
Try enabling the option to Restart Renderer Between Frames in the submission dialog before submission, or in the job properties dialog after submission. We have found that this works 99% of the time in
these cases. When enabled, the c++ Lightning plugin (unique to Deadline), will unload the renderer plugins and then reload them instantly. This has the effect of forcing a memory purge and helps to improve
renderer stability, as well as ensure the lowest possible memory footprint. This can be helpful, when
rendering close to the physical memory limit of a machine. See note below for when this feature should
be disabled.
V-Ray Light-Cache / Irradiance Maps are not the correct file size or seem to be getting reset between incremental frames on Deadline but calculate correctly when executed locally.
Ensure the option Restart Renderer Between Frames is DISABLED if you are sending FG/LC/IM
caching map type jobs to the farm, as the renderer will get reset for each frame and the FG/LC/IM file(s)
wont get incrementally increased with the additional data per frame and will only contain the data from
the last frame it calculated. (The resulting file size will be too small as well).
3dsMax Point Cache Files dropping geometry in renders randomly
Sometimes 3dsMax can drop point cache geometry in renders, in an almost random only certain rigs
fashion. Typically but not exclusively, this happens on the 2nd assigned frame processed by a particular
slave. Ensure the option Restart Renderer Between Frames is DISABLED in the submission dialog
before submission, or in the job properties dialog after submission. We have found that this works 99%
of the time in these cases.
When rendering with V-Ray/Brazil, it appears as if some maps are not being displayed properly.
Try enabling the option to Restart Renderer Between Frames in the submission dialog before submission, or in the job properties dialog after submission. We have found that this works 99% of the time in
these cases.
Tile rendering with a Mental Ray camera shader known as wraparound results in an incorrect final image.
How can I fix this?
This is another situation where enabling the option to Restart Renderer Between Frames in the submission dialog seems to fix the problem.
When tile rendering with a renderer that supports global/secondary illumination, I get bucket stamps (different
lighting conditions in each tile) on the final image.

546

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Try calculating the irradiance/final gather light caching map first in one pass at full resolution. Then
perform your tile render on a scene that reads the irradiance/final gather map created at full resolution. If
creating the map at full resolution is impossible then you can make it in the tile, but you need to make
sure the tiles are overlapping each other (use Deadlines tile/jigsaw padding to help here) and make sure to
use the irradiance/final gather map method that appends to the map file. Alternatively, you could consider
using the VRay/Mental Ray DBR off-load system to accelerate the caculation of the light caching map.
In summary: you create (pre-calculate) the secondary/global illumination map first then run the final
render in tiles as a second job. Deadline job dependencies can be used here to release the second job as
the first job successfully completes the lighting pre-calculation job.
Can I perform Distributed Bucket Rendering (DBR) with V-Ray or V-Ray RT?
Yes. A special reserve job is submitted that will run the V-Ray Spawner/V-Ray standalone process on
the render nodes. Once the V-Ray Spawner/V-Ray standalone process is running, these nodes will be able
to participate in distributed rendering. Please see the VRay Distributed Rendering (DBR) Plug-in Guide
for more information.
Can I fully off-load 3dsMax V-Ray or Mental Ray DBR rendering from my machine?
Yes, see the VRay/Mental Ray DBR section for more information. The advantages to off-loading a VRay DBR job fully from your workstation include; releasing your local workstation to carry out other
processing tasks and helping to accelerate the irradiance map/photon cache calculation process as the
V-Ray DBR system supports distributing this across multiple machines. A risk/disadvantage to this way
of working is if a single machine currently being used to calculate a DBR bucket crashes/fails for an
unknown reason, then the whole process will fail at its current stage and start from the beginning again.
Can I Perform Fume FX Simulations With Deadline?
Yes. To do so, follow these steps:
1. Your render nodes need to have Fume FX licensed properly, either with a full or simulation
licenses. This requirement is the same if you were rendering with Backburner.
2. Before you launch the 3dsmax submission script, make sure that the Fume FX NetRender toggle
button is ON in the Fume FX options in 3dsmax.
3. Before you submit the job, make sure the Disable Progress Update Timeout option is enabled
under the Render in the 3dsmax submission window.
4. Note that Fume FX uses its own frame range (in the Fume FX settings/prefs), so submit the
Max scene file to Deadline as a single frame/task.
Can I force a render to use a specific language?
Yes. Using the option located in User Options tab of SMTD or in the monitor submission, Advanced
Options tab (2013+ only). This will change the default on the machine it is rendered on to the chosen
language. Note that the change is permanent on the machine until such time 3dsMax is restarted and the
language is forced to a different language. You can manually force the language to be changed back via
the language specific shortcuts in the start menu, which effectively start 3dsMax with the language flag.
In this example, EN-US (default) is forced: C:/Program Files/Autodesk/3ds Max 2015/3dsmax.exe
/Language=ENU
When submitting to Deadline, non-ASCII characters in output paths, camera names, etc, are not being sent to
Deadline properly.
You need to enable the Save strings in legacy non-scene files using UTF8 property in the Preference Settings in 3ds Max. After enabling this, the Deadline submission files will be saved as UTF8 and therefore
non-ASCII characters will be saved properly. See the Character Encoding Defaults in 3ds Max section in
the 3ds Max Character Encoding documentation for more information.
Why do 3ds Max jobs add a period delimiter to the output filename?

9.2. 3ds Max

547

Deadline User Manual, Release 7.1.0.35

Deadline 7 introduced a new Delimiter option in the integrated 3ds Max submitter (SMTD) to avoid some
problems with the way render elements and other auto-generated names were formatted in previous version. The Delimiter option is set to a factory default of . as this is the typical convention in VFX
pipelines, but it can be overridden via the Defaults INI file in the Repository. Since this setting is considered a company-wide pipeline value and should not be overridden by individual users, it is currently not
exposed in the SMTD UI.
To change the Delimiter to an empty string, you can do the following:
1. Navigate to your Repository folder
2. Go to ...\submission\3dsmax\Main\
3. Locate the SubmitMaxToDeadline_Defaults.ini file and open it in a text editor
4. Add the following to the [RenderingOptions] category:
[RenderingOptions]
Delimiter=

5. Make sure there is nothing after the = sign!


6. Save the file
7. Restart SMTD on your workstation
8. RESULT: At this point, SMTD should behave like it did in Deadline 6.x and earlier.
Note that in some cases some render element passes might be misformatted due to the lack of delimiter this was a known issue in Deadline 6.x and earlier. For example, if a V-Ray pass was named automatically
based on a TextureMap name ending with digits, the resulting file name could end up having too many
trailing digits, e.g. SomeMap_420000.exr instead of SomeMap_42.0000.exr. So in the Deadline Monitor, the filename could become SomeMap_######.exr instead of SomeMap_42.####.exr. If you want
to replace the . period character with a different character to fit your pipeline requirements, (e.g. _
underscore), you can add the character to the INI file:
[RenderingOptions]
Delimiter=_

In summary, you can use the new Delimiter option to provide a consistent file naming convention across
your studio pipeline. Few caveats; the file naming convention for Thinkboxs tile, region and Jigsaw
remains unchanged and V-Ray v3 has introduced a maxscript property #fileName_addDot which can
be accessed via
renderers.current.fileName_addDot

which by default is True, so it will also try to add a DOT character to its filenames if one is not present.

9.2.8 Error Messages and Meanings


This is a collection of known 3ds Max error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Note that when an error occurs in a Max render, we parse the Max render log (Max.log) for any messages that might
explain the problem and include them in the error message. Some examples are:
ERR: An unexpected exception has occurred in the network renderer and it is terminating.
548

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

ERR: Missing dll: BrMaxPluginMgr.dlu


ERR: [V-Ray] UNHANDLED EXCEPTION: Preparing ray server Last marker is at .srcvrayrenderer.cpp
3dsmax startup: Error getting connection from 3dsmax: 3dsmax startup: Deadline/3dsmax startup error:
lightningMax*.dlx does not appear to have loaded on 3dsmax startup, check that it is the right version and
installed to the right place.
You likely need to install the appropriate Visual C++ Redistributable package, which is carried out automatically by the Deadline Client installer. Try re-installing the Client software if you see this error.
3dsmax startup: Error getting connection from 3dsmax: Monitored managed process 3dsmaxProcess has
exited or been terminated.
Full error message:
3dsmax startup: Error getting connection from 3dsmax: Monitored managed...
2012/08/24 14:48:40 DBG: Starting network
2012/08/24 14:48:40 DBG: Calling NetRenderPreLoad
2012/08/24 14:48:40 DBG: in NetWorkerPreLoad. jobFile: ; jobname: C:\Users\...
2012/08/24 14:48:40 DBG: in NetWorkerPreLoad. LoadLib() failed
2012/08/24 14:48:40 DBG: NetRenderPreLoad failed
2012/08/24 14:48:40 ERR: Error loading *.max file
2012/08/24 14:49:10 INF: SYSTEM: Production renderer is changed to Default...
2012/08/24 14:49:10 DBG: Stop network

This is a known issue with 3ds Max, and can occur when IPv6 is enabled on the render node. The issue
can be fixed by disabing IPv6 on the machines, or by disabing the IPv6 to IPv4 tunnel. See this Area blog
post about IPv6 errors for more information.
Could not delete old lightning.dlx... This file may be locked by a copy of 3ds max
Usually this is because a 3dsmax.exe process didnt quit or get killed properly. Looking in task manager
on the slaves reporting the message for a 3dsmax.exe process and killing it is the solution.
3dsmax crashed in GetCoreInterFace()->LoadFromFile()
There are a number of things that can be tried to diagnose the issue:
Try opening the file on a machine where it crashed. You may already have done this.
Try rendering a frame of it on a machine where it crashed, using the 3dsmaxcmd.exe renderer.
This will make it open the file in slave mode and possibly give an idea of whats failing.
Submit the job to run in workstation mode. In workstation mode theres often more diagnostic
output. Theres a checkbox in the submission script for this.
If youre comfortable sending us the .max file which is crashing, wed be happy to diagnose the
issue here.
Try stripping down the max file by deleting objects and seeing if it still crashes then.
Trapped SEH Exception in CurRendererRenderFrame(): Access Violation
An Access Violation means that when rendering the frame, Max either ran out of memory, or memory
became corrupted. The stack trace in the error message usually shows which plugin the error occurred in.
If that doesnt help track down the issue, try stripping down the max file by deleting objects and seeing if
the error still occurs.
3dsmax: Trapped SEH Exception in LoadFromFile(): Access Violation
An Access Violation means that when loading the scene, Max either ran out of memory, or memory
became corrupted. The stack trace in the error message usually shows which plugin the error occurred in.

9.2. 3ds Max

549

Deadline User Manual, Release 7.1.0.35

If that doesnt help track down the issue, try stripping down the max file by deleting objects and seeing if
the error still occurs.
3dsmax: PNG Plugin: PNG Library Internal Error
3dsMax Render Elements can become corrupt or be placed in a bad state with regard the image file format
plugin trying to being used to save each Render Element to your file server. This issue is not limited to
the PNG file format (TGA, TIF) but is common. A known option, which has been known to fix the issue
in most circumstances, is to rebuild the render elements by deleting and re-creating them in the 3dsmax
scene file. This feature is automated in SMTD if you enable the checkbox Rebuild Render Elements
under the Render tab -> 3ds Max Pathing Options.
RenderTask: 3dsmax exited unexpectedly (it may have crashed, or someone may have terminated)
This generic error message means that max crashed and exited before the actual error could be propagated
up to Deadline. Often when you see this error, it helps to look through the rest of the error reports for that
job to see if they contain any information thats more specific.
RenderTask: 3dsmax may have crashed (recv: socket error trying to receive data: WSAError code 10054)
This generic error message means that max crashed and exited before the actual error could be propagated
up to Deadline. Often when you see this error, it helps to look through the rest of the error reports for that
job to see if they contain any information thats more specific.
3dsmax startup: Error getting connection from 3dsmax: 3dsmax startup: Deadline/3dsmax startup error:
lightningMax*.dlx does not appear to have loaded on 3dsmax startup, check that it is the right version and
installed to the right place.
This error is likely the side effect of another error, but the original error wasnt propagated to Deadline
properly. Often when you see this error, it helps to look through the rest of the error reports for that job to
see if they contain any information thats more specific.
3dsmax startup: Max exited unexpectedly. Check that 1) max starts up with no dialog messages and in the
case of 3dsmax 6, 2) 3dsmaxcmd.exe produces the message Error opening scene file: when run with no
command line arguments
This message is often the result of an issue with the way Max starts up. Try starting 3ds Max on the slave
machine that produced the error to see if it starts up properly. Also try running 3dsmaxcmd.exe from the
command line prompt to see if it produces the message Error opening scene file: when run with no
command line arguments. If it doesnt produce this message, there may be a problem with the Max install
or how its configured. Sometimes reinstalling Max is the best solution.
The 3dsmax command line renderer, ...\3dsmaxcmd.exe, hung during the verification of the 3ds max install
Try running 3dsmaxcmd.exe from the command line prompt to see if it pops up an error dialog or crashes,
which is often the cause of this error message. If this is the case, there may be a problem with the Max
install or with how it is configured. Sometimes reinstalling Max is the best solution.
3dsmax: Failed to load max file: ...
There could be many reasons my Max would fail to load the scene file. Check for ERR or WRN messages
included in the error message for information that might explain the problem. Often, this error is the result
of a missing plugin or dll.
Error: 3ds Max The Assembly Autodesk.Max.Wrappers.dll encountered an error while loading
This is a specific 3ds Max 2015 crash when you try to launch the program. Ensure you perform a Windows
update and get latest updates for Windows 7 or 8. Additionally, install the update for Autodesk 3ds Max
2015 Service Pack 1 and Security Fix. See this ADSK Knowledge post for more information.
Error message: 3dsmax adapter error : Autodesk 3dsMax 17.2 reported error: Could not find the specified file
in DefaultSettingsParser::parse() ; Could not find the specified file in DefaultSettingsParser::parse() ;

550

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

The error Could not find the specified file in DefaultSettingsParser::parse() ; occurs if you dont
have the Populate Data installed on each of your Deadline Slave machines. To resolve the issue you
need to ensure that the Populate Data is installed on all the render machines. You can run the 3dsMax_2015_PopulateData.msi installer from the \x64\PDATA\ folder of the 3ds Max 2015 installer. In
case there was a previous install of the Populate Data on the machine please delete the following folder
before installing C:\Program Files\Common Files\Autodesk Shared\PeoplePower\2.0\. See this Area
blog post for more information.
Error message: ERR: To use this feature, you need the Evolver data. Please check the Autodesk web site for
more information.
You may get the above error message when you try to run a Populate simulation in your 3dsMax scene file.
This is a known Autodesk bug and the fix is to install the Autodesk 3ds Max 2014 64-bit Populate Data
component. The actual file is 3dsMax_2014_PopulateData.msi which you can find in the \x64\PDATA\
folder of the install media. Note that if youre running 3ds Max Design the filename will be 3dsMaxDesign_2014_PopulateData.msi. Simarily, the same bug in 3ds Max 2015 doesnt mention Evolver anymore.
Instead, it tells you to install the Populate data. See this Fixing missing Evolver data errors Area blog post
for more information.
Error message: ERROR: Please, make sure the Populate data is installed.
This is the same error message as the previous Populate FAQ entry and is fixed by installing the Autodesk Populate Data component. See this Fixing missing Evolver data errors Area blog post for more
information.
Unexpected exception (Error in bm->OpenOutput(): error code 12)
Ensure all instances of 3dsMax are running a consistent LANGUAGE. By default 3dsMax ships with the
LANGUAGE code set to ENU - US English and this is recommended for the majority of customers.
If you are using a 3rd party plugin in 3dsMax, please contact the plugin developer to verify that their
plugin is capable of running as a different language inside of 3dsMax. Note, that the majority of 3rd party
plugins are still only developed to work in ENU. Please see this FAQ for more information regarding
options to control the LANGUAGE: 3dsMax Language Code FAQ.
Exception: Failed to render the frame.
There could be many reasons my Max would fail to render the frame. Check for ERR or WRN messages
included in the error message for information that might explain the problem.
DBG: in Init. nrGetIface() failed
This error message is often an indication that 3dsmax or backburner is out of date on the machine. Updating both to the latest service packs should fix the problem.
ERROR: ImageMagick: Invalid bit depth for RGB image [path to tile/region render output image]
This error is due to the old TileAssembler executable not supporting certain bit depth images such as
V-Rays REs Reflection, Refraction and Alpha when saved from the V-Ray Frame Buffer (VFB).
Please note that the Tile Assembler plugin is EOL (End-Of-Life/deprecated). Please use the newer Draft
Tile Assembler plugin (Use Draft for Assembly) checkbox option in SMTD when rendering using the
older tile system to ensure all image types/bit depths are correctly assembled. Draft Tile Asssembler jobs
can also be submitted independently if you already have the *.config file(s) and is explained further in the
Draft Tile Assembler documentation.
Error when using Mental Ray DBR in 3ds Max 2016: Could not locate MDL shared core library.
When you try to use DBR (Distributed Bucket Rendering) you will get the following error message:
Could not locate MDL shared core library.

9.2. 3ds Max

551

Deadline User Manual, Release 7.1.0.35

To help Mental Ray satellite find this .dll copy libmdl.dll from the main 3ds Max 2016 folder to the
NVIDIA/Satellite folder. Note that you have to do this on all the machines that will be used for DBR. See
this Error when using Mental Ray DBR in 3ds Max 2016 Area blog post for more information.

9.3 After Effects


9.3.1 Job Submission
You can submit jobs from within After Effects by installing the integrated submission script, or you can submit them
from the Monitor. The instructions for installing the integrated submission script can be found further down this page.
To submit from within After Effects, select File -> Run Script -> DeadlineAfterEffectsClient.jsx.

552

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

9.3. After Effects

553

Deadline User Manual, Release 7.1.0.35

Project Configuration
In After Effects, place the comps you want to render in the Render Queue (CTRL+ALT+0). Due to an issue with the
Render Queue, if you have more than one comp with the same name, only the settings from the first one will be used
554

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

(whether they are checked or not). It is important that all comps in the Render Queue have unique names, and our
submission script will notify you if they do not. Each comp that is in the Render Queue and that has a check mark
next to it will be submitted as separate job to Deadline.

Note that under the comps Output Module settings, the Use Comp Frame Number check box must be checked. If this
is not done, every frame in the submitted comp will try to write to the same file.

9.3. After Effects

555

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. Note that the Draft/Integration options are only available in
After Effects CS4 and later.
The After Effects specific options are:
Use Comp Name As Job Name: If enabled, the jobs name will be the Comp name.
556

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Use Frame List From The Comp: Check this option to use the frame range defined for the comp.
Comps Are Dependent On Previous Comps: If enabled, the job for each comp in the render queue will be
dependent on the job for the comp ahead of it. This is useful if a comp in the render queue uses footage rendered
by a comp ahead of it.
Render The First And Last Frames Of The First: Enable this option to render the first and last frames first,
followed by the the remaining frames in the comps frame list. Note that this ignores the Frame List setting in
the submission dialog.
Submit The Entire Render Queue As One Job With A Single Task: Use this option when the entire render
queue needs to be rendered all at once because some queue items are dependent on others or use proxies. Note
though that only one machine will be able to work on this job.
Multi-Process Rendering: Enable multi-process rendering.
Submit Project File With Job: If enabled, the After Effects Project File will be submitted with the job.
Ignore Missing Layer Dependencies: If enabled, Deadline will ignore errors due to missing layer dependencies.
Fail On Warning Messages: If enabled, Deadline will fail the job whenever After Effects prints out a warning
message.
Export XML Project File: Enable to export the project file as an XML file for Deadline to render (After Effects
CS4 and later). The original project file will be restored after submission. If the current project file is already an
XML file, this will do nothing.
Ignore Missing Effects References: If enabled, Deadline will ignore errors due to missing effect references.
Continue On Missing Footage: If enabled, rendering will not stop when missing footage is detected.
Enable Local Rendering: If enabled, Deadline will render the frames locally before copying them over to the
final network location.
Override Fail On Existing AE Process: If enabled, the global repository setting Fail on Existing AE Process
will be overridden.
Fail on Existing AE Process: If enabled, the job will be failed if any After Effects instances are currently
running on the slave. Existing After Effects instances can sometimes cause 3rd party AE plugins to malfunction
during network rendering.
The following After Effects specific options are only available in After Effects CS4 and later:
Multi-Machine Rendering: This mode submits a special job where each task represents the full frame range.
The slaves will all work on the same frame range, but if Skip existing frames is enabled for the comps, they
will skip frames that other slaves are already rendering.
This mode requires Skip existing frames to be enabled for each comp in the Render Queue.
Set the number of tasks to be the number of slaves you want working simultaneously on the render.
This mode ignores the Frame List, Machine Limit, and Frames Per Task settings.
This mode does not support Local Rendering or Output File Checking.
Minimum Output File Size: If an output images file size is less than whats specified, the task is requeued
(specify 0 for no limit).
Enable Memory Management: Whether or not to use the memory management options.
Image Cache %: The maximum amount of memory after effects will use to cache frames.
Max Memory %: The maximum amount of memory After Effects can use overall.

9.3. After Effects

557

Deadline User Manual, Release 7.1.0.35

Layer Submission
In addition to normal job submission, you also have the option to submit layers in your After Effects project as separate
jobs. To do so, first select the layers you want to submit. Then run the submission script, set the submission options
mentioned above as usual, and press the Submit Selected Layers button. This will bring up the layers window.

The layer submission options are:


Render With Unselected Layers: Specify the unselected layers that will render with each of the selected layers.
Layer Name Parsing: Allows you to specify how the layer names should be formatted. You can
then grab parts of the formatting and stick them in either the output name or the subfolder format box
with square brackets. So, for example, if youre naming your layers something like ops024_a_diff,
you could put <graphic>_<layer>_<pass> in this box. Then in the subfoler box, you could put
[graphic]\[layer]\v001\[pass], which would give you ops024\a\v001\diff as the subfolder structure.
Render Settings: Which render settings to use.
Output Module: Which output module to use.
Output Format: How the output file name should be formatted.
Output Folder: Where the output files should be rendered to.

558

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Use Subfolders: Enable this to render each layer to its own subfolder. If this is enabled, you must also specify
the subfolder format.

9.3.2 Cross-Platform Rendering Considerations


In order to perform cross-platform rendering with After Effects, you must setup Mapped Paths so that Deadline can
swap out the Project and Output file paths where appropriate. You can access the Mapped Paths Setup in the Monitor
while in super user mode by selecting Tools -> Configure Repository. Youll find the Mapped Paths Setup in the list
on the left.
You then have two options on how to set up your After Effects project file. The traditional way is to ensure that your
After Effects project file is on a network shared location, and that any footage or assets that the project uses is in the
same folder or in sub-folders. Then when you submit the job, you must make sure that the option to submit the project
file with the job is disabled. If you leave it enabled, the project file will be copied to and loaded from the Slaves local
machine, and thus wont be able to find the footage.
You also have the option to save your After Effects project as an AEPX file, which is just an XML file. Deadline will
automatically detect that an AEPX file has been submitted, and will swap out paths within the file itself (because it is
just plain text). This way, you dont have to worry about setting up the project structure described in the first option.
Note though that all the asset paths still need to be network accessible.

9.3.3 Plug-in Configuration


You can configure the After Effects plug-in settings from the Monitor. While in super user mode, select Tools ->
Configure Plugins and select the After Effects plug-in from the list on the left.

Render Executables

9.3. After Effects

559

Deadline User Manual, Release 7.1.0.35

After Effects Executable: The path to the After Effects aerender executable file used for rendering. Enter
alternative paths on separate lines. Different executable paths can be configured for each version installed on
your render nodes.
Render Options
Fail On Existing After Effects Process: Prevent Deadline from rendering when After Effects is already open.
Force Rendering In English: You can configure the After Effects plug-in to force After Effects to render in
English. This is useful if you are rendering with a non-English version of After Effects, because it ensures that
Deadlines progress gathering and error checking function properly (since they are currently based on English
output from the After Effects renderer).
Font Folder Synchronization
The new FontSync event plugin that ships with Deadline v7.1 can be used to synchronize fonts on Mac OS X and
Windows before the Slave application starts rendering any job, or when the Slave first starts up. This general FontSync
Python based event plugin replaces the font synchronization options here in the After Effects plugin and now works
for ALL plugin types in Deadline. This FontSync event plugin is located at <Repository>/events/FontSync
Path Mapping For aepx Project Files (For Mixed Farms)
Enable Path Mapping For aepx Files: If enabled, a temporary aepx file will be created locally on the slave for
rendering and Deadline will do path mapping directly in the aepx file.

9.3.4 Integrated Submission Script Setup


The following procedures describe how to install the integrated After Effects submission script. This script allows for
submitting After Effects render jobs to Deadline directly from within the After Effects editing GUI. The script and the
following installation procedure has been tested with with After Effects CS3 and later.
You can either run the Submitter installer or manually install the submission script
Submitter Installer
Run the Submitter Installer located at <Repository>/submission/AfterEffects/Installers
Manual Installation of the Submission Script
Copy [Repository]\submission\AfterEffects\Client\DeadlineAfterEffectsClient.jsx to [After Effects Install Directory]\Support Files\Scripts
After starting up After Effects, make sure that under Edit -> Preferences -> General, the Allow Scripts to Write
Files and Access Network option is enabled. This is necessary so that the submission script can create the
necessary files to submit to Deadline.

560

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Custom Sanity Check


A CustomSanityChecks.jsx file can be created alongside the main SubmitAEToDeadline.jsx submission script (in
[Repository]\submission\AfterEffects\Main), and will be evaluated if it exists. This script will let you set any of the
initial properties in the submission script prior to displaying the submission window. You can also use it to run your
own checks and display errors or warnings to the user. Here is a very simple example of what this script could look
like:
{
initDepartment = "The Best Department";
initPriority = 33;
initConcurrentTasks = 2;
alert( "You are in a custom sanity check!" );
}

9.3.5 FAQ
Which versions of After Effects are supported?
After Effects CS3 and later are supported.
Why is there no Advanced tab in the integrated submission script for After Effects CS3?

9.3. After Effects

561

Deadline User Manual, Release 7.1.0.35

Tabs are only supported in CS4 and later, so the Advanced tab and its options are not available in CS3 and
earlier.
Does network rendering with After Effects require a full After Effects license?
In After Effects CS5.0 and earlier, a license is not required. In After Effects CS5.5, a full license is
required. In After Effects CS6.0 and later, a license isnt required if you enable non-royalty-bearing
mode.
Rendering through Deadline seems to take longer than rendering through After Effects locally.
After Effects needs to be restarted at the beginning of each frame, and this loading time results in the
render taking longer than expected. If you know ahead of time that your frames will render quickly, it is
recommended to submit your frames in groups of 5 or 10. This way, After Effects will only load at the
beginning of each group of frames, instead of at the beginning of every frame.
When rendering a job, only the images from the first task are saved, and subsequent tasks just seem to overwrite
those initial image files.
In the comps Output Module Settings, make sure that the Use Comp Frame Number checkbox is
checked. Check out step 1 here for complete details.
I get the error that the specified comp cannot be found when rendering, but it is in the render queue.
This can occur for a number of reasons, most of which are related to the name of the comp. Examples are names with two spaces next to each other, or names with apostrophes in them. Try using only
alphanumeric characters and underscores in comp names and output paths to see if that resolves the issue.
Why do the comps in the After Effects Render Queue require unique names?
Due to an issue with the Render Queue, if you have more than one comp with the same name, only the
settings from the first one will be used (whether they are checked or not). It is important that all comps in
the Render Queue have unique names, and our submission script will notify you if they do not.
Understanding the different After Effects command line flags.
Adobe have a web page, Automated Rendering which explains the different network render command
line options and how they work. Deadline currently supports as many of these options as possible.
How can I optimize After Effects for high performance?
Adobe provide an excellent web page, Memory and Storage documenting different areas of After Effects
and what can be done by users to improve performance, particularly in the areas of disk storage/caching
& RAM.

9.3.6 Error Messages and Meanings


This is a collection of known After Effects error messages and their meanings, as well as possible solutions. We want
to keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
What does this After Effects error code mean?
A great resource describing a very large number of After Effects error codes, their meanings and possible
solutions can be found on the Mylenium Errors website. If this site helps you, please do consider donating
to keep the site going!
Exception during render: Renderer returned non-zero error code, -1073741819
The error code -1073741819 is equivalent to 0xC0000005, which represents a Memory Access Violation
error. So After Effects is either running out of memory, or memory has become corrupt. If you find
that your frames are still being rendered, you can modify the After Effects plugin to ignore this error.

562

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Just add the following function to the AfterEffectsPlugin class in AfterEffects.py, which can be found in
[Repository]/plugins/AfterEffects.
def CheckExitCode( self, exitCode ):
if exitCode != 0:
if exitCode == -1073741819:
LogInfo( "Ignoring exit code -1073741819" )
else:
FailRender( "Renderer returned non-zero error code %d." % exitCode )

You can find another example of the CheckExitCode function in MayaCmd.py, which can be found in
[Repository]/plugins/MayaCmd.
aerender ERROR: No comp was found with the given name.
This can occur for a number of reasons, most of which are related to the name of the comp. Examples are
names with to spaces next to each other, or names with apostrophes in them. Try using only alphanumeric
characters and underscores in comp names and output paths to see if that resolves the issue.

Exception during render: Renderer returned non-zero error code, 1


aerender ERROR: An existing connection was forcibly closed by the remote host. Unable to receive at line 287
aerender ERROR: After Effects can not render for aerender. Another instance of aerender, or another script,
may be running; or, AE may be waiting for response from a modal dialog, or for a render to complete. Try
running aerender without the -reuse flag to invoke a separate instance of After Effects.
It is unknown what the exact cause of this error is, but it is likely that After Effects is simply crashing
or running out of memory. If you are rendering with Concurrent Tasks set to a value greater than 1, try
reducing the number and see if that helps.
The Knoll Light Factory plugin has also been known to cause this error message when it cant get a
license.

9.4 Anime Studio


9.4.1 Job Submission
You can submit Anime Studio Standalone jobs from the Monitor.

9.4. Anime Studio

563

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. The Anime Studio specific options are:
564

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Anime Studio File: The scene file (*.anme) to be rendered.


Output File: The path to where the rendered images will be saved.
Add Format Suffix: If this option is enabled, the format name will be appended to the file name of the output
path. Version 9.5 and later.
Version: The version of Anime Studio to render with.
Layer Comp: Render a specific layer comp, or select All to render all layer comps to separate files.
Additional Rendering Options:
Antialiased Edges: Normally, Anime Studio renders your shapes with smoothed edges. Uncheck this box to
turn this feature off.
Apply Shape Effects: If this box is unchecked, Anime Studio will skip shape effects like shading, texture fills,
and gradients.
Apply Layer Effects: If this box is unchecked, Anime Studio will skip layer effects like layer shadows and
layer transparency.
Render At Half Dimensions: Check this box to render a smaller version of your movie. This makes rendering
faster if you just want a quick preview, and is useful for making smaller movies for the web.
Render At Half Frame Rate: Check this box to skip every other frame in the animation. This makes rendering
faster, and results in smaller movie files.
Reduced Particles: Some particle effects require hundreds of particles to achieve their effect. Check this box to
render fewer particles. The effect may not look as good, but will render much faster if all you need is a preview.
Extra-smooth Images: Renders image layers with a higher quality level. Exporting takes longer with this
option on.
Use NTSC-safe Colors: Automatically limits colors to be NTSC safe. This is only an approximation - you
should still do some testing to make sure your animation looks good on a TV monitor.
Do Not Premultiply Alpha Channel: Useful if you plan to composite your Anime Studio animation with other
elements in a video editing program.
QT Options:
Video Codec: The video codec (leave blank to not specify one). Version 10 and later.
Quality: The quality of the export. Version 10 and later. 0 = Minimum, 1 = Low, 2 = Normal, 3 = High, 4 =
Max, 5 = Lossless
Depth: The pixel depth of the export. Version 10 and later.
iPhone/iPad Movie Options:
Format: The available formats for m4v movies.
AVI Options:
Format: The available formats for avi movies.
SWF Options:
Variable Line Widths: Exports variable line widths to SWF.

9.4.2 Plug-in Configuration


You can configure the Anime Studio plug-in settings from the Monitor. While in super user mode, select Tools ->
Configure Plugins and select the Anime Studio plug-in from the list on the left.

9.4. Anime Studio

565

Deadline User Manual, Release 7.1.0.35

Render Executables
Anime Studio Executable: The path to the Anime Studio executable file used for rendering. Enter alternative
paths on separate lines. Different executable paths can be configured for each version installed on your render
nodes.

9.4.3 FAQ
Which versions of Anime Studio are supported by Deadline?
Anime Studio 8 and later are supported.

9.4.4 Error Messages And Meanings


This is a collection of known Anime Studio error messages and their meanings, as well as possible solutions. We
want to keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email
Deadline Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.5 Arion Standalone


9.5.1 Job Submission
You can submit Arion jobs from the Monitor. Note that Arions RenderWarrior application does not support animations
and therefore only single Arion files may be submitted. Arion animations can be rendered through the Arion live
plugins.
566

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

9.5. Arion Standalone

567

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Integration options are
explained in the Integration documentation. The Arion specific options are:
Arion File: The Arion scene that will be rendered. Can be a .rcs or .obj file.
LDR Output File: The name of the rendered LDR output file. If no output file is specified a default image file
will be saved beside the Arion file.
HDR Output File: The name of the rendered HDR output file. If no output file is specified a default image file
will be saved beside the Arion file.
Passes: If enabled, Arion will render until the specified number of passes have completed.
Minutes: If enabled, Arion will render until the specified number of minutes have passed.
Threads: The number of threads that Arion will use to render the input file. If no threads are specified, a default
of one will be used.
Command Line Args: Here you can specify additional command line arguments. Arion accepts command line
arguments in the format -arg:value.
Channels: Each channel enabled will generate a different image appended with the channel name.
If both Passes and Minutes are specified, Arion will finish rendering when the first limit is reached. If neither are
enabled, Arion will render indefinitely and the job will have to be stopped manually.

9.5.2 Plug-in Configuration


You can configure the Arion plug-in settings from the Monitor. While in super user mode, select Tools -> Configure
Plugins and select the Arion plug-in from the list on the left.

568

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Render Executables
Arion Engine Executable: The path to the Arion engine executable file used for rendering. Enter alternative
paths on separate lines.

9.5.3 FAQ
Which versions of Arion are supported?
Only the Arion 2 Standalone is supported.
Are there any issues with referencing a file in the global input folder when one or more other files exist with the
same name?
Yes. When there is a file in the scene that has the same name as a file in another subdirectory, the network
renderer will reference the first file with that name that it finds. It ignores the direct path to the correct
subdirectory.
Can I render multiple channels?
Yes! The Arion submitter supports the selection of individual channels.
How can I pass additional information to Arion?
The Command Line Args field allows you to specify additional arguments to Arion. For example, typing
-h:100 -w:100 in the Command Line Args field will tell Arion to change the image size to 100px by
100px. To find out more information about additional command line arguments, please visit Arions
website.
Can I submit a Arion animations?
The Arion 2 Standalone does not support animations and can only render single images. Arion does still
support animations through there Live plugins.

9.5.4 Error Messages and Meanings


This is a collection of known Combustion error messages and their meanings, as well as possible solutions. We want
to keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.6 Arnold Standalone


9.6.1 Job Submission
You can submit Arnold Standalone jobs from the Monitor.

9.6. Arnold Standalone

569

Deadline User Manual, Release 7.1.0.35

570

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Setup your Arnold Files


Before you can submit an Arnold Standalone job, you must export your scene into .ass files.
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. The Arnold specific options are:
Arnold File: The Arnold file(s) to be rendered.
If you are submitting a sequence of .ass files, select one of the numbered frames in the sequence, and
the frame range will automatically be detected if Calculate Frames From Arnold File is enabled. The
frames you choose to render should correspond to the numbers in the .ass files.
Output File: The output file. If left blank, Arnold will save the output to the location defined in the .ass file.
Version: Choose the Beta or Release version of Arnold to render with (these can be configured in the Arnold
plugin configuration).
Threads: The number of threads to use for rendering.
Verbosity: The verbosity level for the render output.
Enable Local Rendering: If enabled, Deadline will render the frames locally before copying them over to the
final network location.
Command Line Args: Specify additional command line arguments you would like to pass to the Arnold renderer.
Additional Plugin Folders: Specify up to three additional plugin folders that Arnold should use when rendering.

9.6.2 Plug-in Configuration


You can configure the Arnold plug-in settings from the Monitor. While in super user mode, select Tools -> Configure
Plugins and select the Arnold plug-in from the list on the left.

9.6. Arnold Standalone

571

Deadline User Manual, Release 7.1.0.35

Render Executables
Arnold Kick Executable: The path to the Arnold kick executable file used for rendering. Enter alternative
paths on separate lines. Different executable paths can be configured for each version installed on your render
nodes.

9.6.3 FAQ
Is Arnold Standalone supported by Deadline?
Yes.
Can I submit a sequence of Arnold .ass files that each contain one frame?
Yes, this is supported.

9.6.4 Error Messages and Meanings


This is a collection of known Arnold error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

572

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

9.7 AutoCAD
9.7.1 Job Submission
You can submit jobs from within AutoCAD by installing the integrated submission script, or you can submit them
from the Monitor. The instructions for installing the integrated submission script can be found further down this page.
To submit from within AutoCAD, press the Submit To Deadline button on the Deadline tab or run the command
SubmitToDeadline

9.7. AutoCAD

573

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation.
AutoCAD has 3 types of submission jobs each of which have their own specific options.
The render job options are:
Render Views: Which views to render, each one will be a separate frame in a single job.

574

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Render Procedure: View or Selected - whether or not to render everything in the view or only the selected
objects.
The plotter job options are:
Plotter to use: Which plotter should be used.
Plot Area: Extents or Display - what area should be plotted, everything in the scene or what is currently
displayed.
Paper Size: The size of paper to plot to.
Paper Units: Which units to use for the paper.
Fit Plot Scale: Whether or not the plot should be scaled as much as possible to fit on the paper.
Plot Scale: The scale to use if not fitting
Plot Style Table: Which plot style table should be used.
Use Line Weight: Whether or not the lines should have extra weight on them.
Scale Line Weights: Whether or not the lines should be scaled.
The export job options are:
Selection: Which objects should be exported. Only available in the integrated submitter.
Types to Export: Which types of objects should be exported.
Textures: How textures should be handled.
DGN Settings: DGN specific settings such as version and seed file.

9.7.2 Plug-in Configuration


You can configure the AutoCAD plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the AutoCAD plug-in from the list on the left.

9.7. AutoCAD

575

Deadline User Manual, Release 7.1.0.35

Render Executables
AutoCAD 2015 Executable: The path to the AutoCAD 2015 executable file used for rendering. Enter alternative paths on separate lines. Different executable paths can be configured for each version installed on your
render nodes.
AutoCAD 2016 Executable: The path to the AutoCAD 2016 executable file used for rendering. Enter alternative paths on separate lines. Different executable paths can be configured for each version installed on your
render nodes.

9.7.3 Integrated Submission Script Setup


The following procedures describe how to install the integrated AutoCAD submission script. This script allows for
submitting AutoCAD render jobs to Deadline directly from within the AutoCAD editing GUI.
You can either run the Submitter installer or manually install the submission script
Submitter Installer
Run the Submitter Installer located at <Repository>/submission/AutoCAD/Installers
Manual Installation of the Submission Script
Copy
[Repository]/AutoCAD/Client/AutoCADSubmitter.bundle
DATA%/Autodesk/ApplicationPlugins

to

%APP-

Restart AutoCAD and the Deadline tool bar should be available.

576

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

9.7.4 FAQ
Is AutoCAD supported by Deadline?
Yes.
AutoCAD 2016 requires signed dlls. Are Deadlines plugins signed?
Yes, all of Deadlines plugins are signed, due to the new system though you will have to add Thinkbox
as a trusted company to each of your machines. This can be done by opening AutoCAD 2016 on the
machines that have the plugins (including the render plugin) and then allow the plugins to always load.

9.7.5 Error Messages and Meanings


This is a collection of known AutoCAD error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.8 Blender
9.8.1 Job Submission
You can submit jobs from within Blender by installing the integrated submission script, or you can submit them from
the Monitor. The instructions for installing the integrated submission script can be found further down this page.
To submit from within Blender 2.5 and later, select Render -> Submit To Deadline. For previous versions of Blender,
you must submit from the Monitor.

9.8. Blender

577

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. The Blender specific options are:
578

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Threads: The number of threads to use for rendering.


Build To Force: You can force 32 bit or 64 bit rendering.

9.8.2 Plug-in Configuration


You can configure the Blender plug-in settings from the Monitor. While in super user mode, select Tools -> Configure
Plugins and select the Blender plug-in from the list on the left.

Render Executables
Blender Executable: The path to the Blender executable file used for rendering. Enter alternative paths on
separate lines.
Output
Suppress Verbose Progress Output To Log: When enabled, this will prevent excessive progress logging to the
Slave and task logs.

9.8.3 Integrated Submission Script Setup


The following procedures describe how to install the integrated Blender submission script. This script allows for
submitting Blender render jobs to Deadline directly from within the Blender editing GUI. Note that this script only
works with Blender 2.5 and later. You can submit to older versions of Blender from the Monitor.
You can either run the Submitter installer or manually install the submission script
Submitter Installer
Run the Submitter Installer located at <Repository>/submission/Blender/Installers
In Blender, select File -> User Preferences, and then select the Add-Ons tab.
9.8. Blender

579

Deadline User Manual, Release 7.1.0.35

Click on the Render filter on the left, and check the box next to the Render: Submit To Deadline add-on.

Manual Installation of the Submission Script


In Blender, select File -> User Preferences, and then select the Add-Ons tab.

580

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Click the Install Add-On button at the bottom, browse to [Repository]\submission\Blender\Client, and select
the DeadlineBlenderClient.py script. Then press the Install Add-On button to install it. Note that on Windows, you may not be able to browse the UNC repository path, in which case you can just copy [Repository]\submission\Blender\Client\DeadlineBlenderClient.py locally to your machine before pointing the Add-On
installer to it.

Then click on the Render filter on the left, and check the box next to the Render: Submit To Deadline add-on.

9.8. Blender

581

Deadline User Manual, Release 7.1.0.35

After closing the User Preferences window, the Submit To Deadline option should now be in your Render menu.

9.8.4 FAQ
Which versions of Blender are supported?
Blender 2.x is currently supported.

9.8.5 Error Messages And Meanings


This is a collection of known Blender error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.9 Cinema 4D
9.9.1 Job Submission
You can submit jobs from within Cinema 4D by installing the integrated submission script, or you can submit them
from the Monitor. The instructions for installing the integrated submission script can be found further down this page.
To submit from within Cinema 4D, select Python -> Plugins -> Submit To Deadline.

582

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. The Cinema 4D specific options are:
Threads To Use: The number of threads to use for rendering.
Build To Force: Force rendering in 32 bit or 64 bit.
Export Project Before Submission: If your project is local, or you are rendering in a cross-platform environment, you may find it useful to export your project to a network directory before the job is submitted.
Enable Local Rendering: If enabled, the frames will be rendered locally, and then copied to their final network
location.

9.9.2 Cross-Platform Rendering Considerations


In order to perform cross-platform rendering with Cinema 4D, you must setup Mapped Paths so that Deadline can
swap out the Scene and Output file paths where appropriate. You can access the Mapped Paths Setup in the Monitor
while in super user mode by selecting Tools -> Configure Repository. Youll find the Mapped Paths Setup in the list
on the left.
When submitting the Cinema 4D job for rendering, you should enable the Export Project Before Submission option,
and choose a network location when prompted for the export path. This will strip any absolute asset paths and make
them relative to the scene file, and will also ensure the option to submit the Cinema 4D scene file with the job is
disabled.
If you dont enable the Export Project Before Submission option, you need to manually export the project to a network
location. Then, you must submit the exported scene file from the Submit menu in the Monitor and you need to specify
the output and/or multipass output paths in the submitter. Make sure the option to submit the Cinema 4D scene file
with the job is disabled. If you leave it enabled, the scene file will be copied to and loaded from the Slaves local
machine, which will break the relative asset paths.

9.9. Cinema 4D

583

Deadline User Manual, Release 7.1.0.35

9.9.3 Plug-in Configuration


You can configure the Cinema 4D plug-in settings from the Monitor. While in super user mode, select Tools ->
Configure Plugins and select the Cinema 4D plug-in from the list on the left.

Render Executables
C4D Executable: The path to the C4D executable file used for rendering. Enter alternative paths on separate
lines. Different executable paths can be configured for each version installed on your render nodes.

9.9.4 Integrated Submission Script Setup


The following procedures describe how to install the integrated Cinema 4D submission script. This script allows for
submitting Cinema 4D render jobs to Deadline directly from within the Cinema 4D editing GUI.
You can either run the Submitter installer or manually install the submission script
Submitter Installer
Run the Submitter Installer located at <Repository>/submission/Cinema4D/Installers
Manual Installation of the Submission Script
Copy [Repository]/submission/Cinema4D/Client/DeadlineC4DClient.pyp to [Cinema 4D Install Directory]/plugins.
Restart Cinema 4D, and the Submit To Deadline menu should be available from the Python -> Plugins menu.
Custom Sanity Check
A CustomSanityChecks.py file can be created alongside the main SubmitC4DToDeadline.py submission script (in
[Repository]\submission\Cinema4D\Main), and will be evaluated if it exists. This script will let you set any of the
584

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

initial properties in the submission script prior to displaying the submission window. You can also use it to run your
own checks and display errors or warnings to the user. Here is a very simple example of what this script could look
like:
import c4d
from c4d import gui
def RunSanityCheck( dialog ):
dialog.SetString( dialog.DepartmentBoxID, "The Best Department!" )
dialog.SetLong( dialog.PriorityBoxID, 33 )
dialog.SetLong( dialog.ConcurrentTasksBoxID, 2 )
gui.MessageDialog( "This is a custom sanity check!" )
return True

The available dialog IDs can be found in the SubmitC4DToDeadline.py script mentioned above. They are defined near
the top of the SubmitC4DToDeadlineDialog class. These can be used to set the initial values in the submission dialog.
Finally, if the RunSanityCheck method returns False, the submission will be cancelled.

9.9.5 FAQ
Which versions of Cinema 4D are supported?
Cinema 4D 12 and later are supported.
When I use Adobe Illustrator files as textures, the render fails with Asset missing
While Cinema 4D is able to use AI files in workstation mode, there is often problems when rendering
in command line mode. Convert the AI files to another known type such as TIFF or JPEG before using
them.
Sometimes when I open the submission dialog in Cinema 4D, the pool list or group list are empty.
Simply close the submission dialog and reopen it to repopulate the lists.
Does rendering with Cinema 4D with Deadline use up a full Cinema 4D license?
There are separate Cinema 4D command line licenses that are required to render with Deadline. Please
contact Maxon for more information regarding licensing requirements.
Can Deadline render with Cinema 4Ds Net Render Client software?
No. It isnt possible for 3rd party software such as Deadline to control Cinema 4Ds Net Render Client,
which is why it uses the command line renderer.
I have copied over SubmitToDeadline.pyp file but the integrated submission script does not show up under the
python menu.
This is likely caused by some failure in the script. Check your repository path to ensure the client is able
to read and write to that folder. Using the python console within C4D may provide more specific hints.
My frames never seem to finish rendering. When I check the slave machine, it doesnt appear to be doing
anything.
This can occur if Cinema 4D hasnt been licensed yet. Try starting Cinema 4D normally on the machine
and see if you are prompted for a license. If you are, configure everything and then try rendering on that
machine again.

9.9. Cinema 4D

585

Deadline User Manual, Release 7.1.0.35

9.9.6 Error Messages And Meanings


This is a collection of known Cinema 4D error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.10 Cinema 4D Team Render


9.10.1 Job Submission
You can submit jobs from within Cinema 4D by installing the integrated submission script, or you can submit them
from the Monitor. The instructions for installing the integrated submission script can be found further down this page.
To submit from within Cinema 4D, select Python -> Plugins -> Submit Team Render To Deadline.

586

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

9.10. Cinema 4D Team Render

587

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation. The Cinema 4D Team Render
specific options are:
Render Client Count: The number of render clients to use.
Security Token: The security token that the Team Render application will use on the slaves (it will be generated
automatically if left blank).

588

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Rendering
After youve configured your submission options, press the Reserve Clients button to submit the Team Render job.
After the job has been submitted, you can press the Update Clients button to update the jobs ID and Status in the
submitter. As nodes pick up the job, pressing the Update Clients button will also show them in the Active Servers list.
Cinema 4Ds Team Render Machines window will will also appear after pressing the Reserve Clients button, and will
show you the Team Render machines that are currently available. Before you can render with them though, you must
verify them by following these steps:
1. Copying the Security Token from the submitter to the clipboard (use the Copy to Clipboard button).
2. Right-click on each machine in the Team Render Machines window and select the Verify option, then paste the
Security Token and press OK.

When you are ready to render, select the Team Render To Picture Viewer option in C4Ds Render menu to start
rendering.

9.10.2 Plug-in Configuration


You can configure the Cinema 4D Team Render plug-in settings from the Monitor. While in super user mode, select
Tools -> Configure Plugins and select the Cinema 4D plug-in from the list on the left.

9.10. Cinema 4D Team Render

589

Deadline User Manual, Release 7.1.0.35

Cinema 4D Options
C4D Team Render Executable: The path to the Cinema 4D Team Render Client executable file used for
rendering. Enter alternative paths on separate lines. Different executable paths can be configured for each
version installed on your render nodes.

9.10.3 Integrated Submission Script Setup


The following procedures describe how to install the integrated Cinema 4D Team Render submission script. This
script allows for submitting Cinema 4D Team Render render jobs to Deadline directly from within the Cinema 4D
editing GUI.
You can either run the Submitter installer or manually install the submission script
Submitter Installer
Run the Submitter Installer located at <Repository>/submission/Cinema4DTeamRender/Installers
Manual Installation of the Submission Script
Copy [Repository]/submission/Cinema4DTeamRender/Client/DeadlineC4DTeamRenderClient.pyp to [Cinema
4D Install Directory]/plugins.
Restart Cinema 4D, and the Submit To Deadline menu should be available from the Python -> Plugins menu.

9.10.4 FAQ
Which versions of Cinema 4D are supported?

590

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Cinema 4D 15 and later are supported.

9.10.5 Error Messages And Meanings


This is a collection of known Cinema 4D error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.11 Clarisse iFX


9.11.1 Job Submission
You can submit jobs from within Clarisse iFX by installing the integrated submission script, or you can submit them
from the Monitor. The instructions for installing the integrated submission script can be found further down this page.
To submit from within Clarisse iFX, click on the custom toolbar item you created during the integrated submission
script setup. You will first be prompted to specify a file to export the render archive to.

After you specify the render archive file, the submitter will come up with the Render Archive and Frame List fields
already populated.

9.11. Clarisse iFX

591

Deadline User Manual, Release 7.1.0.35

Note that if you are submitting from the Monitor, you will have to manually export your render archive from inside
Clarisse iFX, and then browse to the Render Archive file in the Monitor submitter.
Submission Options
The general Deadline options are explained in the Job Submission documentation. The Clarisse iFX specific options
are:
Threads: The number of threads to use for rendering. If set to 0, the value in the Clarisse configuration file will
be used.
Verbose Logging: Enables verbose logging during rendering.
592

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

9.11.2 Plug-in Configuration


You can configure the Clarisse iFX plug-in settings from the Monitor. While in super user mode, select Tools ->
Configure Plugins and select the Clarisse plug-in from the list on the left.

Render Executables
CRender Executable: The path to the Clarisses crender executable file used for rendering. Enter alternative
paths on separate lines.
Configuration Options
Global Config File: A global configuration file to be used for rendering. If left blank, the Clarisse.cfg file in the
user home directory will be used instead.
Module Paths: Additional paths to search for modules (one path per line).
Search Paths: Additional paths to search for includes (one path per line).

9.11.3 Integrated Submission Script Setup


The following procedures describe how to install the integrated Clarisse iFX submission script. This script allows for
submitting Clarisse iFX render jobs to Deadline directly from within the Clarisse iFX editing GUI.
You can either run the Submitter installer or manually install the submission script
Submitter Installer
Run the Submitter Installer located at <Repository>/submission/Clarisse/Installers
Manual Installation of the Submission Script
In Clarisse iFX, right-click on the toolbar at the top and select Add Item.

9.11. Clarisse iFX

593

Deadline User Manual, Release 7.1.0.35

In the Add New Item dialog, set the following properties:


Title: Submit To Deadline
Category: Custom
Category Custom: Deadline
Script
Path:
Choose
tory]\submission\Clarisse\Client

594

the

DeadlineClarisseClient.py

script

from

[Reposi-

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Click Add, and you should now see a Deadline tab in the toolbar with a button that you can click on to submit
the job.

9.11.4 FAQ
Which versions of Clarisse iFX are supported?

9.11. Clarisse iFX

595

Deadline User Manual, Release 7.1.0.35

The crender application is used for rendering, so any version of Clarisse iFX that includes this application is supported.

9.11.5 Error Messages and Meanings


This is a collection of known Clarisse iFX error messages and their meanings, as well as possible solutions. We want
to keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.12 Combustion
9.12.1 Job Submission
You can submit Combustion jobs from the Monitor.

596

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Workspace Configuration
In Combustion, when you are ready to submit your workspace, open the Render Queue by selecting File ->
Render... (CTRL+R).
Select which items you want to render in the box on the left.
9.12. Combustion

597

Deadline User Manual, Release 7.1.0.35

Configure your output settings under the tab Output Settings.

Under the tab Global Settings, specify an Input Folder (a shared folder where all the footage for you workspace
can be found) and an Output Folder (a shared folder where the output will be dumped). Note that Combustion
will search any subfolders in you Input Folder for footage as well.

598

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Close the Render Queue and save your workspace.


Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. The Combustion specific options are:
Workspace File: The Combustion workspace file to be rendered.
Output Operator: Select the output operator in the workspace file to render. The render will fail if the operator
cannot be found.
Version: The version of Combustion to render with.
Skip Existing Frames: Skip over existing frames during rendering (version 4 and later only).
Use Only One CPU To Render: Limit rendering to one CPU (version 4 and later only).

9.12.2 Plug-in Configuration


You can configure the Combustion plug-in settings from the Monitor. While in super user mode, select Tools ->
Configure Plugins and select the Combustion plug-in from the list on the left.

Render Executables
Combustion Executable: The path to the ShellRender executable file used for rendering. Enter alternative
paths on separate lines. Different executable paths can be configured for each version installed on your render
nodes.

9.12. Combustion

599

Deadline User Manual, Release 7.1.0.35

9.12.3 FAQ
Which versions of Combustion are supported?
Combustion 4 and later are supported.
All my input footage is spread out over the network, so how do I specify a single Input Folder during submission?
When Combustion is given an Input Folder, it will search all subfolders for the required footage until
the footage is found. So if you have a root folder that all of your footage branches off from, you should
specify that root as the Input Folder.
Are there any issues with referencing a file in the global input folder when one or more other files exist with the
same name?
Yes. When there is a file in the scene that has the same name as a file in another subdirectory, the network
renderer will reference the first file with that name that it finds. It ignores the direct path to the correct
subdirectory.
Can Deadline render multiple outputs?
No. Only one output can be enabled in your Combustion workspace. If no outputs are enabled, or multiple
outputs are enabled, the workspace cannot be submitted to Deadline.
When rendering, I receive a pop up error message. Since rendering is supposed to be silent, should I not be
getting error messages like this in the first place?

Make sure that youre using ShellRenderer.exe as the render executable, (combustion.exe starts up Combustion normally, while ShellRenderer.exe is the command line rendering appliation). You can make the
switch in the Plugin Configuration (Tools -> Configure Plugins in the Monitor while in super user mode).
Why isnt path mapping working properly between Windows and Mac?
On the Mac, the Combustion workspace file saves network paths in the form share:\\folder\..., so you have
to set up your Path Mapping settings in the Repository options accordingly.

9.12.4 Error Messages And Meanings


This is a collection of known Combustion error messages and their meanings, as well as possible solutions. We want
to keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.13 Command Line


9.13.1 Job Submission
Arbitrary command line jobs can be submitted to Deadline that will execute the same command line for each frame of
the job.
To submit arbitrary command line jobs, refer to the Manual Job Submission documentation. To submit from the
Monitor, refer to the documentation below.

600

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

9.13. Command Line

601

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation. The Command Line specific options
are:
Job Type: Choose a normal job or maintenance job. A normal job will let you specify an arbitrary frame list,
but a maintenance job requires a start frame and an end frame.
Executable: The executable to use for rendering.
Arguments: The arguments to pass to the executable. Use the Start Frame and End Frame buttons to add their
corresponding tags to the end of the current arguments. See the Manual Job Submission documentation for more
information on these tags.
Frame Tag Padding: Determines the amount of frame padding to be added to the Start and End Frame tags.
Start Up Folder: The folder that the executable will be started in. If left blank, the executables folder will be
used instead.

9.13.2 Plug-in Configuration


The Command Line plug-in does not require any configuration.

9.13.3 FAQ
How do I handle paths in the arguments with spaces in them?
Use double-quotes around the path. For example, T:\projects\path with spaces\project.ext.
Do I need to use the <QUOTE> tags?
These are only needed when submitting manually from the command line. When using the Monitor
submitter, you can just type in the double-quote character in the Arguments field.

9.14 Command Script


9.14.1 Job Submission
You can submit Command Script jobs from the Monitor. Command Script can execute a series of command lines,
which can be configured to do anything from rendering to folder synchronization.

602

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

9.14. Command Script

603

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation. The Command Script specific
options are:
Commands To Execute: Specify a list of commands to execute by either typing them in, or by loading them
from a file. You also have the option to save the current list of commands to a file. To insert file or folder paths
into the Commands field, use the Insert File Path or Insert Folder Path buttons.
Startup Directory: The directory where each command will startup. This is optional, and if left blank, the
executables directory will be used as the startup directory.
Commands Per Task: Number of commands that will be executed for each task.

9.14.2 Manual Submission


Command Script jobs can also be manually submitted from the command line.
Submission File Setup
Three files are required for submission:
the Job Info File
the Plugin Info File
the Command file
The Job Info file contains the general job options, which are explained in the Job Submission documentation.
The Plugin info file contains one line (this is the directory where each command will startup):
StartupDirectory=...

The Command file contains the list of commands to run. There should be one command per line, and no lines should
be left blank. If youre executable path has a space in it, make sure to put quotes around the path. The idea is that
one frame in the job represents one command in the Command file. For example, lets say that your Command file
contains the following:
"C:\Program
"C:\Program
"C:\Program
"C:\Program
"C:\Program

Files\Executable1.exe"
Files\Executable1.exe" -param1
Files\Executable1.exe"
Files\Executable1.exe" -param1 -param2
Files\Executable1.exe"

Because there are five commands, the Frames specified in the Job Info File should be set to 0-4. If the Chunksize is set
to 1, then a separate task will be created for each command. When a slave dequeues a task, it will run the command
that is on the corresponding line number in the Command file. Note that the frame range specified must start at 0.
If you wish to run the commands in the order that they appear in the Command file, you can do so by setting the
MachineLimit in the Job Info File to 1. Only one machine will render the job at a given time, thus dequeuing each
task in order. However, if a task throws an error, the slave will move on to dequeue next task.
To submit the job, run the following command (where DEADLINE_BIN is the path to the Deadline bin directory):
DEADLINE_BIN\deadlinecommand JobInfoFile PluginInfoFile CommandFile

604

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Manual Submission Example


This example demonstrates how you can render a single frame from a Maya scene with different options, and direct
the output to a specific location. To get the submission script, download Example Script For Command Script Plugin
from the Miscellaneous Deadline Downloads Page. To run the script, run the following command (you must have Perl
installed):
Perl SubmitMayaCommandScript.pl "SceneFile.mb" FrameNumber "OutputPath"

9.14.3 Plug-in Configuration


The Command Script plug-in does not require any configuration.

9.14.4 FAQ
Can I use executables with spaces in the path?
Yes, just add quotes around the executable path.

9.15 Composite
9.15.1 Job Submission
You can submit jobs from within Composite by installing the integrated submission script, or you can submit them
from the Monitor. The instructions for installing the integrated submission script can be found further down this page.

9.15. Composite

605

Deadline User Manual, Release 7.1.0.35

To submit from within Composite, select the version you would like to submit, hit render, and choose the Background
option when prompted.

606

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Integration options are
explained in the Integration documentation. The Composite specific options are:
Project File: The Composite .txproject file.
Composition: Path to the composition that you want to submit.
Composition Version: The version of the current composition selected.
Users ini file: The path to the user.ini file for this composition.
Version: The version of Composite to use.
Build to Force: Force 32 bit or 64 bit rendering.

9.15.2 Plug-in Configuration


You can configure the Composite plug-in settings from the Monitor. While in super user mode, select Tools ->
Configure Plugins and select the Composite plug-in from the list on the left.

9.15. Composite

607

Deadline User Manual, Release 7.1.0.35

Render Executables
Composite Executable: The path to the txrender executable file used for rendering. Enter alternative paths on
separate lines. Different executable paths can be configured for each version installed on your render nodes.

9.15.3 Integrated Submission Script Setup


The following procedures describe how to setup the integrated Composite submission script. This script allows for
submitting Composite render jobs to Deadline directly from within the Composite editing GUI.
You can either run the Submitter installer or manually install the submission script
Submitter Installer
Run the Submitter Installer located at <Repository>/submission/Composite/Installers
Manual Installation of the Submission Script
Copy [Repository]\submission\Composite\Client\DeadlineCompositeClient.py to [CompositeInstall Directory]\resources\scripts\
Setup the Custom Render Action.
In Composite under the Edit menu select Edit -> Project Preferences
In the opened dialog select the Render Actions tab
Under Render Actions, right click and select New
Name the new action Deadline
Enter the following for the Render Command (all on one line):

608

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

<PythonExec> <ScriptsFolder>/DeadlineCompositeClient.py -d <RenderProjectPath> -u <RenderUserPath> -c <Composition> -v <Version> -o <Outputs>


-s <StartFrame> -e <EndFrame>
There are two additional options you can add to this line:
* -r COMPOSITE_VERSION (where COMPOSITE_VERSION is the version of Composite,
like 2012)
* -b COMPOSITE_BUILD (where COMPOSITE_BUILD is the bitness of Composite, which
can be set to None, 32bit, or 64bit)

In the Render window, select Deadline as the Action and press Start.

9.15.4 FAQ
Which versions of Composite are supported?
Composite 2010 and later are supported.

9.15.5 Error Messages and Meanings


This is a collection of known Composite error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
9.15. Composite

609

Deadline User Manual, Release 7.1.0.35

Currently, no error messages have been reported for this plug-in.

9.16 Corona Standalone


9.16.1 Job Submission
You can submit Corona jobs from the Monitor.

Submission Options
The general Deadline options are explained in the Job Submission documentation. The Corona specific options are:
Corona Scene: The Corona scene that will be rendered. Must be a .scn file.
Output File Directory: The directory for the output to be saved to.
Output File Name: The prefix for the output file names. If not specified it defaults to output.
Frame List: The list of frames to be rendered. Each frame will be rendered to a separate output file.

610

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Single Frame Job: If selected, the job is a single frame job.


Configuration File(s): Add any configuration files for Corona here. Configuration files are processed in the
order they are listed.
The Corona specific advanced options are:
Override maximum # of Passes: You can override the configuration file setting for the maximum number of
passes here if this is enabled.
Override maximum Render Time: You can override the configuration file setting for the maximum render
time here if this is enabled.
Override Threads: You can override the configuration file setting for the number of threads here if this is
enabled.

9.16.2 Plug-in Configuration


You can configure the Corona plug-in settings from the Monitor. While in super user mode, select Tools -> Configure
Plugins and select the Corona plug-in from the list on the left.

Render Executables
Corona Executable: The path to the corona standalone executable file used for rendering. Enter alternative
paths on separate lines.

9.16.3 Error Messages and Meanings


This is a collection of known Combustion error messages and their meanings, as well as possible solutions. We want
to keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
9.16. Corona Standalone

611

Deadline User Manual, Release 7.1.0.35

Currently, no error messages have been reported for this plug-in.

9.17 Corona Distributed Rendering


9.17.1 Interactive Distributed Rendering
You can submit interactive Corona DR jobs from 3ds Max. The instructions for installing the integrated submission
script can be found further down this page.
The interactive submitter will submit a Corona DR job to reserve render nodes, and the submitter will automatically
update the Corona DR server list in the 3ds Max UI.
Do NOT execute Render Legions Corona DR Server executable manually on each intended machine. Deadline is
more flexible here and will spawn the Corona DR Server standalone executable as a child process of the Deadline
Slave. This makes our system flexible and resilient to crashes as when we terminate/complete the Corona DR job
in the Deadline queue, the Deadline Slave application will cleanly tidy up the DR Server and more importantly,
any instances of 3dsMax which it in turn has spawned as a child process. This can be helpful if Corona DR or that
instance of 3dsMax becomes unstable and a user wishes to reset the system remotely. You can simply re-queue or
delete/complete the current Corona DR job or re-submit.

612

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Port Configuration
Here is a consolidated list of port requirements for Corona DR. Ensure any applicable firewalls are opened to allow
pass-through communication. Typically if in doubt, opening TCP/UDP ports in the range: 19660-19670 will cover all
Corona implementations for DR. During initial testing, it is recommended to open all ports in this range, verify and
9.17. Corona Distributed Rendering

613

Deadline User Manual, Release 7.1.0.35

then consider tightening up security.


Protocol
UDP
TCP
TCP

Port Number
19666
19667
19668

Application
3dsMax
3dsMax
3dsMax

Notes
loopback

Submission Options
The general Deadline options are explained in the Job Submission documentation. The Corona DR specific options
are:
Maximum Servers: The maximum number of Corona DR Servers to reserve for distributed rendering.
Enable Verbose Logging (Optional): When checked, Corona DR server will create verbose logs.
Use Server IP Address Instead of Host Name: If checked, the Active Servers list will show the server IP
addresses instead of host names.
Automatically Update Server List: This option when un-checked stops the automatic refresh of the active
servers list based on the current Deadline queue.
Complete Job after Render: When checked, as soon as the DR session has completed (max quick render
finished), then the Deadline job will be marked as complete in the queue.
Rendering
After youve configured your submission options, press the Reserve Servers button to submit the Corona DR job. The
jobs ID and Status will be tracked in the submitter, and as nodes pick up the job, they will show up in the Active
Servers list. Once you are happy with the server list, press Start Render to start distributed rendering.
Note that the Corona DR Server process can sometimes take a little while to initialize. This means that a server in
the Active Server list could have started the Corona DR server, but its not fully initialized yet. If this is the case, its
probably best to wait a minute or so after the last server has shown up before pressing Start Render.
Update Servers (3dsMax only) button will manually update the Active Servers List. Note, if you modify the Maximum
Servers value, the jobs frame range will be updated when this button is pressed or if Automatically Update Server
List is enabled.
Whilst using the interactive Corona DR Server submission system in 3dsMax, it is recommended to NOT use the
Search LAN button or enable the Search LAN during render checkbox, as you risk accidently selecting the wrong
Corona DR servers running on your network, if another user in your studio is also running 1 or more Corona DR
servers for their rendering needs.
After the render is finished, you can press Release Servers or close the submitter UI (Setup Corona DR With Deadline)
to mark the Corona DR job as complete so that the render nodes can move on to another job in your queue.

9.17.2 Corona DR Submission


You can also submit Corona DR jobs from the Monitor, which can be used to reserve render nodes for distributed rendering. Note, if you submit the job via the Monitor submission script, that you will need to manually configure/update
your local workstation settings to point to the correct, corresponding Deadline slaves either via IP address or hostname, depending on your local network setup. See your local systems administrator if your not sure if you should use
a hostname or IP address on your network.

614

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation. The Corona DR specific options
are:
Maximum Servers: The maximum number of Corona DR Servers to reserve for distributed rendering.
Verbose Logging: Enable for verbose logging from the DrServer application.
Rendering
After youve configured your submission options, press the Submit button to submit the Corona DR job. Note that
this doesnt start any rendering, it just allows the Corona DR Server application to start up on nodes in the farm. Once
youre happy with the nodes that have picked up the job, you can initiate the distributed render manually from within
the application. This will likely require manually configuring your Corona Server list or conveniently, you could use
the Search LAN button to automatically find ANY Corona DR servers running on your network. Additionally,
Corona provides a Search LAN during render checkbox, which can be used to locate additional Corona DR Servers

9.17. Corona Distributed Rendering

615

Deadline User Manual, Release 7.1.0.35

whilst the render is progressing on your workstation and it also allows any errored or user interrupted servers to re-join
this rendering session again.
After the distributed render has finished, remember to mark the job as complete or delete it so that the nodes can move
on to other jobs. Alternatively, use the DR Session timeout functionality described below or the auto task timeout to
control whether these type of jobs are automatically completed after a certain period of time.

9.17.3 Plug-in Configuration


You can configure the Corona DR Server plug-in settings from the Monitor. While in super user mode, select Tools ->
Configure Plugins and select the CoronaDR plug-in from the list on the left.

Corona DrServer Executables


Here you can specify the Corona DR server executable used for rendering.
DR Process Handling
Handle Existing DR Process: Only one instance of the same DR process running over the same port is possible.
This option allows for Deadline to fail the task if this is the case or attempt to kill the currently running process,
to allow the Deadline managed DR process to run successfully.
DR Session Timeout
DR Session Auto Timeout Enable: If enabled, when a DR session has successfully completed on a slave, the
task on the slave will be marked as complete after the DR session auto timeout period in seconds has been
reached (Default: False).
DR Session Auto Timeout (Seconds): This is the timeout period (Default: 30 seconds) when a DR session will
timeout and be marked as complete by a slave.

616

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

9.17.4 Integrated Submission Script Setup


There is an integrated Corona DR submission script for 3ds Max. The installation process for this script can be found
below.
3ds Max
The following procedures describe how to install the integrated Corona DR submission script for 3ds Max. The
integrated submission script and the following installation procedure has been tested with Max versions 2012 and later
(including Design editions).
Note: Due to a maxscript bug in the initial release of 3ds Max 2012, the integrated submission scripts will not work.
However, this bug has been addressed in 3ds Max 2012 Hotfix 1.
You can either run the Submitter installer or manually install the submission script
Submitter Installer
Run the Submitter Installer located at <Repository>/submission/3dsmaxCoronaDR/Installers
Manual Installation of the Submission Script
Copy [Repository]/submission/3dsmaxCoronaDR/Client/Deadline3dsmaxCoronaDRClient.mcr to [3ds Install
Directory]/MacroScripts. If you dont have a MacroScripts folder in your 3ds Max install directory, check to
see if you have a UI/Macroscripts folder instead, and copy the Deadline3dsmaxCoronaDRClient.mcr file there
if you do.
Copy
[Repository]/submission/3dsmax/Client/SMTDSetup.ms
tory]/scripts/Startup/SMTDSetup.ms

to

[3ds

Max

Install

Direc-

Launch 3ds Max, and find the new Deadline menu.

9.17. Corona Distributed Rendering

617

Deadline User Manual, Release 7.1.0.35

9.17.5 FAQ
Is Corona Distributed Rendering (DR) supported?
Yes. A special reserve job is submitted that will run the Corona DR Server application on the render
nodes. Once the Corona DR Server process is running, these nodes will be able to participate in distributed
rendering.
Which versions of Corona DR are supported?
Corona interactive rendering is supported for 3ds Max 2012-2015.
Corona DR Server application fails to start manually?
During initial configuration of Corona DR Server & any future debugging, it is recommended to disable
any firewall & anti-virus software at both the DR master host machine as well as all render slave machines
which are intended to participate in the DR process. We suggest you manually get Corona DR up and
running in your studio pipeline to verify all is well before then introducing Deadline as a framework to
handle the DR Server application.
Is Backburner required for 3dsMax based Corona DR via Deadline?
Yes. Normal 3dsMax rendering via Deadline requires the Backburner dlls to be present on a system
and this is the same prerequisite for Corona DR rendering to work correcty. Ensure you have the latest/corresponding version of Backburner to ensure it supports the version of 3dsMax you are using. You
can submit a normal 3dsmax render job to verify that Backburner & 3dsMax rendering via Deadline are
all operating correctly before attempting to configure Corona DR rendering. Use the Deadline job report
to verify the correctly matched version of Backburner and 3dsMax are in order.
Do I need to run the Corona DR Server application executable on each machine?
Do NOT execute Render Legions Corona DR Server executable manually on each intended machine.
Deadline is more flexible here and will spawn the Corona DR Server standalone executable as a child
process of the Deadline Slave. This makes our system flexible and resilient to crashes as when we terminate/complete the Corona DR job in the Deadline queue, the Deadline Slave application will cleanly
tidy up the DR Server and more importantly, any instances of 3dsMax which it in turn has spawned as a
child process. This can be helpful if Corona DR or that instance of 3dsMax becomes unstable and a user
wishes to reset the system remotely. You can simply re-queue or delete/complete the current Corona DR
job or re-submit.
Can I force Corona DR to run over a certain port?
No. Currently this is not possible and the ports used are fixed. Please see the Port Configuration table at
the top of this page for more information.
Corona DR rendering seems a little unstable sometimes or my machine slows down dramatically!
Depending on the number of slave machines being used (Win7 OS < 20), scene file sizes being moved
around together with asset files, and your network/file storage configuration, it may help to increase the
Synchronization interval [s]: 60 and decrease the Max pixels transfer at once: 500000 settings, which
can help to reduce the load on your local machine and network.

9.17.6 Error Messages and Meanings


This is a collection of known Corona error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

618

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

9.18 CSiBridge
9.18.1 Job Submission
You can submit CSiBridge jobs from the Monitor.

Submission Options
The general Deadline options are explained in the Job Submission documentation. The CSiBridge specific options are:
CSi Bridge Data File(s): The CSi Bridge Data File to be processed. CSi Bridge Files (*.BDB), Microsoft
Access Files (*.MDB), Microsoft Excel Files (*.XLS), CSi Bridge Text Files (*.$BR *.B2K) are supported.
Override Output Directory: If this option is enabled, an output directory can be used to re-direct all processed
files to.
Build To Force: You can force 32 or 64 bit processing with this option.
Submit Data File With Job: If this option is enabled, the Bridge file will be submitted with the job, and then
copied locally to the slave machine during processing.
Version: The version of CSiBridge to render with.
CSiBridge Process/Solver Options are:
Process Selection: Choose to execute inside of the existing Bridge application process or as a separate process.
Solver Selection: Select the Solver to perform the analysis on the data file.
CSiBridge Design Options are:

9.18. CSiBridge

619

Deadline User Manual, Release 7.1.0.35

4 options are available to automatically perform design after the data file has been opened & analysis results are
available.
Steel Frame Design: Perform steel frame design after the analysis has completed.
Concrete Frame Design: Perform concrete frame design after the analysis has completed.
Aluminium Frame Design: Perform aluminium frame design after the analysis has completed.
Cold Formed Frame Design: Peform cold formed frame design after analysis has completed.
CSiBridge Deletion Options are:
Temp File Deletion: Choose a deletion option to cleanup the analysis/log/out files if required.
CSiBridge Additional Options are:
Include Data File: If enabled, the output zip file will contain the data file OR if outputting to a directory path,
the data file will be included.
Compress (ZIP) Output: Automatically compress the output to a single zip file.
Command Line Args: Additional command line flags/options can be added here if required.

9.18.2 Plug-in Configuration


You can configure the CSiBridge plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the CSiBridge plug-in from the list on the left.

Executables
Bridge 15 Executable: The path to the Bridge 15 executable file used for simulating. Enter alternative paths on
separate lines.

620

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Bridge 2014 Executable: The path to the Bridge 2014 executable file used for simulating. Enter alternative
paths on separate lines.
Bridge 2015 Executable: The path to the Bridge 2015 executable file used for simulating. Enter alternative
paths on separate lines.

9.18.3 FAQ
Is CSiBridge supported by Deadline?
Yes.

9.18.4 Error Messages and Meanings


This is a collection of known CSiBridge error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.19 CSiETABS
9.19.1 Job Submission
You can submit CSiETABS jobs from the Monitor.

9.19. CSiETABS

621

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation. The CSiETABS specific options
are:
CSi ETABS Data File(s): The CSi ETABS Data File to be processed. CSi ETABS Files (*.EDB), Microsoft
Access Files (*.MDB), Microsoft Excel Files (*.XLS), CSi ETABS Text Files (*.$ET *.E2K) are supported.
Override Output Directory: If this option is enabled, an output directory can be used to re-direct all processed
files to.
Build To Force: You can force 32 or 64 bit processing with this option.
Submit Data File With Job: If this option is enabled, the ETABS file will be submitted with the job, and then
copied locally to the slave machine during processing.
Version: The version of CSi ETABS to render with.
CSiETABS Design Options are:
4 options are available to automatically perform design after the data file has been opened & analysis results are
available.
Steel Frame Design: Perform steel frame design after the analysis has completed.
Concrete Frame Design: Perform concrete frame design after the analysis has completed.
Composite Beam Design: Perform composite beam design after the analysis has completed.
Shear Wall Design: Peform shear wall design after analysis has completed.
CSiETABS Deletion Options are:
Delete Analysis Results: Choose to delete the analysis results if required.
CSiETABS Additional Options are:
Include Data File: If enabled, the output zip file will contain the data file OR if outputting to a directory path,
the data file will be included.
Compress (ZIP) Output: Automatically compress the output to a single zip file.
Command Line Args: Additional command line flags/options can be added here if required.

9.19.2 Plug-in Configuration


You can configure the CSiETABS plug-in settings from the Monitor. While in super user mode, select Tools ->
Configure Plugins and select the CSiETABS plug-in from the list on the left.

622

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Executables
ETABS 2013 Executable: The path to the ETABS 2013 executable file used for simulating. Enter alternative
paths on separate lines.
ETABS 2014 Executable: The path to the ETABS 2014 executable file used for simulating. Enter alternative
paths on separate lines.

9.19.3 FAQ
Is CSiETABS supported by Deadline?
Yes.

9.19.4 Error Messages and Meanings


This is a collection of known CSiETABS error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.19. CSiETABS

623

Deadline User Manual, Release 7.1.0.35

9.20 CSiSAFE
9.20.1 Job Submission
You can submit CSiSAFE jobs from the Monitor.

Submission Options
The general Deadline options are explained in the Job Submission documentation. The CSiSAFE specific options are:
CSi SAFE Data File(s): The CSi SAFE Data File to be processed. CSi SAFE Files (*.FDB), Microsoft Access
Files (*.MDB), Microsoft Excel Files (*.XLS), CSi SAFE Text Files (*.$2K *.F2K) are supported.
Override Output Directory: If this option is enabled, an output directory can be used to re-direct all processed
files to.
Build To Force: You can force 32 or 64 bit processing with this option.
Submit Data File With Job: If this option is enabled, the SAFE file will be submitted with the job, and then
copied locally to the slave machine during processing.
Version: The version of CSi SAFE to process with.
CSiSAFE Analysis/Design/Detailing Option:

624

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Run Method: Choose a run combination option such as Disabled, Run Analysis, Run Analysis & Design
or Run Analysis, Design & Detailing.
CSiSAFE Process/Solver Options:
Process Selection: Choose to execute inside of the existing SAFE application process or as a separate process.
Solver Selection: Select the Solver to perform the analysis on the data file.
Force 32bit Process: Force analysis to be calculated in 32 bit even when the computer is 64 bit.
CSiSAFE Report Option:
Create Report: Create a report based on the current report settings in the model file.
CSiSAFE Export Options:
File Export: File export a Microsoft Access, Microsoft Excel, or text file.
DB Named Set (required): The name of the database tables named set that defines the tables to be exported.
This parameter is required.
DB Group Set (optional): The specified group sets the selection for the exported tables. This parameter is
optional. If it is not specified, the group ALL is assumed.
CSiSAFE Deletion Options:
Temp File Deletion: Choose a deletion option to cleanup the analysis/output files if required such as keep
everything, delete analysis & output files, delete analysis files only or delete output files only.
CSiSAFE Additional Options:
Include Data File: If enabled, the output zip file will contain the data file OR if outputting to a directory path,
the data file will be included.
Compress (ZIP) Output: Automatically compress the output to a single zip file.

9.20.2 Plug-in Configuration


You can configure the CSiSAFE plug-in settings from the Monitor. While in super user mode, select Tools -> Configure
Plugins and select the CSiSAFE plug-in from the list on the left.

9.20. CSiSAFE

625

Deadline User Manual, Release 7.1.0.35

Executables
SAFE 12 Executable: The path to the SAFE 12 executable file used for simulating. Enter alternative paths on
separate lines.
SAFE 2014 Executable: The path to the SAFE 2014 executable file used for simulating. Enter alternative paths
on separate lines.

9.20.3 FAQ
Is CSiSAFE supported by Deadline?
Yes.

9.20.4 Error Messages and Meanings


This is a collection of known CSiSAFE error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

626

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

9.21 CSiSAP2000
9.21.1 Job Submission
You can submit CSiSAP2000 jobs from the Monitor.

Submission Options
The general Deadline options are explained in the Job Submission documentation. The CSiSAP2000 specific options
are:
CSi SAP2000 Data File(s): The CSi SAP2000 Data File to be processed. CSi SAP2000 Files (*.SDB), Microsoft Access Files (*.MDB), Microsoft Excel Files (*.XLS), CSi SAP2000 Text Files (*.$2K *.S2K) are
supported.
Override Output Directory: If this option is enabled, an output directory can be used to re-direct all processed
files to.
Build To Force: You can force 32 or 64 bit processing with this option.
Submit Data File With Job: If this option is enabled, the SAP2000 file will be submitted with the job, and then
copied locally to the slave machine during processing.
Version: The version of CSi SAP2000 to render with.
CSiSAP2000 Process/Solver Options are:
Process Selection: Choose to execute inside of the existing SAP2000 application process or as a separate
process.

9.21. CSiSAP2000

627

Deadline User Manual, Release 7.1.0.35

Solver Selection: Select the Solver to perform the analysis on the data file.
CSiSAP2000 Design Options are:
4 options are available to automatically perform design after the data file has been opened & analysis results are
available.
Steel Frame Design: Perform steel frame design after the analysis has completed.
Concrete Frame Design: Perform concrete frame design after the analysis has completed.
Aluminium Frame Design: Perform aluminium frame design after the analysis has completed.
Cold Formed Frame Design: Peform cold formed frame design after analysis has completed.
CSiSAP2000 Deletion Options are:
Temp File Deletion: Choose a deletion option to cleanup the analysis/log/out files if required.
CSiSAP2000 Additional Options are:
Include Data File: If enabled, the output zip file will contain the data file OR if outputting to a directory path,
the data file will be included.
Compress (ZIP) Output: Automatically compress the output to a single zip file.
Command Line Args: Additional command line flags/options can be added here if required.

9.21.2 Plug-in Configuration


You can configure the CSiSAP2000 plug-in settings from the Monitor. While in super user mode, select Tools ->
Configure Plugins and select the CSiSAP2000 plug-in from the list on the left.

Executables
628

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

SAP2000 14 Executable: The path to the SAP2000 14 executable file used for simulating. Enter alternative
paths on separate lines.
SAP2000 15 Executable: The path to the SAP2000 15 executable file used for simulating. Enter alternative
paths on separate lines.
SAP2000 16 Executable: The path to the SAP2000 16 executable file used for simulating. Enter alternative
paths on separate lines.
SAP2000 17 Executable: The path to the SAP2000 17 executable file used for simulating. Enter alternative
paths on separate lines.

9.21.3 FAQ
Is CSiSAP2000 supported by Deadline?
Yes.

9.21.4 Error Messages and Meanings


This is a collection of known CSiSAP2000 error messages and their meanings, as well as possible solutions. We want
to keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.22 DJV
9.22.1 Job Submission
You can submit DJV jobs from the Monitor. You can use the Submit menu, or you can right-click on a job and select
Scripts -> Submit DJV Quicktime Job To Deadline to automatically populate some fields in the DJV submitter based
on the jobs output.

9.22. DJV

629

Deadline User Manual, Release 7.1.0.35

Submission Options
The general submission options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. You can get more information about the DJV specific
options by hovering your mouse over the label for each setting. The Settings buttons can be used to quickly save and
load presets, or reset the settings back to their defaults.

9.22.2 Plug-in Configuration


You can configure the DJV plug-in settings from the Monitor. While in super user mode, select Tools -> Configure
Plugins and select the DJV plug-in from the list on the left.

630

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

DJV Executables
DJV Executable: The path to the djv_convert executable file used for rendering. Enter alternative paths on
separate lines. Different executable paths can be configured for each version installed on your render nodes.

9.22.3 FAQ
Is DJV supported by Deadline?
Yes.
Can I create Apple Quicktime mov files with DJV?
Yes. On Windows, you must use the x32 bit version of DJV only. The LibQuicktime based codecs are
only available in DJV v1.0.1 or later AND only on Linux. As an alternative, you can also use Thinkboxs
Draft product (image/movie creation automation toolkit) which is included in Deadline and is licensed
against your active Deadline support subscription. See Draft for more information.
Can I create EXR files compressed with DreamWorks Animations DWAA or DWAB compression?
Yes, but this is only supported in DJV v1.0.01 or later.

9.22.4 Error Messages and Meanings


This is a collection of known DJV error messages and their meanings, as well as possible solutions. We want to keep
this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
[ -auto_tag] and [ -tag Name Value] options not working in DJV plugin

9.22. DJV

631

Deadline User Manual, Release 7.1.0.35

DJV has a bug causing DJV to crash which is currently stopping these 2 command line flag options from
working. The code has been commented out in the DJV plugin and can be re-enabled as such time the
bug is fixed by the DJV developer.
Various Command Line options failing in DJV
Many of the [djv_convert] commmand line flags are broken due to spaces being present between the
flag options in DJV versions earlier than v1.0.1. This is all resolved in DJV v1.0.1 and later, so it is
recommended to use at least this version (wrapping the flag options with additional quotation marks does
not resolve the issue as its a bug in the actual [djv_convert] command line args parser function).

9.23 Draft
9.23.1 Job Submission
There are many ways to submit Draft jobs to Deadline. As always, you can simply submit a Draft job from within the
Monitor from the Submit menu. In addition, weve also added a right-click job script to the Monitor, which will allow
you to submit a Draft job based on an existing job. This will pull over output information from the original job, and
fill in Draft parameters automatically where possible.

632

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

On top of the Monitor scripts, you can also get set up to submit Draft jobs directly from Shotgun. This will again
pull over as much information as possible, this time from the Shotgun database, in order to pre-fill several of the Draft
parameter fields. See the Integrated Submission Script Setup section below for more details on this.
Weve also added a Draft section to all of our other submitters. Submitting a Draft job from any of these uses our
9.23. Draft

633

Deadline User Manual, Release 7.1.0.35

Draft Event Plug-in to submit a Draft job based on the job currently being submitted (this is similar in concept to the
right-click job script described above). The Draft job will get automatically created upon completion of the original
job.

9.23.2 Submission Options


The general Deadline options are available in the Draft submitters, and are explained in the Job Submission documentation. Draft-specific options are explained below. It should be noted, however, that given the nature of Draft scripts,
not all of these parameters will be used by all scripts. They can even feasibly be used for different purposes than listed
here.
Draft Script: This is the Draft script (or Template) that you want to run.
Input File: Indicates where the input file(s) for the Draft Script can be found. What kind of file this is will
depend entirely on the Draft Script itself. Passed to the Draft script as inFile.
Output Folder: Indicates where the output file(s) of the Draft Script will be placed. Can be a relative path, in
which case it will be relative to the input. This is passed to the Draft script as outFolder.
Output File Name: As above, the type of file this is will depend entirely on the selected Draft Script. Passed to
the Draft script as outFile.
Frame List: The list of Frames that the Draft Script should work with. Passed to the Draft Script as frameList,
firstFrame, and lastFrame.
User: The name of the user that is submitting the job. Typically used by the Draft script for frame annotations.
Passed to the Draft script as username.
Entity: The name of the entity being submitted. Typically used by the Draft script for frame annotations. Passed
to the Draft script as entity.
Version: The version of the entity being submitted. Typically used by the Draft script for frame annotations.
Passed to the Draft script as version.
Additional Args: Any additional command line arguments that you wish to pass to the Draft script should be
listed here. Appended to arguments listed above.

9.23.3 Plug-in Configuration


The Draft plug-in does not require any configuration.

9.23.4 Integrated Submission Script Setup


All of our integrated submission scripts have been updated to have a Draft section, in order to submit dependent Draft
jobs. In addition to this, we also have created scripts to allow you to submit a Draft job directly from Shotgun.
Shotgun Action Menu Item
The best way to install the Draft Submission menu item in Shotgun is to use the automated setup script included in the
Monitor. To access this, select Scripts -> Install Integration Submission Scripts from the Monitors menu. From there,
click the Install button next to the Draft entry.
It should be noted that this functionality is currently only available on the Windows version, and requires administrator
privileges to run successfully. It should also be noted that while this script will create the Submit Draft Job entry in
Shotgun for everyone to see, this must still be done on each machine that will be submitting Draft jobs.

634

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

9.24 Draft Tile Assembler


9.24.1 Job Submission
You can submit Draft tile assembler jobs from the Monitor. Normally, these jobs are submitted as dependent jobs for
your original tile jobs, but you can submit them manually if you wish.

Submission Options
The general Deadline options are explained in the Job Submission documentation. The Draft Tile Assembler specific
options are:
Input Config File: The file that will control a majority of the assembly.

9.24. Draft Tile Assembler

635

Deadline User Manual, Release 7.1.0.35

Error on Missing File: If enabled, the job will error if any of the tiles in the config file are missing.
Cleanup Tiles: If enabled, the job Delete all of the tile files after the assembly is complete.
Build To Force: You can force 32 bit or 64 bit rendering.
Config File Setup
The config file is a plain text file that uses Key/Value pairs (key=value) to control the draft tile assembly.
TileCount=<#>: The number of tiles that are going to be assembled
DistanceAsPixels=<true/false>: Distances provided in pixels or in a 0.0-1.0 percentage range (Defaults to
True)
BackgroundSource=<BackgroundFile>: If provided, the assembler will attempt to assemble the new tiles
over the specified image.
TilesCropped=<true/false>: If disabled, the assembler will crop the tiles before assembling them.
ImageHeight=<#>: The height of the final image. This will be ignored if a background is provided. If this is
not provided and the tiles are not cropped then the first tile will be used to determine the final image size.
ImageWidth=<#>: The height of the final image. This will be ignored if a background is provided. If this is
not provided and the tiles are not cropped then the first tile will be used to determine the final image size.
Tile<#>Filename=<FileName>: The file name of the tile to be assembled. (Only used if ImageFolder is not
included, 0 indexed)
Tile<#>X=<#>: The X coordinates for the tile that is to be assembled. 0 at the left side.
Tile<#>Y=<#>: The Y coordinates for the tile that is to be assembled. 0 at the bottom.
Tile<#>Width=<#>: The width of the tile that is to be cropped. (Only used if TilesCropped is false)
Tile<#>Height=<#>: The height of the tile that is to be cropped. (Only used if TilesCropped is false)
ImageFolder=<Folder>: The folder that you would like to assemble images from. (If included the assembler
will render all tiles within the specified Folder )
ImagePadding=<#>: The amount of padding on the file names within the folder.(Only used if ImageFolder is
included)
ImageExtension=<ext>: The extension that the files to be assembled. (Only used if ImageFolder is included)
Tile<#>Prefix=<Prefix>: The Prefix that the file must contain (Only used if ImageFolder is included)
Example Config Files
The first example config file will control a simple tile assembly.
#We are assembling 4 tiles into an image
TileCount=4
#The final image will have the following filename
ImageFileName=C:/ExampleConfig/outputFileName.png
#The final Image will have a resolution of 960x540
ImageWidth=960
ImageHeight=540
#The Images are already Cropped
TilesCropped=True
#What is the file that will be the first tile assembled
Tile0FileName=C:/ExampleConfig/_tile_1x1_2x2_sceneName.png

636

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

#Where should the first tile go


Tile0X=0
Tile0Y=0
#What is the file that will be the second tile assembled
Tile1FileName=C:/ExampleConfig/_tile_2x1_2x2_sceneName.png
#Where should the second tile go
Tile1X=480
Tile1Y=0
#What is the file that will be the third tile assembled
Tile2FileName=C:/ExampleConfig/_tile_1x2_2x2_sceneName.png
#Where should the third tile go
Tile2X=0
Tile2Y=270
#What is the file that will be the fourth tile assembled
Tile3FileName=C:/ExampleConfig/_tile_2x2_2x2_sceneName.png
#Where should the fourth tile go
Tile3X=480
Tile3Y=270

The second example config file controls a folder render. It will assemble all files within the folder C:/ExampleConfig/
that have the extension exr and have the given prefixes. So if the files region_0_test.exr, region_1_test.exr, region_2_test.exr, region_3_test.exr then this file will create the images test.exr:
#We are assembling 4 tiles into an image
TileCount=4
#In the config files we are using relative coordinates instead of pixel coordinates
DistanceAsPixels=0
#The tiles have not yet been cropped so the tile assembler has to crop each tile.
TilesCropped=false
#We are going to assemble all files within the specified folder.
ImageFolder=C:/ExampleConfig
#We are going to only assemble files with the following extension
ImageExtension=exr
#The first tile in each of the images will start with the following prefix
Tile0Prefix=region_0_
#Where should the tile go
Tile0X=0
Tile0Y=0
#Because we are cropping the tiles we need to give it a width and height to crop to
Tile0Width=0.5
Tile0Height=0.5
#The second tile in each of the images will start with the following prefix
Tile1Prefix=region_1_
#Where should the tile go
Tile1X=0.5
Tile1Y=0
#Because we are cropping the tiles we need to give it a width and height to crop to
Tile1Width=0.5
Tile1Height=0.5
Tile2Prefix=region_2_
Tile2X=0
Tile2Y=0.5
Tile2Width=0.5
Tile2Height=0.5
Tile3Prefix=region_3_
Tile3X=0.5
Tile3Y=0.5

9.24. Draft Tile Assembler

637

Deadline User Manual, Release 7.1.0.35

Tile3Width=0.5
Tile3Height=0.5

9.24.2 Plug-in Configuration


The Draft Tile Assembler plug-in does not require any configuration.

9.24.3 FAQ
There are no FAQ entries at this time.

9.24.4 Error Messages And Meanings


This is a collection of known Draft Tile Assembler error messages and their meanings, as well as possible solutions.
We want to keep this list as up to date as possible, so if you run into an error message that isnt listed here, please
email Deadline Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.25 EnergyPlus
9.25.1 Job Submission
You can submit EnergyPlus jobs from the Monitor.

638

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation. The EnergyPlus specific options
are:
EnergyPlus IDF File(s): The EnergyPlus IDF file(s) to be processed.
Weather EPW File(s): The Weather EPW File(s) to be referenced (Optional).
Override Output Directory: If this option is enabled, an output directory can be used to re-direct all processed
files to.
Build To Force: You can force 32 or 64 bit processing with this option.
Submit File(s) With The Job: If this option is enabled, the data file(s) will be submitted with the job, and then
copied locally to the slave machine during processing.
EnergyPlus Post-Process Options are:
../ReadVarsESO.exe Max.Columns: Limit the maximium number of columns used when calling readVarsESO.exe.
Execute ../convertESOMTR.exe: Execute the convertESOMTR.exe application as a post-process.
Execute ../CSVproc.exe: Execute the csvProc.exe application as a post-process.
EnergyPlus Processing Options are:

9.25. EnergyPlus

639

Deadline User Manual, Release 7.1.0.35

Multithreading: If enabled, EnergyPlus simulations will use multithreading. Ignored if Concurrent Tasks > 1.
Pause Mode (DEBUG only): Only for Debug purposes. Will PAUSE the program execution at key steps.
EnergyPlus Other Options are:
Include Data File: If enabled, the output zip file will contain the data file OR if outputting to a directory path,
the data file will be included.
Compress (ZIP) Output: Automatically compress the EP output to a single zip file.

9.25.2 Plug-in Configuration


You can configure the EnergyPlus plug-in settings from the Monitor. While in super user mode, select Tools ->
Configure Plugins and select the EnergyPlus plug-in from the list on the left.

Executables
EnergyPlus Executable: The path to the EnergyPlus executable file used for simulating. Enter alternative paths
on separate lines.

9.25.3 FAQ
Is EnergyPlus supported by Deadline?
Yes.

640

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

9.25.4 Error Messages and Meanings


This is a collection of known EnergyPlus error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.26 FFmpeg
9.26.1 Job Submission
You can submit FFmpeg jobs from the Monitor.

9.26. FFmpeg

641

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation. The FFmpeg specific options are:

642

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Input File: The input file.


Input Arguments: Additional command line arguments for the input file.
Replace Frame in Input File(s) With Padding: If enabled, the frame number in the file name will be replaced
by frame padding before being passed to FFMpeg. This should be enabled if you are passing a sequence of
images as input.
Output File: The output file.
Output Arguments: Additional command line arguments for the output file.
Additional Arguments: Additional general command line arguments.
Additional Input Files: Specify up to 9 additional input files. You can give each file their own arguments, or
use the same arguments as the main input file.
FFmpeg Preset Files: Specify preset files for video, audio, or subtitle.

9.26.2 Plug-in Configuration


You can configure the FFmpeg plug-in settings from the Monitor. While in super user mode, select Tools -> Configure
Plugins and select the FFmpeg plug-in from the list on the left.

Render Executables
FFmpeg Executable: The path to the FFmpeg executable file used for rendering. Enter alternative paths on
separate lines.

9.26.3 FAQ
Currently, there are no FAQs for this plug-in.
9.26. FFmpeg

643

Deadline User Manual, Release 7.1.0.35

9.26.4 Error Messages and Meanings


This is a collection of known FFmpeg error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.27 Fusion
9.27.1 Job Submission
You can submit jobs from within Fusion by installing the integrated submission script, or you can submit them from
the Monitor. The instructions for installing the integrated submission script can be found further down this page.
To submit from within Fusion, select Script -> DeadlineFusionClient.

Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation.
Fusion Comp: The flow/comp file to be rendered.
Frame List: The list of frames to render.
Frames Per Task: This is the number of frames that will be rendered at a time for each job task.
Proxy: The proxy level to use (not supported in command line mode).
Version: The version of Fusion to render with.
Build: Force 32 or 64 bit rendering. Default is None.
Use Frame List In Comp: Enable this option to pull the frame range from the comp file.
644

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Check Output: If checked, Deadline will check all savers to ensure they have saved their image file (not
supported in command line mode).
High Quality: Whether or not to render with high quality (not supported in command line mode).
Command Line Mode: Render using separate command line calls instead of keeping the scene loaded in
memory between tasks. Using this feature disables the High Quality, Proxy, and Check Saver Output options.
This uses the FusionCmd plug-in, instead of the Fusion one.
Submit Comp File: If this option is enabled, the flow/comp file will be submitted with the job, and then copied
locally to the slave machine during rendering.
In-app submitter submission options.
Render First And Last Frames First: The first and last frame of the flow/comp will be rendered first, followed
by the remaining frames in the sequence. Note that the Frame List above is ignored if this box is checked (the
frame list is pulled from the flow/comp itself).
Submit Comp File With Job: If this option is enabled, the flow/comp file will be submitted with the job, and
then copied locally to the slave machine during rendering.
Check Saver Output: If checked, Deadline will check all savers to ensure they have saved their image file (not
supported in command line mode).

9.27.2 Plug-in Configuration


You can configure the Fusion and FusionCmd plug-in settings from the Monitor. While in super user mode, select
Tools -> Configure Plugins and select the Fusion plug-in from the list on the left.
Fusion

9.27. Fusion

645

Deadline User Manual, Release 7.1.0.35

Fusion Options
Fusion Render Executable: The path to the Fusion Render Slave executable used for rendering. Enter alternative paths on separate lines. Different executable paths can be configured for each version installed on your
render nodes.
Fusion Wait For Executable: If you use a proxy RenderSlave.exe, set this to the name of the renamed original.
For example, it might be set to RenderSlave_original.exe. Leave blank to disable this feature.
Fusion Version To Enforce: Deadline will only render Fusion jobs on slaves running this version of Fusion.
Use a ; to separate alternative versions. Leave blank to disable this feature.
Fusion Slave Preference File: The path to a global RenderSlave.prefs preference file that is copied over before
starting the Render Slave. Leave blank to disable this feature.
General Fusion Options
Load Comp Timeout: Maximum time for Fusion to load a comp, in seconds.
Script Connect Timeout: Amount of time allowed for Fusion to start up and accept a script connection, in
seconds.
FusionCmd

Fusion Render Executable: The path to the Fusion Console Slave executable used for rendering. Enter alternative paths on separate lines. Different executable paths can be configured for each version installed on your
render nodes.
Fusion Slave Preference File: The path to a global RenderSlave.prefs preference file that is copied over before
starting the Render Slave. Leave blank to disable this feature.

646

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

9.27.3 Integrated Submission Script Setup


The following procedures describe how to install the integrated Fusion submission script. This script allows for
submitting Fusion render jobs to Deadline directly from within the Fusion editing GUI.
Submitter Installer
Run the Submitter Installer located at <Repository>/submission/Fusion/Installers
Manual Installation of the Submission Script
Copy [Repository]/submission/Fusion/Client/DeadlineFusionClient.eyeonscript to [Fusion Install Directory]/Scripts/Comp
Restart Fusion to find the DeadlineFusionClient option in the Script menu.
Custom Sanity Check Setup
In the [Repository]/submission/Fusion/Main folder, you can create a file called CustomSanityChecks.eyeonscript.
This script will be called by the main Fusion submission script before submission, and can be used to perform sanity
checks. Within this script file, you must define this function, which is called by the main script:
function CustomDeadlineSanityChecks(comp)
local message = ""
...
return message
end

All your checks should be placed within this function. This function should return a message that contains the sanity
check warnings. If an empty message is returned, then it is assumed the sanity check was a success and no warning is
displayed to the user. Here is a simple example that checks if any CineFusion tools are being used in the comp file:
function CustomDeadlineSanityChecks(comp)
local message = ""
------------------------------------------------------ RULE: Check to make sure Cinefusion is disabled
----------------------------------------------------cinefusionAttrs = fusion:GetRegAttrs("CineFusion")
if not (cinefusionAttrs == nil) then
cinefusion_regID = cinefusionAttrs.REGS_ID
local i = nil
for i, v in comp:GetToolList() do
if (v:GetID() == cinefusion_regID) then
if (v:GetAttrs().TOOLB_PassThrough == false) then
message = message ..
"CineFusion '" ..
v:GetAttrs().TOOLS_Name ..
"' should be disabled\n"
end
end
end
end

9.27. Fusion

647

Deadline User Manual, Release 7.1.0.35

return message
end

9.27.4 FAQ
Which versions of Fusion are supported?
Fusion 5 and later are supported.
Whats the difference between the Fusion and FusionCmd plugins?
The Fusion plugin starts the Fusion Render Node in server mode and uses eyeonscript to communicate
with the Fusion renderer. Fusion and the comp remain loaded in memory between tasks to reduce overhead. This is usually the preferred way of rendering with Fusion.
The FusionCmd plugin renders with Fusion by executing command lines, and can be used by selecting
the Command Line mode option in the Fusion submitter. Because Fusion needs to be launched for each
task, there is some additional overhead when using this plugin. In addition, the Proxy, High Quality, and
Saver Output Checking features are not supported in this mode. However, this mode tends to print out
better debugging information when there are problems (especially when the Fusion complains that it cant
load the comp), so we recommend using it to help figure out problems that may be occurring when using
the Fusion plugin.
Can I use both workstation and render node licenses to render jobs in Deadline?
You can use workstation licenses to render, you just need to do a little tweaking to get this to work nicely.
In the Plugin Configuration settings, you need to specify two paths for the render executable option. The
first path will be the render node path, and the second will be the actual Fusion executable path. You then
have to make sure that the render node is not installed on your workstations. Because you have specified
two paths, Deadline will only use the second path if the first one doesnt exist, which is why the render
nodes cant be installed on your workstations.
Why is it not possible to have to 2 instances of Fusion running?
With Fusion there is only one tcp/ip port to which eyeonscript (the scripting language used to run Fusion
renders on a slave computer) can connect. If Fusion is open on a slave computer then the port will be in
use and the Fusion Render Node will have to wait for the port to become available before rendering of
Fusion jobs on that slave can begin.
Fusion alone renders fine, but with Deadline, the slaves are failing on the last frame.
This is usally accompanied by this error message:
INFO: Checking file \\path\to\filename####.ext
INFO: Saver "SaverName" did not produce output file.
INFO: Expected file "\\path\to\filename####.ext" to exist.

The issue likely has to do with the processing of fields as opposed to full frames. When processing your
output as fields, the frames are rendered in two halves (for example, frame 1 would be rendered as 1.0 and
1.5). This error often occurs when the Global Timeline is not set to include the second half of the final
frame. Simply adding a .5 to the Global End Time should resolve this issue.
For example, let us assume that you are processing fields and your output range is 0 - 100. If the Global
Timeline is set to be 0.0 - 100.0, Fusion will render everything, but Deadline will fail on the last frame. If
the Global Timeline is set to be 0.0 - 100.5, Deadline will render everything just fine.
Is there a way to increase Deadlines efficiency when rendering Fusion frames that only take a few seconds to
complete?
648

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Rendering these frames in groups (groups of 5 for example) tends to reduce the jobs overall rendering
time. The group size can be set in the Fusion submission dialog using the Task Group Size option.
Does Fusion cache data between frames on the network, in the same way it does when rendering sequences
locally?
Deadline renders each block of frames using the eyeonscript comp.render function. The Fusion Render
Node is kept running between each block rendered, so when Fusion caches static results, it can be used
by the next block of frames to be rendered on the same machine.
Fusion seems to be taking a long time to start up when rendering. What can I do to fix this?
If you are running Fusion off a remote share, this can occur when there is a large number of files in the
Autosave folder. If this is the case, deleting the files in the Autosave folder should fix the problem.
Can I use relative paths in my Fusion comp when rendering with Deadline?
If your comp is on a network location, and everything is relative to that network path, you can use relative
paths if you choose the option to not submit the comp file with the job. In this case, the slaves will load the
comp directly over the network, and there shouldnt be any problems with the relative paths. Just make
sure that your render nodes resolve the paths the same way your workstation does.

9.27.5 Error Message and Meanings


This is a collection of known Fusion error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Exception during render: Failed to load the comp [flowname].comp in startjob.eyeonscript?
This error usually occurs because the render node is missing a plug-in that is referenced by the flow
in question. Often this is because there is a plug-in installed on the machine from which the job was
submitted that is not in the Fusion Render Node plug-in directory on the slave machine. It is important
to remember that the Fusion Render Node has a different plug-in store than Fusion even on the same
machine thus one should ensure that the needed plug-ins are copied/installed in both locations.
Exception during render: The fusion renderer reported that the render failed. Scroll down to the bottom of the
log below for more details.
This can occur for a number of reasons, but often Fusion will print out the cause for the error. In the
error log window, scroll to the end of the Slave Log capture which is near the bottom of the error message
window, and there will be a part which looks something like the following message. This particular
message indicates that a font was missing on the machine.
INFO: Render started at Wed 8:17PM (Range: 198 to 198) INFO: INFO: Comments: Could not find font
SwitzerlandCondensed INFO: INFO: Saver 1 failed at time 198 INFO: INFO: Render failed at Wed
8:18PM! Last frame rendered: (none)! INFO: INFO: Render failed
Weve usually found that the problem behind this error was a plug-in that was installed for Fusion, but
not for the Fusion Render Node. Try updating your Fusion Render Node plug-ins to match your Fusion
plug-ins exactly, and check whether the error still occurs.
Exception during render: Eyeonscript failed to make a connection in startjob.eyeonscript - check that Eyeonscript is set to no login required?
In order to connect to the Fusion Render Node and communicate with it, Deadline uses the eyeonscript,
the scripting language provided for Fusion. The script connects to the Fusion Render Node via a socket
connection, which by default requires a login username and password to connect.
In order for Deadline to be able to render using a given Fusion Render Node, you must change its settings
so that it no longer requires the username and password. This is done by running the Fusion Render Node,
9.27. Fusion

649

Deadline User Manual, Release 7.1.0.35

right clicking on the icon it creates, and choosing preferences. From there, pick the Script option, and you
will see radio buttons, one of which says No login required. Make sure that that is the option selected,
then click Save to save the preferences, and exit Fusion Render Node.

9.28 Fusion Quicktime


9.28.1 Job Submission
You can submit Fusion Quicktime jobs from the Monitor.

650

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation.
Fusion Options
9.28. Fusion Quicktime

651

Deadline User Manual, Release 7.1.0.35

Fusion Version: Select the version of Fusion to generate the Quicktime with.
Build: Force 32 or 64 bit rendering.
Load/Save Preset: Allows you to save your Fusion Quicktime options to a preset file, so that you can load them
again later.
Input/Output Options
Input Images: The frames you would like to generate the Quicktime from. If a sequence of frames exist in
the same folder, Deadline will automatically collect the range of the frames and will set the Frame Range field
accordingly.
Frames: The frame range used to generate the Quicktime.
Frame Rate: The frame rate of the Quicktime.
Overide Start: Allows the starting frame in the quicktime to be overridden. For example, if you are making
a quicktime from images with a range 101-150, you can override the start frame to be 1, and the range in the
quicktime will appear as 1-50.
Output Movie File: The name of the Quicktime to be generated.
Codec: The codec format to use for the Quicktime.
On Missing Frames: What the generator will do when a frame is missing or is unable to load. There are 4
options:
Fail: Nothing will be generated until the missing frame becomes available.
Hold Previous: The last valid frame will be included instead of the missing frame.
Output Black: A black frame will be included instead of the missing frame.
Wait: The generator will wait until the missing frame becomes available.
Quicktime Options
BG Plate: Specify an optinal background plate. The Quicktime will render using the selected file as the background.
Template: Specify an optional comp template. See the Template documentation below for more information.
Artist Name: if you have a text tool with artist in its name in the selected template comp, its text will be set
to the name that is specified.
Curve Correction: Select to turn on the color curves tool (available when using templates only).
Quality %: The quality of the Quicktime.
Proxy: The ratio of pixels to render (for example, if set to 4, one out of every four pixels will be rendered).
Gamma: The gamma level of the Quicktime.
Exposure Compensation: The stops value used to calculate the gain parameter of the Brightness/Contrast
tool. The gain parameter is calculated by using the value pow(2,stops).

9.28.2 Quicktime Templates


A comp template can be specified to put all the messages and watermarks that you want into the Quicktime. It has
some standardized comp naming conventions so that the renderer can set some standard text tool values, as well as the
input and output images. Here is an example of a very simple template file.

652

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

As you can see, this simple template consists of a loader, a saver, a text tool, and a merge tool. This template simply
merges the text tool with the loader so that This is a test appears in your Quicktime. You can create your own
template files, but they must meet the following requirements. As long as these requirements are met, you can add
whatever you like between the loader and the saver.
There must be exactly one loader and one saver.
The loader must have a dummy file name specified (the file doesnt have to exist).

9.28.3 Plug-in Configuration


The Fusion Qucktime submitter submits jobs to the Fusion plug-in. See the Fusion Plug-in Guide for information on
configuring the Fusion plug-in.

9.28. Fusion Quicktime

653

Deadline User Manual, Release 7.1.0.35

9.28.4 FAQ
Which versions of Fusion are supported?
Fusion 5 and later are supported.
How is this different than submitting regular Quicktime jobs?
Regular Quicktime jobs are more generic, and provide more general Quicktime options. Fusion Quicktime
jobs are more customizable (ie: using templates), but requires Fusion to render.

9.28.5 Error Message and Meanings


The Fusion Quicktime submitter submits jobs to the Fusion plug-in. See the Fusion Plug-in Guide for Fusion error
messages and meanings.

9.29 Generation
9.29.1 Job Submission
You can submit comp jobs to Fusion from within Generation by installing the integrated submission script. The
instructions for installing the integrated submission script can be found further down this page.
In Generation, select the comp(s) you want to submit, and then right-click and select Submit.

654

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

This will bring up the submission window. Note that the submission window is only shown once, and all jobs that are
submitted will use the same job settings.

9.29. Generation

655

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation. The Fusion options are:
Use Frame List In Comp: Uses the frame list defined in the comp files instead of the Frame List setting. If you
are submitting more than one comp from Generation, you should leave this option enabled unless you want the
Frame List setting to be used for each comp.
Proxy: The proxy level to use.
High Quality Mode: Whether or not to render with high quality.
Check Output: If checked, Deadline will check all savers to ensure they have saved their image file.
Version: The version of Fusion to render with.
Build: Force 32 or 64 bit rendering.
Command Line Mode: Render using separate command line calls instead of keeping the scene loaded in
memory between tasks. Using this feature disables the High Quality, Proxy, and Check Saver Output options.
This uses the FusionCmd plug-in, instead of the Fusion one.

9.29.2 Plug-in Configuration


The Generation submitter submits jobs to the Fusion plug-in. See the Fusion Plug-in Guide for information on configuring the Fusion plug-in.

9.29.3 Integrated Submission Script Setup


The following procedures describe how to install the integrated Generation submission script. This script allows for
submitting Generation comp jobs to Deadline directly from within the Generation editing GUI.
Submitter Installer

656

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Run the Submitter Installer located at <Repository>/submission/Generation/Installers


Manual Installation of the Submission Script
Copy [Repository]\submission\Generation\Client\DeadlineGenerationClient.lua to the Generation scripts folder
([Generation Install Folder]\scripts\generation).
In the Generation program data folder (%PROGRAMDATA%\eyeon\Generation), youll need to edit your Generation.cfg file. If you currently do not have a Generation.cfg file, create an empty one. Open your Generation.cfg file and add this:
SCRIPT_FARMSUBMIT="scripts\generation\DeadlineGenerationClient.lua"

Save the file. The next time you start up Generation, this script will be used when you select the Submit option
for the selected comps.

9.29.4 FAQ
Which versions of Generation are supported?
Generation 2 and later are supported.

9.29.5 Error Messages and Meanings


The Generation submitter submits jobs to the Fusion plug-in. See the Fusion Plug-in Guide for Fusion error messages
and meanings.

9.30 Hiero
9.30.1 Job Submission
You can submit transcoding jobs to Nuke from within Hiero by installing the integrated submission script. The
instructions for installing the integrated submission script can be found further down this page.
To submit from within Hiero, open the Export window from the File menu, or by right-clicking on a sequence. Then
choose the Submit To Deadline option in the Render Background Tasks drop down and press Export.

9.30. Hiero

657

Deadline User Manual, Release 7.1.0.35

This will bring up the submission window. Note that the submission window is only shown once, and all jobs that are
submitted will use the same job settings.

658

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation. The Nuke specific options are:
Render With NukeX: Enable this option if you want to render with NukeX instead of Nuke.
Render Threads: The number of threads to use for rendering.
Continue On Error: If enabled, Nuke will attempt to keep rendering if an error occurs.
Maximum RAM Usage: The maximum RAM usage (in MB) to be used for rendering.
Use Batch Mode: If enabled, Deadline will keep the Nuke file loaded in memory between tasks.
Build To Force: Force 32 or 64 bit rendering.
9.30. Hiero

659

Deadline User Manual, Release 7.1.0.35

9.30.2 Cross-Platform Rendering Considerations


The Hiero submitter submits jobs to the Nuke plug-in. See the Nuke Plug-in Guide for cross-platform rendering
considerations.

9.30.3 Plug-in Configuration


The Hiero submitter submits jobs to the Nuke plug-in. See the Nuke Plug-in Guide for information on configuring the
Nuke plug-in.

9.30.4 Integrated Submission Script Setup


The following procedures describe how to install the integrated Hiero submission script. This script allows for submitting Hiero transcoding jobs to Deadline directly from within the Hiero editing GUI. These jobs are then rendered
using the Nuke plugin.
Submitter Installer
Run the Submitter Installer located at <Repository>/submission/Hiero/Installers
Manual Installation of the Submission Script
Go to your .hiero user folder (~/.hiero or %USERPROFILE%\.hiero) and create a folder called Python if it
doesnt exist.
Open the Python folder and create another folder called Startup if it doesnt exist.
Copy [Repository]\submission\Hiero\Client\DeadlineHieroClient.py
ero/Python/Startup or %USERPROFILE%\.hiero\Python\Startup).

to

the

Startup

folder

(~/.hi-

The next time you launch Hiero, there should be a Submit To Deadline option in the Hiero Export window, in the
Render Background Tasks drop down.

9.30.5 FAQ
The Hiero submitter submits jobs to the Nuke plug-in. See the Nuke Plug-in Guide for additional FAQs related to
Nuke.
Which versions of Hiero are supported?
Hiero 1.0 and later are supported.
How does the Deadline submission script for Hiero work?
The submission script submits transcoding jobs from Hiero to Deadline, which are rendered with the Nuke
plugin.

9.30.6 Error Messages and Meanings


The Hiero submitter submits jobs to the Nuke plug-in. See the Nuke Plug-in Guide for Nuke error messages and
meanings.

660

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

9.31 Houdini
9.31.1 Job Submission
You can submit jobs from within Houdini by installing the integrated submission script, or you can submit them from
the Monitor. The instructions for installing the integrated submission script can be found further down this page.
To submit from within Houdini, select Render -> Submit To Deadline.

Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. The Houdini specific options are:
ROP To Render:
Choose: Allows you to choose your ROP from the dropbox to the right.
Selected: Allows you to render each ROP that you currently have selected in Houdini (in the order that
you selected)
All: Allows you to render every ROP in the Houdini file.
Ignore Inputs: If enabled, only the selected ROP will be rendered. No dependencies will rendered.
Build to Force: Force 32 or 64 bit rendering.
Submit Wedges as Separate Jobs: If enabled, each Wedge in a Wedge ROP will be submitted as a separate job
with the current Wedge settings. This option is only enabled if the selected ROP is a Wedge ROP, or if all ROPs
are being rendered and at least one of them is a Wedge ROP.

9.31. Houdini

661

Deadline User Manual, Release 7.1.0.35

Tile Rendering Options


Enable Tile Rendering to split up a single frame into multiple tiles.
Enable Tile Rendering: If enabled, the frame will be split into multiple tiles that are rendered individually and
can be assembled after.
Tiles In X: Number of horizontal tiles.
Tiles In Y: Number of vertical tiles.
Single Frame Tile Job Enabled: Enable to submit all tiles in a single job.
Single Job Frame: The frame that will be split up.
Submit Dependent Assembly Job: Submit a job dependent on the tile job that will assemble the tiles.
Cleanup Tiles after Assembly: If selected the tiles will be deleted after assembly.
Error on Missing Tiles: If enabled, then if any of the tiles are missing the assembly job will fail.
Assemble Over: Determine what the Draft Tile Assembler should assemble over be it a blank image, previous
output or a specified file.
Error on Missing Background: If enabled, then if the background file is missing the job will fail.
IFD Exporting and Mantra Standalone
The Houdini submitter allows you to submit a job that will export the scene to IFD files, and then submit a dependent
Mantra Standalone job to render the exported IFD files.

When submitting from the Monitor, you just need to enable the Override Export IFD option. When submitting from
within Houdini using the integrated submission script, you must first make sure that the ROPs you wish to export have
the Disk File option enabled in their properties, and then enable the Submit Dependent Mantra Standalone Job option
662

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

in the submitter. Note that if a ROP does not have the Disk File setting enabled, it will simply render the image, and
no dependent Mantra Standalone job will be submitted.

The general Deadline options for the Mantra Standalone job are explained in the Job Submission documentation. The
Mantra Standalone specific options are:
Mantra Threads: The number of threads to use for the Mantra stanadlone job.

9.31.2 Plug-in Configuration


You can configure the Houdini plug-in settings from the Monitor. While in super user mode, select Tools -> Configure
Plugins and select the Houdini plug-in from the list on the left.

9.31. Houdini

663

Deadline User Manual, Release 7.1.0.35

Render Executables
Hython Executable: The path to the hython executable. It can be found in the Houdini bin folder. Enter
alternative paths on separate lines. Different executable paths can be configured for each version installed on
your render nodes.
Licensing Options
Slaves To Use Escape License: A list of slaves that should use a Houdini Escape license instead of a Batch
license. Use a , to separate multiple slave names, for example: slave001,slave002,slave003
Path Mapping (For Mixed Farms)
Enable Path Mapping: If enabled, Deadline will use Houdinis HOUDINI_PATHMAP environment variable
to perform path mappings on the contents of the Houdini scene file. This feature can be turned off if there are
no Path Mapping entries defined in the Repository Options.

9.31.3 Integrated Submission Script Setup


The following procedures describe how to setup the integrated Houdini submission script for Deadline. This script has
been tested with Houdini 9 and later.
Submitter Installer
Run the Submitter Installer located at <Repository>/submission/houdini/Installers
ManualInstallation of the Submission Script
On Windows or Linux, copy the client script to the Houdini install directory
If the folder [Houdini Install Directory]\houdini\scripts\deadline\ doesnt exist, create it.

664

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Copy [Repository]\submission\Houdini\Client\DeadlineHoudiniClient.py
tory]\houdini\scripts\deadline\DeadlineHoudiniClient.py

to

[Houdini

Install

Direc-

On Mac OSX, copy the client script to the Houdini Framework folder
If the folder [Houdini Framework]/Versions/[Houdini Version]/Resources/houdini/scripts/deadline/ doesnt exist, create it.
Copy
[Repository]\submission\Houdini\Client\DeadlineHoudiniClient.py
to
[Houdini
work]/Versions/[Houdini Version]/Resources/houdini/scripts/deadline/DeadlineHoudiniClient.py

Frame-

The Houdini Framework folder can typically be found in /Library/Frameworks/Houdini.Framework


Add a menu item to execute the script
Open the file [Houdini Install Directory]/houdini/MainMenuCommon in a text editor.
Add the following in between the <mainMenu> and </mainMenu> tags, and make sure it is added after the
</menuBar> closing tag.
<addScriptItem id="h.deadline">
<parent>render_menu</parent>
<label>Submit To Deadline</label>
<scriptPath>$HFS/houdini/scripts/deadline/DeadlineHoudiniClient.py</scriptPath>
<scriptArgs></scriptArgs>
<insertAfter/>
</addScriptItem>

For example, this is what the last few lines of your MainMenuCommon file might look like:
</menuBar>
<addScriptItem id="h.deadline">
<parent>render_menu</parent>
<label>Submit To Deadline</label>
<scriptPath>$HFS/houdini/scripts/deadline/DeadlineHoudiniClient.py</scriptPath>
<scriptArgs></scriptArgs>
<insertAfter/>
</addScriptItem>
</mainMenu>

9.31.4 FAQ
Which versions of Houdini are supported by Deadline?
Houdini 9 and later are supported. To render with Houdini 7 or 8, use the Mantra Plug-in.
Which Houdini license(s) are required to render with Deadline?
Deadline uses Hython to render, which uses hbatch licenses. If those are not available, it will try to use a
Master License instead.

9.31.5 Error Messages and Meanings


This is a collection of known Houdini error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
9.31. Houdini

665

Deadline User Manual, Release 7.1.0.35

Currently, no error messages have been reported for this plug-in.

9.32 Lightwave
9.32.1 Job Submission
You can submit jobs from within Lightwave by installing the integrated submission script, or you can submit them
from the Monitor. The instructions for installing the integrated submission script can be found further down this page.

To submit from within Lightwave, select the Render Tab and click the SubmitToDeadline button on the left.

666

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. The Lightwave specific options are:
Content Directory: The Lightwave Content directory. Refer to your Lightwave documentation for more information.
Config Directory: The Lightwave Config directory. Refer to your Lightwave documentation for more information.
Force Build: For Lightwave 9 and later, force rendering in 32 bit or 64 bit.
Use FPrime Renderer: If you want to use the FPrime renderer instead of the normal Lightwave renderer.
Use ScreamerNet Rendering: ScreamerNet rendering keeps the Lightwave scene loaded in memory between
frames, which reduces overhead time when rendering.
Notes:
At the moment, there is no support for rendering animation (movie) files. Any animation options will be ignored,
and an RGB output and/or Alpha output must be specified in order to submit to Deadline.

9.32. Lightwave

667

Deadline User Manual, Release 7.1.0.35

In the Scene file, some versions of Lightwave use a number to specify the output file type and some use the
actual file type extension (.tif, .tga, etc). In the versions that use the actual file type extension, individual
rendered images can be viewed from the Monitor task list by right-clicking on them.
For information on how to properly set up your network for Lightwave rendering, see the ScreamerNet section
of your Lightwave documentation. When Lightwave is properly configured for ScreamerNet rendering, it will
then render properly through Deadline.

9.32.2 Cross-Platform Rendering Considerations


In order to perform cross-platform rendering with Lightwave, you must setup Mapped Paths so that Deadline can swap
out file paths where appropriate. You can access the Mapped Paths Setup in the Monitor while in super user mode by
selecting Tools -> Configure Repository. Youll find the Mapped Paths Setup in the list on the left.

From here, you can set the list of executables that will be used for rendering. To get a more detailed description of
each setting, simply hover the mouse cursor over a setting and a tool tip will be displayed.

9.32.3 Integrated Submission Script Setup


This section describes how to install the integrated render job submission script for Lightwave. This script allows
for submitting Lightwave render jobs to Deadline directly from within the Lightwave editing GUI. Note that on Mac
OSX, this script is only supported by the Universal Binary versions of Lightwave.
Click the Utilities tab. Find the Plugins section on the right and click the Add Plug-ins button. Select the
DeadlineLightwaveClient.ls file found in [Repository]\submission\Lightwave\Client.

668

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Click the Edit menu in the top-left corner and select the Edit Menu Layout... option.

9.32. Lightwave

669

Deadline User Manual, Release 7.1.0.35

In the Command list on the left, expand the Plug-ins section in Lightwave 8 or the Additional section in Lightwave 9 and later, and find the DeadlineLightwaveClient plugin. Drag and drop it into the Menus list in the
Render section. Click Done.

Click the Render tab. There should be a DeadlineLightwaveClient button on the right. If there is not, check to
make sure you placed the DeadlineLightwaveClient plugin in the correct section.

670

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

9.32.4 FAQ
Which version of Lightwave are supported?
Lightwave versions 8 and later are supported. On Mac OSX, both the PPC and Universal Binary versions
work. However, the integrated Lightwave submission script only works with the Universal Binary version.
Lightwave 10 integrated submitter crashes with Deadline 5.0 and older on Mac OSX.
Due to an API change in LightWave, previous integrated submission scripts will not work under LightWave 10 on OSX. This is fixed in Deadline 5.1.
Does Deadline support the FPrime renderer?
Yes. FPrime has its own net rendering application called wsn.exe, which can be configured in the Lightwave plugin configuration. When you submit your Lightwave job, just make sure to have the Use FPrime
Renderer option checked.
When rendering with FPrime, I get an error that it cant create a temporary config directory.
This can occur when the job is using a shared Config folder on the network. FPrime tries to create a
temporary config directory in this shared folder, and this can fail if many slaves are trying to access that
Config folder at the same time.
To avoid this problem, we suggest enabling the FPrime Use Local Config option in the Lightwave Plugin
Configuration, which can be accessed from the Monitor while in Super User mode by selecting Tools ->

9.32. Lightwave

671

Deadline User Manual, Release 7.1.0.35

Configure Plugins. When this option is enabled, Deadline will copy the contents of the shared Config
folder to a local folder, and this is the Config folder that FPrime will use.
What does the Use ScreamerNet Rendering option in the submission dialog do?
When using ScreamerNet rendering, the Lightwave scene is kept loaded in memory between each frame
for a job, which greatly reduces the overhead of having to load the scene at the beginning of each frame.
Does Deadline work if one renames the Lightwave configuration files in the configuration directory?
Currently, Deadline assumes that you have not renamed the Lightwave configuration files in the Lightwave
configuration directory.

9.32.5 Error Messages and Meanings


This is a collection of known Lightwave error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.33 LuxRender
9.33.1 Job Submission
You can submit LuxRender jobs from the Monitor.

672

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation. The LuxRender specific options
are:
LXS File: The file to render.
Threads: The number of threads to use. Specify 0 to use the same number of threads as there are CPUs.

9.33.2 Plug-in Configuration


You can configure the LuxRender plug-in settings from the Monitor. While in super user mode, select Tools ->
Configure Plugins and select the LuxRender plug-in from the list on the left.

9.33. LuxRender

673

Deadline User Manual, Release 7.1.0.35

Render Executables
Luxrender Executable: The path to the luxconsole executable file used for rendering. Enter alternative paths
on separate lines.

9.33.3 FAQ
Is LuxRender supported by Deadline?
Yes.

9.33.4 Error Messages and Meanings


This is a collection of known LuxRender error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.34 LuxSlave
9.34.1 Job Submission
You can submit LuxRender Slave jobs from the Monitor, which can be used to reserve render nodes for distributed
rendering. Note, you will need to manually configure/update your locally running LuxRender UI network queue to
point to the correct, corresponding Deadline slaves or IP addresses, over an identical port number.

674

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation. The LuxSlave specific options are:
Maximum LC Slaves: The maximum number of Luxconsole Slaves to reserve for distributed rendering.
Port Number: Override the default Luxconsole Slave TCP port number of 18018 to use.
Threads: The number of threads to use. Specify 0 to use the same number of threads as there are CPUs.
Verbosity Level: The level of verbosity to use.

9.34. LuxSlave

675

Deadline User Manual, Release 7.1.0.35

9.34.2 Plug-in Configuration


You can configure the LuxSlave plug-in settings from the Monitor. While in super user mode, select Tools -> Configure
Plugins and select the LuxSlave plug-in from the list on the left.

Console Executables
Luxconsole Executable: The path to the luxconsole executable file used for rendering. Enter alternative paths
on separate lines.
Luxconsole Slave Options
Write film to disk before transmitting: Write film to disk before transmitting.
Specify the cache directory to use: Specify the local cache directory to use instead of the default: local users
temp directory.
Slave Process Handling
Handle Existing Slave Process: Either Do Nothing, FAIL on existing Slave process or KILL the existing Slave
process if already running.
Slave Session Timeout
Slave Session Auto Timeout Enable: Enable to force Slave Session to be marked as complete after a Slave
Session closes on a Deadline slave.
Slave Session Auto Timeout (Seconds): Slave Session minimum timeout before last closed Slave Session is
marked as complete on Deadline slave (seconds).

676

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

9.34.3 FAQ
Is Luxconsole DR Slave supported by Deadline?
Yes.

9.34.4 Error Messages and Meanings


This is a collection of known LuxSlave error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.35 Mantra Standalone


9.35.1 Job Submission
You can submit Mantra Standalone jobs from the Monitor.

9.35. Mantra Standalone

677

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. The Mantra specific options are:
IFD File: Specify the Mantra IFD file(s) to render.
678

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

If you are submitting a sequence of .IFD files, select one of the numbered frames in the sequence, and
the frame range will automatically be detected if Calculate Frames From IFD File is enabled. The
frames you choose to render should correspond to the numbers in the .IFD files.
Output File: The output file path.
Version: The Mantra version to render with.
Threads: The number of threads to use for rendering.
Additional Arguments: Additional command line arguments to pass to the renderer.
Tile Rendering Options
Enable Tile Rendering to split up a single frame into multiple tiles.
Enable Tile Rendering: If enabled, the frame will be split into multiple tiles that are rendered individually and
can be assembled after.
Tiles In X: Number of horizontal tiles.
Tiles In Y: Number of vertical tiles.
Single Frame Tile Job Enabled: Enable to submit all tiles in a single job.
Single Job Frame: The frame that will be split up.
Submit Dependent Assembly Job: Submit a job dependent on the tile job that will assemble the tiles.
Cleanup Tiles after Assembly: If selected the tiles will be deleted after assembly.
Error on Missing Tiles: If enabled, then if any of the tiles are missing the assembly job will fail.
Assemble Over: Determine what the Draft Tile Assembler should assemble over be it a blank image, previous
output or a specified file.
Error on Missing Background: If enabled, then if the background file is missing the job will fail.

9.35.2 Plug-in Configuration


You can configure the Mantra plug-in settings from the Monitor. While in super user mode, select Tools -> Configure
Plugins and select the Mantra plug-in from the list on the left.

9.35. Mantra Standalone

679

Deadline User Manual, Release 7.1.0.35

Render Executables
Mantra Executable: The path to the Mantra executable file used for rendering. Enter alternative paths on
separate lines. Different executable paths can be configured for each version installed on your render nodes.
Path Mapping (For Mixed Farms)
Enable Path Mapping: If enabled, Deadline will use Houdinis HOUDINI_PATHMAP environment variable
to perform path mappings on the contents of the IFD file. This feature can be turned off if there are no Path
Mapping entries defined in the Repository Options.

9.35.3 FAQ
Which versions of Mantra are supported by Deadline?
Mantra for Houdini 7 and later supported by Deadline.

9.35.4 Error Messages and Meanings


This is a collection of known Mantra error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

680

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

9.36 Maxwell
9.36.1 Job Submission
You can submit Maxwell jobs from the Monitor.

9.36. Maxwell

681

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. The Maxwell specific options are:
Maxwell Options
Maxwell File(s): The Maxwell files to be rendered. Can be a single file, or a sequence of files.
Version: The version of Maxwell to render with.
Verbosity: Set the amount of information that Maxwell should output while rendering.
Single Frame Job: This should be checked if youre submitting a single Maxwell file only.
Build To Force: Force 32 bit or 64 bit rendering.
Threads: The number of threads to use during rendering. Specify 0 to use the default setting.
Co-op Rendering
Cooperative Rendering: Enable this to use Maxwells co-op rendering feature to render the same image across
multiple computers. You can then use Maxwell to combine the resulting output after the rendering has completed.
Split Co-op Renders Into Separate Jobs: By default, a co-op render is submitted as a single job, where each
task represents a different seed. If this option is enabled, a separate job will represent each seed.
Adjust Sampling Overrides For Cooperative Rendering: If this option is enabled, the sampling level given
to each slave will be reduced accordingly to ensure that final merged sampling level will match the requested
one.
Number of Co-op Renders: The number of co-op render jobs to submit to Deadline.
Auto-Merge Files: Enable this option to auto-merge the co-op renders into the final image.
Fail On Missing Intermediate Files: If enabled, the auto-merge will fail if any co-op renders are missing.
Delete Intermediate Files: If enabled, the co-op renders will be deleted after the final image is merged together.
Output Options
Output MXI File: Optionally configure the output path for the MXI file which can be used to resume the render
later. Note that this is required for co-op rendering though.
Output Image File: Optionally configure the output path for the image file.
Render Camera: Optionally specify which camera to render with.
Enable Local Rendering: If enabled, Deadline will save the output locally and then copy it to the final network
location.
Resume Rendering From MXI File: If enabled, Maxwell will use the specified MXI file to resume the render
if it exists. If you suspend the job in Deadline, it will pick up from where it left off when it resumes.
Overrides
Override Time: Enable to override the Time setting in the Maxwell file.
Override Sampling: Enable to override the Sampling setting in the Maxwell file.
Extra Sampling (requires Maxwell 3.1 or later)
Override Extra Sampling: If the extra sampling settings should be overridden.
Enabled: If extra sampling is enabled.
Sampling Level: The extra sampling level.
682

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Invert Mask: If the extra sampling alpha mask must be inverted.


Mask: The extra sampling mask.
Custom Alpha: The custom alpha name that will be used for the extra sampling mask (if Mask is set to Custom
Alpha).
Bitmap File: The bitmap file that will be used for the extra sampling mask (if Mask is set to Bitmap).
Command Line Options
Additional Arguments: Additional command line arguments to pass to the renderer.
Resuming a Render
When specifying an MXI file, you now have the option to have Maxwell use it to resume a render job if that MXI file
already exists. This means that if you suspend a Maxwell job from the Monitor mid-render, it will resume from where
it left off when you resume the job.

9.36.2 Plug-in Configuration


You can configure the Maxwell plug-in settings from the Monitor. While in super user mode, select Tools -> Configure
Plugins and select the Maxwell plug-in from the list on the left.

General Maxwell Options


Slaves To Use Interactive License: A list of slaves that should use an interactive Maxwell license instead of a
render license. Use a , to separate multiple slave names, for example: slave001,slave002,slave003
Maxwell Version Options

9.36. Maxwell

683

Deadline User Manual, Release 7.1.0.35

Render Executable: The path to the Maxwell executable file used for rendering. Enter alternative paths on
separate lines. Different executable paths can be configured for each version installed on your render nodes.
Merge Executable: The path to the Maxwell executable file used for merging. Enter alternative paths on
separate lines. Different executable paths can be configured for each version installed on your render nodes.

9.36.3 FAQ
Which version of Maxwell is supported by Deadline?
Versions 2 and later are supported.
Is Co-op Rendering supported?
Yes.
Can I resume from a previous Maxwell render?
If you have the Resume Rendering From MXI File option enabled when submitting the job, Maxwell will
use the specified MXI file to resume the render if it exists. If you suspend the job in Deadline, it will pick
up from where it left off when it resumes.

9.36.4 Error Messages and Meanings


This is a collection of known Maxwell error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.37 Maya
9.37.1 Job Submission
You can submit jobs from within Maya by installing the integrated submission script, or you can submit them from
the Monitor. The instructions for installing the integrated submission script can be found further down this page.

684

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

To submit from within Maya, select the Thinkbox shelf and press the green button there. If the green icon is missing,
you can delete the shelf and restart Maya to get it back.

Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. The Maya specific options are:

9.37. Maya

685

Deadline User Manual, Release 7.1.0.35

Additional Frame Options


Render Preview Job First: When enabled, two jobs will be submitted - a PREVIEW job with a fraction of
the frames, and a REST job with all other frames. The PREVIEW job can be submitted with slightly higher
priority and will provide a glimpse into the final result. If its output looks incorrect, you can suspend the REST
job before wasting render time rendering a wrong submission.
Priority Offset: Specify a higher priority for the PREVIEW job.
Number of Preview Frames: Specify the number of frames to preview.
Submit Dependent Job With Remaining Frames: If enabled, the REST job will be dependent on the PREVIEW job.
Task Order: The order in which to render the frames for the job.
Out Of Order Step: Defines the Nth frame step to use for some of the Task Order options.

Render Options
Camera: Select the camera to render with. Leaving this blank will force Deadline to render using the default
camera settings (including multiple camera outputs).
Project Path: The Maya project folder (this should be a shared folder on the network).
Output Path: The folder where your output will be dumped (this should be a shared folder on the network).
Maya Build: Force 32 bit or 64 bit rendering.

686

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Use MayaBatch Plugin: This uses our new MayaBatch plugin that keeps the scene loaded in memory between
frames, thus reducing the overhead of rendering the job. This plugin is no longer considered experimental.
Ignore Error Code 211: This allows a Maya task to finish successfully even if the Maya command line renderer
returns the non-zero error code 211 (not available when using the MayaBatch plugin). Sometimes Maya will
return this error code even after successfully saving the rendered images.
Startup Script: Maya will source the specified script file on startup (only available when using the MayaBatch
plugin).
Command Line Args: Specify additional command line arguments to pass to the Maya command line renderer
(not available when using the MayaBatch plugin).
Deadline Job Type: Select the type of Maya job you want to submit. The available options are covered in the
next few sections.
Maya Render Job
If rendering a normal Maya job, select the Maya Render Job type.

General Options
The following options are available:
Threads: The maximum number of CPUs per machine to render with.
Frame Number Offset: Uses Mayas frame renumbering option to offset the frames that are rendered.
Submit Render Layers As Separate Jobs: Enable to submit each layer in your scene as a seperate job.
Override Layer Job Settings: If submitting each layer as a separate job, enable this option override the job
name, frame list, and task size for each layer. When enabled, the override dialog will appear after you press
Submit.

9.37. Maya

687

Deadline User Manual, Release 7.1.0.35

Submit Cameras As Separate Jobs: Enable to submit each camera as a separate job.
Ignore Default Cameras: Enable to have Deadline skip over cameras like persp, top, etc, when submitting each
camera as a separate job (even if those cameras are set to renderable).
Enable Local Rendering: If enabled, Deadline will render the frames locally before copying them over to the
final network location. This has been known to improve the speed of Maya rendering in some cases.
Strict Error Checking: Enable this option to have Deadline fail Maya jobs when Maya prints out any error
or warning messages. If disabled, Deadline will only fail on messages that it knows are fatal.
Render Half Frames: If checked, frames will be split into two using a step of 0.5. Note that frame 0 will save
out images 0 and 1, frame 1 will save out images 2 and 3, frame 2 will save out images 4 and 5, etc.

688

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Region Rendering Options


If submitting a Maya Render Job or an Arnold Export Job, you can choose to submit a region rendering job. You can
also submit a dependent assembly job to assemble the image when the main region job completes. If using Draft for
the assembly, you will need a license from Thinkbox. Otherwise, the output formats that are supported are BMP, DDS,
EXR, JPG, JPE, JPEG, PNG, RGB, RGBA, SGI, TGA, TIF, and TIFF.

The following options are available:


Enable Region Rendering: If enabled, the frame will be split into multiple tiles that are rendered individually
and can be assembled after.
Region Render Type: If set to Jigsaw Rendering then the submission will use Jigsaw, otherwise it will use a
grid of tiles.
Submit All Tiles as a single Job: If enabled, a single frame will be submitted with all tiles in a single job,
otherwise each tile will be submitted as a seperate job with each frame being a seperate frame.
Submit Dependent Assembly Job: Submit a job dependent on the region job that will assemble the tiles, if
doing jigsaw animation a seperate job will be created for each different named output file
Assemble Using Draft: Draft is required when using Jigsaw Rendering. However, when Tile Rendering is the
chosen type, you can choose to assemble with Draft, or with the old Tile Assembler application.
Cleanup Tiles after Assembly: If enabled, the tiles will be deleted after assembly.
Error on missing Tiles: If enabled, then if any of the tiles are missing the assembly job will fail.
Assemble Over: Determine what the Draft Tile Assembler should assemble over be it a blank image, previous
output or a specified file.
Error on Missing Background: If enabled, then if the background file is missing the Draft Tile Assembler job
will fail.
Renderer Specific Options
If rendering with Mental Ray, there is an additional Mental Ray Options section under the Maya Options:

9.37. Maya

689

Deadline User Manual, Release 7.1.0.35

Mental Ray Verbosity: Set the verbosity level for Mental Ray renders.
Auto Memory Limit: If enabled, Mental Ray will automatically detect the optimal memory limit when rendering.
Memory Limit: Soft limit (in MB) for the memory used by Mental Ray (specify 0 for unlimited memory).
If rendering with VRay, there is an additional VRay Options section under the Maya Options:
Auto Memory Limit Detection: If enabled, Deadline will automatically detect the dynamic memory limit for
VRay prior to rendering.
Memory Buffer: Deadline subtracts this value from the systems unused memory to determine the dynamic
memory limit for VRay.
If rendering with Redshift, there will be an additional Redshift Options under the Maya Options:
GPUs Per Task: If set to 0 (the default), then Redshift will be responsible to choosing the GPUs to use for
rendering. If this is set to 1 or greater, then each task for the job will be assigned specific GPUs. This can be
used in combination with concurrent tasks to get a distribution over the GPUs.
For example:
if this is set to 1, then tasks rendered by the Slavess thread 0 would use GPU 0, thread 1 would use GPU
1, etc.
if this is set to 2, then tasks rendered by the Slavess thread 0 would use GPUs {0,1}, thread 2 would use
GPUs {2,3}, etc.
Mental Ray Export Job
If rendering a Mental Ray Export job, select the Mental Ray Export Job type.

690

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

The following options are available:


Output File: The full filename of the Mental Ray files that will be exported. Padding is handled automatically
by the exporter.
Export Settings: This opens up the Mental Ray export settings dialog where you can configure the remaining
of the settings. Note that this dialog must be open when you submit the job.

9.37. Maya

691

Deadline User Manual, Release 7.1.0.35

You have the option to submit a dependent Mental Ray Standalone job that will render the exported mi files after the
export job finishes. The Mentral Ray specific job options are:
Threads: The number of threads to use for rendering.
Frame Offset: The first frame in the input MI file being rendered, which is used to offset the frame range being
passed to the mental ray renderer.
Mental Ray Build: You can force 32 or 64 bit rendering.
Enable Local Rendering: If enabled, the frames will be rendered locally, and then copied to their final network
location.
Command Line Args: Specify additional command line arguments you would like to pass to the mental ray
renderer.
692

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

VRay Export Job


If rendering a VRay Export job, select the VRay Export Job type.

The following options are available:


Output File: The full file name of the VRay files that will be exported (padding is handled automatically by the
exporter).
You have the option to submit a dependent VRay Standalone job that will render the exported vrscene files after the
export job finishes. The VRay specific job options are:
Threads: The number of threads to use for rendering.
Vrimg2Exr Conversion Job: If you are submitting a dependent VRay Standalone job, and the output format is
vrimg, you have the option to submit a dependent job that will convert the vrimg files to exr files, using VRays
vrimg2exr application.
Renderman Export Job
If rendering a Renderman Export job, select the Renderman Export Job type.

9.37. Maya

693

Deadline User Manual, Release 7.1.0.35

The following options are available:


Threads: The number of threads to use for exporting. Specify 0 to automatically use the optimal number of
threads.
Render with RIS: If checked, the exported RIB files will have RIS set as the renderer instead of REYES.
You have the option to submit a dependent PRMan Render job that will render the exported rib files after the export
job finishes. The PRMan specific job options are:
Threads: The number of threads to use for rendering.
Command Line Args: Specify additional command line arguments you would like to pass to the PRMan
renderer.
Arnold Export Job
If rendering an Arnold Export job, select the Arnold Export Job type.

694

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

You have the option to submit a dependent Arnold Standalone job that will render the exported .ass files after the
export job finishes. The Arnold Standalone specific job options are:
Local Export to Arnold: If this option is set to true, the Arnold .ass files will be export locally.
Threads: The number of threads to use for rendering.
Command Line Args: Specify additional command line arguments you would like to pass to the Arnold renderer.
Maxwell Export Job
If rendering a Maxwell Export job, select the Maxwell Export Job type.

9.37. Maya

695

Deadline User Manual, Release 7.1.0.35

The following options are available:


Maxwell Script Name: The path that the exported Maxwell MXS files will be saved to.
You have the option to submit a dependent Maxwell Standalone job that will render the exported .MXS files after the
export job finishes. The Maxwell Standalone specific job options are:
Local Export to Maxwell: If this option is set to true, the Maxwell MXS files will be export locally.
Threads: The number of threads to use for rendering.
Command Line Args: Specify additional command line arguments you would like to pass to the Arnold renderer.

9.37.2 Cross-Platform Rendering Considerations


In order to perform cross-platform rendering with Maya, you must setup Mapped Paths so that Deadline can swap out
the Project and Output paths where appropriate. You can access the Mapped Paths Setup in the Monitor while in super
user mode by selecting Tools -> Configure Repository. Youll find the Mapped Paths Setup in the list on the left.
As long as all paths used in your Maya scene are relative to the Project and Output paths, and those paths are network
accessible, you should have no problems performing cross-platform renders.

696

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

However, if you are using absolute paths in your Maya scene file, it is possible for Deadline to swap them as well, but
you must save your scene file as a Maya Ascii (.ma) file. Because .ma files are ascii files, Deadline can read them and
swap out paths as necessary. If theyre saved as Maya Binary (.mb) files, they cant be read, and cant have their paths
swapped.

9.37.3 Plug-in Configuration


You can configure the MayaBatch and MayaCmd plug-in settings from the Monitor. While in super user mode, select
Tools -> Configure Plugins and select the Maya plug-in from the list on the left.
MayaBatch

9.37. Maya

697

Deadline User Manual, Release 7.1.0.35

Render Executables
Maya Executable: The path to the Maya executable file used for rendering. Enter alternative paths on separate
lines. Different executable paths can be configured for each version installed on your render nodes.
Maxwell For Maya (version 2 and later)
Slaves To Use Interactive License: A list of slaves that should use an interactive Maxwell license instead of a
render license. Use a , to separate multiple slave names, for example: slave001,slave002,slave003
Path Mapping For ma Scene Files (For Mixed Farms)
Enable Path Mapping For ma Files: If enabled, a temporary ma file will be created locally on the slave for
rendering and Deadline will do path mapping directly in the ma file.
Debugging
Log Script Contents To Render Log: If enabled, the full script that Deadline is passing to Maya will be written
to the render log. This is useful for debugging purposes.

698

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

MayaCmd

Render Executables
9.37. Maya

699

Deadline User Manual, Release 7.1.0.35

Maya Executable: The path to the Maya executable file used for rendering. Enter alternative paths on separate
lines. Different executable paths can be configured for each version installed on your render nodes.
Maxwell For Maya (version 2 and later
Slaves To Use Interactive License: A list of slaves that should use an interactive Maxwell license instead of a
render license. Use a , to separate multiple slave names, for example: slave001,slave002,slave003
Path Mapping For ma Scene Files (For Mixed Farms)
Enable Path Mapping For ma Files: If enabled, a temporary ma file will be created locally on the slave for
rendering and Deadline will do path mapping directly in the ma file.

9.37.4 Integrated Submission Script Setup


The following procedures describe how to install the integrated Maya submission script. This script allows for submitting Maya render jobs to Deadline directly from within the Maya editing GUI. The script and the following installation
procedure has been tested with Maya 2010 and later.
Submitter Installer
Run the Submitter Installer located at <Repository>/submission/Maya/Installers
Manual Installation of the Submission Script
On Windows, copy the file [Repository]\submission\Maya\Client\DeadlineMayaClient.mel to [Maya Install Directory]\scripts\startup. If you do not have a userSetup.mel in [My Documents]\maya\scripts, copy the file [Repository]\submission\Maya\Client\userSetup.mel to [My Documents]\maya\scripts. If you have a userSetup.mel file, add
the following line to the end of this file:
source "DeadlineMayaClient.mel";

On
Mac
OS
X,
copy
the
file
[Repository]/submission/Maya/Client/DeadlineMayaClient.melto
[Maya Install Directory]/Maya.app/Contents/scripts/startup.
If you do not have a userSetup.mel in /Users/[USERNAME]/Library/Preferences/Autodesk/maya/scripts, copy the file [Repository]/submission/Maya/Client/userSetup.mel to /Users/[USERNAME]/Library/Preferences/Autodesk/maya/scripts.
If you have a userSetup.mel file, add the following line to the end of this file:
source "DeadlineMayaClient.mel";

On Linux, copy the file [Repository]/submission/Maya/Client/DeadlineMayaClient.mel to [Maya Install Directory]/Maya.app/Contents/scripts/startup. If you do not have a userSetup.mel in /home/[USERNAME]/maya/scripts,
copy the file [Repository]/submission/Maya/Client/userSetup.melto /home/[USERNAME]/maya/scripts. If you have
a userSetup.mel file, add the following line to the end of this file:
source "DeadlineMayaClient.mel";

700

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

The next time Maya is started, a Deadline shelf should appear with a green button that can be clicked on to launch the
submitter.
If you dont see the Deadline shelf, its likely that Maya is loading another userSetup.mel file from somewhere. Maya
can only load one userSetup.mel file, so you either have to configure Maya to point to the file mentioned above, or you
have to modify the file that Maya is currently using as explained above. To figure out which userSetup.mel file Maya
is using, open up Maya and then open up the Script Editor. Run this command:
whatIs userSetup.mel

Custom Sanity Check


You can create a CustomSanityChecks.mel file alongside the main SubmitMayaToDeadline.mel in the [Repository]\submission\Maya\Main folder, and it can be used to set defaults in the submission script before it is displayed.
For example, here is a script that can set the default Limit Groups based on the renderer:
AddStringAttribute( "deadlineLimitGroups" );
if( GetCurrentRenderer() == "mentalRay" )
setAttr defaultRenderGlobals.deadlineLimitGroups -type "string" "mray_for_maya";
else if( GetCurrentRenderer() == "vray" )
setAttr defaultRenderGlobals.deadlineLimitGroups -type "string" "vray_for_maya";
else
setAttr defaultRenderGlobals.deadlineLimitGroups -type "string" "";

The available Deadline globals are defined in the SavePersistentDeadlineOptions function in the SubmitMayaToDeadline.mel script. These can be used to set the initial values in the submission dialog.
You can also create a CustomPostSanityChecks.mel file alongside the main SubmitMayaToDeadline.mel in the
[Repository]submission\Maya\Main folder. It can be used to run some additional checks after the user clicks the
Submit button in the submitter. It must define a global proc called CustomPostSanityCheck() that takes no arguments,
and must return 0 or 1. If 1 is returned, the submission process will continue, otherwise it will be aborted. Here is an
example script:
global proc CustomPostSanityCheck()
{
// Don't allow mayaSoftware jobs to be submitted
if( GetCurrentRenderer() == "mayaSoftware" )
return 0;
return 1;
}

9.37.5 FAQ
Do I need to install Maya on each machine that will render and all 3rd party plugins that are required?
Yes. Traditionally, Maya and all required scripts & 3rd party plugins should always be installed and
licensed (where applicable) on each machine where it is intended to network render on. However, VFX
studios tend to operate a Linux OS platform and take advantage of installing software onto a centralized
file server that importantly has the performance to support this configuration and then all local machines
can be configured to point at this central location. Additionally, 3rd party plugins/scripts can then be added
to this central server path location in combination with floating licenses. This level of custom deployment
and configuration is beyond the scope of Thinkbox support and you would be best advised to engage
an approved Autodesk reseller or Autodesk directly on best practices here. Here are some URL links,

9.37. Maya

701

Deadline User Manual, Release 7.1.0.35

which may be of assistance. If you are able to install and successfully run Maya & all your plugins/scripts
from a network location in your studio, then Deadline will be able to support network rendering from
this location as well. Simply update the MayaBatch & MayaCmd plugins with the new executable path
location using Deadline Monitor, click on Tools > Super User Mode > Configure Plugins... >
MayaBatch or MayaCmd.
How to install Maya on a network share
Maya Environment Variables
Which versions of Maya are supported?
Maya versions 2010 and later are all supported.
Which Maya renderers are supported?
All Maya renderers should work fine with Deadline. The renderers that are known to work with Deadline are 3Delight, Arnold, Caustic Visualizer, Final Render, Gelato, Krakatoa, Maxwell, MayaSoftware,
MayaHardware, MayaVector, Mental Ray, Octane, RedShift, Renderman, Renderman RIS, Turtle, and
VRay. If you see a Maya renderer thats not on this list, email Deadline Support and let us know!
Does the Maya plugin support Tile Rendering?
Yes. See the Region Rendering Options section above for more details.
Does the Maya plugin support multiple arbitrary sized, multi-resolution Tile Rendering for both stills or animations and automatic re-assembly, including the use of multi-channel image formats and arbitary Render
Passes? (incl. VRay/Arnold/MR support?)
Yes. We call it Jigsaw and its unique to the Deadline system! See the Region Rendering Options
section above for more details.
Which Maya application should I select as the render executable in the MayaCmd plugin configuration?
Select the Render.exe application. This is Mayas command line renderer.
Which Maya application should I select as the render executable in the MayaBatch plugin configuration?
Select the MayaBatch.exe application. This is Mayas batch renderer.
What is the MayaBatch plugin, and how is it different than the MayaCmd plugin?
This plugin keeps the Maya scene loaded in memory between frames, thus reducing the overhead of
rendering the job. This is the recommended plugin to use, but if you run into any problems, you can
always try using the MayaCmd plugin.
Why is each task of my job is rendering the same frame(s)?
This happens if you have the Renumber Frames option enabled in your Maya render settings. Each task
is a separate batch, and if Renumber Frames is enabled, each batch will start at that frame number.
I have a multi-core machine, but when rendering the machine isnt using 100% of the cpu. What can I do?
When submitting the job to Maya, set the Threads option to 0. This will instruct Maya to use the optimal
number of threads when rendering based on the machines core-count.
Does Deadline support Maya render layers?
Yes. You can either submit one job that renders all the layers, or you can submit a single job per layer.
Can I render scenes that use Maya Fur?
A recommended setup for Maya is to have your project folder on a shared location that all of your machines can see (whether it be a Windows folder share or a mapped path), then create your Maya scene
in this project folder. This way, when you submit the job, you can specify the shared project path in the

702

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

submission dialog, and all of your slave machines will be able to see it (and therefore see the Maya Fur
folders within the project folder).
Can I make use of the particle cache during network renders?
Yes you can. All that is necessary to do this is to make your scenes project directory network-accessible
by your slaves. For a guide to setting up particle caches, check out this guide on the ResPower Website
that describes the proper set-up procedure for the Maya particle cache.
When clicking on one of the folder browser buttons in the Maya submission dialog, I sometimes get an error.
There is an article on this problem. Its a .NET problem that seems to randomly occur when the user
specifies a path of more than 130 characters, but it looks like Microsoft provides a hotfix for it.
When submitting the job from Maya, if I check the Submit Each Render Layer As A Separate Job box, no jobs
are submitted when you click submit.
The render layers you want to submit need to be set to renderable (the letter R need to be there next to
the render layer) for the submitter to submit the layer. Note that render layer should not be confused with
display layer. Deadline only deals with render layers. It is not using the Maya option to render only the
content of a specific display layer.
Im trying to render certain frame range from maya, but Deadline is rendering the entire frame range set in
the Maya render globals.
If you have the Submit Each Render Layer As A Separate Job box checked, Deadline grabs the frame
information from each individual layers render globals when submitting the job. If unchecked, Deadline
will use the info from the Frame List in the submission dialog.
Rendering Maya scenes with Deadline is taking forever in comparison to a local render of the same file.
One thing you can try is ensuring that the Local Rendering option is enabled when submitting the job to
Deadline. This forces Maya to render the frame locally, then copy it to the final destination after. This has
been known to improve rendering speeds.
How do I configure Mental Ray Satellite to render Mental Ray for Maya jobs with Deadline?
1. Choose a satellite master machine, then modify the maya.rayhosts of the that machine so that it uses the slaves
you want.
2. Only put the master machine in Deadline.
3. Submit a job, and make sure that the job will be picked-up by the master machine you have setup. Use pools to
do so.
4. In the job property page of the Maya job, in the Maya tab, you could add the following line in the additional
arguments field: -rnm 1
This -rnm 1 means render no master true, whicht will force the master not to participate in the rendering but only
submit and receive the render tasks. You will get better results this way.
You could also use -rnm 0 which means render no master false and force 1 cpu on the master (if your master is a
dual cpu) so you have 1 cpu free on the master to dispatch the task. In short you should always have 1 cpu free on the
master machine for dispatching or else your render time will suffer.
Can I submit MEL or Python Maya script files to Deadline?
Yes, you can submit your own custom scripts from the Advanced tab in the Maya submission script in the
Monitor Submit menu.
Can I Perform Fume FX Simulations With Deadline?
Yes and its supported by both our MayaBatch & MayaCmd plugins. To do so, follow these steps:
1. Requires min. FumeFX for Maya v3.5.4

9.37. Maya

703

Deadline User Manual, Release 7.1.0.35

2. Ensure you have MayaSoftware selected as renderer in Maya.


3. Before you launch the Maya submission script, make sure that the Fume FX NetRender (Backburner)
toggle button is ON in the FumeFX options in Maya.
4. Fume FX output paths must be UNC or via mapped drive letter (Windows)
5. Deadline Slave must have either Fume FX full or Fume FX simulation license available and
authorized. If you wish to use Sim Only Mode license, then you can switch via the Fume FX
Prefs Dialog in Maya prior to Deadline Submission. Note, that you must restart Maya for this
license mode change to be committed. Do this before submitting to Deadline if you need to use a
Sim. only license on a Deadline Slave.
6. Submit any arbitrary Maya single frame to begin the Fume FX Simulation (Fume FX uses its own
frame range). However, note that Maya will render whatever single frame the Fume FX job was
submitted on at the end of the simulation.
7. Please see the Fume FX for Maya help manual for more details on the above requirements.
How can I region render large VRay Scenes?
By changing the memory frame buffer on the VRay Common tab of the render settings to None you will
be able to render larger tiles since VRay does not crop the tiles.

9.37.6 Error Messages and Meanings


This is a collection of known Maya error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Error
in
StartJob:
Error
in
RenderExecutable:
64
bit
Maya
####_0
render
executable
was
not
found
in
the
semicolon
separated
list
C:/Program
Files/Autodesk/Maya####/bin/MayaBatch.exe;/usr/autodesk/maya####/bin/maya.
The path to the render executable can be configured from the Plugin Configuration in the Deadline Monitor.
(Where #### is the year of Maya release such as 2015). Using Deadline Monitor, click on Tools
> Super User Mode > Configure Plugins... > MayaBatch or MayaCmd. The above error
message is indicating that Deadline has been unable to locate the correct local install path of Maya on the
machine which generated the error message. Deadline already ships with all the known, default install
path locations for Maya across the 3 supported operating systems. So you shouldnt have to edit these
paths unless you have installed Maya to a custom location. Please note the subtle diffences between the
different executables being used between MayaBatch & MayaCmd and also the slightly different file path
locations depending on OS. If in doubt, please contact Deadline Support for further assistance.
Error in RenderTasks: Monitored managed process MayaBatch has exited or been terminated.
This is the most common Maya error in Deadline. The MayaBatch plugin, which keeps the Maya
scene file open between tasks (frames) will sometimes not display the full stack trace (error message)
regarding the root cause of your issue. To obtain a full error message, you should re-submit the Maya
file via either Monitor > Submit Maya Job To Deadline > Advanced Options > uncheck the Use
MayaBatch Plugin checkbox OR alternatively in the in-app, Maya submission UI > Render Options
> uncheck the Use MayaBatch Plugin checkbox. Allow this job to run and fail in Deadline and a more
comprehensive error message should now be available. It is this comprehensive error message that should
be sent to Deadline Support if further assistance is required.
Exception during render: Error: (Mayatomr) : could not get a license
Mental Ray is reporting that it cant find a license. Mental Ray requires an additional license for network
rendering, whereas renders such as mayaSoftware and mayaHardware simply uses your Maya license.

704

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Certain versions of Maya come with satellite licenses for Mental Ray, but this requires some additional
setting up to enable network rendering. Its probably best to contact the Maya support team about this.
Exception during render: Renderer returned non-zero error code, 211
When Maya prints this error message it usually means that Maya cant access a particular path because it
either doesnt exist or it doesnt have the necessary read/write permissions to access it. This error tends
to occur when Maya is either loading the scene or other referenced data or when saving the final output
images.
When you get this error, you should check the slave log that is included with the error report. If it is a path
problem, Maya shows which path it wasnt able to access. Check to make sure that the slave machine
rendering the job can see the path, and that it has the necessary permissions to read/write to it. If its
not a path problem, the slave log should still provide some useful information that can help explain the
problem.
There is also the case where Maya exits with this error code after successfully rendering the images. If
this is the case, there are two things to try:
1. When you submit the job, enable the option to ignore error code 211.
2. When you submit the job, enable the MayaBatch option. Deadline doesnt check error codes in this
case.
Cannot open renderer description file vrayRenderer.xml
We are not sure if this is specific to a studios installation of V-Ray for Maya on OSX (f.e. a studios
custom environment variables might be confusing the V-Ray installer) or if this is just a bug in the V-Ray
installer specficially on OSX. The issue has been reported to support@chaosgroup.com. Currently, Maya
20xx on OSX (where xx indicates any year of Maya that V-Ray ships support for) has 3 x rendererDesc
directories in the following locations:
1. /Applications/Autodesk/maya20xx/Maya.app/Contents/bin/rendererDesc/
2. /Applications/Autodesk/maya20xx/Maya.app/Contents/MacOS/rendererDesc/
3. /Applications/Autodesk/maya20xx/bin/rendererDesc
The V-Ray installer adds the vrayRenderer.xml file to locations (2) & (3). However, Maya requires this
file to reside primarily in location(s) (1) and/or (2).
There are a couple of ways to resolve this issue whilst hopefully a fix is provided by Chaos Group in the
future.
Ensure your slaves have the environment variable MAYA_RENDER_DESC_PATH defined and
pointing to: /Applications/Autodesk/maya20xx/Maya.app/Contents/MacOS/rendererDesc.
Alternatively, ensure the user shell of the Deadline Slave has this setting exported such as:
export MAYA_RENDER_DESC_PATH=
/Applications/Autodesk/maya20xx/Maya.app/Contents/MacOS/rendererDesc

Finally, another solution is to ensure on each of your rendernodes you copy the vrayRenderer.xml
from location (2) to location (1).
Exception during render: Error: Cannot find procedure getStrokeUVFromPoly
This error can occur when rendering with paint effects. When you write prerender/postrender scripts be
sure to use maya commands and not function wrappers that the gui posts since a huge number of functions
dont get loaded when rendering in batch mode.
For a quick fix, add the following before the call to the prerenderscripts main functions:

9.37. Maya

705

Deadline User Manual, Release 7.1.0.35

source "getStrokes";

Turtle: The system cannot find the path specified.


When Turtle is installed, it sets some environment variables. However, Deadline will not recognize these
variables until the Launcher (the application in the Windows tray) is restarted. Restarting the Launcher
will fix this problem.
Exception during render: Renderer returned non-zero error code, -1073741819
The error code -1073741819 is equivalent to 0xC0000005, which represents a Memory Access Violation
error. So Maya is either running out of memory, or memory is becoming corrupt. Take a look at the full
render log to see if Maya prints out any information prior to the crash that might explain the problem.
mental ray: out of memory
Try tweaking your Memory and Performance settings in the mental ray tab in the Maya Render Settings
window. Try increasing the Physical memory setting (if you have the extra RAM). A common suggestion
is to set it to 80% of your available RAM. You could also try tweaking the Acceleration method settings.
Another thing you can try is trimming down your scene so that is uses less memory.
V-Ray error: [VFBCore::SetRegion] Error writing render region to raw image file.
Ensure V-Rays VFB region render & track mouse buttons are disabled prior to submission to Deadline. It is also recommended to disable Hide V-Ray VFB in batch mode & Disable region rendering in
batch mode in the V-Ray Common settings under Render View in Maya prior to submission if you are
already using Jigsaw for region/tile rendering.

9.38 Media Encoder


9.38.1 Job Submission
You can submit Adobe Media Encoder jobs from the Monitor.

706

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. The Adobe Media Encoder specific options are:
Input Path: The path to the media to be encoded. It may be a media file in any of the formats supported by
AME, a Premiere Pro project (.prproj), or an FCP XML project (.xml).
Output Path: A path to a file that will contain the encoded result.
Preset File: A path to an AME preset .epr file.

9.38. Media Encoder

707

Deadline User Manual, Release 7.1.0.35

Overwrite Output If Present: If enabled, Adobe Media Encoder will overwrite any output file with the same
name as the output file found in the output location with the new output.
Submit Preset With Job: If enabled, the Preset File will be uploaded to the Deadline Repository with the Job.
Enable this if the Preset File file is local.

9.38.2 Plug-in Configuration


You can configure the Media Encoder plug-in settings from the Monitor. While in super user mode, select Tools ->
Configure Plugins and select the Media Encoder plug-in from the list on the left.

Render Executables
Web Service Executable: The path to the Adobe Media Encoder Web Service executable file used for encoding.
Enter alternative paths on separate lines.
Web Service
Deadline Slaves that render Media Encoder jobs use the media encoder web service. To modify the web service port
number or address you need to modify the ame_webservice_config.ini file.
Example of the config file:
# leaving IP blank/commented out will default to whatever IP address the
# web service is able to sniff out
ip = 127.0.0.1
port = 8080
# restart_threshold:

708

if this value is set, the AME engine will restart itself

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

# after x jobs are completed


#restart_threshold = 10
# job_history: if this value is set, the server will retain information on the
# last x completed jobs
job_history = 100

The ame_webservice_config.ini file is found in the same directory as the Adobe Media Encoder Web Service executable file. Note that the default port being used is 8080.

9.38.3 FAQ
Is Media Encoder supported by Deadline?
Yes.

9.38.4 Error Messages and Meanings


This is a collection of known Media Encoder error messages and their meanings, as well as possible solutions. We
want to keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email
Deadline Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.39 Mental Ray Standalone


9.39.1 Job Submission
You can submit Mental Ray Standalone jobs from the Monitor.

9.39. Mental Ray Standalone

709

Deadline User Manual, Release 7.1.0.35

710

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Setup your Mental Ray Files


Before you can submit a Mental Ray Standalone job, you must export your scene into .mi files. You can export into
either one .mi file with all your frames in it, or one .mi file per frame.
Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. The Mental Ray specific options are:
Mental Ray File: The Mental Ray file(s) to be rendered.
If you are submitting with one frame per .mi file, select one of the numbered frames in the sequence,
and Monitor will automatically detect the frame range. In this case, you should leave the checkbox marked Separate Input MI Files Per Frame checked. The frames you choose to render should
correspond to the numbers on the .mi files.
If your .mi file contains all the frames you wish to render, you should leave the Separate Input MI
Files Per Frame box unchecked. In this case, you must specify the Input MI File Start Frame, which is
the first frame in the input MI file being rendered, as it is used to offset the frame range being passed
to the mental ray renderer. You may then specify the frame range as normal.
Output Folder: The location to which your output files will be written.
Separate Input MI Files Per Frame: Should be checked if you are submitting a sequence of MI files that
represent a single frame each.
Frame Offset: The first frame in the input MI file being rendered, which is used to offset the frame range being
passed to the mental ray renderer.
Threads: The number of threads to use for rendering.
Verbosity: Control how much information Mental Ray prints out during rendering.
Build To Force: You can force 32 or 64 bit rendering.
Enable Local Rendering: If enabled, the frames will be rendered locally, and then copied to their final network
location.
Command Line Args: Specify additional command line arguments you would like to pass to the mental ray
renderer.

9.39.2 Plug-in Configuration


You can configure the Mental Ray plug-in settings from the Monitor. While in super user mode, select Tools ->
Configure Plugins and select the Mental Ray plug-in from the list on the left.

9.39. Mental Ray Standalone

711

Deadline User Manual, Release 7.1.0.35

Render Executables
Render Executable: The path to the Mental Ray Standalone executable file used for rendering. Enter alternative
paths on separate lines.
Render Options
Error Codes To Ignore: Mental Ray error codes that Deadline should ignore and instead assume the render has
finished successfully. Use a ; to separate the error codes.
Treat Exit Code 1 As Error: If set then Exit Code 1 will not be treated as success.

9.39.3 FAQ
Is Mental Ray Standalone supported by Deadline?
Yes.
Can I submit a sequence of MI files that each contain one frame, or must I submit a single MI file that contains
all the frames?
Deadline supports both methods.
When rendering a single MI file that contains all the frames, the frame range I tell Deadline to render doesnt
match up with the files that are actually rendered.
When submitting a single MI file that contains all the frames, make sure the Input MI File Start Frame
option is set to the first frame that is in the MI file. This value is used to offset the frame range being
passed to the mental ray renderer.
Mental Ray is printing out an error that is causing Deadline to fail the render, but when I render from the
command line outside of Deadline, the error is still printed out, but the render finishes successfully.

712

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

By default, Deadline fails a Mental Ray job whenever it prints out an error. However, you can configure
the Mental Ray plugin to ignore certain error codes, which are printed out alongside the error in the error
lob.
After a frame is rendered, Deadline takes a long time releasing the task before it moves on to another. Whats
going on?
This can occur when a single MI file that contains all the frames is submitted to Deadline. Try exporting
your frames to a sequence of MI files (one per frame) and submit the sequence of MI files to Deadline
instead.

9.39.4 Error Messages and Meanings


This is a collection of known Mental Ray error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.40 Messiah
9.40.1 Job Submission
You can submit jobs from within Messiah by installing the integrated submission plugin, or you can submit them from
the Monitor. The instructions for installing the integrated submission script can be found further down this page.

9.40. Messiah

713

Deadline User Manual, Release 7.1.0.35

To submit from within Messiah, select the Customize tab, and then from the drop down, select Submit To Deadline.
Click the Submit Messiah Job button to launch the submitter.

714

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. The Messiah specific options are:
Messiah File: The scene file to render.
Content Folder: This is the folder that contains the Messiah scene assets. It is recommended that you have a
network accessible content folder when network rendering with Messiah.
Output Folder: The folder where the output files will be saved (including images from all enabled buffers). If
left blank, the output folders in the scene file will be used.
Threads: The number of threads to use for rendering.
Build To Force: The build of Messiah to force.
Frame Resolution: Override the width and height of the output images. If a value is set to 0, the value from the
scene file will be respected.
Antialiasing: Override the antialiasing settings in the scene file.

9.40.2 Plug-in Configuration


You can configure the Messiah plug-in settings from the Monitor. While in super user mode, select Tools -> Configure
Plugins and select the Messiah plug-in from the list on the left.

9.40. Messiah

715

Deadline User Manual, Release 7.1.0.35

Messiah Settings
Messiah Host Library: The path to the messiahHOST.dll library. Enter alternative paths on separate lines.

9.40.3 Integrated Submission Script Setup


The following procedures describe how to install the integrated Messiah submission plugin. This plugin allows for
submitting Messiah render jobs to Deadline directly from within the Messiah editing GUI. Note that this has only been
tested with Messiah version 5.
Submitter Installer
Run the Submitter Installer located at <Repository>/submission/Messiah/Installers
Manual Installation of the Submission Script
Messiah 32 Bit
Copy [Repository]\submission\Messiah\Client\DeadlineMessiahClient32.mp to [Messiah 32 Bit Install Directory]\Plugins.
Restart Messiah
Youll find the Submit To Deadline option in the drop down under the Customize tab.
Messiah 64 Bit
Copy [Repository]\submission\Messiah\Client\DeadlineMessiahClient64.mp to [Messiah 64 Bit Install Directory]\Plugins.
Restart Messiah
Youll find the Submit To Deadline option in the drop down under the Customize tab.

716

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

9.40.4 FAQ
Which versions of Messiah are supported by Deadline?
Messiah 5 is currently supported.

9.40.5 Error Messages and Meanings


This is a collection of known Messiah error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.41 MetaFuze
9.41.1 Job Submission
You can submit MetaFuze jobs from the Monitor.

9.41. MetaFuze

717

Deadline User Manual, Release 7.1.0.35

Setup your MetaFuze Batch File


Export you MetaFuze transcode as a batch XML file. Multiple files may be submitted to Deadline at once. Be sure to
set the output path in MetaFuze to a network drive accessible by both yourself and your slave machines.
Submission Options
The general Deadline options are explained in the Job Submission documentation. The MetaFuze specific options are:
Input File(s): The MetaFuze XML files to transcode. Multiple files may be selected in the file browser and
added to the list.
Submit Each File In Folder As A Separate Job: Submits each file (including non-XML files) in the folder of
the selected file as a job.

718

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

9.41.2 Plug-in Configuration


You can configure the MetaFuze plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the MetaFuze plug-in from the list on the left.

Render Executables
MetaFuze Executable: The path to the MetaFuze executable file used for rendering. Enter alternative paths on
separate lines.

9.41.3 FAQ
Is MetaFuze supported by Deadline?
Yes.

9.41.4 Error Messages and Meanings


This is a collection of known MetaFuze error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.41. MetaFuze

719

Deadline User Manual, Release 7.1.0.35

9.42 MetaRender
9.42.1 Job Submission
You can submit MetaRender jobs from the Monitor.

720

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation. The MetaRender specific options
are:
Input File: The input file. It could be a movie file or part of an image sequence.
Output File: The file or image sequence name that MetaRender will write to.
Encoding Profile: The path to the encoding profile saved with the Profile Editor.
Burn File (optional): Superimpose the specified burn-in template over the output frames.
Rendering Mode: Select CPU or GPU.
Strip Alpha Channel: Strips the alpha channel from the input sequence during conversion.
Threads: The number of render threads to use (CPU mode only).
Draft Mode: Speed up rendering for non-critical color work (GPU mode only).
Render CPU Masks: Uses high quality mask rendering instead of low quality GPU-based masks (GPU/Draft
mode only).
Write Flex File: Writes a flex file for the entire timeline.
Render Takes Into Subfolders: If the Flex File option is enabled, render takes into subfolders.
Core Command Args: Specify additional Core Command Line arguments (the basic command line options for
all IRIDAS applications).
MetaRender Args: Specify additional MetaRender-specific command line arguments.

9.42.2 Plug-in Configuration


You can configure the MetaRender plug-in settings from the Monitor. While in super user mode, select Tools ->
Configure Plugins and select the MetaRender plug-in from the list on the left.

9.42. MetaRender

721

Deadline User Manual, Release 7.1.0.35

Render Executables
Meta Render Executable: The path to the Meta Render executable file used for rendering. Enter alternative
paths on separate lines.

9.42.3 FAQ
Is MetaRender supported by Deadline?
Yes.

9.42.4 Error Messages and Meanings


This is a collection of known MetaRender error messages and their meanings, as well as possible solutions. We want
to keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.43 MicroStation
9.43.1 Job Submission
You can submit jobs from within MicroStation by installing the integrated submission script, or you can submit them
from the Deadline Monitor. The instructions for installing the integrated submission script can be found further down
this page.

722

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

To submit from within MicroStation (once the submitter has been installed), navigate to the Utilities->Render menu
and select Submit To Deadline. Alternatively, you can use the Key-In mdl load DLSubmit to bring up the submission UI (or dlsubmit open, once its already been loaded).

9.43. MicroStation

723

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation. The MicroStation-specific options
are:
Operation: This is the type of MicroStation operation that will be performed by the Deadline Job. The different
options are described below:
Animation Render: This will render the currently active Animation Script through Deadline.
Single View Render: This will render a single view as an image through Deadline.
Save Multiple Images: This will submit the currently active Save Multiple Images script as a Deadline
Job, or use the specified SM file.
File Export: This will perform a File->Export operation as a Deadline Job (only a specific subset of these
operations are currently available).
Print: This will perform a Print operation as a Deadline Job using the current settings, or the specified
PSET file.
Mode: This option is dependent on the type of Operation selected. Will either specify the Render Mode, or type
of File Export to perform.
Color Model: This drop-down allows you to select the Color output of the Render (e.g. full RGB, GrayScale,
MonoChrome, etc.)
Design File: This option is only relevant to the Monitor Submitter, and specifies which Design File to use for
the selected operation. For the integrated Submitter, this will always be the DGN file that is currently open.
Submit Files with Job: This option, if checked, will submit files with the job, as opposed to leaving them in
their current location.
View Number: The number of the Viewport that will be used for rendering (1-8).
View Name: (Optional) The name of the Saved View that will be applied before rendering.
Output Size X: The X (horizontal) component of the output size. Set to 0 to use current value, or maintain
Aspect Ratio (depending on whether or not the Aspect is currently locked).
Output Size Y: The Y (vertical) component of the output size. Set to 0 to use current value, or maintain Aspect
Ratio (depending on whether or not the Aspect is currently locked).
Environment: The name of the Environment to use for Luxology Renders. If the specified Environment is not
found, the Untitled setup will be used.
Render Setup: The name of the Render Setup to use for Luxology Renders. If the specified Render Setup is
not found, the Untitled setup will be used.
Frame List: The list of Frames to render during Animation Renders.
Task Size: The number of Frames (Animation) or Script Entries (Save Multiple Images) to process per Deadline
Task.
Settings File: The path to an operation-specific file that will specify additional settings for the operation (e.g.
Print Settings file, DWG Export settings, etc.).
Use Current Settings: This checkbox is only available from the integrated submitter. If checked, a new settings
file will be created and submitted with the Job, based on the settings in the current MicroStation session.
Output Path: The Path to the output that will be created. Frame padding should be represented by either #s
or 0s. Unrecognized file formats for the current operation will be changed to a default known format at render
time.

724

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Convert Network Paths to UNC: If this option is selected, Deadline will attempt to convert paths from using
Mapped Network Drives to using the full UNC network path.
Note that some of these parameters might not apply to all Operations/Modes. The Submitters will automatically
disable or hide controls that are not relevant to the currently chosen Operation/Mode.

9.43.2 Plug-in Configuration


You can configure the MicroStation plug-in settings from the Monitor. While in super user mode, select Tools ->
Configure Plugins and select the MicroStation plug-in from the list on the left.

MicroStation Executables
This section defines the possible locations for ustation.exe for different versions of MicroStation. The Deadline Slaves
will look for the executable in each of these locations (in order) when it tries to render a MicroStation job.

9.43.3 Integrated Submission Script Setup


The following procedures describe how to install the integrated MicroStation submission script. This script enables
MicroStation jobs to be submitted to Deadline directly from the MicroStation GUI. The following procedure has been
tested in MicroStation v8i SS3 (08.11.09).
Submitter Installer
Run the Submitter Installer located at <Repository>/submission/MicroStation/Installers
9.43. MicroStation

725

Deadline User Manual, Release 7.1.0.35

Manual Installation of the Submission Script


First, copy the following files to your [MicroStation install]\mdlapps directory:
[Repository]\submission\MicroStation\Client\DLSubmit.dll
[Repository]\submission\MicroStation\Client\TBUtils.dll
[Repository]\submission\MicroStation\Client\TBUtils.ma
[Repository]\submission\MicroStation\Client\Thinkbox.Common.dll
where [MicroStation install] directory would be typically
(x86)\Bentley\MicroStation V8i (SELECTseries)\MicroStation.

something

like:

C:\Program

Files

This should allow you to use the MDL KeyIn mdl load DLSUBMIT to load the Deadline Submitter within the
MicroStation GUI.
If you want to have a menu item to Submit to Deadline, you can append the file path [Repository]\submission\MicroStation\Client\DeadlineMenu.dgnlib to your MS_GUIDGNLIBLIST configuration variable
(under Workspace -> Configuration... in MicroStation), or you can manually create your own menu item in a
custom DGNLIB by following these instructions from Bentley.
NOTE, the MicroStation Submitter Installer, will by default, install the DeadlineMenu.dgnlib file to this location: C:\ProgramData\Bentley\MicroStation V8i (SELECTseries)\WorkSpace\Interfaces\MicroStation\default
which may be preferred instead of pointing all MicroStation machines to use the copy that is stored on the Deadline Repository. If you use the MicroStation Submitter Installer approach, note that it is NOT necessary to edit the
MS_GUIDGNLIBLIST configuration variable.

726

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Sticky and Default Settings


The integrated MicroStation submitter has a feature that saves previously selected values for specific controls as
Custom Properties in the design file that was used for submission, so that they can be restored automatically for future
submissions.
The list of controls that are flagged to have sticky values is determined by the MicroStation_StickySettings.ini
file in the submission/MicroStation/Main folder of your repository. In addition to this, there is also a MicroStation_DefaultSettings.ini file alongside it, which will control the Default values for all controls, when there are no
sticky settings present/enabled. These files can be freely modified to fit your needs, but keep in mind that any changes
made will affect all users.
The priority of setting values is as follows:
1. Sticky Settings: For a given control, if there is a sticky value present in the scene file, it will be used first
if the control is flagged as sticky in MicroStation_StickySettings.ini.
2. Default Settings: For a given control, if a (valid) default value is provided in MicroStation_DefaultSettings.ini, it will then be used.
3. Factory Defaults: If neither of the above are present, hardcoded defaults are provided. These cannot be
changed.
Note that if any Sticky or Default value is invalid for any reason (e.g. a sticky Saved View was deleted), the control
will fall back to the next level of default.

9.43.4 FAQ
Is MicroStation supported by Deadline?
Yes.
Which versions of MicroStation are supported?
Currently, only MicroStation v8i SS3 (08.11.09) is officially supported. We will look to support different
versions of MicroStation as they come out in the future, or as demand dictates.
Does the MicroStation plugin support Tile Rendering?

9.43. MicroStation

727

Deadline User Manual, Release 7.1.0.35

Not currently. The plan is to investigate the possibility of including this feature in MicroStation going
forward.
How do I remove the multiple entries of the Submit to Deadline menu entry in MicroStation GUI?
Please ensure you do NOT store the DeadlineMenu.dgnlib file in this location on your local machine:
C:\ProgramData\Bentley\MicroStation V8i (SELECTseries)\WorkSpace\Interfaces\MicroStation\default.
The only place the DeadlineMenu.dgnlib
file should be declared is in the MS_GUIDGNLIBLIST configuration variable and it should make
reference to the file from the Deadline repository network path.
Do I need a DeadlineSubmission.dll file in my [MicroStation install]\mdlapps directory?
No. This is an old version which is now deprecated. Ensure you only have the files which are identified
above in the Integrated Submission Script Setup section as copied over to your \mdlapps directory.
Ensure you either use the DeadlineMenu.dgnlib UI configuration method described above or the manual
MDL KeyIn method to start the Deadline Submission UI: mdl load DLSUBMIT.

9.43.5 Error Messages and Meanings


This is a collection of known MicroStation error messages and their meanings, as well as possible solutions. We want
to keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.44 modo
9.44.1 Job Submission
You can submit jobs from within modo by using the integrated submitter (7xx and up), running SubmitModoToDeadline.pl script, or you can submit them from the Monitor.

728

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

To run the integrated submitter within modo 7xx or later, after its been installed:
Render -> Submit To Deadline
To run the integrated submitter within modo 6xx or earlier, after its been installed:
Under the system menu, choose Run Script
Choose the DeadlineModoClient.pl script from [Repository]\submission\Modo\Client
Alternatively, you can also copy this script to your local machine and run it from there. You
should do this if the path to your Deadline repository is a UNC path and you are running modo
on Windows OS.

9.44. modo

729

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. The modo specific options are:
** Job Options**
These are the general modo options:
Render With V-Ray: Enable this option to use V-Rays renderer instead of modos renderer. This requires the
V-Ray for modo plugin to be installed on your render nodes.
Pass Group: The pass group to render, or blank to not render a pass group.
Submit Each Pass Group As A Separate Job: If enabled, a separate job will be submitted for each Pass Group
in the scene.
Override Output
You have the option to override where the rendered images will be saved. If this is disabled, Deadline will respect
the output paths in the modo Output items in your scene file. If this is enabled, be sure to set the Output Pattern
appropriately if your scene has multiple passes, output items, or left and right eye views.
Override Render Output: Enable to override where the rendered images are saved.
Output Folder: The folder where the rendered images will be saved.
Output File Prefix: The prefix for the image file names (extension is not required).
Output Pattern: The pattern for the image file names.
Output Format: The format of the rendered images. Note that you can choose the layered PSD or EXR formats
here, and that Tile Rendering supports the layered EXR format.
Tile Rendering Options
Enable Tile Rendering to split up a single frame into multiple tiles.

730

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Enable Tile Rendering: If enabled, the frame will be split into multiple tiles that are rendered individually and
can be assembled after.
Frame To Tile Render: The frame that will be split up.
Tiles In X: Number of horizontal tiles.
Tiles In Y: Number of vertical tiles.
Submit Dependent Assembly Job: Submit a job dependent on the tile job that will assemble the tiles.
Assemble With Draft: Draft is required when using Jigsaw Rendering. However, when Tile Rendering is the
chosen type, you can choose to assemble with Draft, or with the old Tile Assembler application.
Cleanup Tiles after Assembly: If selected the tiles will be deleted after assembly.
Error on Missing Tiles: If enabled, then if any of the tiles are missing the assembly job will fail.
Assemble Over: Determine what the Draft Tile Assembler should assemble over be it a blank image, previous
output or a specified file.
Error on Missing Background: If enabled, then if the background file is missing the job will fail.
Use Jigsaw: Enable to use Jigsaw for tile rendering.
Open Jigsaw Panel: Opens the Jigsaw UI
Reset Jigsaw Background: Resets the background of the jigsaw regions
Save Jigsaw Regions: Saves the Jigsaw Regions to the scene File
Load Jigsaw Regions: Loads the save Jigsaw Regions and sends them to the open panel.

9.44.2 Interactive Distributed Rendering


You can submit interactive modo Distributed Rendering jobs to Deadline. The instructions for installing the integrated
submission script can be found further down this page. The interactive submitter will submit a special modo server
job to reserve render nodes.
Note that this feature is only supported in modo 7xx and later.

9.44. modo

731

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation. The modo Distributed Rendering
specific options are:
Maximum Servers: The maximum number of modo Servers to reserve for distributed rendering.
Use Server IP Address instead of Host Name: If checked, the Active Servers list will show the server IP
addresses instead of host names.
Rendering
After youve configured your submission options, press the Reserve Servers button to submit the modo Server job.
After the job has been submitted, you can press the Update Servers button to update the jobs ID and Status in the
submitter. As nodes pick up the job, pressing the Update Servers button will also show them in the Active Servers list.
Once you are happy with the server list, press Start Render to start distributed rendering.
Note that the modo Server process can sometimes take a little while to initialize. This means that a server in the Active
Server list could have started the modo Server, but its not fully initialized yet. If this is the case, its probably best to
732

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

wait a minute or so after the last server has shown up before pressing Start Render.
After the render is finished, you can press Release Servers or close the submitter to mark the modo Server job as
complete so that the render nodes can move on to another job.

9.44.3 Network Rendering Considerations


This Article provides some useful information for setting up modo for network rendering.

9.44.4 Cross-Platform Rendering Considerations


In order to perform cross-platform rendering with modo, you must setup Mapped Paths so that Deadline can swap out
the Scene and Output file paths where appropriate. You can access the Mapped Paths Setup in the Monitor while in
super user mode by selecting Tools -> Configure Repository. Youll find the Mapped Paths Setup in the list on the left.
Note that Deadline supports path mapping for any texture paths within the modo scene file (see the Path Mapping
setting in the modo Plug-in Configuration section below). However, the modo scene file stores its paths a bit differently,
so you will likely have to add special Mapped Paths in the Repository Options for this to work.
For example, you might already have a Mapped Path like this to handle paths from Mac OS X to Windows:
Replace Path: /Volumes/share/
Windows Path: \\server\share\
Linux Path:
Mac Path:

However, the modo scene file will probably be storing texture paths as Volumes:share/ instead of /Volumes/share/.
This means youll need another Mapped Path entry that looks like this:
Replace Path: Volumes:share/
Windows Path: \\server\share\
Linux Path:
Mac Path:

If you wish to disable the Path Mapping setting in the modo Plug-in Configuration, but still wish to perform crossplatform rendering with modo, you must ensure that your modo scene file is on a network shared location, and that
any footage or assets that the project uses is in the same folder. Then when you submit the job to Deadline, you must
make sure that the option to submit the scene file with the job is disabled. If you leave it enabled, the scene file will be
copied to and loaded from the Slaves local machine, and thus wont be able to find the footage.

9.44.5 Plug-in Configuration


You can configure the modo plug-in settings from the Monitor. While in super user mode, select Tools -> Plugins
Configuration and select the modo plug-in from the list on the left.

9.44. modo

733

Deadline User Manual, Release 7.1.0.35

Render Executables
modo Executable: The path to the modo executable file used for rendering. Enter alternative paths on separate
lines. Different executable paths can be configured for each version installed on your render nodes.
Geometry Cache
Auto Set Geometry Cache: Enable this option to have Deadline automatically set the modo geometry cache
before rendering (based on the geometry cache buffer below).
Geometry Cache Buffer (MB): When auto-setting the geometry cache, Deadline subtracts this buffer amount
from the systems total memory to calculate what the geometry cache should be set to.
Path Mapping (For Mixed Farms)
Enable Path Mapping: If enabled, a temporary modo file will be created locally on the slave for rendering
because Deadline does the path mapping directly in the modo file. This feature can be turned off if there are no
Path Mapping entries defined in the Repository Options.

9.44.6 Integrated Submission Script Setup


This section describes how to install the integrated submission scripts for modo. The integrated submission scripts and
the following installation procedures have been tested with modo 7xx and later.
Submitter Installer
Run the Submitter Installer located at <Repository>/submission/Modo/Installers
Manual Installation of the Submission Script
7xx or later:
Open modo, and select System -> Open User Scripts Folder.

734

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Copy the DeadlineModo folder from \\your\repository\submission\Modo\Client to this User Scripts folder.
Restart modo, and you should find the Submit To Deadline menu item in your Render menu.
6xx or earlier:
Under the system menu, choose Run Script
Choose the DeadlineModoClient.pl script from [Repository]\submission\Modo\Client
Alternatively, you can also copy this script to your local machine and run it from there. You should do this
if the path to your Deadline repository is a UNC path and you are running modo on Windows OS.
Custom Sanity Check
A CustomSanityChecks.py file can be created in [Repository]\submission\Modo\Main, and will be executed if it exists
when the user clicks the Submit button in the integrated submitter. This script will let you override any of the properties
in the submission script prior to submitting the job. You can also use it to run your own checks and display errors or
warnings to the user. Finally, if the RunSanityCheck method returns False, the submission will be cancelled.
Here is a very simple example of what this script could look like:
import
import
import
import

lx
lxu
lxu.command
lxifc

def errordialog(title, message):


lx.eval('dialog.setup error')
lx.eval('dialog.title {%s}' % title)
lx.eval('dialog.msg {%s}' % message)
try:
lx.eval('+dialog.open')
except:
pass
def RunSanityCheck():
lx.eval( 'user.value deadlineDepartment {The Best Department!}' )
lx.eval( 'user.value deadlinePriority 33' )
lx.eval( 'user.value deadlineConcurrentTasks 2' )
errordialog( 'Error', 'This is a custom sanity check!' )
return True

You can open the LoadDeadlineModoUI.py file from [Repository]\submission\Modo\Client\DeadlineModo\pyscripts


to see the available Deadline modo values.
Distributed Rendering Script Setup
Submitter Installer
Run the Submitter Installer located at <Repository>/submission/ModoDBR/Installers
Manual Installation of the Submission Script
7xx or later only:
Open modo, and select System -> Open User Scripts Folder.

9.44. modo

735

Deadline User Manual, Release 7.1.0.35

Copy the DeadlineModoDBR folder from \\your\repository\submission\ModoDBR\Client to this User Scripts


folder.
Restart modo, and you should find the Submit To Deadline: Modo DBR menu item in your Render menu.

9.44.7 FAQ
Which versions of modo are supported?
Modo 3xx and later are supported.
Which versions of modo can I use for interactive distributed rendering?
Modo 7xx and later are supported.
When rendering with modo on Windows, it hangs after printing out @start modo_cl [48460] Luxology LLC.
Were not sure of the cause, but a known fix is to copy the perl58.dll from the extra folder into the
main modo install directory (C:Program FilesLuxologymodo601).
When rendering with modo on Mac OSX, the Slave icon in the Dock changes to the modo icon, and the render
gets stuck.
This is a known problem that can occur when the Slave application is launched by double-clicking it in
Finder. There are a few known workarounds:
1. Start the Launcher application, and launch the Slave from the Launchers Launch menu.
2. Launch the slave from the terminal by simply running DEADLINE_BIN/deadlineslave or DEADLINE_BIN/deadlinelauncher -slave, where DEADLINE_BIN is the Deadline bin folder.
3. Use modo as the render executable instead of modo_cl.
When tile rendering, each tile is rendered, but there is image data in the unrendered region of each tile.
This happens when there is a cached image in the modo frame buffer. Open up modo on the offending
render node(s) and delete all cached images to fix the problem.

9.44.8 Plugin Error Messages


This is a collection of known modo error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.45 Naiad
9.45.1 Job Submission
You can submit jobs from the Monitor.

736

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

9.45. Naiad

737

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. The Naiad specific options are:
Naiad File: The Naiad file to simulate.
Naiad Simulation Job
Submit Simulation Job: Enable to submit a Simulation job to Deadline.
Run Simulation On A Single Machine: If enabled, the simulation job will be submitted as a single task
consisting of all frames so that a single machine runs the entire simulation.
Threads: The number of render threads to use. Specify 0 to let Naiad determine the number of threads to use.
Enable Verbose Logging: Enables verbose logging during the simulation.
EMP to PRT Conversion Job
Submit an EMP to PRT Conversion Job: Enable to submit a PRT Conversion job to Deadline.
If you are also submitting a simulation job, this job will use the EMP files created by the simulation job.
If you are not submitting a simulation job, the EMP files must already exist.
EMP Body Name: The EMP body name.
EMP Body File Name: The path to the EMP files to be converted.

9.45.2 Plug-in Configuration


You can configure the Naiad plug-in settings from the Monitor. While in super user mode, select Tools -> Configure
Plugins and select the Naiad plug-in from the list on the left.

738

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Naiad Executables
Simulation Executable: The path to the command line client executable file used for simulation. Enter alternative paths on separate lines.
Emp to Prt Executable: The path to the emp2prt executable file used for emp conversion. Enter alternative
paths on separate lines.

9.45.3 FAQ
Is Naiad supported by Deadline?
Yes.

9.45.4 Error Messages and Meanings


This is a collection of known Naiad error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.46 Natron
9.46.1 Job Submission
You can submit Natron jobs from the Monitor.

9.46. Natron

739

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. The Natron specific options are:
Writer Node To Render: A custom writer node to render can be specified here. This is optional and can be left
blank.
Frame List: Override the frame list of writer node frames to render. This is optional and can be left blank.
Frames Per Task: This is the number of frames that will be rendered at a time for each job task. Default is 1.

9.46.2 Cross-Platform Rendering Considerations


In order to perform cross-platform rendering with Natron, you must setup Mapped Paths so that Deadline can swap
out Read Node and Write Node file paths where appropriate. You can access the Mapped Paths Setup in the Monitor
while in super user mode by selecting Tools -> Configure Repository. Youll find the Mapped Paths Setup in the list
on the left.

9.46.3 Plug-in Configuration


You can configure the Natron plug-in settings from the Monitor. While in super user mode, select Tools -> Configure
Plugins and select the Natron plug-in from the list on the left.

740

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Render Executables
Natron Executable: The path to the Natron executable file used for rendering. Enter alternative paths on
separate lines. Different executable paths can be configured for each version installed on your render nodes.
Path Mapping (For Mixed Farms)
Enable Path Mapping: If enabled, a temporary Natron file will be created locally on the slave for rendering
because Deadline does the path mapping directly in the Natron file. This feature can be turned off if there are
no Path Mapping entries defined in the Repository Options.

9.46.4 FAQ
Which versions of Natron are supported?
Natron 0.9 and later are supported.
Why doesnt Deadline Slave/Monitor report Natrons task progress?
Currently (v1.0), Natron has limited task reporting although when specifying a particular writer node and
frame list then frame progress is supported.
How do I specify a frame range to be rendered?
Unfortunately, Natron does not currently support specifying a frame range to be rendered, but renders
by default the settings within the Natron project file per writer node. If you optionally, specify a writer
node to be rendered under advanced options in the monitor submission UI, then it is possible to specify a
particular frame range and number of frames per task for this writer node.

9.46. Natron

741

Deadline User Manual, Release 7.1.0.35

9.46.5 Error Messages and Meanings


This is a collection of known Natron error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.47 Nuke
9.47.1 Job Submission
You can submit jobs from within Nuke by installing the integrated submission script, or you can submit them from the
Monitor. The instructions for installing the integrated submission script can be found further down this page.
To submit from within Nuke, select Submit To Deadline from the Thinkbox menu.

Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. The Nuke specific options are:
Render With NukeX: Enable this option if you want to render with NukeX instead of Nuke.
Use Batch Mode: If enabled, Deadline will keep the Nuke file loaded in memory between tasks.
Render Threads: The number of threads to use for rendering.
Use The GPU For Rendering: If Nuke should also use the GPU for rendering (Nuke 7 and later only).
Maximum RAM Usage: The maximum RAM usage (in MB) to be used for rendering.

742

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Enforce Write Node Render Order: Forces Nuke to obey the render order of Write nodes.
Minimum Stack Size: The minimum stack size (in MB) to be used for rendering. Set to 0 to not enforce a
minimum stack size.
Continue On Error: If enabled, Nuke will attempt to keep rendering if an error occurs.
Use Performance Profiler: If enabled, Nuke will profile the performance of the Nuke script while rendering
and create a xml file per task for later analysis (Nuke 9 and later only).
XML Directory: If Use Performance Profiler is enabled, this is the directory on the network where the performance profile xml files will be saved.
Render in Proxy Mode: If enabled, Nuke will render using the proxy file paths.
Choose Views To Render: Enable this option to choose which view(s) to render. By default, all views are
rendered.
Submit Write Nodes As Separate Jobs: Each write node is submitted as a separate job.
Use Nodes Frame List: If submitting each write node as a separate job or task, enable this to pull the frame
range from the write node, instead of using the global frame range.
Set Dependencies Based on Write Node Render Order: When submitting write nodes as separate jobs, this
option will make the separate jobs dependent on each other based on write node render order.
Submit Write Nodes As Separate Tasks For The Same Job: Enable to submit a job where each task for the
job represents a different write node, and all frames for that write node are rendered by its corresponding task.
Selected Nodes Only: If enabled, only the selected Write nodes will be rendered.
Nodes With Read File Enabled Only: If enabled, only the Write nodes that have the Read File option
enabled will be rendered.
Render Precomp Nodes First: If enabled, all write nodes in precomp nodes will be rendered before the main
job.
Only Render Precomp Nodes: If enabled, only the Write nodes that are in precomp nodes will be rendered.
The Submit Each Write Node As A Separate Task option can be useful if you have a bunch of write nodes in a Nuke
script to output different Quicktime movies. You can enable this option, and bump up the Concurrent Tasks value to
allow machines to process multiple write nodes concurrently. Since Quicktime generation only uses a single thread,
you can get much better throughput with this option on multi-core machines.

9.47.2 Cross-Platform Rendering Considerations


In order to perform cross-platform rendering with Nuke, you must setup Mapped Paths so that Deadline can swap out
Read Node and Write Node file paths where appropriate. You can access the Mapped Paths Setup in the Monitor while
in super user mode by selecting Tools -> Configure Repository. Youll find the Mapped Paths Setup in the list on the
left.

9.47.3 Plug-in Configuration


You can configure the Nuke plug-in settings from the Monitor. While in super user mode, select Tools -> Configure
Plugins and select the Nuke plug-in from the list on the left.

9.47. Nuke

743

Deadline User Manual, Release 7.1.0.35

Render Executables
Nuke Executable: The path to the Nuke executable file used for rendering. Enter alternative paths on separate
lines. Different executable paths can be configured for each version installed on your render nodes.
Licensing Options
Slaves To Use Interactive License: A list of slaves that should use an interactive Nuke license instead of a
render license. Use a , to separate multiple slave names, for example: slave001,slave002,slave003
OFX Cache
Prepare OFX Cache Before Rendering: If enabled, Deadline will try to create the temporary ofxplugincache
folder before rendering, which helps ensure that comps that use OFX plugins render properly.
Path Mapping (For Mixed Farms)
Enable Path Mapping: If enabled, a temporary Nuke file will be created locally on the slave for rendering
because Deadline does the path mapping directly in the Nuke file. This feature can be turned off if there are no
Path Mapping entries defined in the Repository Options.

9.47.4 Nuke Studio Sequence Submission


If you are using Nuke Studio you can submit individual comps to Deadline from the Integrated Submitter as you would
in Nuke or NukeX; however, there is also the option of submitting sequences of comps to Deadline as individual jobs.
You can submit all the sequences comps for a project by selecting the project and submitting.

744

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

You can also choose which sequences you want to submit comps for

9.47. Nuke

745

Deadline User Manual, Release 7.1.0.35

Note this is only an option in the Integrated Submitter in Nuke Studio. It is also required to have a saved project with
sequences that have comps in order to have this option.

746

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

9.47.5 Integrated Submission Script Setup


The following procedures describe how to install the integrated Nuke submission script. This script allows for submitting Nuke render jobs to Deadline directly from within the Nuke editing GUI. Note that this has only been tested with
Nuke version 6 and later.
Submitter Installer
Run the Submitter Installer located at <Repository>/submission/Nuke/Installers
Manual Installation of the Submission Script
Copy [Repository]\submission\Nuke\Client\DeadlineNukeClient.py to your .nuke user folder (~/.nuke or
%USERPROFILE%\.nuke)
If menu.py does not exist in your .nuke user folder, copy [Repository]\submission\Nuke\Client\menu.py to your
.nuke user folder
If menu.py does exist, then open it in a text editor and add the following lines of code:
import DeadlineNukeClient
menubar = nuke.menu("Nuke")
tbmenu = menubar.addMenu("&Thinkbox")
tbmenu.addCommand("Submit Nuke To Deadline", DeadlineNukeClient.main, "")

The next time you launch Nuke, there should be a Thinkbox menu with the option to Submit Nuke to Deadline.
Custom Sanity Check
A CustomSanityChecks.py file can be created alongside the main SubmitNukeToDeadline.py submission script (in
[Repository]\submission\Nuke\Main), and will be evaluated if it exists. This script will let you set any of the initial
properties in the submission script prior to displaying the submission window. You can also use it to run your own
checks and display errors or warnings to the user. Here is a very simple example of what this script could look like:
import nuke
import DeadlineGlobals
def RunSanityCheck():
DeadlineGlobals.initDepartment = "The Best Department!"
DeadlineGlobals.initPriority = 33
DeadlineGlobals.initConcurrentTasks = 2
nuke.message( "This is a custom sanity check!" )
return True

The DeadlineGlobals module can be found in the same folder as the SubmitNukeToDeadline.py script mentioned
above. It just contains the list of global variables that you can set, which are then used by the submission script to
set the initial values in the submission dialog. Simply open DeadlineGlobals.py in a text editor to view the global
variables.
Finally, if the RunSanityCheck method returns False, the submission will be cancelled.

9.47.6 FAQ
Which versions of Nuke are supported?

9.47. Nuke

747

Deadline User Manual, Release 7.1.0.35

Nuke 6 and later are supported.


Can I render with NukeX instead of Nuke?
Yes. Simply enable the Render With NukeX option when submitting your Nuke job.
Whats the benefit to using Batch Mode?
If enabled, Deadline will keep the Nuke file loaded in memory between tasks. This can help reduce
overhead, because the Nuke file is only loaded once when the job is started on the Slave.
Is Nuke Studio Supported?
Yes. It functions the same way as Nuke with the additions of Sequence Submission and Frame Server
rendering.
Can I use 3rd party plugins such as Peregrine Labs Bokeh plugin with Nuke via Deadline?
Yes. Ensure all machines have the same plugin software installed locally or via a network where applicable (depending on your studio pipeline/software deployment management and if the plugin in question
supports deployment in this manner). Ensure any neccessary environment variables required are also
present on each Slave. In the case of Peregrine Labs Bokeh plugin, ensure the Slaves have the environment variable available: PEREGRINE_LICENSE=port@hostname, where port & hostname are your
license server details. Environment variables can be set for a job in Deadline, and these variables will
be applied to the rendering process environment. These variables can be set in the Job Properties in the
Monitor, and they can be set during Manual Job Submission.

9.47.7 Error Messages and Meanings


This is a collection of known Nuke error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.48 Nuke Frame Server


9.48.1 Reserving Slaves For Frame Server Render
You can reserve Deadline Slaves for a rendering over the Nuke Studio frame server from Nuke Studio by installing the
integrated submission script, or you can reserve them from the Monitor. The instructions for installing the integrated
submission script can be found further down this page.
To submit a Reserve Frame Server Slaves job from within Nuke Studio, select Submit to Deadline from the Thinkbox
menu.

748

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

9.48. Nuke Frame Server

749

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation. The Nuke specific options are:
Host: The IP address or Host Name of the Master Machine that the frame server slaves will connect to. This
machine is the machine that will start the actual Nuke render.
Port Number: The port number to use to connect to the Master Machine. Default is 5560. Slave must NOT run
on the same machine as the Master Machine.

750

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Worker Count: The number of workers that should be spawned on each Deadline Slave that is reserved. Each
worker runs an instance of Nuke and renders independently of other workers.
Worker Threads: The number of threads each worker should use.
Worker Memory: The amount of memory to reserve for each worker.
Reserving From Inside Nuke Studio
It is required that you have Nuke Studio 9.0v3 or newer installed in order to properly use the Frame Server with
Deadline. After youve configured your submission options, press the Reserve Machines button to submit the Nuke
Frame Server job. The jobs ID and Status will be tracked in the submitter, and as nodes pick up the job, they will
show up in the Reserved Machines list. Once you are happy with the server list you can start rendering or exporting
over the frame server.
Note that the Nuke Frame Server process can sometimes take a little while to initialize. This means that a machine in
the Reserved Machines list could have started the Nuke Frame Server process, but its not fully initialized yet. If this
is the case, its probably best to wait a minute or so after the last server has shown up before starting the render.
After the render is finished, you can press Release Machines or close the submitter UI (Setup Frame Server Slaves
With Deadline) to mark the Frame Server job as complete so that the render nodes can move on to another job.
Note: Only one Slave per machine may pick up a Nuke Frame Server job, as allowing multiple Slaves on the same
machine to try to bind to the same port would not work. Deadline will also fail a render if a slave running on the
Master Machine tries to pick up the job, as it is already running an instance of the Frame Server and the same port
binding conflict can occur.
Reserving From The Monitor
After youve configured your submission options, press the Submit button to submit the Nuke Frame Server job. Note
that this doesnt start any rendering, it just allows the Nuke Frame Server to start up on nodes in the farm. Once youre
happy with the nodes that have picked up the job, you can initiate the distributed render manually from within Nuke
Studio.
After the distributed render has finished, remember to mark the job as complete or delete it so that the nodes can move
on to other jobs.

9.48.2 Cross-Platform Rendering Considerations


In order to perform cross-platform rendering with Nuke Studio, you must setup Mapped Paths so that Deadline can
swap out Read Node and Write Node file paths where appropriate. You can access the Mapped Paths Setup in the
Monitor while in super user mode by selecting Tools -> Configure Repository. Youll find the Mapped Paths Setup in
the list on the left.

9.48.3 Plug-in Configuration


You can configure the Nuke Frame Server plug-in settings from the Monitor. While in super user mode, select Tools
-> Configure Plugins and select the NukeFrameServer plug-in from the list on the left.

9.48. Nuke Frame Server

751

Deadline User Manual, Release 7.1.0.35

Render Executables
Nuke Executable: The path to the Nuke executable file used for rendering. Enter alternative paths on separate
lines. Different executable paths can be configured for each version installed on your render nodes.
Licensing Options
Slaves To Use Interactive License: A list of slaves that should use an interactive Nuke license instead of a
render license. Use a , to separate multiple slave names, for example: slave001,slave002,slave003
OFX Cache
Prepare OFX Cache Before Rendering: If enabled, Deadline will try to create the temporary ofxplugincache
folder before rendering, which helps ensure that comps that use OFX plugins render properly.

9.48.4 Integrated Submission Script Setup


The following procedures describe how to install the integrated Nuke Frame Server submission script. This script
allows for submitting Nuke Reserve Frame Server Slaves jobs to Deadline directly from within the Nuke Studio
editing GUI.
Submitter Installer
Run the Submitter Installer located at <Repository>/submission/Nuke/Installers
Manual Installation of the Submission Script
Copy [Repository]\submission\Nuke\Client\DeadlineNukeFrameServerClient.py to your .nuke user folder
(~/.nuke or %USERPROFILE%\.nuke)
If menu.py does not exist in your .nuke user folder, copy [Repository]\submission\Nuke\Client\menu.py to your
.nuke user folder
If menu.py does exist, then open it in a text editor and add the following lines of code:

752

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

#For Nuke Frame Server and Nuke Submission


import DeadlineNukeClient
menubar = nuke.menu("Nuke")
tbmenu = menubar.addMenu("&Thinkbox")
tbmenu.addCommand("Submit Nuke To Deadline", DeadlineNukeClient.main, "")
#This is done to only add the Frame Server in Nuke Studio.
#Try-except is for older versions of Nuke.
try:
if nuke.env[ 'studio' ]:
import DeadlineNukeFrameServerClient
tbmenu.addCommand("Reserve Frame Server Slaves",
DeadlineNukeFrameServerClient.main, "")
except:
pass

The next time you launch Nuke Studio, there should be a Thinkbox menu with the option to Reserve Frame Server
Slaves.
Custom Sanity Check
A CustomSanityChecks.py file can be created alongside the main ReserveFrameServerSlaves.py submission script (in
[Repository]\submission\Nuke\Main), and will be evaluated if it exists. This script will let you set any of the initial
properties in the submission script prior to displaying the submission window. You can also use it to run your own
checks and display errors or warnings to the user. Here is a very simple example of what this script could look like:
import nuke
import DeadlineFRGlobals
def RunSanityCheck():
DeadlineFRGlobals.initDepartment = "The Best Department!"
DeadlineFRGlobals.initPriority = 33
DeadlineFRGlobals.initPort = 5570
nuke.message( "This is a custom sanity check!" )
return True

The DeadlineFRGlobals module can be found in the same folder as the ReserveFrameServerSlaves.py script mentioned
above. It just contains the list of global variables that you can set, which are then used by the submission script to
set the initial values in the submission dialog. Simply open DeadlineFRGlobals.py in a text editor to view the global
variables.
Finally, if the RunSanityCheck method returns False, the submission will be cancelled.

9.48.5 FAQ
Which versions of Nuke are supported?
Nuke Studio 9.0v3 and onwards is supported.
What Nuke license does Frame Server use?
Nuke Studios Frame Server uses by default a standard Nuke rendernode -r license. Note for every
license of Nuke Studio you own, a number of Nuke render licenses are included from The Foundry.

9.48. Nuke Frame Server

753

Deadline User Manual, Release 7.1.0.35

These licenses are intended to be used for local Nuke Studio background rendering using Frame Server
running locally. Deadlines Frame Server jobs are for when additional processing power is required by
your local running instance of Nuke Studio and its Master Frame Server functionality. Note, in Deadlines
Nuke Frame Server plugin configuration section, you can also provide a list of slaves that should use an
interactive Nuke license instead of a render license, albeit this is a somewhat expensive thing to do with
your Nuke gui licenses!
Can I run Frame Server via Deadline Slave on the Master Machine?
No. You wont be able to run Frame Server via Deadline Slave on the same machine that is also acting
as the Master Machine (the machine currently running your session of Nuke Studio). Deadline will fail a
render if a slave running on the Master Machine tries to pick up the job, as it is already running an instance
of the Frame Server and a port binding conflict will occur. You will need to use a different machine even
for simple testing purposes.
If running multiple Deadline Slaves, can I run a normal Nuke network rendering job simultaneously with Nuke
Frame Server jobs?
Yes. You will want to consider using Deadline limits here to ensure you dont blow your Nuke license
budget. See our Limits documentation for how to implement limits for each of your software license
needs. Ensure you use Machine as the Usage Level in your Limits configuration, to ensure only 1 x
Nuke license is used by the one physical/virtual machine. Dont forget to consider licensing implications
for any 3rd party Nuke plugins such as Optical Flares you may be using.

9.48.6 Error Messages and Meanings


This is a collection of known Nuke Frame Server error messages and their meanings, as well as possible solutions. We
want to keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email
Deadline Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.49 Octane Standalone


9.49.1 Job Submission
You can submit Octane Standalone jobs from the Monitor.

754

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. The Octane specific options are:

9.49. Octane Standalone

755

Deadline User Manual, Release 7.1.0.35

Octane Scene File: Specify the Octane scene file(s) to render. If you have an animation with one OCS file per
frame, you just need to select one of the OCS files from the sequence.
Output File: The output file path. This is optional and can be left blank.
Render Target: Select the target to render. This list is automatically populated based on the selected OCS file.
Single Frame Job: Check this option if you are submitting a single frame to render, as opposed to an animation
consisting of a sequence of OCS files.
Override Sampling: Overrides the maximum samples in the OCS file.
Command Line Args: Additional command line arguments to pass to the renderer.

9.49.2 Plug-in Configuration


You can configure the Octane plug-in settings from the Monitor. While in super user mode, select Tools -> Configure
Plugins and select the Octane plug-in from the list on the left.

Render Executables
Octane Executable: The path to the Octane executable file used for rendering. Enter alternative paths on
separate lines.

9.49.3 FAQ
Is Octane Standalone supported by Deadline?
Yes!

756

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

9.49.4 Error Messages and Meanings


This is a collection of known Octane error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.50 PRMan (Renderman Pro Server)


9.50.1 Job Submission
You can submit PRMan jobs from the Monitor.

9.50. PRMan (Renderman Pro Server)

757

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. The PRMan specific options are:

758

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

RIB Files: The RIB files to be rendered (can be ASCII or binary formatted). These files should be network
accessible.
Working Directory: The working directory used during rendering. This is required if your RIB files contain
relative paths.
Threads: The number of threads to use for rendering. Set to 0 to let PRMan automatically determine the optimal
thread count.
Additional Arguments: Specify additional command line arguments you would like to pass to the PRMan
renderer.

9.50.2 Plug-in Configuration


You can configure the PRMan plug-in settings from the Monitor. While in super user mode, select Tools -> Configure
Plugins and select the PRMan plug-in from the list on the left.

Render Executables
PRMan Executable: The path to the PRMan executable file used for rendering. Enter alternative paths on
separate lines.

9.50.3 FAQ
Is PRMan supported by Deadline?
Yes.
Is PRMans folder structure where each frame has its own folder supported by Deadline?

9.50. PRMan (Renderman Pro Server)

759

Deadline User Manual, Release 7.1.0.35

Yes. Deadline can render rib files that are in separate folders per frame, and can also render rib files that
are all stored in the same folder.

9.50.4 Error Messages and Meanings


This is a collection of known PRMan error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.51 Puppet
9.51.1 Job Submission
You can submit Puppet update jobs from the Monitor.

760

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation. The Puppet specific options are:
Verbose Output: Prints very detailed output when the job is run.

9.51.2 Plug-in Configuration


You can configure the Puppet plug-in settings from the Monitor. While in super user mode, select Tools -> Configure
Plugins and select the Puppet plug-in from the list on the left.

9.51. Puppet

761

Deadline User Manual, Release 7.1.0.35

Options
Puppet Batch: The path to the Puppet executable file. Enter alternative paths on separate lines.

9.52 Python
9.52.1 Job Submission
You can submit Python jobs from the Monitor.

762

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation. The Python specific options are:
Script File: The script you want to submit.
Arguments: The arguments to pass to the script. Leave blank if the script takes no arguments.
Version: The version of Python to use.

9.52.2 Plug-in Configuration


You can configure the Python plug-in settings from the Monitor. While in super user mode, select Tools -> Configure
Plugins and select the Python plug-in from the list on the left. To get a description of each setting, simply hover the
mouse cursor over a setting and a tool tip will be displayed.

9.52. Python

763

Deadline User Manual, Release 7.1.0.35

Python Executables
Python Executable: The path to the Python executable file used. Enter alternative paths on separate lines.
Different executable paths can be configured for each version installed on your render nodes.

9.52.3 FAQ
Which versions of Python are supported?
Python 2.3 to 3.2 are all supported. Additional versions can be added when necessary.

9.53 Quicktime Generation


9.53.1 Job Submission - Apple Quicktime
Jobs can be submitted from the Monitor. You can use the Submit menu, or you can right-click on a job and select
Scripts -> Quicktime Submission to automatically populate some fields in the Quicktime submitter based on the jobs
output.
Note that in order to use this Quicktime plugin, you MUST have Quicktime version 7.04 or later installed on your
slaves, as well as any workstations that Quicktime jobs will be submitted from.

764

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. The Quicktime specific options are:
Input Images: The frames you would like to generate the Quicktime from. If a sequence of frames exist in
the same folder, Deadline will automatically collect the range of the frames and will set the Frame Range field
accordingly.

9.53. Quicktime Generation

765

Deadline User Manual, Release 7.1.0.35

Output Movie File: The name of the Quicktime to be generated.


Frames: The frame range used to generate the Quicktime.
Codec: The codec format to use for the Quicktime.
Frame Rate: The frame rate of the Quicktime.
Audio File: Specify an audio file to be added to the Quicktime movie. Leave blank to disable this feature.
Settings File: The Quicktime settings file to use. Press the Create Settings button to create a new Quicktime
settings file.

9.53.2 Plug-in Configuration


The Apple Quicktime plug-in does not require any configuration.

766

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

9.53.3 FAQ
Which version of Apple Quicktime is required to create Quicktime movies with Deadline using the Apple Quicktime renderer?
Apple Quicktime version 7.04 or later is required. It must be installed on all slaves that will be rendering
Quicktime movies, as well as any machines from which Quicktime jobs will be submitted from. You can
download the latest version of Quicktime from here.
Can I submit an Apple Quicktime job from Windows to run on Mac OSX, or vice versa?
No, because the export settings are saved out differently on each operating system. The Windows Quicktime generator doesnt recognize settings that are exported on a Mac, and vice versa. We hope to find a
solution for this in the future, but for now you should ensure that your Quicktime job renders on the same
operating system from which it was submitted from (using groups, pools, machine lists, etc).
Can multiple machines work together to render a single movie file?
No, this is not possible. This is why Quicktime Generation jobs should always consist of a single task that
contains all the frames to be included in the movie file.
When submitting an Apple Quicktime, an error message pops up when I click the Submit button.

This error pops up when you have an older version of Apple Quicktime installed. Installing the latest
version should fix the problem.

9.53.4 Error Messages and Meanings


This is a collection of known Quicktime Generation error messages and their meanings, as well as possible solutions.
We want to keep this list as up to date as possible, so if you run into an error message that isnt listed here, please
email Deadline Support and let us know.
Exception during render: Error: Class not registered - Make sure that QuickTime version 7.04 or later is
installed on this machine
This error occurs when you have an older version of Apple Quicktime installed. Installing the latest
version should fix the problem.
Exception during render: Renderer returned non-zero error code, 128
The Apple Quicktime renderer is crashing for some reason. Check to make sure you have the latest
version of Apple Quicktime installed.

9.54 Realflow
9.54.1 Job Submission
You can submit jobs from within RealFlow by installing the integrated submission script, or you can submit them from
the Monitor. The instructions for installing the integrated submission script can be found further down this page.
9.54. Realflow

767

Deadline User Manual, Release 7.1.0.35

To submit from within RealFlow 5 or later, select Commands -> System Commands -> SubmitToDeadline.py.
To submit from within RealFlow 4, select Scripts -> User Scripts -> Deadline -> Submit To Deadline.

768

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

9.54. Realflow

769

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. The Realflow specific options are:
Submit IDOC Jobs: Enable to submit separate IDOC jobs for each IDOC name specified. Separate multiple
IDOC names with commas. For example: IDOC01,IDOC02,IDOC03
Start Rendering At [Start Frame - 1]: Enable this option if RealFlow rendering should start at the frame
preceding the Start Frame. For example, if you are rendering frames 1-100, but you need to pass 0-100 to
RealFlow, then you should enable this option.
Use One Machine Only: Forces the entire job to be rendered on one machine. If this is enabled, the Machine
Limit, Task Chunk Size and Concurrent Tasks settings will be ignored.
Version: The version of RealFlow to render with.
Build: Force 32 bit or 64 bit rendering.
Rendering Threads: The number of threads to use during simulation.
Reset Scene: If this option is enabled, the scene will be reset before the simulation starts.
Generate Mesh: This option will generate the mesh for a scene where particle cache files were created previously.
Use Particle Cache: If you have created particle cache files for a specific frame and you want to resume your
simulation from that frame you have to use this option. The starting cached frame is the Start Frame entered
above.
Render Preview: Enable this option to create a Maxwell Render preview.

9.54.2 Plugin Configuration


You can configure the RealFlow plug-in settings from the Monitor. While in super user mode, select Tools -> Plugins
Configuration and select the RealFlow plug-in from the list on the left.

770

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Render Executables
Realflow Executable: The path to the Realflow executable file used for rendering. Enter alternative paths on
separate lines. Different executable paths can be configured for each version installed on your render nodes.

9.54.3 Integrated Submission Script Setup


The following procedures describe how to install the integrated RealFlow submission script. This script allows for
submitting RealFlow render jobs to Deadline directly from within the RealFlow editing GUI.
Submitter Installer
Run the Submitter Installer located at <Repository>/submission/RealFlow/Installers
Manual Installation of the Submission Script
RealFlow 5 and Later:
Copy [Repository]\submission\RealFlow\Client\DeadlineRealFlowClient.py to [RealFlow Install Directory]\scripts.
Launch RealFlow.
Now you can select Commands -> System Commands -> DeadlineRealFlowClient.py.

9.54. Realflow

771

Deadline User Manual, Release 7.1.0.35

RealFlow 4:
Copy [Repository]\submission\RealFlow\Client\DeadlineRealFlowClient.py to [RealFlow Install Directory]\scripts.
Launch RealFlow and select Scripts -> Add.

772

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

In the Add Script dialog, for the Name enter Submit To Deadline, and for the Script enter the path to the
DeadlineRealFlowClient.py file that you just copied over. Then click the New Folder button and name the
folder Deadline. Then select the Deadline folder and click OK.

Now you can select Scripts -> User Scripts -> Deadline -> Submit To Deadline to launch the submission
dialog.

9.54.4 FAQ
What versions of RealFlow are supported by Deadline?
RealFlow versions 3 and later are supported. The integrated submission script is only supported in RealFlow 4 and later. RealFlow 3 jobs can still be submitted from the Monitor.
Does rendering with RealFlow require a separate license?
Yes. You need separate command line licenses to render.
Can I render separate IDOCs from the same scene across different machines?
Yes. You can specify which IDOCs you want to render in the submitter, and a separate job will be
submitted for each one.
Why is RealFlow looking for the particle cache on the local C: instead of on the network?
9.54. Realflow

773

Deadline User Manual, Release 7.1.0.35

This is likely happening because you are choosing to submit the RealFlow file with the job. This means
the file is copied locally to the slave machines, which is why they are looking for the cache locally. If
you disable the option to submit the file with the job, the slave machines should be able to find the cache
properly.

9.54.5 Error Messages and Meanings


This is a collection of known RealFlow error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Exception during render: [RealFlow Error]: License file not found.
RealFlow requires a separate license for network rendering, and that licensing system needs to be setup
before you can render through Deadline.

9.55 REDLine
9.55.1 Job Submission
You can submit REDLine jobs from the Monitor. REDLine is the command line tool that ships with Redcine-X, and
previously with REDAlert.

774

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

9.55. REDLine

775

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation. The REDLine specific options are:
Input R3D File: Specify the R3D file you want to render.
Output Folder: The folder where the output files will be saved.
Output Filename: The prefix portion of the output filename. It is not necessary to specify the extension.
Output Format: The output format. This will determine the filename extension.
Render Resolution: The resolution to render at.
Make Output Subfolder: Makes subdirectory for each output.
Frame List: The list of frames to render if rendering an animation.
Renumber Start Frame: The new start frame number (optional).
Frames Per Task: The number of frames per task.
Submit Input R3D File With Job: If checked, the input file is submitted with the job to the repository.
Deadline basically supports all the options that are available in the Redcine-X application. It also supports the ability
to specify RSX files to use when rendering, so you can set your options in Redcine-X and then use them to render
the job through Deadline. Please refer to your Redcine-X documentation for more information about these additional
render options.

9.55.2 Plug-in Configuration


You can configure the REDLine plug-in settings from the Deadline Monitor. While in super user mode, select Tools
-> Configure Plugins and select the REDAlert plug-in from the list on the left.

776

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Render Executables
REDLine Executable: The path to the REDline executable file used for rendering. Enter alternative paths on
separate lines.

9.55.3 FAQ
Is Redcine-X/REDAlert supported by Deadline?
Yes. Both applications ship with a command line application called REDLine, which Deadline uses to
render.
Which Operating System(s) can I render REDLine jobs with?
Currently, REDLine is available on Windows and OSX, so you can render REDLine jobs on these operating systems.

9.55.4 Error Messages and Meanings


This is a collection of known REDLine error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.56 Renderman (RIB)


9.56.1 Job Submission
You can submit Renderman jobs from the Monitor.

9.56. Renderman (RIB)

777

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. Note that a Draft job can only be submitted if Deadline is
able to parse absolute Display paths from the selected rib file. If it cannot extract the output paths, it will let you know
during submission so that you can disable the Draft job option.

778

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

The RIB specific options are:


RIB Files: The RIB files to be rendered (can be ASCII or binary formatted). These files should be network
accessible.
Renderer: The renderer that will be used to render the RIB files.
Additional Arguments: Specify additional command line arguments you would like to pass to the RIB renderer.
See the documentation for your particular RIB renderer for additional arguments.

9.56.2 Plug-in Configuration


You can configure the RIB plug-in settings from the Monitor. While in super user mode, select Tools -> Plugins
Configuration and select the RIB plugin from the list on the left.

Render Executables
Executable: The path to the RIB executable file used for rendering. Enter alternative paths on separate lines.
Different executable paths can be configured for each RIB renderer installed on your render nodes.

9.56.3 FAQ
Which RIB renderes are supported by Deadline?
The following renders are supported:
3Delight
Air
Aqsis
BMRT
9.56. Renderman (RIB)

779

Deadline User Manual, Release 7.1.0.35

Entropy
Pixie
PRMan
RenderDotC
RenderPipe
If you use a RIB renderer that is not on this list, please contact Deadline Support and let us know.

9.56.4 Errors Messages and Meanings


This is a collection of known RIB error messages and their meanings, as well as possible solutions. We want to keep
this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.57 Rendition
9.57.1 Job Submission
You can submit Rendition jobs from the Monitor.

780

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. The Rendition specific options are:
Input MI File: The MI file to render. This needs to be on a network location, since Rendition often saves the

9.57. Rendition

781

Deadline User Manual, Release 7.1.0.35

output file(s) relative to the input MI file location.


Output File: Optionally override the output file.
Build To Force: Force 32 bit or 64 bit rendering.
Skip Existing Frames: If existing images should be skipped.
Additional Cmd Line Args: Additional command line arguments to pass to Rendition during rendering.
Tile Rendering Options: Setup a Rendition tile rendering job. Note that this requires you to override the output
file. Also make sure that the final image resolution settings are correct, because these are used to determine the
size of the tiles to render. The output formats that are supported are BMP, DDS, EXR, JPG, JPE, JPEG, PNG,
RGB, RGBA, SGI, TGA, TIF, and TIFF.

9.57.2 Plug-in Configuration


You can configure the Rendition plug-in settings from the Monitor. While in super user mode, select Tools -> Configure
Plugins and select the Rendition plug-in from the list on the left.

Render Executables
Rendition Executable: The path to the Rendition executable file used for rendering. Enter alternative paths on
separate lines.

9.57.3 FAQ
Is Rendition supported by Deadline?
Yes.
Why do the image format options (like color depth) get reverted to defaults when rendering with Deadline?

782

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

This only happens when overriding the output file in the submission script. When we pass the output path
to Rendition, it uses the default image format options for the output type. If you dont want this to occur,
simply dont override the output file.

9.57.4 Error Messages and Meanings


This is a collection of known Rendition error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.58 Rhino
9.58.1 Job Submission
You can submit jobs from within Rhino by installing the integrated submission script, or you can submit them from
the Monitor. The instructions for installing the integrated submission script can be found further down this page.

9.58. Rhino

783

Deadline User Manual, Release 7.1.0.35

To submit from within Rhino, left-click on the Deadline button you created during the integrated submission script
installation.

784

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. The Rhino specific options are:
Rhino File: The Rhino file to be rendered.
Output File: The filename of the image(s) to be rendered.
Renderer: Specify the renderer to use.
Render Bongo Animation: If your Rhino file uses the Bongo animation plugin, you can enable a Bongo
animation job.
Tile Rendering
The following options are available for tile rendering. Note that tile rendering is only available when submitting from
within Rhino.
Enable Tile Rendering: If enabled, the image will be rendered in regions and automatically assembled by
Draft.
Use Jigsaw: Use Jigsaw to determine the regions.
Tiles in X and Tiles in Y: The number of tiles to divide the job into if not using Jigsaw.
Submit Dependent Assembly Job: If enabled, then a dependent job will be submitted to assemble the
tiles/regions into a single image.
Cleanup Tiles as Assembly: If enabled, then after assembly the tiles/regions will be deleted.
Error on Missing Tiles: If enabled, then if any of the tiles/regions are missing the assembly job will fail.
Assemble Over: Determines what the tiles/regions will be assembled over; nothing, a single image or the same
image as the final image.
Error on Missing Background: Determines if the assembler should fail if the background image is missing.

9.58. Rhino

785

Deadline User Manual, Release 7.1.0.35

Supported Renderers
Deadline supports many of the Rhino renderers out of the box, including Rhino Render, Flamingo, VRay, Brazil,
Penguin, and TreeFrog. If you are using a renderer that Deadline does not currently support, please email Deadline
Support and let us know!

786

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

It is also possible to manually add new renderers to the list that Deadline supports.
Go to
\\your\repository\script\Submission\RhinoSubmission and open Renderers.ini in a text editor. Youll see that this file
contains the list of renderers that Deadline currently supports, one per line. Just add the missing renderer as a new line
and save the file. Note that the name needs to match that of the renderer exactly!

9.58.2 Plug-in Configuration


You can configure the Rhino plug-in settings from the Monitor. While in super user mode, select Tools -> Configure
Plugins and select the Rhino plug-in from the list on the left.

Render Executables
Rhino Executable: The path to the Rhino executable file used for rendering. Enter alternative paths on separate
lines. Different executable paths can be configured for each version installed on your render nodes.

9.58.3 Integrated Submission Script Setup


The following procedures describe how to install the integrated Rhino submission script for different versions of
Rhino. This script allows for submitting Rhino render Jobs to Deadline directly from within the Rhino editing GUI.
Rhino 5
The following installation procedure is intended and has been tested for Rhino 5.0.
In Rhino, select Tools -> Toolbar Layout.

9.58. Rhino

787

Deadline User Manual, Release 7.1.0.35

Select the Toolbar Collection file that you want to add the Deadline submission button to, and then select File
-> Import Toolbars.... Browse to [Repository]\submission\Rhino\Client\ and select the deadline.rui file.
Check the box next to Deadline and press OK.

788

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

There should not be a toolbar with a Deadline button on your screen, which you can dock anywhere you want.
Left-click on the button to submit a Rhino Job to Deadline.
Right-click on the button to launch the Monitor.
Rhino 4
The following installation procedure is intended and has been tested for Rhino 4.0. It is largely similar to the procedure
described for Rhino 5 above, with some slight differences.
In Rhino, select Tools -> Toolbar Layout.

9.58. Rhino

789

Deadline User Manual, Release 7.1.0.35

Select the Toolbar collection file that you want to add the Deadline submission button to, then select Toolbar ->
Import. Browse to [Repository]\submission\Rhino\Client\ and select the deadline.tb file.
Check the box next to Deadline and press Import.

790

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Select File -> Save to save the changes to the selected Toolbar collection file.
There should now be a toolbar with a Deadline button on your screen, which you can dock.
Left-click on the button to submit a Rhino job to Deadline.
Right-click on the button to launch the Monitor.

9.58.4 FAQ
Which versions of Rhino are supported?
Rhino 4 and later are supported.
Does Rhino need to be licensed on each render node?
Yes.
Is the Bongo plugin for animation supported?
Yes. The Rhino submission dialog has the option to render a Bongo animation.
Is V-Ray for Rhino fully supported?
Yes. Please see the V-Ray Distributed Rendering plugin for details on how V-Ray interactive DBR in
Rhino operates.

9.58. Rhino

791

Deadline User Manual, Release 7.1.0.35

9.58.5 Error Messages and Meanings


This is a collection of known Rhino error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.59 RVIO
9.59.1 Job Submission
You can submit RVIO jobs from the Monitor, or you can right-click on a job and select Scripts -> Submit RVIO Job
To Deadline to automatically populate some fields in the RVIO submitter based on the jobs output.

792

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation.
The RVIO submitter allows you to create and save Layers, each of which can contain one or two source images, an
arbitrary number of audio files, and a list of overrides.
Click the New button to add a new Layer.
Click the Rename button to rename the selected Layer.
Click the Remove button to remove the selected Layer.
Click the Clear All button to remove all Layers.
Click the Load Layers button to load saved Layers from disk.
Click the Save Layers button to save the list of current Layers to disk.
For Layers, the only required setting is the Source 1 file(s). If specifying a sequence, you can set the range to the right
of the file name (the same for the Source 2 file if specified). Note that the .rv file format is also supported as a Source
file. For Audio Files, a comma separated list is used to allow the submission of multiple files. Other than submitting at
least one Layer, the only other required option is the Output File under the Output tab. See the RVIO Documentation
for more information about the available options and overrides.
Codec Lists
The RVIO submitter pulls its codec settings from the GetRawCodecText() function in
\\your\repository\scripts\submission\RVIOSubmission.py. The raw text was retrieved from running rvio.exe
-formats in a command prompt. This means that if your installation of RVIO supports additional codecs that arent
available in the submitter, you can run the following and then copy the text in the resulting Codecs.txt file and paste it
between the triple quotes in GetRawCodecText():
rvio.exe -formats > Codecs.txt

9.59.2 Plug-in Configuration


You can configure the RVIO plug-in settings from the Monitor. While in super user mode, select Tools -> Configure
Plugins and select the RVIO plug-in from the list on the left.

9.59. RVIO

793

Deadline User Manual, Release 7.1.0.35

Render Executables
RVIO Executable: The path to the rvio executable file used for rendering. Enter alternative paths on separate
lines.

9.59.3 FAQ
Is RVIO supported by Deadline?
Yes.

9.59.4 Error Messages and Meanings


This is a collection of known RVIO error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.60 Salt
9.60.1 Job Submission
You can submit Salt update jobs from the Monitor.

794

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation. The Salts specific options are:
Verbose Logging Level: The level of logging a Salt job will output.

9.60.2 Plug-in Configuration


You can configure the Salt plug-in settings from the Monitor. While in super user mode, select Tools -> Configure
Plugins and select the Salt plug-in from the list on the left.

9.60. Salt

795

Deadline User Manual, Release 7.1.0.35

Options
Salt Executable: The path to the Salt Executable. Enter alternative paths on separate lines.

9.61 Shake
9.61.1 Job Submission
You can submit Shake jobs from the Monitor.

796

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. The Shake specific options are:
Shake Script File: The Shake script file to be rendered.

9.61. Shake

797

Deadline User Manual, Release 7.1.0.35

CPUs: The number of CPUs to use during rendering.


Additional Arguments: Additional arguments to pass to the Shake command line renderer.

9.61.2 Plug-in Configuration


You can configure the Shake plug-in settings from the Deadline Monitor. While in super user mode, select Tools ->
Plugins Configuration and select the Shake plug-in from the list on the left.

Render Executables
Shake Executable: The path to the Shake executable file used for rendering. Enter alternative paths on separate
lines.

9.61.3 FAQ
Is Shake supported by Deadline?
Yes.

9.61.4 Error Messages and Meanings


This is a collection of known Shake error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

798

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

9.62 SketchUp
9.62.1 Job Submission
You can submit jobs from within SketchUp by installing the integrated submission script, or you can submit them from
the Monitor. The instructions for installing the integrated submission script can be found further down this page.
To submit from within SketchUp, select Plugins -> Submit To Deadline.

9.62. SketchUp

799

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation. The SketchUp specific options are:
SketchUp File: The file to be exported.
800

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Export Directory: The export destination folder.


Export File Prefix: The file prefix of the export.
Export Type: The type of export (3D model, 2D image sequence or 2D image).
Export Format: The file format of the export file (ie. .3ds, .png, etc.).
Frame Rate: Enabled if exporting an image sequence, determines frequency of images (in frames per second).
Compression Rate: Float compression factor for JPEG images (between 0.0 and 1.0, lower is smaller size,
larger is better quality).
Width: Width of image in pixels (if 0, uses information from SketchUp file).
Height: Height of image in pixels (if 0, uses information from SketchUp file).
Anti-alias: Enables image anti-aliasing.
Transparent: Enables image Transparency.
Version: The version of SketchUp to use.

9.62.2 Plug-in Configuration


You can configure the SketchUp plug-in settings from the Monitor. While in super user mode, select Tools -> Configure
Plugins and select the SketchUp plug-in from the list on the left.

Render Executables
SketchUp Executable: The path to the SketchUp executable file used for rendering. Enter alternative paths on
separate lines. Different executable paths can be configured for each version installed on your render nodes.

9.62. SketchUp

801

Deadline User Manual, Release 7.1.0.35

9.62.3 Integrated Submission Script Setup


The following procedures describe how to setup the integrated SketchUp submission script for Deadline. This script
has been tested with SketchUp 7 and later.
Submitter Installer
Run the Submitter Installer located at <Repository>/submission/SketchUp/Installers
Manual Installation of the Submission Script
Windows:
Copy [Repository]/submission/SketchUp/Client/DeadlineSketchUpClient.rb to [SketchUp Plugin Directory]
which will look different depending on your version of SketchUp.
SketchUp 8 and earlier, the plug-in directory may look something like this, C:\Program
Files\Google\Google SketchUp #\plugins
SketchUp 2013, the plug-in directory may look something like this, C:\Program Files\SketchUp\SketchUp
2013\plugins
SketchUp
2014,
the
plug-in
directory
may
look
something
C:\Users\[User]\AppData\Roaming\SketchUp\SketchUp 2014\SketchUp\Plugins

like

this,

Mac OS X:
Copy [Repository]/submission/SketchUp/Client/DeadlineSketchUpClient.rb to [SketchUp Plugin Directory]
which will look different depending on your version of SketchUp.
SketchUp 8 and ealier, the plug-in directory may look something like this, /Library/Application Support/Google SketchUp #/SketchUp/plugins
SketchUp 2013 and later, the plug-in directory may look something like this (Note: it may have to be in
the specific users /Library/ directory as of 2014), /Library/Application Support/SketchUp #/plugins

9.62.4 FAQ
Which versions of SketchUp are supported by Deadline?
The commercial versions of SketchUp 7 and later are supported.

9.62.5 Error Messages and Meanings


This is a collection of known SketchUp error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.63 Softimage
9.63.1 Job Submission
You can submit jobs from within Softimage by installing the integrated submission script, or you can submit them
from the Monitor. The instructions for installing the integrated submission script can be found further down this page.

802

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

To submit from within Softimage, select the Render toolbar on the left and click Render -> Submit To Deadline.

9.63. Softimage

803

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. The Softimage specific options are:
Workgroup: Specify the workgroup that Softimage should use during rendering. Leave blank to ignore.
Force Build: Force 32 bit or 64 bit rendering.
Submit Softimage Scene File: The Softimage scene file will be submitted with the job. If your Softimage scene
is stored in a project folder on the network, it is recommended that you leave this box unchecked.
Threads: The number of render threads to use during rendering.
Use Softimage Batch Plugin: This plugin keeps Softimage and the scene loaded in memory between tasks.

804

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Enable Local Rendering: If enabled, the frames will be rendered locally, and then will be copied to the final
network location. Note that this feature doesnt support the Skip Existing Frame option.
Skip Batch Licensing Check: If enabled, Softimage wont try to check out a Batch license during rendering.
This allows you to use 3rd party renderers like VRay or Arnold without using a Softimage batch license.
Selecting passes to render:
Select which passes you would like to render. A separate job is submitted for each pass. If no passes are selected,
then the current pass is submitted. Note that if you are using FxTree Rendering, the passes are ignored.

Setting up a tile rendering job:

9.63. Softimage

805

Deadline User Manual, Release 7.1.0.35

Enable tile rendering to split up a frame into multiple tiles that are rendered individually. By default, a separate
job is submitted for each tile (this allows for tile rendering of a sequence of frames). For easier management of
single frame tile rendering, you can choose to submit all the tiles as a single job.
You can submit a dependent assembly job to assemble the image when the main tile job completes. If using
Draft for the assembly, youll need a license from Thinkbox. Otherwise, the output formats that are supported
are BMP, DDS, EXR, JPG, JPE, JPEG, PNG, RGB, RGBA, SGI, TGA, TIF, and TIFF.
Note that the Error On Missing Tiles option only applies to Draft assemblies.
Note that if you are using FxTree Rendering, the tile rendering settings are ignored.

Setting up an FxTree rendering job:


Enable FxTree rendering to render a specific FxTree output node, which can be selected from the FxTree Output
dropdown box. You can also set the frame offset for the output files. Some things to note are:
The frame range to be rendered is pulled from the Frame List setting under the Submission
Options tab.
If you are rendering to a movie file, be sure to set the Group Size to the number of frames in
your animation.

806

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Notes:
Softimage gives the option to specify file paths as being relative to the current directory or absolute. Deadline
requires that all file paths be absolute.
When specifying the image output, make sure to include the extension (.pic, .tga, etc) at the end so that you can
view the individual rendered images from the task list in the Monitor.
Redshift Renderer Options
If submitting a Softimage scene that uses the Redshift renderer, there will be an additional option in the integrated
submitter called GPUs Per Task. If set to 0 (the default), then Redshift will be reponsible to choosing the GPUs to use
for rendering.
If this is set to 1 or greater, then each task for the job will be assigned specific GPUs. This can be used in combination
with concurrent tasks to get a distribution over the GPUs. For example:
if this is set to 1, then tasks rendered by the Slavess thread 0 would use GPU 0, thread 1 would use GPU 1, etc.
if this is set to 2, then tasks rendered by the Slavess thread 0 would use GPUs {0,1}, thread 2 would use GPUs
{2,3}, etc.

9.63.2 Cross-Platform Rendering Considerations


In order to perform cross-platform rendering with Softimage, you must setup Mapped Paths so that Deadline can swap
out file paths where appropriate. You can access the Mapped Paths Setup in the Monitor while in super user mode by
selecting Tools -> Configure Repository. Youll find the Mapped Paths Setup in the list on the left.

9.63.3 Plug-in Configuration


You can configure the Softimage and SoftimageBatch plug-in settings from the Monitor. While in super user mode,
select Tools -> Plugins Configuration and select the Softimage or SoftimageBatch plug-in from the list on the left.

9.63. Softimage

807

Deadline User Manual, Release 7.1.0.35

Softimage

Render Executables
Softimage Render Excecutable: The path to the XSIBatch.bat file used for rendering. Enter alternative paths
on separate lines. Different executable paths can be configured for each version installed on your render nodes.
Options
Enable Strict Error Checking: If enabled, Deadline will fail in almost all cases when the job whenever Softimage prints out ERROR for whatever reason.
Return Codes To Ignore: Error codes (other than 0) that Deadline should ignore and instead assume the render
has finished successfully. Use a ; to separate the error codes.

808

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

SoftimageBatch

Render Executables
Softimage Render Excecutable: The path to the XSIBatch.bat file used for rendering. Enter alternative paths
on separate lines. Different executable paths can be configured for each version installed on your render nodes.
Options
Enable Strict Error Checking: If enabled, Deadline will fail in almost all cases when the job whenever Softimage prints out ERROR for whatever reason.
Connection Timeout: The amount of seconds to give the Deadline plugin and Softimage to establish a connection before the job fails.
Timeout For Progress Updates: The amount of seconds to between Softimage progress updates before the job
is failed. Set to 0 to disable this feature.

9.63.4 Integrated Submission Script Setup


The following procedures describe how to install the integrated Softimage submission script. This script allows for
submitting Softimage render jobs to Deadline directly from within the Softimage editing GUI.
Earlier versions of Softimage might not ship with Python out of the box. In this case, follow these steps:
Install the Python engine for Softimage. For more information, see the Softimage Python installation documentation.
Check that the Python engine has been installed correctly. This can be done by opening up Softimage and
selecting File -> Preferences. Under the Scripting preferences, you should have the option to select Python as
the Script Language. If you dont see this option, then Python has not been installed correctly, and you should
contact the Softimage support team.

9.63. Softimage

809

Deadline User Manual, Release 7.1.0.35

Once Python is an available scripting option in Softimage, you can follow these steps to install the submission script:
You can either run the Submitter installer or manually install the submission script
Submitter Installer
Run the Submitter Installer located at <Repository>/submission/Softimage/Installers
Manual Installation of the Submission Script
Copy the file [Repository]/submission/Softimage/Client/DeadlineSoftimageClient.py to the folder [Softimage
Install Directory]/Application/Plugins
Launch Softimage. The submission script is automatically installed when Softimage starts up. To make sure
the script was installed correctly, select the Render toolbar on the left and click the Render button. A Submit To
Deadline menu item should be available.

810

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Custom Sanity Check


A CustomSanityChecks.py file can be created alongside the main Softimage submsion scripts (in [Repository]\submission\Softimage\Main), and will be evaluated if it exists. This script will let you set any of the initial
properties in the submission script prior to displaying the submission window. You can also use it to run your own
checks and display errors or warnings to the user. Here is a very simple example of what this script could look like:
import win32com.client
Application = win32com.client.Dispatch( 'XSI.Application' )
def RunSanityCheck( opSet ):
opSet.Parameters( "DepartmentTextBox" ).Value = "The Best Department!"
opSet.Parameters( "PriorityNumeric" ).Value = 33
opSet.Parameters( "BatchBox" ).Value = True
Application.LogMessage( "This is a custom sanity check!" )

9.63. Softimage

811

Deadline User Manual, Release 7.1.0.35

return True

The opSet parameters can be found in the SoftimageToDeadline.py script in the Main folder mentioned above. Look
for the following line in the script:
opSet = Application.ActiveSceneRoot.AddProperty(
"CustomProperty",False,"SubmitSoftimageToDeadline")

After this line, all the available parameters are added to the opSet. These can be used to set the initial values in the
submission dialog.
Finally, if the RunSanityCheck method returns False, the submission will be canceled.

9.63.5 FAQ
Which versions of Softimage are supported?
Softimage versions 2010 and later are supported.
What is the difference between the Softimage and SoftimageBatch plug-ins?
The SoftimageBatch plug-in keeps the scene loaded in memory between subsequent tasks for the same
job. This saves on the overhead of having to load Softimage and the scene file for each task. The Softimage
plug-in uses standard command line rendering, and should only be used if you experience problems with
the SoftimageBatch plug-in.
Is FxTree rendering supported?
Yes. Simply enable FxTree rendering in the submission dialog and specify the FxTree and Output Node
you want to render.
Is the Arnold renderer for Softimage supported?
Yes. Deadline supports the Arnold plug-in for Softimage, as well as Arnolds standalone renderer
(kick.exe). For more information on rendering Arnold Standalone jobs, see the Arnold Standalone Plug-in
Guide.
Can Softimage script jobs be submitted to Deadline?
Yes. Deadline provides very basic support for script jobs, though there is currently no interface to submit
them. The option for submitting a script job can be specified in the plug-in info file.
After installing the Softimage integrated submission script, Softimage failes to load (it goes to a white screen
and hangs).
We have heard of this problem before, but we have not been able to reproduce it. The workaround for this
problem is to remove the script from the plugins folder, and manually path to the submission script plugin
after starting Softimage.
When Deadline renders the job, Softimage isnt able to find anything in the scenes project folder.
If youre Softimage scene file is saved in a project folder on the network, leave the Submit Softimage
Scene File check box unchecked in the submission dialog. This allows Deadline to load the Softimage
scene in the context of its project folder.
I have Softimage configured to save output to a network share, but when Deadline renders the scene, the render
slaves save their output to their local C drive rather than to the network share.
There are two possible solutions:

812

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

1. If youre Softimage scene file is saved in a project folder on the network, leave the Submit
Softimage Scene File check box unchecked in the submission dialog. This is the recommend
solution.
2. Specify the full resolved path for the scene output directory, instead of something like [Project
Path]\Render_Pictures.
Rendering with Deadline seems a lot slower than rendering through Softimage itself.
If youre submitting your jobs with the Use Softimage Batch option disabled, then Softimage needs to be
restarted and the scene needs to be reloaded for every task in the job, which can add a lot of overhead to
the render time, especially if cached data needs to be loaded.
To speed up your renders, you can increase the task group size (aka: chunk size) from 1 to 5 or 10. This
way, the scene is loaded once for every 5 or 10 frames. Increasing the chunksize like this is recommended
if you know ahead of time your frames will only take seconds to render, or if a large amount of cached
data needs to be loaded.

9.63.6 Error Messages and Meanings


This is a collection of known Softimage error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Exception during render: Renderer returned non-zero error code, -1073741819
The error code -1073741819 is equivalent to 0xC0000005, which represents a Memory Access Violation
error. So Softimage is either running out of memory, or memory is becoming corrupt. If you find that
your frames are still being rendered, you can configure the Softimage plugin configuration to ignore this
error.
Exception during render: ERROR : 2000 - Library not found: ...
This can occur if the Slave applications environment variables havent been updated. Try rebooting the
machine and see if that fixes the problem.
ERROR : 2004 - Invalid pointer - [line 2]
You can workaround this by renaming the ICEFlow plugin (Application\Plugins\ICEFlow.dll). This plugin manages the transfer of data between Softimage and Maya (the one-click ICE workflow).

9.64 Terragen
9.64.1 Job Submission
You can submit Terragen jobs from the Monitor.

9.64. Terragen

813

Deadline User Manual, Release 7.1.0.35

814

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. The Terragen specific options are:
Project File: The Terragen project file to render.
Render Node: Select the render node to render. Leave blank to use the default in the project.
Output: Override the output path in the project file. If rendering a sequence of frames, remember to include the
%04d format in the output file name so that padding is added to each frame.
Extra Output: Override the extra output path in the project file. If rendering a sequence of frames, remember
to include the IMAGETYPE.%04d format in the output file name so that padding is added to each frame.
Enable Local Rendering: If enabled, the frames will be rendered locally, and will then be copied to the final
network location. Note that this requires that an Output file be specified above.
Version: The version of Terragen.

9.64.2 Plug-in Configuration


You can configure the Terragen plug-in settings from the Monitor. While in super user mode, select Tools -> Configure
Plugins and select the Terragen plug-in from the list on the left.

Render Executables
Terragen CLI Executable: The path to the Terragen executable file used for rendering. Enter alternative paths
on separate lines. Different executable paths can be configured for each version installed on your render nodes.

9.64. Terragen

815

Deadline User Manual, Release 7.1.0.35

9.64.3 FAQ
Which versions of Terragen are supported?
The commercial version of Terragen 2 and later are supported.

9.64.4 Error Messages and Meanings


This is a collection of known Terragen error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.65 Tile Assembler


9.65.1 Job Submission
You can submit Tile assembler jobs from the Monitor. Normally, these jobs are submitted as dependent jobs for your
original tile jobs, but you can submit them manually if you wish. Please note that the Tile Assembler plugin is EOL
(End-Of-Life/deprecated) and we recommend using the newer Draft Tile Assembler plugin for all tile/region assembly
duties.

816

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation. The Tile Assembler specific options
are:
Input Tile Files: Select just one of your image tile files from a group to perform the tile assembly for. The files should have the format [PREFIX]_tile_[I]x[J]_[X]x[Y].[EXTENSION]. For example,
r:\projects\deadline\Tests\example_tile_1x1_2x1_0000.exr. Ensure the filenames match this naming convention.
Tiles Are Uncropped: Enable this option if a tile consists of the full resolution of the image, with only a part
of it rendered.

9.65. Tile Assembler

817

Deadline User Manual, Release 7.1.0.35

Ignore Overlap: If assembling uncropped tiles, enable this option to ignore any overlap that exists for the given
tiles. For example, if two tiles share a few pixels between them.
Clean Up Tile Files After Assembly: Enable to automatically delete the tile files after successfully assembling
the final image.
Opaque Opacity: Use this option if non-exr tiles use opaque opacity in empty pixels.

9.65.2 Plug-in Configuration


The Tile Assembler plug-in does not require any configuration.

9.65.3 FAQ
Is the Tile Assembler plugin still officially supported in Deadline?
No. Please note that the Tile Assembler plugin is EOL (End-Of-Life) and we recommend using the newer
Draft Tile Assembler plugin for all tile/region assembly duties. The old Tile Assembler system is still
available in Monitor and via some of the in-app tile rendering submission scripts and will still work.
However, it is now deprecated, so please do not build any in-house tools around Tile Assembler. The
newer Draft Tile Assembler contains all the features of the old Tile Assembler and more! Tile Assembler
will be removed at an undetermined date in the future. You have been warned!

9.65.4 Error Messages And Meanings


This is a collection of known Tile Assembler error messages and their meanings, as well as possible solutions. We want
to keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know. Please note that the Tile Assembler plugin is EOL (End-Of-Life) and we recommend using
the newer Draft Tile Assembler plugin for all tile/region assembly duties.
ERROR: ImageMagick: Invalid bit depth for RGB image [path to tile/region render output image]
This error is due to the TileAssembler executable not supporting certain bit depth images such as VRays
REs Reflection, Refraction and Alpha when saved from the VRay Frame Buffer (VFB). Please
use the newer Draft Tile Assembler plugin to ensure all image types/bit depths are correctly assembled.
Draft Tile Asssembler jobs can also be submitted independently if you already have the *.config file(s)
and is explained further in the Draft Tile Assembler documentation.

9.66 V-Ray Distributed Rendering


9.66.1 Interactive Distributed Rendering
You can submit interactive V-Ray DBR jobs from 3ds Max, Maya, or Softimage. The instructions for installing the
integrated submission script can be found further down this page.
The interactive submitter will submit a V-Ray Spawner job (V-Ray standalone for Maya, Softimage, 3dsMaxRT,
Rhino, Sketchup) to reserve render nodes, and the submitter will automatically update the V-Ray server list.
Do NOT execute or install the Chaos Group V-RaySpawner (V-RaySpawner/V-RaySpawner RT/V-Ray standalone)
executable as a background service (NT service/daemon). Deadline is more flexible here and will spawn the VRaySpawner/standalone executable as a child process of the Deadline Slave. This makes our system more flexible and
resilient to crashes as when we terminate the V-Ray DBR job in the Deadline queue, the Deadline Slave application will
cleanly tidy up V-RaySpawner/standalone and more importantly, any DCC application (3dsMax/Maya) or standalone
818

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

instances which it in turn has spawned as a child process. This can be helpful if V-Ray DBR becomes unstable and
a user wishes to reset the system remotely. You can simply re-queue or delete/complete the current DBR job or
re-submit.

Port Configuration
Here is a consolidated list of port requirements for the various versions of V-Ray. Ensure any applicable firewalls are
opened to allow pass-through communication. Typically if in doubt, opening TCP/UDP ports in the range: 2020020300 will cover all V-Ray implementations for DBR. During initial testing, it is recommended to open all ports in
this range, verify and then consider tightening up security.
ProDefault Port
tocol
Number
TCP/IP 20204
TCP/IP FIXED
TCP/IP 20206
TCP/IP
TCP/IP
TCP/IP
TCP/IP
TCP/IP
TCP/IP

20207
20207
20207
20207
20207
20207

Application

Notes

3dsMax V-Ray
Production
3dsMax /
V-Ray Spawner
3dsMax V-Ray
RT/GPU
Maya
Softimage
modo
Rhino
SketchUp
C4D

V-Ray 2.x, V-Ray 3.x - Production and Nightly beta builds (v2 & v3:
20204)
Used by render servers to broadcast a message that they are ready to
join an ongoing DR session (v2 & v3: 20205)
V-Ray 2.x, V-Ray 3.x RT/GPU - Production and Nightly beta builds
(v2 & v3: 20206)
V-Ray 2.x and 3.x RT/GPU - Production and Nightly beta builds
V-Ray 2.x and 3.x - Production
V-Ray Standalone
V-Ray Standalone
V-Ray Standalone
V-Ray Standalone

9.66. V-Ray Distributed Rendering

819

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation. The V-Ray DBR specific options
are:
Maximum Servers: The maximum number of V-Ray Servers to reserve for distributed rendering.
Port Number (Softimage/Maya/3dsMax/3dsMaxRT only): The port number that V-Ray will use for distributed rendering. In the case of Softimage, this is necessary because Softimage uses V-Ray standalone for
distributed rendering and the default port number for V-Ray in Softimage is different from the default port
number in V-Ray standalone. The port number needs to be identical on all machines including the workstation machine for a particular DCC application to communicate correctly. It is recommended to disable any
client firewall whilst initial testing/configuration is carried out. Typically, opening TCP/UDP ports in the range:
20200-20300 will cover all V-Ray implementations for DBR.
Use Server IP Address instead of Host Name: If checked, the Active Servers list will show the server IP
addresses instead of host names.
Automatically Update Server List (3dsMax only): This option when un-checked stops the automatic refresh
of the active servers list based on the current Deadline queue.
Complete Job After Render (3dsMax only): When checked, as soon as the DR session has completed (max
quick render finished), then the Deadline job will be marked as complete in the queue.
Active Servers (3dsMax only): Individual Deadline Slaves can be enabled/disabled here (V-Ray Spawner as a
job will still continue to run on the disabled slaves until the job is deleted/completed).
Check ALL/INVERT/Check NONE (3dsMax only): Easily enable all/invert/none all currently listed Deadline
Slaves in the Active Servers List.
Rendering
After youve configured your submission options, press the Reserve Servers button to submit the V-Ray Spawner job.
The jobs ID and Status will be tracked in the submitter, and as nodes pick up the job, they will show up in the Active
Servers list. Once you are happy with the server list, press Start Render (3ds Max and Maya) or Render Current
Pass/Render All Passes (Softimage) to start distributed rendering.
Note that the V-Ray Spawner/V-Ray standalone process can sometimes take a little while to initialize. This means that
a server in the Active Server list could have started the V-Ray Spawner, but its not fully initialized yet. If this is the
case, its probably best to wait a minute or so after the last server has shown up before pressing Start Render.
Update Servers (3dsMax only) button will manually update the Active Servers List. Note, if you modify the Maximum
Servers value, the jobs frame range will be updated when this button is pressed or if Automatically Update Server
List is enabled.
After the render is finished, you can press Release Servers or close the submitter UI (Setup V-Ray DBR With Deadline)
to mark the V-Ray Spawner/V-Ray standalone job as complete so that the render nodes can move on to another job.

9.66.2 V-Ray Spawner/V-Ray standalone Submission


You can also submit V-Ray Spawner/V-Ray standalone jobs from the Monitor, which can be used to reserve render
nodes for distributed rendering. Note, if you submit the job via the Monitor submission script, that you will need to
manually configure/update your local workstation settings to point to the correct, corresponding Deadline slaves over
an identical port number.

820

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation. The V-Ray DBR specific options
are:
Maximum Servers: The maximum number of V-Ray Servers to reserve for distributed rendering.
Application: The application you will be initiating the distributed render from.
Version: The version of the application, if applicable.
Port Number (Softimage/Maya/3dsMaxRT only): The port number that V-Ray will use for distributed rendering. In the case of Softimage, this is necessary because Softimage uses V-Ray standalone for distributed
rendering and the default port number for V-Ray in Softimage is different from the default port number in VRay standalone. The port number needs to be identical on all machines including the workstation machine for
a particular DCC application to communicate correctly. It is recommended to disable any client firewall whilst
initial testing/configuration is carried out. Typically, opening TCP/UDP ports in the range: 20200-20300 will
cover all V-Ray implementations for DBR.
9.66. V-Ray Distributed Rendering

821

Deadline User Manual, Release 7.1.0.35

Rendering
After youve configured your submission options, press the Submit button to submit the V-Ray Spawner/V-Ray standalone job. Note that this doesnt start any rendering, it just allows the V-Ray Spawner/V-Ray standalone application
to start up on nodes in the farm. Once youre happy with the nodes that have picked up the job, you can initiate the
distributed render manually from within the application (ie: Rhino or Sketchup). This will likely require manually
configuring your V-Ray Server list.
After the distributed render has finished, remember to mark the job as complete or delete it so that the nodes can move
on to other jobs.

9.66.3 Plug-in Configuration


You can configure the V-Ray Spawner/V-Ray standalone plug-in settings from the Monitor. While in super user mode,
select Tools -> Configure Plugins and select the V-RaySpawner plug-in from the list on the left.

V-Ray Executables
Here you can specify the executable used for rendering for the different versions of V-Ray.
DR Process Handling
Handle Existing DR/DBR Process: Only one instance of the same DR process running over the same port is
possible. This option allows for Deadline to fail the task if this is the case or attempt to kill the currently running
process, to allow the Deadline managed DR process to run successfully. Note, if the process is set to kill and
does indeed kill a currently present process, but seems to auto-restart even after killing; then this indicates the
process is already running as a service and the service will need to be stopped by your IT staff. Do NOT install
as a service as Deadline can NOT support this configuration.
DR Session Timeout (unsupported in 3dsMax)

822

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

DR Session Auto Timeout Enable: If enabled, when a DR session has successfully completed on a slave, the
task on the slave will be marked as complete after the DR session auto timeout period in seconds has been
reached (Default: False).
DR Session Auto Timeout (Seconds): This is the timeout period (Default: 30 seconds) when a DR session will
timeout and be marked as complete by a slave.

9.66.4 Integrated Submission Script Setup


There are integrated V-Ray DBR submission scripts for 3ds Max, Maya, and Softimage. The installation process for
these scripts can be found below.
You can also submit V-Ray Spawner jobs for Rhino and Sketchup from the Monitor. In this case, the render nodes
will simply be reserved for DBR, and the distributed rendering process itself will have to be initiated manually from
within Rhino or Sketchup.
3ds Max
The following procedures describe how to install the integrated V-Ray DBR submission script for 3ds Max. The
integrated submission script and the following installation procedure has been tested with Max versions 2012 and later
(including Design editions).
Note: Due to a maxscript bug in the initial release of 3ds Max 2012, the integrated submission scripts will not work.
However, this bug has been addressed in 3ds Max 2012 Hotfix 1.
You can either run the Submitter installer or manually install the submission script
Submitter Installer
Run the Submitter Installer located at <Repository>/submission/3dsmaxVRayDBR/Installers
Manual Installation of the Submission Script
Copy [Repository]/submission/3dsmaxVRayDBR/Client/Deadline3dsmaxVRayDBRClient.mcr to [3ds Install
Directory]/MacroScripts. If you dont have a MacroScripts folder in your 3ds Max install directory, check to
see if you have a UI/Macroscripts folder instead, and copy the Deadline3dsmaxVRayDBRClient.mcr file there
if you do.
Copy
[Repository]/submission/3dsmax/Client/SMTDSetup.ms
tory]/scripts/Startup/SMTDSetup.ms

to

[3ds

Max

Install

Direc-

Launch 3ds Max, and find the new Deadline menu.

9.66. V-Ray Distributed Rendering

823

Deadline User Manual, Release 7.1.0.35

Maya
The following procedure describes how to install the integrated V-Ray DBR submission script for Maya. The integrated submission script and the following installation procedure has been tested with Maya versions 2012 and later.
You can either run the Submitter installer or manually install the submission script
Submitter Installer
Run the Submitter Installer located at <Repository>/submission/MayaVRayDBR/Installers
Manual Installation of the Submission Script
On Windows, copy the file [Repository]\submission\MayaVRayDBR\Client\DeadlineMayaVRayDBRClient.mel
to [Maya Install Directory]\scripts\startup. If you do not have a userSetup.mel in [My Documents]\maya\scripts,
copy the file [Repository]\submission\MayaVRayDBR\Client\userSetup.mel to [My Documents]\maya\scripts.
If you have a userSetup.mel file, add the following line to the end of this file:
source "DeadlineMayaVRayDBRClient.mel";

On Mac OS X, copy the file [Repository]/submission/MayaVRayDBR/Client/DeadlineMayaVRayDBRClient.mel


to [Maya Install Directory]/Maya.app/Contents/scripts/startup.
If you do not have
a
userSetup.mel
in
/Users/[USERNAME]/Library/Preferences/Autodesk/maya/scripts,
copy
the
file
[Repository]/submission/MayaVRayDBR/Client/userSetup.mel
to
/Users/[USERNAME]/Library/Preferences/Autodesk/maya/scripts. If you have a userSetup.mel file, add
the following line to the end of this file:
source "DeadlineMayaVRayDBRClient.mel";

824

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

On Linux, copy the file [Repository]/submission/MayaVRayDBR/Client/DeadlineMayaVRayDBRClient.mel


to
[Maya
Install
Directory]/Maya.app/Contents/scripts/startup.
If
you
do
not
have
a
userSetup.mel
in
/home/[USERNAME]/maya/scripts,
copy
the
file
[Repository]/submission/MayaVRayDBR/Client/userSetup.mel to /home/[USERNAME]/maya/scripts.
If you
have a userSetup.mel file, add the following line to the end of this file:
source "DeadlineMayaVRayDBRClient.mel";

The next time Maya is started, a Deadline shelf should appear with an orange button that can be clicked on to
launch the submitter.

If you dont see the Deadline shelf, its likely that Maya is loading another userSetup.mel file from somewhere.
Maya can only load one userSetup.mel file, so you either have to configure Maya to point to the file mentioned
above, or you have to modify the file that Maya is currently using as explained above. To figure out which
userSetup.mel file Maya is using, open up Maya and then open up the Script Editor. Run this command:
whatIs userSetup.mel

Softimage
The following procedure describes how to install the integrated V-Ray DBR submission script for Softimage. The
integrated submission script and the following installation procedure has been tested with Softimage versions 2012
and later.
Submitter Installer
Run the Submitter Installer located at <Repository>/submission/SoftimageVRayDBR/Installers
Manual Installation of the Submission Script
Copy the file [Repository]/submission/SoftimageVRayDBR/Client/DeadlineSoftimageVRayDBRClient.py to
the folder [Softimage Install Directory]/Application/Plugins
Launch Softimage. The submission script is automatically installed when Softimage starts up. To make sure the
script was installed correctly, select the Render toolbar on the left and click the Render button. A Setup V-Ray
DBR With Deadline menu item should be available.

9.66. V-Ray Distributed Rendering

825

Deadline User Manual, Release 7.1.0.35

9.66.5 FAQ
Is V-Ray Distributed Rendering (DBR) supported?
Yes. A special reserve job is submitted that will run the V-Ray Spawner/V-Ray standalone process on
the render nodes. Once the V-Ray Spawner/V-Ray standalone process is running, these nodes will be able
to participate in distributed rendering.
Which versions of V-Ray DBR are supported?
V-Ray DBR interactive rendering is supported for 3ds Max, Maya, and Softimage 2012-2015. You can
also submit V-Ray Spawner jobs for Rhino and Sketchup from the Monitor. In the latter case, the render
nodes will simply be reserved for DBR, and the distributed rendering process itself will have to be initiated
manually from within Rhino or Sketchup.
V-Ray Slave or V-Ray Spawner application fails to start manually?
During initial configuration of V-Ray DBR & any future debugging, it is recommended to disable any
firewall & anti-virus software at both the DBR master host machine as well as all render slave machines
which are intended to participate in the DBR render. We suggest you manually get V-Ray DBR up and
running in your studio pipeline to verify all is well before then introducing Deadline as a framework to
handle the Spawner/Slave process.
Is Backburner required for 3dsMax based V-Ray DBR rendering via Deadline?
Yes. Normal 3dsMax rendering via Deadline requires the Backburner dlls to be present on a system
and this is the same prerequisite for V-Ray DBR rendering to work correcty. Ensure you have the latest/corresponding version of Backburner to ensure it supports the version of 3dsMax you are using. You
can submit a normal 3dsmax render job to verify that Backburner & 3dsMax rendering via Deadline are
all operating correctly before attempting to configure V-Ray DBR rendering. Use the Deadline job report
to verify the correctly matched version of Backburner, 3dsMax are in order.
826

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

3dsmax.exe starts (via vrayspawnerYYYY.exe) in the taskbar (minimized) but then instantly disappears?
V-Ray DBR rendering requires Deadline to have rendered at least one normal 3dsMax render job on the
slave machine prior to attempting DBR rendering via vrayspawnerYYYY.exe. Essentially, to test/debug if
this is an issue, try to manually start the vrayspawnerYYYY.exe program from the Start menu (Start menu
> Programs > Chaos Group > V-Ray for 3dsmax > Distributed rendering > Launch V-Ray DR spawner).
It will automatically try to find the 3dsmax.exe file and start it in server mode. You should end up with
3dsmax minimized in the task bar with the title vraydummyYYYY.max. If 3ds Max stays there alive
without closing then V-Ray DBR is working correctly. If you see the 3ds Max window flashing on the
taskbar and then instantly disappearing, right-click on the V-Ray DR spawner icon in the taskbar tray,
select Exit to close the DR spawner application, and try submitting a regular Deadline 3dsMax render
job with this machine running Deadline slave. After that, try to start the V-Ray DR spawner again.
Do I need to run the vrayspawner (or RT/vrayslave/vray standalone) application or install vrayspawner (or
RT/vrayslave/vray standalone) executable as a service/daemon on each machine?
No. Do NOT execute or install the Chaos Group V-RaySpawner (V-RaySpawner/V-RaySpawner RT/VRay standalone) executable as a background service (NT service/daemon). Deadline is more flexible here
and will spawn the V-RaySpawner/standalone executable as a child process of the Deadline Slave. This
makes our system more flexible and resilient to crashes as when we terminate the V-Ray DBR job in the
Deadline queue, the Deadline Slave application will cleanly tidy up V-RaySpawner/standalone and more
importantly, any DCC application (3dsMax/Maya) or standalone instances which it in turn has spawned
as a child process. This can be helpful if V-Ray DBR becomes unstable and a user wishes to reset the
system remotely. You can simply re-queue or delete/complete the current DBR job or re-submit.
Can I force V-Ray Spawner/Slave to run over a certain control port?
Yes. Set the system environment variable VRAY_DR_CONTROLPORT to the required port number
or where possible, in the case of some supported applications we expose the Port Number option in
our Monitor/in-app submitters. Please consult the V-Ray version 2 or version 3 user manual for more
information on TCP/IP Port Numbers.
Can I force V-Ray DBR to run over a specific port for 3dsMax?
Yes. V-Ray production renderer specifically via 3dsMax uses a TCP port (default:20204), which can
be changed via the Port Number spinner. V-Ray RT as the renderer uses a different TCP port (default:20206). See here for more information on Port Configuration. Please consult the V-Ray version 2 or
version 3 user manual for more information on TCP/IP Port Numbers. Note, the Port Number can only
be controlled via the 3dsMax in-app submitter and NOT when reserving a V-Ray DBR job for 3dsMax
via the Deadline Monitor submission script.
V-Ray DBR rendering seems a little unstable sometimes or my machine slows down dramatically!
Depending on the number of slave machines being used (Win7 OS < 20), scene file sizes being moved
around together with asset files, and your network/file storage configuration, it may help to disable your
local machine from participating in the DR render process. Depending on your 3D application used and
the V-Ray version, there might be a Use local host or Dont use local machine checkbox option,
which can help to reduce the load on your local machine.
Can I fully off-load 3dsMax V-Ray or Mental Ray DBR rendering from my machine?
Yes, although please note, this is a different workflow and is supported directly in the 3dsmax plugin. See
the V-Ray/Mental Ray DBR section for more information.

9.66.6 Error Messages and Meanings


This is a collection of known V-Ray DBR error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
9.66. V-Ray Distributed Rendering

827

Deadline User Manual, Release 7.1.0.35

error: Failed to start network server: Failed to open listening port (98)
VRay.exe/vrayslave has been configured as a service/daemon on the machine generating this error message, possibly during the V-Ray/Maya install process and this is conflicting with Deadline trying to
also spawn the same process on the same TCP port (default: 20207). On Linux, ensure you check
the contents of the file: /usr/autodesk/maya20##-x64/vray/bin/vrayslave for a line entry as follows:
/usr/autodesk/maya2014-x64/vray/bin/vray.bin $* -server -portNumber=20207 where ## is the Maya
version. This line entry should not be present. Note, we are unable to attach to an already running process
as part of the V-Ray Spawner Plugin, hence the V-Ray executable must NOT already be running. Do NOT
execute or install V-Ray as a service. Deadline is more flexible here and will spawn the executable as a
child process of the Deadline Slave.

9.67 VRay Standalone


9.67.1 Job Submission
You can submit VRay Standalone jobs from the Monitor.

828

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

9.67. VRay Standalone

829

Deadline User Manual, Release 7.1.0.35

Setup your vrscene Files


Before you can submit a VRay Standalone job, you must export your scene into .vrscene files. You can export into
either one .vrscene file with all your frames in it, or one .vrscene file per frame.
Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. The VRay specific options are:
VRay File: The VRay file (*.vrscene) to be rendered. If you are submitting a sequence of vrscene files (one file
per frame), you only need to select one vrscene file from the sequence.
Output File: Optionally override the output file name.
Separate Input vrscene Files Per Frame: Select this option of you are submitting a sequence of vrscene files
(one file per frame).
Threads: The number of threads to use for rendering. Specify 0 to use the optimal number of threads.
Command Line Args: Specify additional command line arguments you would like to pass to the mental ray
renderer.
Vrimg2Exr Options: If you are saving out vrimg files, you can submit a dependent Vrimg2Exr job that will
convert the vrimg files to exr files.

9.67.2 Plug-in Configuration


You can configure the VRay plug-in settings from the Monitor. While in super user mode, select Tools -> Configure
Plugins and select the VRay plug-in from the list on the left.

830

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Render Executables
VRay Executable: The path to the VRay executable file used for rendering. Enter alternative paths on separate
lines.
Path Mapping For vrscene Files (For Mixed Farms)
Enable Path Mapping For vrscene Files: If enabled, a temporary vrscene file will be created locally on the
slave for rendering and Deadline will do path mapping directly in the vrscene file.

9.67.3 FAQ
Is VRay Standalone supported?
Yes.

9.67.4 Error Messages and Meanings


This is a collection of known VRay error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.68 VRay Ply2Vrmesh


9.68.1 Job Submission
You can submit Ply2Vrmesh jobs from the Submit menu in the Monitor.

9.68. VRay Ply2Vrmesh

831

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation. The Ply2Vrmesh specific options
are:
Input File: The file to be converted.
Output File: Optionally override the output file name. If left blank, the output name will be the same as the
input name (with the vrmesh extension).
Append: appends the information as a new frame to the .vrmesh file
Merge Voxels: merge objects before voxelization to reduce overlapping voxels

832

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Smooth Angle: a floating point number that specifies the angle (in degrees) used to distinguish if the normals
should be smoothed or not. If present it automatically enables the -smoothNormals flag.
Smooth Normals: generates smooth vertex normals. Only valid for .obj and .geo files; always enabled for .bin
files
Map Channel: stores the UVW coordinates to the specified mapping channel (default is 1). Only valid for .obj
and .geo files. When exporting a mesh that will be used in Maya, currently this must be set to 0 or the textures
on the mesh will not render properly
FPS: a floating-point number that specifies the frames per second at which a .geo or .bin file is exported, so that
vertex velocities can be scaled accordingly. The default is 24.0
Preview Faces: specifies the maximum number of faces in the .vrmesh preview information. Default is 9973
faces.
Faces Per Voxel: specifies the maximum number of faces per voxel in the resulting .vrmesh file. Default is
10000 faces.
Preview Hairs: specifies the maximum number of hairs in the .vrmesh preview information. Default is 500
hairs.
Segments Per Voxel: specifies maximum segments per voxel in the resulting .vrmesh file. Default is 64000
hairs.
Hair Width Multiplier: specifies the multiplier to scale hair widths in the resulting .vrmesh file. Default is 1.0.
Preview Particles: specifies the maximum number of particles in the .vrmesh preview information. Default is
20000 particles.
Particles Per Voxel: specifies maximum particles per voxel in the resulting .vrmesh file. Default is 64000
particles.
Particle Width Multiplier: specifies the multiplier to scale particles in the resulting .vrmesh file. Default is
1.0.
Velocity Attr Name: specifies the name of the point attribute which should be used to generate the velocity
channel. By default the v attribute is used.
Disable Color Set Packing: only valid for .geo and .bgeo files; disables the packing of float1 and float2 attributes in vertex color sets.
Material IDs: only valid for .geo files; assigns material IDs based on the primitive groups in the file.
Flip Normals: reverses the face/vertex normals. Only valid for .obj, .geo and .bin files
Flip Vertex Normals: reverses the vertex normals. Only valid for .obj, .geo and .bin files
Flip Face Normals: reverses the face normals. Only valid for .obj, .geo and .bin files
Flip YZ: swap y/z axes. Needed for some programs i.e. Poser, ZBrush. Valid for .ply, .obj, .geo and .bin files.
Flip Y Positive Z: same as -flipYZ but does not reverse the sign of the z coordinate.
Flip X Positive Z: same as -flipYPosZ but swaps x/z axes.

9.68.2 Plug-in Configuration


You can configure the Ply2Vrmesh plug-in settings from the Monitor. While in super user mode, select Tools ->
Configure Plugins and select the Ply2Vrmesh plug-in from the list on the left.

9.68. VRay Ply2Vrmesh

833

Deadline User Manual, Release 7.1.0.35

Render Executables
Ply2Vrmesh Executable: The path to the ply2vrmesh.exe executable file used for rendering. Enter alternative
paths on separate lines. Different executable paths can be configured for each version installed on your render
nodes.

9.68.3 FAQ
Which versions of Ply2Vrmesh are supported?
Ply2Vrmesh for VRay 2 and 3 are currently supported.

9.68.4 Error Messages and Meanings


This is a collection of known Ply2Vrmesh error messages and their meanings, as well as possible solutions. We want
to keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.69 VRay Vrimg2Exr


9.69.1 Job Submission
You can submit Vrimg2Exr jobs from the Monitor. You can use the Submit menu, or you can use a jobs right-click
Scripts menu to automatically populate some fields in the Vrimg2Exr submitter based on the jobs output.

834

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

9.69. VRay Vrimg2Exr

835

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation. The Vrimg2Exr specific options
are:
VRay Image File: The VRay Image file(s) to be converted. If you are submitting a sequence of files, you only
need to select one vrimg file from the sequence.
Output File: Optionally override the output file name (do not specify padding). If left blank, the output name
will be the same as the input name (with the exr extension).
Frame List: The list of frames convert.
Specify Channel: Enable this option to read the specified channel from the vrimg file and write it as the RGB
channel in the output file.
Long Channel Names: Enable channel names with more than 31 characters. Produced .exr file will NOT be
compatible with OpenEXR 1.x if a long channel name is present.
Set Gamma: Enable this option to apply the specified gamma correction to the RGB colors before writing to
the exr file.
Crop EXR Data Window: Enable this option to auto-crop the EXR data window.
Set Buffer Size: Enable this option to set the maximum allocated buffer size per channel in megabytes. If the
image does not fit into the max buffer size, it is converted in several passes.
Store EXR Data as 16-bit (Half): Enable this option to store the data in the .exr file as 16-bit floating point
numbers instead of 32-bit floating point numbers.
Set Compression: Enable this option to set the compression type. The Zip method is used by default.
Separate Files: Writes each channel into a separate .exr file.
Threads: The number of computation threads. Specify 0 to use the number of processors available.
Multi Part: Writes each channel into a separate OpenEXR2 part.
Convert RGB Data to the sRGB Color Space: Enable this option to converts the RGB data from the vrimg
file to the sRGB color space (instead of linear RGB space) before writing to the exr file.
Delete Input vrimg Files After Conversion: Enable this option to delete the input vrimg file after the conversion has finished.

9.69.2 Plug-in Configuration


You can configure the Vrimg2Exr plug-in settings from the Monitor. While in super user mode, select Tools ->
Configure Plugins and select the Vrimg2Exr plug-in from the list on the left.

836

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Render Executables
Vrimg2Exr Executable: The path to the vrimg2exr.exe executable file used for rendering. Enter alternative
paths on separate lines.

9.69.3 FAQ
Is Vrimg2Exr supported?
Yes.

9.69.4 Error Messages and Meanings


This is a collection of known Vrimg2Exr error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.70 VRED
9.70.1 Job Submission
You can submit jobs for VRED from the Monitor.

9.70. VRED

837

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation.
The VRED specific options are:
Version: Which version of vred to use.
Job Type: What type of job to submit.

838

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

View: Which View to render from.


Animation Clip: Which animation to render.
Dimensions: The dimensions of the final rendered image.
Super Sampling Factor: The super sampling factor to use when rendering.
Render Quality: The render quality to render with.
Background Color: The color of the background to use when rendering.
Include Alpha Channel: Whether or not the the alpha channel should be included.
Premultiply Alpha: Whether or not the alpha channel should be premultiplied.
Tonemap HDR: Whether or not tonemapping should be applied to .hdr files.
DPI: Dots per inch when rendering a still frame.
Export Render Passes: If the render quality is set to raytracing you can export separate render passes.
Export Meta Data: Whether or not Meta Data should be embedded into the rendered files.

9.70.2 Plug-in Configuration


You can configure the VRED plug-in settings from the Monitor. While in super user mode, select Tools -> Configure
Plugins and select the VRED plug-in from the list on the left.

Render Executables
VRED 2015 Executable: The path to the VRED 2015 executable file used for rendering. Enter alternative paths
on separate lines. Different executable paths can be configured for each version installed on your render nodes.

9.70. VRED

839

Deadline User Manual, Release 7.1.0.35

VRED 2016 Executable: The path to the VRED 2016 executable file used for rendering. Enter alternative paths
on separate lines. Different executable paths can be configured for each version installed on your render nodes.

9.70.3 FAQ
Is VRED supported by Deadline?
Yes.
Can VRED render non-GUI via Deadline?
Yes. We already pass an additional argument at render time (-nogui) to force a non-GUI session of VRED
Pro is run during network rendering. If you would prefer to save on your VRED Pro license being used
in non-GUI mode, then please consider using VRED Server Node instead which has no GUI. See the
next FAQ for more information.
Is VRED via Deadline able to render using VRED Server Node (render node) Licenses?
Yes. In order to render using VRED Server Node (render node) licenses (and save using expensive VREDPro licenses) you should edit the VRED Render Executable path to VREDServerNode.exe instead of
VREDPro.exe. Note, you must actually own some Autodesk VRED Render Node 20xx licenses
(where xx is the YEAR) to be able to use the VREDServerNode.exe executable as it does NOT use
the same license that VREDPro.exe uses. Please note that we believe these Autodesk VRED Render
Node licenses are actually referred to on the ADSK pricing list as Autodesk Raytracing Cluster Module
for Autodesk VRED 20xx if you have trouble finding them via your ADSK reseller.

9.70.4 Error Messages and Meanings


This is a collection of known VRED error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.71 VRED Cluster


9.71.1 Job Submission
You can submit jobs for VRED from the Monitor.

840

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation.
The VRED Cluster specific options are:
Cluster Count: The number of tasks/maximum number of slaves to create in the cluster. Default: 1
Port Number: The port number to be used for the cluster service. Default: 8889. Ensure firewall is open.
VRED Version: The VRED application version to use.

9.71.2 Plug-in Configuration


You can configure the VRED Cluster plug-in settings from the Monitor. While in super user mode, select Tools ->
Configure Plugins and select the VRED plug-in from the list on the left.

9.71. VRED Cluster

841

Deadline User Manual, Release 7.1.0.35

Cluster Executables
VRED 2015 Cluster Executable: The path to the VRED 2015 Cluster executable file used for rendering. Enter
alternative paths on separate lines. Different executable paths can be configured for each version installed on
your render nodes.
VRED 2016 Cluster Executable: The path to the VRED 2016 Cluster executable file used for rendering. Enter
alternative paths on separate lines. Different executable paths can be configured for each version installed on
your render nodes.

9.71.3 FAQ
Is VRED supported by Deadline?
Yes.

9.71.4 Error Messages and Meanings


This is a collection of known VRED error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

842

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

9.72 Vue
9.72.1 Job Submission
You can submit jobs from within Vue, or you can submit them from the Monitor.

9.72. Vue

843

Deadline User Manual, Release 7.1.0.35

Submitting from Vue


If you are submitting a single frame from within Vue, select Render -> Render Options, then do the following:
Find the Renderer section, select RenderBull/RenderNode Network, then press the Edit button.
In the Options dialog that pops up, enter the submission command described below.
You can also enter the folder you want the temporary Vue scene file saved in during submission. By default, you
should be able to leave this blank. Press OK when finished.
Press Render to bring up the submission dialog.

If you are submitting an animation from within Vue, select Animation -> Animation Render Options, then do the
following:
Find the Renderer section, select Network Rendering/RenderNode Network, then press the Edit button.
In the Options dialog that pops up, enter the submission command described below.
You can also enter the folder you want the temporary Vue scene file saved in during submission. By default, you
should be able to leave this blank. Press OK when finished.
Press Render Animation to bring up the submission dialog.

844

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

This is the submission command to submit a job from within Vue. Make sure this is entered as one line, and make sure
to set the deadlinecommand.exe and repository paths correctly. Note that the last two arguments 10 and 64bit are
optional, are are used to automatically populate the Version and Build settings respectively. Check the Vue submission
dialog in the Monitor for the available options for Version and Build.
"[Client Bin Folder]\deadlinecommand.exe" -executescript
[Repository]\scripts\submission\VueSubmission\VueSubmission.py
"[FILE_PATH]" "[SCENE_NAME]" "[NUM_FRAMES]" 10 64bit

9.72. Vue

845

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. The Vue specific options are:
Vue File: The Vue scene file to be rendered.
Render animation sequence: Whether or not to render the full animation.
Version: The version of Vue to render with.
Build To Force: Force 32 bit or 64 bit rendering.

9.72.2 Plug-in Configuration


You can configure the Vue plug-in settings from the Monitor. While in super user mode, select Tools -> Configure
Plugins and select the Vue plug-in from the list on the left.

Render Executables
Vue Executable: The path to the Vue executable file used for rendering. Enter alternative paths on separate
lines. Different executable paths can be configured for each version installed on your render nodes.

9.72.3 FAQ
Which versions of Vue are supported?
Vue 6 and later are supported (Infinite and xStream editions).
I have Vue render node licenses, but when I render with Deadline, I get the error No serial number found.

846

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

If you have render node licenses for Vue, you need to use the *RenderNode.exe executable (ie: Vue 9
xStream RenderNode.exe) instead of the StandaloneRenderer.eon executable for rendering.

9.72.4 Error Messages and Meanings


This is a collection of known Vue error messages and their meanings, as well as possible solutions. We want to keep
this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Unable to initialize application - Check render log for more information.
Check the render log for the job to see if this additional information in printed out:
STDOUT: Initializing...Error
STDOUT: No serial number found
STDOUT: Unable to initialize application. Exiting.

If this is the case, it means that Vue cant get a license. If you have render node licenses for Vue, you need to use the
*RenderNode.exe executable (ie: Vue 9 xStream RenderNode.exe) instead of the StandaloneRenderer.eon executable
for rendering.

9.73 xNormal
9.73.1 Job Submission
You can submit xNormal jobs from the Monitor.

9.73. xNormal

847

Deadline User Manual, Release 7.1.0.35

Submission Options
The general Deadline options are explained in the Job Submission documentation. The xNormal specific options are:
XML File: The xNormal XML file to render.
Build To Force: Force 32 bit or 64 bit rendering.

9.73.2 Plug-in Configuration


You can configure the xNormal plug-in settings from the Monitor. While in super user mode, select Tools -> Configure
Plugins and select the xNormal plug-in from the list on the left.

848

Chapter 9. Application Plugins

Deadline User Manual, Release 7.1.0.35

Render Executables
xNormal Executable: The path to the xNormal executable file used for rendering. Enter alternative paths on
separate lines.

9.73.3 FAQ
Is xNormal supported?
Yes

9.73.4 Error Messages and Meanings


This is a collection of known xNormal error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.

9.73. xNormal

849

Deadline User Manual, Release 7.1.0.35

850

Chapter 9. Application Plugins

CHAPTER

TEN

EVENT PLUGINS

10.1 Draft
10.1.1 Overview
Draft is a tool that provides simple compositing functionality. It is implemented as a Python library, which exposes
functionality for use in python scripts. Draft is designed to be tightly integrated with Deadline, but it can also be used
as a standalone tool.
Using Deadlines Draft plugin, artists can automatically perform simple compositing operations on rendered frames
after a render job finishes. They can also convert them to a different image format, or generate Quicktimes for dailies.

10.1.2 Submitting Dependent Draft Jobs


When submitting jobs to Deadline through any of our integrated submitters, you now have the option to have Deadline
create a dependent Draft Job once the submitted job is done rendering; this is where the Draft Event Plugin comes into
play.

The options available here are similar to those discussed in the Draft Plugin section. Although it might appear as
though there are less options here than in the Monitor submitter, all the same information will get passed to the Draft
template. This approach just allows us to automatically pull a lot of the needed info directly from the scene file and
from information filled in elsewhere in the submitter.

851

Deadline User Manual, Release 7.1.0.35

10.1.3 Setup
Since Draft is being shipped alongside Deadline, there is not a whole lot of configuration that is needed for this event
plugin to work (beyond simply enabling it). There are, however, options that allow you to select the priority, group
and pool to which the Draft event plugin will submit Draft jobs.
To access these settings, simply enter Super User mode and select Tools -> Configure Events form the Monitors menu.
From there, select the Draft entry from the list on the left.

The Draft event plugin settings are:


Enabled: If this event plugin is enabled.
Draft Pool: The Group to which the Draft jobs will be submitted. If blank, the original jobs Group will be
re-used.
Draft Group: The Pool to which the Draft jobs will be submitted. If blank, the original jobs Pool will be
re-used.
Draft Limit: The Limit to which the Draft jobs will be submitted. If blank, no Limit will be used.
Priority Offset: This offset will be added to the original jobs priority, in order to determine the Draft jobs
priority.
Draft Output Folder: The folder in which to put the Draft output, relative to the Draft input folder.

852

Chapter 10. Event Plugins

Deadline User Manual, Release 7.1.0.35

10.2 FontSync
10.2.1 Overview
The FontSync event plugin can be used to synchronize fonts from a central server to Windows and Mac OS X render
nodes. It can be configured to synchronize the fonts when the Slave application is launched on the render node, or
before each job the Slave renders.
The font folder on the central server must be accessible by the render nodes, and it is recommended to use separate
folders for Windows and Mac OS X fonts.

10.2.2 Setup
Some configuration is needed to use the FontSync event plugin. To access these settings, simply enter Super User
mode and select Tools -> Configure Events form the Monitors menu. From there, select the FontSync entry from the
list on the left.

General Options
Enabled: If this event plugin is enabled.
Perform Font Synchronization: If the event plugin should synchronize fonts when a slave starts up, or before
each job it renders.
Mac OSX Font Synchronization Options
Network Mac OSX Font Folder: The network Mac OSX font folder used for synchronization.

10.2. FontSync

853

Deadline User Manual, Release 7.1.0.35

Local Mac OSX Font Folder: The local Mac OSX font folder to synchronize with the network font folder.
Enter alternative paths on separate lines.
Windows Font Synchronization Options
Network Windows Font Folder: The network Windows font folder used for synchronization.
Use Users Temp Folder as Font Folder: If enabled, the fonts will be copied to a DeadlineFonts folder in the
current users TEMP folder. Using this option avoids having to create a font folder on each machine, and avoids
permission issues.
Local Windows Font Folder: The local Windows font folder to synchronize with the network font folder. Enter
alternative paths on separate lines. This is ignored if Use Users Temp Folder as Font Folder is enabled.
Timeout For Font Registration (ms): The amount of milliseconds the event plugin will wait before timing out
per font when registering fonts.

10.3 ftrack
10.3.1 Overview
ftrack is a cloud-based Project Management tool that provides Production Tracking, Asset Management, and Team
Collaboration tools to digital studios; see the ftrack website for more information.
Using Deadlines ftrack event plugin, artists can automatically create new Asset Versions in ftrack when they submit
a render Job to the farm. When a Job completes, Deadline will automatically update associated Asset Versions with a
proper Status, Thumbnail, and Components (if the output location is known).

854

Chapter 10. Event Plugins

Deadline User Manual, Release 7.1.0.35

10.3.2 Creating Versions


Versions can either be created automatically on submission (using the ftrack Event Plugin), or done manually afterwards.
Automatic Version Creation
When you submit a new job to Deadline, you can have Deadline automatically create a new Asset Version in ftrack.
This is done by connection to ftrack during the submission process, and selecting the Asset to which the Job should
be tied. The majority of the submission scripts that ship with Deadline include the ftrack connection option. For this
example, we will use Nuke, but the process is basically the same for each submission script.
First, find the tab or panel with the Integration settings. For Nuke, this is under the Integration tab.

10.3. ftrack

855

Deadline User Manual, Release 7.1.0.35

Choose ftrack from the Project Management drop down, and then press the Connect button to bring up Deadlines
ftrack browser. Enter your ftrack Login Name and press Connect. If the connection is successful, Deadline will collect
the list of Projects and Tasks you are assigned to. If there are problems connecting, Deadline will try to display the
appropriate error message to help you diagnose the problem.

856

Chapter 10. Event Plugins

Deadline User Manual, Release 7.1.0.35

10.3. ftrack

857

Deadline User Manual, Release 7.1.0.35

After you have selected a Task and Asset, you must specify a Version Description.

858

Chapter 10. Event Plugins

Deadline User Manual, Release 7.1.0.35

After you have configured the Version information, press OK to return to the Nuke submitter. The ftrack settings will
now contain the Version information you just specified. To include this information with the job, leave the Create
New Version option enabled. If you want to change the Version name or description before submitting, you can do so
without reconnecting to ftrack.

10.3. ftrack

859

Deadline User Manual, Release 7.1.0.35

You can now press OK to submit the job. If the ftrack event plugin is configured to create the new version during
Submission, the log report from the ftrack event plugin will show the Versions ID. Otherwise, the Version wont be
created in ftrack until the job completes.
You can view the log report for the job by right-clicking on the job in the Monitor and selecting View Job Reports.

860

Chapter 10. Event Plugins

Deadline User Manual, Release 7.1.0.35

Manual Version Creation


You can also create an Asset Version and tie it to a Job after submission, from the Deadline Monitor. To do this, simply
right-click on the job and select Scripts -> Create FTrack Version. This will bring up an ftrack browser so that you
can connect, pick the appropriate asset, and set a description. After specifying the required information, just press OK
and the new Version should be created.

10.3. ftrack

861

Deadline User Manual, Release 7.1.0.35

Selecting An Existing Version


Some of our submission scripts can edit an existing ftrack version. For example, our Quicktime submitter allows you
to select an existing ftrack Version to upload the movie to when the job completes.

862

Chapter 10. Event Plugins

Deadline User Manual, Release 7.1.0.35

Choose ftrack from the Project Management drop down, and then press the Connect button to bring up Deadlines
ftrack browser. Enter your ftrack Login Name and press Connect. If the connection is successful, Deadline will collect
the list of Tasks you are assigned to. If there are problems connecting, Deadline will try to display the appropriate
error message to help you diagnose the problem.

10.3. ftrack

863

Deadline User Manual, Release 7.1.0.35

864

Chapter 10. Event Plugins

Deadline User Manual, Release 7.1.0.35

After you have selected a Task and Asset press OK to return to the Quicktime submitter. The ftrack settings will now
contain the Version information you just specified. To upload the movie file to the selected Version, leave the Create
New Version option enabled.

10.3. ftrack

865

Deadline User Manual, Release 7.1.0.35

You can now press OK to submit the job. When the job finishes, the rendered movie will automatically be uploaded to
the selected Version.

10.3.3 Setup
In order to be able to create versions within Deadline, you must first follow the steps below to setup Deadlines
connection to ftrack.

866

Chapter 10. Event Plugins

Deadline User Manual, Release 7.1.0.35

Create API Key


The first thing you need to do is create an API Key in ftrack. This will be used by Deadline to authenticate when
connecting to the ftrack API.
To create a new API Key, you need to navigate to the API Keys page, located under the Security header of ftracks
Settings section. Once this page is displayed, press the Create button to create a new key; while you could re-use an
existing key, it is recommended that you create a separate one for Deadline.

The name of the key doesnt matter much (as long as its descriptive), but make sure Enabled is set to On and that
you select the API role. Once youve filled in all the values, click the Create button to finalize the keys creation.

Once youve created the new entry, take note of its Key value you will need this when configuring Deadline in the
next step.
Configure Deadline
Once youve created an API Key as detailed above, you can now set up the Event Plugin to connect to ftrack. To
perform this setup, you need to enter Super User Mode (from the Tools menu), and then select Tools -> Configure
Events. Once in the Event Plugin Configuration window, select FTrack from the list on the left.

10.3. ftrack

867

Deadline User Manual, Release 7.1.0.35

This is where you will configure all the ftrack-relevant settings in Deadline. There are several different categories of
settings you can configure; they are described in more detail below.
Options
This section contains general high-level options that control the behaviour of the Deadlines ftrack integration.
Enabled: This will turn Deadlines ftrack integration on/off. In order for this feature to function properly, this
must be set to True.
Create Version On Submission: This setting controls when an Asset Version is created in ftrack. If this is
True, they will be created when a Job is submitted. On the other hand, if this is False, the Asset Version will
only be created when the Job is Completed.
Connection Settings
This section contains information that Deadline uses to connect to the ftrack API; these settings must be configured
properly in order for this feature to work at all.
FTrack URL: This is the URL which you use to connect to your ftrack installation.
FTrack Proxy: The proxy you use to connect to ftrack. This is only relevant if you use a Proxy; if in doubt,
leave this field blank.
FTrack API Key: This is where you must enter the API Key created in the Create API Key step.
Version Status Mappings

868

Chapter 10. Event Plugins

Deadline User Manual, Release 7.1.0.35

This section contains mappings from Deadline Job Statuses to ftrack Asset Version Statuses. These are not necessary, but if specified, Deadline will update the status of Asset Versions as Deadline Jobs change status (based on the
mappings provided).
Rename ExtraInfo Columns
The ftrack integration uses ExtraInfo columns 0-5 to display relevant information about the Asset Versions that are
tied to Deadline Jobs. Given that ExtraInfo0 isnt exactly a descriptive name for what that column is being used for
in this context, many people find it useful to rename these columns to be more descriptive.
To do so, you must be in Super User mode and select Tools -> Repository Options. You must then go to the Job
Settings section, and select the Extra Properties tab; from here youll be able to change these column names to
something more appropriate.

10.4 Puppet
10.4.1 Overview
Puppet is management system that can be used to keep applications and plugins synched across your render nodes.
See the Puppet Labs Website for more information.
The Puppet event plugin that ships with Deadline can be used to run a Puppet update on a slave when it starts and
when it becomes idle, thus allowing you to keep your render nodes in sync without interupting jobs that are currently
rendering.
Note that Puppet must already be configured to work outside of Deadline. Once your Puppet system is set up, you can
then enable the Puppet event plugin for Deadline to automatically trigger Puppet updates.

10.4.2 Setup
Some configuration is needed to use the Puppet event plugin. To access these settings, simply enter Super User mode
and select Tools -> Configure Events form the Monitors menu. From there, select the Puppet entry from the list on
the left.

10.4. Puppet

869

Deadline User Manual, Release 7.1.0.35

The Puppet event plugin settings are:


Enabled: If this event plugin is enabled.
Puppet Path: The path to the Puppet executable file. Enter alternative paths on separate lines.
Verbose: If enabled, the puppet update will have verbose logging enabled.

10.5 Salt
10.5.1 Overview
Salt (or SaltStack) is management system that can be used to keep applications and plugins synched across your render
nodes. See the SaltStack Website for more information.
The Salt event plugin that ships with Deadline can be used to run a Salt update on a slave when it starts and when it
becomes idle, thus allowing you to keep your render nodes in sync without interupting jobs that are currently rendering.
Note that Salt must already be configured to work outside of Deadline. Once your Salt system is set up, you can then
enable the Salt event plugin for Deadline to automatically trigger Salt updates.

10.5.2 Setup
Some configuration is needed to use the Salt event plugin. To access these settings, simply enter Super User mode and
select Tools -> Configure Events form the Monitors menu. From there, select the Salt entry from the list on the left.

870

Chapter 10. Event Plugins

Deadline User Manual, Release 7.1.0.35

The Salt event plugin settings are:


Enabled: If this event plugin is enabled.
Salt Exe: The path to the Salt Executable. Enter alternative paths on separate lines.
Logging: The level of verbose logging Salt will provide.

10.6 Shotgun
10.6.1 Overview
Shotgun is a customizable web-based Production Tracking system for digital studios, and is developed by Shotgun
Software.
Using Deadlines Shotgun event plug-in, artists can automatically create new Versions for Shots or Tasks in Shotgun
when they submit a render job to the farm. When the job finishes, Deadline can automatically update the Version by
uploading a thumbnail and marking it as complete or pending for review.

10.6. Shotgun

871

Deadline User Manual, Release 7.1.0.35

10.6.2 Creating Versions


Versions can either be created automatically on submission (using the Shotgun Event Plugin), or done manually
afterwards.
Automatic Version Creation
When you submit a new job to Deadline, you can have Deadline automatically create a new Version in Shotgun. This
is done by connecting to Shotgun prior to submitting the job and choosing the Task that the job is for. The majority of
the submission scripts that ship with Deadline include the Shotgun connection option. For this example, we will use
Nuke, but the process is basically the same for each submission script.
First, find the tab or panel with the Integration settings. For Nuke, this is under the Integration tab.

872

Chapter 10. Event Plugins

Deadline User Manual, Release 7.1.0.35

Choose Shotgun from the Project Management drop down, and then press the Connect button to bring up Deadlines
Shotgun browser. Enter your Shotgun Login Name and press Connect. If the connection is successful, Deadline
will collect the list of Tasks you are assigned to. If there are problems connecting, Deadline will try to display the
appropriate error message to help you diagnose the problem.

10.6. Shotgun

873

Deadline User Manual, Release 7.1.0.35

874

Chapter 10. Event Plugins

Deadline User Manual, Release 7.1.0.35

After you have selected a Task, you must specify a Version name and a description. If you have configured Version
name templates in the Shotgun event plugin configuration, you can select one from the drop down. You can also
manually type in the version name instead.

10.6. Shotgun

875

Deadline User Manual, Release 7.1.0.35

After you have configured the Version information, press OK to return to the Nuke submitter. The Shotgun settings
will now contain the Version information you just specified. To include this information with the job, leave the Create
New Version option enabled. If you want to change the Version name or description before submitting, you can do so
without reconnecting to Shotgun.

876

Chapter 10. Event Plugins

Deadline User Manual, Release 7.1.0.35

You can now press OK to submit the job. If the Shotgun event plugin is configured to create the new version during
Submission, the log report from the Shotgun event plugin will show the Versions ID. Otherwise, the Version wont be
created in Shotgun until the job completes.
You can view the log report for the job by right-clicking on the job in the Monitor and selecting View Job Reports.

10.6. Shotgun

877

Deadline User Manual, Release 7.1.0.35

Manual Version Creation


To manually create a Version from a completed job, right-click on the job in the Deadline Monitor and select Scripts
-> Create Shotgun Version. This will bring up the Shotgun browser so that you can connect, pick the appropriate Task,
and specify a Version name and description. After specifying the appropriate information, press OK to create the new
version.

878

Chapter 10. Event Plugins

Deadline User Manual, Release 7.1.0.35

Selecting An Existing Version


Some of our submission scripts can edit an existing Shotgun version. For example, our Quicktime submitter allows
you to select an existing Shotgun Version to upload the movie to when the job completes.

10.6. Shotgun

879

Deadline User Manual, Release 7.1.0.35

Choose Shotgun from the Project Management drop down, and then press the Connect button to bring up Deadlines
Shotgun browser. Enter your Shotgun Login Name and press Connect. If the connection is successful, Deadline
will collect the list of Tasks you are assigned to. If there are problems connecting, Deadline will try to display the
appropriate error message to help you diagnose the problem.

880

Chapter 10. Event Plugins

Deadline User Manual, Release 7.1.0.35

10.6. Shotgun

881

Deadline User Manual, Release 7.1.0.35

After you have selected a Task, you can select a Version for that Task. Then press OK to return to the Quicktime
submitter. The Shotgun settings will now contain the Version information you just specified. To upload the movie file
to the selected Version, leave the Create New Version option enabled.

You can now press OK to submit the job. When the job finishes, the rendered movie will automatically be uploaded to
the selected Version.

882

Chapter 10. Event Plugins

Deadline User Manual, Release 7.1.0.35

10.6.3 Advanced Workflow Mode


When setting up the Shotgun event plugin, you can enable an Advanced Workflow Mode. This mode allows you
to create Versions by selecting a Task, or by selecting a Project and Entity. Studios that dont use the Task-centric
approach will probably find the Advanced Workflow Mode more suitable to their needs.

10.6.4 Setup
Follow these steps to setup Deadlines connection to Shotgun.
Create the API Script in Shotgun
In Shotgun, you must first create a new API script so that Deadline can communicate with Shotgun. This can by done
from the Admin menu.

10.6. Shotgun

883

Deadline User Manual, Release 7.1.0.35

After the Scripts page is displayed, press the [+] button to create a new script, and enter the following information in
the window that appears. If you cant see one or more of the following fields, use the More Fields drop down to show
them.
Script Name: deadline_integration
Description: Script for Deadline integration
Version: 1.0
Permission Group: API Admin

After you have created the new script, click on the deadline_integration link in the Scripts list and note the value in
Application Key field (its a long key consisting of alphanumeric characters). Youll need this key when configuring
Deadlines Shotgun connection in the next step.

884

Chapter 10. Event Plugins

Deadline User Manual, Release 7.1.0.35

Configure the Shotgun Connection


After you have created the Deadline API Script in Shotgun, you can now configure the Shotgun event plug-in from the
Deadline Monitor. Enter Super User Mode from the Tools menu, and then select Tools -> Configure Events.

10.6. Shotgun

885

Deadline User Manual, Release 7.1.0.35

The event plugin settings are split up into a few sections. The most important sections are the Options and Connection
Settings, as these control how Deadline connects to Shotgun. In most cases, the Field and Value Mapping sections can
be left alone because they map to fields that exist in the default Shotgun installation. Only studios that have deeply
customized their Shotgun installations might have to worry about changing the Field and Value Mapping settings.
Options
Enabled: The Shotgun event plugin must be enabled before Deadline can connect to Shotgun.
Create Version On Submission: If enabled, Deadline will create the Shotgun Version at time of submission
and update its status as the job progresses. Otherwise, the Version will only be created once the job completes.
Enable Advanced Workflow: If enabled, the user can select a Project and Entity instead of just a Task.
Thumbnail Frame: The frame to upload to Shotgun as a thumbnail.
Convert Thumbnails with Draft: Whether or not to attempt to use Draft to convert the Thumbnail frame prior
to upload.
Thumbnail Conversion Format: The format to convert the Thumbnail to prior to upload (see above).
Version Templates: Presets for Version names that users can select from (one per line). Available tokens include
${project}, ${shot}, ${task}, ${user}, and ${jobid}. For example:
${project} - ${shot} - ${task}
${project}_${shot}_${task} (${jobid})
Enable Verbose Errors: Whether or not detailed (technical) error information should be displayed when errors
occur while connecting to Shotgun.
Connection Settings
Shotgun URL: Your Shotgun URL.
Shotgun Proxy: Your proxy (if you use one).
886

Chapter 10. Event Plugins

Deadline User Manual, Release 7.1.0.35

API Script Name: The name of the API script you created in Shotgun earlier (deadline_integration).
API Application Key: The key from the script you created in Shotgun earlier (its a long key consisting of
alphanumeric characters).
Shotgun Field Mappings
These are the Version fields that Deadline is expecting to exist in Shotgun. The default values match those from a
default Shotgun installation, so you will only have to edit these settings if you have customized the Version Field
names in your Shotgun installation.
Note that some of the Fields you can specify arent created by default in Shotgun. You will have to manually create
those fields in Shotgun and specify their names here, if you wish to use them. An example of such fields would be
Deadline Job ID, and Average/Total Render Time.
Status Value Mappings
These are the Version status values that Deadline is expecting to exist in Shotgun. The default values match those from
a default Shotgun installation, so you will only have to edit these settings if you have customized the Version Status
values in your Shotgun installation.
Draft Field Mappings
Draft Template Field: The field code for a Task field that contains a Draft Template relevant to the Task. If this
is specified, Deadline can automatically pull in the specified template at submission time.
Test the Shotgun Connection
After you have configured the Shotgun connection, you can test it from the Deadline Monitor by selecting Scripts ->
TestIntegrationConnection. This will bring up the Test Integration Connection dialog.

10.6. Shotgun

887

Deadline User Manual, Release 7.1.0.35

Choose Shotgun from the Project Managemnt drop down, and then press Connect. If the connection is successful,
Deadline will collect the list of Tasks you are assigned to. If there are problems connecting, Deadline will try to
display the appropriate error message to help you diagnose the problem.

888

Chapter 10. Event Plugins

Deadline User Manual, Release 7.1.0.35

Set up Shotgun Columns in the Deadline Monitor


Deadline uses the job Extra Info properties 0 to 5 for Shotgun specific settings, and you can configure the columns in
the Job List in the Monitor to properly represent these settings. In the Monitor, enter Super User mode from the Tools
menu, and then select Tools -> Configure Repository Options. Find the Job Settings section and click on the Extra
Properties tab. It will show the following:

10.6. Shotgun

889

Deadline User Manual, Release 7.1.0.35

Rename the Extra Info properties as shown in the following image. After committing these changes, you will now be
able to see these Shotgun specific columns in the Job List in the Monitor.

890

Chapter 10. Event Plugins

Deadline User Manual, Release 7.1.0.35

10.6.5 FAQ
Which editions of Shotgun does Deadline support?
Deadline supports the Studio and Partner editions of Shotgun, because those editions include the necessary
API access.
Which versions of Shotgun does Deadline support?
Deadline supports Shotgun 2.3 and later.
Which version of the Shotgun API does Deadline use?
Deadline 7.1 ships with version 3.0.17 of the Python Shotgun API.

10.6. Shotgun

891

Deadline User Manual, Release 7.1.0.35

892

Chapter 10. Event Plugins

CHAPTER

ELEVEN

CLOUD PLUGINS

11.1 Amazon EC2


11.1.1 Overview
The Amazon EC2 plugin for Deadline allows for communication between Deadline and the EC2 service. It works
with both the Cloud Panel in the Monitor and the Deadline Balancer application.

11.1.2 Configuration
Before you can configure the Amazon EC2 plugin for Deadline, you must add Amazon as a provider in the Cloud
Providers dialog in the Monitor. The Amazon EC2 plugin requires only a few credentials before it can be used in
Deadline. These can be collected from the Amazon EC2 web site (see the image below).

893

Deadline User Manual, Release 7.1.0.35

Configuration Settings

894

Chapter 11. Cloud Plugins

Deadline User Manual, Release 7.1.0.35

General
Enabled: Enables the cloud region for use in Deadline.
Options
Access Key ID: Your EC2 Access key.
Secret Access Key: Your EC2 secret key.
Region: The EC2 region you want to use.
Account Number: Your EC2 account number. Used to filter the image list.
VM Configuration
Key Pair Name: The Key Pair to be used for the instance.
Subnet ID: ID of the Subnet to start instances in.
Instance Types: List of the Hardware Types used on EC2. Make sure you use types that are supported by
Amazon. You can find a list of them Here
User Data: Any data you want to pass to an instance. Can be used to configure instances as part of start up
scripts. Doing curl http://169.254.169.254/latest/user-data from within the instance will get you everything you
11.1. Amazon EC2

895

Deadline User Manual, Release 7.1.0.35

put in here.
Security Group: Start instances in this security group.
Customization
Instance Name: The name of the instances that are started by the Balancer. We add some random hex values to
the end for uniqueness.

11.1.3 FAQ
Is Amazon EC2 Cloud supported by Deadline?
Yes.

11.1.4 Error Messages and Meanings


This is a collection of known Amazon EC2 Cloud error messages and their meanings, as well as possible solutions.
We want to keep this list as up to date as possible, so if you run into an error message that isnt listed here, please
email Deadline Support and let us know.
Currently, no error messages have been reported for this cloud plug-in.

11.2 Google Cloud


11.2.1 Overview
The Google plugin for Deadline allows for communication between Deadline and the Google Cloud service. It works
with both the Cloud Panel in the Monitor and the Deadline Balancer application.

11.2.2 Configuration
Before you can configure the Google Cloud plugin for Deadline, you must add Google Cloud as a provider in the
Cloud Providers dialog in the Monitor. The Google plugin requires only a few credentials before it can be used in
Deadline (see the image below). You can download the Client Secrets file from the API->Credentials section of the
Google Compute console.

11.2.3 Credentials
Here is a guide for how to get your client secrets file from the Google Cloud Console and verify your access to your
GCE project in Deadline. Step 1: Get your project ID
This is used in Deadline to verify access to this project.

896

Chapter 11. Cloud Plugins

Deadline User Manual, Release 7.1.0.35

Step 2: Create your credentials


Click on Credentials under APIs & auth. Here well create a new Client ID.

Step 3: Creating a new Client ID


The first time you create a Client ID GCE will ask you to configure your consent screen. Select Installed Application
and then click Configure consent screen. If youve already configured a consent screen you should see a dialog
similar to Step 5

11.2. Google Cloud

897

Deadline User Manual, Release 7.1.0.35

Step 4: Consent Screen


Setting up a basic consent screen is pretty simple. All you need is an email address and a product name. Everything
else is optional.

Step 5: Creating a new Client ID (continued)


After setting up our Consent screen well be taken back to the Create Client ID screen. Make sure Install Application
and Other are both selected. Then click Create Client ID

898

Chapter 11. Cloud Plugins

Deadline User Manual, Release 7.1.0.35

Step 6: Downloading your Client Secrets File


To download your Client Secrets file click on Download JSON. Ensure you click on Download JSON for your
Client ID for native application. Do NOT Download JSON for your Compute Engine and App Engine as this
JSON file NOT work.

11.2.4 Verifying Access


In order to use the Google Compute Engine we need to authenticate with Oauth 2. What were going to need is the
Project ID (found in Step 1), the path to your client secrets file you download above and the path to your oauth.dat file
(this file wont exist until we successfully connect to GCE). Fill in these fields in the Cloud Providers dialog and click
Verify Access. (Note: This is not the only way to get the oauth.dat file. The first time Deadline tries to connect to
GCE with either no oauth file or an expired oauth file it will go through this process.)

11.2. Google Cloud

899

Deadline User Manual, Release 7.1.0.35

A web browser will start asking you to sign into your Google Account.

Next, youll see the consent screen you created in Step 4. Heres what a basic consent screen looks like.

After choosing Accept you should see this message:

Now your oauth.dat file should be downloaded and you should be all set to use your GCE project with Deadline.
Notes:
This verify access process is time sensitive. It will timeout if you wait too long. The timeout is not very long,
only lasting about 30 seconds.
One issue that has come up is blocked ports. If you have a process running that blocks on port 8080 this will not
work. Youll have to change the port number. (Programs such as Skype might do this)
oauth.dat files do eventually expire. You will need to repeat this process to get a new one.

900

Chapter 11. Cloud Plugins

Deadline User Manual, Release 7.1.0.35

11.2.5 Configuration Settings

General
Enabled: Enables the cloud region for use in Deadline.
Credentials
Client Secrets *.json File: The path to your Client Secrets file. You can download it by going to your project,
clicking on Credentials (under the APIs & auth heading) and clicking download JSON. Note: Deadline requires
a Client ID for native applications.
OAuth 2.0 *.dat File: The path to your OAuth2 dat file. This file wont exist until youve Verified Access at
least once (or try to use the plugin) for the first time. The API will download it after you grant it access.
Project ID: The non-human readable ID of your Google Cloud Project. Found on the Overview tab and the
Cloud Console. See Step 1.
Options
Region: The GCE region you are using.
Network: The GCE network to spawn instances.

11.2. Google Cloud

901

Deadline User Manual, Release 7.1.0.35

Disk Size: The size of the Persistent Disk to start with your instance. The default is 10GB.
Port Number: The port number used for authentication.
Show Images in Cloud Panel: Show the image name in the Cloud Panel. Enabling this may cause performance
issues.
Customization
Instance Name: The name of the Instances that will be spawned. We add some random hex values on the end
to make them unique.
Tags: Tag firewall rules to the instances that Deadline starts. Each tag should be on a new line.

11.2.6 FAQ
Does the Google Compute Engine API need to be enabled?
Yes, ensure that the Google Compute Engine API is enabled for the project. In the Google Developer
Console, click the project name. From the left-side menu, choose the APIs link under the APIs and auth
heading. If Google Compute Engine is not shown in the list of Enabled APIs at the top of the page, do
the following: Scroll down and find Google Compute Engine and enable it. If you have not previously
enabled billing for the account, you will be prompted to do so. After enabling billing, you will again need
to enable the Google Compute Engine API. It should now appear under the list of Enabled APIs at the
top of the page. Once Google Compute Engine appears in the list of Enabled APIs at the top of the page,
re-generate and re-save the Client Secret JSON file.

11.2.7 Error Messages and Meanings


This is a collection of known Google Cloud error messages and their meanings, as well as possible solutions. We
want to keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email
Deadline Support and let us know.
Error: No JSON object could be decoded
There is an issue with the JSON file you downloaded. Heres a checklist of possible issues that should be
considered:
The JSON file could be corrupt or poorly formatted or didnt download cleanly. Try pasting it into
an online JSON checker to verify it is valid OR try re-downloading the file again.
Double-check you selected the *.json file for the Client Secrets *.json File and you selected the
oauth.dat file for the OAuth 2.0 *.dat File in the Configure Cloud Providers... dialog.
Double-check you clicked on Download JSON for your Client ID for native application. Do
NOT Download JSON for your Compute Engine and App Engine as this JSON file NOT work.
Its possible that the JSON credentials file was made (downloaded) before the Google GCE APIs
were enabled or had time to update/refresh after being enabled in the Google Console. So, ensure
the GCE Compute API is enabled and try waiting 5 minutes before re-downloading the JSON file,
perhaps trying a different internet browser in case of any intermediate downloader manager issue.

902

Chapter 11. Cloud Plugins

Deadline User Manual, Release 7.1.0.35

11.3 Microsoft Azure


11.3.1 Overview
The Azure plugin for Deadline allows for communication between Deadline and the Azure service. It works with both
the Cloud Panel in the Monitor and the Deadline Balancer application.

11.3.2 Configuration
Before you can configure the Azure plugin for Deadline, you must add Azure as a provider in the Cloud Providers
dialog in the Monitor. The Azure plugin requires only a few credentials before it can be used in Deadline (see the
image below). Youll also have to create and upload a Management Certificate.

11.3. Microsoft Azure

903

Deadline User Manual, Release 7.1.0.35

Configuration Settings

General
Enabled: Enables the cloud region for use in Deadline.
Credentials
Subscription ID: Your access ID for your Azure account.
Certificate Path: Path to your Azure Certificate.
VHD Blob Storage: The url of your Blob Storage.
Blob Storage Password: Password for Blob Storage if you have one.
VM Config
Affinity Group: The Affinity Group to start instances in. Can be used instead of Location.
Location: The Location to start instances in. Can be used instead of Affinity Group.
Virtual Network: The virtual network that the instance will be a part of.
Subnet Name: Name of the subnet that the instance will be in.

904

Chapter 11. Cloud Plugins

Deadline User Manual, Release 7.1.0.35

VM Login User: User name to login to the instance.


VM Login Password: Password to login to.
Customization
Instance Name: Name used when starting new instances. We add some random hex values to the end for
uniqueness.

11.3.3 FAQ
Is Azure Cloud supported by Deadline?
Yes.

11.3.4 Error Messages and Meanings


This is a collection of known Azure Cloud error messages and their meanings, as well as possible solutions. We want
to keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this cloud plug-in.

11.4 OpenStack
11.4.1 Overview
The Openstack plugin for Deadline allows for communication between Deadline and an Openstack server. It works
with both the Cloud Panel in the Monitor and the Deadline Balancer application.

11.4.2 Configuration
Before you can configure the OpenStack plugin for Deadline, you must add OpenStack as a provider in the Cloud
Providers dialog in the Monitor. The Openstack plugin requires only a few credentials before it can be used in
Deadline (see image below).

11.4. OpenStack

905

Deadline User Manual, Release 7.1.0.35

Configuration Settings

General
Enabled: Enables the cloud region for use in Deadline.
Options
User Name: Your Openstack user name.
Password: The password for your Openstack account.
Keystone Endpoint: The endpoint of the Openstack server. This is listed as Identity in the Access & Security
section of the Openstack project.
Tenant Name: The Tenant name (aka Project Name).
Keypair Name: Instances start with these key names.
Security Group: Start instances in this security group.
Customization
Instance Name: The name of newly created instances. We add some random characters on the end for uniqueness.
906

Chapter 11. Cloud Plugins

Deadline User Manual, Release 7.1.0.35

11.4.3 FAQ
Is Openstack Cloud supported by Deadline?
Yes.

11.4.4 Error Messages and Meanings


This is a collection of known Openstack Cloud error messages and their meanings, as well as possible solutions. We
want to keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email
Deadline Support and let us know.
Currently, no error messages have been reported for this cloud plug-in.

11.5 vCenter
The vCenter plugin for Deadline allows for communication between Deadline and a vCenter server. It only works
with the Cloud Panel in the Monitor. It does not work with the Deadline Balancer application.

11.5.1 Configuration
Before you can configure the vCenter plugin for Deadline, you must add vCenter as a provider in the Cloud Providers
dialog in the Monitor. The vCenter plugin requires only a few credentials before it can be used in Deadline.

11.5. vCenter

907

Deadline User Manual, Release 7.1.0.35

Configuration Settings

General
Enabled: Enables the cloud region for use in Deadline.
Options
vCenter Server: The name of the vCenter Server you want to connect to.
User Name: Username for vCenter.
Password: Password for vCenter.
Customization
Instance Name: Name used when starting new instances. We add some random hex values to the end for
uniqueness.

11.5.2 FAQ
Is VMware vCenter supported by Deadline?

908

Chapter 11. Cloud Plugins

Deadline User Manual, Release 7.1.0.35

Yes, but not with Balancer. Only basic manual cloud instance starting/stopping/terminating is supported
via the cloud plugin architecture.

11.5.3 Error Messages and Meanings


This is a collection of known VMware vCenter error messages and their meanings, as well as possible solutions. We
want to keep this list as up to date as possible, so if you run into an error message that isnt listed here, please email
Deadline Support and let us know.
Currently, no error messages have been reported for this cloud plug-in.

11.5. vCenter

909

Deadline User Manual, Release 7.1.0.35

910

Chapter 11. Cloud Plugins

CHAPTER

TWELVE

RELEASE NOTES

12.1 Deadline 7.0.0.54 Release Notes


12.1.1 Overview
Deadline 7 is the latest version of Thinkbox Softwares scalable high-volume compute management solution. It features built-in VMX (Virtual Machine Extension) capabilities, which allow artists, architects and engineers to harness
resources in both public and private clouds.
In addition to enhanced cloud support, Deadline 7 expands support for the Jigsaw multi-region rendering feature,
which can now be accessed in 3ds Max, Maya, modo, and Rhino. Deadline 7 also introduces Draft 1.2, an update to
Thinkboxs lightweight compositing and video processing plug-in designed to automate typical post-render tasks such
as image format conversion as well as the creation of animated videos and QuickTimes, contact sheets, and watermark
elements on exported images. Finally, Deadline 7 introduces a wealth of new features, enhancements, and bug fixes,
which are detailed below.
Note that a new 7.0 license is required to run this version. If you have a license for Deadline 6.2 or earlier, you will
need an updated license. In addition, the version of Draft that ships with Deadline 7 needs a new 1.2 license. If you
have a license for Draft 1.1 or earlier, you will need an updated license.

12.1.2 Highlighted Features


VMX (Virtual Machine eXtension)
With VMX (Virtual Machine eXtension) built in and pluggable cloud support, Deadline 7 can interact with private
and public cloud solutions out-of-the-box, including Amazon EC2, Microsoft Azure and OpenStack, among others.
The new Deadline Balancer application can start and shut down virtual instances on demand based on the jobs in the
queue, the current budget settings, or other custom algorithms. Multiple cloud solutions can be used simultaneously,
along with classic non-cloud rendernode and workstation rendering.
Updated to MongoDB 2.6.3
Deadline now ships with MongoDB 2.6.3, with version 2.6.1 being the new minimum requirement for Deadline 7.
Deadline utilizes MongoDBs new timestamp feature to significantly reduce the number of write queries performed
during normal operation. Not only does this improve performance under heavier loads, but it also allows Deadline to
support MongoDBs Sharding feature. Sharding can be used to create a cluster of MongoDB instances, allowing the
database server to scale horizontally by adding more nodes to the cluster.
Deadlines Replica Set support has been improved as well. Previously, you had to specify each node in your Replica
Set when specifying the MongoDB server name. Now, you can also include the Replica Set name.

911

Deadline User Manual, Release 7.1.0.35

Updated User Interface


Deadlines User Interface libraries have been updated to Qt 5, and the Deadline applications now use Qts new Fusion
theme for a more modern look and feel. The Fusion theme provides better scaling at larger resolutions, and it also
provides more color contrast.
The Monitor also uses new progress bars to show the progress for jobs. The progress bars show the state of every task
for the job, not just the complete versus incomplete tasks. This allows you to see the overall state of all the tasks at a
glance.
Finally, updating to Qt 5 also addresses issues that Qt 4 had with Wacom tablets.
Python Upgraded to 2.7.8
Deadline now ships with Python 2.7.8. Note that this shouldnt affect any existing scripts that you use with Deadline.
In addition, the Deadline applications no longer set the PYTHONHOME and PYTHONPATH environment variables
for their current session. This means that any applications launched from a Deadline application will no longer inherit
these modified variables, which should avoid compatibility issues if those other applications use a different version of
Python.
Draft Upgraded to 1.2.3.57201
Deadline now ships with Draft 1.2.3.57201. Note that this shouldnt affect any existing Draft template scripts that you
use with Deadline. Also note that if you are using Draft 1.1 or earlier, you will need an updated Draft license. Below
is a list of whats new in Draft 1.2.3.57201:
Python Version
The Python version that Draft requires is now Python 2.7.
FFmpeg Version
FFmpeg libraries have been updated to version 2.3.
OpenColorIO Improvements
Use config.ocio and ColorSpaces / Roles to create OCIO color processors for color correcting images.
Create OCIO color processors directly from your favourite LUT files... see http://opencolorio.org/FAQ.html for
the full list of LUT formats supported.
ASC CDL Improvements
A fully standard-compliant implementation of ASC CDL LUTs. (The clamping steps in OCIOs ASC CDL
implementation is not currently standard-compliant.)
Added ASC CDL and OCIO lut example templates.
WebM Improvements
Added support for WebM files (vp8 video codec, vorbis audio).
EXR Improvements
Improved error message when trying to open an exr file that doesnt exist.
Unicode Improvements
Draft now supports unicode filenames and text annotations!

912

Chapter 12. Release Notes

Deadline User Manual, Release 7.1.0.35

Note: We need to modify the DraftParamParser.py library so that unicode strings arent mangled in the Deadline/Draft boundary, but once theyre in, Draft handles them properly.
Licensing Improvements
Draft Licences are now more flexible! Most Draft features require only that a license be present. Actual checkout
of licenses now happens only while videos are being encoded or decoded.
Lost connection to license server no longer pops up dialog boxes on Windows.
Mono Upgraded to 3.8
Deadline now runs against Mono 3.8 on Linux and Mac OSX, which helps improve stability. In addition, the Mac
OSX version of Mono is now 64-bit. This new version is bundled with the Linux and Mac OSX Client and Repository
installers.
Mono Included in Linux Installers
Mono is now installed automatically as part of the installation procedure on Linux. It is installed to the Deadline
installation folder, and wont impact any existing Mono installations. Now Mono no longer needs to be installed
manually on Linux prior to installing Deadline.
Updated Slave Licensing Model
When running multiple slaves on a single machine, they will now share a single license instead of needing one license
per slave instance. In addition, the slaves will only hold onto their license while they are rendering. When they become
idle, they will return their license.
Customizable Styles for Deadline Applications
The new Styles configuration panel in the Monitor options allows you to customize the color of the Deadline applications. Simply specify a palette color and the User Interface will automatically use lighter and darker variants of that
color where necessary. In addition, the font style and size can be configured as well. Finally, you can export styles and
share them with other users.
New Batch Property for Grouping Jobs
A new Batch property has been added to jobs that allows jobs to be grouped together in the Job List. All jobs with the
same Batch name will be grouped under that Batch name, and the Batch name can be expanded or collapsed to show
and hide all the jobs, respectively. Jobs in the same Batch will also be grouped together in the Job Dependency View.
Finally, the properties for the jobs in the same Batch can be modified by simply right-clicking on the Batch item in the
Job List or the Job Dependency View.
New Graphs in the Monitor
New graphs have been added to the Monitor. The Jobs panel can show pie charts based on the job pool, secondary
pool, group, user, and plugin. The Tasks panel can show graphs representing the task render times, image sizes, cpu
usage, and memory usage. The Slaves panel can now show bar charts that show how many slaves are in certain pools
and groups. The Job Reports panel can now show a pie chart that shows the percentage of errors generated by each
slave.

12.1. Deadline 7.0.0.54 Release Notes

913

Deadline User Manual, Release 7.1.0.35

Customizable Default Layout for Panels in Monitor


A default layout for panels in the Monitor can now be saved, and when a new panel is opened, it will use the saved
default layout. So now you can set up your favourite default layouts for the Job list, Task list, etc and not have to worry
about setting them up again when you open new panels.
In addition, you can now save the layout from a panel to disk and load it in again. This allows you to share a layout
from your Monitor with someone else.
Job Dependency Improvements
Job dependencies are now more flexible than ever. Individual dependencies can have notes attached to them, and they
can also have their own overrides for the Frame Offset and Resume On... settings.
The Job Dependency view in the Monitor has also been updated to show these per-dependency settings. In addition,
there is now a new feature in the Dependency View that allows you to test the dependencies and see which ones pass
and which ones do not. Finally, the look of the nodes in the Dependency View have been updated.
Limit Improvements
Limits are now much more flexible than they were before. Previously, one Limit Stub per Slave was used up when a
Slave rendered a job that required that Limit. This is still supported, but now, a Limit can be configured so that one
Limit Stub per Task is used up, or one Limit Stub per Machine is used up.
The per Task option is useful if you are rendering with an application that requires one license per instance, and you
are rendering more than one concurrent task at a time. The per Machine option is useful if you are rendering with
an application that only requires a single license per machine, regardless of how many instances are running on that
machine.
Improvements to Pool and Group Management
The Slave list in the Pool and Group Management dialogs can now be filtered, and all columns in the list are now
available. In addition, you can now right-click on specific slaves in the Slave list in the Monitor to modify Pools and
Groups for the selected slaves only.
Suspend Tasks
Deadline now supports the ability to suspend and resume individual tasks. This can be useful if you want to postpone
or skip the rendering of specific tasks.
Slave Scheduling Improvements and Idle Detection
Deadlines Slave Scheduling feature has undergone a major overhaul. Previously it was part of Power Management
and controlled by Pulse, but now it is a standalone feature that is controlled by the Launcher application that runs on
every Client machine. This means that Pulse is no longer required to use the Slave Scheduling feature.
There are also new features that have been added to Slave Scheduling. If a slave is scheduled to start on a machine, a
notification message will now pop up for 30 seconds indicating that the slave is scheduled to start. If someone is still
using the machine, they can choose to delay the start of the slave for a certain amount of time. Another addition is
the new option to enforce the slave schedule. If enabled, the Launcher will keep restarting the slave if it is shut down
during a period of time that it is supposed to be running.

914

Chapter 12. Release Notes

Deadline User Manual, Release 7.1.0.35

Finally, Slave Scheduling can now be configured to launch the slave if the machine has been idle for a certain amount
of time (idle means no keyboard or mouse input). There is also additional criteria that can be checked before
launching the slave, including the machines current memory and CPU usage, the current logged in user, and the
processes currently running on the machine. Finally, this system can stop the slave automatically when the machine is
no longer idle.
Note that Idle Detection can be set in the Slave Scheduling settings, or on a per-slave basis in the Slave Settings dialog
in the Monitor. It can also be set in the new Local Slave Control dialog so that users can configure if their local slave
should launch when the machine becomes idle.
Job Dequeueing Mode
Slaves now have a new Job Dequeuing mode that controls which jobs a slave dequeues based on how the job was
submitted. By default, a slave will dequeue any job, but it can be configured to only dequeue jobs submitted from the
same machine that the slave is running on, or submitted by specific users.
The Job Dequeuing Mode can be configured in the Slave Settings dialog in the Monitor. It can also be set in the new
Local Slave Control dialog so that users can configure if their local slave should only render their own jobs, or if they
want to help another user render their jobs.
Local Slave Controls
The Monitor and Launcher applications now have a new dialog that can be used to control the slave running on the
local machine. It can be used to start and stop the slave, or connect to the slaves log. This is useful if the slave is
running as a service on the machine.
In addition, you can set up the slave to launch if the machine has been idle for a certain amount of time (idle means
no keyboard or mouse input). It can also stop the slave automatically when the machine is no longer idle.
Finally, the slaves Job Dequeuing Mode can be configured here. By default, a slave will dequeue any job, but it can
be configured to only dequeue jobs submitted from the same machine, or submitted by specific users. This is useful if
a user wants their slave to only render their jobs, or they want to help another user render their jobs.
Note that the Idle Detection and Job Dequeuing Mode settings can also be changed by administrators for all slaves.
In addition, the Local Slave Controls feature can be disabled by administrators if they dont want users to be able to
control their local slaves.
Render As User
A new option has been added to Deadline to render jobs with the account that is associated with the jobs user. The
account information can be configured in the Deadline user settings. On Windows, the users login name, domain, and
password are required. On Linux and Mac OSX, just the users login name is required, but the Slave must run as root
so that the Slave has permission to launch the rendering process as another user.
Improved Slave Statistics
Additional statistical information is now gathered for individual slaves, including the slaves running time, rendering
time, and idle time. It also includes information about the number of tasks the slave has completed, the number of
errors it has reported, and its average Memory and CPU usage. Like job statistics, Pulse does not need to be running
to gather this information.

12.1. Deadline 7.0.0.54 Release Notes

915

Deadline User Manual, Release 7.1.0.35

Pulse Redundancy
You can run now multiple instances of Pulse on separate machines as backups in case your Primary Pulse instance goes
down. If the Primary Pulse goes offline or becomes stalled, Deadlines Repository Repair operation can elect another
running instance of Pulse as the Primary, and the Slaves will automatically connect to the new Primary instance.
Note that when multiple Pulse instances are running, only the Primary Pulse is used by the Slaves for Throttling.
In addition, only Primary Pulse is used to perform Housecleaning, Power Management, and Statistics Gathering.
However, you can connect to any Pulse instance to use the Web Service.
New Events and Asynchronous Job Events
New events have been added to the Event Plugin system. The first is the OnHouseCleaning event, which triggers
whenever Deadline performs Housecleaning. This allows you to set up event plugins to do custom cron-job style
operations within Deadline.
In addition, there are four new events that trigger when a slave changes state: OnSlaveStarted, OnSlaveStopped,
OnSlaveRendering, and OnSlaveStartingJob. As an example, an event plugin could be written to have slaves automatically add themselves to Groups when they start up based on some custom criteria, or an event plugin could be written
to have slaves perform maintenance checks when they become idle.
Finally, there is now an option to process many types of job events asynchronously. The benefit is that job events will
no longer slow down batch operations in the Monitor (for example, deleting 1000 jobs will be much faster if you are
using event plugins because those events will be processed later). These job events are queued up in the Database
and Deadlines Pending Job Scan will process them at regular intervals. Because they are placed in a queue, they will
still be processed in the same order that they were triggered. Note that if this option is enabled, some events are still
processed synchronously, like the OnJobSubmitted and OnJobStarted events.
Auto Configuration Overhaul
The Auto Configuration feature has undergone a couple of significant changes. The first is that all Deadline applications can now pull the Auto Configuration settings, instead of just the Slave. This means that Auto Configuration can
now be used to automatically configure workstations, not just render nodes.
The second change is with how Auto Configuration works. Previously, all Auto Configuration settings were pulled
from Pulse. Now, only the Repository Path is pulled from Pulse, and the other settings are pulled when the Deadline
application connects to the Repository. The benefit to this is that most of the Auto Configuration settings will work
without Pulse running.
Finally, Auto Configuration rule sets can now be enabled or disabled, so you no longer have to delete a rule set if you
want to remove it temporarily.
Region Awareness
Regions can now be configured in Deadline, and users and slaves can be assigned to a specific region. Currently, this
is useful for Path Mapping, and allows you to map paths differently based on the region that the users or slaves are in.
Note that when VMX launches a slave, it will automatically be added to the region associated with the cloud provider
settings.
Grid-Based Script Dialogs
New grid-based functions have been added to the DeadlineScriptDialog class which makes it easier to create custom
dialogs. Instead of setting the width and height when adding new controls to a row, you can instead add them to a grid
and indicate which row and column the control should go in. Optionally, you can also indicate how many rows and
916

Chapter 12. Release Notes

Deadline User Manual, Release 7.1.0.35

columns the control should occupy. By being part of a grid, the controls will now grow and shrink dynamically based
on the size of the dialog and the size of the font.
FTrack Integration
The Deadline/FTrack integration enables a seamless render and review data flow. When Deadline starts a render, an
Asset Version is automatically created within FTrack using key metadata. When the render is complete, Deadline
automatically updates the created Version appropriately a thumbnail image is uploaded, components are created
from the Jobs output paths (taking advantage of FTracks location plugins), and the Version is flagged for Review. In
doing so, Deadline provides a seamless transition from Job Submission to Review process, without artists needing to
monitor their renders.
Jigsaw for Maya, modo, and Rhino
Jigsaw, which was previously only available for 3ds Max, is now available for Maya, modo, and Rhino. It gives you
more control over the tiles and/or regions that you are submitting to Deadline. This feature uses Thinkbox Softwares
Draft library to assemble the final image instead of the old TileAssembler.exe application. Note that Draft requires a
license, so contact Thinkbox Sales if you dont already have a Draft license.
Submission Script Installers
Submission script installers can now be found in each application folder in the Submission folder in the Repository.
These allow for most of the submission scripts to be installed automatically, instead of having to manually copy over
files.
Support for Salt and Puppet
Application and Event plugins have been added to support the Salt and Puppet automation applications. Jobs can be
submitted to the application plugin to update software and machine configurations on specific machines, while the
event plugins can be used to update all of your machines when the slave running on them becomes idle.
Updated Application Support
Support has been added for After Effects CC 2014, Arnold for Houdini, Cinema 4D 16, Corona, Fusion 7, Nuke 9,
Realflow 2014, and SketchUp 2015.

12.1.3 Complete Release Notes


General Improvements
Added the new VMX (Virtual Machine eXtension) system to Deadline.
Upgraded Python to version 2.7.8.
Added FTrack support, and updated many job submission scripts to connect to FTrack.
Added new event to Event Plugins that triggers every time Housecleaning is performed. This is useful for
performing custom cron-job style operations within Deadline.
Added new events to Event Plugins that trigger when a slave starts, stops, starts rendering, and becomes idle.

12.1. Deadline 7.0.0.54 Release Notes

917

Deadline User Manual, Release 7.1.0.35

Added option to process many of the job events asynchronously to improve performance (particularly in the
Monitor).
Added application and event plugins for Puppet and Salt automation applications.
Users, slaves, and pulse can now be added to regions, which affects how Path Mapping is performed for them
(regions can be configured in the Repository Options).
Path mapping can now be associated with regions so that different path mappings can be set for different regions.
There is now an option in slave scheduling to keep the slave running during scheduled hours.
Housecleaning and the Pending Job scan are now performed on a more regular basis by the Slaves when Pulse
isnt running.
During the Pending Job Scan, the task dependency check now handles a missing __main__ function in the
dependency script properly.
Fixed a typo where the Pending Job Scan would refer to itself as Housecleaning.
Fixed an encoding issue when saving and loading job and slave reports.
Added new slave statistics gathering that logs more information about individual slaves.
Added new vCenter Cloud plugin.
Limits can now be configured with different usage levels. They can be per task, per slave, or per machine.
Previously, they could only be per slave.
Bumped up the maximum thread/cpu setting limit in the submission scripts.
The Deadline temp folder on the Client machines now gets cleaned up on a regular basis.
Split out the critical Housecleaning operations into a new Repository Repair operation (orphaned task and limit
stub checking, stalled slave checking, and available DB connection checking).
The randomness of the housecleaning checks has been removed to make the system more reliable and predictable.
Fixed some cases where timestamps were still using 12 hour clocks.
Added IP address/hostnames to the power management logging.
Fixed a bug that prevented Deadline from shutting down an OSX machine.
Most integrated submitter client scripts now print out where theyre getting the main script file from prior to
running the main script.
Housecleaning can now detect if a task is waiting to start, but the slave hasnt updated its state to show that its
rendering that task.
Fixed a bug that prevented the timeout from triggering when running the housecleaning operations as separate
processes.
Added an option for splitting the output from the different housecleaning operation to separate logs.
Fixed how the timestamps look when connecting to a remote slave/pulse/balancer log.
Job event triggers now fire properly when changing states of individual tasks.
Improved performance when checking pending jobs with frame dependencies.
The Deadline applications no longer set the PYTHONHOME and PYTHONPATH environment variables for
their current session.
The error message that is displayed when auto-archiving a job fails now shows the job ID instead of the job
name.

918

Chapter 12. Release Notes

Deadline User Manual, Release 7.1.0.35

Housecleaning only loads event plugins once when deleting or archiving completed jobs.
When purging jobs in housecleaning, the event plugins are only loaded once per batch.
Added a Machine Startup option in Power Management to not send the command to the machine to launch the
slave.
Added user group permission option to disable job submission (enabled by default).
Added stalled Pulse and Balancer detection to housecleaning.
Removed a misleading message that was printed when getting the user from deadline.ini and one wasnt defined
yet.
Housecleaning, pending job scan, and repository repair are no longer run as a separate process by default.
Installer Improvements
Mono is now shipped with the Linux installers, so it is no longer required for Mono to be installed prior to
installing Deadline.
Added the major version number to the shortcuts created on windows, and to the uninstaller shortcuts created
on all operating systems.
Added command line option to Client installer to set the NoGuiMode setting.
When setting up the database, the Repository installer now checks to make sure the database version is the
minimum supported version.
The Repository installer now checks to make sure its not installing over an existing repository thats a different
version.
The Repository installer now sets the default database name to include the major Deadline version number.
The Repository installer now creates a repository.ini file in the repository install directory which contains the
Version information.
The Windows Repository installer now ships with both the standard and legacy versions of MongoDB. The
standard version will be installed on Windows Server 2008 R2 and later, and the legacy version will be installed
on older versions of Windows.
The Repository uninstaller now removes all subfolders except for the custom one.
Fixed a bug in the Client Installer that was causing the license server entry to be reset if the repository directory
was invalid.
The MongoDB service name and port can now be customized in the Repository installer, and its default is based
on the current Deadline version.
The Windows client installer now creates a DeadlineLauncher# registry key to start the Launcher on login (where
# is the major version number). This allows different versions of the Launcher to start on login.
Fixed a bug in the Repository installer that was causing Password: to be set for the user name in dbConnect.xml on OSX.
Fixed some errors when running the Repository installer in unattended mode.
Installers on OSX are now signed with codesign v2 so that Gatekeeper doesnt flag them on OSX 10.9.5.
The replica set name and mongo password fields in the Repository installer are now wider.
The Mono.Posix and Mono.Security dlls are no longer installed with the Linux version of Deadline.
The api, balancer, cloud, and draft folders in the repository are now backed up during an upgrade.

12.1. Deadline 7.0.0.54 Release Notes

919

Deadline User Manual, Release 7.1.0.35

Windows installers are now code-signed.


The settings folder is now backed up by the Repository Installer.
The slavedatadir command line option for the client installer is now visible in the usage instructions.
Repository Improvements
Archived jobs are now stored in subfolders based on the year and month they are submitted.
Job reports are now stored in a subfolder with the jobs ID, which improves performance when deleting reports
for a job.
The License Server is no longer installed in the Repository. It can be downloaded from the Thinkbox website.
There are now submission script installers in each application folder in the Submission folder in the Repository.
Lock files are no longer used in the Repository to ensure that operations like Housecleaning and Repository
Repair are only done by one application at a time.
There are now separate 32 and 64 bit versions of the windows bin.zip file in the repository. This is so that we
can ship platform-specific libraries as part of the auto-upgrade in the future if necessary.
Database Improvements
Upgraded minimum MongoDB requirement to 2.6.1 (although the Repository installer ships with 2.6.3).
Deadline now uses MongoDBs new timestamp feature to reduce the number of write operations it performs.
Split many collections into separate databases to improve performance.
Using the new timestamp feature allows Deadline to support Sharding.
A Replica Set Name can now be specified when configuring the database connection settings.
Improved how passwords are saved in the database.
When saving a new job, if there is a job with that ID in the Deleted Job Collection, it is now removed from the
Deleted Job Collection.
A config file is now installed to the Database folder, and this can be modified to configure how MongoDB runs.
Unexpected mongodb exceptions now include the stacktrace and exception type.
Reduced the number of database writes that occur when deleting jobs, slaves, pulses, balancers, and limits.
Reduced bandwidth when checking if a job or slave exists in the database.
Fixed a bug where too many asynchronous calls to the database could result in connection errors.
When adding history entries, the saving of the new entries and the purging of the old ones is now done in one
query instead of two.
Added a locking collection to the database that is used instead of lock files to ensure that operations like
Housecleaning and Repository Repair are only done by one application at a time.
Job Improvements
Individual tasks for jobs can now be suspended or resumed.
Added ability to render jobs using the account for the user that submitted the job.
Job dependencies are more flexible, and can have per-dependency overrides and notes.
920

Chapter 12. Release Notes

Deadline User Manual, Release 7.1.0.35

Added new OnTaskTimeout option to mark a task as complete.


Added optional timeout option for the Starting phase of a job.
Added a job timeout option to calculate the task timeout based on the number of frames for the current task.
Jobs with custom plugin locations specified no longer need the custom plugin to be in the repository to be
submitted and resubmitted.
Sequential jobs are no longer dropped for higher priority jobs. Once a slave picks up a sequential job, it will
keep rendering it until the job is complete or the render is canceled.
Added a job task buffer value that can be applied to balanced or weighted algorithms to help prevent slaves from
jumping between jobs to keep things balanced.
Fixed a bug where pre job scripts were not necessarily finishing before regular tasks were started.
Failed jobs with a post job script no longer remain stuck in the queued state.
Added option to submit a job with a start time delay by specifying a JobDelay=dd:hh:mm:ss value in the job
info file. The delay value is represented by the number of days, hours, minutes, and seconds, all separated by
colons.
Fixed a bug when undeleting a job that could cause the jobs task counts to be incorrect.
Jobs now have a limit on the number of error reports that can be generated. A job with the maximum number
of errors will fail and cannot be resumed until some reports are deleted. This number is configurable in the Job
Settings in the Repository Options.
Fixed a bug that could cause interruptible jobs to be interrupted for another job of equal priority.
Improved the performance of how job tasks are updated in some cases.
Added support for jobs to have their own custom event plugin directory to load event plugins from.
Added a job option to override the number of days before the job is automatically cleaned up.
Added an option to completely override auto-cleanup settings, which means you can choose to disable autocleanup for a job if its enabled in the Repository Options.
A history event is now logged when a job is failed because it reached the error limit.
If a job with a post job task is frame dependent, the post job task now only gets released if all the other tasks
are complete. This fixes the problem of the job showing up as Queued in the Monitor because the post tasks is
queued, but the rest of the tasks are a combination of pending/completed/failed.
During job submission, if the jobs user doesnt exist, default user settings are now created for them.
Client Application Improvements
Upgraded user interface libraries to Qt 5, which fixes some known Wacom Tablet issues.
The Deadline applications now use the new Qt Fusion theme for a more modern and scalable look.
All Deadline applications can now update the Auto Configuration configuration settings, instead of just the
slave.
All Auto Configuration settings, except for the Repository path, are now pulled directly from the Repository.
Only the Repository Path is still pulled from Pulse.
The color and font used in all Deadline applications can now be customized from the Monitor.
All Deadline application command line arguments now support any number of leading dashes (for example,
deadlinemonitor -console or deadlineslave help).

12.1. Deadline 7.0.0.54 Release Notes

921

Deadline User Manual, Release 7.1.0.35

Added a NoGuiMode setting to the deadline.ini file. Its set to False by default, but if True, then the launcher,
slave, and pulse will always run in nogui mode, regardless if the -nogui flag is passed or not.
All logs for the Deadline applications and for jobs now have timestamps.
The LaunchPulseAtStartup and LaunchBalancerAtStartup settings are now stored in the system deadline.ini file,
not the per user one.
The Monitor, Pulse, and Balancer listening ports and process IDs are now stored in separate ini files, not the
system deadline.ini file. This means that a symlinked deadline.ini file can now be shared between multiple
machines.
Added the major version number to the app packages on OSX.
Fixed some bad logic when the applications try to determine if they should run in GUI mode or not.
Fixed a typo in the dbConnect.xml error that would be shown if the Client application couldnt find or read the
dbConnect.xml file.
The look of the disabled text in labels now matches Qts default look.
On OSX, any popups that appear when the splash screen is visible now appear in front of the splash screen.
On Windows, a task bar item is now visible when the splash screen is visible.
Improved the Connection Error message when a Deadline application cannot connect to the Repository or
Database.
Menus that are too long for the screen are now scrollable.
Launcher Improvements
The Launcher now controls the the scheduled starting and stopping of slaves.
The Launcher displays a popup message when a slave is scheduled to start, allowing a user to delay launching
the slave if they are still using the machine.
The Launcher can detect if the system is idle and launch the slave. It can also stop the slave when the system is
no longer idle.
Added new Local Slave Settings dialog to the Launcher menu to control the local slave and configure its Idle
Detection and Job Dequeuing Mode settings.
The Launcher system tray icon now shows the Deadline version number in the tooltip.
The launcher now waits 5 minutes after starting before it starts checking if it should restart a stalled slave. This
ensures that if the launcher is set to launch the slave at startup, and that slave previously stalled, the slave will
have a chance to cleanup after itself. Otherwise, the launcher might try to launch the slave multiple times.
Added new -shutdownall command line option to launcher, which shuts down the slaves, pulse, and balancer
before shutting down the launcher.
On Linux, Deadlines init.d script now shuts down the slaves and the launcher during a reboot/shutdown, which
ensures the slaves check their licenses back in. Pulse and the balancer are shut down if they are running as well.
On Linux, fixed some other issues in Deadlines init.d script.
The Restart Slave If Stalled option is now disabled by default.
Fixed some bugs in the Launcher init script on Linux.
Cleaned up the output of a successful remote command.
The Launcher can now process multiple remote commands simultaneously.

922

Chapter 12. Release Notes

Deadline User Manual, Release 7.1.0.35

Added the -balancer command line option to launch the Balancer through the Launcher.
A LaunchBalancerAtStartup=true entry can be added to the system deadline.ini file to have the Launcher start
the Balancer when the Launcher starts.
When running as a service on Windows, the Launcher now properly shuts down the slave when the machine is
shut down, which ensures the slaves check their licenses back in. Pulse and the balancer are shut down if they
are running as well.
Added new optional entries to deadline.ini file to have the launcher keep pulse and balancer running (KeepPulseRunning=true and KeepBalancerRunning=true).
Added a -slavenames command line option to the launcher to be used with -slave to launch slaves with
specific names by specifying a comma-separated list of slave names.
Added -upgrade command line option to launcher to simply trigger an upgrade if its required.
Up to 5 attempts are now made during an auto-upgrade to copy over the binaries, with an increasing interval
between attempts.
When the launcher checks for upgrades, it now performs an upgrade if the local Version file is missing (but the
network one exists).
When doing an automatic upgrade, the launcher now copies the bootstrap files to the systems temp directory,
instead of using the Deadline temp directory.
Monitor Improvements
General
The UI Lock can now be toggled on and off using the Shortcut ALT+.
Font sizes are now consistent for all column headers in the lists in the Monitor.
Added new graphs to the Monitor.
There are no longer artifacts in the images when saving graphs to disk.
Default list layouts can now be saved for each panel in the Monitor. These defaults are used when new panels
are opened.
List layouts for each panel in the Monitor can be saved to disk and opened again later.
The lists no longer auto-scroll horizontally when clicking on a column that is only partially visible.
Added ability to add Separators when customizing Script Menus.
The Monitor now gives the user the option to save the Location and Size when pinning a layout or saving a
layout to disk.
When right-clicking on the column headers for a list to show hidden columns, the column will now appear where
the mouse cursor is instead of at the end.
Added search history to the search boxes in the Monitor. The search history can be cleared from the down-arrow
menu for each list.
The default size for the Manage Pools and Manage Groups dialogs are now bigger.
The Slave list in the Pool and Group Management dialogs can now be filtered, and all columns in the list are
now available.
Fixed a bug when deleting groups and pools from the Manage Pool and Group Dialog that was preventing
deletion of a single pool or group, or deleting them all if one was selected for deletion.

12.1. Deadline 7.0.0.54 Release Notes

923

Deadline User Manual, Release 7.1.0.35

The Slave Scheduling feature has been broken out of the Power Management dialog and now has its own
configuration dialog.
Added new Repository Options panel to create regions.
In Repository Options, moved the database threshold to the Notifications panel, and grouped it with the database
email address setting.
Added an option to the Email Notification panel in the Repository Options to enable/disable auto-generating
email addresses for new users. If enabled, the email address will be based on the SMTP server unless a postfix
override is specified.
The statistics panel in the repository options now has all of its settings in a group box.
Added a toggle to the FarmOverviewReport to switch between percentages and counts for the graphs.
Repository options dialog now notes that it can take up to 10 minutes for the settings to propagate.
Improved the tooltips in the Repository Options dialog.
Fixed a typo in the House Cleaning panel in the Repository Options.
Updated Repository Options, Job Properties, Slave Properties, and Monitor Options dialogs so that each panel
takes up a bit more space.
New rows created in the Path Mappings, Drive Mappings, and Monitor Layout panels in the Repository Options
now have the correct height.
Moved the database threshold in the repository options dialog to the Notifications panel, and grouped it with the
database email address setting.
Added a button in the Repository Options dialog to reset all settings back to factory defaults.
In the Repository Options, all performance-related settings are now on a new Performance panel. Use the new
Auto Adjust spinner control to automatically pick good default settings based on the number of Slaves in your
farm.
Fixed a bug in the Auto Configuration page in the Repository Options that occurred when the last entry in the
Auto Configuration list was deleted.
Auto Configuration rule sets can now be enabled or disabled.
Manage User, Manage Groups and Manage Pools dialogs no longer close the Name dialog if an invalid name is
entered.
When a new user group, pool or group is created, it is automatically selected.
Fixed an error that could occur when deleting multiple users at the same time.
The Farm Statistics dialog now has a drop down to choose an interval, rather than 4 separate buttons.
The Configure Cloud Providers dialog now initializes the cloud plugins before displaying to improve performance when viewing the settings for different cloud plugins.
Added Import Settings option to the Tools menu, which allows you to import settings from other Repositories
running a minimum of Deadline 6.
Added new Local Slave Settings dialog to the Tools menu and the main toolbar to control the local slave and
configure its Idle Detection and Job Dequeuing Mode settings.
Improved layouts of controls in Plugin and Event Plugin configuration dialogs.
Features that require Pulse now mention it in their respective property dialogs.
Updated all Monitor scripts to use the new grid-based system for the script dialogs.

924

Chapter 12. Release Notes

Deadline User Manual, Release 7.1.0.35

All Monitor submission scripts now save their sticky settings if the dialog is closed using the X button, or if
Alt+F4 is pressed.
Added additional command line arguments for the Monitor to set specific Monitor Options at startup.
Fixed the filter types for some columns in the slave list, job report list, and slave report list.
If a Remote Control command succeeds, the result will now be Connection Accepted instead of just being
empty.
Added Monitor option to show when the last house cleaning and pending job scan operations where performed
in the Monitor status bar. If they havent been performed for more than 10 minutes, they will be highlighted in
red.
Added Monitor option to enable slave pinging (its now disabled by default).
Fixed some Remote Control commands that were not checking if they should be using the slaves IP address, or
a machine name or IP address override.
Fixed a bug where trying to send a Remote Command to an unknown host would hang indefinitely on Linux
and OSX.
When executing a remote command, if the process returns a non-zero exit code, then the result is returned as a
failure instead of a success.
Fixed a ManageListForm error.
The limit dialog and the power management dialogs now disable the name field instead of just making it readonly when in edit mode.
Added settings in the Repository Options to control how long the local Launcher and Balancer logs should be
kept for.
Fixed a layout issue in the multi-line file browser control in the Plugin Configuration dialogs.
Resetting the Repository Options in the dialog is now visually smoother.
Added a panel menu item to reset the default list layout back to the original default.
Fixed an error when removing users from the User Group permissions dialog.
When cloning an existing user group, the clone is selected automatically.
Increased the default height for the Manage Users dialog.
Added Monitor Option settings to configure the double click task behavior for rendering, completed and failed
tasks.
The scripts menus are now hidden when right-clicking on a panel with nothing selected.
The job scheduling weight settings in the Repository Options now have 4 decimal places instead of 2.
Updated the icon/script sync icon to be the refresh icon.
Added View menu option to show/hide the main toolbar.
Cleaned up the layout of the View menu a bit.
Graph names are now shown in the panel titles when they are showing a graph.
The splitter for job reports, slave reports, and remote command panels no longer moves when resizing the panel.
Fixed a leak caused by the context menus in the panels.
Fixed a bug in the Auto Job Timeout settings in the Repository Options that caused the Timeout Multiplier to
be disabled when it shouldnt be.

12.1. Deadline 7.0.0.54 Release Notes

925

Deadline User Manual, Release 7.1.0.35

When restarting the Monitor, the location of the splitters for all panels is now restored properly from the previous
session.
When switching between saved layouts, the Monitor is now hidden and shown to ensure that the location of the
splitters is restored properly.
Fixed a bug in the Manage Users dialog where the password confirmation fields were not being verified on
accept.
Tweaked some labels in the idle shutdown and machine startup tabs in the power management dialog.
Cleaned up the error message when a job import fails due to the job already existing.
Deleting a ruleset in the auto configuration panel of the repository options now resets all controls to their defaults.
When creating new Path Mappings in the Repository Options, they are no longer case-sensitive by default.
Plugin and Event configuration settings are now sanitized when they are saved.
Added a new general TestIntegrationConnection script to the General script menu that can be used to test connecting to Shotgun or ftrack, and it shows the results.
Added stacktraces to the error messages if the Monitor cant update its data cache.
Added Repository Configuration settings for maximum repository, slave, job, pulse, and balancer history entries.
Double clicking the title bar of a floating panel in the Monitor now maximizes it on Windows.
Repository history entries are now logged when changing Repository Options.
Fixed a bug when collapsing and expanding group boxes in the Configure Plugins/Events dialogs.
Improved the performance of bulk delete operations in the Monitor.
Improved the default widths of some of the columns for lists in the Repository Options.
When switching between the global pinned monitor layouts, the local pinned layout settings (column layouts
and filters) are ignored so that they do not get clobbered.
Fixed a typo in the Application Logging panel in the Repository Options.
Fixed a layout bug in the Plugin Configuration if CategoryOrder was specified in the .options file of a plugin.
Fixed some errors when editing idle shutdown overrides, and when editing existing thermal shutdown sensors
and overrides.
When connecting to a remote log from the Monitor, it now connects to the correct machine if the Monitor is
connected to a different repository than the one stored in the deadline.ini file.
CMD+R shortcuts now work properly on OSX (ie: resume job, resume task).
Jobs and Tasks
Added new progress bars to the job list to show the state of all tasks for the job at a glance.
Jobs with only a single task now show better job progress in the Job list.
Fixed some issues that caused the job counts in the job list to be incorrect.
Fixed some issues where requeue reports werent getting created properly for jobs.
Improved layout of controls in the plugin-specific properties in the Job Properties dialog.
Selecting multiple jobs and modifying their properties only overwrites shared properties for dependencies, extra
info variables and environment variables.
All dependency related job properties are now in the Dependencies panel in the job properties dialog, instead of
being spread across three separate panels.

926

Chapter 12. Release Notes

Deadline User Manual, Release 7.1.0.35

The job timeout panel in the job properties now lets you specify a timeout in terms of hours, minutes, and
seconds.
Fixed a color control bug in the plugin-specific job properties that would cause the property to appear as modified
when pressing Cancel on the color picker dialog.
Split up the job history logging to be more granular when modifying certain Job Properties.
Jobs can now be grouped together in the Job list if they share the same Batch name.
Improved the performance of the Quick Filters for the job list.
User name quick filters now have Me (userName) as the entry for the current user, and will be the first user in
the list.
Changed the right-click menu item text in the quick filters to avoid confusion.
Added an option when suspending a job to only suspend the non-rendering tasks for the job.
Updated the Transfer Job script to include some missing job properties that werent getting transferred.
Fixed an error that could show up in the Console when closing the Job Details panel.
The Explore Output menu in the job and task list no longer shows any duplicate paths.
The task list now shows the current CPU and RAM information for a rendering task.
Added right-click menu item to task list to suspend/resume individual tasks.
Swapped the default location of the Startup Time and Render Time columns in the task list.
The Job Dependency nodes in the Monitor have also been updated to show per-dependency settings.
Added a new feature to toe Job Dependency View to test the dependencies and see which ones pass and which
ones do not.
The backgrounds for the graphs and the Job Dependency View now match the look of the rest of the Monitor.
Jobs can now be grouped together in the Job Dependency View if they share the same Batch name.
The layout can now be pinned for the Job Dependency View panel.
You can now select multiple jobs in the job list and have them show up in the job dependency view.
The Job report list in the Monitor now have columns that show Memory and CPU usage information.
Moved the Explore Path menu for non-job nodes to the main context menu in the Job Dependency View, and
fixed a bug that caused it to be disabled when it shouldnt be.
Cleaned up the error message when changing the frame range for a job, and the new task count exceeds the
maximum allowed.
Added ability to pin and save quick filters.
The archive job path is now remembered within a session (it will revert back to the default repository folder the
next time the Monitor is restarted).
Added new Cleanup panel to job properties window (for auto-cleanup override settings).
Added option to auto-filter Job Reports based on the selected Task.
Added option to switch Job Reports panel to a horizontal orientation.
Added a Render Status column to task list, which shows the same information that the Task Render Status
column in the slave list shows.
Fixed some layout and font-size issues in the job dependency drag and drop dialog.
Fixed a bug that could cause output paths in the job/task context menus to show double path separators.

12.1. Deadline 7.0.0.54 Release Notes

927

Deadline User Manual, Release 7.1.0.35

Event plugins are only loaded once when archiving a batch of jobs.
Fixed a bug when parsing the frame padding of an output path that contained multiple sections of padding
characters.
The Task ID column in the Task and Job Report lists are now string filters instead of integers.
Capped the job and task sub-menu length for viewing output and auxiliary files to 50 menu items.
Deleting jobs from the monitor now logs to the repository history.
If a job report cant be loaded, the error message is now shown in the job report viewer.
Task progress bars are now only visible for completed and rendering tasks.
Task progress bars no longer change color based on the tasks state, although they will still match the completed
job color when the task is complete.
Disabled ability to resubmit tasks for Tile and Maintenance jobs.
For tile jobs, the tile numbers under the Frame column in the Task List now start at 1 instead of 0.
Fixed a bug in Job Properties where editing a jobs existing Script Dependencies wasnt being committed properly when pressing OK.
Fixed some errors when removing multiple asset or script dependencies from their respective lists in the job
properties.
Fixed spelling of interruptiple in the job properties dialog.
Fixed a bug in the Job Dependency View that could lock up the Monitor when clicking on different jobs.
When resubmitting a job that was scheduled to start at a certain time, the flag that indicates if the job has been
resumed already is now reset.
Slaves and Pulse
You can now right-click on specific slaves in the Slave list in the Monitor to modify Pools and Groups for the
selected slaves only.
Added job icon to the Job Name column in the slave list.
The Slave list now shows which Limits the slaves are whitelisted, blacklisted, and excluded for.
The Slave report list in the Monitor now have columns that show Memory and CPU usage information.
The utilization value in the slave list now takes into account rendering and idle slaves (necessary if there are
multiple slaves running on the same machine, but not all are rendering).
If the slave list is filtered, the utilization will show the total utilization, as well as the utilization for just the
visible slaves.
Fixed a bug where the utilization would only update if you click on a slave in the list.
Cleaned up the utilization text a bit so that its easier to read.
Added option for viewing history to the Pulse list.
Moved the Modify Pools/Groups menu items in the slave list menu below the Modify Slave Properties menu
item.
The slave list now shows the time a slave has been in its current state for all states (previously it would only
show this for rendering slaves).
A warning now appears when trying to shut down the local machine from the slave list, instead of failing silently.
Added option to switch Slave Reports panel to a horizontal orientation.

928

Chapter 12. Release Notes

Deadline User Manual, Release 7.1.0.35

If a slave report cant be loaded, the error message is now shown in the slave report viewer.
When deleting a pulse, the history entry is now logged in the repository history.
The Mark Slave As Offline menu item is now shown if the slave is in the StartingJob state.
Fixed a bug where history entries for saving slave settings werent logged if only one slave was selected.
The Job Candidate Filter in the Slave list now handles jobs with empty whitelists properly.
The Slave Reports panel now shows render logs in addition to render errors.
Added new graphs to Slave Reports panel.
Added Connect Host, Primary, and Region columns to pulse list.
Pulse settings can now be modified from the pulse list.
The pulse list is now used to connect to the pulse log, instead of the Tools menu.
Limits and Cloud
The Limit list shows who the current stub holders are if that Limit is in use.
The Limit list now has a new column that shows the Usage Level for the Limit.
The Limit property dialog now has an option to use the Usage Level for the Limit.
Many context menu items in the Cloud panel (ie: starting and stopping instances) are now performed asynchronously.
User group permissions can now be set for the Cloud panel.
The cloud panel will show dialog boxes if an error occurs when interacting with the cloud instances.
Cloud plugin data is now only loaded and updated if the Cloud panel is being displayed.
Added some messages to the cloud commands so you get some feedback when a command is successful.
Console and Remote Commands
Fixed a timestamp bug in the Console panel.
The Remote Commands panel is now enabled by default in the User Group Permissions (so that the Monitors
Local Slave Controls can display it).
Fixed a spacing inconsistency between the timestamp and the text in the Monitors Console panel.
Slave Improvements
Multiple slaves on a single machine now share one license, instead of requiring one license each.
Slaves now return their license when they become idle.
New Idle Detection settings can be set per slave. They can be used to launch the slave when the machine is idle
and/or stop the slave when the machine is in use again.
New Job Dequeueing Mode settings can be set per slave. They can be used to force slaves running on workstations to only pick up jobs submitted from the same machine, or by specific users.
Slaves can now be added to regions, which mainly affect how the slave applies Path Mappings.
The slave system tray icon now shows the Deadline version number in the tooltip.
Added timestamps when streaming the slave log.
Fixed a startup bug on Linux and Mac OSX that could result in multiple slaves with the same name starting up
on the same machine.
12.1. Deadline 7.0.0.54 Release Notes

929

Deadline User Manual, Release 7.1.0.35

Improved how the slave picks its IP address on Windows and Linux so that it picks a network interface with a
gateway (the Mac OSX version already did this).
If a slave is initially running in Free Mode and it later gets a license, the License information in the slave UI and
the slave list in the Monitor will be updated appropriately.
When a slave cant connect to a license server, it only tries to do auto-discovery every 5 minutes so that it doesnt
saturate the network.
The slave now queries the machines CPU speed at regular intervals while its running, instead of just caching
the value it gets at startup. This is useful for machines with CPU speeds that dynamically change while the
system is running.
Fixed a bug that was not checking the Job failure detection settings when a plugin failed to sync its files.
When searching for a job, we no longer prune jobs that have a QueuedChunk count less than or equal to 0. This
helps ensure that if a jobs state gets messed up, queued tasks will still be dequeued for that job.
When searching for a job, the slave will now cache any Limits that it failed to acquire, and ignore other jobs that
require the same limit during that search.
The idle interval between job searches is now calculated based on the percentage of the idle slaves in the farm.
The interval increases as more slaves become idle.
Improved the message printed by the slave when it is doing a self-cleanup because it didnt close properly the
previous sessions.
Limit stub returning a little more robust.
Improved verbose log messages when the slave is looking for a higher priority job.
Fixed a bug that allowed the slave to move on to another task before finishing saving the log for the current task.
Significantly improved how the slave handles large amounts of stdout from the rendering process (both to performance and memory usage).
Improved speed and reduced database load when a slave is processing limit groups while searching for a job to
render.
Fixed a null reference exception when the slave would check if it needed to return limit stubs based on progress,
and the limit no longer exists.
The check that the slave makes to see it needs to return limit stubs based on progress is now done every few
minute instead of every second.
If the dlinit file is not found after a plugin sync, the slave will try three more times and then throws an exception.
When dequeuing a job, the slave now returns job limit stubs immediately if it cant find any tasks for that job.
When dequeuing a job, the slave will check if the job has any queued tasks available before trying to get a task
for it.
When updating the job state information during rendering, the slave no longer reads the full job object back
from the database.
Slaves only do partial updating of their state when possible to reduce bandwidth.
Fixed a bug that could cause the slave to crash during shutdown.
Fixed a bug that would result in only partial logs for a task that rendered across different days.
The local task logs have been renamed to ensure they are unique to the slave and render thread that is rendering
them.
Any orphaned local task logs are now cleaned up the next time that render thread renders a task.

930

Chapter 12. Release Notes

Deadline User Manual, Release 7.1.0.35

Fixed a bug that could cause a render to fail if the jobs name changed between tasks.
Added some additional logging just before the slave exits.
Slaves now save their own copy of the task report, which can be viewed from the Slave Reports panel in the
Monitor.
Fixed some text fields in the Slave UI that werent readonly.
Fixed some typos in some error messages.
During each job scan, the slaves cache if a plugin supports concurrent tasks or not to avoid repeatedly reloading
that information from the repository.
Pulse Improvements
A primary pulse can now be configured, which is the ones that the slaves will connect to. Only the primary
instance of pulse will do things like housecleaning and the pending job scan.
If the primary pulse is offline or stalled, the repository repair operation can elect another running pulse as the
primary. This can be enabled in the repository repair settings in the repository options.
Fixed some text fields in the Pulse UI that werent readonly.
Pulse no longer controls the Slave Scheduling feature. It is now handled by the Launcher.
Pulse now only sends the Repository Path for Auto Configuration requests. The other settings are pulled from
the Repository after the Deadline applications have connected to it.
The Pulse system tray icon now shows the Deadline version number in the tooltip.
In Power Management, Idle Shutdown now takes into account if there are multiple slaves running on the same
machine.
Added field to the Pulse UI that shows the state of the web service.
The slave can now be shutdown with deadlineslave -s when it hasnt connected to a repository yet.
Added more information to the pulse throttling messages such as the slave name, job id, number of requests and
throttle limit.
Made some tweaks to the web service new and delete user groups functions to not return error codes for certain
outcomes.
Fixed bugs in some REST API functions that could cause Pulse to crash.
Added a catch to prevent REST API functions from causing Pulse to crash.
Changed some of the error messages that were inconsistent with the rest of the REST API.
Pulse no longer prints out an error when favicon.ico is requested from the web service.
Cleaned up the web service messages when the command is an invalid API command, and when no command
is specified.
Added Access Control Allow Origin header to Web Service responses.
The options request type is now supported by the Web Service.
When deleting from the restful API, we now log to the repository history, not the jobs history.
Added support to the restful API for only grabbing certain job properties in a request for jobs to reduce the
amount of data getting passed around.
Fixed a bug in the Machine Startup feature of Power Management that would result in no slaves being woken
up for a job with an empty whitelist.
12.1. Deadline 7.0.0.54 Release Notes

931

Deadline User Manual, Release 7.1.0.35

Command Improvements
All command line options now support any number of leading dashes (for example, deadlinecommand.exe
-pools or deadlinecommand.exe groups).
Added new commands to suspend/resume individual tasks.
Added a new command to suspend all non-rendering tasks for a job.
Fixed some bugs with the RenderJob command line option.
Fixed some bugs with the JobStatistics command line option.
Added some User Group command line options.
Fixed the RemoteControl command to properly print out results.
Updated the help text for the ChangeRepository command line option to mention the optional Repository Path
argument.
The RemoteControl command options are no longer case sensitive.
Added GetJobDetails command to print the job details that are shown in the Job Details panel in the Monitor.
Added GetVersion and GetMajorVersion commands.
Added commands that can be used to configure the Cloud plugins, group mappings, regions, etc.
Added DeadlineCommand commands for adding job, slave and repository history entries.
Added command line option to DoHouseCleaning and DoRepositoryRepair to choose which mode to run.
Added command line commands for performing path mapping.
Removed JobCleanup command line option, since the DoHouseCleaning command can do this.
The DoPendingJobScan command line option can now take an optional region parameter that is used for path
mapping when checking asset and script dependencies.
Added SlaveExists command to check if a slave exists.
Deadline Command no longer checks if the collection indices in the database need to be created (the other
Deadline applications still handle this).
The ChangeRepository command no longer tries to load the Qt libraries if it is being passed the repository path
as a command line option.
The ChangeLicenseServer command no longer tries to load the Qt libraries if it is being passed the license server
as a command line option.
The ChangeUser command no longer tries to load the Qt libraries if it is being passed the user name as a
command line option.
Fixed a bug with commands that accept a repository path as an argument. The bug would cause deadline
command to crash if the repository path was quoted and ended with a character (ie: \serverrepository).
Scripting Improvements
Added new events to the Event Plugin API: OnHouseCleaning, OnSlaveStarted, OnSlaveStopped, OnSlaveRendering, and OnSlaveStartingJob.
Added new grid-based control options to the DeadlineScriptDialog class, which make it easier to create custom
interfaces in the Deadline scripts.
Updated the cloud plugins to not swallow their errors when creating instances.

932

Chapter 12. Release Notes

Deadline User Manual, Release 7.1.0.35

Exposed some errors that happened when Cloud/Balancer plugin files were missing or spelled incorrectly.
Added function to get the database connection string.
Added function to change a jobs frame list.
Default for ConcurrentTasks in a plugins dlinit file is now True.
Added API commands to launch processes with a specific user account.
Made some improvements to the way python exceptions are printed out.
Fixed some issues with how python stdout and stderr redirection to the Deadline logs was working.
Added new API commands to suspend all non-rendering tasks for a job.
When a plugin, event, cloud, or Monitor script is executed, the log will now show where the script is being
loaded from.
Added EnabledStickySaving function to the DeadlineScriptDialog class that can be used to automatically save
sticky settings when the dialog is closed.
Improved some function documentation for the API.
The Slave Stdout Limit is now applied to ManagedProcess objects created in plugin scripts. Before, it was only
applied to the main DeadlinePlugin object.
Fix a bug that prevented module import errors from showing the actual Python error.
Added some additional User Group functions.
The RGB spinners in the Color script control now resize when the control size changes.
Added ClientUtils.CreateScriptTempFolder() function to create a temporary folder for the script that is automatically cleaned up.
Fixed how the value is set for the RadioControl script control.
Fixed a bug with getting the disabled slave count in the GetFarmStatisticsEx.py web service script.
Added a OnJobPurged event trigger that gets called right before a job gets purged from the database.
Added OnSlaveStalled callback for event plugins.
Added functions to the REST API and the standalone Python module to get the job details that are shown in the
Job Details panel in the Monitor.
Added support to the REST API and the standalone Python module to undelete deleted jobs, purge deleted jobs,
get deleted jobs and get deleted job ids.
Added RepositoryUtils.GetJobDetails() function.
Added RepositoryUtils functions to get deleted job IDs and to undelete jobs.
Fixed a bug that prevented JobUtils.CalculateJobStatistics() from working in non-Monitor scripts.
PYTHONHOME and PYTHONPATH are now properly set to the systems values in RunProcess for the event
plugins.
GetConfigEntry and GetConfigEntryWithDefault functions for plugins now trim whitespace off the values.
Added support to the Standalone Python API for doing basic authentication with the Web Service.
Added missing documentation for SlaveUtils.GetMachineIPAddresses() API function.
Added RepositoryUtils.SlaveExists() function to check if a slave exists.
Fixed a bug where the OnJobFinished callback for Event plugins wasnt always getting the updated job object.

12.1. Deadline 7.0.0.54 Release Notes

933

Deadline User Manual, Release 7.1.0.35

Added some missing properties to the doxygen docs for BalancerInfo, PulseInfo, SlaveInfo, and SlaveSettings.
SlaveHostMachineIPAddressOverride in SlaveSettings now represents the correct value.
Application Plugin Improvements
3ds Max Improvements
Updated SMTD version numbers to 7.0.
Fixed a SMTD initialization error.
When copying external files, SMTD no longer tries to copy over missing files.
3dsMax2015_sp2 & Extension_1 dictionary entry added to 3dsmax plugin.
Default/sticky settings can now be set in SMTD for the ExtraInfo fields.
Removed (x86) references in 3dsmaxcmd plugin for Max2014 & Max2015.
Made some improvements for the RTT (Render To Textrure) feature in SMTD, including the option to bake one
object per task.
Fixed bug in FumeFX string handling in 3dsmax plugin.
Updated SMTD to handle blowup mode properly.
Updated Region manipulator in SMTD to keep aspect ratio while in blowup mode.
When offloading Mental Ray DBR jobs, the job will now use a temporary max.rayhosts file, rather than modify
the original.
Added workaround to prevent 3ds Max 2015 from crashing when its rendering as a service.
Fixed some layout issues in SMTD.
Fixed some layout issues in the VRay DBR submitter.
Added better error messages to SMTD if the main script from the repository cant be loaded.
Added some new SMTD sanity checks (CheckForOutputPathLength, CheckForREPathLength, CheckForDuplicateREPaths, CheckForObjectNames, CheckForCorruptGroup).
Fixed a bug in the 3ds Max 2015 workspace workaround that caused it to fail if the workspace directory doesnt
exist.
Fixed a bug that affected the tile assembly of frames rendered using the VRay frame buffer.
Fixed a tile assembly issue with VRay MultiMatte render elements.
Updated 3dsmax plugin dict in 3dsmax.py to clearly inform users which versions of 3dsMax are broken with
Deadline.
Changed maxTileAssembler command to use HiddenDOSCommand to hide console window on slave.
SMTD - Add [PREVIEW] job ability to enable/disable its parent dependency to the [REMAINING] frames job.
SMTD - When rendering single frame tile or single frame jigsaw, OutputFilename# should be frame specific
instead of ####.
SMTD - If VRay Separate Render Channels is enabled, RE paths were not output to the Monitor OutputFilename#.
SMTD - Re-worked logic for when VRay REs are output as Separate Render Channels via VFB to the
Monitor OutputFilename#.

934

Chapter 12. Release Notes

Deadline User Manual, Release 7.1.0.35

SMTD - When the script file SubmitMaxToDeadline_RemapLocalToNetworkPath.ms is not found in the


Repository, the option for network remap will be hidden from the UI.
SMTD - Improved Generate Quicktime .MOV File drop down UI positions.
SMTD - Last edited dates are now shown in the SMTD ABOUT dialog.
SMTD - 3dsmax.options - Size category re-named to Render Size so it appears next to the other Render...
categories in alphabetical order in job properties.
SMTD - Updated RebuildRenderElements function.
SMTD - Added ability to override Tile/Jigsaw Assembler Pool, Secondary Pool, Group & Priority to be different
from main 3dsMax job.
SMTD - Added sticky/default ini entries for Assembler Override settings.
SMTD - Empty State Sets are now purged from the scene during submission via SMTD.
SMTD - Submit a subset of Tiles for rendering with Draft Assembler, and they will all assemble over black
background.
SMTD - Re-queue some of the Completed tasks of the above job to render more tiles that were not requested
as Custom Tiles during submission - the Draft Tile Assembler will successfully integrate them into the final
image(s).
SMTD - Submit with Clean Up Tiles checked multiple times - each time, it will successfully assemble the
new tiles over the previous output image used as background.
SMTD - Jigsaw standalone feature Fill Regions backported to SMTD Jigsaw MX version, with extended
functionality.
SMTD - Added an option to permanently rename render elements during submission.
SMTD - Fixed a bunch bugs that could affect how the output paths are passed to Deadline.
SMTD - The State Sets dialog is now automatically closed during submission and re-opened afterwards.
SMTD - The Sequencer Mode State Set dialog docking now supported during State Sets dialog auto closing
during submission.
SMTD - Properly handles versions earlier than Max 2015 not having IsMainFrameVisible property available
in State Sets object.
After Effects Improvements
Added support for After Effects CC 2014.
Improved the error message if the wrong submission script is installed on the client machine.
Relaxed the output path sanity check in the integrated submitter so that it doesnt prevent you from submitting
a job that is outputting to a folder that doesnt exist yet.
Arnold Standalone Improvements
Added the -dp flag to the render arguments to speed up the rendering.
Cinema 4D Improvements
All multi-pass paths are now included when submitting from the integrated submitter, allowing you to open
these output files from the Monitor.
Fixed a bug in how the integrated submitter gets the output file name in cases where the output name scheme
doesnt start with a period.
A Team Render submitter is now available that lets you launch Team Render on slaves and connect to them to
perform an interactive render.

12.1. Deadline 7.0.0.54 Release Notes

935

Deadline User Manual, Release 7.1.0.35

CommandLine Improvements
Path Mapping is now performed on the arguments for CommandLine jobs.
CommandScript Improvements
Path Mapping is now performed on the arguments for CommandScript jobs.
Corona Improvements
Added support for Corona standalone.
DJV Improvements
Re-worked DJV plugin & submission script to handle new DJV v1.0.1, which has changed the majority of its
command line flags in this new release!
Fixed a couple bugs when using the job right-click script to submit a DJV job.
Draft Improvements
Added Path Mapping support to the Draft tile assembler.
Updated Draft to version 1.2.3.57201. Also note that if you are using Draft 1.1 or earlier, you will need an
updated Draft license.
Updated Draft Tile assembler monitor submission script to be able to add all of the plugin submission options.
Updated Draft Tile submitter to fix a visual bug.
Improved the error message when the Draft Tile Assembler cant load input tiles.
FFmpeg Improvements
Path mapping is now applied to the preset files.
The FFmpeg plugin now enforces the correct path separators based on the OS.
Fixed some typos in the FFmpeg submitter in the Monitor.
Fusion Improvements
Added support for Fusion 7.
Updated the Fusion plugin icon.
Hiero Improvements
Fixed how we get the start and end frame for a clip in the Hiero submitter.
Houdini Improvements
Fixed some bad logic when checking the output file in the houdini submitter.
Fixed an error when loading the sticky SubmitSuspended property in the integrated houdini submitter.
The integrated submitter now includes the current ROP name with the job name.
Improved Arnold for Houdini support.
Lightwave Improvements
Updated the Path Mapping tooltip in the Lightwave plugin to mention that it can be disabled if there are no Path
Mapping entries defined in the Repository Options.
Jobs submitted from Lightwave 11.8 now render properly.
Mantra Standalone Improvements
The mantra: Bad Alembic Archive error message is now caught during rendering.

936

Chapter 12. Release Notes

Deadline User Manual, Release 7.1.0.35

Updated the Path Mapping tooltip in the Mantra plugin to mention that it can be disabled if there are no Path
Mapping entries defined in the Repository Options.
Maya Improvements
Added Jigsaw support to Maya.
Removed unnecessary 32 bit paths from the MayaBatch and MayaCmd plugin configurations.
Added a new stdout handler to catch a Maya licensing error.
Fixed some text cutoff issues in the integrated submitter on Mac OSX Mavericks.
Added overrides for the height and width of the render output to the Monitor submitter.
Fixed FumeFX Wavelet Sim issue for MayaBatch & MayaCmd.
Fixed an Arnold for Maya verbosity flag bug.
Fixed some issues when using tile rendering with VRay.
VRay render elements are now supported when using the Draft Tile Assembler.
Arnold AOVs are now supported by tile rendering.
Added multichannel EXR support for Jigsaw and Draft Tile rendering.
Fixed the default Maya executable paths on OSX.
Added an explanation to the tooltip for the frame list control in the integrated submitter for why it would be
disabled.
Fixed some Vray related bugs in the integrated Maya submitter due to differences between Vray 2 and Vray 3.
Mental Ray Standalone Improvements
Added plugin configuration option to treat exit code 1 as error or success.
modo Improvements
Added Jigsaw support to modo.
Added option to modo Monitor submitter to specify the output pattern.
Added warning message to modo Monitor submitter that overriding output and using Tile Rendering has limitations, and that they should use the integrated submitter in certain cases.
Fixed a bug in the integrated modo submitter that prevented it from working in modo 801.
Nuke Improvements
Added support for Nuke 9.
Updated Nuke plugin to properly handle frame counts in batch node when given write node names.
Fixed a bug that could crop up when setting the environment in the nuke submitter prior to launching deadlinecommand.
Added Render Using Proxy Mode option to the Nuke submitter.
Removed Build option from Nuke submitter, since the versions of Nuke that Deadline supports are 64-bit only.
Fixed an error that could occur if PrepForOFX is not defined in the Nuke.dlinit file.
The integrated Nuke submitter now includes output paths for all Views so that they can be viewed from the
Monitor.
The integrated Nuke submitter now displays a warning if you are trying to submit a job that has no Views.

12.1. Deadline 7.0.0.54 Release Notes

937

Deadline User Manual, Release 7.1.0.35

Updated the names given to the Knobs created by the integrated submitter, which seems to address some instability issues that could come up.
The secondary pool setting is now sticky in the integrated submitter.
Fixed a bug with Nuke path mapping that would mess up embedded TCL in the output path.
Updated the Path Mapping tooltip in the Nuke plugin to mention that it can be disabled if there are no Path
Mapping entries defined in the Repository Options.
The integrated Nuke submitter handles TCL embedded in the output path properly when passing the paths to
Deadline to view the output from the Monitor.
Fixed an error in the submitter when the Nuke comp has proxy mode enabled.
In the Nuke submitter, Deadlines settings are now created in a Deadline tab, instead of just using the default
User tab. The settings have more readable names too.
Added Performance Profiling option to submitter (Nuke 9 and later).
Changed layout of submitter controls a bit.
Fixed an issue with loading Shotgun and FTrack KVPs from the Nuke script file.
Puppet Improvements
Added support for Puppet jobs.
Python Improvements
Path separators for the script path are now set per OS after Path Mapping has taken place.
Quicktime Improvements
Fixed an error in the job right-click script to submit a Quicktime job.
Realflow Improvements
Added support for Realflow 2014.
Improved Hybrido simulation progress reporting.
Rhino Improvements
Added Jigsaw support to Rhino.
Added Tile Rendering support to Rhino.
Updated the default Rhino 5 executable path.
When Rhino starts up, the Enter button is now pressed to workaround a case where Rhino wouldnt start
rendering.
Salt Improvements
Added support for Salt jobs.
SketchUp Improvements
Added support for sketchup 2015.
Increased width of export directory and prefix fields in the submitter.
Vray DBR Improvements
Added a task timeout option to all the DBR submission scripts. When the timeout is reached, the task will be
marked as complete so that the slave can move on to something else.
In the Monitor submitter, the application version number is now sticky between sessions.

938

Chapter 12. Release Notes

Deadline User Manual, Release 7.1.0.35

The 3ds Max and Maya DBR submitters now disable vray distributed rendering when closing if the submitter
had automatically enabled it.
The 3ds Max DBR submitter can now automatically mark the spawner job as complete when the rendering
finishes.
Fixed how the Maya Vray DBR submitter creates a new shelf if there isnt already a Deadline shelf.
In the Monitor submitter, the port label visibility is now toggled on/off based on the currently selected application, which properly refreshes the UI.
The default Vray spawner paths for 3ds Max Design are now included.
Added a timeout setting for all supported applications except 3ds Max (3ds Max RT is supported though).
Added an option for how to handle the case where a vray DR process is already running on the machine.
The Port number can now be specified for 3ds Max.
3ds Max RT is now properly supported.
Updated height of VRay dialog in Softimage.
In the Ply2Vrmesh submitter, the attribute field is now wider.
Event Plugin Improvements
ftrack Event Improvements
Added ftrack support to most of the submission scripts.
Shotgun Event Improvements
Updated Shotgun API to version 3.0.17.
Added functionality to upload a filmstrip and a H264 quicktime movie to Shotgun when a job finishes rendering.

12.2 Deadline 7.0.1.3 Release Notes


12.2.1 Overview
This is a patch release for Deadline 7.0. It fixes a few important bugs that were discovered shortly after Deadline 7.0
was released.
A bug with how the Slaves updated their state in the database had a significant impact on database performance.
In order to fix this bug, we had to change how the Slaves update their state, and as a result the Slave list in the
Monitor will show that your Slaves are in an Unknown state until all your machines (Slaves, Monitors, Pulse,
etc) are running Deadline 7.0.1. Once all machines are running the same version, the Slaves will appear properly
in the Monitor again.
See the Deadline 7.0.0.54 Release Notes for the full release notes.
Note that a 7.0 license is still required to run this version. If you have a license for Deadline 6.2 or earlier, you will
need an updated license. In addition, the version of Draft that ships with Deadline 7 needs a new 1.2 license. If you
have a license for Draft 1.1 or earlier, you will need an updated license.

12.2. Deadline 7.0.1.3 Release Notes

939

Deadline User Manual, Release 7.1.0.35

12.2.2 Complete Release Notes


Monitor Improvements
Fixed some bugs in the Dependency panel in the Job Properties dialog.
When resubmitting a job from the monitor, you can no longer set the frames per task to 0, which results in an
error during submission.
Slave Improvements
Fixed a bug with how slaves update their state in the database, which had a negative impact on performance.
The slave no longer prints out logging before and after each successful license checkout (errors are still printed
out).
The slave no longer updates its state in the database a bunch of times when shutting down.
Reduced the frequency at which the slaves check if housecleaning needs to be done. Now, they only check at the
same interval that Pulse would be performing the housecleaning operations, instead of before each task search.
Application Plugin Improvements
CommandScript Improvements
Fixed a syntax error in the CommandScript plugin.
Maya Improvements
Fixed a couple bugs that affected how the integrated submitter handled some VRay render elements.
Mental Ray Standalone Improvements
Fixed a syntax error in the MentalRay plugin.

12.3 Deadline 7.0.2.3 Release Notes


12.3.1 Overview
This is the second patch release for Deadline 7.0, which fixes a few bugs, and adds support for Lightwave 2015.
See the following pages for the full release notes:
Deadline 7.0.0.54 Release Notes
Deadline 7.0.1.3 Release Notes
Note that a 7.0 license is still required to run this version. If you have a license for Deadline 6.2 or earlier, you will
need an updated license. In addition, the version of Draft that ships with Deadline 7 needs a new 1.2 license. If you
have a license for Draft 1.1 or earlier, you will need an updated license.

12.3.2 Complete Release Notes


Installer Improvements
The Repository installer now sets the version number correctly in the repository.ini file.
940

Chapter 12. Release Notes

Deadline User Manual, Release 7.1.0.35

The submission script installers no longer create a rollback folder in the Repository folder.
Launcher Improvements
Fixed an error on Linux when checking how long the system has been idle in a headless environment.
Slave Improvements
Fixed a bug that caused the slave to report that it had a permanent license in some cases when it couldnt check
out a valid license.
Pulse Improvements
Fixed a bug that prevented a Primary Pulse from performing the Pending Job Scan on Linux and OSX.
Application Plugin Improvements
3ds Max Improvements
Fixed a bug for 3ds Max 2015 when checking the visibility of the SceneExplorer prior to rendering.
Cinema 4D Team Render Improvements
The C4D Team Render plugin now works properly with C4D 15 and 16.
Removed the security token file location options from the plugin configuration, since they arent needed.
The security token file is now created in the correct location on OSX.
Improved the error message that occurs if the security token file cant be created (often due to permissions).
Moved the Copy to Clipboard button next to the security token field in the integrated submitter.
Increased the button widths at the bottom of the integrated submitter to fix some text cutoff issues.
If the security token is blank when submitting the job, it is now populated with the token that is automatically
generated.
The Team Render submission script installer now supports C4D 16.
The security token can no longer be modified from the Monitor after the job has been submitted.
Combustion Improvements
Path mapping is now performed on the scene file path (if the scene isnt being submitted with the job).
Lightwave Improvements
Added support for Lightwave 2015.
Fixed a bug that prevented the integrated submitter from working with Lightwave 2015.
modo Improvements
Permissions are now set properly by modo submitter installer, which allows modo to recognize the Deadline
submitter when loading.

12.3. Deadline 7.0.2.3 Release Notes

941

Deadline User Manual, Release 7.1.0.35

12.4 Deadline 7.0.3.0 Release Notes


12.4.1 Overview
This is the third patch release for Deadline 7.0. It fixes a critical bug in the feature that allows you to pick an alternate
folder for job auxiliary files in the Job Settings in the Repository Options. Without this fix, Deadline can delete any
existing subfolders in the chosen folder if their name doesnt match an ID of a job that is still in the queue. This isnt a
problem if you choose an empty folder (which is recommended), but if you choose a folder with existing subfolders,
those subfolders will get deleted.
This fix ensures that only subfolders with names that represent a valid job ID can be deleted by Deadline.
See the following pages for the full release notes:
Deadline 7.0.0.54 Release Notes
Deadline 7.0.1.3 Release Notes
Deadline 7.0.2.3 Release Notes
Note that a 7.0 license is still required to run this version. If you have a license for Deadline 6.2 or earlier, you will
need an updated license. In addition, the version of Draft that ships with Deadline 7 needs a new 1.2 license. If you
have a license for Draft 1.1 or earlier, you will need an updated license.

12.5 Deadline 7.1.0.35 Release Notes


12.5.1 Overview
Deadline 7.1 adds many new features to Deadline 7.0, including enhanced logic for balanced/weighted job scheduling, new Slave metrics, better font synchronization, and new application support. It also fixes some bugs that were
discovered after Deadline 7.0 was released.
Note that a new 7.1 license is required to run this version. If you have a license for Deadline 7.0 or earlier, you will
need an updated license. In addition, the version of Draft that ships with Deadline 7.1 needs a new 1.3 license. If you
have a license for Draft 1.2 or earlier, you will need an updated license.

12.5.2 Highlighted Features


Enhanced Logic for Balanced/Weighted Job Scheduling
An experimental Enhanced Balancing Logic option has been added for balanced/weighted scheduling options in the
Job Scheduling settings. When this option is enabled, the Slaves will use the new SlaveJobState collection in the
database to get a more accurate snapshot of all the rendering jobs in the farm, and use this information to make better
decisions about which job they should be rendering.
Testing has shown that when this option is enabled, a proper distribution of Slaves among jobs is much more consistent,
and Slaves no longer jump between jobs of the same priority. The result is more predictable behavior, and less wasted
time due to the overhead of switching between jobs that are expensive to start up.
New Slave Metrics
Slaves now report their Network I/O, Disk I/O, and Swap usage, which can be viewed from the Monitor. This information is also stored in the statistics that are gathered for the Slaves.

942

Chapter 12. Release Notes

Deadline User Manual, Release 7.1.0.35

In addition, Swap usage for the rendering process is stored with a jobs task when it completes, and is also stored in
the statistics for the job when it completes.
Improved Slaves Statistics Reports
The Slave Resource Usage farm report is now called the Slaves Overview farm report, and shows additional statistics.
For example, the new Slaves Overview chart shows how many slaves were in each state (starting job, rendering, idle,
offline, stalled, and disabled). In addition, the new Available/Active Slaves charts show the number of slaves that are
available, and the number of available slaves that are active. Finally, the new Plugin Usage chart shows the overall
usage of the render plugins.
Both the Slaves Over and Active Slaves Stats reports can also be shown for a given region. This allows you see
statistics for slaves in a specific Cloud region, or in specific areas in the office (ie: render nodes versus workstations).
Note that this requires you to set which regions your slaves belong to in their Slave Settings.
Improved Graphs in the Monitor
Line and Bar graphs in the Monitor now support panning and zooming, and a right-click option has been added to reset
the zoom level. In addition, individual series in some Line graphs can be shown/hidden from the right-click menu.
Finally, the axis labels in these graphs have been updated to properly represent integer and date/time values, which
makes them easier to read.
Expanded Font Synchronization
The new FontSync event plugin that ships with Deadline can be used to synchronize fonts on Mac OS X and Windows
before the Slave application starts rendering any job, or when the Slave first starts up. This general FontSync event
plugin replaces the font synchronization options in the After Effects plugin and now works for ALL plugin types in
Deadline.
Improved Job Batch Display
Deadline 7 introduced the ability to group jobs together in the Monitor by setting their Batch Name property. Now, all
Deadline submitters automatically set the Batch Name if multiple related jobs are being submitted at the same time.
For example, when submitting each render layer as a separate job in Maya, they will all be part of the same batch.
Another example is submitting a Jigsaw render with a dependent assembly job.
In addition, the Batch Row in the job list in the Monitor now shows information for all columns, depending on the
settings for the jobs in the batch. For numeric settings like priority or machine limit, the largest value for the jobs is
shown. For settings like pool and group, the value will be shown if all jobs have the same value, and if they dont,
<batch> is shown instead. For all other columns, <batch> is simply shown.
Finally, the counts above the job list in the job panel now show the number of batches in the list, and the selected count
now ignores selected batches so that it properly represents the number of selected jobs.
Protected Jobs
Jobs now have a Protected property. When enabled, the job can only be deleted by the jobs user, a super user, or a
user that belongs to a user group that has permissions to handle protected jobs. Other users will not be able to delete
the job, and the job will also not be cleaned up by Deadlines automatic house cleaning. This is useful if you have jobs
you need to keep around for testing or benchmark purposes.

12.5. Deadline 7.1.0.35 Release Notes

943

Deadline User Manual, Release 7.1.0.35

Flexible Image Viewer Configuration


The Monitor has always had the option to specify up to three viewers to use when viewing images from the Task list.
Now, optional command line arguments can be set, which are then passed to the viewer when viewing images. There
are also special tags, which are replaced automatically with information about the image being viewed.
The following tags are supported in the command line options:
{FRAME}: This represents the tasks frame file. For example: /path/to/image0002.png
{SEQ#}: This represents the tasks frame sequence files, using # as the padding.
/path/to/image####.png

For example:

{SEQ?}: This represents the tasks frame sequence files, using ?


/path/to/image????.png

as the padding.

For example:

{SEQ@}: This represents the tasks frame sequence files, using @ as the padding.
/path/to/image@@@@.png

For example:

{SEQ%}: This represents the tasks frame sequence files, using %d as the padding.
/path/to/image%04d.png

For example:

The arguments default to {FRAME}, which keeps the default behavior from previous versions of Deadline intact.
In addition, proper names can be given to the viewers, which are shown in their corresponding menu items. Finally,
viewers can be configured to support chunked tasks (tasks which consist of more than one frame).
Standalone Web Service Application
A standalone Web Service application is now shipped with Deadline, and is called deadlineWebService.exe. It works
exactly the same as the Web Service feature that is built into Deadline Pulse, and both can be configured using the new
Web Service page in the Repository Options.
Install Launcher as Daemon on Mac OS X
The Deadline Client installer now has an option to install the Launcher as a Daemon on Mac OS X. This feature lets
you run the Launcher daemon as root, or as another user account.
Improved Submission Script Installers
The submission script installers now show what DEADLINE_PATH is set to (which is used by the submission scripts
to determine where the Deadline Clients bin folder is located). You then have the option to change it if its incorrect,
or set it if it doesnt exist. This is useful if you have multiple versions of Deadline installed on your system.
A side-effect of this improvement is that it allows you to update DEADLINE_PATH without having to reinstall the
Deadline Client or manually changing your systems environment. To do this, simply run any submission script
installer, change the DEADLINE_PATH value, and uncheck all options listed in the Components list. The installer
will then update the DEADLINE_PATH without installing the submission script files.
Draft Updated to Version 1.3.2.58232
This version of Draft requires a new Draft 1.3 licenses, and includes the following updates:
EXR Images:
Added support for EXR data and display windows (previously data windows were set to the same size as the
display windows).
944

Chapter 12. Release Notes

Deadline User Manual, Release 7.1.0.35

Updated to OpenEXR 2.2.0.


LUT support:
Added ACES 1.0 LUTs to the included ocio-configs folder.
Improved the robustness of the Draft ASCCDL Reader. The reader can now handle different syntax in its input
file.
VideoEncoder:
Fixed a bug when encoding an image with VideoEncoder. The VideoEncoder was applying a bit of scaling to
the image.
Fixed bug on Mac OS X where encoding with certain dimensions (ie: 640 x 480) was causing a memory error
crash.
Draft Tile Assembler:
Added support for assembling big images by exposing a new class in Python called TileAssembler. Most of the
logic of an assembly job can now be handled internally
New Application Support
Support has been added for many AEC (Architecture Engineering Construction) and Product Visualization products,
including AutoCAD, CSiBridge, CSiETABS, CSiSAFE, CSiSAP2000, EnergyPlus, MicroStation, VRED and VREDCluster.
Support has also been added for 3ds Max 2016, Anime Studio 11, Composite 2016, Corona distributed rendering,
Maya 2016, Media Encoder, Houdini 14, LuxSlave distributed rendering, Natron, Nuke Studio Frame Server distributed rendering, Renderman RIS for Maya, VRay for modo, and Vue 2015.

12.5.3 Complete Release Notes


General Improvements
A standalone Web Service application is now shipped with Deadline.
A new KeepWebServiceRunning property can be added to the systems deadline.ini file to restart the web
service application if it is stopped or crashes.
Added a generic new event plugin to synchronize fonts, and removed the font synchronization option from the
After Effects plugin.
Regions can now be renamed.
On OSX, the Deadline applications now respond properly to SIGTERM and SIGINT signals.
All Deadline job submitters now set the jobs Batch Name automatically if multiple related jobs are being
submitted.
Passwords can now be longer than 15 characters (affects super user password, passwords for running jobs as
users on Windows, etc). Note that this does not break existing passwords that are already saved.
Fixed a bug that caused the Slave, Balancer, and Pulse to report their disk space incorrectly on Linux.
Added ability to import statistics from another repository.
Added options to Jigsaw window to pan the image using the middle mouse button.
If a non-pulse machine performs house cleaning, pending job scan, or repository repair, it logs it to the repository
history.
12.5. Deadline 7.1.0.35 Release Notes

945

Deadline User Manual, Release 7.1.0.35

Updated submitters that support Draft tile assembly to add a new line to the start of Draft assembler config files
to workaround potential encoding issues.
Fixed some rounding errors in Jigsaw that could occur with region sizes if the background image is a different
resolution than the rendered image.
Fixed some encoding issues that could occur in the assembly config files when submitting Jigsaw renders.
Added an Idle Detection option to only stop a slave when the machine is no longer idle if that slave was originally
started by idle detection.
Fixed a bug when doing path mapping that caused the whole file to be read into memory when there werent any
paths in the Mapped Paths settings in the Repository Options.
Deadline no longer tries to create the deadline.ini file if it doesnt exist, which can happen unintentionally when
the deadline.ini file is being updated on Linux or OSX, and cause the file to get wiped.
The Deadline applications no longer cache the machines host name, which can cause problems when running
multiple instances of the Slave on the same machine.
Fixed a bug in how the default font for the Deadline applications was chosen on OSX, which could cause the
shortcuts in the menus to be displayed incorrectly.
Fixed issues with how all the Deadline applications handled startup errors.
Installer Improvements
The Repository installer now ships with default script menu layouts for the Monitor (they are only applied if
there arent existing customizations to the script menus).
Added a backuprepo command line option to the Repository installer to specify if the repository should be
backed up or not (default is true).
Improved the speed of backing up the repository in the Repository installer.
Added command line option to the Windows Client installer to kill Deadline processes before proceeding with
installation.
Fixed an error in the OSX Client installer that occurred when trying to set up the Launcher Login Item in a
headless session.
The Repository installer now sets the label in the mongodb plist file correctly on OSX.
Added option to OSX Client installer to install launcher as a daemon.
The Submitter Installers no longer create empty folders in the Start Menu on Windows.
The Client uninstaller now checks if there is another Deadline installation prior to deleting DEADLINE_PATH,
and if there is, it prompts the user if they want to delete DEADLINE_PATH or change its value to something
else.
The submission script installers now show what DEADLINE_PATH is set to, giving you the option to change it
if its incorrect.
Fixed some issues with the MongoDB init script that is installed by the Repository installer on Linux that could
cause it to conflict with a previous MongoDB installation.
Fixed how the LAUNCHERSERVICELOCK variable is set in the Launcher init script that is installed by the
Client installer on Linux.
Fixed a bug that caused the modo submitter installer to install into an extra DeadlineModo sub folder.
The DeadlineModoClient.pl script for modo 6xx and earlier is now shipped with the Repository installer again.

946

Chapter 12. Release Notes

Deadline User Manual, Release 7.1.0.35

End of line characters are now removed with the Repository installer sets up the dbConnect.xml file in the
Repository.
The SketchUp submitter installer now works even if SketchUp hasnt been installed on the machine yet.
Fixed the default install paths in the SketchUp submitter installer for SketchUp 7, 8, and 2013.
Added command line option to the Client installer to set the launcher daemon delay setting (Linux and OSX).
Permissions are now set properly on the integrated submission scripts that are installed by the Submitter Installers.
When installing the launcher as a service on Windows, the client installer now grants the SeServiceLogonRight right to the account name so that the service can start properly.
Job Improvements
Added an option to the job scheduling settings to set when the job should stop.
The swap usage of the rendering process is now stored with the task when it is complete, and is also stored in
the job statistics.
Added a job option to specify the rendering progress cut-off for interrupting a job.
Maintenance jobs now take the jobs whitelist/blacklist into account when setting the number of tasks for the
job.
Added an Enhanced Balancing Logic option for balanced/weighted scheduling options. Its an experimental
feature that helps prevent slaves from jumping between jobs.
Added new OutputFilename#Tile? property to the job info file, which will keep track of images for tile jobs (#
is for the output index, ? is for the tile index).
Added job property to project jobs from being deleted or archived.
A history entry is now added to a job before it is deleted (in case the job gets undeleted later).
Fixed some warning messages that could appear when submitting a job that is frame dependent.
Fixed some bugs when submitting jobs with asset dependencies.
Updated the JobTransfer plugin to use the new RepositoryUtils.CreateJobSubmissionFiles function, which ensures that the transferred jobs properties are set correctly.
The TransferSubmission script now sets the transfer job name based on the selected job name.
Trailing path separators are now stripped from the output directory when using OutputDirectory# in the job info
file during submission.
Added a Custom job scheduling option which lets you pick specific days to start and/or stop the job, just like in
the Slave Scheduling settings.
When setting the next start or stop date for Daily scheduled jobs, it is now relative to the current date, not the
date that the job was originally scheduled to be started or stopped.
Fixed a Daylight Savings Time related bug that affected job archiving and getting jobs from the REST API.
When archiving a job, any JSON errors are now written to the log.
Added option to use sudo or su on Linux and OSX when rendering the job as another user. Also added the
option to preserve the environment when using sudo or su.
Fixed an Automatic Job Timeout logic bug. Now, if both Automatic Job Timeout options are enabled in the
Repository Options, then both requirements must be met.

12.5. Deadline 7.1.0.35 Release Notes

947

Deadline User Manual, Release 7.1.0.35

A job no longer fails to submit if an empty Username value is set in the job info file. Instead, the current
Deadline user on the machine is used.
Statistics Improvements
The Slaves Overview report now shows an overview of Available and Active slaves.
The Slaves Overview report now shows overall usage of the render plugins..
The Slaves Overview and Active Slave Stats reports can now be shown for a specific Region.
Added tables to the Slaves Overview report to show Last, Min, Avg, and Max values for the series in the graphs.
All farm reports pages now use splitters so that the graphs can be resized or hidden.
In the Farm Reports dialog, the date range boxes are now formatted to be consistent with how dates are formatted
elsewhere.
Power Management Improvements
When Power Management is starting up Slaves for a job, it now checks the jobs Limits, and doesnt start up
Slaves if they will exceed the maximum for those Limits.
When starting up Slaves for a job, the list of jobs for the secondary pool scan are now gathered properly.
The maximum number of Slaves that can be started for a specific job is now tracked between the primary and
secondary pool scan, in case the job shows up in both collections.
Fixed a bug in Idle Shutdown that would not shutdown an idle Slave on a machine if there were offline and/or
stalled Slave instances on the same machine.
The thermal shutdown sensor dialog in the Power Management window now ensures that the user enters a host
name or IP address for the sensor.
When setting up Thermal Shutdown, you can now specify Test sensors that can be used to test the Thermal
Shutdown functionality without connecting to real temperature sensors.
Added option to Power Management groups to simply include all slaves in the group (instead of having to add
each one manually).
Launcher Improvements
The option to change the user is now disabled if Deadline is configured to use the systems user.
Fixed a memory leak in the Launcher that occurred when it launched various Deadline applications through it.
When shutting down a Linux machine, the Launcher now tries to use sudo shutdown instead of just shutdown.
The deadlinelauncherservice script on Linux now returns the proper exit code when checking the status of the
Launcher.
Fixed a bug that prevented Idle Detection from working properly on Linux.
The launcher now passes a -service command line argument to the slave if it is running as a service on
Windows. This tells the slave that it is running as a service.
The init.d script (Linux) and launchd script (OSX) now pass a -daemon argument to the launcher.

948

Chapter 12. Release Notes

Deadline User Manual, Release 7.1.0.35

When the launcher is running as a daemon (Linux and OSX), it will sleep the number of seconds specified for
the LauncherDaemonStartupDelay setting in the system deadline.ini file before starting up any other Deadline
applications. This delay helps ensure that the machine has its hostname set during startup before launching
Slave, Pulse, or Balancer.
The Launcher icon tooltip now shows that the repository is not set when the deadline.ini file doesnt exist.
Monitor Improvements
General
All remote viewer scripts (ARD, Radmin, RDC, and VNC) now use the hostname/IP override if necessary.
When managing the Script menus in the User Group Permissions dialog, scripts that are no longer in the repository no longer show up.
When auto-configuring the performance settings in the Repository Options, a preview of the current and new
performance settings is now shown.
The option to change the user is now disabled if Deadline is configured to use the systems user.
Added Import/Export/Default buttons to the Script Menus page in the Repository Options.
Fixed a bug in the Script Menus page where script items were losing their order when dragged and dropped as
a group.
Fixed a bug in the Script Menus page that could cause the Monitor to crash during some drag and drop operations.
When changing repositories, the Submission and Scripts menus in the Monitor are now updated.
Moved Image Viewer settings to separate page in Monitor options, and added the option to specify command
line arguments for them.
The Client Setup page in the Repository Options now explains that clients can automatically upgrade or downgrade.
Added a pinned filters button above the list in most panels to allow quicker switching between pinned filters.
Tooltips for spinner controls in the Repository Options now show the minimum and maximum values that are
supported.
The default balancing algorithm in the Balancer Settings in the Repository Options now has a Verbose option.
Made a slight improvement to the performance of the list panels.
The size of the box on the left of Repository Options, Plugin Configuration, etc, can now be resized.
The datetime values shown in the Monitor are no longer based on the systems region settings. This was breaking
the Monitor datetime filters in some regions.
The first column in all the lists can now be moved using drag and drop.
The Monitor no longer asks for the super user password twice during startup if the Monitor is configured to start
in super user mode.
Fixed inconsistencies in the sort order of the job status column.
Added options to the House Cleaning settings in the Repository Options to disable having the Slaves perform
housecleaning, repository repair, and pending job scan.
Added option to Slave Scheduling groups to simply include all slaves in the group (instead of having to add each
one manually).
Removed the [?] button in the top right corner from a few dialogs, since it isnt used.
12.5. Deadline 7.1.0.35 Release Notes

949

Deadline User Manual, Release 7.1.0.35

Added a green label to the Monitor status bar that makes it clear that you are only seeing your own jobs if you
arent allowed to see other users jobs. The tooltip for this label explains why you cant see the other jobs.
Fixed a bug in the local slave controls where some options werent disabled by default if Override Idle Detection
Settings was disabled.
Added an option to the View menu to save all pinned layouts to a zip file.
The first tab in each group of tabs is now selected by default when opening the Monitor, instead of the last tab.
The pinned filter button menu for the list controls is now updated properly.
The Monitor no longer is hidden and restored when changing local pinned layouts.
The Execute Command remote control option now respects the IP Address or Host Name override of the machine its connecting to.
Added user permission option to control if users in a group are allowed to handle protected jobs.
Fixed tooltips and tab order in New/Edit Path Mapping dialog.
The Reports panels in the Monitor now use a monospace font when using the default style (like the Console
panel does).
The Console font in a custom style is now applied properly when the monospace option is disabled.
The Log report row color in the Reports panels is now applied properly when using a custom style.
Line graphs now support panning and zooming, and a right-click option is now available to reset the zoom level.
Individual series in some Line graphs can be shown/hidden from the right-click menu.
Improved the axis labels in many graphs so that they properly represent integer or date/time values.
Fixed a display issue with the Find icon in the context menu for the Console and Reports panels.
Fixed a bug that could prevent jobs exported in one timezone from being imported into another timezone.
Fixed a bug that could prevent the report panel from displaying an error message if it cant connect to Pulse to
stream logs (if Pulse log streaming is enabled in the Monitor options).
Fixed how the controls in the Override Idle Detection group box were enabled/disabled when enabling it.
Split local slave dialog into 3 tabs so that its not so tall.
Error and log reports now show if the slave was running as a service (Windows only).
Added a new CheckTemperatureSensors.py script that can be used to check the temperature of all sensors in
Power Management.
Added options for custom viewer name (which is used for the menu item), and if it supports chunked tasks. If
an image viewer supports chunked tasks, the chunked task image viewer dialog wont be shown.
Added a better error message to deal with custom viewers that are pointing to directories instead of files.
When the Monitor is configured to start in super user mode, it no longer hides panels during startup that the user
wouldnt normally have permissions to see.
The Monitor no longer loads the Monitor settings twice during startup.
When changing repositories while in super user mode, the monitor will stay in super user mode if the new repo
doesnt have a super user password, or will prompt for the password if it does have a super user password.
Fixed a bug in the logic that determines if the Timeout Multiplier label in Automatic Job Timeout settings in the
Repository Options is enabled or disabled.
When streaming logs from Pulse, the Monitor will now only to connect to the Primary Pulse if it is running.

950

Chapter 12. Release Notes

Deadline User Manual, Release 7.1.0.35

Fixed an error in the Custom Farm Reports when creating a graph with a Time Span value for the group or value
column.
Increased the default width of the Edit Data Columns dialog when creating a Custom Farm Report.
Jobs and Tasks
The job batch expansion arrow in the job list is now always shown in the first column.
The job batch row now shows more information based on the jobs in the batch.
The batch row now shows the number of jobs that are in the batch.
The counts above the job list now show the number of batches, and the selected count now ignores batch rows
so that it properly shows the number of selected jobs.
The counts at the top of the job list now update properly when switching between graph and list displays.
The Job Batch setting in the Job Properties dialog is now called Batch Name.
The initial title for the charts in the job list is now set properly.
Fixed a bug where changing a jobs state from the Monitor didnt update the jobs Started and Finished date
properties.
Fixed error when updating jobs history after deleting all reports for the job.
When switching between being able to see other users jobs and not being able to see them, the job counts are
now updated properly.
The Scan For Missing Output option is now available for Tile jobs.
The Scan For Missing Output dialog now pulls the colors for the task output rows from the job lists color
scheme.
Clicking No on the requeue confirmation in the Scan For Missing Output dialog no longer closes the dialog.
Output for Tile jobs can now be viewed from the Task list in the Monitor.
Added a new dialog to handle the resubmission of Tile jobs.
Fixed a bug in the Scan For Missing Output dialog that would always result in the whole job getting requeued.
Fixed some cases where job batch rows did not disappear properly when all their jobs were filtered out.
Removed an obsolete warning message that could appear when using quick filters in the job list.
Trailing path separators are now removed in the job/task context menu options to Explore Output so that duplicates can be removed properly.
Fixed a bug when whitelisting or blacklisting a slave from the task menu that prevented it from persisting.
The Scheduling page in the Job Properties now has a Custom option, which lets you pick days of the week to
start and/or stop the job.
The job properties dialog will now ask you to pend/release a job if the scheduling settings have changed.
Fixed a bug that prevented the job progress bar in the Monitor from updating when the progress for a single task
job is updated.
Changed the color for normalized render time line in the task render times graph.
Fixed an error in the Job Dependency View when changing the Elided Titles setting.
Slaves and Pulses
When starting Slave machines from the Slave list, the info dialog is now displayed immediately.
The initial title for the charts in the Slave list is now set properly.

12.5. Deadline 7.1.0.35 Release Notes

951

Deadline User Manual, Release 7.1.0.35

The Slave list now shows the name and port of the Pulse instance that a Slave is connected to (it still shows
No if it cant connect to Pulse).
The host name or IP address specified in the Pulse settings is now treated as an override. If left blank, the host
name or IP address shown in the Pulse list will be used (depending on the Pulse setting to use Pulses IP address
in the Repository Options).
If a host name or IP address override is specified in the Pulse settings, it is now used when the Slaves connect to
Pulse, and for remote commands.
A MAC address override can now be specified in the Pulse settings.
Added Pulse remote commands to Monitor to perform housecleaning, pending job scan, repository repair, and
power management check.
Removed the filtered utilization calculation from the slave list because it impacted performance, and wasnt
really that useful.
Fixed some display issues in the Slave list for slaves that were in the Starting Job state.
Fixed an onDataChanged error in the Slave list.
Removed some debug logging when starting slaves via Remote Control.
Disabled slaves now show their actual state in parenthesis in the Status column in the slave list.
Disabled slaves are no longer included in the utilization values in the slave list.
Slaves that are starting a job are now included in the utilization values in the slave list.
When sending remote commands from the Monitor to the Slave, only the slaves postfix is sent, instead of the
full slave name. This prevents slaves from starting up with the wrong slave name if the machines host name
changes.
When sending remote commands from the Monitor to the default slave instance on the machine, the ~ character
is used to represent the default instance (since the default instance has no postfix).
The CPU Affinity settings in the Slave Properties now mention that its only supported on Windows and Linux.
Limits
Added a button to the Limit List Control to add a new Limit, which works the same as the existing right-clicking
option.
The list labels in the Limit Dialog are now set correctly when editing a Machine-level Limit.
Balancers and Cloud
Cloud Regions can now be renamed.
Added a Scripts menu to the Balancer lists right-click menu.
Added Balancer license information to the Balancer list.
Added Balancer remote commands to Monitor to perform balancing.
A host name or IP address override can now be specified in the Balancer settings, which is used for remote
control.
A MAC address override can now be specified in the Balancer settings.
Added a button and a right-click option in the Cloud Panel to add a new instance.
Some right-click options in the Cloud Panel are now asynchronous.
Fixed a bug that affected the updating of the Cloud panel.
Permissions for the cloud panel are now open by default.

952

Chapter 12. Release Notes

Deadline User Manual, Release 7.1.0.35

Slave Improvements
Slaves now report their Network I/O, Disk I/O, and Swap usage, which can be viewed from the Monitor.
Fixed a bug that could cause the Slave to lock up when registering new fonts on Windows.
The Slave UI now shows the name and port of the Pulse instance that the Slave is connected to (it still shows
No if it cant connect to Pulse).
Fixed an Access Denied error that could occur when rendering as a user on Windows.
On Windows, a warning is now printed if the rendering process cannot be assigned to a job object instead of
failing the render. Note that this is only an issue on Windows 7 and earlier.
The Region name is now shown in the Slave UI.
The Slave now sets its Slave name when updating a requeue report.
Fixed a bug that prevented the Slave on Linux from getting the output image size correctly after it finished
rendering a task.
On Linux and Mac OSX, a SIGKILL signal is now sent to the rendering process when cancelling a task if it
doesnt shut down gracefully.
When finished rendering a task from a Tile job, the Slave now sets the output image file size for the task.
The slaves now report memory usage for a task more reliably on Linux.
Fixed an error on Windows when mapping drives to a remote path with / as the path separators.
Disabled slaves no longer perform house cleaning, pending job scan, or repository repair operations.
A slave now only triggers slave events when its state actually changes. Previously, the slave would trigger the
OnSlaveIdle event repeatedly when it didnt find any jobs to render, even if it was idle before looking for a job.
When the slave is shutting down, it skips the gathering of system info (cpu, ram, swap, network I/O and disk
I/O) when reporting the slave state because it can significantly slow down the shutdown of the slave.
The slaves now only try to connect to the Primary Pulse if it is running.
Pulse Improvements
Fixed a few bugs with how Slave names were processed by the web service (for example, there were issues with
case sensitivity).
Added functions to the REST API to get the contents of job, task, and Slave reports.
The confirmation dialog shown when shutting down Pulse now mentions if Pulse is the Primary.
The Region name is now shown in the Pulse UI.
Balancer Improvements
Added a text box to the Balancer UI that shows information from the previous balancing operation.
Added Balancer license information to the Balancer UI.
Added option to change license server from the Balancers file menu.
The Balancer system tray icon is now hidden when the Balancer is closed.
The Balancer now responds to remote shutdown requests properly.
The Balancer UI and logs now show which logic plugin the Balancer is using.

12.5. Deadline 7.1.0.35 Release Notes

953

Deadline User Manual, Release 7.1.0.35

The confirmation dialog shown when shutting down Balancer now mentions if Balancer is the Primary.
The Region name is now shown in the Balancer UI.
Added icon to the Perform Balancing menu item in the Balancer UI.
The primary balancer now tries to pull a license immediately after connecting to the repository. This ensures
that the license information in the balancer GUI is correct when it pops up.
The primary balancer will now check in its license if it is switched to standby mode while its running.
The primary balancer will now explicitly check in its license when it is shut down.
The Balancer now shows regions as disabled if they are disabled or if they are disabled specifically for the
Balancer.
Fixed a bug that could cause the Slave from a terminated instance to still show up in the Slave list in the Monitor.
Command Improvements
Added RemoveCloudRegion command line option to remove a cloud region.
Fixed the help text for the CreateCloudRegion command.
Fixed the deadlinecommand shell script to properly set the LD_LIBRARY_PATH on Linux.
More job properties are now supported by the GetJobSetting and SetJobSetting commands.
Added GetJobExtraInfoKeyValue and SetJobExtraInfoKeyValue commands to get/set key/values in the jobs
Extra Info dictionary.
Added GetJobPluginInfoKeyValue and SetJobPluginInfoKeyValue commands to get/set key/values in the jobs
Plugin Info dictionary.
Added AppendJobFrameRange command to append frames to a job without affecting the jobs existing tasks.
Improved stdout when using commands that change a jobs state (like RequeueJob).
The help messages for the SubmitMultipleJobs and Multi commands are now consistent.
Web Service Improvements
A standalone web service application is now included with Deadline.
Fixed a bug that could cause the web service to lock up on Linux and OSX.
Added REST API functions to get the Deadline version.
Added REST API functions to delete Pulse and Balancer instances.
Added REST API functions to perform path mapping.
Added States parameter to the jobs API to get jobs in the specified state(s). It accepts a comma separated list
of states.
Added REST functions to get the report contents for a job.
The web service now returns status code 500 when a web service script throws an error.
Web service scripts can now return a status code, as well as additional headers.
Added REST function to append frames to a job without affecting the jobs existing tasks.

954

Chapter 12. Release Notes

Deadline User Manual, Release 7.1.0.35

Scripting Improvements
The RepositoryUtils.CheckPathMappingInFileAndReplace() function no longer loads the entire file in memory
for path mapping.
Popup detection in the Application Plugins now works on Qt popups and Windows 8 mobile popup dialogs.
Invalid DateTime values in Deadline objects passed via the standalone Python API no longer cause errors, and
instead are set to the minimum DateTime value.
Right-click scripts for the Balancer list in the Monitor can now be created.
Added new ReplaceFrameNumberWithPadding and ReplaceFrameNumberWithPrintFPadding functions to
FrameUtils.
Added functions to the standalone Python API to get the contents of job, task, and Slave reports.
Fixed the RemoveSlavesFroLimitGroupList function typo in RepositoryUtils.
ProcessUtils.IsProcessRunning() is now more reliable on OSX.
Added functions to JobUtils to test dependencies.
Added new Power Management events.
Added script API functions to modify a jobs bad Slave list.
Added a GetEventDirectory() function to the event plugin class, which returns the directory path for the current
event plugin.
Added new API functions to get selected Pulse and Balancer settings objects in the Monitor.
Added Standalone Python API functions to get the Deadline version.
Added Standalone Python API functions to delete Pulse and Balancer instances.
Added functions to PathUtils to register or unregister a list of fonts (Windows only).
Added a RepositoryUtils.CreateJobSubmissionFiles function to create the submission files from the job.
Added new MappedPaths module to standalone Python API to perform path mapping.
Added a RepositoryUtils.GetJobsInState function to get jobs that are in the specified state(s).
Added Jobs.GetJobsInState and Jobs.GetJobsInStates functions to standalone Python API to get jobs in specific
states.
Added JobBatchName to Job object in the script API.
Removed use of deprecated JobUtils and ScriptUtils functions from the Monitor scripts that ship with Deadline.
Added FrameUtils.ReplacePaddingWithFrameNumber() function to Script API.
Added standalone Python functions to get the report contents for a job.
Added IsRunningAsService() function to DeadlinePlugin to check if the slave is running as a service on Windows.
Added Limit properties LimitCurrentHolders, LimitInUse, and LimitStubLevel to scripting API.
Added RepositoryUtils.GetPowerManagementOptions function to scripting API.
Added PowerManagementGroup and PowerManagementOptions classes to scripting API.
The SetIniFileSetting function no longer changes the order of sections and keys in the ini file.
Added RepositoryUtils.GetPathMappings() function to get all the path mappings for the current OS and region.

12.5. Deadline 7.1.0.35 Release Notes

955

Deadline User Manual, Release 7.1.0.35

Added RepositoryUtils.AppendJobFrameRange() function to append frames to a job without affecting the jobs
existing tasks.
Added standalone Python function Jobs.AppendJobFrameRange() to append frames to a job without affecting
the jobs existing tasks.
Comments are now supported in the param and options files for the plugins. A ; or # can be placed at the
start of a line to comment it out.
Application Plugin Improvements
3ds Cmd Improvements
Added support for 3ds Max 2016.
Added Backburner sys env PATH checks to 3dsCmd plugin.
Fixed FTrack bug in the 3dsCmd integrated submitter.
Version info for 3dsmaxcmd.exe and 3dsmax.exe executables are now logged during rendering.
Added a new sanity check to the integrated submitter.
Updated version dictionary in 3dscmd.py.
3ds Max Improvements
Added support for 3ds Max 2016.
Many Corona renderer properties can now be modified from the Monitor after the job has been submitted.
Increased the communication timeout between Deadline and 3ds Max, which greatly reduces the occurrence of
timeout errors.
Lowered the timeout for customise.ms from 10 minutes down to 1 minute.
Improved the reliability of the Kill ADSK Comm Center feature.
Added Qt popup handling.
Fixed a bug with .mxp path config files for some submission modes.
The contents of the temporary dl.ini file are now printed to the render log.
Fixed a bug in SMTD that allowed you to click the Submit button before SMTD finished loading.
A warning message is now logged if the Backburner version installed doesnt match the version of 3ds Max
being rendered with.
Sub State-Sets and Scripted State-Sets now supported in SMTD.
Unified color scheme in SMTD.
Blacklist/whitelist now shows the slave states by coloring the items in the list in SMTD.
The Batch Name for jobs is now supported for all job type submissions in SMTD.
Updated support for some Corona advanced options that recently changed.
Fixed a bug with Quicktime job submission in SMTD.
Added popup handler for ADSK license dialog popup when you borrow a license.
Added popup handler for Populate dialog.
Fixed some Tile/Jigsaw related bugs.
Added MAXScript Debugger popup ignorer.
956

Chapter 12. Release Notes

Deadline User Manual, Release 7.1.0.35

SMTD now sets the tile output paths for a Tile job so that they can be viewed from the Monitor.
Fixed a bug in SMTD that prevented a couple Shotgun checkboxes from working properly.
Fixed how SMTD set the job batch name when using the Create/Upload checkboxes.
Added new sanity checks to SMTD.
Added ExtraInfo customisable maxscript $tokens to SMTD.
Added extra LogInfo line so we can see in a crashing 3dsMax error/log report that slave is running as a service.
Added V-Ray & Corona VFB override checkbox option in SMTD.
Added V-Ray & Corona VFB enable/disable checkbox, only active if override option is enabled in SMTD.
Updated customize.ms to use DeadlineUtil.WarnMessage for warning messages.
Added support for V-Ray Image Sampling - Render Mask Type = current scene selection.
Updated V-Ray advanced renderer maxscript properties to support latest V-Ray v3.15 / nightly builds.
Updated iray advanced renderer maxscript properties to support most recent iray features introduced in 3dsMax
2014 onwards.
Added read-only labels to 3dsmax submission via SMTD to display the final, assembled image resolution when
tile/region/jigsaw rendering.
Changed V-Ray VFB [Region] button is enabled sanity check from #fail to #fix.
Added a couple Maxwell popup ignorers.
Updated version dictionary in 3dsmax.py.
After Effects Improvements
Changed wording of Number of Tasks to Number of Machines in multi machine rendering settings.
Enabling multi machine mode in the Monitor submitter now disables the local rendering option.
Fixed a text cutoff issue in the integrated After Effects submitter.
Anime Studio Improvements
Added support for Anime Studio 11.
The submitter can now parse the Layer Comps from the new .anime and .animeproj scene files that were introduced in Anime Studio 11.
Arnold Standalone Improvements
Added path mapping support to the contents of the Arnold .ass files.
Added progress reporting to the Arnold plugin.
The Command Line field in the Monitor submitter is now sticky.
Cinema 4D Improvements
Added stdout handler to catch The output resolution is too high for the selected render engine error message.
Composite Improvements
Added support for Composite 2016.
Corona Improvements
Added support for Corona distributed rendering.
Updated the Corona icon.

12.5. Deadline 7.1.0.35 Release Notes

957

Deadline User Manual, Release 7.1.0.35

Draft Improvements
Updated Draft to version 1.3.2.58232 (requires a new Draft 1.3 license).
Added the Use Shotgun Data button and its functionality to the job right-click Draft submission script.
Fixed a bug caused by trailing backslashes in paths being passed as command line arguments to Draft.
Fixed some bugs in the Draft Tile Assembler.
Updated Draft Assembly plugin to use new Draft functions, which reduce memory usage and improves performance.
Fusion Improvements
Re-added Submission and Job scripts to submit Fusion Quicktime jobs to Deadline (was removed in Deadline
6).
Fusion Quicktime jobs are now submitted to the Fusion plugin, instead of having their own QuicktimeFusion
plugin.
The Fusion plugin now pulls from the Fusion render log correctly when Fusion.exe is chosen as the render
executable.
Fixed a bug in the integrated submitter that would prevent job submission from working for certain versions of
Fusion 7.x.
Hiero Improvements
When submitting a sequence with a custom in/out time, the integrated Hiero submitter now sets the end frame
properly.
Houdini Improvements
Added support for Houdini 14.
Added support to the integrated submitter for submitting Wedge ROP jobs.
Submission settings in the integrated submitter now get saved in the Houdini scene file so that settings are sticky
between individual scenes.
Removed auto-detection of Houdini install path in the Houdini submitter installer due to various bugs.
Fixed a bug with how the Houdini submitter installer installed the submission script on OSX.
Fixed an ftrack bug in the integrated submitter.
The integrated submitter now collects all ROPs in the scene, not just the ones in /out.
Added Tile rendering support when submitting from the integrated or Monitor submitters.
Improved path mapping support by using the HOUDINI_PATHMAP environment variable.
Path mapping is now enabled by default in the Houdini plugin.
LuxRender Improvements
Added support for LuxSlave distributed rendering (works like the VRay Spawner and modo Distributed Rendering plugins).
Updated the default executable paths in the plugin configuration.
Mantra Standalone Improvements
Added support for Houdini 14.
Added Tile rendering support when submitting from the Monitor submitter.
Improved path mapping support by using the HOUDINI_PATHMAP environment variable.

958

Chapter 12. Release Notes

Deadline User Manual, Release 7.1.0.35

Fixed a path mapping error that affected Houdinis crowd feature.


Path mapping is now enabled by default in the Mantra plugin.
Maxwell Improvements
Added support for Maxwell 3.1s new Extra Sampling options, which can be overridden in the Maxwell submitter.
Fixed an error when submitting Maxwell jobs that prevented them from being submitted.
The Minimum Sampling Level value is now 0 instead of 1.
Stripped out a bunch of unnecessary plugin info settings that were set by the submitter when submitting a merge
job.
Added option to Monitor submitter to specify the camera to render.
Maya Improvements
Added support for Maya 2016.
Added the out of order frame range and preview job options to the integrated Maya submitter.
Added configurable submission properties for the different kinds of dependent export jobs that can be submitted
from Maya.
Added additional Vrimg2Exr options to Maya submitter (when submitting a dependent vrimg2exr job).
VRay render elements can now be viewed from the Monitor when the job is submitted with the integrated Maya
submitter.
The Maya submitter now sets the tile output paths for a Tile job so that they can be viewed from the Monitor.
Fixed some bugs in Jigsaw for Maya.
The integrated submitter can now export Arnold .ass files locally and then submit an Arnold Standalone job to
render them.
The Jigsaw window now respects the selected camera in the integrated submitter.
Added option to the integrated submitter to submit a Maxwell export job to Deadline, as well as a dependent
Maxwell standalone job.
Added option to the integrated submitter to perform a local Maxwell export, and then submit a Maxwell standalone job to render the exported files.
The Command Line field in the Monitor submitter is now sticky.
The Jigsaw regions are now rendered properly when using Arnold.
Arnolds MergeAOV setting is now respected when Tile or Jigsaw rendering.
Fixed a bug with getting bounding boxes for Jigsaw when using a vertical camera.
The Threads setting is now disabled in the Monitor submitter if it isnt supported by the selected renderer.
Some settings in the Monitor submitter are now disabled when the File renderer is selected.
Added support for Renderman RIS for Maya.
When exporting Renderman RIB files in the integrated Maya submitter, there is now an option to set the Renderman renderer to RIS.
When exporting Renderman RIB files for a scene with layers, a dependent PRMan job is now submitted for
each layer.
Fixed offset issues when rendering tile and Jigsaw renders with Renderman for Maya.

12.5. Deadline 7.1.0.35 Release Notes

959

Deadline User Manual, Release 7.1.0.35

Fixed inverted assembly issue when rendering animation tile jobs with Renderman for Maya.
Tile and Jigsaw now works with render layers with Renderman for Maya.
Fixed a bug that could cause Jigsaw animation jobs to not submit a dependent assembler job.
Jigsaw animation jobs now respect the frame list override when overriding layer settings during submission.
Draft Assembly config files are now created in the layer folder that the output is saved to. Before, they were
saved to the root image folder.
Removed some debugging print statements from the integrated Maya submitter.
Path mapping is now enabled by default in the MayaCmd and MayaBatch plugins.
Fixed the install path for the Maya integrated submitter installer on OSX.
Removed all frame borders from the integrated Maya submitter (since they are deprecated in Maya 2016).
Fixed an overlap issue with the integration Connect button in the Maya integrated submitter.
modo Improvements
Added path mapping support for assets in the modo scene file, and for the render output paths.
Added submission option to submit each render pass group as a separate job.
Native modo dialogs are now used for info, errors, and yes/no questions.
The local scene file warning is now only shown when the scene file is not being submitted with the job.
Added support for VRay for modo.
Fixed a typo in the description of the Geometry Cache Buffer setting in the modo plugin configuration.
The output format is now sticky in the modo submitter for the Monitor.
The modo submitter now sets the tile output paths for a Tile job so that they can be viewed from the Monitor.
Fixed a typo for one of the tabs in the modo submitter for the Monitor.
Added Output Override settings to the integrated modo submitter, which let you render to the Layered PSD or
EXR formats.
The browser buttons in the integrated submitter no longer clear their corresponding values if the user cancels
the browser window.
Fixed a bug that prevented Draft assembly from working when submitting modo Tile renders from the Monitor
for if a layered EXR format was not selected.
Added support for a modo sanity check script, which can be created in the submission/Modo/Main folder in the
Repository.
Nuke Improvements
Added support for Nuke Frame Server distributed rendering (for Nuke Studio).
Added support for sequence/container submission in Nuke Studio.
Added option to integrated submitter to submit only write nodes within precomp nodes.
Added option to integrated submitter to render the precomp nodes first.
Fixed an issue with how the nuke integrated submitter handles write nodes with embedded TCL in the output
path.
Fixed the nuke integrated submitter to evaluate embedded TCL properly before checking for frame padding.

960

Chapter 12. Release Notes

Deadline User Manual, Release 7.1.0.35

The integrated submitter now properly detects if Gizmos are selected when submitting the selected write nodes
only.
Octane Improvements
The Octane submitter can now parse render targets from Octane 2 ocs files.
The Octane submitter handles ocs parsing errors better.
Ply2Vrmesh Improvements
Added support for handling multiple frames.
Added option to merge the outputs.
PRMan Improvements
The PRMan plugin now supports rendering RIB files with layers in the file name.
REDLine Improvements
Added support to REDLine for using RMD files for metadata, in addition to the existing RSX option.
Rhino Improvements
Added Qt popup handling.
Fixed a bug in the Rhino submitter.
The Rhino submitter now sets the tile output paths for a Tile job so that they can be viewed from the Monitor.
Fixed some layout issues in the Rhino submitter.
Tweaked Tile Rendering labels in the submitter, and added a label that explains that tile rendering is disabled
when submitting from the Monitor.
RIB Improvements
The RIB plugin no longer fails 3delight renders when they print error reporting to stdout.
RVIO Improvements
Re-added right-click job script for RVIO submission.
Softimage Improvements
The Softimage submitter now sets the tile output paths for a Tile job so that they can be viewed from the Monitor.
VRay Standalone Improvements
When rendering separate vrscene files per frame, the frame padding is now added to the output file name.
Path mapping is now enabled by default in the VRay plugin.
VRay DBR Improvements
Added support for VRay DBR for 3ds Max 2016 and Maya 2016.
The Monitor VRay Spawner submitter now defaults to the none pool like the other submitters.
The minimum value for the maximum number of servers in the Monitor submitter is now 1 instead of 0.
The Version and Port settings can now be seen in the job properties for VRay Spawner jobs.
Updated the default TCP ports for 3ds Max VRay and 3ds Max VRay RT.
The port number in the integrated submitter is now disabled after the render begins.
Updated Vray Spawner Monitor submitter to add support for Cinema 4D.
Monitor submitter now hides the port setting if it isnt applicable for the selected application.

12.5. Deadline 7.1.0.35 Release Notes

961

Deadline User Manual, Release 7.1.0.35

The log box in the submitter for 3ds Max now has colored text.
Fixed a regression in the submitter for 3ds Max.
Added Check ALL, INVERT & None button to the submitter for 3ds Max to allow easy Active Server List
selection.
Added ability to select which Slaves are used for DBR in the submitter for 3ds Max. Disabled Slaves will still
continue to run the spawner job until it is deleted or completed.
Added a couple Maxwell popup ignorers when rendering with VRay for 3ds Max.
Fixed the install path for the Maya VRay DBR integrated submitter installer on OSX.
Increased the width of the Maya VRay DBR submitter, and removed all frame borders from the UI (since they
are deprecated in Maya 2016).
VRimg2Exr Improvements
Added additional submission options: Separate Files, Multi Part, Long Channel Names, and Threads.
Vue Improvements
Added support for Vue 2015.
Default Vue executable paths for Vue 2014 and 2015 now include the path to the Vue PLE executable.
Event Plugin Improvements
Draft
Fixed a bug where Pool and Group were switched in the Draft event plugin.
Fixed the Draft event plugin to pass on the contents of the DraftExtraArgs key-value pairs to the Draft job.
ftrack
You can now create new Assets from Deadlines ftrack UI.
The Asset list in the UI will now list all Assets belonging to the selected Tasks parent (as opposed to assets
already tied with that Task).
The ftrack event plugin now uses a relative path to load the ftrack API.
The ftrack event plugin no longer adds the ftrack API path to sys.path if its already there.
Fixed a bug in the ftrack event plugin where it would still try to create a thumbnail after determining that it
shouldnt.
If there is only one output file, Deadline now creates the default main component instead of Deadline_Output_0.
Upgraded the ftrack API.
Shotgun
The Shotgun event plugin now uses a relative path to load the Shotgun API.
Cleaned up some logging in the Shotgun event plugin.
Added option to Shotgun event plugin to specify the character that should be used for frame padding when
uploading the paths to Shotgun (default is #).

962

Chapter 12. Release Notes

Deadline User Manual, Release 7.1.0.35

Cloud Plugin Improvements


Amazon EC2 Cloud Plugin Improvements
Added support for Security Groups.
Tags are now copied over when cloning an existing instance.
Fixed a typo in a log message.
Added support for user data.
Updated boto library, which added support for new AWS regions.
Google Cloud Plugin Improvements
Fixed a bug in the Google Cloud plugin that caused it to only show the first 500 instances in the Monitor.
Improved the tooltip for the Project ID field in the plugin configuration.
Set a minimum and maximum value for the Port field in the plugin configuration.
Cloning an existing instance now copies the persistent disk size.
Added support for starting and stopping instances.
A Save File dialog is now used when setting the path to the OAuth File in the plugin configuration.
Removed unnecessary deletion of disks in TerminateInstances because all instances the Balancer starts have
autodelete on.
Added regions that were missing.
Numbers can now be used in instance names (note that the first character in the instance name must be a
lowercase letter).
OpenStack Cloud Plugin Improvements
Added support for Security Groups.
Added support for Key Names.
Metadata is now copied over when cloning an existing instance.
Added a Region Name option to the OpenStack cloud plugin configuration.
Updated the version of the libcloud library that the OpenStack cloud plugin uses.

12.5. Deadline 7.1.0.35 Release Notes

963

S-ar putea să vă placă și