Documente Academic
Documente Profesional
Documente Cultură
0 Tutorials
© 2015 SAS IP, Inc. All rights reserved. Unauthorized use, distribution or duplication is prohibited.
ANSYS, ANSYS Workbench, Ansoft, AUTODYN, EKM, Engineering Knowledge Manager, CFX, FLUENT, HFSS, AIM
and any and all ANSYS, Inc. brand, product, service and feature names, logos and slogans are registered trademarks
or trademarks of ANSYS, Inc. or its subsidiaries in the United States or other countries. ICEM CFD is a trademark
used by ANSYS, Inc. under license. CFX is a trademark of Sony Corporation in Japan. All other brand, product,
service and feature names or trademarks are the property of their respective owners.
Disclaimer Notice
THIS ANSYS SOFTWARE PRODUCT AND PROGRAM DOCUMENTATION INCLUDE TRADE SECRETS AND ARE CONFID-
ENTIAL AND PROPRIETARY PRODUCTS OF ANSYS, INC., ITS SUBSIDIARIES, OR LICENSORS. The software products
and documentation are furnished by ANSYS, Inc., its subsidiaries, or affiliates under a software license agreement
that contains provisions concerning non-disclosure, copying, length and nature of use, compliance with exporting
laws, warranties, disclaimers, limitations of liability, and remedies, and other provisions. The software products
and documentation may be used, disclosed, transferred, or copied only in accordance with the terms and conditions
of that software license agreement.
For U.S. Government users, except as specifically granted by the ANSYS, Inc. software license agreement, the use,
duplication, or disclosure by the United States Government is subject to restrictions stated in the ANSYS, Inc.
software license agreement and FAR 12.212 (for non-DOD licenses).
Third-Party Software
See the legal information in the product help files for the complete Legal Notice for ANSYS proprietary software
and third-party software. If you are unable to access the Legal Notice, Contact ANSYS, Inc.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. iii
Remote Solve Manager R17.0 Tutorials
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
iv ation of ANSYS, Inc. and its subsidiaries and affiliates.
Configuring Remote Solve Manager (RSM) to Submit Jobs to a
Microsoft HPC Cluster
Important
The instructions in this tutorial apply only when configuring RSM for direct job submission
from ANSYS Workbench. If you will be submitting jobs to RSM via an EKM Portal, refer to
Integrating EKM with Remote Solve Manager (RSM) in the EKM Administration Guide instead.
Introduction
This tutorial steps you through the configuration of ANSYS Remote Solve Manager (RSM), Solvers, and
Workbench so that solve jobs can be submitted to a Microsoft HPC 2008 or 2012 Server cluster via RSM.
In this tutorial, RSM is configured using the Remote Solve Manager Setup Wizard. For a quick-start
guide on using the wizard, select Start > All Programs > ANSYS 17.x > Remote Solve Manager >
Readme - RSM Setup Wizard 17.x.
Assumptions
These instructions assume the following:
• You have installed and configured a Microsoft HPC Server, and the compute nodes can access the cluster
head node. If Microsoft HPC is not configured properly, contact Microsoft for support before you attempt
to install ANSYS applications.
You can access a Getting Started Guide for Windows HPC Server at the following locations:
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 1
Configuring Remote Solve Manager (RSM) to Submit Jobs to a Microsoft HPC Cluster
• You are a local administrator of the Microsoft HPC cluster and know how to share directories and map network
drives. If you do not know how to perform these tasks, contact your Systems Administrator for assistance.
You can also access help from the Start menu on your desktop.
• You know the machine name of the head node on the Microsoft Server HPC cluster.
• You are able to install and run ANSYS, Inc. products, including Licensing on Windows systems. For information
on installation and licensing, see the tutorials on the Downloads menu of the ANSYS Customer Portal.
If you have any problems with, or questions about the installation process, go to the Support Contacts
page of the ANSYS Customer Portal and submit a support request.
2. Ensure that the Microsoft HPC user account has Read & Execute permissions for this directory. Typically,
it is sufficient to add DOMAIN USERS to the list of users that have access to submit jobs to the compute
cluster.
When using the ANSYS installer to install a solver (Fluent, CFX, Mechanical, Polyflow), RSM and Workbench
will be installed also.
• Go to Start > All Programs > ANSYS 17.x > Remote Solve Manager.
• Right-click on RSM Setup Wizard 17.x and select Run as Administrator from the context menu.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
2 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Configuring RSM on the Cluster Head Node
2. Click Next. Complete the steps presented by the wizard, using the sections that follow as a guide.
3. To allow for auto-configuration of Workbench, leave Configure ANSYS Workbench when starting RSM
services checked.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 3
Configuring Remote Solve Manager (RSM) to Submit Jobs to a Microsoft HPC Cluster
4. Click Next. If you opted to configure ANSYS Workbench when starting RSM services, the HPC Setup
Prerequisites page will prompt you to cache your password with HPC:
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
4 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Configuring RSM on the Cluster Head Node
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 5
Configuring Remote Solve Manager (RSM) to Submit Jobs to a Microsoft HPC Cluster
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
6 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Configuring RSM on the Cluster Head Node
2. On the Select a Compute Server screen, select Define a New Compute Server, then click Next.
3. On the Identify Machine screen, enter a Machine Name or IP Address for the server. This must be the
actual computer name or IP address of the head node. In this example we’ll enter headnode.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 7
Configuring Remote Solve Manager (RSM) to Submit Jobs to a Microsoft HPC Cluster
4. Enter a Display Name for the server. This can be any name that makes sense for you. In this example we’ll
enter MS Compute Server.
5. On the Set Cluster Information screen, specify whether you want to run jobs from a network share or
from the local disk, then click Next. In this example we’ll select Network Share.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
8 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Configuring RSM on the Cluster Head Node
6. On the next Set Cluster Information screen, enter the UNC path for your Shared Cluster Directory. This
is the directory that is shared out to all the cluster nodes from the head node. In this example we’ll use
the shared Temp directory as the shared cluster directory, so we’ll enter \\Headnode\Temp as our path.
Click Next.
7. On the Job Submission Settings screen, specify the Maximum Number of Jobs that can run concurrently
on this Compute Server, then click Next.
8. On the Save Compute Server Settings screen, select Yes, save all changes to save your Compute
Server settings, then click Next.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 9
Configuring Remote Solve Manager (RSM) to Submit Jobs to a Microsoft HPC Cluster
9. On the Set up Compute Server screen, specify whether you want to auto-configure Compute Server
directories. In this example we’ll select Yes, automatically configure directories, then click Next.
10. On the Additional Compute Servers screen, specify whether you want to create or modify another
Compute Server. In this example we’ll select No, then click Next.
Adding a Queue
1. On the Define Queues screen, select Yes to define a new or modify and existing queue, then click Next.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
10 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Configuring RSM on the Cluster Head Node
2. On the Select Queue screen, specify whether you want to create a new queue or modify one already in
list. In this example we’ll select Define a new Queue, then click Next.
3. On the Queue Information screen, enter a Name for the queue. In this example we’ll enter MS Compute
Cluster Queue. The Compute Server you added previously (MS Compute Server, in this example)
appears in the list of Compute Servers. Select its check box to assign it to the new queue. Click Next.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 11
Configuring Remote Solve Manager (RSM) to Submit Jobs to a Microsoft HPC Cluster
4. On the Additional Queues screen, specify whether you want to define or modify another queue. In this
example we’ll select No, then click Next.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
12 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Configuring RSM on the Cluster Head Node
Defining Accounts
1. On the Define Accounts screen, specify whether or not you want to define new accounts or modify
passwords. In this example we’ll select Yes, then click Next.
2. On the Select Account screen, select an existing account to modify or specify that you want to define a
new account. In this example we’ll select Define a new account, then click Next.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 13
Configuring Remote Solve Manager (RSM) to Submit Jobs to a Microsoft HPC Cluster
3. On the Define Account screen, enter the Username and Password that you use to log into your Windows
machine, then confirm your password. Click Next.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
14 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Configuring RSM on the Cluster Head Node
4. On the Define More Accounts screen, specify if you want to define more accounts. In this example we’ll
select No, then click Next.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 15
Configuring Remote Solve Manager (RSM) to Submit Jobs to a Microsoft HPC Cluster
If the test succeeds, the Test Status will be Finished. If the test fails, the Test Status will be Test
Failed. Check over the steps to make sure that you followed all steps correctly. You can also check
Troubleshooting RSM Issues (p. 16) for information on adding firewall ports and so on.
3. Click Next
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
16 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Troubleshooting RSM Issues
4. Right-click on the log in the lower right pane and select Save Job Report.
Solution: Make sure that the RSM services on the manager machine (in other words, the head node)
were started as Administrator.
For Windows, you must either have Windows administrative privileges on the Solve Manager, have RSM
administrative privileges (as a member of the RSM Admins (p. 17) user group), or launch the RSM Admin
by right-clicking on it and selecting Run as administrator.
1. Log in as Administrator.
2. On the machine where RSM is set up, open a Command Prompt and change the directory (cd) to
C:\Program Files\Ansys Inc\v17x\RSM\bin.
4. In the New Group dialog, enter RSM Admins as the Group Name and add members by clicking Add.
5. In the Select Users, Computers, Service Accounts, or Groups dialog, type a user name in the editing
window and then click Check Names to search for a matching name in the current domain. When found,
the user name will be displayed in full syntax in the editing window.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 17
Configuring Remote Solve Manager (RSM) to Submit Jobs to a Microsoft HPC Cluster
Replace the x with the point version number of your R17 product. For example, add ports 8170 and
9170 for version 17.0, and 8171 and 9171 for version 17.1.
On the Client machine, ping the head node using the Fully Qualified Domain Name (FQDN). For example,
open up a Command Prompt and type: ping headnode.domain.com (where headnode is the
actual machine name of the head node). The ping command should return a statement similar to the
following:
Pinging headnode.domain.com [10.2.10.32] with 32 bytes of data:
Reply from 10.2.10.32: bytes=32 time=56ms TTL=61
Note
Take note of the IP address (10.2.10.32 in the above example). You will need this address in
the steps that follow.
1. Go to the head node and navigate to C:\Program Files\Ansys Inc\v17x\RSM\Config and locate the
Ans.Rsm.AppSettings.config file.
3. Locate the <appSettings name="Global"> section. If your text editor can show line numbers this
section starts on line 3.
The bolded line in the sample code below shows what the line looks like using our example IP
address of 10.2.10.32:
<appSettings name="Global"
<add key="DiskSpaceLowWarningLimitGb" value="2.0"/>
<add key="PingServerTimeout" value="3000"/>
<add key="PingServerMaxRetries" value="4"/>
<add key="PortInUseTimeout" value="5000"/>
<add key="RemotingSecureAttribute" value="false"/>
<add key="EnablePerformanceLogging" value="false"/>
<!--This setting is sometimes required for machines with multiple network interface cards.
example value="1.2.3.4" or value="machine.mycompany.com/-->
<add key="RemotingMachineNameAttribute" value="10.2.10.32"/>
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
18 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Troubleshooting RSM Issues
6. Go to Control Panel > Administrative Tools and restart the services ANSYS JobManager Service V17.x
and ANSYS ScriptHost Service V17.x. To restart a service, right-click on it and select Restart.
2. Make sure that 127.0.0.1 is not commented out with a # sign. If it is, remove the # sign.
2. To turn off UAC, move the slider to the Never notify position, and then click OK.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 19
Configuring Remote Solve Manager (RSM) to Submit Jobs to a Microsoft HPC Cluster
Caught exception at user logon: A required privilege is not held by the client
Description: In the Windows Task Manager, on the Processes page, RSM is running as a user and not
as SYSTEM. This is incorrect. To submit jobs to another Windows machine the processes need to be
running as SYSTEM.
1. Log in as Administrator.
2. On the machine where RSM is set up, open a Command Prompt and change the directory (cd) to
C:\Program Files\Ansys Inc\v17x\RSM\bin.
Caught exception at user logon; logon failure: unknown user name or bad password
/ Account Password not Provided
Description: The following error is reported in the RSM log file:
Compute Server running as: DOMAIN\username
Caught exception at user logon: A required privilege is not held by the client.
Or
Account Password not Provided
In the Set Password dialog box, if your DOMAIN and username match the one shown simply press
Enter.
If the Windows client account is different from the HPC Windows account, you will need to set up an
alternate account. You can do this after you have cached your primary Windows account with RSM by
selecting Set Password again, but this time enabling the This is the alternate account check box before
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
20 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Troubleshooting RSM Issues
entering the credentials for the HPC Windows account. You can also set up an alternate account from
the Accounts dialog box (right-click on My Computer in RSM and select Accounts).
Error: A connection attempt failed because the connected party did not properly re-
spond after a period of time. Or,
No connection could be made because the target machine actively refused it.
If you have a local firewall turned on for the server and/or RSM Client machines, you will need to add
two ports to the Exceptions List for RSM.
Replace the x with the point version number of your R17 product. For example, add ports 8170 and
9170 for version 17.0, and 8171 and 9171 for version 17.1.
If that is not the case, check to see if IPv6 is enabled and if it is, disable it. See Disabling IPv6 (p. 19)
for details.
You can disable it partly by going to your network properties for the NIC card and unchecking the IPv6
box, but you also have to further disable it in the registry.
Caught exception from script: Failed to find the TCP port from TaskHost run.
Solution 1: Restart the RSM services.
1. Go to Control Panel > System and Security > Administrative Tools > Services.
2. Restart the ANSYS RSM JHost and ANSYS RSM ScriptHost services.
Solution 2: Check for firewalls. Refer to Dealing with Firewalls (p. 17).
The submission of the requested job has been cancelled because the Solve Manager
“….” seems not fully initialized.
Solution: This is a dual network card issue. For instructions see Configuring Multiple Network Cards
(NIC) (p. 18).
You may also want to check for multiple RSM admins of the same version running concurrently.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 21
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
22 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Configuring Remote Solver Manager (RSM) to Submit Jobs to a Linux
LSF, PBS, Torque with Moab, or UGE (formerly SGE) Cluster
Important
The instructions in this tutorial apply only when configuring RSM for direct job submission
from ANSYS Workbench. If you will be submitting jobs to RSM via an EKM Portal, refer to
Integrating EKM with Remote Solve Manager (RSM) in the EKM Administration Guide instead.
Introduction
This tutorial steps you through the configuration of ANSYS Remote Solve Manager (RSM), solvers, and
Workbench so that solve jobs can be submitted to a Linux LSF, PBS, Torque with Moab, or UGE (formerly
SGE) cluster via RSM.
In this tutorial, RSM is configured using the Remote Solve Manager Setup Wizard. For a quick-start
guide on using the wizard, select Start > All Programs > ANSYS 17.x > Remote Solve Manager >
Readme - RSM Setup Wizard 17.x.
Assumptions
These instructions assume the following:
• You have installed and configured the Linux job scheduler and the compute nodes can access the cluster
head node. If your cluster is not configured properly please contact your hardware vendor or a third party
consultant for assistance.
• You have passwordless ssh set up between the head node and compute nodes. Consult an IT professional
for assistance with setting up passwordless ssh.
• You know the machine name of the head node on the Linux cluster.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 23
Configuring Remote Solver Manager (RSM) to Submit Jobs to a Linux LSF, PBS,Torque
with Moab, or UGE (formerly SGE) Cluster
• You are able to install and run ANSYS, Inc. products, including Licensing on Windows systems. For information
on installation and licensing, see the tutorials on the Downloads menu of the ANSYS Customer Portal.
If you have any problems with, or questions about the installation process, go to the Support Contacts
page of the ANSYS Customer Portal and submit a support request.
Note that when using the ANSYS installer to install a solver (Fluent, CFX, Mechanical, Polyflow), RSM
and Workbench will be installed also.
1. Export the /ansys_inc directory by adding the following line to the /etc/exports file:
/usr/ansys_inc
2. The default behavior on Linux provides read-only access from all clients. To enable read/write permission
from all clients, use *(rw):
/usr/ansys_inc *(rw)
3. Run: exportfs –a
5. If you perform a network install where you want the clients to be able to modify the licensing configuration,
you need to consider the NFS write options for the exported file system as shown in the above examples.
You also need local permissions to the licensing directory (/shared_files/licensing/) if you want
to be able to create the install_licconfig.log that the license configuration produces.
6. If you need to transfer the files from a Windows machine with a DVD drive to a Linux machine without
one, copy the DVD contents using a Samba mount or some other transfer method that is safe to use
between Windows and Linux.
7. If sharing the ANSYS directory between Linux machines, you must use the same mount point for both
the client and server. For example, if you installed to a file server in a directory named /apps/ansys_inc
and you did not choose the symbolic link to /ansys_inc, then you must mount this directory on the
client machine using /apps/ansys_inc as the mount point. If you did choose the symbolic link to
/ansys_inc during installation on the file server, you must either use /ansys_inc as the mount point
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
24 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Configuring RSM on the Cluster Head Node
on the client or you must create a symbolic link to /ansys_inc on the client machine. (The symbolic
link is created by default during installation if you installed as root.)
Manually:
1. Log in as ROOT (this is required initially to start the RSM daemons) and manually create a group called
rsmadmins.
2. Add users to the group that is responsible for configuring RSM Admin.
Automatically:
1. If you started the daemons as ROOT, log out as root. A local rsmadmin account and rsmadmins group
are automatically created when the daemons are started as ROOT.
2. If other users will be configuring RSM Admin, add their user names to the rsmadmins group (this also
requires ROOT permission). You can log out as root now.
If the user prefers to start the non-daemon services from the RSM Setup Wizard (as opposed to installing
and starting the services as daemons with a root account), then a user account from the rsmadmins
user group must be used. Note that if the RSM services are not installed as daemons, the rsmadmins
user group is not automatically created. Therefore, in order to start non-daemon services via the wizard,
prior to running the wizard your IT department must:
2. Add the users who will be running/starting non-daemon services to the rsmadmins group.
Note
If you start the services with an rsmadmins non-root user account, the service will be run
by that account in non-daemon mode. Root user privileges are required for starting RSM
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 25
Configuring Remote Solver Manager (RSM) to Submit Jobs to a Linux LSF, PBS,Torque
with Moab, or UGE (formerly SGE) Cluster
services as daemons. If you start RSM services as daemons, any non-daemon services will be
killed.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
26 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Configuring RSM on the Cluster Head Node
c. Click Next.
4. Click Next.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 27
Configuring Remote Solver Manager (RSM) to Submit Jobs to a Linux LSF, PBS,Torque
with Moab, or UGE (formerly SGE) Cluster
a. When asked if you want to define new or modify existing Compute Servers, select Yes.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
28 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Configuring RSM on the Cluster Head Node
b. Click Next.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 29
Configuring Remote Solver Manager (RSM) to Submit Jobs to a Linux LSF, PBS,Torque
with Moab, or UGE (formerly SGE) Cluster
b. Click Next.
a. Type in a Machine Name or IP Address for the server. This must be the actual computer name or
IP address of the head node. In this example, we will use: headnode
b. Type in the Display Name. This can be any name that makes sense for you.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
30 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Configuring RSM on the Cluster Head Node
c. Click Next.
a. Specify whether you want to run jobs from a network share or from the local disk. In this example,
we will select Network Share.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 31
Configuring Remote Solver Manager (RSM) to Submit Jobs to a Linux LSF, PBS,Torque
with Moab, or UGE (formerly SGE) Cluster
b. Click Next.
a. Enter the local path for your Shared Cluster Directory. This is the directory that is shared out and
mounted to all the cluster nodes from the head node.
b. Enter the name of the network share. In this example, we will use the shared temp directory
/Headnode/Temp.
c. Click Next.
a. Specify the Maximum Number of Jobs that can run concurrently on this Compute Server.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
32 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Configuring RSM on the Cluster Head Node
b. Click Next.
b. Click Next.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 33
Configuring Remote Solver Manager (RSM) to Submit Jobs to a Linux LSF, PBS,Torque
with Moab, or UGE (formerly SGE) Cluster
a. Specify whether you want to auto-configure Compute Server directories. In this example, we will
select Yes, automatically configure directories.
b. Click Next.
a. Specify whether you want to create or modify another Compute Server. In this example, we will select
No.
b. Click Next.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
34 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Configuring RSM on the Cluster Head Node
b. Click Next.
a. Specify whether you want to create a new queue or modify one already in list. In this example, we
will select Define a new Queue.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 35
Configuring Remote Solver Manager (RSM) to Submit Jobs to a Linux LSF, PBS,Torque
with Moab, or UGE (formerly SGE) Cluster
b. Click Next.
a. Enter a Name for the queue. In this example, we will enter Linux Cluster Queue. For your
configuration you can enter the actual cluster queue name that will be used to run jobs.
b. The Compute Server you added previously (Linux Cluster, in this example) appears in the list
of compute servers. Select its check box to assign it to the new queue.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
36 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Configuring RSM on the Cluster Head Node
c. Click Next.
a. Specify whether you want to define or modify another queue. In this example, we will select No.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 37
Configuring Remote Solver Manager (RSM) to Submit Jobs to a Linux LSF, PBS,Torque
with Moab, or UGE (formerly SGE) Cluster
b. Click Next.
b. Click Next.
a. Select an existing account to modify or specify that you want to define a new account. In this example,
we will select Define a new account.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
38 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Configuring RSM on the Cluster Head Node
b. Click Next.
a. Enter the Username that you use to log into your Linux machine.
b. Enter and confirm the Password that you use to log into your Linux machine.
Note
If you are going to later run a job from Windows to this Linux cluster machine, you
may need to also create an alternate Linux account that is associated with your
primary Windows account. For details refer to the Resolution in the troubleshooting
topic, Caught exception at user logon; logon failure: unknown user name or bad
password. Account password not provided. (p. 45)
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 39
Configuring Remote Solver Manager (RSM) to Submit Jobs to a Linux LSF, PBS,Torque
with Moab, or UGE (formerly SGE) Cluster
c. Click Next.
a. Specify if you want to define more accounts. In this example, we will select No.
b. Click Next.
1. In the Queues drop-down, select the queue that you want to test.
If the test succeeds, the Test Status will be Finished. If the test fails, the Test Status will be Test
Failed. Check over the steps to make sure you followed all steps correctly. You can also check
Troubleshooting RSM (p. 41) for information on adding firewall ports, and so on.
3. Click Next.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
40 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Troubleshooting RSM
Examples
The two examples below show the command line used to configure the Manager and Compute Server
service daemons via either the rsmconfig script or the install_daemon script.
tools/linux#> ./rsmconfig -mgr -svr
Once the daemon service is installed, the RSM service will be started automatically without rebooting.
The next time when the machine is rebooted, the installed RSM service will be started automatically.
4. Troubleshooting RSM
Refer to the following topics should you encounter any issues with RSM.
4.1. Gathering RSM Job Logs for Systems Support
4.2. Issue:“My Computer” Disabled in RSM Manager
4.3. Configuring Multiple Network Cards (NIC)
4.4. Disabling IPv6
4.5. Cannot Resolve localhost
4.6. Common Errors Found in RSM Job Log
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 41
Configuring Remote Solver Manager (RSM) to Submit Jobs to a Linux LSF, PBS,Torque
with Moab, or UGE (formerly SGE) Cluster
3. Select the failed RSM job in the job list view.
4. Right-click the log in the lower right pane and choose Save Job Report.
2. Open a terminal window and log in to the cluster head node that is running RSM.
3. Type cd /ansys_inc/v17x/RSM/Config/tools/linux
If you have a local firewall turned on for the server and/or RSM Client machines, you will need to add
two ports to the Exceptions List for RSM, as follows:
Replace the x with the point version number of your R17 product. For example, add ports 8170 and
9170 for version 17.0, and 8171 and 9171 for version 17.1.
1. Make sure you can ping all of the nodes you want to use.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
42 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Troubleshooting RSM
On the Client machine, ping the head node using the fully qualified domain name. For example, open
a Command Prompt and enter:
ping headnode.domain.com
Note
Record the IP address (10.2.10.32 in the above example). You will need this address in the
steps that follow.
2. Locate the <appSettings name="Global"> section. If your text editor can show line numbers this
section starts on line 3.
The bolded line in the sample code below shows what the line looks like using our example IP
address of 10.2.10.32:
<appSettings name="Global"
<add key="DiskSpaceLowWarningLimitGb" value="2.0"/>
<add key="PingServerTimeout" value="3000"/>
<add key="PingServerMaxRetries" value="4"/>
<add key="PortInUseTimeout" value="5000"/>
<add key="RemotingSecureAttribute" value="false"/>
<add key="EnablePerformanceLogging" value="false"/>
<!--This setting is sometimes required for machines with multiple network interface cards.
example value="1.2.3.4" or value="machine.mycompany.com/-->
<add key="RemotingMachineNameAttribute" value="10.2.10.32"/>
Note
1. When the RSM services are installed and started as daemon services by ANSYS-provided service
scripts, an rsmadmins administrative user group is automatically created on the Solve Manager
machine. An rsmadmin user account is created in the new user group. This account has admin-
istrative, non-root privileges and can be used to perform RSM administrative and configuration
tasks via the wizard on Linux.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 43
Configuring Remote Solver Manager (RSM) to Submit Jobs to a Linux LSF, PBS,Torque
with Moab, or UGE (formerly SGE) Cluster
2. On Linux, to provide additional users with RSM administrative privileges, you must add them
to the rsmadmins user group.
1. Make sure 127.0.0.1 is not commented out with a # sign. If it is, remove the # sign.
4.6.1. Caught exception at user logon: A required privilege is not held by the client.
Resolution: Start the RSM Services manually.
2. Open a terminal window and log in to the machine that is running RSM.
3. Type : cd /ansys_inc/v17x/RSM/Config/tools/linux
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
44 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Troubleshooting RSM
4.6.2. Caught exception at user logon; logon failure: unknown user name or bad
password. Account password not provided.
You see one of the following errors in the RSM log file:
Compute Server running as: username
Caught exception at user logon: A required privilege is not held by the client.
or
Compute Server running as: username
Account Password not Provided
Resolution:
Right-click on My Computer in RSM Admin it and choose Set Password as your password is not set.
For the Set Password dialog box, if your name matches the one shown, press Enter.
If jobs will be submitted from a Windows client, and that account is different from the Linux account,
you will need to set up an alternate account. You can do this after you have cached your primary Win-
dows account with RSM on your Windows client by selecting Set Password again, but this time enabling
the This is the alternate account check box before entering the credentials for the Linux account. You
can also set up an alternate account from the Accounts dialog box (right-click on My Computer in
RSM and select Accounts). If running on Linux, you do not need to enter a DOMAIN, just your username
and password.
or
No connection could be made because the target machine actively refused it.
Resolution: If you have a local firewall turned on for the server and/or RSM Client machines, you will
need to add two ports to the Exceptions List for RSM, as follows:
Replace the x with the point version number of your R17 product. For example, add ports 8170 and
9170 for version 17.0, and 8171 and 9171 for version 17.1.
If you do not have a local firewall turned on, check to see if IPv6 is enabled; if it is, disable it.
4.6.4. Failed to create Script Task: Access to the path “…” is denied.
Make sure all users have read/write access to the directory that the error is referencing, that the directory
is available on all nodes, and that it is shared.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 45
Configuring Remote Solver Manager (RSM) to Submit Jobs to a Linux LSF, PBS,Torque
with Moab, or UGE (formerly SGE) Cluster
4.6.5. Caught exception from script: Failed to find the TCP port from TaskHost run.
Resolution 1: Restart the RSM services.
On Linux you can stop the RSM services manually by running the appropriate service script with the
command line option stop. The examples below illustrate how to stop the RSM services manually:
./rsmmanager stop
./rsmserver stop
You can start the RSM services manually by running the appropriate service script with the command
line option start. The examples below illustrate how to start each of the RSM services manually:
./rsmmanager start
./rsmserver start
If you have a local firewall turned on for the server and/or RSM Client machines, you will need to attach
two ports to the Exceptions List for RSM, as follows:
Replace the x with the point version number of your R17 product. For example, add ports 8170 and
9170 for version 17.0, and 8171 and 9171 for version 17.1.
Try flushing iptables. Consult your MAN pages for instructions on how to do this.
Check the permissions on the RSM scratch directory and ensure that all users have write access to it.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
46 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Troubleshooting RSM
4.6.6. The submission of the requested job has been cancelled because the Solve
Manager “….” seems not fully initialized.
Resolution:
This may be a dual NIC issue. Also check for multiple RSM admins of the same version running concur-
rently. See the section Multiple Network Interface Cards (NIC) Issues in the Remote Solve Manager (RSM)
documentation for instructions.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 47
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
48 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Submitting CFX, Fluent and Mechanical Jobs to a Linux or Microsoft
HPC Cluster
This tutorial shows you how to configure Workbench to submit solve jobs via RSM, and provides instruc-
tions on submitting CFX, Fluent and Mechanical jobs from Workbench to a Linux or Windows cluster.
If you will be submitting jobs to RSM via EKM, refer to Managing Jobs in the EKM User’s Guide. For in-
formation on configuring RSM for use with EKM, refer to Integrating EKM with Remote Solve Manager
(RSM) in the EKM Administration Guide.
In this tutorial:
1. Configuring RSM on a Windows Client Machine Prior to Submitting Jobs to a Linux or Windows Cluster
2. Submitting a CFX Job from Workbench to a Linux or Windows Cluster
3. Submitting a Fluent Job from Workbench to a Linux or Windows Cluster
4. Submitting a Mechanical Job from Workbench to a Linux or Windows Cluster
5.Troubleshooting Job Failures
1. Install ANSYS, Inc. products on each Client machine that will be submitting RSM jobs the cluster.
2. On the Client machine, open ANSYS Workbench (Start Menu → All Programs → ANSYS 17.x → Workbench
17.x).
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 49
Submitting CFX, Fluent and Mechanical Jobs to a Linux or Microsoft HPC Cluster
4. From the Solve Manager drop box, select the computer where Remote Solve Manager is installed.
If you are submitting a job to a Linux manager or compute server where the Windows client logon
credentials (user name or password) are different than the credentials used to log on to the Linux
manager or compute server, you will need to set up an alternate account.
To do this after you have cached your primary Windows account with RSM:
c. Enter the user name and password for your alternate account to log on to the remote manager or
compute server, then click OK. This launches the Alternate Account Settings dialog box:
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
50 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Submitting a CFX Job from Workbench to a Linux or Windows Cluster
d. Select the manager or compute server that you want to apply the alternate account to, then click
Done.
Proceed to the sections that follow to learn how to send your job to your Linux or Windows cluster.
3. In the CFX system, right-click the Solution cell and select Properties.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 51
Submitting CFX, Fluent and Mechanical Jobs to a Linux or Microsoft HPC Cluster
b. For Solve Manager, type the name of the Manager that will be used. (If you do not know the name
of the Solve Manager, contact your System Administrator for this information.)
c. For Queue, enter the name of the queue that will be used. (If you do not know the name of the
Queue, contact your System Administrator for this information.)
d. For automatic downloading of progress information, verify that Download Progress Information
is set to Always Download.
e. Leave the Download Progress Information at the default of 120 seconds (or a different value de-
pending on how frequently you would like the solver to query RSM for output files in order to display
progress). Note that if the job finishes before the first interval is reached, you will not see progress
results until the end of the job.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
52 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Submitting a Fluent Job from Workbench to a Linux or Windows Cluster
3. In the Fluent system, right-click the Solution cell and select Properties.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 53
Submitting CFX, Fluent and Mechanical Jobs to a Linux or Microsoft HPC Cluster
c. For Solve Manager, type the name of the Manager that will be used. (If you do not know the name
of the Solve Manager, contact your System Administrator for this information.)
d. For Queue, enter the name of the queue that will be used. (If you do not know the name of the
Queue, contact your System Administrator for this information.)
f. Set the Progress Download Interval to the default of 120 seconds (or a different value depending
on how frequently you would like the solver to query RSM for output files in order to display progress).
Note that if the job finishes before the first download interval is reached, you will not see progress
results until the end of the job.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
54 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Submitting a Mechanical Job from Workbench to a Linux or Windows Cluster
2. Add a Mechanical system and assign a geometry, establish all necessary loads, and so on.
3. On the analysis system on the Project Schematic, double-click either the Model or the Setup cell to launch
Mechanical.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 55
Submitting CFX, Fluent and Mechanical Jobs to a Linux or Microsoft HPC Cluster
b. Click OK. The Rename Solve Process Settings dialog box closes.
a. Select the solve process setting you just specified from the list on the left.
b. Under Computer Settings, enter the machine name of the Solve Manager. (If you do not know the
name of the Solve Manager, contact your System Administrator for this information.)
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
56 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Submitting a Mechanical Job from Workbench to a Linux or Windows Cluster
c. For Queue, enter the name of the queue that will be used. (If you do not know the name of the
Queue, contact your System Administrator for this information.)
d. Click Advanced.
9. In the Solve Process Settings dialog box, click OK. The dialog box closes and the solve process setup is
complete.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 57
Submitting CFX, Fluent and Mechanical Jobs to a Linux or Microsoft HPC Cluster
10. In Mechanical, finish setting up your analysis. When the model is set up and ready to solve, open/launch
Mechanical and select the Solve toolbar button drop-down arrow. You will see the solve process name
you just defined (in this example, Cluster). Select that process.
11. The solve commences. When the solution has completed, the Solution branch and the items underneath
it in the project tree will each have a down-arrow next to them.
12. Right-click Solution and select Get Results to bring the solution items to the local machine.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
58 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Troubleshooting Job Failures
a. Choose Start Menu → All Programs → ANSYS 17.x → Remote Solve Manager → RSM 17.x, then
right-click the shortcut and choose Run as Administrator.
b. Select Tools > Options. In the Name field type the name of the Solve Manager. (If you do not know
the name of the Solve Manager, contact your System Administrator for this information.)
3. Right-click on the log in the lower right pane and select Debug Messages.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 59
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
60 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Remote Solve Manager Tutorial: Configuring Custom Client-Side
Cluster Integration R17
This tutorial walks you through process of configuring Remote Solve Manager (RSM) to use a “Custom”
Linux Cluster using the “Client-Side Integration” technique. This should be used only if your environ-
ment:
• has special File Transfer requirements that RSM standard file transfer methods don’t meet, such as using
HTTP file transfers that you want to integrate into RSM (or)
• has restrictions on the installation of RSM services and is also an unsupported cluster type (that is, not
a standard LSF/PBS/UGE/MS HPC cluster type).
If you have similar, but less restrictive, setups to the following, use the other simpler setup
methods listed for each case below:
• You can install RSM services and file transfers can use RSM native or network file shares like Linux Samba
or Windows shares on your unsupported cluster type (server-side integration).
• You cannot install RSM services but you are using a supported Linux cluster type (standard SSH setup).
• The only special requirement for file transfers is SSH/SCP transfers to a supported Linux cluster type
(standard SSH setup).
For more information on server-side integration, see “Customizing Server-Side Integration” in the Remote
Solve Manager User's Guide and/or review “Remote Solve Manager Tutorial: Configuring Custom Server-
Side Cluster Integration R17.”
For more information on standard SSH setups, see “Appendix B. Integrating Windows with Linux using
SSH/SCP” in the Remote Solve Manager User's Guide.
This tutorial is not meant to replace the user’s guide; for more information on custom integration, see
“Custom Cluster Integration Setup” in Remote Solve Manager User's Guide.
If this scenario does not suit your needs, see the other tutorials available on the Tutorials and Training
Materials page of the ANSYS Customer Portal. For further information about tutorials and documentation
on the ANSYS Customer Portal, go to http://support.ansys.com/docinfo.
You can follow this tutorial while actually configuring RSM. To do so, simply make the selections that
are pertinent to you or insert your specific information where noted.
Once you’ve tested your configuration, you can submit a Fluent, CFX, or Mechanical job to RSM.
• Both the Windows and the Linux machines are set up correctly on the network.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 61
Remote Solve Manager Tutorial: Configuring Custom Client-Side Cluster Integration
R17
• You have properly set up passwordless SSH from the Client to the Linux cluster.
For help with setting up SSH, see “Appendix B: Integrating Windows with Linux using SSH/SCP”
in the Remote Solve Manager User's Guide for instructions.
• Both ANSYS Workbench and RSM have been installed on the Windows Client machine.
• You are able run ANSYS, Inc. products on the Linux cluster by submitting a command line (i.e. the ANSYS
install on the cluster has been verified).
For information on product and licensing installations, see the RSM tutorials on the Downloads page
of the ANSYS Customer Portal.
If you have any problems with, or questions about, the installation process, go to the Support page of
the ANSYS Customer Portal and submit an online support request.
For further information about tutorials and documentation on the ANSYS Customer Portal, go to http://
support.ansys.com/docinfo.
2.1. Creating the RSM Compute Server for Custom Cluster Type “Keyword”
Perform the following steps on your Windows RSM Client machine to configure RSM to use a custom
client-side integrated cluster. In this section, we are adding a “Custom” Linux cluster as the Compute
Server which can have user-programmed inputs.
1. Underneath the local Manager (My Computer) in the RSM tree view, right-click the Compute Servers
folder and select Add.
2. On the General tab of the Compute Server Properties dialog box, set properties as follows:
a. For the Display Name property, enter a descriptive name for the Linux machine being defined as a
Compute Server. This example will use Client Side Integration Example.
b. In this example, the Compute Server services will be on the same machine as the Manager. Both of
them will run on your Client machine (My Computer), so we will set Machine Name to localhost.
c. Enable the This Compute Server is integrating with an HPC cluster check box.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
62 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Setting Up the RSM Client and Manager
f. If you do not have RSH enabled on the cluster, then check Use SSH protocol for intra node commu-
nication. This means that the remote scripts will use SSH to contact other machines in the cluster.
a. Set up the Cluster Node and Account Name in order to access the Remote Linux Cluster.
b. Set the Cluster Type property. In this example, we’ll select CUSTOM.
c. For the Custom Cluster Type property, enter a short, descriptive name. This is your keyword and will
need to be appended to some filenames later, so try to keep it simple; for ease of use, it should not
contain spaces. For this example, we’ll use CUS_CLIENT.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 63
Remote Solve Manager Tutorial: Configuring Custom Client-Side Cluster Integration
R17
4. On the File Management tab, set properties as shown below. For more information on file management
and directory handling, see “Compute Server Properties Dialog: File Management Tab” in the Remote Solve
Manager User's Guide.
a. For the Remote Shared Cluster Directory property, enter the path to your central cluster file-staging
directory. This should be a directory that the cluster execution nodes share and all have mounted so
that every execution node can access the input files once they are moved there.
The Shared Cluster Directory is typically located on the machine defined on the General tab.
However, in this example, the General tab specifies localhost. Since we have set up and are
modifying the remote Manager from the Client machine, the directory reference here will be to
the remote machine. The RSM job needs to find this shared directory on the remote machine.
In this example, /path/to/shared/cluster/directory is a network share that all of the
cluster nodes have mounted.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
64 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Setting Up the RSM Client and Manager
5. Select the General tab again. Now we can set the location of the Working Directory, which is used to store
all of the client files before sending them to the remote machine.
For the Working Directory Location property, select Reuse Manager Storage. This will reuse the
RSM Manager's project storage directory as the Working Directory.
7. In the RSM tree view, expand the Compute Servers node to view the Compute Server you added (Client
Side Integration Example in this example).
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 65
Remote Solve Manager Tutorial: Configuring Custom Client-Side Cluster Integration
R17
2. Under General in the Queue Properties dialog box, enter a Name for this queue. In this example, we will
use Priority_Queue.
Note
The queue Name will be presented to the cluster directly, so this queue name should
match the desired submission queue name exactly. A Compute Server can be placed in
more than one queue in RSM, so you can submit to any number of queues enabled on
the cluster in this way.
3. The Compute Server you added previously (Client Side Integration Example in this example) appears
under Assigned Servers. Select the check box next to it to assign the server to this queue.
5. In the RSM tree view, expand the Queues node to view the queue you added (Priority Queue in this ex-
ample).
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
66 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Setting Up Custom Code References
3.2.1. Modifying the Job Configuration File for the New Cluster Type
As part of the setup, you must add an entry for your custom cluster keyword in the jobConfigura-
tion.xml file, and reference the HPC commands file that is needed for this cluster job type.
2. Open the jobConfiguration.xml file and add an entry for your custom cluster job type. The sample
entry below is for the CUS_CLIENT keyword that we established earlier, and points to the custom
hpc_commands_CUS_CLIENT.xml file. Use your own keyword and HPC commands file name where
appropriate.
<keyword name="CUS_CLIENT">
<jobCode name="GenericJobCode_CUS_CLIENT.xml">
<include name="GenericJobCode_base.xml"/>
</jobCode>
<hpcCommands name="hpc_commands_CUS_CLIENT.xml">
</hpcCommands>
</keyword>
3.2.2. Modifying the Custom HPC Commands File to Reference Custom Scripts
Below is the entire hpc_commands_CUS_CLIENT.xml file in its unmodified form.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 67
Remote Solve Manager Tutorial: Configuring Custom Client-Side Cluster Integration
R17
<environment>
<env name="RSM_HPC_PARSE">LSF</env>
<env name="RSM_HPC_PARSE_MARKER">START</env> <!-- Find "START" line before parsing according to parse type -->
<env name="RSM_HPC_SSH_MODE">ON</env>
<env name="RSM_HPC_CLUSTER_TARGET_PLATFORM">Linux</env> <!-- Still need to set RSM_HPC_PLATFORM=linx64
on Local Machine -->
</environment>
<submit>
<primaryCommand name="submit">
<properties>
<property name="MustRemainLocal">true</property>
</properties>
<application>
<pythonapp>%RSM_HPC_SCRIPTS_DIRECTORY_LOCAL%/submit_CUS_CLIENT.py</pythonapp>
</application>
<arguments>
</arguments>
</primaryCommand>
<postcommands>
<command name="parseSubmit">
<properties>
<property name="MustRemainLocal">true</property>
</properties>
<application>
<pythonapp>%RSM_HPC_SCRIPTS_DIRECTORY_LOCAL%/lsfParsing.py</pythonapp>
</application>
<arguments>
<arg>-submit</arg>
<arg>
<value>%RSM_HPC_PARSE_MARKER%</value>
<condition>
<env name="RSM_HPC_PARSE_MARKER">ANY_VALUE</env>
</condition>
</arg>
</arguments>
<outputs>
<variableName>RSM_HPC_OUTPUT_JOBID</variableName>
</outputs>
</command>
</postcommands>
</submit>
<queryStatus>
<primaryCommand name="queryStatus">
<properties>
<property name="MustRemainLocal">true</property>
</properties>
<application>
<pythonapp>%RSM_HPC_SCRIPTS_DIRECTORY_LOCAL%/status_CUS_CLIENT.py</pythonapp>
</application>
<arguments>
</arguments>
</primaryCommand>
<postcommands>
<command name="parseStatus">
<properties>
<property name="MustRemainLocal">true</property>
</properties>
<application>
<pythonapp>%RSM_HPC_SCRIPTS_DIRECTORY_LOCAL%/lsfParsing.py</pythonapp>
</application>
<arguments>
<arg>-status</arg>
<arg>
<value>%RSM_HPC_PARSE_MARKER%</value>
<condition>
<env name="RSM_HPC_PARSE_MARKER">ANY_VALUE</env>
</condition>
</arg>
</arguments>
<outputs>
<variableName>RSM_HPC_OUTPUT_STATUS</variableName>
</outputs>
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
68 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Setting Up Custom Code References
</command>
</postcommands>
</queryStatus>
<cancel>
<primaryCommand name="cancel">
<application>
<pythonapp>%RSM_HPC_SCRIPTS_DIRECTORY_LOCAL%/cancel_CUS_CLIENT.py</pythonapp>
</application>
<arguments>
</arguments>
</primaryCommand>
</cancel>
<transfer>
<primaryCommand name="transfer">
<properties>
<property name="MustRemainLocal">true</property>
</properties>
<application>
<pythonapp>%RSM_HPC_SCRIPTS_DIRECTORY_LOCAL%/transfer_CUS_CLIENT.py</pythonapp>
</application>
<arguments>
</arguments>
<outputs>
<variableName>RSM_HPC_OUTPUT_DIRECTORY_SHARED</variableName>
</outputs>
</primaryCommand>
</transfer>
<cleanup>
<primaryCommand name="cleanup">
<application>
<pythonapp>%RSM_HPC_SCRIPTS_DIRECTORY_LOCAL%/cleanup_CUS_CLIENT.py</pythonapp>
</application>
<arguments>
</arguments>
</primaryCommand>
</cleanup>
</jobCommands>
In the HPC Commands file shown above, you have only two steps to finish:
1. Referring to the example below, replace all of the Generic and SSH references with _CUS_CLIENT ref-
erences (or your specific keyword), as was done in Making a Copy of CIS Example Files from RSM Director-
ies (p. 67) above.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 69
Remote Solve Manager Tutorial: Configuring Custom Client-Side Cluster Integration
R17
</primaryCommand>
<postcommands>
<command name="parseSubmit">
<properties>
<property name="MustRemainLocal">true</property>
</properties>
<application>
<pythonapp>%RSM_HPC_SCRIPTS_DIRECTORY_LOCAL%/lsfParsing.py</pythonapp>
</application>
<arguments>
<arg>-submit</arg>
<arg>
<value>%RSM_HPC_PARSE_MARKER%</value>
<condition>
<env name="RSM_HPC_PARSE_MARKER">ANY_VALUE</env>
</condition>
</arg>
</arguments>
<outputs>
<variableName>RSM_HPC_OUTPUT_JOBID</variableName>
</outputs>
</command>
</postcommands>
</submit>
<queryStatus>
<primaryCommand name="queryStatus">
<properties>
<property name="MustRemainLocal">true</property>
</properties>
<application>
<pythonapp>%RSM_HPC_SCRIPTS_DIRECTORY_LOCAL%/statusGeneric.py</pythonapp>
</application>
<arguments>
</arguments>
</primaryCommand>
<postcommands>
<command name="parseStatus">
<properties>
<property name="MustRemainLocal">true</property>
</properties>
<application>
<pythonapp>%RSM_HPC_SCRIPTS_DIRECTORY_LOCAL%/lsfParsing.py</pythonapp>
</application>
<arguments>
<arg>-status</arg>
<arg>
<value>%RSM_HPC_PARSE_MARKER%</value>
<condition>
<env name="RSM_HPC_PARSE_MARKER">ANY_VALUE</env>
</condition>
</arg>
</arguments>
<outputs>
<variableName>RSM_HPC_OUTPUT_STATUS</variableName>
</outputs>
</command>
</postcommands>
</queryStatus>
<cancel>
<primaryCommand name="cancel">
<application>
<pythonapp>%RSM_HPC_SCRIPTS_DIRECTORY_LOCAL%/cancelGeneric.py</pythonapp>
</application>
<arguments>
</arguments>
</primaryCommand>
</cancel>
<transfer>
<primaryCommand name="transfer">
<properties>
<property name="MustRemainLocal">true</property>
</properties>
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
70 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Setting Up Custom Code References
<application>
<pythonapp>%RSM_HPC_SCRIPTS_DIRECTORY_LOCAL%/transferSSH.py</pythonapp>
</application>
<arguments>
</arguments>
<outputs>
<variableName>RSM_HPC_OUTPUT_DIRECTORY_SHARED</variableName>
</outputs>
</primaryCommand>
</transfer>
<cleanup>
<primaryCommand name="cleanup">
<application>
<pythonapp>%RSM_HPC_SCRIPTS_DIRECTORY_LOCAL%/cleanupSSH.py</pythonapp>
</application>
<arguments>
</arguments>
</primaryCommand>
</cleanup>
</jobCommands>
Note
• If you want to use other types of code such as C++, that is acceptable if you simply place your
compiled (executable) code in the <app> </app> section, arguments are not required. For
Python, an interpreter is included in the ANSYS Workbench install, so that is what you see refer-
enced. If you want to use Python you can simply replace <app> </app> with <pythonapp>
</pythonapp> as shown and enter the Python code file name.
• Any custom code that you want to provide as part of the customization should also be located
in the [RSMInstall]\RSM\Config\scripts directory corresponding to your local (client) installation.
Alternatively, a full path to the script must be provided along with the name.
Important
The scripts submitGeneric.py and cancelGeneric.py that you have copied and re-
named to submit_CUS_CLIENT.py and cancel_CUS_CLIENT.py actually contain fully
functional code. However, the code could be considered to be quite complex, and going
over it in detail is beyond the scope of this tutorial. These scripts are intended for more ad-
vanced programmers in customizing the code.
Here we have provided simpler, commented versions of these scripts with only basic func-
tionality, so that the scripts may be more easily understood by newer programmers. We have
illustrated the inner workings of these scripts so that you can modify them or write your
own scripts based on your specific needs.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 71
Remote Solve Manager Tutorial: Configuring Custom Client-Side Cluster Integration
R17
If you want to use the simpler scripts, you can simply replace the content in the original
scripts with the following examples for submit_CUS_CLIENT.py and cancel_CUS_CLI-
ENT.py.
import sys
import ansLocale
import os
import tempfile
import os.path
import shutil
import glob
import shlex
import subprocess
import time
import platform
print('RSM_HPC_DEBUG=Submitting job...')
# See Below #1
print('Custom Coding goes here')
# RSM_HPC_PROTOCOL_OPTION2 is the name of the cluster node that was entered in the Cluster Tab.
# We will reference 'RSM_HPC_PROTOCOL_OPTION2' below, and the command will not succeed
# if its not defined so check it and give a specific error if it is not set.
if os.getenv("RSM_HPC_PROTOCOL_OPTION2") == None:
print("RSM_HPC_ERROR=RSM_HPC_PROTOCOL_OPTION2 (Remote Cluster Node Name) not defined")
sys.exit(1)
_jobname = os.getenv("RSM_HPC_JOBNAME")
if not _jobname == None:
_plinkArgs += " -J \\\"" + _jobname + "\\\""
_queue = os.getenv("RSM_HPC_QUEUE")
if not _queue == None and not _queue == "":
_plinkArgs += " -q " + _queue
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
72 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Setting Up Custom Code References
_staging = os.getenv("RSM_HPC_STAGING")
_plinkArgs += " -cwd \"" + _staging + "\""
_distributed = os.getenv("RSM_HPC_DISTRIBUTED")
if _distributed == None or _distributed == "FALSE":
_plinkArgs += " -R 'span[hosts=1]'"
_nativeOptions = os.getenv("RSM_HPC_NATIVEOPTIONS")
if not _nativeOptions == None:
_plinkArgs += " " + _nativeOptions
_stdoutfile = os.getenv("RSM_HPC_STDOUTFILE")
if not _stdoutfile == None:
_plinkArgs += " -o " + _stdoutfile
_stderrfile = os.getenv("RSM_HPC_STDERRFILE")
if not _stderrfile == None:
_plinkArgs += " -e " + _stderrfile
# Some environment variables were written directly into the string '_plinkArgs'
# and we want to replace those references with their actual values before submission.
print('RSM_HPC_DEBUG=plink arguments: ' + _plinkArgs);
_plinkArgs = os.path.expandvars(_plinkArgs);
Note
This code references many RSM-set environment variables. For more information on what
environment variables are available and their contents, see “Environment Variables Set by
RSM” in the Remote Solve Manager User's Guide.
1. You can add any code you want to this section; code placed here will execute before the job is submitted.
Also, you can stop the job from submitting with some controls on the Submit command, if desired.
2. Basic LSF command line starting point; we will continuously append arguments to this line as necessary
to complete the command.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 73
Remote Solve Manager Tutorial: Configuring Custom Client-Side Cluster Integration
R17
3. Most blocks are composed of three parts: storing an environment variable to a local variable, testing to
ensure that a variable either isn’t empty or contains a special value, and then appending some flag to the
command line based on the findings.
4. One of the final actions is to read the RSM_HPC_COMMAND variable and append it to the submission
command. This command is created by RSM and contains the command line to run the ClusterJobs
script which can complete the submission process. It creates the full command line for ANSYS by using
the controls file created by the individual add-ins. ANSYS suggests that you always use the
RSM_HPC_COMMAND to submit a job whenever possible because of the complexities of the ANSYS command
line for different solvers and on different platforms.
5. Popen finally “runs” the command we have been building. Then we wait for it to finish.
Since this script is a Submit script, there are many options for bsub command, and because this is a
custom Client integration, the commands are being wrapped in an SSH command to submit from the
local machine to the remote machine. However, it is much simpler to create a custom script for the
Cancel command, although it contains the same basic parts. This process is addressed in the next section.
import sys
import ansLocale
import os
import tempfile
import os.path
import shutil
import glob
import shlex
import subprocess
import time
import platform
print('RSM_HPC_DEBUG=Cancelling job...')
# See Below #1
print('Custom Coding goes here')
# RSM_HPC_PROTOCOL_OPTION2 is the name of the cluster node that was entered in the Cluster Tab.
# We will reference 'RSM_HPC_PROTOCOL_OPTION2' below, and the command will not succeed
# if its not defined so check it and give a specific error if it is not set.
if os.getenv("RSM_HPC_PROTOCOL_OPTION2") == None:
print("RSM_HPC_ERROR=RSM_HPC_PROTOCOL_OPTION2 (Remote Cluster Node Name) not defined")
sys.exit(1)
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
74 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Setting Up Custom Code References
_error = False
# See Below #5
for line in _process.stderr:
print('RSM_HPC_ERROR='+line)
_error = True
if _error:
sys.exit(1)
sys.exit(0)
Note
This code references many RSM-set environment variables. For more, information on what
environment variables are available and their contents, see “Environment Variables Set by
RSM” in the Remote Solve Manager User's Guide.
1. You can add any code you want to this section; code placed here will happen before the job is cancelled.
Also, some code could be run at the end of the script just before sys.exit(0), if some extra precautions
are to be taken after the job has been cancelled through the scheduler.
2. Basic LSF command line starting point. You would type bkill <job ID> at the command line in order
to cancel a job in LSF. We will continuously append arguments to this line as necessary to complete the
command. In this case, it’s only the job number being added in block #4.
3. Most blocks are composed of three parts: storing an environment variable to a local variable, testing to
ensure that a variable isn’t empty, and then appending some flag to the command line (or stopping the
command if an error is found) based on the findings. This environment variable is set by RSM. A list of these
useful variables can be found in “Custom Integration Environment Variables” in the Remote Solve Manager
User's Guide.
4. Popen finally “runs” the command we have been building. Then we wait for it to finish.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 75
Remote Solve Manager Tutorial: Configuring Custom Client-Side Cluster Integration
R17
5. Finally, we simply print out all of the output along with a line that says that the command has finished,
just so we know it has run properly through RSM. Unlike the Submit command, the Cancel command has
no output requirements, as shown in the “Cancel Command” section of the Remote Solve Manager User's
Guide.
2. Right-click on the newly added Compute Server under the queue’s folder (Client Side Integration Example)
and select Test.
3. When the test job completes, you can view job details in the RSM Progress Pane.
a. Check to see if any firewalls are turned on and blocking the connection between the two machines.
b. Make sure you can reach the machine(s) via the network.
c. Attempt to use plink.exe from the command prompt and connect to the remote machine this way.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
76 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Setting Up Custom Code References
iii. Ensure that the KEYPATH variable has been set up for passwordless SSH.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 77
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
78 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Remote Solve Manager Tutorial: Configuring Custom Server-Side
Cluster Integration R17
This tutorial walks you through process of configuring Remote Solve Manager (RSM) to use for a “Custom”
Linux cluster using the “Server-Side Integration” technique. This should only be used if your cluster has
one or more of these customization requirements:
• You need to run some additional custom code or make command line modifications that cannot be
easily accomplished through the load scheduler or RSM UI (or)
• The cluster has a customized interface so that the built-in default command line commands for the
above cluster types are not accepted (or)
• You are not using a supported load scheduler (LSF, PBS Pro, SGE/UGE, or Microsoft HPC) but you have
an open source cluster or a proprietary cluster that you want to integrate.
If none of these is the case, use the standard cluster setups, as they are easier to set up and support.
For more information, see the following appendices in the Remote Solve Manager User's Guide:
• “Appendix C. Integrating RSM with a Linux Platform LSF, PBS, or SGE (UGE) Cluster”
If your requirements are even stricter than those noted above, such as running an unsupported OS like
AIX or proprietary file transfer methods to the cluster, then see “Customizing Client-Side Integration”
in the Remote Solve Manager User's Guide and/or review “Remote Solve Manager Tutorial: Configuring
Custom Client-Side Cluster Integration R17.”
This tutorial is not meant to replace the user’s guide; for more information on custom integration, see
“Custom Cluster Integration Setup” in Remote Solve Manager User's Guide.
If this scenario does not suit your needs, see the other tutorials available on the ANSYS Customer Portal.
For further information about tutorials and documentation on the ANSYS Customer Portal, go to ht-
tp://support.ansys.com/docinfo.
You can follow this tutorial while actually configuring RSM. To do so, simply make the selections that
are pertinent to you or insert your specific information where noted.
Once you’ve tested your configuration, you can follow the steps for submitting a Fluent, CFX, or
Mechanical job to RSM.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 79
Remote Solve Manager Tutorial: Configuring Custom Server-Side Cluster Integration
R17
1. Before You Begin
These instructions assume the following:
• Both the Windows and the Linux machines are set up correctly on the network.
• You are not using the optional SSH file transfer protocol but instead are using native RSM communication
or OS File Transfer. For information on these file transfer types, see “Setting Up RSM File Transfers” in
the Remote Solve Manager User's Guide.
Note
If you are using SSH, see “Appendix B: Integrating Windows with Linux using SSH/SCP”
in the Remote Solve Manager User's Guide for instructions.
– RSM has been installed and both RSM Manager and Compute Server services have been started.
Pay particular attention to “Installing RSM Automatic Startup (Daemon) Services for Linux” in
the Remote Solve Manager User's Guide.
– You have at least RSM administrative privileges through the rsmadmins group, if not root privileges.
Pay particular attention to “Installing RSM Automatic Startup (Daemon) Services for Linux” in
the Remote Solve Manager User's Guide.
– You must be able to use password-less RSH (or SSH) from every node in the cluster to every other
node in the cluster.
• You are able run ANSYS, Inc. products on the Linux cluster by submitting a command line (that is, the
ANSYS install on the cluster has been verified).
For information on product and licensing installations, see the RSM tutorials on the Downloads page
of the ANSYS Customer Portal.
If you have any problems with, or questions about, the installation process, go to the Support page of
the ANSYS Customer Portal and submit an online support request.
For further information about tutorials and documentation on the ANSYS Customer Portal, go to http://
support.ansys.com/docinfo.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
80 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Setting Up the RSM Client and Manager
3. Enter the Linux Cluster Name (or IP address) into the Name field and click Add.
4. Select both the local and new remote Manager and then click OK.
Check in the UI to verify that the new machine has shown up. The first time you connect to it,
it should prompt you to set a password (covered in step 5).
5. Cache your login on this machine to gain access to change the properties. Your system administrator
needs to have added your login to be in the rsmadmins group. You will be setting up the Manager service
on the Linux machine remotely from your Client machine to make it easier. If you get a credentials
error, review Before You Begin (p. 80) section and/or have your system’s administrator set up the
cluster as described and add you to the rsmadmins group.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 81
Remote Solve Manager Tutorial: Configuring Custom Server-Side Cluster Integration
R17
6. The remote Manager is now added to the Client and can be configured.
Note
This tutorial will use Tester1rsm as the remote Manager in the examples. We will be
configuring this REMOTE Manager, not “My Computer” from now on.
2.2. Creating the RSM Compute Server for Custom Cluster Type “Keyword”
Perform the following steps on your Windows RSM Client machine to configure RSM to use a custom
server-side integrated cluster. In this section, we are adding a “Custom” Linux cluster as the Compute
Server that can have user-programmed inputs.
1. Underneath the remote Manager in the RSM tree view, right-click the Compute Servers folder and select
Add.
2. On the General tab of the Compute Server Properties dialog box, set properties as follows:
a. For the Display Name property, enter a descriptive name for the Linux machine being defined as a
Compute Server. This example will use Tester1 Custom Cluster.
b. In this example, the Compute Server services will be on the same machine as the Manager. Both of
them are on the custom cluster so in this example we will set Machine Name to localhost.
e. If you do not have RSH enabled, check Use SSH protocol for inter and intra-node communication
(Linux only). This means that the local scripts will use SSH to contact other machines in the cluster.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
82 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Setting Up the RSM Client and Manager
3. On the Cluster tab of the Compute Server Properties dialog box, set properties as follows:
a. Set the Cluster Type property. In this example, we’ll select CUSTOM.
b. For the Custom Cluster Type property, enter a short, descriptive name. This is your keyword and will
need to be appended to some filenames later, so try to keep it simple. For this example we will use
SHEF01.
c. In this example, we use optional Job Submission Arguments to override the queue name and force
it to be all.q regardless of the queue Name created in the next section. This is not required, however;
it is shown here only as an example of the functionality. Often, this box is left blank so that any
number of queues can be setup for this Compute Server as shown in the next section. Refer to your
specific cluster’s documentation for the exact commands that can be used here.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 83
Remote Solve Manager Tutorial: Configuring Custom Server-Side Cluster Integration
R17
4. On the File Management tab of the Compute Server Properties dialog box, set properties as follows:
a. Before we look at the Shared Cluster Directory, we should decide a File Management method. For
this example, we will choose to run the job In the Shared Cluster Directory.
For more information on file management and directory handling, see “Compute Server Properties
Dialog: File Management Tab” in the Remote Solve Manager User's Guide.
b. For the Shared Cluster Directory property, enter the path to your central cluster file-staging directory.
This should be a directory that the cluster execution nodes share and all have mounted so that every
execution node can access the input files once they are moved there.
The Shared Cluster Directory is located on the machine defined on the General tab. In this
example, the General tab specifies localhost, and we have set up and are modifying the remote
Manager from the Client machine. So the directory reference will be to the remote machine. The
RSM job needs to find the shared directory there. In this example,
/path/to/shared/cluster/directory is a network share that all of the cluster nodes
have mounted.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
84 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Setting Up the RSM Client and Manager
Note
The directories you enter here must match the directory names exactly (capitalization
carries over to Linux). If the directory names do not match exactly, the process will fail.
6. In the RSM tree view, expand the Compute Servers node to view the Compute Server you added (Tester1
Custom Cluster in this example).
2. Under General in the Queue Properties dialog box, enter a Name for this queue. In this example, we will
use Custom_Queue.
3. The Compute Server you added previously (Tester1 Custom Cluster in this example) appears under
Assigned Servers. Select the check box next to it to assign the server to this queue.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 85
Remote Solve Manager Tutorial: Configuring Custom Server-Side Cluster Integration
R17
5. In the RSM tree view, expand the Queues node to view the queue you added (Custom_Queue in this ex-
ample).
Note
If this is not your configuration as stated in Before You Begin (p. 80), then this scripting
method could fail. This method ensures that all users use the same scripts. A method for
applying different scripts for different groups is also allowed, but not covered in this tutorial
and is not the preferred method.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
86 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Setting Up Custom Code References
Recall your keyword from the section, Creating the RSM Compute Server for Custom Cluster Type “Keyword”,
step 3b (p. 83). Now we need to find the files for the supported cluster type that is most like the cluster
you are customizing. The base scripts for all of the supported clusters are located in the directory in
step #1 (below), choose the scripts to copy to your <keyword> files based on your specific needs. In
this example, our cluster is actually a UGE cluster that we are customizing to add some extra features,
so we will start from the SGE version of the scripts (denoted by “_SGE” in their names):
2. Make a copy of hpc_commands_SGE.xml and call the copy hpc_commands_SHEF01.xml. For this
example, the <keyword> is SHEF01.
3.3.1. Modifying the Job Configuration File for the New Cluster Type
As part of the setup, you must add an entry for your custom cluster keyword in the jobConfigura-
tion.xml file, and reference the HPC commands file that is needed for this cluster job type.
2. Open the jobConfiguration.xml file and add an entry for your custom cluster job type. The sample
entry below is for the SHEF01 keyword that we established earlier, and points to the custom hpc_com-
mands_SHEF01.xml file. Use your own keyword and HPC commands file name where appropriate.
<keyword name="SHEF01">
<jobCode name="GenericJobCode_SHEF01.xml">
<include name="GenericJobCode_base.xml"/>
</jobCode>
<hpcCommands name="hpc_commands_SHEF01.xml">
</hpcCommands>
</keyword>
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 87
Remote Solve Manager Tutorial: Configuring Custom Server-Side Cluster Integration
R17
3.3.2. Modifying the Custom HPC Commands File to Reference Custom Scripts
As part of the setup, you must edit the cluster-specific HPC Commands file provided as part of the RSM
installation. A reference example of an unmodified HPC Commands file will be followed by instructions
on how to modify it and an example of the completed HPC Commands file.
Note
Commands files for different cluster types are sometimes very different, so this may not
look like yours if you have started from LSF or PBS scripts, but you should still find sec-
tions named similarly even if the actual commands are different than SGE/UGE as shown.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
88 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Setting Up Custom Code References
</properties>
<application>
<pythonapp>%RSM_HPC_SCRIPTS_DIRECTORY_LOCAL%/ugeParsing.py</pythonapp>
</application>
<arguments>
<arg>-submit</arg>
<arg>
<value>%RSM_HPC_PARSE_MARKER%</value>
<condition>
<env name="RSM_HPC_PARSE_MARKER">ANY_VALUE</env>
</condition>
</arg>
</arguments>
<outputs>
<variableName>RSM_HPC_OUTPUT_JOBID</variableName>
</outputs>
</command>
</postcommands>
</submit>
<cancel>
<primaryCommand name="cancel">
<application>
<app>qdel</app>
</application>
<arguments>
<arg>%RSM_HPC_JOBID%</arg>
</arguments>
</primaryCommand>
</cancel>
<queryStatus>
<primaryCommand name="queryStatus">
<application>
<app>qstat</app>
</application>
<arguments>
<arg>-u %RSM_HPC_USER%</arg>
<arg noSpaceOnAppend="true">
<value>,%RSM_HPC_PROTOCOL_OPTION1%</value>
<condition>
<env name="RSM_HPC_PROTOCOL_OPTION1">ANY_VALUE</env>
</condition>
</arg>
</arguments>
</primaryCommand>
<postcommands>
<command name="parseStatus">
<properties>
<property name="MustRemainLocal">true</property>
</properties>
<application>
<pythonapp>%RSM_HPC_SCRIPTS_DIRECTORY_LOCAL%/ugeParsing.py</pythonapp>
</application>
<arguments>
<arg>-status</arg>
<arg>
<value>%RSM_HPC_PARSE_MARKER%</value>
<condition>
<env name="RSM_HPC_PARSE_MARKER">ANY_VALUE</env>
</condition>
</arg>
</arguments>
<outputs>
<variableName>RSM_HPC_OUTPUT_STATUS</variableName>
</outputs>
</command>
</postcommands>
</queryStatus>
<queryQueues>
<primaryCommand name="queryQueues">
<application>
<app>qconf</app>
</application>
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 89
Remote Solve Manager Tutorial: Configuring Custom Server-Side Cluster Integration
R17
<arguments>
<arg>-sql</arg>
</arguments>
</primaryCommand>
<postcommands>
<command name="checkQueueExists">
<properties>
<property name="MustRemainLocal">true</property>
</properties>
<application>
<pythonapp>%RSM_HPC_SCRIPTS_DIRECTORY_LOCAL%/ugeParsing.py</pythonapp>
</application>
<arguments>
<arg>-queues</arg>
<arg>
<value>%RSM_HPC_PARSE_MARKER%</value>
<condition>
<env name="RSM_HPC_PARSE_MARKER">ANY_VALUE</env>
</condition>
</arg>
</arguments>
<outputs>
<variableName>RSM_HPC_OUTPUT_QUEUE_DEFINED</variableName>
</outputs>
</command>
</postcommands>
</queryQueues>
<queryPe>
<primaryCommand name="queryPe">
<application>
<app>qconf</app>
</application>
<arguments>
<arg>-spl</arg>
</arguments>
</primaryCommand>
<postcommands>
<command name="checkPeExists">
<properties>
<property name="MustRemainLocal">true</property>
</properties>
<application>
<pythonapp>%RSM_HPC_SCRIPTS_DIRECTORY_LOCAL%/ugeParsing.py</pythonapp>
</application>
<arguments>
<arg>-pe</arg>
<arg>
<value>%RSM_HPC_PARSE_MARKER%</value>
<condition>
<env name="RSM_HPC_PARSE_MARKER">ANY_VALUE</env>
</condition>
</arg>
</arguments>
<outputs>
<variableName>RSM_HPC_OUTPUT_PE_DEFINED</variableName>
</outputs>
</command>
</postcommands>
</queryPe>
<!--<queryQacct>
<primaryCommand name="queryQacct">
<application>
<app>qacct</app>
</application>
<arguments>
<arg>-j %RSM_HPC_JOBID%</arg>
</arguments>
</primaryCommand>
<postcommands>
<command name="parseQacct">
<properties>
<property name="MustRemainLocal">true</property>
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
90 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Setting Up Custom Code References
</properties>
<application>
<pythonapp>%RSM_HPC_SCRIPTS_DIRECTORY_LOCAL%/sgeParsing.py</pythonapp>
</application>
<arguments>
<arg>-qacct</arg>
<arg>
<value>%RSM_HPC_PARSE_MARKER%</value>
<condition>
<env name="RSM_HPC_PARSE_MARKER">ANY_VALUE</env>
</condition>
</arg>
</arguments>
<outputs>
<variableName>RSM_HPC_OUTPUT_QACCT</variableName>
</outputs>
</command>
</postcommands>
</queryQacct>-->
</jobCommands>
In the HPC Commands file shown above, you need to do two things:
1. Replace all of the Submit command, between <primaryCommand name =”submit”> and
</primaryCommand>, with the new (much shorter) code reference to the
%RSM_HPC_SCRIPTS_DIRECTORY%/CustomSubmissionCode.py as shown below.
2. Replace all of the Cancel command, between <primaryCommand name =”cancel”> and
</primaryCommand>, with the new code reference to the %RSM_HPC_SCRIPTS_DIRECT-
ORY%/CustomCancelCode.py as shown below. Modifications are in bold text.
Note
Replacing the references to this code here means that when RSM needs to Submit a
job or Cancel a job, it will now use this new code to do so. Changes made to these
scripts/code will be immediately implemented into RSM.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 91
Remote Solve Manager Tutorial: Configuring Custom Server-Side Cluster Integration
R17
<application>
<pythonapp>%RSM_HPC_SCRIPTS_DIRECTORY_LOCAL%/ugeParsing.py</pythonapp>
</application>
<arguments>
<arg>-submit</arg>
<arg>
<value>%RSM_HPC_PARSE_MARKER%</value>
<condition>
<env name="RSM_HPC_PARSE_MARKER">ANY_VALUE</env>
</condition>
</arg>
</arguments>
<outputs>
<variableName>RSM_HPC_OUTPUT_JOBID</variableName>
</outputs>
</command>
</postcommands>
</submit>
<cancel>
<primaryCommand name="cancel">
<application>
<pythonapp>%RSM_HPC_SCRIPTS_DIRECTORY%/CustomCancelCode.py</pythonapp>
</application>
<arguments>
</arguments>
</primaryCommand>
</cancel>
<queryStatus>
<primaryCommand name="queryStatus">
<application>
<app>qstat</app>
</application>
<arguments>
<arg>-u %RSM_HPC_USER%</arg>
<arg noSpaceOnAppend="true">
<value>,%RSM_HPC_PROTOCOL_OPTION1%</value>
<condition>
<env name="RSM_HPC_PROTOCOL_OPTION1">ANY_VALUE</env>
</condition>
</arg>
</arguments>
</primaryCommand>
<postcommands>
<command name="parseStatus">
<properties>
<property name="MustRemainLocal">true</property>
</properties>
<application>
<pythonapp>%RSM_HPC_SCRIPTS_DIRECTORY_LOCAL%/ugeParsing.py</pythonapp>
</application>
<arguments>
<arg>-status</arg>
<arg>
<value>%RSM_HPC_PARSE_MARKER%</value>
<condition>
<env name="RSM_HPC_PARSE_MARKER">ANY_VALUE</env>
</condition>
</arg>
</arguments>
<outputs>
<variableName>RSM_HPC_OUTPUT_STATUS</variableName>
</outputs>
</command>
</postcommands>
</queryStatus>
<queryQueues>
<primaryCommand name="queryQueues">
<application>
<app>qconf</app>
</application>
<arguments>
<arg>-sql</arg>
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
92 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Setting Up Custom Code References
</arguments>
</primaryCommand>
<postcommands>
<command name="checkQueueExists">
<properties>
<property name="MustRemainLocal">true</property>
</properties>
<application>
<pythonapp>%RSM_HPC_SCRIPTS_DIRECTORY_LOCAL%/ugeParsing.py</pythonapp>
</application>
<arguments>
<arg>-queues</arg>
<arg>
<value>%RSM_HPC_PARSE_MARKER%</value>
<condition>
<env name="RSM_HPC_PARSE_MARKER">ANY_VALUE</env>
</condition>
</arg>
</arguments>
<outputs>
<variableName>RSM_HPC_OUTPUT_QUEUE_DEFINED</variableName>
</outputs>
</command>
</postcommands>
</queryQueues>
<queryPe>
<primaryCommand name="queryPe">
<application>
<app>qconf</app>
</application>
<arguments>
<arg>-spl</arg>
</arguments>
</primaryCommand>
<postcommands>
<command name="checkPeExists">
<properties>
<property name="MustRemainLocal">true</property>
</properties>
<application>
<pythonapp>%RSM_HPC_SCRIPTS_DIRECTORY_LOCAL%/ugeParsing.py</pythonapp>
</application>
<arguments>
<arg>-pe</arg>
<arg>
<value>%RSM_HPC_PARSE_MARKER%</value>
<condition>
<env name="RSM_HPC_PARSE_MARKER">ANY_VALUE</env>
</condition>
</arg>
</arguments>
<outputs>
<variableName>RSM_HPC_OUTPUT_PE_DEFINED</variableName>
</outputs>
</command>
</postcommands>
</queryPe>
</jobCommands>
Note
• If you want to use other types of code such as C++, that is acceptable if you simply place your
compiled (executable) code in the <app> </app> section, arguments are not required. For
Python, an interpreter is included in the ANSYS Workbench install, so that is what you see refer-
enced. If you want to use Python you can simply replace <app> </app> with <pythonapp>
</pythonapp> as shown and enter the Python code file name.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 93
Remote Solve Manager Tutorial: Configuring Custom Server-Side Cluster Integration
R17
• Any custom code that you want to provide as part of the customization should also be located
in the [RSMInstall]\RSM\Config\scripts directory corresponding to the remote (Manager machine)
installation. Alternatively, you must enter a full path to the script along with the name.
"""
Copyright (C) 2015 ANSYS, Inc. and its subsidiaries. All Rights Reserved.
import sys
import ansLocale
import os
import tempfile
import os.path
import shutil
import glob
import shlex
import subprocess
import time
import platform
print('RSM_HPC_DEBUG=Debug Statements need to be turned on in the rsm job window')
# See Below #1
print('RSM_HPC_WARN=This is what a warning displays like')
print('RSM_HPC_ERROR=This is what an error message look like')
print('Standard output looks like this, you dont need the special RSM tags')
print('End custom coding')
# See Below #2
# Code below is for Clusterjobs submission to standard SGE cluster.
# The variable _ClusterjobsSubmit is recursively added again and again
# to incorporate all the variables that "might" exist from RSM.
# These can be modified if necessary.
_ClusterjobsSubmit = "qsub -S /bin/sh -V -R y"
# See Below #3
# Check that the Jobname Exists, if so Add it to the command line.
_jobname = os.getenv("RSM_HPC_JOBNAME")
if not _jobname == None:
_ClusterjobsSubmit += " -N \\\"" + _jobname + "\\\""
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
94 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Setting Up Custom Code References
_DistributedMemoryEnvironmentName = 'pe_mpi'
# Number of cores should always be defined from RSM code, but check anyway.
# Check if job is distributed and choose environment type accordingly.
_numcores = os.getenv("RSM_HPC_CORES")
_distributed = os.getenv("RSM_HPC_DISTRIBUTED")
if not _numcores == None:
if _distributed == None or _distributed == "FALSE":
_ClusterjobsSubmit += " -pe " + _SharedMemoryEnvironmentName + " " + _numcores
else:
_ClusterjobsSubmit += " -pe " + _DistributedMemoryEnvironmentName + " " + _numcores
_nativeOptions = os.getenv("RSM_HPC_NATIVEOPTIONS")
if not _nativeOptions == None:
_ClusterjobsSubmit += " " + _nativeOptions
# Check if the Staging directory exists. If not, then error log out but don't exit.
# If so, add it as the qsub working directory.
_staging = os.getenv("RSM_HPC_STAGING")
if _staging == None:
print("RSM_HPC_ERROR=RSM_HPC_STAGING is not defined, please define and Restart RSM Services")
else:
_ClusterjobsSubmit += " -wd " + _staging
# See Below #4
# Don't want to expand RSM_HPC_COMMAND since $AWP_ROOTxxx needs to be expanded later, on the cluster.
_qsubCommand = os.getenv("RSM_HPC_COMMAND")
if not _qsubCommand == None:
_ClusterjobsSubmit += " " + _qsubCommand
# Split the string into a list of strings that Subprocess.Popen can read.
_argList = shlex.split(_ClusterjobsSubmit)
# Printing START tells RSM that all output above was just junk,
# i.e. dont try to find SGE Submit output above here. IF this is not printed, RSM assumes
# Start is at the TOP of the file and tries to interpret everything.
print('START')
# See Below #5
# Run the command we created.
process = subprocess.Popen(_argList, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, cwd=os.getcwd())
# Wait for the command to finish.
try:
while process.poll() == None:
time.sleep(1)
except:
pass
print("RSM_HPC_DEBUG=qsub command finished")
# See Below #6
# Just dump out the standard output to Print. RSM should be able to interpret SGE output
# exactly as long as the correct parser is selected in <parseSubmit> in the HPC commands file.
for line in process.stdout:
print line
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 95
Remote Solve Manager Tutorial: Configuring Custom Server-Side Cluster Integration
R17
# Job is finished with no errors; exit 0 means everything is fine.
sys.exit(0)
Note
This code references many RSM-set environment variables. For more, information on what
environment variables are available and their contents, see “Environment Variables Set by
RSM” in the Remote Solve Manager User's Guide.
1. You can add any code you want to this section, code placed here will execute before the job is submitted.
Also, you can stop the job from submitting with some controls on the Submit command, if desired.
2. Basic SGE command line starting point. We will continuously append arguments to this line as necessary
to complete the command.
3. Most blocks are composed of three parts: storing an environment variable to a local variable, testing to
ensure that a variable either isn’t empty or contains a special value, and then appending some flag to the
command line based on the findings.
4. One of the final actions is to read the RSM_HPC_COMMAND variable and append it to the submission
command. This command is created by RSM and contains the command line to run the ClusterJobs
script that can complete the submission process. It creates the full command line for ANSYS by using the
controls file created by the individual add-ins. ANSYS suggests that you always use the RSM_HPC_COMMAND
to submit a job whenever possible because of the complexities of the ANSYS command line for different
solvers and on different platforms.
5. Popen finally “runs” the command we have been building. Then we wait for it to finish.
6. Finally, print any output that came from it so RSM can interpret it and obtain the job #.
Since this script is a Submit script, there are many options for qsub command. However, it is much
simpler to create a custom script for the Cancel command, although it contains the same basic parts.
This process is addressed in the next section.
"""
Copyright (C) 2015 ANSYS, Inc. and its subsidiaries. All Rights Reserved.
import sys
import ansLocale
import os
import tempfile
import os.path
import shutil
import glob
import shlex
import subprocess
import time
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
96 ation of ANSYS, Inc. and its subsidiaries and affiliates.
Setting Up Custom Code References
import platform
print('RSM_HPC_DEBUG=Custom Cancel command running')
print('Begin Custom Coding')
# See Below #1
print('RSM_HPC_WARN=Warning test')
print('RSM_HPC_ERROR=Error Test')
print('End custom coding')
# See Below #2
# Code below is for cancelling a job on a standard SGE cluster. The variable _SGEjobsCancel is
# recursively added to incorporate any needed variables from RSM.
# These can be modified in any way necessary.
_SGEjobsCancel = "qdel"
# See Below #3
# Check if the jobid exists. If not, then error log out and exit failed.
# If so, add it with a space to the qdel command as its argument.
_jobid = os.getenv("RSM_HPC_JOBID")
if _jobid == None or _jobid == ' ':
print("RSM_HPC_ERROR=RSM_HPC_JOBID is not defined, There has been an error in the job submission")
sys.exit(1)
else:
_SGEjobsCancel += " " + _jobid
# Split the string into a list of strings that Subprocess.Popen can read.
_argList = shlex.split(_SGEjobsCancel)
# Printing START tells RSM that all output above was just junk,
# i.e. dont try to find SGE Cancel output above here. IF this is not printed, RSM assumes
# Start is at the TOP of the file and tries to interpret everything.
print('START')
# See Below #4
# Run the command we created.
process = subprocess.Popen(_argList, stdout=subprocess.PIPE, stderr=subprocess.STDOUT,
cwd=os.getcwd())
# Wait for the command to finish.
try:
while process.poll() == None:
time.sleep(1)
except:
pass
print("RSM_HPC_DEBUG=cancel command finished, printing output")
# See Below #5
# Just dump out the standard output to Print.
for line in process.stdout:
print line
# Script is finished with no errors; exit 0 means everything is fine.
sys.exit(0)
Note
This code references many RSM-set environment variables. For more information on what
environment variables are available and their contents, see “Environment Variables Set by
RSM” in the Remote Solve Manager User's Guide.
1. You can add any code you want to this section; code placed here will happen before the job is cancelled.
Also, some code could be run at the end of the script just before sys.exit(0), if some extra precautions
are to be taken after the job has been cancelled through the scheduler.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
ation of ANSYS, Inc. and its subsidiaries and affiliates. 97
Remote Solve Manager Tutorial: Configuring Custom Server-Side Cluster Integration
R17
2. Basic SGE command line starting point: qdel is what you would type at the command line in order to
cancel a job in SGE. We will continuously append arguments to this line as necessary to complete the
command.
3. Most blocks are composed of three parts: storing an environment variable to a local variable, testing to
ensure that a variable isn’t empty, and then appending some flag to the command line (or stopping the
command if an error is found) based on the findings. This environment variable is set by RSM. A list of these
useful variables can be found in “Custom Integration Environment Variables” in the Remote Solve Manager
User's Guide.
4. Popen finally “runs” the command we have been building. Then we wait for it to finish.
5. Finally, print any output that came from it so RSM can interpret it if needed.
2. Right-click on the newly added Compute Server under the Queue’s folder (Tester1 Custom Cluster) and
select Test .
3. When the test job completes, you can view job details in the RSM Progress Pane.
a. Check to see if any firewalls are turned on and blocking the connection between the two machines.
b. Make sure you can reach the machine(s) via the network.
c. Add RSM ports to the firewall as needed. If you have a local firewall turned on (Compute Server and
RSM Client machines), you will need to add the following two ports the Exceptions List for RSM:
Replace the x with the point version number of your R17 product. For example, add ports 8170
and 9170 for version 17.0, and 8171 and 9171 for version 17.1.
ANSYS Release 17.0 - © SAS IP, Inc. All rights reserved. - Contains proprietary and confidential inform-
98 ation of ANSYS, Inc. and its subsidiaries and affiliates.