Documente Academic
Documente Profesional
Documente Cultură
GPAC_DAMI
DM Tool
User Guide
Table of Contents
Table of Contents ...................................................................................................................... 1
Document History ...................................................................................................................... 4
Overview of this guide ............................................................................................................... 5
Assumptions .............................................................................................................................. 5
Document Support ................................................................................................................. 5
Abbreviations used in this Guide ........................................................................................... 5
T24 Modules Supported ............................................................................................................ 5
1.
Introduction ......................................................................................................................... 6
2.
3.
4.
4.1.
DM.INFORMATION ................................................................................................. 8
4.2.
DM.DATA.TRANSLATION ...................................................................................... 8
4.3.
DM.PREREQUISITE ............................................................................................. 10
4.4.
DATA.MAPPING.VERIFICATION ......................................................................... 11
4.5.
DM.MAPPING.DEFINITION .................................................................................. 12
4.6.
DM.SERVICE.CONTROL...................................................................................... 14
4.7.
DM.SERVICE.SCHEDULER ................................................................................. 16
4.8.
DM.AA.MAPPING .................................................................................................. 17
4.9.
DM.DATA.COMPARE ........................................................................................... 17
4.10.
DM.EXTRACTOR .................................................................................................. 18
4.11.
DM.MAPPING.EXTRACT...................................................................................... 21
4.12.
DM.AA.EXTRACT ................................................................................................. 21
4.13.
T24.SAB.INFO ....................................................................................................... 22
5.
5.1.
Menus .................................................................................................................... 24
5.1.1
5.1.1.1
5.1.2
5.1.2.1
5.1.2.2
5.1.2.3
5.1.2.4
5.1.2.5
5.1.2.6
5.1.2.7
5.1.2.8
AA.MAPPING ................................................................................................. 93
GPACK
5.1.2.9
5.1.3
5.1.3.1
5.1.3.2
5.1.4
5.1.5
5.1.5.1
5.1.5.2
5.2.
6.
5.2.2.
5.2.3.
T24.AA.INT.ACT.DEFAULT............................................................................. 231
5.2.4.
5.2.5.
5.2.6.
5.2.7.
5.2.8.
5.2.9.
DM.DATA.TARNSLATION............................................................................... 261
5.2.10.
5.2.11.
5.2.12.
5.2.13.
5.2.14.
5.2.15.
5.2.16.
5.2.17.
5.2.18.
5.2.19.
GPACK
Document History
Author
Version
Date
Gpack
1.0
11 April 2013
Gpack
2.0
5 August 2013
Data Migration
3.0
13 September 2013
th
th
th
Comments:
DM Tool
GPACK
Assumptions
This reference guide is prepared in different releases. Hence the screen shots will be in any
release. Wherever the release is mentioned, there are all the actual release is applicable.
Document Support
Any queries in any part of this document can be addressed to gpack_pack.requests
<gpack_pack.requests@temenos.com>
CUSTOMER
ACCOUNT
COLLATERAL
CURRENCY
SECURITY.MASTER
AA.ARRANGEMENT.ACTIVITY, etc
GPACK
1. Introduction
The T24 Data Migration tool was originally written using the EB.PHANTOM option. Phantom
processing in T24 is gradually moving from EB.PHANTOM to TSA.SERVICE. The Data
Migration tool too has been modified to be run as a Service. The original functionalities of
2. Product Overview
The T24 Data Migration tool will be used during the pre-Live in any site implementations. It
basically covers the following functionalities.
Used the Standard OFS module to perform the Load. The Option is given to
use either OFS or standard jBase WRITE function to perform the Load.
Accepts data in any layout and format, that includes double byte characters
(Provided the jBase used for the implementation is 4.1 or higher)
Perform all standard T24 validations and also includes any special cases to
perform local validations for the data being loaded.
Exception Handling is done to report any erroneous data present in the Data
file. This report also includes detailed description of the error that is raised
during the load.
GPACK
3. Workflow Diagram
GPACK
DM.INFORMATION
Counting the number of records for every application, before and after migration, is a vital part
of Data Migration .The information of the number of counts per application is available in a
Live file DATA.MIG.COUNT. This application keeps track of the changes in the load count.
The entire operation can be successfully complete using the Browser
No.
Field Name
Description
@ID
ID must be SYSTEM.
1.1
CLIENT.NAME
2.1
DESCRIPTION.
MIG.TYPE
SELECT.LIST
MNEMONIC
6.1
APPLICATION
7.1
LIVE.COUNT
8.1
NAU.COUNT
9.1
HIS.COUNT
10
OVERRIDE
Table 1 DM.INFORMATION
4.2.
DM.DATA.TRANSLATION
The data provided by the Client will contain values that correspond to the Legacy
Data Format which cannot be loaded into T24 without any TRANSLATION. Hence a valid
TRANSLATION must be done so that the Data is converted into a T24 Data format and can
be loaded. The application DM.DATA.TRANSLATION is used for converting legacy data into
T24 format.
No.
Field Name
Description
@ID
GPACK
APPLICATION
SOURCE.DIR
SOURCE.FNAME
TARGET.DIR
TARGET.FNAME
FM.DELIM
The Value Marker that is used in the datafile to
delimit field values. By default :: is defaulted which
can be modified as per the user.
VM.DELIM
The Sub Value Marker that is used in the datafile to
delimit field values. By default !! is defaulted which
can be modified as per the user.
SM.DELIM
Any Concat file name that will be used for the
conversion can be provided in this field. This is a
multivalue field.
9.1
10.1.1
MAPPING.TABLE
POS.CONVERT
POS.VALUE
POS.ROUTINE
NO.OF.SERVICE
11.1.1
12.1.1
13
Table 2 DM.DATA.TRANSLATION
GPACK
4.3.
DM.PREREQUISITE
This table is used to create folders and for copying of data files to the respective folders
and creating the DM.SERVICE.CONTROL (DSC) that are necessary for the load of Data to
T24. Using the application DM.PREREQUISITE we can create necessary folders and moves
the data files to their respective paths. We will also be able to create folders for placing
reconciliation extracts from T24.
No.
Field Name
Description
@ID
DESCRIPTION
PROCESS
DMD.PREFIX
DIRECTORY.NAME
SELECT.LIST
6.1
DMD.NAME
10
E.g.: CUSTOM.01.201203031653.txt
7.1
FILE.NAME
12
RECONCILIATION
13
RECON.DIR
14.1
RECON.SPEC.NAME
GPACK
15.1
DM.EXTRACTOR.NAME
16
FILE.CONV.REQD
17
FILE.LOAD.TYPE
18
CONVERSION.TYPE
4.4.
DATA.MAPPING.VERIFICATION
Verifying the correctness of the Data Mapping Table and the DM.MAPPING.DEFINITION is a
vital part of Data Migration, without which the data load, could go wrong.
DATA.MAPPING.VERIFICATION verifies, whether the Data Mapping Table and the
DM.MAPPING.DEFINITION are in synchronisation.
This application also produces an output file in the specified DMV directory with the status of
the Verification.
No.
Field Name
Description
The id of this application is free format text, 12
characters long.
DESCRIPTION
APPLICATION.NAME
DMD.NAME
DIRECTORY.NAME
11
MAPPING.STATUS
If the fields in DMD and the one in save list are same
without any mismatch then it would be updated as
PASS.
GPACK
CORE.LIST
This is the name of the file where the core fields are
given for comparison. This file is to be placed in
DATA.IN.
LOCAL.LIST
CORE.FIELDS
CO.FIELD.STATUS
10.1
LOCAL.FIELDS
11
DATAFILE.NAME
This field specifies the Data file name. The file has to
be placed in DMV directory.
VERIFICATION.TYPE
FIRST.POSITION
14
LAST.POSITION
15
DATAFILE.COUNT
16
ERROR.DESC
8.1
9.1
12
13
Table 4 DATA.MAPPING.VERIFICATION
4.5.
DM.MAPPING.DEFINITION
This application allows the user to define the format of the data in the incoming tape. The
following provides the list of fields, positions and the associated description.
No.
Field Name
Description
The id of this application is free format text, 12
characters long.
@ID
12
GPACK
R12.V.CUSTOM
R12.A.CUSTOM
R12.I.CUSTOM
GB. DESCRIPTION
APPLICATION.NAME
LOAD.TYPE
OFS.ACTION
OFS.FUNCTION
FILE.TYPE
OFS.VERSION
IN.DATA.DEF
FM.DELIM
VM.DELIM
SM.DELIM
ESC.SEQ.FR
ESC.SCR.TO
14
ID.TYPE
15
ID.ROUTINE
16
ID.POSITION
10
11
12
13
13
GPACK
DATA
The Length of the ID in each record in the data file.
Value allowed only when ID.TYPE is set to DATA
17
18.1
19.1
ID.LENGTH
APPL.FIELD.NAME
FIELD.POSITION
20.1
21.1
22.1
23
FIELD.LENGTH
FIELD.ATTRIB
FIELD.VALUE
POST.UPDATE.RTN
24
OFS.SOURCE
Table 5 DM.MAPPING.DEFINITION
4.6.
DM.SERVICE.CONTROL
14
GPACK
No.
Field Name
Description
@ID
UPLOAD.COMPANY
FLAT.FILE.DIR
The directory where the flat file resides. This must be a valid
directory name, e.g. DATA.BP. Actual path or relative path of
the directory must be provided.
FLAT.FILE.NAME
The flat file name of the incoming data. Must be an existing file
name.
SERVER.NAME
USER
The user name which will be used for the load process. This
username will get updated in the TSA.SERVICE record related
to the DM.SERVICCE.CONTROL.
NO. OF SESSIONS
MOVE.TO.HIST
10
RUN.STATUS
11
DSC.TYPE
12
PRE.PROCESS
15.1.1
CONTROL.LOG
16.1
ERROR.DETAILS
The errors that have been generated after the completion of the
process will be captured in this field. Auto updated.
NO.OF.ERROR
18.1
TYPE.OF.ERROR
19
ERROR.FILE.PATH
If the error is generated, this field will indicate the path where the
error long gets stored. Auto Updated.
20
ERROR.FILE.NAME
If the error is generated, this field will indicate the file name of
the error long gets stored. Auto Updated.
21
DATE
The current date of the load process will be updated in this field.
Auto updated.
22
STARTED
17.1
15
GPACK
23
STOPPED
ELAPSED
Table 6 DM.SERVICE.CONTROL
4.7.
DM.SERVICE.SCHEDULER
The services for validating, loading and authorizing of Data in T24 is required to be scheduled
and automated. Most of the tasks are manually performed which needs to be eliminated. The
multithread service will perform the tasks in an automated way to eliminate the manual task to
an extent.
No.
Field Name
Description
The ID for this application must be a user defined name for the
SCHEDULER.
@ID
e.g.: R11.DSS.STATIC
DESCRIPTION
DSS.TYPE
VERIFICATION.DSS
RUN.STATUS
5.1
DM.SERVICE.ID
VERIFICATION
7.1
CURRENT.DIRECTORY
To store the specified data file path for that services. This is an
associated multi-value field.
8.1
DATA.FILE.NAME
To specify the data file name for validate or load the data. This
is an associated multi-value field.
9.1
NEW.DIRECTORY
To specify the new data file path for that services. This is an
associated multi-value field.
10.1
ERR.THRESHOLD
11
NO.OF.ERRORS
12.1
SKIP.SERVICE
1.1
6.1
16
GPACK
DSC.STATUS
14
DSS.STATUS
15
STATUS.DETAIL
13.1
Table 7 DM.SERVICE.SCHEDULER
4.8.
DM.AA.MAPPING
No.
Field Name
Description
Id for this application must be in the format:
A 3 digit number followed by an asterisk (*), followed by a valid
user name that must be present in the USER table.
If only the USER is provided, then the 3 digit number will be
defaulted automatically
@ID
1
DESCRIPTION
EXTRACT.TYPE
VERSION.CHK
PRODUCT.LINE
4.9.
DM.DATA.COMPARE
The Reconciliation extract will be taken from T24 after completion of load as per the process,
and the reconciliation extract from the Legacy will also be extracted and provided in the
DATA.IN Directory.
The table is used to compare both the Extracts and to provide the output of the comparison.
17
GPACK
No.
Field Name
Description
@ID
DESCRIPTION
COMPARE.TYPE
ID.POSITION
This field position of the Primary key (ID) in the file that is
provided for comparison.
FM.DELIM
VM.DELIM
SM.DELIM
HEADER
This field is to flag if both the source and target extract for file
compare has Headers or not.
LEGACY.FILE.NAME
This field is used for information only to store the Legacy Data
Extract.
SOURCE.FILE.PATH
10
SOURCE.FILE.NAME
11
TARGET.FILE.PATH
12
TARGET.FILE.NAME
13
OUTPUT.FILE.PATH
14
OUTPUT.FILE.NAME
15
NO.OF.AGENTS
16
APPLICATION.NAME
17.1
FIELD.POSITION
This field specifies the position of the field and applicable only
for environment compare.
18.1
SOURCE.FIELD
4.10.
DM.EXTRACTOR
Extraction of Data from T24 is necessary when a Migration is to take place between older
releases of T24 to a newer release or for reconciliation extracts.
The application DM.EXTRACTOR is used for extraction of data from T24.
18
GPACK
No.
Field Name
Description
The id of this application is free format text, 12
characters long.
DESCRIPTION
APPLICATION
Whether the extraction is Multi Company or not.
MULTI.COMPANY
FILE.TYPE
OFS.ACTION
SEL.DESC
7.1
SEL.FLD
8.1
SEL.CRIT
9.1
SEL.VALUE
10.1
SEL.COMB
The save list name must be entered in this field.
19
11
SELECT.LIST
12
TARGET.FILE
13
OUTPUT.DIR
14
FM.DELIM
GPACK
VM.DELIM
The Sub value Marker that will be used in the extract
to separate the Sub values.
16
SM.DELIM
17
LOCAL.CCY
18
DC.TOA.CATEGORY
19
DC.DAO.START.ID
20
FILE.HEADER
The field description for the field to be extracted.
21.1
FLD.LABEL
The application name must be specified from which
the for concurrent extract of the value is done.
22.1
FLD.APP
The extract type for the field is specified here.
23.1
FLD.TYPE
24.1.1
FLD.NAME
25.1
TRANS.CONDITION
The value provided depends on the type mentioned
in the field Trans Condition. This field becomes
mandatory when Trans Condition is provided.
26.1
TRANS.VALUE
27.1
FLD.RTN
28.1.1
FLD.APP.FLD
Whether the extract is done for DATA.CAPTURE
application or not.
29
30
20
DATACAPTURE
TRANSACTION.CR
GPACK
31
TRANSACTION.DR
Table 10 DM.EXTRACTOR
4.11.
No.
DM.MAPPING.EXTRACT
Field Name
Description
@ID
APPL.NAME
CORE.FLD.NAME
The core field name of the application for which the extraction
has to be done
SYS.PROG
This field indicate what type of core validation check is used for
that particular field
FMT
SYS.FLD.NO
USR.FLD.NAME
The local field name of the application for which the extraction
has to be done
USR.SYS.PROG
USR.FLD.NO
USR.FMT
Table 11 DM.MAPPING.EXTRACT
4.12.
No.
21
DM.AA.EXTRACT
Field Name
Description
@ID
DESCRIPTION
PRODUCT.LINE
This field is used specify the product line for which the extract is
to be taken. Only two options are valid either Deposits or
GPACK
Lending.
3
FM.DELIM
VM.DELIM
SM.DELIM
OUTPUT.DIR
EXTRACT.NAME
4.13.
No.
22
T24.SAB.INFO
Field Name
Description
@ID
DAB.INFO.DETAILS
SOURCE.SERVER
COMPANY.ID
SOURCE.SL.ID
INFO.DATE
INFO.STATUS
DAB.SOURCE.DETAILS
SOURCE.DATE
SOURCE.STATUS
10
DAB.TARGET.DETAILS
11
TARGET.SERVER
12
TARGET.DATE
13
TARGET.STATUS
14
DAB.COMPARE.DETAIL
15
REVERSE.COMPARE
16
COMPARE.DATE
GPACK
17
COMPARE.STATUS
18
OAB.SOURCE.DETAILS
19
OAB.SOURCE.SL.ID
20
OAB.SOURCE.DATE
21
OAB.SOURCE.STATUS
22
OAB.TARGET.DETAILS
23
OAB.TARGET.SL.ID
24
OAB.TARGET.DATE
25
OAB.TARGET.STATUS
26
OAB.COMPARE.DETAILS
27
OAB.COMPARE.DATE
28
OAB.COMPARE.STATUS
5. Process workflow:
This Section covers the functionalities involved in the DMIG tool. Each of the tables and
their related functionalities are explained. Also, sample setups are given. Please Use the
Menu MB.DMMAIN for operations using the DM TOOL.
23
GPACK
It has three menus namely Migration Pre-Requisite Check, Client System to T24
Migration , T24 to T24 Migration to facilitate efficient working with DM tool.
A sample screenshot of the Screen is as shown.
5.1.
Menus
24
GPACK
System Dates
Table
DATES
Purpose
The DATES application is checked for the current system Date before start of Data Migration.
25
GPACK
ENQUIRY DM.TOOL.DETAILS
Purpose
Display the details of the DM Tool that will be used for the validation and Load.
5.1.1.1
Table
DM.INFORMATION
Purpose
The application is used to summarise the number of records Loaded. The information of
the
number of counts per application is available in a Live file DATA.MIG.COUNT. This
application
keeps track of the changes in the Load count.
This is the table where the parameters are provided. It can be accessed from the
Command line of the Browser.
The ID for this application must be only SYSTEM.
After selecting the correct ID you will get a screen as below. Input The application name
and commit it.
26
GPACK
DM.IFORMATION verify
1. The count for each application is done and updated in the fields LIVE Count, NAU
Count, HIS Count.
2. The value is updated in a LIVE file DATA.MIG.COUNT.
After committing the record we need to verify the record. We select the record name to be
verified and click the button as shown.
27
GPACK
28
GPACK
The latest count of the application in the area will be the first multivalue set.
This menu contains applications that are used while migrating data from a Clients system to T24.
It consists of T24 Data Loading and T24 Recon Data Extract.
A sample screenshot of the Screen is as shown.
29
GPACK
5.1.2.2
Table
DM.PREREQUISITE
Purpose
Creation of folder, copying of data files to the respective folders and creating the
DM.SERVICE.CONTROL (DSC) is necessary for the load of Data to
T24.DM.PREREQUISITE by which we can create necessary folders and moves the data files
to their respective paths. We will also be able to create folders for Reconciliation.
The DSC is an integral part of data load. This application also creates specified DSCs
with appropriate field values through Open Financial Service (OFS).
OFS.SOURCE
The following OFS.SOURCE record DSC.CREATE must be available in the deployed area.
30
GPACK
DM.PREREQUISITE Parameters
This is the table where the parameters are provided. It can be accessed from the Command
line of the Browser.
The ID for this application must be in the format:
A 3 digit number followed by an asterisk (*), followed by a valid user name that must be present in
the USER table. (The format is mandatory).
31
GPACK
32
Process: The task that will be done when the application is verified. Either
creation of DSCs or Only creation of Directories or Both can be specified here.
Directory Name: The directory name where the data files are stored and under
this directory will other folders get created which will be used as Path Name in
DM.SERVICE.CONTROL. The value DATA.IN is defaulted, which can be
changed to any other directory as and when needed.
Select List: This is an optional filed, where a valid &SAVEDLISTS& name can be
given.
Dmd Name: This is an Associated Multivalue set which will hold the DMD name
for which the DM.SERVICE.CONTROL will be created followed by a dot (.) and a
two digit sequence number. (The format is mandatory). The folders that need to
be created for the data file will be based on the input to this field.
GPACK
File Name: The data file that will be used for the Load will be selected
automatically based on the input in the previous field. The data file must be
present in the Directory that has been specified in the field Directory Name. The
format of the data file must be as follows.
E.g.: CUSTOM.01.201003031653.TXT
Recon Dir: The Directory name must be specified here where the reconciliation
folders will be created. By default RECON.OUT is populated in this field.
Recon Spec Name: This is a multivalue field that is used to provide the Names of
the folders that need to be created under the Directory specified in the field
Recon Dir.
DM Extractor Name: This is a multivalue field that contains DM Extractor Id. The
value entered must exist in the DM.EXTRACTOR table.
File Conv Reqd: File conversion will be done if this field is set to YES.
File Load Type: This field specifies whether the file load is from Server or Local
host.
Conversion Type: Conversion type is specified here. Values entered here are NT
or Unix.
Local Ref: Currently non inputtable, can be used for further developments.
DM.PREREQUISITE Verify
After doing the initial setup of parameters, DM.PREREQUISITE is verified. On verification 4
processes are triggered. Creation of folders VALIDATE01, VALIDATE 02, VALIDATE 03,
INPUT01, AUTHORISE01 and DMV under the specified folders. Copying of the data file to the
above folders in their respective paths.
Creation of DSC records via OFS.
Creation of Reconciliation folders.
If DMD with function R or D records are not available, DSC will not be created. This will be
captured as override. For other DMDs with function as V, I or A, if record is not present, it
will be an error.
33
GPACK
After committing the record we need to verify the record. We type the record name to be
verified and click the button as shown.
34
GPACK
The folders are created for Validate, Input and Authorise and also the Reconciliation folders are
created as shown below and the data file is also copied to the respective path
35
GPACK
Purpose
The menu is used to do the initial setup for the verification of the Data Mapping Sheet and the
DM.MAPPING.DEFINITION
.
Detailed Working Procedure
This is the table where the parameters are provided. It can be accessed from the Command
line of the Browser.
The ID for this application can be upto 65 characters
36
GPACK
After committing the record we need to verify the record. We type the record name to be
verified and click the button as shown.
37
GPACK
The compared DMV fields and the DMD fields are populated in the file as below
By providing the data file name only, the values in the data file will be validated and the
PASS/FAIL status will be updated based on the validation status of each record. The validation
status of each record will be stored in a separate file as a result.
The validation of the record values in the data file can be validated based on the
verification type provided by the user. The following are verification types,
FIRST
POSITION
RANDOM
38
GPACK
1. FIRST
If the datafile name is given and the verification type is FIRST then only the first string in
the datafile will be validated and the result for that particular record will be stored in a separate file
in the DMV folder. A sample record is provided below for reference.
On verifying the DMV, the PASS/FAIL status will be updated and the result will be stored
in the DMV folder. Only the file for data file will be created as the field verification is not provided in
this case.
39
GPACK
The status of the validated datafile will be populated in the file as below
40
GPACK
2. POSITION
If the datafile name is given and the verification type is POSITION then its mandatory to
provide values for the fields FIRST.POSITION and LAST.POSITION. The number of strings
present between the two position values in the datafile will be validated and the result for those
records will be stored in a separate file in the DMV folder.
On verifying the DMV, the PASS/FAIL status will be updated and the result will be stored
in the DMV folder. Only the file for data file will be created as the field verification is not provided in
this case.
3. RANDOM
If the datafile name is given and the verification type is RANDOM then a string will be
picked randomly from datafile and will be validated. The result for that particular record will be
stored in a separate file in the DMV folder. A sample record is provided below for reference.
On verifying the DMV, the PASS/FAIL status will be updated and the result will be stored
in the DMV folder. Only the file for datafile will be created as the field verification is not provided in
this case.
5.1.2.3
Table
DM.MAPPING.DEFINITION
Purpose
This is the base mapping definition table, which maps the incoming data with T24 fields.
The following items are defined here.
T24 Table to be Loaded
Type of Load to be done (Example OFS or flat jbase write)
Type of Action to be performed (Example Only Validate the data or actual
update)
Function to be used (Allowed function are Input, Authorise, Delete, Reverse)
If not mentioned, Input function will be used by default.
Way to identify the each field in a transaction/record (Example by delimiter or
position)
Associated multi-valued sets to map the T24 field with Data, by giving the
position or delimiter to identify the data from the source file
Any Post Update routine that will be called for every transaction/record being
Loaded.
41
GPACK
5. Defining Incoming Data Generally the incoming data from the client received in the
Data file will comprise of several records in the form of text strings. For the purpose of
loading these records have to be segregated and this segregation can be done using
different methods. The possible methods for segregation include (i) use of delimiters
for identifying the records and fields within a record (ii) use of position of different
fields e.g. field 7 in a record can be of length 6 and will be starting at a position 25.
This way a field can be defined in a record. In this case it is not possible to define the
multi-value and sub-value fields since the position for mapping will get fixed and it will
not be possible to support the variable field lengths in case of multi -value and sub value fields.
42
GPACK
This option is handled through the mandatory field IN.DATA.DEF which has 2 options viz DELIM
or POSITION. The option DELIM is used for defining the delimiters to be used whereas POSITION
i s used for defining the position of the fields.
5. Delimiter Definitions: It is possible to parameterize different delimiters that can be
used for identifying the records, multi -values and sub-values. These definitions can be
provided in the fields FM.DELIM, VM.DELIM and SM.DELIM as each of these fields
represent one type of delimitation.
6. ID Generation:
In T24 it is possible that for some of the application the ID has to be system
generated whereas for certain other applications the ID has be to be provided
manually.
Accordingly 3 options have been provided in this mandatory field ID.TYPE viz.
AUTO, DATA and ROUTINE.
When the option AUTO is selected the ID will be automatically generated by the
system.
For option DATA the record ID has to be part of the incoming data.
In case of some of the applications require routine for generating the Ids the n the
option ROUTINE can be used. In this case a single argument is passed and returned
back from the routine used for forming the record ID. The name of the ID routine
can be defined in the field ID.ROUTINE.
The ID position has to be defined in case of data being segregated by fixed positions.
This field ID.POSITION becomes a mandatory field if field IN.DATA.DEF has a value of
POSITION.
7. Escape Sequence:
There is a provision to allow the user to change certain character in the data from one
kind to another. For example if a , has to be changed to : then this can be done
using an associated multi -value field ESC.SEQ.FR and ECS.SEQ.TO. The value
defined in the field ESC.SEQ.FR will be changed to the value in the field ESC.SEQ.TO.
8. Mapping:
These are a set of associated multi -value fields from APPL.FIELD.NAME to
FIELD.VALUE that are used for mapping either OFS or flat write. This set of fields
comprise of APPL.FIELD.NAME, FIELD.POSITION, FIELD.LENGTH,
FIELD.ATTRIB and FIELD.VALUE.
9. Field APPL.FIELD.NAME is for defining the name of the field in given T24
application (either core or local). It can take up to 65 characters. The field has to be defined in the
STANDARD.SELECTION record of the application. Either the field name or the field
number can be input in this field.
43
GPACK
It is possible to provide the company for which the record is loaded. To do this, the
field name must be set to LOAD.COMPANY and the company code must be
provided in the associated field position.
For LD.LOANS.AND.DEPOSITS, an option to provide both the
LD.LOANS.AND.DEPOSITS record as well as the LD.SCHEDULE.DEFINE record is
available. The values must be separated using a message separator, defined using
MESSAGE.SEPARATOR.
10. Field FIELD.POSITION is used to define the position of the field in the record given
by the client. For example if there is local reference field in the customer record
defined as field number 40.2 (MERCHANT.NUMBER) and this has been provided at
25th position in the data given by the Client then the field APPL.FIELD.NAME should
have the value MERCHANT.NUMB ER and field FIELD.POSITION should have the
value of 25. This is used in case option DELIM or POSITION is opted in field
IN.DATA.DEF.
11. Field FIELD.LENGTH is a mandatory field in case field IN.DATA.DEF has value of
POSITION and for other cases this s is a no-input field.
12. Field FIELD.ATTRIB has two options viz. CONSTANT and ROUTINE. When option
CONSTANT is selected the value in the field FIELD.VALUE will be mapped to the
Application field as constant value. In case option ROUTINE is selected the n the
Name of the routine should be defined in the field FIELD.VALUE. The name of the
Routine should be preceded by @ and should be defined in the PGM.FILE as type S.
14. Field POST.UPDATE.ROUTINE is used to perform the post update processing for a
record. In applications like LD which are single authoriser, this routine will be used to
perform the authorisation of an unauthorised record. any routine is attached to the field
POST.UPDATE.RTN, it can be removed even after the record is authorised.
15. Field OFS.SOURCE is used to define the OFS.SOURCE record to be used when
Loading the data into T24 via OFS. Four OFS.SOURCE records DM.OFS.SRC, 15. Field
OFS.SOURCE is used to define the OFS.SOURCE record to be used when
Loading the data into T24 via OFS. Four OFS.SOURCE records DM.OFS.SRC,
DM.OFS.SRC.VAL, DM.OFS.SRC.ORD and DM.OFS.SRC.VAL.ORD are provided.
In cases where the field validation needs to be triggered for incoming OFS message
then the DM.OFS.SRC.VAL option need to be used and if OFS.REQUEST.DETAIL
44
GPACK
45
GPACK
DMD Record:
46
GPACK
47
GPACK
DMD Record:
48
GPACK
49
GPACK
50
GPACK
51
GPACK
52
GPACK
53
GPACK
54
GPACK
55
GPACK
The below screenshot shows the $ARC file of the FUNDS.TRANSFER record
56
GPACK
57
GPACK
600000001
600000002 to 600000010
58
GPACK
59
GPACK
Data File
Group 2
The first block is the Group 1 and the Second block is Group 2. (N number of groups
can be included in the data file). The first field in the Data file must be the Group ID
followed by the second field which must be the Sequence number for the respective
Group.
60
GPACK
61
GPACK
The OFS.REQUEST.DETAIL (ORD) lists all the records in the data file, as processed.
Processed OFS.REQUEST.DETAIL
62
GPACK
Uploaded Records:
Global Limit: 600000001.0010000.01
63
GPACK
64
GPACK
65
GPACK
5.1.2.4
Application name
Once the user provides the appropriate values, the program takes care of creating the
respective Versions (If the record does not exist in the environment) for the corresponding
DMD and also DMDs in the environment.
Prerequisites
Need to have these record has the perquisite for successful execution this routine
66
GPACK
Process Flow:
Once the application name is keyed, all input able fields are extracted from the
STANDARD.SELECTION and for that particular application and then DMD will be created.
67
GPACK
Below is the few pages of sample screenshot of the automatically created DMD for the
CUSTOMER application
68
GPACK
69
GPACK
5.1.2.5
DM Service Control
Table
DM.SERVICE.CONTROL
Purpose
This work-file application is used to control the actual Load process. The following details must
be provided when the DM.SERVICE.CONTROL is set up .
Load Company The Company for which the data is loaded. If this information is
not provided in the incoming tape, it must be provided here.
70
Data file directory and file name A valid directory name and file name.
User The user name which will be used for the Load process. This username
will get updated in the TSA.SERVICE record related to the
DM.SERVICE.CONTROL. Initially DMUSER was hardcoded as the USER.
Error Details, No Of Error and Type Of Error The details of the errors that are
generated during validation/process will be updated in these fields. Date, Started,
Stopped and Elapsed The Current system Date, the start time, end time and
the elapsed time of validation/process activity will be updated in these fields.
GPACK
Batch
A batch record with the id as DM.SERVICE -<<DM.MAPPING.DEFINITION name>> is created.
This contains three jobs namely
DM.SERVICE.PRE.PROCESS A single thread process which reads the incoming sequential
file and writes to DM.SERVICE.DATA.FILE
DM.SERVICE A multi threaded job which reads the DM.SERVICE.DATA.FILE and updates
the job list. Each record is then processed using OFS.GLOBUS.MANAGER and the error
output is logged into <<DM.MAPPING.DEFINITION name>>.LOG.<<Session Number>>
71
GPACK
TSA.WORKLOAD.PROFILE record
TSA.SERVICE record
72
GPACK
As can be seen, the user in the TSA.SERVICE record is updated from the
DM.SERVICE.CONTROL.
On verification, the status of the DM.SERVICE.CONTROL record is marked as START. The
same is updated into TSA.SERVICE record as well. The control log is updated with the starting
request details.
Data File:
Output:
73
GPACK
74
GPACK
To mark the DM.SERVICE stop, the DM.SERVICE.CONTROL record must be opened in Input
mode and the RUN.STATUS must be set to STOP.
Immediately the TSA.SERVICE is marked for STOP. The agents will stop after processing the
current records.
75
GPACK
record in Verify mode. Update the number of sessions as required. This will be updated to the
TSA.WORKLOAD.PROFILE application. Once the TSM reaches its review time, it will either
spawn more or less agents as required (in Phantom mode) or request for running more or less
agents (in Debug mode)
Right now this option is available only in Classic
The process of enhancing DSC, by capturing the Start and Stop fields in DSC, when the service is
started . The field DSC.TYPE is included in DM.SERVICE.CONTROL which can be updated either
as DME or DMD. If the id of the DM.SERVICE.CONTROL is from DM.MAPPING.DEFINITION
application then the value DMD is given or else if the id is from DM.EXTRACTOR, then DME is
given in the field DSC.TYPE.
DM.SERVICE.CONTROL ID from DM.MAPPING.DEFINITION
76
GPACK
77
GPACK
78
GPACK
79
GPACK
After the extraction is complete we will find that the time fields in DSC have been
updated.
80
GPACK
81
GPACK
Accept Override.
82
GPACK
83
GPACK
84
GPACK
5.1.2.6
Service Control
Table
TSA.SERVICE
Purpose
The Menu is used to Start or Stop the TSA.SERVICE record TSM or any other service that
needs to be started or stopped manually.
Service Status
Enquiry
ENQUIRY DM.ENQ.TSA.STATUS
Purpose
This enquiry provides the information for all Data Migration services as well as the TSM status.
Once the TSA.SERVICE has been marked as START, if the TSM is running in phantom mode
the tSA(agent) will be automatically started. If the same is running in debug mode, then the agents
will have to be manually initiated to start the actual process.
85
GPACK
We need to refresh the enquiry regularly to know the status of the services. For this we need
to enable auto refresh as shown below.
86
GPACK
The result will be as below and the query will refresh every 5 seconds.
5.1.2.7
87
Application backup
GPACK
Table
DM.APP.BACKUP
Purpose
Taking backup of the application is a necessary step in the process of migration. The following
Steps explain the usage of the application DM.APP.BACKUP by which we can take the
Backup of any application.
DM.APP.BACKUP - parameters
This is the table where the parameters are provided. It can be accessed from the MENU line
of the Browser.
88
GPACK
Application: This field is defaulted with the Application for which the APPLICATION N
backup is to be taken.
Backup Dir: The Backup Directory where the folders will be created for creating the
backup.
Backup Folder: The folder name where the APPLICATION records will be moved is
defaulted here.
No Of Agents: The number of agents that will be required to run the service to clear and
backup the APPLICATION.
Local Ref: Currently non inputtable, can be used for further developments.
Override: No overrides are generated with this application.
89
GPACK
1.
DOB-<ID of DM.APP.BACKUP>
2.
DOB-<ID of DM.APP.BACKUP>
90
GPACK
3.
DOB-<ID of DM.APP.BACKUP>
91
GPACK
The Folder for taking the Application backup will be created under DM.BACKUP.
application DM.APP.BACKUP.
2.
3.
Start the TSM , but inputting and authorizing the TSA.SERVICE records TSM to Start
Initiate the TSM in the classic mode like below.
4.
The records from the APPLICATION will be moved to the backup folder.
Before :
92
GPACK
After
Count the number of records in the backup folder to verify all the records are been
moved .
5.1.2.8
AA.MAPPING
Table
DM.AA.MAPPING
Purpose
This application is used to extract the AA products with their respective Properties
and Property class.
93
GPACK
Input a record
94
GPACK
The output will be stored in AA.DMS.OUT directory under the particular product line
name
5.1.2.9
DM.PRE.PROCESS (Multi-Threading)
Table
DM.PRE.PROCESS
Purpose
The scope of the table is to explain in detail regarding Multi-Threading
DM.PRE.PROCESS.
95
GPACK
96
GPACK
97
GPACK
98
GPACK
Service Control
Table
TSA.SERVICE
Purpose
To start and stop the services during extraction process.
Service Status
Table
DM.PREREQUISITE
Purpose
To monitor the services status during extraction process.
99
GPACK
5.1.3.1
DM.PREREQUISITE
This is the table where the parameters are provided. It can be accessed from the
Command
line of the Browser.
The ID for this application must be in the format:
A 3 digit number followed by an asterisk (*), followed by a valid user name that must be
present in the USER table. (The format is mandatory).
100
GPACK
Process: The task that will be done when the application is verified. Either creation of
DSCs or Only creation of Directories or Both can be specified here.
Dmd Prefix: The prefix that will be used to identify the DM.MAPPING.DEFINITION f or
creation of the respective DM.SERVICE.CONTROL.
Directory Name: The directory name where the data files are stored and under this
directory will other folders get created which will be used as Path Name in
DM.SERVICE.CONTROL. The value DATA.IN is default ed, which can be changed to
any other directory as and when needed.
Select List: This is an optional filed, where a valid &SAVEDLISTS& name can be
Given.
Dmd Name: This is an Associated Multivalue set which will hold the DMD name for
Which the DM.SERVICE.CONTROL will be created followed by a dot (.) and a two
Digit sequence number. (The format is mandatory). The folders that need to be
Created for the data file will be based on the input to this field
File Name: The data file that will be used for the Load will be selected automatically
based on the input in the previous field. The data file must be present in the Directory
that has been specified in the field Directory Name. The format of the data file must
be as follows.
Recon Dir: The Directory name must be specified here where the reconciliation
folders will be created. By default RECON.OUT is populated in this field.
Recon Spec Name: This is a multivalue field that is used to provide the Names of the
folders that need to be created under the Directory specified in the field Recon Dir.
101
Local Ref: Currently non inputtable, can be used for further developments .
GPACK
DM.PREREQUISITE verify
After committing the record we need to verify the record. We type the record name to be
verified and click the button as shown.
102
GPACK
5.1.3.2
Table
DM.EXTRACTOR
Purpose
The application DM.EXTRACTOR is used for extraction of data from T24.
DM.EXTRACTOR parameters
103
GPACK
This is the table where the parameters are provided. It ca n be accessed from the Command
line of the Browser.
There are no specifications for the ID of this application.
multivalue field.
Sel Value: The Selection Value must be specified here. This is an associated
multivalue field.
Sel Comb: The Selection Combination must be specified here. This is an associated
multivalue field.
104
Select List: The save list name must be enter ed in this field.
Target File: The output filename for the Extraction.
Output Dir: The output directory where the extracted file will reside.
FM Delim: The Field Marker that will be used in the extract to separate the fields.
GPACK
VM Delim: The Value Marker that will be used in the extract to separate the Multi
values.
SM Delim: The Sub value Marker that will be used in the extract to separate the Sub
values.
Fld Label: The field description for the field to be extracted .
Fld Type: The extract type for the field is specified here.
Fld Name: The field name from the application which is to be extracted.
Fld Rtn: A subroutine can be attached to this field that will be used for different
extraction logic
DM.EXTRACTOR authorize
DM.EXTRACTOR-<ID of DM.EXTRACTOR>
2.
DM.EXTRACTOR-<ID of DM.EXTRACTOR>
3.
DM.EXTRACTOR-<ID of DM.EXTRACTOR>
TSA.WORKLOAD.PROFILE
105
GPACK
TSA.SERVICE
BATCH
106
GPACK
107
GPACK
108
GPACK
The output file specified in the application is generated in the output directory.
Input the necessary details in the respective fields and OFS.ACTION must be
VALIDATE.
109
GPACK
110
GPACK
After authorising the record, the service has to be started as per the below screen shot.
Once the service is complete, the OFS will be listed as shown below.
111
GPACK
112
GPACK
113
GPACK
114
GPACK
115
GPACK
Extraction can also be done for multibook or multicompany via DME as below
116
GPACK
OUTPUT:
117
GPACK
118
GPACK
119
GPACK
120
GPACK
121
GPACK
Any specific conversion that is required for extraction can also be done.
122
GPACK
123
GPACK
124
GPACK
125
GPACK
126
GPACK
127
GPACK
128
GPACK
129
GPACK
130
GPACK
131
GPACK
132
GPACK
133
GPACK
Trail balance for the GL in foreign currency can be extracted through DME.
134
GPACK
TSA.SERVICE record:
On authorisation of the DM.EXTRACTOR record a record in TSA.SERVICE will be created.
Sample Output :
TSA.SERVICE record:
On authorisation of the DM.EXTRACTOR record a record in TSA.SERVICE will be created.
135
GPACK
Sample Output :
136
GPACK
Any transformation required that is appending values or trimming any value can be done
via DME as below.
Append:
Append is the condition which is used to add a value in the first position or in the last position
of the field value. Also a value can be added at a position mentioned within the field value.
The field values are provided as given in the below screen shot to extract from more than one
application using the append condition.
137
GPACK
138
GPACK
139
GPACK
After running the service, target file has been created in the output directory
140
GPACK
Extracted file
Trim:
Trim is the condition which is used to remove a value from the position which is mentioned
within the field value.
141
GPACK
After running the service, target file has been created in the output directory
142
GPACK
Extracted file
Replace:
Replace is the condition which is used to replace one value with another value in the position
mentioned in the field value.
The format is VALUE[POSITION]
E.g.:
VALUE [number1, number2]
Or VALUE[number]
The field values are provided as given in the below screen shot to extract from more than one
application using the replace condition.
143
GPACK
After running the service, target file has been created in the output directory
144
GPACK
Extracted file
145
GPACK
Specific Conversions
For specific conversions FLD.TYPE field should be set to MANIPULATE.
System Variables
Variables like !TODAY, !USER, !NEXT.WORKING.DAY etc. can be used
DM.EXTRACTOR Record
146
GPACK
EXTRACTED DATA:
147
GPACK
Arithmetic Operation
The following operations can be performed.
+, - , * , /
DM.EXTRACTOR Record
148
GPACK
EXTRACTED DATA:
149
GPACK
Absolute value
Absolute value can be extracted. FLD.NAME should be set to A
DMFLD.ONLINE.BALANCE.
DM.EXTRACTOR Record
150
GPACK
EXTRACTED DATA:
151
GPACK
Substitute
DM.EXTRACTOR Record
152
GPACK
EXTRACTED DATA:
153
GPACK
Convert
This will replace every occurrence of the character X with Y.
DM.EXTRACTOR Record:
154
GPACK
EXTRACTED DATA:
155
GPACK
Extract
The values in the record is Extracted from position N1 to N2.
DM.EXTRACTOR Record:
156
GPACK
EXTRACTED DATA:
157
GPACK
158
GPACK
159
GPACK
The value OPEN.ACTUAL.BAL in the field FLD.NAME will extract the values
for the fields Sign, Transaction Reference and Amount FCY/Amount LCY as
mentioned below.
DC.ORDER.LIST.txt
DC.BALANCED.LIST.txt
160
GPACK
Before extracting the data, the DME record the LD.SCHEDULE.DEFINE fields must
be in the below format. The @id only required for extracting the future repayment
data. Here the LD records with future repayment schedule will be extracted
DM.EXTRACTOR Record:
161
GPACK
After starting the services the DME records the LD contract with future repayment will
be extracted.
162
GPACK
Extracted file:
163
GPACK
The FILE.TYPE field must be provided with the value $ARC for extracting archived
records. If the field FILE.TYPE is left blank the live file will be extracted for the
particular application.
164
GPACK
After starting the services the DME records the LD contract with future repayment will
be extracted.
165
GPACK
Extracted file:
Service Control
166
GPACK
Table
TSA.SERVICE
Purpose
The Menu is used to Start or Stop the TSA.SERVICE record TSM or any other service that
needs to be started or stopped manually.
Service Status
Enquiry
ENQUIRY DM.ENQ.TSA.STATUS
Purpose
This enquiry provides the information for all Data Migration services as well as the TSM status.
Once the TSA.SERVICE has been marked as START, if the TSM is running in phantom mode
the tSA(agent) will be automatically started. If the same is running in debug mode, then the agents
will have to be manually initiated to start the actual process.
167
GPACK
We need to refresh the enquiry regularly to know the status of the services. For this we need
to enable auto refresh as shown below.
168
GPACK
The result will be as below and the query will refresh every 5 seconds.
169
GPACK
T24 to T24 migration has the same menus as in Client System to T24 Migration and the
procedure is also the same as seen above.
170
T24.SAB.INFO
T24.SAB.INFO.FIELDS
T24.SAB.INFO.ID
T24.SAB.INFO.VALIDATE
T24.DAB.INFO
T24.DAB.INFO.FIELDS
T24.DAB.INFO.BUILD
GPACK
T24.DAB.INFO.BUILD.LOAD
T24.DAB.INFO.BUILD.SELECT
T24.DAB.INFO.BUILD.POST
The below routines creates the dump file from source environment.
T24.DAB.SOURCE.BUILD
T24.DAB.SOURCE.BUILD.LOAD
T24.DAB.SOURCE.BUILD.SELECT
T24.DAB.SOURCE.BUILD.POST
The below routine creates the dump file from target environment.
T24.DAB.TARGET.BUILD
T24.DAB.TARGET.BUILD.LOAD
T24.DAB.TARGET.BUILD.SELECT
T24.DAB.TARGET.BUILD.POST
The below routine creates the output dump file, which is comparison of source and target
dump in the target environment.
T24.DAB.COMPARE.BUILD
T24.DAB.COMPARE.BUILD.LOAD
T24.DAB.COMPARE.BUILD.SELECT
T24.DAB.COMPARE.BUILD.POST
This routine creates the output dump file, which is comparison of source and target dump in
the target environment.
The Data Audit Build tool is developed for all the data tables.
In this document, we have shown for both AUDIT and NON AUDIT Field data tables. Similarly
the procedure can be used for all the data tables.
171
GPACK
Browser View:
172
GPACK
Putty View:
SOURCE.SL.ID- The savedlists ID with the selected application has to be provided in this
field. If not, the build will be created for all the applications
The table names has to be provided in the savedlists as shown in the below screenshot,
173
GPACK
174
GPACK
175
GPACK
176
GPACK
177
GPACK
To view records from the template T24.DAB.INFO give S as shown below in the
screenshot
The data record is shown below from the template T24.DAB.INFO.
178
GPACK
Putty View
179
GPACK
Putty View:
This T24.DAB.SOURCE.BUILD service will pick the data from T24.SAB.INFO &
T24.DAB.INFO
Start the service T24.DAB.SOURCE.BUILD to create the source build
180
GPACK
Check whether the date and status has been updated in T24.SAB.INFO
181
GPACK
STEP 5: Check the source dump in T24.DAB/SOURCE folder under the bnk.run
Please select the data table dump folder from Source dump files.
And View of the record contents in the data dump is shown in the below screenshot.
182
GPACK
183
GPACK
184
GPACK
185
GPACK
186
GPACK
Putty View:
187
GPACK
Check whether the date and status has been updated in T24.SAB.INFO
188
GPACK
189
GPACK
To view records from the template T24.DAB.INFO give S as shown below in the screenshot
190
GPACK
STEP 9: Check the source dump in T24.DAB/TARGET folder under the bnk.run
The dump will be created in the TARGET path with the ID as T24.DAB.TARGET.BUILD
Current System Date & Time)
191
GPACK
Please select the particular data table dump folder from Target dump files.
View of the record contents in the target data dump is shown in the below screenshot.
192
GPACK
Steps to be done for comparison between Source dump and Target dump
STEP 1: Clear files
Before running every service the below mentioned files have to be cleared,
CLEAR.FILE F.TSA.STATUS
CLEAR.FILE F.BATCH.STATUS
CLEAR.FILE F.EB.EOD.ERROR
Before running the service see the below screen shot SOURCE STATUS &
TARGET.STATUS completed and then input the REVERSE.COMPARE - Y/N
Since we have only the testing data's its very less than client datas. To proceed the reverse
process it will take more time to compare. If you wish to continue Press
Y Comparison of reverse process, it means compares target dump to source dump
N Skip the reverse process comparison
STEP2: Start the TSM
193
GPACK
Start the TSM first and then start the service T24.DAB.COMPARE.BUILD in AUTO mode
194
GPACK
See the below screen shot after completed the dump creation T24.SAB.INFO application is
include the date and completed status
STEP 6: Check the source dump in T24.DAB/RESULT folder under the bnk.run
195
GPACK
The dump will be saved in the RESULT path with the ID as T24.DAB.RESULT.BUILD Current
System Date & Time)
196
GPACK
1) T24.DAB.MISMATCH.BASE
2) T24.DAB.SOURCE.BASE
3) T24.DAB.TARGET.BASE
4) T24DAB.AA.ACTIVITY.CLASS.SOURCE.NEW
5) T24DAB.AA.ACTIVITY.CLASS.TARGET.NEW
T24.DAB.MISMATCH.BASE The field level mismatches between source and target will be
captured in this file
197
GPACK
198
GPACK
199
GPACK
T24.DAB.TARGET.BASE - The new records in the target environment will be captured in this
file
200
GPACK
201
GPACK
5.1.5.2
T24.SAB.INFO
T24.SAB.INFO.FIELDS
T24.SAB.INFO.ID
T24.SAB.INFO.VALIDATE
T24.OAB.SOURCE.BUILD
T24.OAB.SOURCE.BUILD.LOAD
T24.OAB.SOURCE.BUILD.SELECT
T24.OAB.SOURCE.BUILD.POST
T24.OAB.TARGET.BUILD
T24.OAB.TARGET.BUILD.LOAD
T24.OAB.TARGET.BUILD.SELECT
T24.OAB.TARGET.BUILD.POST
Below routine creates the output dump file, which is comparison of source and target
dump in the target environment.
T24.OAB.COMPARE.BUILD
T24.OAB.COMPARE.BUILD.LOAD
T24.OAB.COMPARE.BUILD.SELECT
T24.OAB.COMPARE.BUILD.POST
The Object Audit Build tool is developed for all the Local and Core Source.
202
GPACK
203
GPACK
204
GPACK
Putty View:
205
GPACK
206
GPACK
Browser View:
207
GPACK
Putty View:
208
GPACK
209
GPACK
Step 8: Check the source dump in T24.OAB/SOURCE folder under the bnk.run
The Output structure of the base from the Source environment is as follows:
The base will be saved in the SOURCE path as T24.OAB.BASE.SOURCE stamped by the
present System DATE & TIME.
To view the source base select the file using the following command in the specified path.
210
GPACK
211
GPACK
Create a saved list (User defined name) and give the list of BP names.
JED &SAVEDLISTS& JSHOW.COMMAN.BP
212
GPACK
To start the TSM first and then start the service T24.OAB.TARGET.BUILD in auto
mode
213
GPACK
214
GPACK
215
GPACK
Putty View:
START THE SERVICE T24.OAB.TARGET.BUILD to create the dump with the source
provided in savedlists (if savedlists provided) or with all the Bp lists
Step 6: Start the TSM service
216
GPACK
Check whether the date and status has been updated in T24.SAB.INFO
217
GPACK
Step 8: Check the source dump in T24.OAB/TARGET folder under the bnk.run
The Output structure of the base from the Target environment is as follows:
The target base file will be saved in the below directory.
../ bnk.run/T24.OAB/TARGET
The base will be saved in the TARGET path as T24.OAB.BASE.TARGET stamped by the
present System DATE & TIME.
To view the source base select the file using the following command in the specified path.
218
GPACK
Move the source base file to ../ bnk.run/T24.OAB/ in the target area.
219
GPACK
220
GPACK
Browser View:
To start the TSM first and then start the service T24.OAB.COMPARE.BUILD in AUTO mode
221
GPACK
Putty View:
Start the service T24.OAB.COMPARE.BUILD to compare the routines
222
GPACK
See the below screen shot after completed the dump creation T24.SAB.INFO application is
include the date and completed status
223
GPACK
Step 7: Check the source dump in T24.OAB/RESULT folder under the bnk.run
The Output structure of the comparison result from the Target environment is as follows:
T24.OAB.MATCH - It is based on the comparison result between the Source and Target
base. The result contains the matched Contents and also the new routines (routines found in
Source Base and not in Target Base).
224
GPACK
Where ,<DD MTH YYYY> is the date when the routine was compile Last
<HH:MM:SS>, is the time when the routine was compile Last
<NEW> Applies only when there is a new routine exists and compiled in target
env.
>
225
GPACK
T24.OAB.UNMATCH - It is based on the comparison result between the Source and Target
base. The result contains the unmatched Contents. Below is the sample for the unmatched
object audits
226
GPACK
T24.OAB.NEW - It is based on the comparison result between the Source and Target base.
The result contains the new routines (routines found in Target Base and not in Source Base).
Below is the sample for the unmatched object audits.
227
GPACK
5.2.
5.2.1. DM.REF.TABLE
It is data migration reference table for AA Migration. It contains three fields
DESCRIPTION,
APPLICATION, FIELD.NAME. If we want to extract any specific data (field value) from any
valid t24 table, we need to specify the application name and field names for which the values
has to be extracted.
228
GPACK
5.2.2. DM.AA.CUSTOMER.PROXY
Description:
The DM.AA.CUSTOMER.PROXY is a concat table containing arrangement id and proxy
customer id. It is a multi value set. The concat file id must be customer id which provided in
the AA.ARRANGEMENT.ACTIVITY record. Before we update the concat file, we need to be
specify the required fields in DM.REF.TABLE (Ref 4.4.1). Based on the reference table it
updates DM.AA.CUSTOMER.PROXY table.
Functional Process:
The arrangement id and customer id extracted from AA.ARRANGEMENT.ACTIVITY table as
based on the certain condition. First select the AA records, if the ACTIVITY field has
PROXY.SERVICES-NEW-ARRANGEMENT with FIELD.NAME has PROXY.
If all the records has the value PROXY in field FIELD.NAME, the corresponding proxy
customer and arrangement id will be updated in concat file.
229
GPACK
230
GPACK
5.2.3. T24.AA.INT.ACT.DEFAULT
The subroutine will be attached as input routine in specified version with creation of PGM .FILE
and EB.API entry. This routine will default the values as PROXY.ARRANGEMENT in the
fields FIELD.NAME and updates the corresponding arrangement id in the FIELD.VALUE field
in AA.ARRANGEMENT.ACTIVITY table (selected from DM.AA.CUSTOMER.PROXY) . It is
multi value set.
Prerequisites:
Create the PGM.FILE entry for the T24.AA.INT.ACT.DEFAULT
231
GPACK
While inputting or Loading the data via this version the specified fields will be defaulted
from the concat table DM.AA.CUSTOMER.PROXY.
232
GPACK
5.2.4. DM.AA.CUSTOMER
Description:
The DM.AA.CUSTOMER table is a concat file containing arrangement id for the
corresponding PROXY CUSTOMER. The arrangement id filed is a multi value set. The concat
file id must be customer id which provided in the AA.ARRANGEMENT.ACTIVITY (Proxy)
record. Before we update the concat file, we need to be specify the required fields in
DM.REF.TABLE (Ref 4.4.1). Based on this reference table, it updates the
DM.AA.CUSTOMER table.
Functional Process:
The arrangement id extracted from AA.ARRANGEMENT.ACTIVITY table as based on the
certain condition. First select the AA records, if the ACTIVITY field has
233
GPACK
234
GPACK
5.2.5. T24.EB.EXT.ARR.ACT.DEFAULT
The subroutine will be attached as input routine in specified version with creation of
PGM.FILE
and EB.API entry. This routine will default the values in the fields ARRANGEMENT from the
concat table DM.AA.CUSTOMER in EB.EXTERNAL.USER table.
Prerequisites
Create the PGM.FILE entry for t he T24.EB.EXT.ARR.ACT.DEFAULT.
235
GPACK
236
GPACK
While inputting or Loading the data via this version the arrangement id will be defaulted from
the concat table DM.AA.CUSTOMER.
237
GPACK
This is used to convert the data file of SC.PERF.DETAIL into the required utility format file
and
to update the concat File SC.PERF.DETAIL.CONCAT.
Routine Updation
DM.SC.FORMAT:
This utility is used to convert the provided data file into RUN.PERFORMANCE.TAK EON
execution format.
Required Input for DM.SC.FORMAT:
Steps to be followed for Program execution.
1. Create a directory PERF.DATA in ...bnk.run path
2. Enter the path and file name
3. In PERF.DATA directory create a file data.txt and save it
238
GPACK
4. Run the program DM.SC.FORMAT it will create a new Data file in data.txt.
Expected output
Before Processing DM.SC.FORMAT data file in format:
239
GPACK
240
GPACK
DM.SC.PERF.CONCAT.UPDATE
The routine will update the concat file SC.PERF.DETAIL.CONCAT based on the loaded
SC.PERF.DETAIL record. It contains the SC.PERF.DETAIL id with corresponding date
(YYYYMM).
SC.PERF.DETAIL:
241
GPACK
242
GPACK
243
GPACK
244
GPACK
Table
DM.DATA.COMPARE
Purpose
The scope of the table is to compare both the Extracts and to provide
the output of the comparison.
Id Key: This field will hold the Primary key (ID) position of the record.
Legacy File Name: This field is used for information only to store the Legacy
Data Extract.
Source File Path: The directory Name where the Extract is provided by the
bank.
Source File Name: The File Name of the Extract provided by the bank.
Target File Path: The Directory Name where the Recon Extract from T24 is
located.
Target File Name: The File Name of the Recon Extract from T24.
Output File Path: The Output File Path where the reconciliation extract is
available after the comparison.
Output File Name: The Output File name of the reconciliation extract after the
comparison.
No. Of Agents: The number of agents that will be required to run the
comparison can be
245
GPACK
Field Position.1: This field specifies the position of the field and applicable
only for environment compare.
Source Field.1: This field specifies the first field of source data file.
246
GPACK
Data File:
In the below source file, the first two string ids are ending with 004,005. But the target file,
the first two string ids are ending with 005,004.
SOURCE.txt
TARGET.txt
247
GPACK
DM.DATA.COMPARE record
Record ID: TEST
248
GPACK
Output:
The source and target new records are created in the separate file.
Mismatch records
249
GPACK
In the below source file, the first two string ids are ending with 004,005. But the target file,
the first two string ids are ending with 005,004.
SOURCE.txt
TARGET.txt
DM.DATA.COMPARE record
Record ID: TEST
250
GPACK
The source and target new records are created in the separate file.
251
GPACK
Mismatch records
Mismatch records
SOURCE.txt
TARGET.txt
DM.DATA.COMPARE record
Record ID: TEST
252
GPACK
253
GPACK
After updating the error log file, the source and target concat file are clear by the routine.
5.2.8. DM Scheduler
Table
DM.SERVICE.SCHEDULER
Purpose
The services for validating, loading and authorising of Data in T24 is required
to be scheduled and automated. Most of the tasks are manually performed
which needs to be eliminated. The multithread service will perform the tasks
in an automated way to eliminate the manual task to an extent.
254
GPACK
Err Threshold: To specify the maximum number of errors that will be tolerated
before marking the service as failed. This is an associated multi-value field.
No Of Errors: No input field to update the number of records in error from the
corresponding DM.SERVICE.CONTROL after the service is complete. This
is an associated multivalue field.
Skip Service: To specify if the service needs to be run or can be skipped. The
two options are YES and NO.
Dsc Status: No input field which will be updated with values 0, 1, 2 or 3 based
on the current status of the service associated with the field DM Service Id.
Dss Status: No input field which will be updated with values 0, 1, 2 or 3 based
on the current status of all the services in the SCHEDULER.
Status Detail: No input field which will be updated with some details for
information only.
255
GPACK
256
GPACK
257
GPACK
258
GPACK
Data File:
Output:
259
GPACK
Once the Scheduler is completed, the DSS updates with Sceduler Status as
Scheduler Completed.And also indicating No of Errors as 0 for both (Positive Test case).
260
GPACK
5.2.9. DM.DATA.TARNSLATION
Table
DM.DATA.TRANSLATION
Purpose
The data provided by the Client will contain values that correspond to the
Legacy Data Format which cannot be loaded into T24 without any
TRANSLATION. Hence a valid TRANSLATION must be done so that the
Data is converted into a T24 Data format and can be loaded.
Source Dir: The source directory of the data file consisting of Legacy ID is
provided in this field.
(Please note that any error log generated from the TRANSLATION will
be available in this directory)
261
GPACK
Target Dir: The name of the output directory where the converted file will be
available after the TRANSLATION is over. Any error log generated will be
available as mentioned above.
Target Fname: The name of the converted data file with T24 Ids must be
given here.
FM Delim: The Field Marker that is used in the datafile to delimit field values.
By default | is defaulted which can be modified as per the user.
VM Delim: The Value Marker that is used in the datafile to delimit field
values. By default :: is defaulted which can be modified as per the user.
SM Delim: The Sub Value Marker that is used in the datafile to delimit field
values. By default !! is defaulted which can be modified as per the user.
Mapping Table: Any Concat file name that will be used for the
TRANSLATION can be provided in this field. This is a multivalue field.
Pos Convert: The positions in the data file where the TRANSLATION has to
be done and replaced with T24 Ids. This is a sub value field associated with
Mapping Table field.
Pos Value: fixed values change the position in the data file where the
TRANSLATION has to be done and replaced the fixed value. This is an
associated sub value set with Pos Check field.
Pos Routine: A subroutine can be attached to this field that will be used for
TRANSLATION logic. This is an associated sub value set with Pos Check
field.
No. Of Service: The number of agents that will be required to run the
TRANSLATION can be mentioned in this field.
Input a record.
262
GPACK
263
GPACK
Output
Thus any changes that are to be done in the data file can me made through this table.
5.2.10.
When loading the UTF converted data file or data file which consists of
more than 1024 characters, there should not be any broken records in
DM.SERVICE.DATA.FILE which means there is no breakage of record
into 2.
264
GPACK
After starting the service, while listing the OFS.REQUEST.DETAIL, it is found there is
no breakage of record even it consists of special characters or more than 1024
characters.
In the above screenshots, it is found that there is no truncation of record even if there
is special characters or more than 1024 characters.
265
GPACK
5.2.11.
DM.AA.LENDING.BAL.EXTRACT
OUTPUT File:
Sample Output:
266
GPACK
5.2.12.
DM.AA.LENDING.SCHEDULE.EXTRACT
Execution:
OUTPUT File:
Sample Output:
267
GPACK
5.2.13.
DM.DC.BATCH.UPDATE
DM.DC.BATCH.UPDATE can be done via DSC and also it checks for the updation of
DC.BATCH.CONTROL and DC.DEPT.BATCH.CONTROL.
It also checks the ID format for Data capture like Number of characters, prefixed with
DC and the current Julian Date in T24
DSC record:
268
GPACK
If the batch already exist in the environment then the below error will throw
5.2.14.
Mnemonic duplication:
On loading the file, if there are any duplication of mnemonic it would be captured and
error message will be created for the same.
269
GPACK
270
GPACK
5.2.15.
During AA load. Suppose if the AA product is proofed and not published, error log is
created.
271
GPACK
After running the service, if in the data file the product used is
SMALL.BUSINESS.LOAN then error log is created as shown below.
5.2.16.
TDM.DATA.STRUCTURE.EXTRACT
It is a program which is used to extract all the fields and properties of any table given in
the saved list.
Create the savedlists by entering the name of a specific application.
272
GPACK
Once the savedlists name is given, it will extract the field properties of that application and
also it will get stored in the T24.DATA.STRUCTURE.LIST directory in the format of .xls.
273
GPACK
5.2.17.
ENQUIRY DM.ENQ.ISSUE.LOG
Selection Criteria
Output of the selection showing number of error, number of records validated and the
error details
5.2.18.
UTF8 conversion
The UTF8 conversion of data file for swift character handling has to be incorporated in
generic DM Tool.
274
GPACK
275
GPACK
After verifying the DMP we need to specify the value Yes for converting the data file and the
converted data file is generated in same path.
276
GPACK
5.2.19.
DM.CLEANUP
1. The purpose of the routine is to clean up the existing DMD'S, DSC'S, TSAs
(DM.SERVICE) and BATCH records related to DM SERVICE and
VERSIONS and The lib & bin files (GR0800005bin, GR0800005lib) and
DM.BP and T24MIG.BP folders are decataloged and deleted
2. For this we need to run program DM.CLEANUP.
Prerequisites
To export the corresponding LIB and BIN :
export JBCDEV_LIB=GR0800005lib
277
GPACK
This routine DM.CLEANUP has to be run after export the LIB and BIN.
Run the routine, it has been read the DMD and get the version ID from the
DMD (OFS.VERSION).
Then read version record and get the CURR.NO, based on the CURR.NO the
record will be delete the Versions LIVE NAU and HISTORY path.
DM.BP and T24MIG.BP routines get decataloged and the folder get deleted.
278
GPACK
Clean-up Process:
279
GPACK
280
GPACK
6. Conclusions:
Thus the document explaining the functionality of a non -core T24 Data Migration Tool that
is used for Data extract from T24 Load into T24 from any Client system. This could be used
as a base for using Data Migration tool in any T24 data Extract & Load process.
281
GPACK