Documente Academic
Documente Profesional
Documente Cultură
Table of Contents
1. Extraction................................................................................................................................................. 8
1.1Introduction ......................................................................................................................................... 8
1.2 Step-by-step control flow for a successful data extraction with SAP BW: .......................................12
2. Data Extraction from SAP Source Systems.......................................................................................... 12
2.1 Introduction ...................................................................................................................................... 12
2.1.1 Process ....................................................................................................................................... 13
2.1.2 Plug-in for R/3 Systems..............................................................................................................14
2.2 Transfer Method - PSA and IDoc....................................................................................................... 14
2.2.1 Introduction ...............................................................................................................................14
2.2.2 Persistent Staging Area (PSA)..................................................................................................... 15
2.2.2.1 Definition ............................................................................................................................ 15
2.2.2.2 Use ...................................................................................................................................... 15
2.2.3 IDoc’s..........................................................................................................................................15
2.2.3.1 Definition ............................................................................................................................ 15
2.2.3.2 Example: ............................................................................................................................. 15
2.2.4 Two Methods to transfer data...................................................................................................15
2.2.4.1 Differences and advantages:...............................................................................................16
2.2.4.1.1 PSA ...............................................................................................................................16
2.2.4.2 ALE (data IDoc)................................................................................................................ 16
2.3 Data Source....................................................................................................................................... 16
2.3.1 Assigning DataSources to InfoSources and Fields to InfoObjects.............................................. 17
2.3.2 Maintaining DataSources...........................................................................................................17
2.3.3 Transferring Business Content DataSources into Active Version .............................................. 18
2.3.4 Extraction Structure ...................................................................................................................18
2.3.5 Transfer Structure ......................................................................................................................18
2.3.6 Replication of DataSources ........................................................................................................19
2.3.6.1 Replication of the Entire Metadata ....................................................................................19
2.3.6.2 Replication of the Application Component Hierarchy of a Source System ........................ 19
2.3.6.3 Replication of the Metadata ...............................................................................................19
2.3.6.4 Replication of a DataSource of a Source System ................................................................19
2.4 Data Extraction Logistics ...................................................................................................................20
2.4.1 Data extraction Illustration ........................................................................................................20
2.4.1.1 Full Load:............................................................................................................................. 20
2.4.1.2 Delta Load: .......................................................................................................................... 22
2.5 LO Cockpit Functions......................................................................................................................... 23
2.5.1 Maintain Extract Structures ....................................................................................................... 23
2.5.2 Maintain Data Sources............................................................................................................... 23
2.5.3 Activating update.......................................................................................................................24
2.5.4 Controlling update .....................................................................................................................24
2.5.5 Setup Tables...............................................................................................................................24
2.5.6 Serialized V3...............................................................................................................................24
2.5.7 Queued Delta (the third update method).................................................................................. 25
2.5.8 Direct Delta ( 2nd delta update method in our list) .................................................................... 25
2.5.9 Unserialized V3: (The last one) ..................................................................................................25
2.6 LO Data Sources Data Flow in R/3 :................................................................................................... 25
2.6.1 Filling up the Appended Structure.............................................................................................30
2.6.2 Regenerate & Check the Customized Objects ...........................................................................34
2.7 Structure of Delta Method for LO Cockpit Data Sources..................................................................36
2.7.1 Delta Management in extraction...............................................................................................36
2.7.2 Step-by-Step Maintenance ........................................................................................................37
2.8 Delta Method ....................................................................................................................................46
2.8.1 Master Data ...............................................................................................................................46
2.8.2 TRANSACTIONAL DATA ..............................................................................................................47
2.8.3 Delta Process..............................................................................................................................48
2.9 Delta Method Properties .................................................................................................................. 49
2.9.1 Delta Initialization ......................................................................................................................49
2.9. 2 Delta Extraction.........................................................................................................................49
2.9.3 Update Modes ........................................................................................................................... 50
2.9.3.1 V1 Update ........................................................................................................................... 50
2.9.3.2 V2 Update ........................................................................................................................... 51
2.9.3.3 V3 Update ........................................................................................................................... 51
2.10 Delta Queue Functions....................................................................................................................51
2.10.1 Direct Delta (V1 update) ..........................................................................................................52
2.10.2 Queued delta (V1 + V3 updates)..............................................................................................53
2.10.2.1 Benefits ............................................................................................................................. 53
2.10.2.2 Limits.................................................................................................................................54
2.10.3 Un-serialized V3 Update (V1/V2 + V3 Updates) ...................................................................... 54
2.11 Generic extraction...........................................................................................................................55
2.11.1Create Generic extraction [Master data]..................................................................................56
2.12 Generic Data Types .........................................................................................................................59
2.12.1 Master Data ............................................................................................................................. 59
2.12.1.1. Texts.................................................................................................................................59
2.12.1.2. Attributes .........................................................................................................................59
2.12.1.3. Hierarchies .......................................................................................................................59
2.12.2 Functions..................................................................................................................................59
2.12.2.1 Time-dependent Attributes ..............................................................................................59
2.12.2.2 Time-dependent Texts ...................................................................................................... 59
2.12.2.3 Time-dependent Texts and Attributes.............................................................................. 60
2.12.2.4 Language-dependent Texts...............................................................................................60
2.12.3 Transactional data....................................................................................................................60
2.13 Generic Data sources ......................................................................................................................61
2.13.1 Extraction Structure ................................................................................................................. 62
2.13.2 Editing the DataSource in the Source System..........................................................................62
2.13.3 Replication of DataSources...................................................................................................... 62
2.13.3.1 Replication Process Flow ..................................................................................................62
2.13.3.2 Deleting DataSources during Replication .........................................................................63
2.13.3.3 Automatic Replication during Data Request.....................................................................63
2.14 Enhancing Business Content...........................................................................................................63
3. Extraction with Flat Files......................................................................................................................69
3.1 Data from Flat Files (7.0)...................................................................................................................69
3.2 Data from Flat Files (3.x) ...................................................................................................................70
3.3 Extracting Transaction and Master Data using Flat Files ..................................................................70
3.4 Data Types that can be extracted using Flat Files.............................................................................86
3.4.1 Basic Steps of Data Flow (ETL process): .....................................................................................87
3.4.2 Step-by-Step to upload Master Data from Flat File to InfoObjects ...........................................87
4. DB Connect...........................................................................................................................................106
4.1 Introduction ....................................................................................................................................106
4.2 Loading data from SAP Supporting DBMS into BI...........................................................................107
4.2.1 Process Description.................................................................................................................. 107
5. Universal Data Integration ................................................................................................................. 130
5.1 Introduction ....................................................................................................................................130
5.2 Process Flow....................................................................................................................................130
5.3 Creating UD source system.............................................................................................................130
5.4 Creating a DataSource for UD Connect...........................................................................................131
5.5 Using Relational UD Connect Sources (JDBC) ................................................................................. 133
5.5.1 Aggregated Reading and Quantity Restriction ........................................................................133
5.5.2 Use of Multiple Database Objects as UD Connect Source Object ...........................................133
5.6 BI JDBC Connector...........................................................................................................................133
5.6.1 Deploy the user data source’s JDBC driver to the server: .......................................................134
5.6.2 Configuring BI Java Connector ...............................................................................................134
5.6.2.1 Testing the Connections................................................................................................135
5.6.2.2 JNDI Names.................................................................................................................. 135
5.6.2.3 Cloning the Connections...............................................................................................135
5.6.3 Connector Properties................................................................................................................ 135
5.7 BI XMLA Connector .........................................................................................................................137
5.7.1 Using InfoObjects with UD Connect......................................................................................... 138
5.7.2 Using SAP Namespace for Generated Objects.........................................................................139
6. XML Integration...................................................................................................................................140
6.1 Introduction ....................................................................................................................................140
6.2 Benefits of XML Integration ............................................................................................................140
6.2.1 End-to-End Web Business Processes .......................................................................................140
6.2.2 Open Business Document Exchange over the Internet ........................................................... 141
6.2.3 XML Solutions for SAP services ................................................................................................141
6.3 Business Integration with XML ....................................................................................................... 141
6.3.1 Incorporating XML Standards ..................................................................................................142
6.3.2 SAP’s Internet Business Framework ........................................................................................142
6.3.3 SAP applications with XML....................................................................................................... 143
6.3.4 Factors leading to emergence of XML-enabled SAP solutions ................................................ 144
6.3.4.1 Changing Business Standards and their adoption ............................................................144
6.3.4.2 Internet Security Standards ..............................................................................................144
6.4 Web-based business solutions........................................................................................................144
6.4.1 Components of Business Connector........................................................................................144
6.5 How to Customize Business Connector (BC)...................................................................................145
6.5.1 Add New Users to BC ............................................................................................................... 145
6.5.2 Add SAP Systems......................................................................................................................145
6.5.3 Add Router Tables....................................................................................................................146
6.5.4 Access functionality in the Business Connector....................................................................... 146
7. Data Mart Interface ............................................................................................................................. 147
7.1 Introduction ....................................................................................................................................147
7.2 Special Features ..............................................................................................................................147
7.3 Data Mart Interface in the Myself System......................................................................................148
7.4 Data Mart Interface between Several Systems .............................................................................. 148
7.4.1 Architectures............................................................................................................................ 149
7.4.1.1 Replicating Architecture....................................................................................................149
7.4.1.2 Aggregating Architecture..................................................................................................150
7.4.2 Process Flow.............................................................................................................................151
7.4.2.1 In the Source BI................................................................................................................. 151
7.4.2.2 In the Target BI.................................................................................................................. 151
7.4.3 Generating Export DataSources for InfoProviders...................................................................152
7.4.4 Generating Master Data Export DataSources..........................................................................152
7.4.5 Transactional Data Transfer Using the Data Mart Interface....................................................153
7.4.5.1 Delta Process.....................................................................................................................153
7.4.5.2 Restriction.........................................................................................................................153
7.4.6 Transferring Texts and Hierarchies for the Data Mart Interface ............................................. 153
7. Virtual InfoCubes.................................................................................................................................154
7.1 Introduction ....................................................................................................................................154
7.2 Create Virtual Infocube...................................................................................................................154
7.3 Different Types................................................................................................................................154
7.3.1 SAP RemoteCube .....................................................................................................................155
7.3.1.1 Creating a SAP RemoteCube.............................................................................................155
7.3.1.2 Structure ........................................................................................................................... 156
7.3.1.3 Integration ........................................................................................................................156
7.3.2 Remote Cube............................................................................................................................ 156
7.3.2.1 Structure ........................................................................................................................... 157
7.3.2.2 Integration ........................................................................................................................157
7.3.3 Virtual InfoCubes with Services ...............................................................................................158
7.3.3.1 Structure ........................................................................................................................... 158
7.3.3.2 Dependencies....................................................................................................................159
7.3.3.2.1 Description of the interfaces for user-defined function modules .............................160
7.3.3.2.2 Additional parameters for variant 2 for transferring hierarchy restrictions ............. 161
7.3.3.3 Method for determining the correct variant for the interface.........................................161
1. Extraction
1.1Introduction
Extraction programs that read data from extract structures and send it, in
the required format, to the Business Information Warehouse also belong to the data staging
mechanisms in the SAP R/3 system as well as the SAP Strategic Initiative products such as
APO, CRM and SEM. The IDOC structures or TRFC data record structures (if the user
chooses to use the PSA - Persistent Staging Area) that are generated from the transfer
structures for the Business Information Warehouse on the source system side are used for
this. These extraction tools are implemented on the source system side during
implementation and support various releases. In non-SAP applications, similar extraction
programs can be implemented with the help of third party providers. These then collect the
requested data and send it in the required transfer format using BAPIs to the SAP Business
Information Warehouse.
The OLTP extraction tables form the basis of a DataSource on an R/3
OLTP system. The structure is written in the OLTP using the data elements that describe
the available data, usually from a table view. For an R/3 OLTP source system, the
‘DataSource Replication’ step is provided to duplicate the DataSource is replicated with its
relevant properties in BW. Once there the user can assign it to an InfoSource. The user can
request the Metadata for a DataSource, the Metadata for an application component, or all
the Metadata of a source system: To replicate the Metadata of a DataSource, choose Source
System Tree ® The user Source System à DataSource Overview à The user Application
Components à The user DataSource à Context Menu (right mouse click) à Replicate
DataSources in the BW Administrator Workbench To replicate the Metadata from a source
system into BW for an application component, choose Source System Tree ® The user
Source System ® DataSource Overview ® The user Application Components ® Context
Menu (right mouse click) à Replicate DataSources in the BW Administrator Workbench.
To update all the Metadata of a source system, choose Source System Tree à The user
Source System à Context Menu (right mouse click) à Replicate DataSources in the BW
Administrator Workbench
All application data must be described in SAP BW using meta data. The
InfoObjects used for this are not just transaction and master data but also relationship sets
such as attributes or hierarchies for master data.
Virtually any source of data can be extracted for use in the Business
Information Warehouse
Two options for data transfer are possible
Benefits of TRFC
API to access the data stored in the ODS (read and update).
1.2 Step-by-step control flow for a successful data extraction with SAP
BW:
2. Once the defined point of time is reached, the SAP BW system starts a batch job that
sends a request IDoc to the SAP source system.
3. The request IDoc arrives in the source system and is processed by the IDoc dispatcher,
which calls the BI Service API to process the request.
4. The BI Service API checks the request for technical consistency. Possible error
conditions include specification of DataSources unavailable in the source system and
changes in the DataSource setup or the extraction process that have not yet been
replicated to the SAP BW system.
5. The BI Service API calls the extractor in initialization mode to allow for extractor-
specific initializations before actually starting the extraction process. The generic
extractor, for example, opens an SQL cursor based on the specified DataSource and
selection criteria.
6. The BI Service API calls the extractor in extraction mode. One data package per call is
returned to the BI Service API, and customer exits are called for possible
enhancements. The extractor takes care of splitting the complete result set into data
packages according to the IDoc control parameters. The BI Service API continues to
call the extractor until no more data can be fetched.
7. The BI Service API finally sends a final status IDoc notifying the target system that
request processing has finished (successfully or with errors specified in the status
IDoc).
2.1 Introduction
Extractors are one of the data retrieval mechanisms in the SAP source
system. An extractor can fill the extract structure of a DataSource with the data from SAP
source system datasets.
In a metadata upload, the DataSource, including its relevant properties,
is replicated in the BW. Once there the user can assign it to an InfoSource. The DataSource
fields are made available to be assigned to BW InfoObjects.
2.1.1 Process
In addition, there are generic extractors, with which the user can
extract more data from the SAP source system and transfer it into BW. Only when the user
calls up the generic extractor by naming the DataSource does it know which data is to be
extracted, and from which tables it should read it from and in which structure. This is how
it fills different extract structures and DataSources.
The user can run generic data extraction in the R/3 source system
application areas such as LIS, CO-PA, FI-SL and HR. This is how LIS, for example, uses
generic extraction to read info structures. DataSources are generated on the basis of these
(individually) defined info structures. We speak of customer-defined DataSources with
generic data extraction from applications.
The DataSource data for these types are read generically and
transferred into the BW. This is how generic extractors allow the extraction of data that
cannot be made available within the framework of Business Content.
2.2.2.1 Definition
The Persistent Staging Area (PSA) is the initial storage area for
requested transaction data, master data attributes, and texts from various source systems
within the Business Information Warehouse.
2.2.2.2 Use
2.2.3 IDoc’s
2.2.3.1 Definition
The IDoc interface exchanges business data with an external system. The
IDoc interface consists of the definition of a data structure, along with processing logic for
this data structure. The business data is saved in IDoc format in the IDoc Interface and is
forwarded as IDocs. If an error occurs, exception handling is triggered using SAP tasks.
The agents who are responsible for these tasks and have the relevant authorizations are
defined in the IDoc Interface. Standard SAP format for electronic data interchange between
systems (Intermediate Document).Different message types (for example, delivery
confirmations or purchase orders) normally represent the different specific formats, known
as IDoc types. Multiple message types with related content can be assigned to one IDoc
type.
2.2.3.2 Example:
The IDoc type ORDERS01 transfers the logical message types ORDERS
(purchase order) and ORDRSP (order confirmation).Among other areas, IDocs are used in
both Electronic Data Interchange (EDI) and for data distribution in a system group (ALE).
2.2.4.1.1 PSA
The user will not be able to view the data in IDoc's while transferring the data. The most
advantageous thing about PSA is that we can see and do any editing if there is any error in
the records which means that we are able to view the data. That is not the case with IDoc’s.
Data that logically belongs together is stored in the source system in the
form of DataSources. A DataSource consists of a quantity of fields that are offered for data
transfer into BW. The DataSource is technically based on the fields of the extraction
structure. By defining a DataSource, these fields can be enhanced as well as hidden (or
filtered) for the data transfer. It also describes the properties of the extractor belonging to it,
as regards the data transfer into BW. During a Metadata upload, the properties of the
DataSource relevant to BW are replicated in BW.
DataSources are used for extracting data from a source system and for transferring data into
the BW. DataSources make the source system data available on request to the BW in the
form of the (if necessary, filtered and enhanced) extraction structure. Data is transferred
from the source system into the SAP Business Information Warehouse in the Transfer
Structure. In the transfer rules maintenance, the user determines how the fields of the
transfer structure are transferred into the InfoObjects of the Communication Structure. The
user assign DataSources to InfoSources and fields to InfoObjects in the transfer rules
maintenance.
Using this function, the user can also replicate an individual DataSource that so far did not
exist in the BI system. This is not possible in the view for the DataSource tree since a
DataSource that has not been replicated so far will not be displayed.
2.4 Data Extraction Logistics
Serialized V3
1. Queued Delta
2. Direct Delta
3. Unserialized V3
Here the user can add additional fields from the communication
structures available to the extract structure.
In the Data source maintenance screen, the user can customize the
data source by using the following fields: field name, short text, selection, hide field,
inversion or cancellation field or reverse posting, and field only known in customer exit.
2.5.3 Activating update
These talks about the delta update mode the user are using and how do
the user control the data load based on the volume of data. LO Cockpit supports 4 types of
update modes (delta modes, which we have already discussed): Serialized V3 update,
Direct Delta, Queued Delta, Unserialized V3 update.
Access to application tables are not permitted, hence setup tables are
there to collect the required data from the application tables. When a load fails, the user can
re-run the load to pull the data from setup tables. Data will be there in setup tables. Setup
tables are used to Initialize delta loads and for full load. Its part of LO Extraction scenario.
With this option, the user avoid pulling from R/3 directly as we need to bring field values
from multiple tables. The user can see the data in the setup tables. Setup table name wiil be
extract structure name followed by SETUP. Set up table names starts with 'MC' followed
by application component '01'/'02' etc and then last digits of the datasource name and then
followed by SETUP Also we can say the communication structure (R/3 side, the user can
check it in LBWE also) name followed by 'setup'.
The setup tables are the base tables for the Datasource used for Full
upload. So if the user are going for only full upload full update is possible in LO extractors.
Full update is possible in LO extractors. In the full update whatever data is present in the
setup tables(from the last done in it) is sent to BW.But setup tables do not receive the delta
data from the deltas done after the init.So if users full update should get ALL data from the
source system will need to delete and re-fill setup tables.
2.5.6 Serialized V3
With queued delta update mode, the extraction data (for the relevant
application) is written in an extraction queue (instead of in the update data as in V3) and
can be transferred to the BW delta queues by an update collective run, as previously
executed during the V3 update. After activating this method, up to 10000 document
delta/changes to one LUW are cumulated per datasource in the BW delta queues.
If the user use this method, it will be necessary to schedule a job to regularly transfer the
data to the delta queues As always, the simplest way to perform scheduling is via the "Job
control" function in LBWE.SAP recommends to schedule this job hourly during normal
operation after successful delta initialization, but there is no fixed rule: it depends from
peculiarity of every specific situation (business volume, reporting needs and so on).
With this update mode, that we can consider as the serializer’s brother,
the extraction data continues to be written to the update tables using a V3 update module
and then is read and processed by a collective update run (through LBWE).But, as the
name of this method suggests, the V3 unserialized delta disowns the main characteristic
of his brother: data is read in the update collective run without taking the sequence into
account and then transferred to the BW delta queues.Issues:Only suitable for data target
design for which correct sequence of changes is not important e.g. Material Movements V2
update has to be successful.
1. Logon to BW
a. Go to Administrator Workbench (RSA1).
b. Go to Source systems.
c. Choose R/3 Source System (where the user DataSource resides) - Right Click &
go to “Customizing for Extractors”.
5. Click on DataSource
6. Scroll down to reach the appended structure
7. Double Click on appended structure – ZBIW_KNA1_S1 (or the user chosen one) &
add the fields ( which the user wish to add to DataSource)
8. Check, save & activate - It will append this to Extract Structure of DataSource.
9. Go Back
10. Click on 0CUSTOMER_ATTR (DataSource)
11. Now Click on Extract Structure – BIW_KNA1_S (in my case), check out the
appended field below append structure.
2.6.1 Filling up the Appended Structure
1. Post appending the structure, we need to fill this Append with data – For this
a. Enhancement – RSAP0001
b. Click on Components.
1. Now Final Step – We need to regenerate the DataSource & make sure Newly Added
Attributes (Starting with ZZ) are not HIDE.
2. Logon to BW System – Go to Administrator Workbench (RSA1) – Go to Source
systems – Choose Source System (R/3 in our case) - Right Click & go to
“Customizing for Extractors”.
3. Choose “Edit DataSources and Application Component Hierarchy”
4. Go to “0CUSTOMER_ATTR” Customer Number DataSource & Click on CHANGE.
Scroll Down.
5. Keep clicking on Right Sign if it prompts for any confirmation.
6. Scroll down & Go to Fields starting with ZZ & make sure these are not HIDE
(remove the sign from HIDE check box).
The delta InfoPackage is executed which extracts data from the delta
queue to the SAP BI system and the same is scheduled as a part of the process chain. The
data is extracted to the persistent staging area, which forms the first physical layer in BI
from where data is further staged into the DSO, which can be a part of a pass through layer
or an EDW layer. Note the negative values for the key figures for the before image record
in the PSA table. The same can be updated to a DSO, in overwrite and summation mode,
and an infocube.
When the data from the LO datasource is updated to a DSO with the setting „unique
record the before image is ignored.
Generic extractors of type (extraction method) F2 and delta process are AIE (After image
Via Extractor) will be using pull delta model 'F2': The data is extracted by means of a
function module that, in contrast to 'F1', occupies a simplified interface (see documentation
for data element ROFNAME_S).
Whenever we request delta data from BW, the data will pulled via delta queue and Delta
LUW’s will be saved in repeat delta table and repeat delta LUW’s only will be visible in
RSA7, But for the normal F1 type extractors both Delta and Repeat delta LUW’s will be
visible in RSA7
In the below screen shot Total =2 will refer number LUW’s in repeat delta table
In the above screenshot: Click on the Update Overview text to reach the following
screen. This will take the user to SM13 for any relative table updates and Execute.
Now go back to previous screen and click on BW Maintenance Delta Queue.
This will take the user to RSA7 transaction to view the delta queues if any
Click back to go reach this pop-up.
Click on Run and It will Prompt for confirming the entries in the Extract structure.
Assign a request so that it generates extract structure successfully.
Now on the Main LBWE screen, the user can see RED status before the datasource.
Now click on the Datasource as below.
Assign a Request to have the Datasource screen where properties related to fields can be
modified.
As the user assign this and come back the user will see a change in the status color as
YELLOW.
Now go to the BW System and Replicate the related Datasource from the Exact Source
system.
Now go back to the R/3 System and Click on the ACTIVE parameter under the Job
Control, assign a request
Now the user will see that the status color will turn as GREEN and then the user can
assign the update mode as well.
Now in the BW system create transformations from the datasource to the Infoprovider.
Create an infopackage and DTP to load the data.
In contrast with other business content and generic data sources, the
LO datasources use the concept of set up tables to carry out the initial data extraction
process. The data extractors for HR, FI etc. extract data by directly accessing the
application tables, but in case of LO extractors they do not access the application tables
directly. The presence of restructuring/set up tables prevents the BI extractors directly
access the frequently updated large logistics application tables and are only used for
initialization of data to BI. For loading data first time into the BI system, the set up tables
have to be filled. The restructuring/set up tables are cluster tables that hold the respective
application data, and the BI system extracts the data as a onetime activity for the initial data
load, and the data can be deleted from the set up tables after successful data extraction into
BI to avoid redundant storage.
The setup tables in SAP have the naming convention, <Extraction
structure>SETUP and the compressed data from application tables stored here can be
viewed through SE11. Thus the datasource 2LIS_11_VAITM having extract structure
MC11VA0ITM has the set up table MC11VA0ITMSETUP. A job is executed to fill the set
up tables, and the init InfoPackage extracts the initial data into BI.
The after image provides status after change, a before image gives
status before the change with a minus sign and a reverse image sends the record with a
minus sign for the deleted records. The serialization plays an important role if the delta
records has to be updated into a DSO in overwrite mode. For e.g. in the sales document
1000, if the quantity of ordered material is changed to 14 from 10, then the data gets
extracted as shown in the table,
The type of delta provided by the LO datasources is a push delta, i.e. the delta data records
from the respective application are pushed to the delta queue before they are extracted to BI
as part of the delta update. The fact whether a delta is generated for a document change is
determined by the LO application. It is a very important aspect for the logistic datasources
as the very program that updates the application tables for a transaction triggers/pushes the
data for information systems, by means of an update type, which can be a V1 or a V2
update.
2.9.3 Update Modes
Before elaborating on the delta methods available for LO datasources it is necessary to
understand the various update modes available for the logistics applications within the SAP
ECC 6.0 system.
The following three update methods are available;
a) V1 Update
b) V2 Update
c) V3 Update
While carrying out a transaction, for e.g. the creation of a sales order, the user enters data
and saves the transaction. The data entered by the user from a logistics application
perspective is directly used for creating the orders, having an integrated controlling aspect,
and also indirectly forms a part of the information for management information reporting.
The data entered by the user is used by the logistic application for achieving both the above
aspects, but the former, i.e. the creation of the order takes a higher priority than result
calculations triggered by the entry. The latter is often termed as statistical updates.
The SAP system treats both these events generated by the creation of order with different
priorities by using two different update modes for achieving the same, the V1 update and
the V2 update, with the former being a time critical activity. Apart from these two update
modes SAP also supports a collective run, called the V3 update, which carries out updates
in the background. The update modes are separately discussed below.
2.9.3.1 V1 Update
A V1 update is carried out for critical or primary changes and these
affect objects that has a controlling function in the SAP System, for example the creation of
an sales order (VA01) in the system. These updates are time critical and are synchronous
updates. With V1 updates, the program that outputs the statement COMMIT WORK AND
WAIT which waits until the update work process outputs the status of the update. The
program then responds to errors separately.
The V1 updates are processed sequentially in a single update work process and they
belong to the same database LUW. These updates are executed under the SAP locks of the
transaction that creates the update there by ensuring consistency of data, preventing
simultaneous updates. The most important aspect is that the V1 synchronous updates can
never be processed a second time. During the creation of an order the V1 update writes data
into the application tables and the order gets processed. The V1 updates are carried out as a
priority in contrast to V2 updates, though the V2 updates are usually also processed straight
away.
2.9.3.2 V2 Update
A V2 update, in contrast with V1 is executed for less critical
secondary changes and are pure statistical updates resulting from the transaction. They are
carried out in a separate LUW and not under the locks of the transaction that creates them.
They are often executed in the work process specified for V2 updates. If this is not the case,
the V2 components are processed by a V1 update process but the V1 updates must be
processed before the V2 update. They are asynchronous in nature.
2.9.3.3 V3 Update
Apart from the above mentioned V1 and V2 updates, the SAP
system also has another update method called the V3 update which consists of collective
run function modules. Compared to the V1 and V2 updates, the V3 update is a batch
asynchronous update, which is carried out when a report (RSM13005) starts the update (in
background mode). The V3 update does not happen automatically unlike the V1 and V2
updates.
All function module calls are then collected, aggregated and updated together and are
handled in the same way as V2 update modules. If one of the function modules increments
a statistical entry by one, this is called up 10 times during the course of the transaction.
Implementing the same as a V2 update runs 10 times after the V1 for the same has been
completed; i.e. the database is updated 10 times. But when executed as a V3 update, the
update can be executed at any time in one single operation with the same being carried out
in one database operation at a later point in time. This largely reduces the load on the
system.
a. Writing to the delta queue within the V1 posting process ensures serialization
by document.
b. Recommended for customers with fewer documents.
c. Extraction is independent of V2 updating.
d. No additional monitoring of update data or extraction queue required.
When using this update mode, no document postings should be carried out during delta
initialization in the concerned logistics application from the start of the recompilation run
in the OLTP until all delta init requests have been successfully updated successfully in BW.
The data from documents posted is completely lost if documents are posted during the re-
initialization process.
2.10.2 Queued delta (V1 + V3 updates)
In the queued delta update mode the logistic application pushes the
data from the concerned transaction into an extraction queue by means of the V1 update.
The data is collected in the extraction queue and a scheduled background job transfers the
data in the extraction queue to the delta queue, in a similar manner to the V3 update, with
an update collection run. Depending on the concerned application, up to 10,000 delta
extractions of documents can be aggregated in an LUW in the delta queue for a datasource.
The data pushed by the logistic application can be viewed in the logistics queue overview
function in the SAP ECC 6.0 system (transaction LBWQ). SAP recommends the queued
delta process for customers with a high amount of documents with the collection job for
extraction from extraction queue to be scheduled on an hourly basis.
2.10.2.1 Benefits
When the user need to perform a delta initialization in the OLTP, thanks to the logic
of this method, the document postings (relevant for the involved application) can be
opened again as soon as the execution of the recompilation run (or runs, if several
and running in parallel) ends, that is when setup tables are filled, and a delta init
request is posted in BW, because the system is able to collect new document data
during the delta init uploading too (with a deeply felt recommendation: remember
to avoid update collective run before all delta init requests have been successfully
updated in the user BW!).
By writing in the extraction queue within the V1 update process (that is more
burdened than by using V3), the serialization is ensured by using the enqueue
concept, but collective run clearly performs better than the serialized V3 and
especially slowing-down due to documents posted in multiple languages does not
apply in this method.
On the contrary of direct delta, this process is especially recommended for
customers with a high occurrence of documents (more than 10,000 document
changes - creation, change or deletion - performed each day for the application in
question.
Extraction is independent of V2 update.
In contrast to the V3 collective run an event handling is possible here, because a
definite end for the collective run is identifiable: in fact, when the collective run for
an application ends, an event (&MCEX_nn, where nn is the number of the
application) is automatically triggered and, thus, it can be used to start a subsequent
job.
2.10.2.2 Limits
The job uses the report RMBWV311 for collection run and the function module will have
the naming convention MCEX_UPDATE_<Application>, MCEX_UPDATE_11 for sales
orders. In the initialization process, the collection of new document data during the delta
initialization request can reduce the downtime on the restructuring run. The entire
extraction process is independent of the V2 update process.
2.10.3 Un-serialized V3 Update (V1/V2 + V3 Updates)
In this mode of delta update the concerned logistic application writes
data to update tables which further transfers data to the delta queue by means of a
collection run call V3 update. Once the data is updated to the update tables by the logistic
applications, it is retained there until the data is read and processed by a collective update
run, a scheduled background job, the V3 update job, which updates all the entries in the
update tables to the delta queue.
As the name suggests the update is un-serialized, i.e. this mode of update does not ensure
serialization of documents posted to the delta queue. This means that the entries in the delta
queue need not correspond to the actual sequence of updates that might have happened in
the logistic application. This is important if the data from the datasource is further updated
to a DSO in overwrite mode as the last entry would overwrite the previous entries resulting
in erroneous data. An un-serialized delta update when used should always update data
either to an infocube or to a DSO with key figures in summation mode. It is also advised if
the un-serialized V3 update can be avoided to documents subjected to a large number of
changes when it is necessary to track changes.
Generic R/3 data extraction allows us to extract virtually any R/3 data.
Generic data extraction is a function in Business Content that supports the creation of
DataSources based on database views or InfoSet queries. InfoSet is similar to a view but
allows outer joins between tables. The new generic delta service supports delta extractors
on monotonic ‘delta attributes‘like Timestamp, Calendar day, Numeric pointer (e.g.
document number, counter) – must be strictly monotonic increasing with time. Only one
attribute can be defined as the delta attribute. For extracting data from the VBAK table, the
Logistics Extraction Cockpit is the recommended method.
2.11.1Create Generic extraction [Master data]
1. Under Transaction SBIW – This step gives the user the option of creating and
maintaining generic Data Sources for transaction data, master data attributes or
texts from any kind of transparent tables, database views or SAP query functional
areas or via a function module, regardless of application. This enables the user to
use the generic extraction of data.
3.
a) Choose an application .Component to which the data source is to be assigned.
b) Enter the descriptive texts. The user can choose these freely.
c) Choose Generic Delta
4. Specify the delta-specific field and the type for this field. Maintain the
settings for the generic delta: Specify a safety interval. Safety interval should be set
so that no document is missed – even if it was not stored in the DB table when the
extraction took place.
5. Select Delta type: New status for changed records (I.e. after-image); This can be
used with Data target ODS (AIE).Additive Delta (I.e. aggregated data records)
(ADD) Then choose Save.
6. After step 4, the screen of step 3 comes back. Now choose Save again.
This will generate the data source. After generating the data source, the user will
see the Delta Update flag selected. In systems as of basis release 4.0B,the user can
display the current value for the delta-relevant field in the delta queue.
Delta Attributes can be monitored in delta queue (RSA7). Also note LUW count
does not equal to the changes records in the source table. Most of the time it will be
ZERO. Delta is enabled by data selection logic
LUW count can also have value 1.Whenever delta is extracted, the extracted data is
stored in the delta queue tables to serve as a fallback, when an error occurs during
the update of the BW system. The user will see a '1' in this field (the extract counts
as one LUW) and are even able to be displayed in a detail screen.
2.12.1.3. Hierarchies
Hierarchies can be used in the analysis to describe alternative views of
the data. A hierarchy consists of a quantity of nodes that have a parent child relationship
with one another. The structures can be defined in a version-specific as well as a time-
dependent manner. An example of this is the cost center hierarchy.
2.12.2 Functions
If the texts and the attributes are time-dependent, the time intervals do
not have to agree.
2.12.2.4 Language-dependent Texts
If they are language-dependent, the user have to upload all texts with a language indicator.
1. Based on table/view
2. Based on Infoset Query
3. Based on Function module
2.13 Generic Data sources
The user can edit DataSources in the source system, using transaction
SBIW.
2.13.3 Replication of DataSources
In the first step, the D versions are replicated. Here, only the
DataSource header tables of BI Content DataSources are saved in BI as the D version.
Replicating the header tables is a prerequisite for collecting and activating BI Content
In the second step, the A versions are replicated. DataSources (R3TR RSDS) are saved in
the M version in BI with all relevant metadata. In this way, the user avoid generating too
many DDIC objects unnecessarily as long as the DataSource is not yet being used – that is,
as long as a transformation does not yet exist for the DataSource.3.x DataSources (R3TR
ISFS) are saved in BI in the A version with all the relevant metadata.
13. Assign Data source. Map the fields manually since ZZBOM is an enhancement, it
should be mapped to the InfoObject created by us.
14. Create ODS.
15. Assign the objects corresponding to the Key fields and Data fields.
16. Create Update rules for the ODS.
17. Create InfoSource and InfoPackage.The data is extracted form the SAP R/3 while
we schedule the load from Info Package. The Data is then loaded into the
ODS which is the data target for the InfoPackage.
19. The data is loaded in to the ODS which could be monitored. The data load is
successful.
20. Create InfoCube.
21. Assign the corresponding characteristics, Time Characteristics and Key figures.
22. Create relevant Dimensions to the cube.
23. Activate the InfoCube and create update rule in order to load the data from the
ODS.
24. Update ODS and the InfoCube is successfully loaded. Manage the InfoCube to view
the available data.
26. As per the requirement the General Ledger InfoCube is created and ready for
Reporting. Create a new query using a query designer.
27. Query designer for 0_FI_GL_4.
28. General Ledger Balance Sheet for the period 1995, 1996 and 1997.
The SAP Business Information Warehouse allows the user to analyze data from operative
SAP applications as well as all other business applications and external data sources such
as databases, online services and the Internet.
The SAP Business Information Warehouse enables Online Analytical Processing (OLAP),
which processes information from large amounts of operative and historical data. OLAP
technology enables multi-dimensional analyses from various business perspectives. The
Business Information Warehouse Server for core areas and processes, pre-configured with
Business Content, ensures the user to look at information within the entire enterprise. In
selected roles in a company, Business Content offers the information that employees need
to carry out their tasks. As well as roles, Business Content contains other pre-configured
objects such as InfoCubes, queries, key figures, and characteristics, which make BW
implementation easier.
BI supports the transfer of data from flat files, files in ASCII format
(American Standard Code for Information Interchange) or CSV format (Comma Separated
Value). For example, if budget planning for a company’s branch offices is done in
Microsoft Excel, this planning data can be loaded into BI so that a plan-actual comparison
can be performed. The data for the flat file can be transferred to BI from a workstation or
from an application server.
1. The user defines a file source system.
2. The user creates a DataSource in BI, defining the metadata for the user file in BI.
3. The user creates an InfoPackage that includes the parameters for data transfer to the
PSA.
The structure of the flat file and the metadata (transfer structure of the DataSource) defined
in SAP BW have to correspond to one another to enable correct data transfer. Make
especially sure that the sequence of the InfoObjects corresponds to the sequence of the
columns in the flat file.
The transfer of data to SAP BW takes place via a file interface. Determine the parameters
for data transfer in an InfoPackage and schedule the data request. The user can find more
information under Maintaining InfoPackages Procedure for Flat Files.
For flat files, delta transfer in the case of flexible updating is supported. The user can
establish if and which delta processes are supported during maintenance of the transfer
structure. With additive deltas, the extracted data is added in BW. DataSources with this
delta process type can supply both ODS objects and InfoCubes with data. During transfer
of the new status for modified records, the values are overwritten in BW. DataSources with
this delta process type can write the data into ODS objects and master data tables.
Go to T-CODE SE11 and select the radio button TRANSACTION DATA and give the
some technical name and click on CREATE button.
Give APLICATION COMPONENT, TABLE NAME and give Descriptions. Then click on
SAVE button.
Select the fields like master data otherwise click on hide. For transaction data KEY
FIGURES and REFERENCE are compulsory. So select click on SAVE button. Some key
figures along with the reference values. And then
-> Go to BW side and Replicate the user datasource by selecting the application
component, which we already assigned in R/3 side, in the source system tab.
-> Click on Assign infosource by the context menu of the user replicated datasource
Flat files are data files that contain records with no structured
relationships. Additional knowledge is required to interpret these files such as the file
format properties. Modern database management systems used a more structured approach
to file management (such as one defined by the Structured Query Language) and therefore
have more complex storage arrangements.
Many database management systems offer the option to export data to comma delimited
file. This type of file contains no inherent information about the data and interpretation
requires additional knowledge. For this reason, this type of file can be referred to as a flat
file.
FOR Example .csv is comma separated flat file. , .txt , .lis , .lst..
3.4.1 Basic Steps of Data Flow (ETL process):
In the Attribute tab, enter the fields (non-key fields) that need to be maintained
within the Master Data table.
Ex: Customer Id – primary key field
Land
City
On entry of each field, hit [Enter] and in the pop-up box, select the 1st option – “Create
Attribute as Characteristic”.
Proposal tab – provides a quick view of the data in the flat file to be uploaded
Click on the ‘Load Example’ tab. The details can be viewed in the below pane.
Fields tab – provides details of all fields used to create a data structure, to map with
the fields from Source file.
In the Template Info field, enter the field names (characteristics/attributes).
[Enter] or [Save]
[Copy] : all properties of the characteristics ( ex: data types, length etc) are
copied here.
Note: the fields 0LANGU and 0TXTSH are the standard SAP defined characteristics
InfoObjects.
[Save]
[Activate]
Preview tab – gives a preview of data on flat file loaded onto a structure.
Click on “Read Preview Data”
Info Package: An InfoPackage helps to transport data from the Data Source structures into
the PSA tables. (This is similar to transfer of data from Work areas to Internal Tables in
R/3 ABAP).
Rt. Click on the Data Source and Create InfoPackage.
In the pop-up, defining the number of records per request, click OK.
This leads to the PSA display screen, with the table and the respective data in them.
Click on [Back] and come back to the DataSource screen.
Transformation: This is a rule that is defined, for mapping the source fields in the
DataSource to the final Target (basically, the Info Providers from where BI extracts the
reports).
In this case, our target is the InfoObject (Master data table), which has been already
created.
Note: Normally, the Transformation can be created on the Data Source or the Target. Here,
we are creating a Transformation on the DataSource.
Right click on DataSource Create Transformation
In the pop-up,
For the Target of Transformation,
Enter Object Type: InfoObject (this is our target in this scenario)
Enter subtype of Object: Attributes (as we’re considering Master data only.
Not transactional data)
Enter the name of the InfoObject we created.
For the Source of Transformation,
Enter Object Type: DataSource (this is our source structure)
Enter name of Data Source used
Enter the Source System used.
Click on OK
Note:
While uploading Master Data, select Sub-type of Object : ‘Attributes’ (IP1 for DS1)
While uploading Text desc. For this Master Data, select Subtype of Object : ‘Texts’ (IP2
for DS2)
In the next screen pop-up, we get the no. of proposals (rules) generated to map source
fields with target fields. Click on Ok.
On the next window we can see a graphical representation of the mapping of Source fields
to Target fields.
Save and Activate, if the mapping if done correctly.
4. DB Connect
4.1 Introduction
There are 2 types of classification. One is the BI DBMS & the other is source DBMS. The
main thing which is, both these DBMS are supported on their respective operating system
versions, only if SAP has released a DBSL. If not, they don’t meet the requirements &
hence can’t perform DB Connect. In this process we use a Data source, to make the data
available to BI & transfer the data to the respective Info providers defined in BI system.
Further, using the usual data accusation process we transfer data from DBs to BI system.
Using this SAP provides options for extracting data from external systems, in addition to
extracting data using standard connection; the user can extract data from tables/views in
database management systems (DBMS)
Now, Under DB Connect, we can see the name of our Source System (MS SQL DB
Connect)
The type of data type data source that we have here is “Master Data Attributes”
The below screen shot describes how to perform extraction or loading using a
Table/View. As the standard adapter is “Database Table” (by default), we can
specify the Table/View here
Now, choose the data source from the DB Object Names.
Region ID
Region Description
Now that the Data source has to be activated before it is loaded, we “ACTIVATE” it
once.
After activation, the data records (4) are displayed. Eastern, Western, Northern &
Southern
Now, we need to create an Info Area to create an Info provider (like Info Cube)
After creating the info cube we check for the data in the PSA by “Manage the PSA”
This can be also done using the Key controls (Ctrl + Shift + F6)
he number of records displayed: 4 No’s
1. Status
2. Data Packet
3. Data records
4. REGION ID
5. REGION Description
The Table/View “CUSTOMERS” is now chosen for Extraction. In the next tab we
have “PROPOSAL”, which describes all the Database fields, and we have to specify
the Data source fields, types & length.
Now, we create an Info package à IP_TEST_CUST
Now, go to RSA1 à Info objects à Info object (Test) à Create Info Object Catalog
Now, we can preview the Region ID & Region Description.
We now create 2 Info objects & pass the Region ID & Region Description to the 2
objects.
Region (REGION2)
Region ids (REG_ID)
We create Characteristic as Info Provider for the Master Data loading in the “Info
Provider” section Insert Characteristic as Info Provider
After checking the Transformation mappings on the Region ID, we now perform a
DTP Creation on the same Region ID (Attribute)
We choose the Target system (default) as Info object Region ID REG_ID &
the Source Type as Data source with the Source System MSSQL
After this step, we proceed with creating an Info package IP_DS_TEDDY which
has the source system as MSSQL. Further we start the scheduling of the Info
package. Once the info package has been triggered we can go to “Maintain PSA” &
monitor the status of data in PSA
Further, we EXECUTE the DTP. And, we can monitor transfer of data from PSA
Info cube
5. Universal Data Integration
5.1 Introduction
UD Connect Source
The UD Connect Sources is the instances that can be addressed as data sources using the
BI JDBC Connector.
1. Create the connection to the data source with the user relational or multi-dimensional
source objects (relational database management system with tables and views) on the
J2EE Engine.
2. Create RFC destinations on the J2EE Engine and in BI to enable communication
between the J2EE Engine and BI. In the Implementation Guide for SAP NetWeaver
→Business Intelligence →UDI Settings by Purpose →UD Connect Settings.
3. Model the InfoObjects required in accordance with the source object elements in BI.
4. Define a DataSource in BI.
1. Select the application component where the user want to create the DataSource and
choose Create DataSource.
2. On the next screen, enter a technical name for the DataSource, select the type of
DataSource and choose Copy. The DataSource maintenance screen appears.
3. Select the General tab.
A connection to the UD Connect source is established. All source objects available in the
selected UD Connect source can be selected using input help.
a) Check the mapping and change the proposed mapping as required. Assign the
non-assigned source object elements to free DataSource fields. The user cannot
map elements to fields if the types are incompatible. If this happens, the system
displays an error message.
b) Choose Copy to Field List to select the fields that the user want to transfer to
the field list for the DataSource. All fields are selected by default.
Here, the user can edit the fields that the user transferred to the field list of the DataSource
from the Proposal tab. If the system detects changes between the proposal and the field list
when switch from the Proposal tab to the Fields tab, a dialog box is displayed where the
user can specify whether the user want to copy changes from the proposal to the field list.
a) Under Transfer, specify the decision-relevant DataSource fields that the user
wants to be available for extraction and transferred to BI.
b) If required, change the values for the key fields of the source. These fields are
generated as a secondary index in the PSA. This is important in ensuring good
performance for data transfer process selections, in particular with semantic
grouping
c) If required, change the data type for a field.
d) Specify whether the source provides the data in the internal or external format.
e) If the user chooses an External Format, ensure that the output length of the
field (external length) is correct. Change the entries if required.
f) If required, specify a conversion routine that converts data from an external
format to an internal format.
g) Select the fields that the user wants to be able to set selection criteria for when
scheduling a data request using an InfoPackage. Data for this type of field is
transferred in accordance with the selection criteria specified in the
InfoPackage.
h) Choose the selection options (such as EQ, BT) that the user wants to be
available for selection in the InfoPackage.
i) Under Field Type, specify whether the data to be selected is language-
dependent or time-dependent, as required.
If the user did not transfer the field list from a proposal, the user can define the fields of the
DataSource directly. Choose Insert Row and enter a field name. The user can specify
InfoObjects in order to define the DataSource fields. Under Template InfoObject, specify
InfoObjects for the fields of the DataSource. This allows the user to transfer the technical
properties of the InfoObjects to the DataSource field.
Entering InfoObjects here does not equate to assigning them to DataSource fields.
Assignments are made in the transformation. When the user define the transformation, the
system proposes the InfoObjects the user entered here as InfoObjects that the user might
want to assign to a field
7. Check, save and activate the DataSource
8. Select the Preview tab.
If the user selects Read Preview Data, the number of data records the user specified in the
user field selection is displayed in a preview. This function allows the user to check
whether the data formats and data are correct.
In order to keep the data mass that is generated during UD Connect access to a JDBC data
source as small as possible, each select statement generated by the JDBC adapter receives
a group by clause that uses all recognized characteristics. The recognized key figures are
aggregated. What is recognized as a key figure or characteristic and which methods are
used for aggregation depends on the properties of the associated InfoObjects modeled in
SAP BW for this access.
The amount of extracted data is not restricted. To prevent exceeding the storage limitations
of the J2EE server, packages with around 6,000 records are transferred to the calling ABAP
module.
Currently only one database object (table, view) can be used for a UD Connect Source. The
JDBC scenario does not support joins. However, if multiple objects are used in the form of
a join, a database view should be created that provides this join and this object is to be used
as a UD Connect source object. The view offers more benefits:
The database user selected from SAP BW for access is only permitted to access
these objects.
Using the view, the user can run type conversions that cannot be made by the
adapter (generation of the ABAP data type DATS, TIMS etc.)
Sun's JDBC (Java Database Connectivity) is the standard Java API for Relational
Database Management Systems (RDBMS). The BI JDBC Connector allows the user to
connect
applications built with the BI Java SDK to over 170 JDBC drivers, supporting data sources
such as Teradata, Oracle, Microsoft SQL Server, Microsoft Access, DB2, Microsoft Excel,
and text files such as CSV. This connector is fully compliant with the J2EE Connector
Architecture (JCA).
The user can also use the BI JDBC Connector to make these data sources available in SAP
BI systems using UD Connect. The user can also create systems in the portal that are based
on this connector.
The connector adds the following functionality to existing JDBC drivers:
• Standardized connection management that is integrated into user management in the
portal
• A standardized metadata service, provided by the implementation of JMI capabilities
based on CWM
• A query model independent of the SQL dialect in the underlying data source
The JDBC Connector implements the BI Java SDK's IBIRelational interface.
5.6.1 Deploy the user data source’s JDBC driver to the server:
To prepare a data source for use with the BI Java SDK or with UD
Connect, the user first need to configure the properties in BI Java Connector used to
connect to the data source. The user does this in SAP NetWeaver Application Server’s
Visual Administrator.
In the service Connector Container, configure a reference to the JDBC driver of the user
data source. This can be done by performing the following steps:
After the user has configured the BI Java Connector, the user can
perform a rough installation check by displaying the page for the connector in the user
server. Perform the tests for the connector by visiting the URLs
When creating applications with the BI Java SDK, refer to a connector by its
JNDI name: The BI JDBC Connector has the JNDI name SDK_JDBC.
The user can clone an existing connection by using the Clone button in the
toolbar.
Refer to the table below for the required and optional properties to
configure for the user connector: BI JDBC Connector Properties
5.7 BI XMLA Connector
The XMLA Connector sends commands to an XMLAcompliant OLAP data source in order
to retrieve the schema rowsets and obtain a result set. The BI XMLA Connector allows the
user to connect applications built with the BI Java SDK to data sources such as Microsoft
Analysis Services, Hyperion, MicroStrategy, MIS, and BW 3.x. This connector is fully
compliant with the J2EE Connector Architecture (JCA).The user can also use the BI
XMLA Connector to make these data sources available in SAP BI Systems via UD
Connect, or the user can create systems in the portal based on this connector. The BI
XMLA Connector implements the BI Java SDK's IBIOlap interface.
5.7.1 Using InfoObjects with UD Connect
When modeling InfoObjects in BI, note that the InfoObjects have to
correspond to the source object elements with regard to the type description and length
description. For more information about data type compatibility,
.
The following restrictions apply when using InfoObjects:
• Alpha conversion is not supported
• The use of conversion routines is not supported
• Upper and lower case must be enabled
These InfoObject settings are checked when they are generated.
5.7.2 Using SAP Namespace for Generated Objects
System changeability
Key
Because the generated objects are ABAP development objects, the user
must be authorized as a developer. A developer key must be procured and entered.
Generation requires the customer-specific installation number and can be generated online.
The system administrator knows this procedure and should be included in the procurement.
The key has to be procured and entered exactly once per user and system. Because the
generated objects were created in the SAP namespace, an object key is required.
Like the developer key, this is customer specific and can also be procured online. The key
is to be entered exactly once per object and system. Afterwards, the object is released for
further changes as well. Further efforts are not required if there are repeated changes to the
field list or similar.
6. XML Integration
6.1 Introduction
XML Based communication over the Internet is achieved through SAP's Business
Connector
Figure 1
6.3.4 Factors leading to emergence of XML-enabled SAP solutions
The requirements for XML interface certification for SAP's complementary software
include:
Use of HTTP and HTTPS protocols with the SAP Business Connector
Customization for a specific Business Application
Sending and receiving the communication objects (i.e. Idocs, BAPIs or RFCs)
2. Open the Business Connector administration screen into a web browser window. Enter
the user Username and corresponding password.
3. If the user want to transmit data from an SAP system to the BC, the user need to have
the same user in both systems. For creating users in the BC click Security > Users and
Groups > Add and Remove Users.
4. Enter the desired SAP User, assign the corresponding Password and click Create
Users. This creates a User. Mark the just created User in the Groups box section and
make sure that the ‘Select Group’ ="Administrators". Now, add the User into the
Administrators group by clicking (below the right selection box). Click Save Changes to
save the settings.
5. All the proposed SAP system(s) should be added within the Business Connector. To
achieve this, click Adapters > SAP which opens a new window. In the new window click
SAP > SAP Servers > Add SAP Server.
6. Enter the necessary information for the SAP server (System, Login Defaults, Server
Logon, and Load Balancing. "Save" (as illustrated in screen 1).
6.5.3 Add Router Tables
7. All incoming calls to Business Connector are scrutinized and routing information is
extracted about the ‘Sender’, ‘Receiver’ and ‘Msg Type’. Using the Rules it finds the
recipient and the format that should be used to send the call to this particular recipient.
Clicking Adapters > Routing opens a new window. Enter information like ‘Sender’,
‘Receiver’ and ‘MsgType’. With "Add Rule" the rule is created on other details like
"Transport" and "Transport Parameters" must be provided.
8. After entering all the details, click Save and enable the rule by clicking No under the
"Enabled?" column.
9. Post a document containing the XML format of the Idoc or BAPI/RFC-call to the
Business Connector service.
For example: Use the following statements to post a document to the /sap/InboundIdoc
service of the Business Connector.
The user cannot use the SAP Business Connector in a web application (e-Commerce,
Procurement, etc), but can use it to facilitate a business-to-business transaction in an EDI-
like manner. For example: The user can send an XML document to the user vendor, and
they send the user an XML packet back.
7. Data Mart Interface
7.1 Introduction
The data mart interface makes it possible to update data from one
InfoProvider to another. Data exchange of multiple BI Systems: the data-delivering system
is referred to as the Source B; the data-receiving system as the Target BI. The individual BI
systems arranged in this way are called data marts. The InfoProviders of the source BI are
used as sources of data here. Data exchange between BI systems and other SAP systems.
A BI system defines itself as the source system for another BI system by:
Providing metadata
Providing transaction data and master data
An export DataSource is needed to transfer data from a source BI into a target BI. Export
DataSources for InfoCubes and DataStore objects contain all the characteristics and key
figures of the InfoProvider. Export DataSources for master data contain the metadata for all
attributes, texts, and hierarchies for an InfoObject.
Changes to the metadata of the source system can only be added to the export
DataSources by regenerating the export DataSources.
The Delete function is not supported at this time.
The user can only generate an export DataSource from an InfoCube if:
The InfoCube is activated
The name of the InfoCube is at least one character shorter than the
maximum length of a name, since the DataSource name is made up of the
InfoCube name and a prefix.
7.3 Data Mart Interface in the Myself System
The data mart interface in the Myself System is used to connect the
BW System to it.. This means the user can update data from data targets into other data
targets within the system. The user can import InfoCube data by InfoSource into BW again
and, thereby, fill another InfoCube. The user can carry out a data clean--up, for example,
using transfer rules and update rules.
The user can update data directly from ODS objects into other data
targets, since the ODS data can be used directly as a DataSource in the same system.
Data marts are found both in the maintenance and in the definition, the
same as in the SAP source system. Here too, the user can group together data from one or
more source
ource systems in a BI System, or continue to work in several BI Systems.
The data mart interface can be used between two BI Systems or between another SAP
system and a BI System:
To enable large amounts of data to be processed
To increase the respective speeds of data transfer, logging and analysis.
To achieve improved concision and maintenance of the individual Data
Warehouses.
To enable data separation relating to the task area, on the one hand, and to be able
to perform analyses of the state of the entire dataset on the other.
To make it possible to have fewer complexities when constructing and
implementing a Data Warehouse.
To construct hub and spoke scenarios in which a BI System stands in the middle
and the data from distributed systems runs together and is standardized.
7.4.1 Architectures
Replication Architecture
Aggregation Architecture
If the user selects this architecture, the data for a BI server is available
as source data and can be updated in further target BI Systems.
7.4.1.2 Aggregating Architecture
If required, define the source system. This is only necessary if the source system
has not yet been created in the target BI.
Create the InfoProvider into which the data is to be loaded.
Replicate the metadata of the user export DataSource from the user source BI into
the user target BI. Using the source system tree, the user can replicate all the
metadata of a source system, or only replicate the metadata of the user DataSource.
Activate the DataSource.
Create an InfoPackage at the user DataSource using the context menu. The user
source BI system is specified as the source by default.
Using the context menu for the user DataSource, create a data transfer process with
the InfoProvider into which the data is to be loaded as the target. A default
transformation is created at the same time.
The complete data flow is displayed in the InfoProvider tree under the user
InfoProvider.
Schedule the InfoPackage and the data transfer process. We recommend that the
user always use process chains for loading processes.
7.4.3 Generating Export DataSources for InfoProviders
The technical name of the export DataSource is made up of the number 8 together with the
name of the InfoProvider. Technical name of an InfoCube: COPA.Technical name of the
export DataSource: 8COPA
1. Select the InfoObject tree in the Data Warehousing Workbench in the source BI
system.
2. Generate the export DataSource from the context menu of the user InfoObject. To
do this, choose Additional Functions Generate Export DataSource.
When the user create an InfoObject or a master data InfoSource in the source BI system,
the user must therefore make sure that the length of the technical name of each object is no
longer than 28 characters.
7.4.5 Transactional Data Transfer Using the Data Mart Interface
The data transfer is the same as the data transfer in an SAP system.
The system reads the data, taking into account the specified dimension-specific selections,
from the fact tables of the delivering BI system.
Using the data mart interface, the user can transfer data by full upload as well as by delta
requests. A distinction is made between InfoCubes and DataStore objects.
The InfoCube that is used as an export DataSource is first initialized, meaning that
the current status is transferred into the target BI system. When the next upload is
performed, only those requests are transferred that have come in since initialization.
Different target systems can also be filled like this. Only those requests are
transferred that have been rolled up successfully in the aggregates. If no aggregates
are used, only those requests are transferred that are set to Qualitative OK in the
InfoCube administration.
For DataStore objects, the requests in the change log of the DataStore object are
used as the basis for determining the delta. Only the change log requests that have
arisen from reactivating the DataStore object data are transferred.
7.4.5.2 Restriction
It is possible to only make one selection for each target system for the
delta. The user first makes a selection using cost center 1 and load deltas for this selection.
Later on, the user also decides to load a delta for cost center 2 in parallel to the cost center
1 delta. The delta can only be fully requested for both cost centers, meaning that it is then
impossible to separately execute deltas for the different selections
7.4.6 Transferring Texts and Hierarchies for the Data Mart Interface
When the user load hierarchies, the user get to the available hierarchies
from the Hierarchy Selection tab page by using the pushbutton Available Hierarchies in
OLTP. Select the user hierarchy and schedule the loading.
7. Virtual InfoCubes
7.1 Introduction
They only store data virtually, not physically and are made available
to Reporting as Info-Providers. Virtual remote cube can be used to 'act like an infocube,
feel like an infocube, but is not an infocube', The user can create this virtual infoprovider,
having the same structure similar to infocube, with dimensions, characteristics and key
figures. But the actual generation of data for this cube, is, well depends on the user making.
The user can choose to create a virtual provider with a 'direct access' to
R/3. This means that upon viewing the data in this virtual provider, a remote function call
is called direct to R/3 to get the values on the fly. The user can also choose to get value
from a function module of the user calling, so the implementation details are really up to
the user.
In BI7, just right click on any infoarea, and choose create 'virtual
provider'. For direct access, the user chooses Virtual Provider with 'direct access'. There's
an option to use the old R/3 infosource. What this means is that the user can create an
infosource with transfer rules and such, and when the user create the virtual infoprovider, it
will use the structure in the infosource automatically and the flow of data is automatically
link R/3 with the transfer rules and the virtual infoprovider. Note that no update rules exist
with this kind of setup.
But in Bi7, the user have a more flexible approach in that the user can
create a transformation to configure the 'transfer logic' of the user virtual infoprovider,
along with start routine, end routine or any other transformation technique visible with
using a transformation rule.
But using BI7 setup, the user need to create a sort of 'pseudo' DTP,
which doesn't actually do anything, meaning the user do not 'execute' it to start a data
transfer. After all is done, the user need to right click on the virtual infoprovider and choose
'Activate Direct Access'. If the user uses Infosource, go to the Infosource tab, and choose
the infosource. If the user is using BI7 setup, choose the DTP related to the transformation
and save it.
An SAP RemoteCube is a RemoteCube that allows the definition of queries with direct
access to transaction data in other SAP systems.
The user request a large amount of data in the first query navigation step, and no
appropriate aggregates are available in the source system
A lot of users execute queries simultaneously
The user frequently access the same data
The SAP RemoteCube can be used in reporting in cases where it does not differ from other
InfoProviders.
7.3.1.3 Integration
BW Service API functions (contained in the SAP R/3 plug-in) are installed.
The Release status of the source system is at least 4.0B
In BW, a source system ID has been created for the source system
DataSources from the source system that are released for direct access are assigned
to the InfoSource of the SAP RemoteCube. There are active transfer rules for these
combinations.
7.3.2 Remote Cube
When reporting using a RemoteCube, the Data Manager, instead of using a Basic Cube
filled with data, calls the RemoteCube BAPI and transfers the parameters.
Selection
Characteristics
Key figures
As a result, the external system transfers the requested data to the OLAP Processor.
7.3.2.2 Integration
To report using a RemoteCube the user has to carry out the following steps:
1. In BW, create a source system for the external system that the user wants to use.
2. Define the required InfoObjects.
3. Load the master data:
Create a master data InfoSource for each characteristic
Load texts and attributes
4. Define the RemoteCube
5. Define the queries based on the RemoteCube
7.3.3 Virtual InfoCubes with Services
The user uses a virtual InfoCube with services if the user wants to
display data from non-BW data sources in BW without having to copy the data set into the
BW structures. The data can be local or remote. The user can also use the user own
calculations to change the data before it is passed to the OLAP processor.
7.3.3.1 Structure
When the user creates an InfoCube the user can specify the type. If the
user chooses Virtual InfoCube with Services as the type for the user InfoCube, an extra
Detail pushbutton appears on the interface. This pushbutton opens an additional dialog box,
in which the user defines the services.
1. Enter the name of the function module that the user wants to use as the data
source for the virtual InfoCube. There are different default variants for the
interface of this function module. One method for defining the correct variant,
together with the description of the interfaces, is given at the end of this
documentation.
2. The next step is to select options for converting/simplifying the selection
conditions. The user does this by selecting the Convert Restrictions option. These
conversions only change the transfer table in the user-defined function module.
The result of the query is not changed because the restrictions that are not
processed by the function module are checked later in the OLAP processor.
Options:
No restrictions: if this option is selected, any restrictions are passed to the
InfoCube.
Only global restrictions: If this option is selected, only global restrictions
(FEMS = 0) are passed to the function module. Other restrictions (FEMS > 0)
that are created, for example, by setting restrictions on columns in queries, are
deleted.
Simplify selections: Currently this option is not yet implemented.
Expand hierarchy restrictions: If this option is selected, restrictions on
hierarchy nodes are converted into the corresponding restrictions on the
characteristic value.
3. Pack RFC: This option packs the parameter tables in BAPI format before the
function module is called and unpacks the data table that is returned by the
function module after the call is performed. Since this option is only useful in
conjunction with a remote function call, the user have to define a logical system
that is used to determine the target system for the remote function call, if the user
select this option.
4. SID support: If the data source of the function module can process SIDs, the user
Should select this option.
If this is not possible, the characteristic values are read from the data source and
the data manager determines the SIDs dynamically. In this case, wherever
possible, restrictions that are applied to SID values are converted automatically
into the corresponding restrictions for the characteristic values.
7.3.3.2 Dependencies
If the user uses a remote function call, SID support must be switched
off and the hierarchy restrictions must be expanded.
7.3.3.2.1 Description of the interfaces for user-defined
user function modules
Variant 1:
Variant 2:
7.3.3.2.2 Additional parameters for variant 2 for transferring hierarchy restrictions
Variant 3:
7.3.3.3 Method for determining the correct variant for the interface
The following list describes the procedure for determining the correct
interface for the user-defined
defined function module. Go through the list from top to the bottom.
The first appropriate case is the variant that the user should use: