Documente Academic
Documente Profesional
Documente Cultură
http://help.sap.com/saphelp_nw70/helpdata/en/43/7a69d9f3897103e10000000a1553f7/content.htm
Use
You can use the open hub destination to extract data to non-SAP systems. Various APIs allow
you to connect a third-party tool to the BI system and to use this third-party tool to distribute data
to other non-SAP systems.
Features
First you extract the data from BI InfoProviders or DataSources into a database table in the BI
system. The third-party tool receives a message when the extraction process is complete. You can
define parameters for the third-party tool. You can also use the monitor to oversee the process.
You can connect one or more data transfer processes to an open hub destination of type ThirdParty Tool.
You can use a process chain to start the extraction process not only in the BI system itself, but
also using the third-party tool.
The following APIs are available:
RSB_API_OHS_DEST_SETPARAMS: You use this API to transfer the parameters of the thirdparty tool that are required for the extraction to the BI system. These parameters are saved in a
parameter table within the BI system in the metadata for the open hub destination.
RSB_API_OHS_3RDPARTY_NOTIFY: This API sends a message to the third-party tool after
extraction. It transfers the open hub destination, the request ID, the name of the database table,
the number of extracted data records and the time stamp. In addition, you can add another
parameter table that contains the parameters that are only relevant for the third-party tool.
RSB_API_OHS_REQUEST_SETSTATUS: This API sets the status of extraction to the thirdparty tool in the monitor. Red means that the existing table is not overwritten. If the status is
green, the request is processed further. Normally the user can change the status manually in the
monitor or in the maintenance screen for the data transfer process. However, these manual
functions are deactivated with open hub destinations of type Third-Party Tool.
RSB_API_OHS_DEST_GETLIST: This API delivers a list of all open hub destinations.
RSB_API_OHS_DEST_GETDETAIL: This API gets the details of an open hub destination.
RSB_API_OHS_DEST_READ_DATA: This API reads data from the database table in the BI
system.
See also:
For detailed information about certification and the scenario, see the SDN at www.sdn.sap.com
Partners and ISVs SAP Integration and Certification Center Integration Scenarios
Business Intelligence Interface: BW-OHS.
Microsoft Connector 1.0 for SAP BI is an add-in for SQL Server Integration Services.
It provides an efficient and streamlined solution for integrating non-SAP data
sources with SAP BI. It also enables the construction of data warehouse solutions for
SAP data in SQL Server 2008, where SAP BI is exposed as a data source of SQL
Server.
Microsoft Connector 1.0 for SAP BI has the following requirements:
SQL Server 2008 Integration Services. Microsoft Connector 1.0 for SAP BI
needs to be installed on the same computer where Integration Services is
installed.
Extracting data using Microsoft Connector 1.0 for SAP BI from SAP BI system
requires the SAP Open Hub license. For more information about SAP licensing,
consult your SAP representative.
On the SAP BI system, SAP_BW component support package level 16 (as part
of SAP NetWeaver Support Pack Stack 14) is required. SAP_BW component
support package level 17 or higher is strongly recommended.
To use Microsoft Connector 1.0 for SAP BI in 32-bit (64-bit) mode on any 32bit (64-bit) operating system, The 32-bit (64-bit) version of librfc32.dll needs
to copied to the following location: %windir%\system32.
To use Microsoft Connector 1.0 for SAP BI in 32-bit mode on a 64-bit operating
system, the 32-bit librfc32.dll needs to be copied to the following location:
%windir%\SysWow64.
Notes
Microsoft Connector 1.0 for SAP BI can only be used with SQL Server 2008
Integration Services. However, you can load data from or extract data to SQL
Server 2008, SQL Server 2005, or SQL Server 2000 databases.
Microsoft Connector 1.0 for SAP BI does not support SAP BW 3.5 and earlier
versions.
Extracting data from an SAP BI system by using Microsoft Connector 1.0 for
SAP BI only supports Open Hub Destinations. It does not support InfoSpokes,
because InfoSpokes are obsolete in SAP NetWeaver BI.
With Microsoft Connector 1.0 for SAP BI, it is now possible use components of the
SQL Server platform to move data in and out of SAP BI.
Figure 9: The two nodes that are the minimum requirement for a process chain in
SAP BI
The process chain must contain at least these two nodes:
Start node with the scheduling option Start Using Meta Chain or API (Figure
10)
Figure 11: Adding the SAP BI source to the Toolbox in Business Intelligence
Development Studio
Now SAP BI Source is available in Data Flow Sources (Figure 12).
Figure 12: The list of Data Flow Sources in the Toolbox in Business Intelligence
Development Studio after adding the SAP BI source
Figure 15: The representation of the SAP BI source in the data flow of a package
Edit the source by choosing the appropriate SAP BI connection manager, specifying
the RFC destination, and choosing the previously-created process chain (Figure 16).
Figure 16: Configuring the SAP BI source on the Connection Manager page of the
SAP BI Source Editor
Note the different execution modes that are available:
W Wait for Notify: No process chain is started; instead the tool only waits
until it is notified of that the extraction is complete. Someone else is
responsible for starting up the extraction (for example, SAPs own scheduler).
E Extract Only: A process chain is not started, and the source does not wait
for notification. Instead, the Request ID entered in the field Request ID is
used to retrieve data that is hidden behind the respective request.
If the Integration Services package will initiate the ETL process from SAP BI, then
the mode P should be chosen to trigger the SAP BI process chain for data
movement through Open Hub. This is the most suitable option for a pull pattern.
The mode W is the best for a push pattern. In this mode, SAP BI schedules its
own internal ETL, and then it starts the Open Hub DTP to push data to SQL Server.
The mode E is used when there is an error during the ETL and a particular request
needs to be reprocessed. This is mostly useful during testing, or in production
during a data recovery process.
Note that the Extract-Only mode will fail if there are multiple packages within one
request. This failure occurs because the SAP BI system does not provide the number
of packets correctly when the Read function of the Open Hub API is called. To work
around this limitation and support Extract-Only mode, increase the package size in
the DTP of the Open Hub Destination to a value greater than the number of rows
that will be extracted. As a result, only one package is created.
Configuring the Advanced Settings
There are three main options available on the Advanced page of the SAP BI Source
Editor:
Timeout setting
Request ID reset
Figure 17: Configuring advanced options for the SAP BI source on the Advanced
page of the SAP BI Source Editor
Timeout and Request ID are very important.
Timeout specifies the valid period that the Integration Services destination should
wait for the SAP BI source, before the package fails due to a timeout error. If an
Open Hub DTP is expected to run for a long time, as in a full initial extraction,
increase the timeout to a large enough number to avoid the timeout error. However,
for routine delta loads, where the duration is not so long, enter a realistic timeout
value. Any value between 300 and 3600 should be acceptable under normal delta
circumstances.
Request ID can be used to reset a DTP that encountered a problem. If a DTP load is
stuck in Yellow status in SAP BI, the request can be reset to Green. After a request is
successfully reset, it can be deleted in SAP BI in Admin Workbench Monitor. For
more information about DTP request status, check the SAP system table
RSBKREQUEST table on SAP BI, and look under the columns USTATE (User-Defined
Processing Status for a DTP Request) and TSTATE (Technical Processing Status for a
DTP Request). The overall DTP status will be successful when both USTATE and
TSTATE of a DTP request indicate success (value 2). Figure 18 shows all available
values of USTATE and TSTATE.
Figure 18: The available values for the status of a DTP request in SAP BI
Adding and Configuring the Destination
After you set up the SAP BI source, define the destination in the package. An OLE
DB destination is commonly used for this purpose. Based upon the metadata from
the SAP BI source, the system may propose a table creation script if the target table
is not available in the database. After the column mapping is done, the Integration
Services package is ready to run (Figure 19).
Figure 19: A data flow for extracting from an SAP BI source to a non-SAP
destination
Figure 21: Configuring a source system on the RFC Destination screen in SAP BI
The InfoSource and InfoPackage can either be set up within SAP BIs Admin
Workbench, or in Integration Services from within the SAP BI Destination Editor
dialog box.
Figure 22: Creating SAP BI objects directly from the SAP BI Destination Editor
dialog box
Note that the objects created from the SAP BI Destination Editor dialog box are
put under the Unassigned node application area in SAP workbench. If you prefer a
dedicated application area, consider creating the objects in SAP BI Admin
Workbench.
Configuring the Integration Services package in Business Intelligence Development
Studio involves three main steps:
1. Add the SAP BI Destination as a destination in the data flow.
2. Set up the connection manager for SAP BI.
Figure 23: Adding the SAP BI destination to the Toolbox in Business Intelligence
Development Studio
Create a new connection manager for SAP BI first. The details can be found in the
setup steps for Application Scenario 1. For more information, see Setting Up the
Connection Manager for SAP BI earlier in this white paper.
After the InfoPackage and InfoSource are available, add the SAP BI destination to
the data flow of the package. Then configure the destination in the SAP BI
Destination Editor dialog box.
Figure 24: Configuring the SAP BI destination on the Connection Manager page
of the SAP BI Destination Editor dialog box
The data flow of the package now looks like this.
Figure 25: A data flow for loading from a non-SAP source to an SAP BI destination
A compelling use case is to leverage Microsoft Connector 1.0 for SAP BI to move the
multidimensional data in SAP BIs InfoCubes to SQL Server Analysis Services cubes,
with all the dimensional structures and content intact. The main objective is to
migrate SAP BI InfoCubes to SQL Server cubes efficiently, in order to construct an
Analysis Services based enterprise data warehouse. This use case demonstrates
that this objective can be achieved with stability, quality, and performance, and
with a relatively small amount of effort.
When SAP BI Open Hub processes InfoCube data, it flattens the multidimensional
structure into a relational structure. So the design idea is to mirror the same flat
structure first in a staging table, then reconstruct the dimensions in the Analysis
Services cube.
Figure 27: Metadata for the dimensions and fact tables in standard SAP BI
InfoCube 0FIAP_C03
The flattened Open Hub structure is shown in Figure 28.
Figure 29: The configuration of the process chain in SAP BI, and of the data flow in
the SQL Server Integration Services package
The column mappings in the Integration Services package are shown in Figure 30.
Figure 30: The column mappings between the SAP BI source and the destination
on the Mappings page of the OLE DB Destination Editor dialog box
The matching structure of the data in SQL Server Analysis Services is shown in
Figure 31.
Figure 31: The structure of the SQL Server Analysis Services cube based on the
data extracted from SAP BI to SQL Server
After the Analysis Services cube is set up, it needs be deployed. Then, each
dimension and the cube itself can be processed to dispatch data from the staging
table to each dimension respectively.
An easy way to validate the data quality after the cube migration is to run and
compare reports on SAP BI and Analysis Services.
Here is the result of an SAP BI BEx query against the SAP BI InfoCube.
Figure 33: Viewing the data from the SQL Server Analysis Services cube in an Excel
PivotTable report
Here is a SQL Server Reporting Services report against the Analysis Services cube.
Figure 34: Viewing the data from the Analysis Services cube in a Reporting
Services report
The query results on SAP BI and in the Analysis Services cube match precisely.
This paper has described the functionality of the Microsoft Connector 1.0 for SAP BI,
and provided detailed step-by-step instructions on how to use the connector in SQL
Server Integration Services. A realistic use case is presented with the design
highlights and rationale. Overall, the connector bridges the gap to support building
an enterprise data warehouse solution centered on Microsoft SQL Server 2008 in a
heterogeneous environment with heavy presence of SAP BI. It offers great flexibility
and efficiency for extracting non-SAP data into SAP BI, and for extracting SAP BI
data into a SQL Server data warehouse.
By utilizing the Microsoft Connector 1.0 for SAP BI effectively, it is now possible to
construct a streamlined end-to-end data warehouse and business intelligence
solution based upon Microsoft technologies for enterprises running SAP, with lower
TCO, better design, and more flexibility.
For more information:
http://www.microsoft.com/sqlserver/: SQL Server Web site
http://technet.microsoft.com/en-us/sqlserver/: SQL Server TechCenter
http://msdn.microsoft.com/en-us/sqlserver/: SQL Server DevCenter
http://www.microsoft.com/sqlserver/2008/en/us/integration.aspx: SQL
Server Integration Services Web site
http://technet.microsoft.com/en-us/sqlserver/cc510302.aspx: SQL Server
Integration Services TechCenter
http://msdn.microsoft.com/en-us/sqlserver/cc511477.aspx: SQL Server
Integration Services DevCenter
Did this paper help you? Please give us your feedback. Tell us on a scale of 1 (poor)
to 5 (excellent), how would you rate this paper and why have you given it this
rating? For example:
Are you rating it high due to having good examples, excellent screen shots,
clear writing, or another reason?
Are you rating it low due to poor examples, fuzzy screen shots, or unclear
writing?
This feedback will help us improve the quality of white papers we release.