Sunteți pe pagina 1din 9

What is an Alias?

The alias technique has been around far longer than Business Objects has. Whenever a SQL
script needs to access the same table more than once an alias is used. One of the easiest
examples is an employee table that contains a key to the employees manager. In order to
return a list of employees and who they work for I might write SQL code something like this:
SELECT emp.name
, mgr.name
FROM employees emp
, employees mgr
WHERE emp.mgr_id = mgr.emp_id

There is only one table in the code: the employees table. However, in order to get the employee
name and the manager name, I need to join the table back to itself. In order to accomplish that,
I have to give each side of the relationship a unique name. That is the SQL alias for each table,
and the same technique is used in building universes.
Aliases are more typical in a universe environment with lookup tables. For example, in the
screen shot shown below (taken from my presentation) the COUNTRY table has two unique
purposes. In one case it is used to show where a Resort is located. In the other case it contains
the Region for each customer. An alias is indicated by the fact that the COUNTRY table is always
on the one side of a one-to-many join. This issue (and solution) can be found in the Island
Resorts universe.

To summarize so far: an alias is used when the same table is used more than once, and is
typically indicated by that table always being on the one side of a one-to-many join.

What is a Context?
A context, on the other hand, is a concept that probably originated within a universe. If a SQL
developer decides that he or she needs two scripts to get the job done, then thats what they
write. But a universe is supposed to be able to generate SQL for a non-developer. So how can it
have the intelligence to know when to split something into multiple SQL passes? That is the role
of a context.
Here is another screen shot borrowed from my presentation.

In the diagram shown above, the two end points are CUSTOMER and SERVICE. There are two
ways to figure out which service a customer is related to. The first relationship goes through the
reservations tables and is used to show which services a customer has requested. The second
relationship goes through the sales tables and is used to bill the customer for services used.
These relationships are equally important.
A context is indicated in this case because there is a loop. (Please note, however, that contexts
also solve fan and chasm traps which do not involve loops.) A loop in a universe is found when
a sequence of joins can be selected that start and end at the same table. In this case, I could
map from the CUSTOMER table through the INVOICE_LINE table, the SERVICE table, back through
the reservations tables and end up back where I started. Thats a loop. If the loop is not resolved
in some way, Desktop Intelligence and Web Intelligence will issue an incompatible combination
of objects message rather than run the invalid SQL.
An alias was indicated by tables on the one side of every join. Contexts are indicated by tables
appearing only on the many side of a one-to-many join. In this loop, those are
theRESERVATION_LINE and INVOICE_LINE tables. Those tables will ultimately drive my context
creation process.
To summarize contexts: they are used to resolve loops (and other SQL traps) and are indicated
by tables always appearing on the many side of a one-to-many join.

Context Versus Alias


I have shown examples of where each technique would be appropriate. Is it possible to use an
alias to solve a context issue, or a context to solve an alias challenge? To be honest, yes.
But
you would end up creating more problems than you solve. Heres a summary of the two
features:


o
o
o
o

o
o
o
o

Alias Features
Aliases are simple to create
Objects for each alias can be created, for example Resort Country and Customer Country
from my example given above
Once created no further maintenance is required
Aliases eliminate (break) loops

Context Features
Contexts are more complex to create
Every join must exist in at least one context; if not, it becomes an isolated join
Loops are not eliminated; two (or more) paths are identified and used but the loop remains
Contexts can also resolve fan or chasm traps

SAP BW Interview Questions 2


1) What is process chain? How many types are there? How many we use in real time scenario? Can we define
interdependent processes with tasks like data loading, cube compression, index maintenance, master data & ods
activation in the best possible performance & data integrity.
2) What is data integrityand how can we achieve this?
3) What is index maintenance and what is the purpose to use this in real time?
4) When and why use infocube compression in real time?
5) What is mean by data modelling and what will the consultant do in data modelling?
6) How can enhance business content and what for purpose we enhance business content (becausing we can
activate business content)
7) What is fine-tuning and how many types are there and what for purpose we done tuning in real time. tuning can
only be done for infocube partitions and creating aggregates or any other?
8) What is mean by multiprovider and what purpose we use multiprovider?
9) What is scheduled and monitored data loads and for what purpose?
Ans # 1:
Process chains exists in Admin Work Bench. Using these we can automate ETTL processes. These allows BW guys
to schedule all activities and monitor (T Code: RSPC).
PROCESS CHAIN - Before defining PROCESS CHAIN, let us define PROCESS in any given process chain. Is a
procedure either with in the SAP or external to it with a start and end. This process runs in the background.
PROCESS CHAIN is set of such processes that are linked together in a chain. In other words each process is
dependent on the previous process and dependencies are clearly defined in the process chain.
This is normally done in order to automate a job or task that has to execute more than one process in order to
complete the job or task.
1. Check the Source System for that particular PC.
2. Select the request ID (it will be in Header Tab) of PC
3. Go to SM37 of Source System.
4. Double Click on the Job.

5. You will navigate to a screen


6. In that Click "Job Details" button
7. A small Pop-up Window comes
8. In the Pop-up screen, take a note of
a) Executing Server
b) WP Number/PID
9. Open a new SM37 (/OSM37) command
10. In the Click on "Application Servers" button
11. You can see different Application Servers.
11. Goto Executing server, and Double Click (Point 8 (a))
12. Goto PID (Point 8 (b))
13. On the left most you can see a check box
14. "Check" the check Box
15. On the Menu Bar.. You can see "Process"
16. In the "process" you have the Option "Cancel with Core"
17. Click on that option.
* -- Ramkumar K
Ans # 2:
Data Integrity is about eliminating duplicate entries in the database and achieve normalization.
Ans # 4:
InfoCube compression creates new cube by eliminating duplicates. Compressed infocubes require less storage
space and are faster for retrieval of information. Here the catch is .. Once you compress, you can't alter the InfoCube.
You are safe as long as you don't have any error in modeling.
This compression can be done through Process Chain and also manually.
Tips by: Anand
Ans#3
Indexing is a process where the data is stored by indexing it. Eg: A phone book... When we write somebodys number
we write it as Prasads number would be in "P" and Rajesh's number would be in "R"... The phone book process is
indexing.. similarly the storing of data by creating indexes is called indexing.
Ans#5
Datamodeling is a process where you collect the facts..the attributes associated to facts.. navigation atributes etc..
and after you collect all these you need to decide which one you ill be using. This process of collection is done by
interviewing the end users, the power users, the share holders etc.. it is generally done by the Team Lead, Project
Manager or sometimes a Sr. Consultant (4-5 yrs of exp) So if you are new you dont have to worry about it....But do
remember that it is a imp aspect of any datawarehousing soln.. so make sure that you have read datamodeling
before attending any interview or even starting to work....
Ans#6
We can enhance the Business Content bby adding fields to it. Since BC is delivered by SAP Inc it may not contain all
the infoobjects, infocubes etc that you want to use according to your company's data model... eg: you have a
customer infocube(In BC) but your company uses a attribute for say..apt number... then instead of constructing the
whole infocube you can add the above field to the existing BC infocube and get going...

Ans#7
Tuning is the most imp process in BW..Tuning is done the increase efficiency.... that means lowering time for loading
data in cube.. lowering time for accessing a query.. lowering time for doing a drill down etc.. fine tuning=lowering
time(for everything possible)...tuning can be done by many things not only by partitions and aggregates there are
various things you can do... for eg: compression, etc..
Ans#8
Multiprovider can combine various infoproviders for reporting purposes.. like you can combine 4-5 infocubes or 2-3
infocubes and 2-3 ODS or IC, ODS and Master data.. etc.. you can refer to help.sap.com for more info...
Ans#9
Scheduled data load means you have scheduled the loading of data for some particular date and time you can do it
in scheduler tab if infoobject... and monitored means you are monitoring that particular data load or some other loads
by using transaction RSMON.

Hi Pradeep
1.Procedure for repeat delta?
You need to make the request status to Red in monitor screen and then delete it from ODS/Cube. Then when you
open infopackage again, system will prompt you for repeat delta.
also.....
Goto RSA7->F2->Update Mode--->Delta Repetation
Delta repeation is done based on type of upload you are carrying on.
1. if you are loading masterdata then most of the time you will change the QM status to red and then repeat the delta
for the repeat of delta. the delta is allowed only if you make the changes.
and some times you need to do the RnD if the repeat of delta is not allowed even after the qm status id made to red.
here you have to change the QM status to red.
If this is not the case, the source system and therefore also the extractor, have not yet received any information
regarding the last delta and you must set the request to GREEN in the monitor using a QM action.
The system then requests a delta again since the last delta request has not yet occurred for the extractor.
Afterwards, you must reset the old request that you previously set to GREEN to RED since it was incorrect and it
would otherwise be requested as a data target by an ODS.
Caution: If the termianted request was a REPEAT request itself, always set this to RED so that the system tries to
carry out a repeat again.
To determine whether a delta or a repeat are to be requested, the system ONLY uses the status of the monitor.
It is irrelevant whether the request is updated in a data target somewhere.

When activating requests in an ODS, the system checks delta repeat requests for completeness and the correct
sequence.
Each green delta/repeat request in the monitor that came from the same DataSource/source system combination
must be updated in the ODS before activation, which means that in this case, you must set them back to RED in the
monitor using a QM action when using the solution described above.
If the source of the data is a DataMart, it is not just the DELTARNR field that is relevant (in the roosprmsc table in the
system in which the source DataMart is, which is usually your BW system since it is a Myself extraction in this case),
rather the status of the request tabstrip control is relevant as well.
Therefore, after the last delta request has terminated, go to the administration of your data source and check whether
the DataMart indicator is set for the request that you wanted to update last.
If this is NOT the case, you must NOT request a repeat since the system would also retransfer the data of the last
delta but one.
This means, you must NOT start a delta InfoPackage which then would request a repeat because the monitor is still
RED. For information about how to correct this problem, refer to the following section.
For more information about this, see also Note 873401.
Proceed as follows:
Delete the rest of this request from ALL updated data targets, set the terminated request to GREEN IN THE
MONITOR and request a new DELTA.
Only if the DataMart indicator is set does the system carry out a repeat correctly and transfers only this data again.
This means, that only in this case can you leave the monitor status as it is and restart the delta InfoPackage. Then
this creates a repeat request
In addition, you can generally also reset the DATAMART indicator and then work using a delta request after you have
set the incorrect request to GREEN in the monitor.
Simply start the delta InfoPackage after you have reset the DATAMART indicator AND after you have set the last
request that was terminated to GREEN in the monitor.
After the delta request has been carried out successfully, remember to reset the old incorrect request to RED since
otherwise the problems mentioned above will occur when you activate the data in a target ODS.
What is process chain and how you used it?
A) Process chains are tool available in BW for Automation of upload of master data and transaction data while taking
care of dependency between each processes.
B) In one of our scenario we wanted to upload wholesale price infoobject which will have wholesale price for all the
material. Then we wanted to load transaction data. While loading transaction data to populate wholesale price, there

was a look up in the update rule on this InfoObject masterdata table. This dependency of first uploading masterdata
and then uploading transaction data was done through the process chain.
What is process chain and how you used it?
A) We have used process chains to automate the delta loading process. Once you are finished with your design and
testing you can automate the processes listed in RSPC. I have a real time example in the attachment.
for more detail
Collecting Process Chain Statistics
/thread/235805 [original link is broken]
Advice regarding process chains
creation of process chains

Hi ,
<b>1. What is process chain and how you used it?</b>
Process chains are tool available in BW for Automation of upload of master data and transaction data while taking
care of dependency between each processes.
<b>2. What is transaction for creating Process Chains ?</b>
RSPC .
<b> 3. Explain Colector Process ?</b>
Collector processes are used to manage multiple predecessor
processes that feed into the same subsequent process. The collector
processes available for BW are:
AND :
All of the direct predecessor processes must raise an event in order for subsequent processes to be executed
OR :
A least one predecessor process must send an event The first predecessor process that sends an event triggers the
subsequent process
Any additional predecessor processes that send an event will again trigger
subsequent process (Only if the chain is planned as periodic)

EXOR : Exclusive OR
Similar to regular OR, but there is only ONE execution of the successor
processes, even if several predecessor processes raise an event
<b>4. What are application Process ?</b>
Application processes represent BW activities that are typically
performed as part of BW operations.
Examples include:
Data load
Attribute/Hierarchy Change run
Aggregate rollup
Reporting Agent Settings
<b>5. Tell some facts about process Chains</b>
Process chains are transportable Button for writing to a change request when
maintaining a process chain in RSPC
Process chains available in the transport connection wizard (administrator workbench)
If a process dumps, it is treated in the same manner as a failed process
Graphical display of Process Chain Maintenance requires the 620 SAPGUI and SAP BW 3.0B Frontend GUI
A special control background job runs to facilitate the execution of the of the other batch jobs of the process chain
Note your BTC process distribution, and make sure that an extra BTC process is available so the supporting control
job can run immediately
<b>6. What happens when chain is activated ?</b>
When a chain gets activated It will be copied into active version The processes will be planned in batch as program
RSPROCESS with type and variant given as parameters with job name BI_PROCESS_<TYPE> waiting for event,
except the trigger The trigger is planned as specified in its variant, if start via meta-chain it is not planned to batch
<b>7. Steps in process chains ?</b>
Go to transaction code-> RSPC
Follow the Basic Flow of Process chain..
1. Start chain
2. Delete BasicCube indexes

3. Load data from the source system into the PSA


4. Load data from the PSA into the ODS object
5. Activate data in the ODS object
6. Load data from the ODS object in the BasicCube
7. Create indexes after loading for the BasicCube

S-ar putea să vă placă și