Sunteți pe pagina 1din 117

Techno Level 1: Basic Level

1. Overview of OBIEE Real Time Project


1.1. Why OBIEE?
For the Consultant (you):
 As an excellent source to get a job quickly.
Ans: opportunities are more. EX: Naukri(Fig 1- 5)
 To get excellent and handsome salary.
Ans: opportunities are more. EX: Gmail and naukri
 The Future is secured and steady in OBIEE Stream.
 Help in Settle quickly.
For the Employer (Company):
 Get more than paid with OBIEE Suite.
 Ease to upgrade and fuse with third party tools.
 Get Full version free for training users.
 Oracle Support.
For the MSIT Techno:
 We have real time experience what an employer needs from aconsultant.
 After the course, a consultant will have the working knowledge and skill of
a real time experienced OBIEE principle consultant.
 After the course, a consultant can crack an interview in two or three
attempts.
1.2. What is OBIEE?
Oracle Definition:
Oracle Business Intelligence Enterprise Edition 11g (OBIEE) is an unmatched and
comprehensive business intelligence and analytics platform that delivers a full range
of capabilities - including interactive dashboards, ad hoc queries, mobile analytics,
notifications and alerts, enterprise and financial reporting, scorecard and strategy
management, business process invocation, unstructured search and collaboration,
integrated systems management and more. OBIEE 11g is built on a proven and
modern technological foundation that supports the highest workloads and most
complex deployments, while providing timely insights to users across an enterprise at
a low overall total cost of ownership.
MSITDefinition:

OBIEE or OBI is a reporting tool or application used by business users to query or


requests the organization data stored over a period of time or data warehouse or OLAP
to obtain answers or results or reports for the analysis of business and make better
decisions.

Fig6

Open bi analytics from web and how I opened and accessed.

1.3. OBIEE Real Time Project Types


1.3.1. OBIEE without ETL (Directly from OLTP or any Database)
Fig7

1.3.2. OBIEE with ETL(OLAP using ETL Tools like Informatica, DAC, and ODI)
Fig8

1.3.3. OBIEE with OBIA or Oracle BI Apps(Using OBIA 7.9.6.4, OBIA 11.1.1.10.1)
Fig9

1.4. OLTP vs OLAP

OLTP OLAP
1. Source of data is transactions. 1. Source of data is from various OLTP
Databases.
2. It is useful to store transactional or 2. It is useful to store consolidated data.
operational data.

3. The nature of data is current and Detail. 3. The nature of data is historical and
summarized.
4. Useful to RUN the Business. 4. Useful to ANALYZE the Business.
5. OLTP supports 5. OLAP supports Read only.
CRUD(create,Read,update,delete)
6. OLTP is volatile. 6. OLAP is non-volatile
7. Querying or processing is very fast. 7. Querying or processing depends on amount
of data. Improved by creating indexes.
8. OLTP uses normalized schemes. 8. OLAP mostly uses De-normalized schemes.
9. OLTP is isolation depends on 9. OLAP is integrated as per subject area.
Application.
10. OLTP is accessed by large number of users. 10. OLAP is accessed by less number of users.
Ex: Customers, Employees... Ex: CEO, CFO, COO, GM, Managers…
I1.What is multi dimensional datasource?
1.5. OBIEE Installation (11g/12c)
1.5.1. Certification Matrix
Fig 10
1.5.2. Downloads (OTN & edelivery)
Oracle.com and edelivery.com
1.5.3. Installation Types
Fig11
1.5.4. Installation
1.5.4.1. On Windows Server
Visit website
1.5.4.2. On Linux Server
Visit website
1.5.4. Bugs and Patches (Oracle Support)
Open oracle support site

ID: Msit.techno@gmail.com

Pwd:Msit123456

I2. Did you work any Sev-1 Issue?


Try Interview Level – Expected Questions
Section: I.Introduction
2. OBIEE Introduction
2.1. History of OBIEE

Siebel Systems Inc. acquired privately-held nQuire Software Inc. and


integrated and shipped nQuire's scalable analytic server and intelligent Web products with
Siebel 7, the e-business application suite.
San Mateo, California and Minnesota-based nQuire launched its nQuire suite of products in
late 1999 to monitor and deliver real-time intelligence based on any fact-based problem or
opportunity by initiating an e-mail or page to any device, such as a laptop, pager, PDA, RIM,
U.S. and European mobile phones.

Siebel 7 is an integrated suite of applications for customer relationship management (CRM --


including applications for partner relationship management (PRM) and employee relationship
management (ERM) -- all based on a common Web architecture.

Siebel Enterprise Analytic Platform 7.7 includes Siebel Analytics Server for calculation and
integration, logical business model and metadata management, and caching. Other
components include the Intelligent Interaction Management module for data mining;
Interactive Dashboard, which provides a portal interface to the data; Siebel Answers for ad
hoc querying; and Siebel Delivers for sending alerts of changes in data.
In 2006,Oracle purchased Seibel analytics and renamed as Oracle Business Intelligence 10g
(OBIEE 10g)
OBIEE 10g:

OBIEE 11g:
Summary:

 2000: nQuire formed


 2001: nQuire courted by Siebel.
 2002: Siebel purchases nQuire
 2005: Oracle purchases Siebel 7.8.3.1
 2006: OBIEE 10G or Siebel 7.8.4
 2010: OBIEE11g Version Released
 2015: OBIEE 12c Version Released

Following are major releases:

Siebel Analytics 7.0 – 2002


Siebel Analytics 7.5 – 2003
Siebel Analytics 7.7 – 2004
Siebel Analytics 7.8.2 and 7.8.3 – 2005
Siebel Analytics / Oracle Business Intelligence 7.8.4 and 7.8.5 – 2006
OBIEE 10gR3, 10.1.3.2 – Jan 2007 (MAUI)
OBIEE 10gR3, 10.1.3.2.1 – Apr 2007
OBIEE 10gR3, 10.1.3.3.0 – Aug 2007
OBIEE 10gR3, 10.1.3.3.1 – Oct 2007
OBIEE 10gR3, 10.1.3.3.2 – Dec 2007
OBIEE 10gR3, 10.1.3.3.3 – May 2007
OBIEE 10gR3, 10.1.3.4 – Aug 2008
OBIEE 10gR3, 10.1.3.4.1 – Apr 2009
OBIEE 11g-July 2010
OBIEE 12c-Oct 2015

OBIEE 11g versions


1. OBIEE 11.1.1.3.0
2. OBIEE 11.1.1.5.0
3. OBIEE 11.1.1.6.0
4. OBIEE 11.1.1.7.0
5. OBIEE 11.1.1.9.0

OBIEE 12c versions


1. OBIEE 12.2.1.0.0
2. OBIEE 12.2.1.1.0 (latest)

2.2 . Components and Types

OBIEE components are mainly divided into two types of components −

 Server Components
 Client Components

Server Components:

Discussed during OBIEE Architecture

Client Components:

Windows based or Non Web based or Non Browser based clients

1. BIAdministration tool:
The Oracle BI Administration Tool is a Windows application that you can use
to create and edit repositories(RPD).The Administration Tool can connect
directly to the repository in offline mode, or it can connect to the repository
through the Oracle BI Server(online mode).
Fig14
2. Catalog Manager:

Catalog Manager is a tool that lets you perform online and offline management of

Presentation Catalogs and Catalog items like reports, dashboards, permissions,

Renamingreports, Migrationetc.
3. Job Manager:

The Job Manager is a Windows tool that is the interface with the Oracle BI Scheduler.

Through Job Manager, you can connect to, start and stop the Oracle BI Scheduler, add

and manage jobs, and manage job instances.

4. ODBC Client:

Open Database Connectivity (ODBC) is an industry standard interface for


connecting to databases. A Data Source Name (DSN) is used to store the
information about connecting to a given database as a given database user over
ODBC.
Ex: nqcmd, BI analytics etc.,
Fig14

Web based or Browser based clients

1. BI Analytics(11g)/BI Answers (10g):

Oracle BI Analysis Editor is a set of graphical tools that are used to build, view, and
modify Oracle BI analyses. The analysis are queries against an organization’s data. It
allows us to include the views that you create in an analysis for display in dashboards.

2. BI Interactive Dashboards:
Dashboards provide personalized views of corporate and external information.
A dashboard consists of one or more pages. Pages can display anything that
you can access or open with a web browser, including, the results of analyses,
Images,Text etc.,
3. BI Delivers:

BI Delivers is the interface that is used to create alerts based on Oracle


Business Intelligence Analyses. You can use Delivers to detect specific results
and notify appropriate persons or groups using the web, wireless, mobile, and
other communication channels.
4. BI Publisher:

Oracle BI Publisher is the reporting solution to author, manage, and deliver


all pixel-perfect standardized reports and documents easier and faster than
traditional reporting tools.

5. OFMW Enterprise Manager:

Fusion Middleware Control is a web based,graphical user interface that used to centrally
manage, monitor, and configure Oracle Business Intelligence system components. For
example, the Oracle BI Server, Oracle BI Presentation Services, and Oracle BI Scheduler.

6. OWLS Administration Console:

The Administration Console is a Web browser-based, graphical user interface that you
use to manage a WebLogic Server domain.WebLogic Server Administration Console enables you
to monitor status ,configure security and manage the Administration Server and Managed
ServersAnd more.

2.3.Data ModellingElements
2.3.1. Columns
String Columns:
Dimensional Columns
Dimensional Columns or Dimensions are categories of attributes by which the
business is defined.
EX: Time periods,Products,Customers
Within a given dimension, there may be many attributes.
For example, the time period dimension can contain the attributes day, week,
month, quarter, and year. Exactly what attributes a dimension contains depends
on the way the business is analyzed.
I3 What is a dimension?
I4 What is a Conformed dimension,
I5 What is a scd,
I6 What is a degenerate dimension,
I7 What is a junk dimension
Dimensional Hierarchical Columns
Dimensional Hierarchical Columns typically contain hierarchies, which are sets of top-
down relationships between members within a dimension.

Ex:Yearquartermonthweekday

Measure or Numeric Columns:


Fact Columns or Measure:
Facts are the metrics that business users would use for making business decisions by

Joining them with their associated dimensions.

Generally, facts are mere numbers.

Ex:Revenue,shipped quantity,billed quantity

I8. What is a fact or measure?

I9. What are fact types?

The data in the dimension tables are less compared to the data in the fact tables
The data in the dimension table is static and descriptive in nature whereas the fact
table contains numeric and will change regularly. facts tables are the key performance indicators of
the business

I10. What is anImplicit fact column?

Aggregated Columns:
Aggregated columns stores precomputed (typically summed) or calculated results of facts.
Ex:Total Revenue,Total Shipped quantity
Key Columns:
A Key columns are used toidentify each row/record in a database table.
Or after summed a numeric column is given a valid value then it is a
Fact column otherwise it is a key column
Ex:Primary key,foreign key, Surrogate key, Chronological key

I11. What is primary key and foreign key?


I12. What is Surrogate key or ROW_WID, Candidate Key and Super Key?
I13. What is Chronological Key?
Cubes?
Cubes are made up of measures and organized by dimensions. Because they are
already dimensional, each cube maps easily to the logical fact and dimension tables in
the business model

I14. What is a Cube? Do you have used Cubes?


2.3.2. Tables
Dimensional Tables:
Dimension tables contain attributes that describe business entities (such as Customer Name, Region, Address,
Country and so on). Dimension tables also contain primary keys that identify each member.

EX: W_CUSTOMER_D, W_ORGANIZATION_D,W_PO_VENDORS_D

Dimensional Hierarchical Tables


Dimensional Hierarchical Tables has Columns typically contain hierarchies, which are sets of top-
down relationships between members within a dimension.

Ex:W_CALENDER_DH, W_PRODUCTS_DH

I15. What are the tables or Dimension or fact tables used in your projects or in your
datawarehouse?
Fact Tables:
Fact Tables contains facts or measures which are the key performance
indicators of the business when joined with related dimensions.

Ex: W_AP_INVOICE_F, W_GL_COGS_F,W_ORDERS_F

I16. Did you created any dummy tables in rpd? Why?


I17. What is Factless fact table?

Aggregated Tables:
Aggregate tables store precomputed results that are aggregated measures (typically summed) over
a set of dimensional attributes. Using aggregate tables is a typical technique used to improve query response
times in decision support systems.

Ex:W_AP_INVOICE_A,W_GL_COGS_A,W_ORDERS_A

2.3.3. Schemes
Fig13
OBIEE Schemes
1. Star Scheme:
A star schema is a set of dimensional schemas (stars) that each have
a single fact table with foreign key join relationships to several dimension tables
Fig15
2. Snow flake Scheme:
A snow flake schema is a set of dimensional schemas (stars) that each have
a single fact table with foreign key join relationships to several dimension tables and
Primary key join with other dimensional tables.
Fig16
Mixed schemes
3. Normalized Schemas:
Normalized schemas (OLTP) distribute data entities into multiple
tables to minimize data storage redundancy and optimize data updates
Fig17
4. Fully Denormalized Schemas:
This type of dimensional schema combines the facts and dimensions as columns in one
table.
Ex: Flat files (excel),Prebuilt summarisation or OLAP cubes,star scheme,snow flake
scheme

I18.What are Schemes used in OBIEE? Difference between Star and Snowflake?

Try Interview Level – Expected Questions


Section: I.Introduction
3. BI ADMIN TOOL
I.19. On a scale of 5, how you rate yourself in BI Admin Tool? [OR]
On a scale of 10, how you rate yourself in OBIEE?
3.1. Physical Layer
The Physical layer defines the data sources to which Oracle BI Server submits queries and the relationships between physical
databases
and other data sources that are used to process multiple data source queries. The recommended way to populate the Physical
layer is by
importing metadata from databases and other data sources. The data sources can be of the same or different varieties. You
can import
schemas or portions of schemas from existing data sources. Additionally, you can create objects in the Physical layer manually.
When you import metadata, many of the properties of the data sources are configured automatically based on the information
gathered
during the import process. After import, you can also define other attributes of the physical data sources, such as join
relationships, that might
not exist in the data source metadata. There can be one or more data sources in the Physical layer, including databases, flat
files, XML
documents, and so forth. Here we will import and configure tables from the BISAMPLE schema from Oracle Database.

3.1.1. Hands on Main Objectives


To build the Physical layer of a repository, you perform the following steps:
1. Creating a New Repository
2. Importing Metadata
3. Verifying Connection
4. Creating Aliases
5. Creating Physical Keys and Joins

Creating a New Repository


1. Select Start > Programs > Oracle Business Intelligence > BI Administration to open the Administration Tool.

2. Select File > New Repository

3. Select the Binary method

4.Enter the Name for the Repository i.e., BISAMPLE.


5. Leave the default location as is. It points to the default repository directory.

C:\OBIEE11g_HOME\instances\instance1\bifoundation\OracleBIServerComponent\coreapplication_obis1\repository
6. Leave Import Metadata set to Yes.
7. Enter and retype a password for the repository. BISAMPLE123is the repository password.

8. Click Next.
Importing Metadata
1. Change the Connection Type to OCI 10g/11g. The screen displays connection fields based on the connection type you
selected.

2.Enter a data source name. In this example the data source name is orcl. This name is the same as the tnsnames.ora entry for
this Oracle database instance.

3.Enter user name and password for the data source. In this example the username and password
are both BISAMPLE.

4.Click Next.

5.Accept the default metadata types and click Next.

6.In the Data source view, expand the BISAMPLE schema.

7.Use Ctrl+Click to select the following tables from BISAMPLE schema:

SAMP_ADDRESSES_D
SAMP_CUSTOMERS_D
SAMP_PRODUCTS_D
SAMP_REVENUE_F
SAMP_TIME_DAY_D

8.Click the Import Selected button to add the tables to the Repository View.

9.The Connection Pool dialog box appears. Accept the defaults and click OK.

10.The Importing message appears.

11.When import is complete, expand BISAMPLE in the Repository View and verify that
the five tables are visible.

12.Click Finish to open the repository

13.Expand orcl > BISAMPLE and confirm that the five tables are imported into the
Physical layer of the repository

Verifying Connection
1. Select Tools > Update All Row Counts.

2.When update row counts completes, move the cursor over the tables and observe that row
count information is now visible, including when the row count was last updated.
3.Expand tables and observe that row count information is also visible for individual
columns.

4.Right-click a table and select View Data to view the data for the table.

5.Close the View Data dialog box when you are done. It is a good idea to update row counts
or view data after an import to verify connectivity. Viewing data or updating row count, if
successful, tells you that your connection is configured correctly.

Creating Aliases
1.It is recommended that you use table aliases frequently in the Physical layer to eliminate unrelated joins and to include best
practice
naming conventions for physical table names. Right-click SAMP_TIME_DAY_D and select New Object > Alias to open the
Physical
Table dialog box.

2. Enter D1 Time in the Name field

3.In the Description field, enter Time Dimension Alias at day grain. Stores one record for
each day.

4.Click the Columns tab. Note that alias tables inherit all column definitions from the source
table

5.Click OK to close the Physical Table dialog box.

6.Repeat the steps and create the following aliases for the remaining physical tables.

SAMP_ADDRESSES_D = D4 Address
SAMP_CUSTOMERS_D = D3 Customer
SAMP_PRODUCTS_D = D2 Product
SAMP_REVENUE_F = F1 Revenue

Creating Keys and Joins


1. Select the five alias tables in the Physical layer

2.Right-click one of the highlighted alias tables and select Physical Diagram > Selected
Object(s) Only to open the Physical Diagram. Alternatively, you can click the Physical
Diagram button on the toolbar.

3.Rearrange the alias table objects so they are all visible

4.You may want to adjust the objects in the Physical Diagram. If so, use the toolbar buttons
to zoom in, zoom out, fit the diagram, collapse or expand objects, select objects, and so forth:

5.Click the New Join button on the toolbar.

6.Click the F1 Revenue table and then the D1 Time table. The Physical Foreign Key dialog
box opens. It matters which table you click first. The join creates a one-to-many (1:N)
relationship that joins the key column in the first table to a foreign key column in the second
table.
7.Select the D1 Time. CALENDAR_DATE column, and then select F1
Revenue.BILL_DAY_DT to join the tables. Ensure that the Expression edit box (at the
bottom) contains the following expression:

"orcl".""."BISAMPLE"."D1 Time"."CALENDAR_DATE" = "orcl".""."BISAMPLE"."F1


Revenue"."BILL_DAY_DT"

8.Click OK to close the Physical Foreign Key dialog box. The join is visible in the Physical
Diagram.

9.Repeat the steps to create joins for the remaining tables. Use the following expressions as a
guide. Please notice that D4 Address joins to D3 Customer.

"orcl".""."BISAMPLE"."D2 Product"."PROD_KEY" = "orcl".""."BISAMPLE"."F1


Revenue"."PROD_KEY"

"orcl".""."BISAMPLE"."D3 Customer"."CUST_KEY" = "orcl".""."BISAMPLE"."F1


Revenue"."CUST_KEY"

"orcl".""."BISAMPLE"."D4 Address"."ADDRESS_KEY" = "orcl".""."BISAMPLE"."D3


Customer"."ADDRESS_KEY"

10.Click the Auto Layout button on the toolbar.

11.Your diagram should look similar to the screenshot:


12.Click the X in the upper right corner to close the Physical Diagram.

 Select File > Save or click the Save button on the toolbar to save the repository.

13.Click No when prompted to check global consistency. Some of the more common checks
are done in the Business Model and Mapping layer and Presentation layer. Since these
layers are not defined yet, bypass this check until the other layers in the repository are built.
3.2. BMM Layer

1.The Business Model and Mapping layer (BMM) of the Administration Tool defines the
business, or logical model of the data and specifies the mappings between the business model
and the Physical layer schemas.

2.This layer is where the physical schemas are simplified to form the basis for the users’ view
of the data.

3.The Business Model and Mapping layer of the Administration Tool can contain one or
more business model objects.

4.A business model object contains the business model definitions and the mappings from
logical to physical tables for the business model.

5.The main purpose of the business model is to capture how users think about their business
using their own vocabulary.

6.The business model simplifies the physical schema and maps the users’ business
vocabulary to physical sources. Most of the vocabulary translates into logical columns in the
business model.

3.2.1. Hands on Main Objects

To build the Business Model and Mapping layer of a repository, you perform the following
steps:

 Creating a Business Model

 Examining Logical Joins

 Examining Logical Columns

 Examining Logical Table Sources

 Renaming Logical Objects Manually

 Renaming Logical Objects Using the Rename Wizard

 Deleting Unnecessary Logical Objects

 Creating Simple Measures


Creating a Business Model

1. Right-click the white space in the Business Model and Mapping layer and select
New Business Model to open the Business Model dialog box.

2.Enter Sample Sales in the Name field. Leave Disabled checked.

3.Click OK. The Sample Sales business model is added to the Business Model and Mapping
layer.

4. In the Physical layer, select the following four alias tables:

D1 Time
D2 Product
D3 Customer
F1 Revenue

Do not select D4 Address at this time.

5.Drag the four alias table from the Physical layer to the Sample Sales business model in the
Business Model and Mapping layer. The tables are added to the Sample Sales business
model.

Note:

Notice that the three dimension tables have the same icon, whereas the F1 Revenue table
has an icon with a # sign, indicating it is a fact table.

Examining Logical Joins

1. Right-click the Sample Sales business model and select Business Model Diagram >
Whole Diagram to open the Business Model Diagram.

Note:

If necessary, rearrange the objects so that the join relationships are visible.
Because you dragged all tables simultaneously from the Physical layer onto the business
model, the logical keys and joins are created automatically in the business model. This is
because the keys and join relationships were already created in the Physical layer. However,
you typically do not drag all physical tables simultaneously, except in very simple models.
Later in this tutorial, you learn how to manually build logical keys and joins in the Business
Model and Mapping layer. The process is very similar to building joins in the Physical layer.
2. Double-click any one of the joins in the diagram to open the Logical Join dialog box. In this
example the join between D1 Time and F1 Revenue is selected.
Note:Notice that there is no join expression. Joins in the BMM layer are logical joins. Logical
joins express the cardinality relationships between logical tables and are a requirement for a
valid business model. Specifying the logical table joins is required so that Oracle BI Server
has necessary metadata to translate logical requests against the business model into SQL
queries against the physical data sources. Logical joins help Oracle BI Server understand the
relationships between the various pieces of the business model. When a query is sent to
Oracle BI Server, the server determines how to construct physical queries by examining how
the logical model is structured. Examining logical joins is an integral part of this process. The
Administration Tool considers a table to be a logical fact table if it is at the “many” end of all
logical joins that connect it to other logical tables.

3.Click OK to close the Logical Join dialog box.

4. Click the X to close the Business Model Diagram.

Examining Logical Columns

Expand the D1 Time logical table. Notice that logical columns were created
automatically for each table when you dragged the alias tables from the Physical layer
to the BMM layer.

Examining Logical Table Sources (LTS)

1. Expand the Sources folder for the D1 Time logical table. Notice there is a logical
table source, D1 Time. This logical table source maps to the D1 Time alias table in
the Physical layer.

2.Double-click the D1 Time logical table source (not the logical table) to open the Logical Table
Source dialog box.

3.On the General tab, rename the D1 Time logical table source to LTS1 Time. Notice that the logical
table to physical table mapping is defined in the "Map to these tables" section.

4.On the Column Mapping tab, notice that logical column to physical column mappings are defined.
If mappings are not visible, select Show mapped columns

5.You learn more about the Content and Parent-Child Settings tabs later in this tutorial when you
build logical dimension hierarchies. Click OK to close the Logical TableSource dialog box. If desired,
explore logical table sources for the remaining logical tables.

Renaming Logical Objects Manually

1. Expand the D1 Time logical table.


2. Click on the first logical column, BEG_OF_MONTH_WID, to highlight it.
3. Click on BEG_OF_MONTH_WID again to make it editable.
4. Rename BEG_OF_MONTH_WID to Beg of Mth Wid. This is the manual method for renaming
objects. You can also rename an object and select Rename to manually rename an object.

Renaming Objects Using the Rename Wizard

1. Select Tools > Utilities > Rename Wizard > Execute to open the Rename Wizard.

2.In the Select Objects screen, click Business Model and Mapping in the middle pane.
3.Expand the Sample Sales business model.

4.Expand the D1 Time logical table.

5.Use Shift+Click to select all of the logical columns except for the column you already
renamed, Beg of Mth Wid.

6.Click Add to add the columns to the right pane.

7.Repeat the steps for the three remaining logical tables so that all logical columns from the
Sample Sales business model are added to the right pane.

8.Click Next to move to the Select Types screen.Notice that Logical Column is selected.
If you had selected other object types, such as logical tables, the type would have
appeared here.

9.Click Next to open the Select Rules screen.

10.In the Select Rules screen, select All text lowercase and click Add to add the rule to the lower
pane.

11.Add the rule Change each occurrence of '_' into a space.

12.Add the rule First letter of each word capital.

13.Click Next to open the Finish screen. Verify that all logical columns will be named according to
the rename rules you selected.

14.Click Finish.

15. In the Business Model and Mapping layer, expand the logical tables and confirm that all
logical columns have been renamed as expected..

16.In the Physical layer, expand the alias tables and confirm that all physical columns have not been
renamed. The point here is you can change object names in the BMM layer without impacting object
names in the Physical layer. When logical objects are renamed, the relationships between logical
objects and physical objects are maintained by the logical column to physical column mappings.

Deleting Unnecessary Logical Objects

1. In the BMM layer, expand Sample Sales > F1 Revenue.

2.Use Ctrl+Click to select all F1 Revenue logical columns except for Revenue and Units.

3.Right-click any one of the highlighted logical columns and select Delete. Alternatively you
can select Edit > Delete or press the Delete key on your keyboard.
4.Click Yes to confirm the delete
5.Confirm that F1 Revenue contains only the Revenue and Units columns.
Creating Simple Measures

1. Double-clicktheRevenue logical column to open the Logical Column dialog box.


2. Click the Aggregation tab.
3. Change the default aggregation rule to Sum.
4. Click OK to close the Logical Column dialog box. Notice that the icon has changed
for the Revenue logical column indicating that an aggregation rule has been applied.
5. Repeat the steps to define the SUM aggregation rule for the Units logical column.

Note: Measures are typically data that is additive, such as total dollars or total
quantities. The F1 Revenue logical fact table contains the measures in your business
model. You aggregated two logical columns by summing the column data.

6. Save the repository without checking global consistency.


Congratulations! You have successfully built a business model in the Business Model
and Mapping layer of a repository and created business measures.

3.3. Presentation Layer


1.You have created the initial Sample Sales business model in the repository.
2. You now create the Presentation layer of the repository.

3.The Presentation layer exposes the business model objects in Oracle BI user interfaces so that
users can build analyses and dashboards to analyze their data.

3.3.1. Hands on Main Objects

To build the Presentation layer you perform the following steps:

 Creating a Subject Area


 Creating Presentation Tables
 Creating Presentation Columns
 Renaming Presentation Columns
 Reordering Presentation Columns

Creating a Subject Area

1. Right-click the white space in the Presentation layer and select New Subject Area to
open the Subject Area dialog box.

2.On the General tab, enter Sample Sales as the name of the subject area.

3.Click OK to close the Subject Area dialog box. The Sample Sales subject area is added to
the Presentation layer.

Creating Presentation Tables


1. Right-click the Sample Sales subject area and select New Presentation Table to open
the Presentation Table dialog box.

2.On the General tab, enter Time as the name of the presentation table.

3.Click OK to close the Presentation Table dialog box. The Time presentation table is added
to the Sample Sales subject area

4.Repeat the process and add three more presentation tables: Products, Customers, and
Base Facts.

Note:

Please note that you are using the manual method for creating Presentation layer objects.
For simple models it is also possible to drag objects from the BMM layer to the Presentation
layer to create the Presentation layer objects. When you create presentation objects by
dragging from the BMM layer, the business model becomes a subject area, the logical tables
become presentation tables, and the logical columns become presentation columns. Note
that all objects within a subject area must derive from a single business model.

Creating Presentation Columns

1. In the BMM layer, expand the D1 Time logical table


2. Use Ctrl+ Click to select the following logical columns:
Calendar Date
Per Name Half
Per Name Month
Per Name Qtr
Per Name Week
Per Name Year.
3. Drag the selected logical columns to the Time presentation table in the Presentation
layer.
4. Repeat the process and add the following logical columns to the remaining
presentation tables:
Products: Drag Brand, Lob, Prod Dsc, Type from D2 Product.
Customers: Drag Cust Key, Name from D3 Customer.
Base Facts: Drag Revenue, Units from F1 Revenue.

Renaming Presentation Columns

1. In the Presentation layer, expand the Products presentation table.Double-click the


Lob presentation column to open the Presentation Column dialog box.
2. On the General tab notice that "Use Logical Column Name" is selected. When you
drag a logical column to a presentation table, the resulting presentation column
inherits the logical column name by default. In this example the Lob presentation
column inherits the name of the logical column "Sample Sales"."D2
Product"."Lob".

3.Deselect Use Logical Column Name. The Name field is now editable.
4.Enter Line of Business in the Name field.
5.Click OK to close the Presentation Column dialog box. Notice that the presentation column
name is now changed to Line of Business in the Presentation layer.
6.In the BMM layer, expand D2 Product. Notice that the Lob logical column name is not
changed. The point here is you can change object names in the Presentation layer without
impacting object names in the BMM or Physical layers.
7.In the BMM layer, rename the Prod Dsc logical column to Product. Notice that the name
change is inherited by the corresponding presentation column

8.Make the following name changes to logical objects in the BMM layer so that the names of
the corresponding presentation columns are also changed:

For the D3 Customer logical table:

Change Cust Key to Customer Number.


Change Name to Customer Name

9.Confirm that the corresponding presentation column names are changed.

Reordering Presentation Columns

1. In the Presentation layer, double-click the Time presentation table to open the
Presentation Table dialog box.

2.Click the Columns tab.

3.Select columns and use the up and down arrows, or drag the columns. to rearrange the
presentation columns into the following order from top to bottom:

Per Name Year


Per Name Half
Per Name Qtr
Per Name Month
Per Name Week
Calendar Date

4.Click OK to close the Presentation Table dialog box and confirm that the presentation column
order is changed in the Presentation layer.

5.Repeat the steps to reorder the columns in the Products presentation table:

Brand
Line of Business
Type
Product

6.Save the repository without checking global consistency.

Congratulations! You have successfully built the Presentation layer of a repository.

Note:
Create another subject area ‘sample sales DM’ and copy Time ,Product and Revenue from
samples sales subject area and paste in sample sales DM. save RPD.

3.4. Testing and Validating a RPD

1.You have finished building an initial business model and now need to test and validate the
repository before continuing.

2.You begin by checking the repository for errors using the consistency checking option.
3.Next you load the repository into Oracle BI Server memory.

4.You then test the repository by running an Oracle BI analysis and verifying the results.
5.Finally, you examine the query log file to observe the SQL generated by Oracle BI Server.

To test and validate a repository you perform the following steps:

 Checking Consistency
 Loading the Repository

 Disabling Cache

 Setting Up Query Log

 Creating and Running Analysis

 Checking the Query Log

Checking Consistency

1. Select File > Check Global Consistency

2.You should receive the message Business model "Sample Sales" is consistent. Do you
want to mark it as available for queries?

3.Click Yes. You should receive the message: Consistency check fixed certain object(s);
there are no errors, warnings or best practice violations left to report.

Note:If you do not receive this message, you must fix any consistency check errors or
warnings before proceeding.

4.Click OK. Notice that the Sample Sales business model icon in the BMM layer is now green,
indicating it is available for queries.

5. Save the repository without checking global consistency again.


6. Select File > Close to close the repository. Leave the Administration Tool open.
4. OFMW Enterprise Manager
4.1. Overview
Fusion Middleware Control is used to centrally manage, monitor, and configure Oracle
Business Intelligence system components. For example, the Oracle BI Server, Oracle BI
Presentation Services, and Oracle BI Scheduler.

Fusion Middleware Control enables you to manage system components by performing tasks
such as monitoring status, starting and stopping processes, scaling out, resolving issues, and
configuring components. You can also manage some aspects of Java components. For
example, you can monitor their status and start and stop them.

Locking mechanism
With large deployments, you might have multiple administrators accessing the system
concurrently to view the state of the system while other administrators might want to make
configuration changes. Fusion Middleware Control and Oracle WebLogic Server prevent
concurrent updates of the same configuration settings by multiple administrators by using a
locking mechanism that allows only one administrator to make changes at any one time.

4.2 Loading the Repository

1.Open a browser and enter the following URL to navigate to Oracle Enterprise Manager:

http://<machine name>:7001/em

Ex:http:\\msit:7001\em

Log in as an administrative user. Typically you use the administrative user name and
password provided during the Oracle BI installation. In this example the user name is
weblogic and password is weblogic123.

2. In the left navigation pane, expand Business Intelligence and click coreapplication.
3. In the right pane, click the Deployment tab.
4. Click the Repository subtab.
5. Click Lock and Edit Configuration.
6. Click Close when you receive the confirmation message "Lock and Edit Configuration -
Completed Successfully."
7. In the "Upload BI Server Repository" section, click Browse to open the Choose file dialog
box.
8. By default, the Choose file dialog box should open to the repository directory. If not,
navigate to the repository directory with the BISAMPLE repository. If not, browse to
EX:D:\OBIEE11g_Home\instances\instance1\bifoundation\OracleBIServerComponent\corea
pplication_obis1\repository.
9. Select the BISAMPLE.rpd file and click Open.
10. Enter BISAMPLE123 as the repository password and confirm the password.
Loading the Catalog
11.In the “BI Presentation Catalog”section, click on the catalog location and move the
cursor to the end.
12.And then remove SampleApplite and give MSITCAT name

13.Click Apply

14.In the BI Server Repository section, confirm that the Default RPD is now BISAMPLE with an
extension. In this example the file name is BISAMPLE_BIXXXX.

15.Click Activate Changes.

16.Allow Active Changes processing to complete. Click Close when you receive the confirmation
message Activate Changes - Completed Successfully.

17.On the Availability > Processes page, select Restart all.

18.Click Yes when you receive the message “Are you sure you want to restart all components?”

19. Allow the “Restart All – In Progress” processing to complete. This may take a few
moments.

20.Click Close when you receive the confirmation message “Restart All– Completed
Successfully".

20.Confirm that all components are running. Oracle BI cache is now disabled and the BISAMPLE
repository is loaded into BI Server.

21.Leave Oracle Enterprise Manager open.

22.Go to the catalog folder using the below path and check catalog folder MSITCAT is created.

Path:

Ex:C:\OBIEE11g_HOME\instances\instance1\bifoundation\OracleBIPresentationServicesComponent
\coreapplication_obips1\catalog

4.3 Disabling Cache

1. In the right pane of Oracle Enterprise Manager. Click the Capacity Management
tab.
2. Click the Performance sub tab
3. Locate the Enable BI Server Cache section. Cache is enabled by default.
4. Click Lock and Edit Configuration
5. Click Close when you receive the confirmation message "Lock and Edit Configuration
- Completed Successfully."
6. Deselect Cache enabled. Caching is typically not used during development. Disabling
cache improves query performance.
7. Click Apply.
8. Click ActivateChanges.
9. Click Close when you receive the confirmation message Activate Changes -
Completed Successfully.
4.4 Setting Up Query Log

1. Return to the Administration Tool, which should still be open.


2. Select File > Open > Online to open the repository in online mode. You use online
mode to view and modify a repository while it is loaded into the Oracle BI Server.
The Oracle BI Server must be running to open a repository in online mode.
3. Enter BISAMPLE123 as the repository password and enter your administrative user
name and password.
4. Click Open to open the repository in online mode
5. Select Manage > Identity to open Identity Manager.
6. In the left pane, select BI Repository.
7. Select Action > Set Online User Filter.
8. Enter an asterisk and click OK to fetch users from the identity store.
9. In the right pane, double-click your administrative user to open the User dialog box. In this
example the administrative user is weblogic
10. In the User dialog box, on the User tab, set Logging level to 2.
11. Click OK to open the Check Out Objects dialog box.
12. In the Check Out Objects dialog box, click Check Out. When you are working in a
repository open in online mode, you are prompted to check out objects when you
attempt to perform various operations.
13. Select Action > Close to close Identity Manager.
14. Select File > Check In Changes. Alternatively, you can click the Check In Changes
icon on the toolbar.
15. Save the repository. There is no need to check consistency.
16. Select File > Copy As to save a copy of the online repository with the security changes
17. In the Save Copy As dialog box, save the file as BISAMPLE.rpd, replacing the existing
BISAMPLE repository.
18. Click Yes when asked if you want to replace the existing BISAMPLE repository. This will
create a new BISAMPLE repository with query logging set for the weblogic user.
19. Select File > Close to close the repository.
20. Click OK when you receive the following message:
"In order for your online changes to take effect, you will have to manually restart each
non-master Oracle BI Server instance in the cluster."
21. Leave the Administration Tool open.
22. Open Enterprise Manager and restart all components.

5. BI Analytics
5.1. Accessing Analytics for First Time
Oracle BI Analysis Editor is a set of graphical tools that are used to build, view, and
modify Oracle BI analyses. The analysis are queries against an organization’s data. It
allows us to include the views that you create in an analysis for display in dashboards.

Logging In

1.Open any Mozilla or chrome or IE web browser and type below addresses

a. In a browser window, enter


http://localhost:9704/analytics for Enterprise Installation

http://localhost:7001/analytics for Simple Installation

Ex:http://msit:9704/analytics

b. The Oracle Business Intelligence Sign In page is displayed. Enter your User ID and
Password and click Sign In.

2.When you sign in, the Home page is displayed.

The Home page is a task-oriented, centralized workspace combined with a global


header, allowing access to Oracle BI EE objects, their respective editors, help
documentation, and so on.

3.Home Page contains global header, Create New section, Catalog Management
section, Get Started section with links to additional help and BI tools, Recent section
displaying the recently viewed or created analysis or dashboards, and Most Popular
section. You can always operate these features from the global header as well.

5.2. Creating or Developing First Analysis

Creating an Analysis and Using the Analysis Editor

1.To build an analysis, do the following:

From the home page, click New > Analysis.

2.The Select Subject Area pop-up appears.

3.In the Select Subject Area pop-up, select Sample Sales. The Analysis Editor is
displayed.

Note:

1.The Analysis Editor is composed of tabs and panes, as shown in the screen shot,
representing the subject area (columns), available catalog objects, selected columns
for the analysis, and filters (which limit the selected data).

2.A subject area contains folders, measure columns, attribute columns, hierarchical
columns, and hierarchy levels that represent information about the areas of an
organization's business.

3.In this example:


o The selected subject area isSample Sales
o The four tabs - Criteria, Results, Prompts and Advanced are displayed at the
top of the Editor
o Selected columns pane is empty as you are yet to choose the columns
o Filters is empty as well waiting for the column selections and further criteria

a. Select the following columns for your analysis.

Folder Columns
Customers Customer Name
Products Product
Base Facts Revenue

b. While selecting the columns, click the plus sign to expand the folders and double
click the required column names to get them in the Selected Columns section. In this
example, expand the Customers folder, and then double clickCustomer nameto get
it in the Selected Columns section.

Note: In the Selected Columns section, you can reorder the columns in your analysis
by clicking and dragging them.

4.Click the Results tab. The default Compound Layout is displayed.

Note1:The Compound Layout is a composition of many views. By default, both a


Title and Table view are defined for you when using attribute and measure columns.
A Pivot Table view is automatically created when using hierarchical columns in your
analysis.

Note2: In the Compound Layout, you can create different views of the analysis results
such as graphs, tickers, and pivot tables. These are covered in this tutorial going
forward.

Filtering, Sorting, and Saving your Analysis

Filtering An Analysis:

1.Perform the following steps to filter, sort and save the previously created analysis.

2.Click the Criteria tabbed page. Select the column Customer nameto create a filter.
You can create a filter by hovering over the specific column's toolbar by selecting the
more drop-down menu.

3. In the More drop-down menu, select Filter.


The column selected for this example is Customer name.

4.The New Filter dialog box is displayed. Accept the default value for the operator,
that is is equal to / is in, and enter a column value (or a range of column values) for
this condition. To do this, click the drop-down list for Value, and click the desired
checkboxes.

5.Click OK.

The Filters pane displays the newly created filter.

6.Click OK.

Saving an Analysis:

7.Click the Save icon to save your analysis.

8.Navigate to My Folders and click the New Foldericon. The New Folder dialog box
appears.

9.Name the folder Revenue Details and click OK.

10.Name the analysis Revenue by Customer and click OK.

11.The analysis is saved to the catalog folder Revenue Details

Sorting An Analysis:
Go to Home page, and in the Recent area, click the Edit link for the Revenue
Detailsanalysis.

Now you will add a sort to this analysis.

1. On the Criteria tabbed page, click the More Options icon for Revenue column.

2. Select Sort > Sort Descending.

Note:

Observe that a sort icon is added to Revenue. The order of the sort is indicated by an
arrow; in this case, the arrows points down, indicating that it is descending.
Additionally, if multiple sorts are added, a subscript number will also appear,
indicating the sequence for the sort order.

3. Save your analysis again.

Click the Results tabbed page to verify the filter and sort are being applied to your
analysis. The Compound Layout display the filtered and sorted analysis.
5.3. Enhancing an Analysis by Adding Views

1.Create a new analysis by using new columns. Click New > Analysis on the global
header. Use Sample Sales Subject Area.

2.Add Per Name Year from Time, Product from Products, and Revenue from Base
Facts to Selected Columns.

3.Save the analysis to your Revenue Detail folder, as Revenue by Year as the
analysis name.

4.You will now add a graph to this analysis.

Click the Results tabbed page, and click the New View icon.

5.Select Graph > Bar > Default (Vertical) from the menus.

The default Graph view appears below the Table view.

6.Click the Remove View from Compound Layout icon for both Title and Table
views.

7.Both views are removed from the Compound Layout. Note however, that they are
still available for use from the Views pane.

8.Save the analysis.

6. BI Interactive Dashboards

6.1. Exploring and Editing My Dashboard

To open My Dashboard, perform the following steps:

1.Open any Mozilla or chrome or IE web browser and type below addresses

In a browser window, enter http://localhost:9704/analytics.

2.Click the Dashboards link on the global header and then click My Dashboard.

An empty My Dashboard page appears.

Note:

When you open a dashboard, including My Dashboard, the content appears in one or
more dashboard tabbed pages. Pages contain the columns and sections that hold the
content of a dashboard, and every dashboard has at least one page. Multiple pages are
used to organize content.
This example shows an empty My Dashboard page with no content. Hover over the
Edit icon to edit the dashboard and add content.

3.Click the Edit icon ( ) to add content to your empty dashboard page.

The Dashboard Builder appears and automatically creates page 1 of your dashboard
(the first tabbed page).

4.Using the Dashboard Builder, you can add pages and objects to a dashboard and
control the page layout. The Dashboard Builder is composed of the following:

o Dashboard Toolbar: The toolbar allows you to perform tasks such as adding
or deleting pages, previewing, saving, and so on.
o Dashboard Objects pane: Items that are used only in a dashboard. Examples
of dashboard objects are sections to hold content, action links, and embedded
content that is displayed in a frame on a dashboard
o Catalog pane: Items that you or someone else has saved to the Catalog, for
example, analyses, prompts, and so on. In a dashboard, the results of an
analysis can be shown in various views, such as a table, graph, and gauge.
(The results of an analysis are the output that is returned from the Oracle BI
Server that matches the analysis criteria.) Users can examine and analyze
results, save or print them, or download them to a spreadsheet.
o Page Layout pane: This is a workspace that holds your objects, which are
displayed on the dashboard.

5.In the Dashboard Toolbar, the Tools toolbar button provides options to set
dashboard properties, set page report links, and so on.

6.As mentioned above, the Dashboard Objects pane provides you with a list of objects
to add as content to a dashboard page. You will have to drag the object to the Page
Layout pane on the right.

7.Drag the Column object onto the Page Layout pane.The Column object appears on
the Page Layout pane.

8.In the Catalog pane, navigate to the folder where you saved your analysis.

9.Drag the Revenue Per Year analysis to the Column 1.

Revenue Per Yearappears in the column.

Note:

Observe that a Section is automatically created for you. You can also drag an
analysis directly onto an empty Layout Pane without first creating a column. The
Dashboard Builder automatically creates the column for you. You can then add
sections automatically to that column by dragging analyses below the existing
sections.
10.Click the Save icon to save the dashboard page and then click the Run icon .

My Dashboard appears with the selected analysis Revenue Per Year

Note:

1.Before creating a dashboard, click on catalog and click on My folder and


Click on Expand Option under Revenue Details Folder and then click on
edit under Revenue By Customer analysis.

2. Then Click on save as and save as dialog box will open .select shared
folder from left pane then create new folder name it as Revenue analysis
and click ok.

6.2. Building or Developing First Dashboard

1.Click the New > Dashboard in the global header.

The New Dashboard dialog box appears.

2.Enter Revenue Summary in the name text box. Notice that you can also enter a
description of your choice.

3.Click on location and select Browse catalogand select shared folders.

4. Create a new folder Revenue Details

This Folder will appear under Dashboards Link

5. Accept the default to Add content now.

6.Click OK. The Dashboard Builder appears.

7.Navigate to the Revenue By Customer analysis and drag it from the Catalog to the
Page Layout pane.

8. Save and run the dashboard. The Revenue Detail dashboard appears.

Note:

Now you can access this dashboard from Dashboards link on global header.

6.3. Catalog or BI Presentation Catalog


The Oracle BI Presentation Catalog stores the content that users create in a directory
structure of individual files. This content includes folders, shortcuts, Oracle BI EE objects
(such as analyses, filters, prompts, and dashboards), and Oracle BI Publisher objects (such
as reports and templates).
Each object is composed of two files:

 Own file: For example, an analysis called Analysis 1 would be stored in a file
named Analysis1.
Attributes file:
Each object has a corresponding attribute file. For example, the analysis called Analysis1
would have a corresponding attribute file named Analysis1.atr. The attribute file contains the
object's full name, access control list (ACL), description, and so on. To access an object in
the catalog, users must have appropriate ACL entries for that object. All objects in the catalog
use ACL entries.

Lock Files :

To guarantee that only one user can write to a file at one time, a lock file is created when an
object is being written to. On rare occasions (for example, after a power outage), temporary
lock files in the catalog might not be removed completely. If Presentation Services reports of
such a lock file, then you must delete it manually.

The web catalog is located:

$ORACLE_INSTANCE\bifoundation\OracleBIPresentationServicesComponent\core
application_obips1\catalog\XXXXX(catalogname)
EX:
C:\OBIEE11g_HOME\instances\instance1\bifoundation\OracleBIPresentationServ
icesComponent\coreapplication_obips1\catalog\MSITCAT

Directory Structure:

When opening a web catalog, you will always find the following folders

 The shared folder— Contains content that is shared among catalog users.
Dashboards that are shared across users are saved in a Dashboards subfolder
under a common subfolder under the /Shared Folders folder.
 The system folder — Contains administrative elements of Presentation
Services. Some of these elements are distributed with the product, and
others are set up by the administrator, such as privileges.
 The users folder — Contains content that catalog users with the appropriate
permissions have saved to their personal folders, such as individual analyses.
Includes a Subject Area Contents folder where you save objects such as
calculated items and groups.

Use of BI Presentation Catalog:


BI Presentation Catalog is managed by BI Presentation Services the following things
can be managed

 Permissions on objects to respective users, groups and roles


 Access to objects globally
 Scheduling Analysis
 Rename objects
 Copy, save as and deleting objects
 CreatingShortcuts

Techno Level 2: Experienced Level


7. OBIEE Architecture
7.1. Physical Architecture
Fig12
7.2. Logical Architecture
Fig12
8. Oracle Weblogic Server (OWLS) Administration Console
8.1. Overview
Oracle WebLogic Server is a scalable, enterprise-ready Java Platform, Enterprise Edition (Java EE) application
server.The Administration Console is a Web browser-based, graphical user interface that you
use to manage a WebLogic Server domain.A domain includes one or more WebLogic Servers and may also
include WebLogic Server clusters. Clusters are groups of WebLogic Servers instances that work together to
provide scalability and high-availability for applications. You deploy and manage your applications as part of a
domain.

8.2. Creating a user


1. Open the web browser and enter the below url
http://<hostname>:port/console
Ex:http://<msittest>:7001/console
And press enter
Fig18
2. Enter the username and passwordand click login, then Home is displayed.
Fig19
3. In the left pane of the Home, under Domain Structureclick Security
Realms.
Fig20
4. In the Summary of Security Realms window,Click on “myrealm”.
Fig21
5. In Settings for myrealms window,click on users and groups tab.
Fig22
6. Under Users and Groups tab, click on Users tab and click New button.
Fig23
Note:
Under users tab, in users table there are some users already displaying ,they are
system default userslike WebLogic, BISystemUser.
7. Create a New User window is displayed. Enter the name(case sensitive)
,description(optional) and password(8 characters mandatory).
Click ok.
Fig 24
8. User created message will be displayed and the user is displayed in the user
table.
Fig25
Note:
There is no need to restart any BI server or services after creating a new user.
9. Testing the new user, open the analytics and login using new username and
password.
Fig26
8.3. Creating a group
Follow steps 1 to 5 of creating a New User Topic,then
6. Under Users and Groups tab, click on Groups tab and click New button.
Fig27
Note:
Under Groups tab, in groups table there are some groups already displaying
,they are system default groups like BIAdministrator, BIAuthor,operators.
7. Create a New Group window is displayed. Enter the name(case sensitive)
and description(optional).
Click ok.
Fig 28
8. Group created message will be displayed and the group is displayed in the
group table.
Fig29
Note:
There is no need to restart any BI server or services after creating a new
group.
8.3.1. Assigning a user or users to a group or groups
Follow steps 1 to 5 of creating a New User Topic, then
1. Under Users and Groups tab, click on Users tab.
2. In the Users table,click on the user (Ex: msit).
Fig30
3. Settings for msit window will open,click on the groups tab in the window
Fig31
4. choose the groups to be added to the user and click save.
Fig32
5. Then you will get successfully updated message.
Fig33
Note:
You can assign same group to any number of users.
8.4. Creating an application role in OFMW EM
1. Open Oracle Fusion MiddleWare Enterprise Manager.
2.In the Left pane,click Business Intelligencecore application.
3. In right pane,at the top under core application click on business intelligence
instancesecurityapplicaion roles.

Or

In right pane, click on security tab and at the bottom of the tab under Application policies
and roles title,click configure and manage applcation roles.

Fig 34

4. Application roles window will open,click on create button.

Fig35

5. Enter role name and display name and click Add buton to add groups or users or other roles.

Fig36

6.A window is prompted,select type as group and princopal name as msit and click on search
button.

Fig37

7. the group name or any name starting with ‘m’ will appear.select your group and click ok.

Fig 38
Click ok at the top right cornor

8. a new role cretaed meassge is displayed and your application roles is displayed in applcation
roles list.

Fig39

8.4.1. Assigning a group or groups to an application role or roles


Follow all steps from above. Except at step 4,select any existing role and click
on edit button.
Note:
Using Create like option we can create an application role same as desired role.

8.5. Deleting users, groupsand roles


Deleting a role:
Open EM,click on core applicationsecurity tabconfigure and manage
application roles(bottom)select a role and click delete button and click ok.
Fig40
Deleting a group :
Open WLS Console security realm(left pane)myrealmUsers and Groups
tabGroups tabUnder Groups table select group check boxDeleteok.
Fig41
Deleting a User :
Open WLS Console security realm(left pane)myrealmUsers and Groups
tabUsers tabUnder Users table select user check boxDeleteok.
Fig42
9. BI ADMIN TOOL
9.1. Adding Multiple Sources
1.Open Oracle BI Administration tool,
Start > Programs > Oracle Business Intelligence > BI Administration.

2. File openonlineRPD Password and username and password to open BISAMPLE RPD

Which is loading into BI server memory in basic level.

3.In the BMM layer, expand Sample Sales> D3 Customer > Sources. Notice that
the D3 Customer logical table has one logical table source named D3 Customer.

4. Rename the D3 Customer logical table source (not the logical table) to LTS1 Customer.

5. Double-click LTS1 Customer to open the Logical Table Source dialog box.

6. Click the Column Mapping tab and notice that all logical columns map to physical
columns in the same physical table: D3 Customer. It may be necessary to scroll to the right
to see the Physical Table column. Make sure "Show mapped columns" is selected.

7.Click OK to close the Logical Table Source dialog box.


8. In the Physical layer, expand orcl > BISAMPLE.

9. Drag D4 Address from the Physical layer to the D3 Customer logical table in the BMM
layer. Notice this creates a new logical table source named D4 Address for the D3 Customer
logical table. It also creates new logical columns that map to the D4 Address physical table.

Note:

1.If data is not duplicated then that physical table can add to LTS or LT (As a best
practice add to LTS).
2. If data is duplicated then we need to add physical table to LT.
3. To add Physical table to LTS, the adding physical table and existing physical table of
LTS must have direct join.

10. In the BMM layer, double-click the new D4 Address logical table source to open the
Logical Table Source dialog box.

11. On the General tab, enter LTS2 Customer Address in the Name field.

12. Click the Column Mapping tab and notice that all logical columns map to physical
columns in the same physical table: D4 Address. If necessary, select Show mapped
columns and deselect Show unmapped columns.

13. Click OK to close the Logical Table Source dialog box.

14. Confirm that the D3 Customer logical table now has two logical table sources: LTS1
Customer and LTS2 Customer Address. A single logical table now maps to two physical
sources.

15. Right-click the new ADDRESS_KEY column and select Delete. This is a duplicate
column and is not needed.

16. Click Yes to confirm the delete.

17.Use the Rename Wizard or a manual renaming technique to rename the new address
logical columns (with uppercase letters) in D3 Customer.

18. Rename the remaining logical table sources according to the following table. Recall that
logical table sources are located in the Sources folder for a logical table. For example: D2
Product > Sources.

19. In the Presentation layer, right-click the Sample Sales subject area and select New
Presentation Table to open the Presentation Table dialog box.

20. On the General tab, enter Customer Regions in the Name field.

21. Click OK to close the Presentation Table dialog box. Confirm that the Customer
Regions presentation table is added to the Sample Sales subject area in the Presentation
layer.
22. In the BMM layer, expand Sample Sales > D3 Customer.

23. Drag the following logical columns from D3 Customer to Customer Regions in the
Presentation layer:

Address 1
Address 2
Area
City
Country Name
Estab Name
Postal Code
Region
State Province
State Province Abbrv

Your column names may be slightly different depending on how you renamed them.

24.Reorder the Customer Regions presentation columns in the following order, from top to
bottom:

Region
Area
Country Name
State Province
State Province Abbrv
City
Postal Code
Address 1
Address 2
Estab Name

25. Double-click the Sample Sales subject area in the Presentation layer to open the Subject
Area dialog box.

26. Click the Presentation Tables tab.

27. Reorder the presentation tables so that Customer Regions appears after Customers.

28. Click OK to close the Subject Area dialog box. Confirm that the presentation tables
appear in the expected order.

You now have two presentation tables, Customers and Customer Regions, mapped to the
same logical table, D3 Customer. The D3 Customer logical table is mapped to two physical
sources: D3 Customer and D4 Address.

29.Save the repository and check global consistency when prompted. You should receive a
message that there are no errors, warnings, or best practice violations to report.

30.Click OK to close the consistency check message.


31.open BI analytics, in the left navigation pane, under Create... Analysis and Interactive
Reporting, select Analysis.

32. Select the Sample Sales subject area.

33. Select “Reload Server Metadata”.

34. In the left navigation pane, expand the folders and confirm that the Customer Regions
folder and corresponding columns appear.

35. Create the following analysis by double-clicking column names in the Subject Areas
pane:

Customer Regions.Region
Customers.Customer Name
Products.Type
Base Facts.Revenue

36. Click Results to view the analysis results. Use the Get more rows button at the bottom of
the results screen to see more rows.

37. Click on Administration( At the top right corner.)

38. Click Leave Page when prompted with the message: “Are you sure?”

39. On the Administration page, under Session Management, select Manage Sessions.

40. In the Cursor Cache section, locate your query and select View Log.

41.Locate the SQL Request section. This section contains the logical SQL issued by the
query.

42. Click the browser back button to return to the Manage Session page.

43. Click the browser back button to return to the Administration screen.

44. Click Home to return to the Home page.

9.2. Calculations
1.In general, if the calculations are complex and large, then such calculations are done during
ETL Stage.
2. If the calculations are reusable and reliable, then such calculations are done in

RPD BMM Layer.


3. If the calculations are specific, personnel and temporary, then such calculations are done at

Report level.
4. In OBIEE BMM Layer we can do calculations in 3 ways.
i. Creating calculation based on logical columns.
ii. Creating calculation based on physical columns.
iii. Creating calculations by using calculation wizard.

9.2.1. Calculations based on logical columns

1. In the BMM layer, expand Sample Sales > F1 Revenue.

2. Right-click F1 Revenue and select New Object > Logical Column to open the Logical
Column dialog box.

3. On the General tab, enter Actual Unit Price in the Name field.

4. Click the Column Source tab.

5. Select Derived from existing columns using an expression.

6. Click the Edit Expression button to open Expression Builder.

7. In the left pane select Logical Tables > F1 Revenue > Revenue.

8. Click the Insert selected item button to move the Revenue column to the right pane.

9. Click the division operator to add it to the expression.

10. In the left pane select Logical Tables > F1 Revenue and then double-click Units to add
it to the expression.

11.Click OK to close Expression Builder. Notice that the formula is added to the Logical
Column dialog box.

12. Click OK to close the Logical Column dialog box. The Actual Unit Price calculated
measure is added to the business model.

13. Drag Actual Unit Price from the BMM layer to the Base Facts presentation table in the
Presentation layer.

14.Save the repository and check consistency. Fix any errors or warnings before proceeding.

9.2.2. Calculations based on Physical columns


1.In the BMM layer, expand Sample Sales > F1 Revenue>Sources and Double Click on LTS1
Revenue.

Fig43

2. Go to Column mapping Tab, then click on Add New Column Icon to open a logical column dialog
box.
Fig44

3.In logical column dialog box, enter name as Actual units price and click ok.

4.Check show unmapped columns and the new column is displayed below, select it and click on
Expression builder button .

Fig45

5.In Expression Builder left pane, select physical tables in the topand at bottom select revenue
column.

6. Then click on division operator and select units column as denominator in the expression.

Fig46

7. click ok to close expression builder and click ok to close logical column dialog box.

8. set aggregation for the new column by double click the new columnaggregation tabselect
SUM aggregation rule and click ok.

9.Drag Actual Unit Price from the BMM layer to the Base Facts presentation table in the
Presentation layer.

Fig48

10. Save the repository and check consistency. Fix any errors or warnings before proceeding.

9.2.3. Calculations using Calculations wizard

1.In Physical Layer, Expand BISAMPLE>F1 revenue and select columns –


COST_FIXED and COST_VARIABLE.
Fig49
2. Drag these columns to F1 Revenue in BMM layer and rename as Fixed Cost and
variable Cost.
Fig50
3. Double click fixed cost column and set SUM aggregation rule in aggregation tab,
do same for variable cost column.
4. right click on variable cost column, select calculation wizard>click next>check
fixed cost column> click next.
Fig51 and 52
5. Maximize the calculation wizard box, in the Generate calculation section check
only change option and uncheck other options .
Fig53
6.Under calculation name give name as change in costs and click next and again click
next and click finish.
7.Drag Change In Cost,variable cost ,Fixed Cost from the BMM layer to the Base Facts
presentation table in the Presentation layer.

Fig54

8.Save the repository and check consistency. Fix any errors or warnings before proceeding.

9. Open BI Analytics and test your work.

Develop a report using Per name year, Type, revenue, units, Actual unit price columns.

Develop a report using Per name year,Type,variable cost,fixed cost,change in cost columns.

Develop a report using Product, Revenue, Revenue Rank ,Units, Actual unit Price columns.

9.3. Dimensional Hierarchies


In OBIEE,there are two types of logical dimensions
1.Dimensions with level-based hierarchies (structure hierarchies)
i.Ragged and Skipped hierarchies

ii.Time Dimension Hierarchy

2. Dimensions with parent-child hierarchies (value hierarchies)

9.3.1. Level based Hierarchy


In level-based hierarchies, members can be of different types and members of the same
type come only at single level. It has two mandatory levels that is, Grand Total and
detail levels.

Creating a Logical Dimension for Product

1. In the BMM layer, right-click the Sample Sales business model and select New
Object > Logical Dimension > Dimension with Level-Based Hierarchy to open
the Logical Dimension dialog box.
2. Name the logical dimension H2 Product.
3. Click OK. The logical dimension is added to the Sample Sales business model.

Creating Logical Levels

1. Right-click H2 Product and select New Object > Logical Level.


2. Name the logical level as Product Total.
3. Because this level represents the grand total for products, select the Grand total
level check box. Note that when you do this, the Supports rollup to higher level of
aggregation field is grayed out and protected.
4. Click OK to close the Logical Level dialog box. The Product Total level is added
to the H2 Product logical dimension.
5. Right-click Product Total and select New Object > Child Level to open the
Logical Level dialog box.
6. Name the logical level Product Brand.
7. Click OK to close the Logical Level dialog box. The Product Brand level is added
to the logical dimension.
8. Repeat the steps to add the following child levels:

Product LOB as a child of Product Brand


Product Type as a child of Product LOB
Product Detail as a child of Product Type

Associating Logical Columns with Logical Levels

1. Expand the D2 Product logical table.


2. Drag the Brand column from D2 Product to the Product Brand level in H2
Product.
3. Continue dragging logical columns from the D2 Product logical table to their
corresponding levels in the H2 Product logical dimension:

Logical Column Logical Level


Lob Product LOB
Type Product Type
Product Product Detail
Prod Key Product Detail

Setting Logical Level Keys

1. Double-click the Product Brand logical level to open the Logical Level dialog
box. On the General tab, notice that the Product LOB child level is displayed.
2. Click the Keys tab.
3. Enter Brand for Key Name.
4. In the Columns field, use the drop down list to select D2 Product.Brand.
5. Check Use for Display. When this is selected, users can drill down to this column
from a higher level.
6. Set Brand as the Primary key.
7. Click OK to close the Logical Level dialog box. The icon changes for Brand to
show that it is the key for the Product Brand level.
8. Use a different technique to create a logical level key: Right-click Lob for the
Product LOB level and select New Logical Level Key to open the Logical Level
Key dialog box.
9. In the Logical Level Key dialog box, accept the defaults and click OK.
10. The icon changes for Lob to show that it is the key for the Product LOB level.
11. Use either method to set the remaining keys for the H2 Product logical
dimension:

Logical Logical Level Use for


Level Key Display
Product
Type Yes
Type
Product
Product Yes
Detail
Product
Prod Key No
Detail

Note: Please note that the Detail level (lowest level of the hierarchy) must have the
column that is the logical key of the dimension table associated with it and it must be the
key for that level: Prod Key in this example.

12. Set Prod Key as the primary key for the Product Detail level. Hint: Double-click
the level and select the Keys tab.

Creating a Logical Dimension for Time(Time Dimension Hierarchy)

1. Use a different technique to create a logical dimension for Time. Right-click the
D1 Time logical table and select Create Logical Dimension > Dimension with
Level-Based Hierarchy.
2. A new logical dimension, D1 TimeDim in this example, is automatically added to
the business model.
3. Rename D1 TimeDim to H1 Time.
4. Expand H1 Time . Notice that two levels were created automatically: D1 Time
Total and D1 Time Detail. D1 Time Detail is populated with all of the columns
from the D1 Time logical table.
5. Rename D1 Time Total to Time Total, and rename D1 Time Detail to Time
Detail.
6. Right-click Time Detail and select New Object > Parent Level to open the
Logical Level dialog box.
7. On the General tab, name the logical level Week, and check Supports rollup to
higher level of aggregation.
8. Click OK to close the Logical Level dialog box. The Week level is added to the
H1 Time logical dimension.
9. Repeat the steps to add the remaining logical levels:

Month as a parent of Week


Quarter as a parent of Month
Half as a parent of Quarter
Year as a parent of Half

Associating Time Logical Columns with Logical Levels

1. Use a different technique to associate logical columns with logical levels. Drag
the logical columns from the Time Detail logical level (not from the D1 Time
logical table) to their corresponding levels in the H1 Time logical dimension. This
is a convenient technique when logical columns are buried deep in the business
model.
Logical Column Logical Level
Per Name Year Year
Per Name Half Half
Per Name Qtr Quarter
Per Name Month Month
Per Name Week Week

2. Delete all remaining columns from the Time Detail level except for Calendar
Date so that only Calendar Date is associated with the Time Detail level. Notice
that deleting objects from the hierarchy does not delete them from the logical
table in the business model.
3. Set the logical keys for the H1 Time logical dimension according to the following
table:

Logical Level Level Key Use for Display


Year Per Name Year Yes
Half Per Name Half Yes
Quarter Per Name Qtr Yes
Month Per Name Month Yes
Week Per Name Week Yes
Time Detail Calendar Date Yes

Creating a Logical Dimension for Customer

1. Use either technique to create a logical dimension with a level-based hierarchy


named H3 Customer for the D3 Customer logical table with the following levels,
columns, and keys.

Level Column Key Use for Display


Customer Total <none> <none> <none>
Customer Region Region Region Yes
Customer Area Area Area Yes
Customer Country Country Name Country Name Yes
Customer State State Province State Province Yes
Customer City City City Yes
Customer Postal
Postal Code Postal Code Yes
Code
Customer Customer Yes
Customer Detail Name Name
No
Customer Customer
Number Number

2. Set Customer Total as the grand total level.


3. Set Customer Number as the primary key for the Customer Detail level.

Setting Aggregation Content for Logical Table Sources

1. Expand D1 Time > Sources.


2. Double-click the LTS1 Time logical table source to open the Logical Table
Source dialog box.
3. Click the Content tab.
4. Confirm that Aggregation content, group by is set to Logical Level and the
logical level is set to Time Detail for the H1 Time logical dimension.
5. Click OK to close the Logical Table Source dialog box.
6. Repeat to verify or set content settings for the remaining logical table sources
using the table :

Logical
Logical Table Source Logical Level
Dimension
LTS1 Product H2 Product Product Detail
Customer
LTS1 Customer H3 Customer
Detail
LTS2 Customer Customer
H3 Customer
Address Detail
Time Detail
H1 Time
Product Detail
LTS1 Revenue H2 Product
Customer
H3 Customer
Detail

7. Save the repository and check global consistency. Fix any errors or warnings before
proceeding. Notice that you did not have to make any changes to the Presentation layer.

Testing Your Work

1. Return to Oracle BI, which should still be open, and sign in if necessary.
2. Create the following analysis to test the Product hierarchy.

Products.Brand
Base Facts.Revenue

3. Click Results.
4. Click on the BizTech brand and verify that you can drill down through the
hierarchy to see revenue data at each level.
5. Select New > Analysis > Sample Sales.
6. Click Leave Page when prompted with the message: “Are you sure?”
7. Create the following analysis:

Time.Per Name Year


Base Facts.Revenue

8. Click Results and verify that you can drill down through the Time hierarchy.
9. Repeat the steps and create the following analysis to test the Customers
hierarchy:

Customer Regions.Region
Base Facts.Revenue
10. Click Results and verify that you can drill down through the Customers
hierarchy.
11. Sign out of Oracle BI. Click Leave Page when prompted about navigating away
from this page. Leave the Oracle BI browser page open.

9.3.2. Time Dimension Hierarchy


A Hierarchy which contains at least one level of a time dimension must having a chronological key is called
Time Dimension hierarchy.
Refer sections titled:Creating a Logical Dimension for Timeand
Modelling Time Series Data.
9.3.3.Unbalanced Hierarchies (Ragged and Skipped)
1. Ragged Hierarchy: A hierarchy in which all levels end at different depth or
end at different levels.
2. Skipped Hierarchy: A hierarchy in which immediate parent level of a child
level is skipped or not present.

Importing Metadata and Define Physical Layer Objects

1. Open the BISAMPLE repository in online mode.


2. In the Physical layer, expand orcl.
3. Right-click Connection Pool and select Import Metadata to open the Import
Wizard.
4. In the Select Metadata Types screen, accept the defaults and click Next.
5. In the Select Metadata Objects screen, in the data source view, expand
BISAMPLE.
6. In the data source view, select the SAMP_PRODUCTS_DR table for import
7. Click the Import Selected button to move the table to the Repository View.
8. Expand BISAMPLE in the Repository View and confirm that the
SAMP_PRODUCT_DR table is visible.
9. Click Finish to close the Import Wizard.
10. Confirm that the SAMP_PRODUCT_DR table is visible in the Physical layer of
the repository.
11. Create the following alias for the table: D20 Product
12. Use the Physical Diagram to create the following physical join for the alias table:

"orcl".""."BISAMPLE"."D20 Product"."PROD_KEY" =
"orcl".""."BISAMPLE"."F1 Revenue"."PROD_KEY"

13. Right-click D20 Product and select View Data.

Note : there are skipped levels in the hierarchy. For example, brand A - Brand2
has a NULL value for LOB for the product D - Product 8.

14. Close View Data.

Creating Logical Table and Logical Columns

1. Drag D20 Product from the Physical layer to the Sample Sales business model in
the BMM layer to create a D20 Product logical table. The logical join to F1
Revenue is created automatically based on the join in the Physical layer.
2. Rename the D20 Product logical columns:

Old Name New Name


BRAND Brand
LOB LOB
PROD_DSC Product
PROD_KEY Product Number
Type Product Type

3. Rename the D20 Product logical table source to LTS1 Product (Ragged).

Creating a Ragged/Skipped Levels Logical Dimension

1. Right-click the D20 Product logical table and select Create Logical Dimension >
Dimension with Level-Based Hierarchy to automatically create a logical
dimension named D20 ProductDim.
2. Rename D20 ProductDim to H20 Product.
3. Double-click the H20 Product logical dimension to open the Logical Dimension
dialog box.
4. On the General tab, select both Ragged and Skipped Levels.
5. Click OK to close the Logical Dimension dialog box.
6. Expand H20 Product.
7. Create the following hierarchy:
Use for
Level Column Key
Display
Product
<none> <none> <none>
Total
Product
Brand Brand Yes
Brand
Product LOB LOB LOB Yes
Product
Product Type Product Type Yes
Type
Product Product Product Yes
Product Product Product
Yes
Detail Number Number

Creating Presentation Layer Objects

1. Drag the D20 Product logical table to the Sample Sales subject area in the
Presentation layer.
2. In the Presentation layer, rename D20 Product to Products (Ragged) and move
Products (Ragged) to appear after Products.
3. Expand Products (Ragged) and notice that the H20 Product logical dimension is
automatically added to the Presentation layer.
4. Save the repository and check consistency. Fix any errors or warnings before
proceeding.
5. Close the repository. Leave the Administration Tool open.

Testing Your Work

1. Return to Oracle BI, which should still be open, and sign in.
2. Create the following analysis to test the ragged / skipped level hierarchy:

Products (Ragged).Brand
Products (Ragged).LOB
Products (Ragged).Product Type
Products (Ragged).Product
Base Facts.Revenue

3. Click Results.

The results display correctly even though there are skipped levels (levels with
NULL values) and ragged levels (leaves with varying depth).

4. Sign out of Oracle BI. Click Leave Page when prompted about navigating away
from this page. Leave the Oracle BI browser page open.
9.3.4. Parent-Child Hierarchy

A parent-child hierarchy is a hierarchy of members that all have the same type. This
contrasts with level-based hierarchies, where members of the same type occur only at
a single level of the hierarchy

Importing Metadata and Define Physical Layer Objects

1. Open the BISAMPLE repository in online mode.


2. In the Physical layer, expand orcl.
3. Right-click Connection Pool and select Import Metadata to open the Import
Wizard.
4. In the Select Metadata Types screen, accept the defaults and click Next.
5. In the Select Metadata Objects screen, in the data source view, expand
BISAMPLE and select the following tables for import:

SAMP_EMPL_D_VH
SAMP_EMPL_PARENT_CHILD_MAP
SAMP_EMPL_POSTN_D

6. Click the Import Selected button to move the tables to the Repository View.

7. Click Finish to close the Import Wizard.

8. Confirm that the three tables are visible in the Physical layer of the repository.

9. Right-click SAMP_EMPL_PARENT_CHILD_MAP and select View Data.

Note:

This is an example of a parent-child relationship table with rows that define the
inter-member relationships of an employee hierarchy. It includes a Member Key
column, which identifies the member (employee); an Ancestor Key, which
identifies the ancestor (manager) of the member; a Distance column, which
specifies the number of parent-child hierarchy levels from the member to the
ancestor; and a Leaf column, which indicates if the member is a leaf member.

10. Create the following aliases for the tables:

Table Alias
SAMP_EMPL_D_VH D50 Sales Rep
D51 Sales Rep
SAMP_EMPL_PARENT_CHILD_MAP
Parent Child
D52 Sales Rep
SAMP_EMPL_POSTN_D
Position
11.Use the Physical Diagram to create the following physical joins for the alias tables:

"orcl".""."BISAMPLE"."D52 Sales Rep Position"."POSTN_KEY" =


"orcl".""."BISAMPLE"."D50 Sales Rep"."POSTN_KEY"

"orcl".""."BISAMPLE"."D50 Sales Rep"."EMPLOYEE_KEY" =


"orcl".""."BISAMPLE"."D51 Sales Rep Parent Child"."ANCESTOR_KEY"

"orcl".""."BISAMPLE"."D51 Sales Rep Parent Child"."MEMBER_KEY" =


"orcl".""."BISAMPLE"."F1 Revenue"."EMPL_KEY"

Creating Logical Table and Logical Columns

1. In the BMM layer, right-click the Sample Sales business model and select New
Object > Logical Table to open the Logical Table dialog box.
2. On the General tab, name the logical table D5 Sales Rep.
3. Click OK to add the logical table to the business model.

Note: Note that the D5 Sales Rep icon has a # sign. This is because you have not
yet defined the logical join relationship.

4. Drag all six columns from D50 Sales Rep in the Physical layer to D5 Sales Rep in
the BMM layer. This action creates logical columns and adds a D50 Sales Rep
logical table source to D5 Sales Rep.
5. Rename the D50 Sales Rep logical table source to LTS1 Sales Rep.
6. In the Physical layer, expand D52 Sales Rep Position.
7. Drag POSTN_DESC and POSTN_LEVEL from D52 Sales Rep Position to LTS1
Sales Rep. Note that you are dragging the columns to the logical table source, not
the logical table. Dragging to the logical table would create a second logical table
source.
8. Drag DISTANCE from D51 Sales Rep Parent Child to LTS1 Sales Rep. Again,
you drag the column to the logical table source, not the logical table.
9. Rename the logical columns:

Old Name New Name


POSTN_KEY Position Key
TYPE Sales Rep Type
EMPL_NAME Sales Rep Name
EMPLOYEE_KEY Sales Rep Number
HIRE_DT Hire Date
MGR_ID Manager Number
POSTN_DESC Position
POSTN_LEVEL Position Level
DISTANCE Closure Distance
Creating a Logical Join

1. In the BMM layer, select D5 Sales Rep and F1 Revenue.


2. Right-click either highlighted table and select Business Model Diagram >
Selected Tables Only to open the Business Model Diagram.
3. Create a logical join between D5 Sales Rep and F1 Revenue with F1 Revenue at
the many end of the join.
4. Close the Business Model Diagram. Notice that the icon has changed for the D5
Sales Rep table.

Notice that the D5 Sales Rep icon is not having # sign.

Creating a Parent-Child Logical Dimension

1. Right-click the D5 Sales Rep logical table and select Create Logical Dimension >
Dimension with Parent-Child Hierarchy.
2. In the Logical Dimension dialog box, on the General tab, name the logical
dimension H5 Sales Rep.
3. Click Browse next to Member Key. The Browse window shows the physical table
and its corresponding key.
4. Click View to open the Logical Key dialog box. Confirm that the Sales Rep
Number column is selected.
5. Click Cancel to close the Logical Key dialog box.
6. Click OK to close the Browse window.
7. Click Browse next to Parent Column. The Browse window shows the columns
other than the member key.
8. Deselect Show Qualified Names and select Manager Number as the parent
column for the parent-child hierarchy.
9. Click OK to close the Browse window, but do not close the Logical Dimension
dialog box.

Defining Parent-Child Settings

1. Click Parent-Child Settings to display the Parent-Child Relationship Table


Settings dialog box. Note that at this point the Parent-Child Relationship Table is
not defined.

Note:

For each parent-child hierarchy defined on a relational table, you must explicitly
define the inter-member relationships in a separate parent-child relationship
table. In the process of creating the parent-child relationship table, you may
choose one of the following options:

1. Select a previously-created parent-child relationship table.

2. Use a wizard that will generate scripts to create and populate the parent-child
relationship table. In the next set of steps you select a previously created and
populated parent-child relationship table.
For your information only: To start the wizard you would click the Create
Parent-Child Relationship Table button. The wizard creates the appropriate
repository metadata objects and generates SQL scripts for creating and
populating the parent-child relationship table. At the end of the wizard, Oracle
BI Server stores the scripts into directories chosen during the wizard session.
The scripts can then be run against the database to create and populate the
parent-child relationship table. Running the wizard is not necessary in this
tutorial because the parent-child relationship table is already created and
populated.

2. Click the Select Parent-Child Relationship Table button to open the Select
Physical Table dialog box.
3. In the Select Physical Table dialog box, select the D51 Sales Rep Parent Child
alias you created.
4. The D51 Sales Rep Parent Child alias is now displayed in the Parent-Child
Relationship Table column.
5. In the Parent-Child Table Relationship Column Details section, set the
appropriate columns:

Member Key MEMBER_KEY


Parent Key ANCESTOR_KEY
Relationship Distance DISTANCE
Leaf Node Identifier IS_LEAF

Explanation:

Member Key identifies the member.


Parent Key identifies an ancestor of the member, The ancestor may be the
parent of the member, or a higher-level ancestor.
Relationship Distance specifies the number of parent-child hierarchical levels
from the member to the ancestor.
Leaf Node Identifier indicates if the member is a leaf member (1=Yes, 0=No).

6. Click OK to close the Parent-Child Relationship Table Settings dialog box.


7. Click OK to close the Logical Dimension dialog box.
8. Right-click H5 Sales Rep and select Expand All. Note that a parent-child logical
dimension has only two levels.
9. Delete all columns from the Detail level except for Sales Rep Name and Sales Rep
Number.
10. Double-click the Detail level to open the Logical Level dialog box.
11. On the Keys tab, create a new key named Display Key that maps to the Sales
Rep Name column.
12. Deselect Use for Display for the Sales Rep Number column and select Use for
Display for the Sales Rep Name column.
13. Make sure that Member Key is still set to D50 Sales Rep_Key.
14. Click OK to close the Logical Level dialog box.
15. Expand F1 Revenue > Sources and double-click LTS1 Revenue to open the
Logical Table Source dialog box.
16. On the Content tab, set the logical level to Detail for the H5 Sales Rep logical
dimension.
17. Click OK to close the Logical Table Source dialog box.
18. Expand D5 sales rep>Sources and double click LTS1 Sales rep to open the
Logical Table Source dialog box.
19. On the Content tab, set the logical level to Detail for the H5 Sales Rep logical
dimension.
20. On parent-child settings tab,click browse and select D51 Sales Rep Parent Child .
21. In the Parent-Child Table Relationship Column Details section, set the
appropriate columns:

Member Key MEMBER_KEY


Parent Key ANCESTOR_KEY
Relationship Distance DISTANCE
Leaf Node Identifier IS_LEAF

Creating Presentation Layer Objects

1. Drag the D5 Sales Rep logical table from the BMM layer to the Sample Sales
subject area in the Presentation layer.
2. Rename the D5 Sales Rep presentation table to Sales Reps.
3. Move the Sales Reps presentation table above the Base Facts table.
4. Expand the Sales Reps presentation table and notice that the H5 Sales Rep
parent-child logical dimension is automatically included as a presentation
hierarchy.
5. Double-click the H5 Sales Rep presentation hierarchy to open the Presentation
Hierarchy dialog box.
6. On the Display Columns tab, confirm that Sales Rep Name is set as the display
column.
7. Click OK to close the Presentation Hierarchy dialog box.
8. Save the repository and check consistency. Fix any errors or warnings before
proceeding.
9. Close the repository. Leave the Administration Tool open.

Testing Your Work

1. Return to Oracle BI, which should still be open, and sign in.
2. Create the following analysis to test the parent-child logical dimension.

Sales Reps.H5 Sales Reps


Sales Reps.Position
Base Facts.Revenue

3. Click Results.
4. Expand the pivot table to view data at different levels of the hierarchy. Notice
that the Revenue measure rolls up through each level.
5. Sign out of Oracle BI. Click Leave Page when prompted about navigating away
from this page. Leave the Oracle BI browser page open.

9.4. Level Based Measures (LBM)


1.A level-based measure is a column whose values are always calculated to a specific
level of aggregations present in a logical dimension hierarchy.

2. To create LBM, a logical dimension hierarchy must be created or present.

3. Level based measures are useful to calculate Grand Totals,sharesor percentages.

4. Nested aggregate functions are not supported by RPD. Hence we need learn LBM
Ex: SUM(SUM(Revenue))

Creating Level-Based Measures

1. Open RPD in Online Mode.


2. In the BMM layer, right-click the F1 Revenue table and select New Object >
Logical Column to open the Logical Column dialog box.
3. On the General tab, enter Product Total Revenue in the Name field.
4. Click the Column Source tab.
5. Select Derived from existing columns using an expression.
6. Open the Expression Builder.
7. In the Expression Builder, add Logical Tables > F1 Revenue > Revenue to the
expression. Recall that the Revenue column already has a default aggregation
rule of Sum.
8. Click OK to close Expression Builder.
9. Click the Levels tab.
10. For the H2 Product logical dimension, select Product Total from the Logical
Level drop-down list to specify that this measure should be calculated at the
grand total level in the product hierarchy.
11. Click OK to close the Logical Column dialog box. The Product Total Revenue
measure appears in the Product Total level of the H2 Product logical dimension
and the F1 Revenue logical fact table.
12. Repeat the steps to create a second level-based measure:

Name Logical Dimension Logical Level


Product Type Revenue H2 Product Product Type

13. Expose the new columns to users by dragging Product Total Revenue and
Product Type Revenue to the Base Facts presentation table in the Sample Sales
subject area in the Presentation layer. You can drag the columns from either the
H2 Product logical dimension or the F1 Revenue logical table.
Creating a Share Measure

1. In the BMM layer, right-click the F1 Revenue table and select New Object >
Logical Column to open the Logical Column dialog box.
2. On the General tab, name the logical column Product Share.
3. On the Column Source tab, select "Derived from existing columns using an
expression".
4. Open the Expression Builder.
5. In the Expression Builder, Select Functions > Mathematic Functions > Round.
6. Click Insert selected item. The function appears in the edit box.
7. Click Source Number in the formula.
8. Enter 100* followed by a space.
9. Insert Logical Tables > F1 Revenue > Revenue.
10. Using the toolbar, click the Division button. Another set of angle brackets
appears, <<expr>>.
11. Click <<expression>>.
12. Insert Logical Tables > F1 Revenue > Product Total Revenue. Recall that this is
the total measure for the hierarchy.
13. Click between the last set of angle brackets, <<Digits>>, and enter 1. This
represents the number of digits of precision with which to round the integer.
14. Check your work:
Round(100* "Sample Sales"."F1 Revenue"."Revenue" / "Sample Sales"."F1
Revenue"."Product Total Revenue", 1)

This share measure will allow you to run an analysis that shows how revenue of a
specific product compares to total revenue for all products.

15. Click OK to close the Expression Builder. The formula is visible in the Logical
Column dialog box.
16. Click OK to close the Logical Column dialog box. The Product Share logical
column is added to the business model.
17. Add the Product Share measure to the Base Facts presentation table.
18. Save the repository. Check consistency. You should receive the following
message.

If there are consistency errors or warnings, correct them before you proceed.

19. Close the repository.

Testing Your Work

1. Return to Oracle BI, which should still be open, and sign in.
2. Create the following analysis to test the level-based and share measures.

Products.Product
Base Facts.Revenue
Base Facts.Product Type Revenue
Base Facts.Product Share

3. For the Product Share column, select Column Properties.


4. On the Data Format tab, select Override Default Data Format.
5. Change Treat Numbers As to Percentage and set Decimal Places to 2. Deselect
Use 1000's separator.
6. Click OK to close the Column Properties dialog box.
7. Sort Product Share in descending order.
8. Click Results. Notice that Product Type Revenue returns dollars grouped by
Type even though the query is at a different level than Type; Product in this
example. Product Share shows the percent of total revenue for each product
sorted in descending order.
9. Sign out of Oracle BI. Click Leave Page when prompted about navigating away
from this page. Leave the Oracle BI browser page open.

9.5. Modelling Time Series Data

1.Time series functions provide the ability to compare business performance with previous time periods,
allowing you to analyze data that spans multiple time periods.
For example, time series functions enable comparisons between current sales and sales
a year ago, a month ago, and so on

2.Time series functions are 3 types.


1)Ago function
i) The AGO function offsets the time dimension to display data from a past period.

ii) Example : 1 year ago Revenue,3 month Ago Revenue , , 6 months ago Revenue
iii) Syntax: Ago(<<Measure>>, <<Level>>, <<Number of Periods>>)

EX:Ago("Sample Sales"."F1 Revenue"."Revenue" ,"Sample Sales"."H1 Time "."Per name


month" , 3)
iv) Ago function will use dense_rank analytics function to achieve the calculations
2) ToDate function
i) Aggregates a measure attribute from the beginning of a specified time period to the
currently displayed time
ii) Example : YTD (Year To Date) , QTD (Quarter To Date) ……etc
iii) Syntax :ToDate(<<Measure>>, <<Level>>)

Ex: ToDate("Sample Sales"."F1 Revenue"."Revenue " , "Sales"."H1 Time"."Per name


Year")
iv) Todate function will use row_number ,min and Rank analytics function to achieve the
calculations
3) Period Rolling function
i) This function computes the aggregate of a measure over the period starting x units
of time and ending y units of time from the current time.
ii) Example 3 Months rollup ( Current Month + two previous Months)
iii) Syntax PeriodRolling(<<Measure>>, <<Starting Period Offset>>, <<Ending Period
Offset>>)

EX:PeriodRolling("Sample Sales"."F1 Revenue"."Revenue" , -2, 0)


iv) Period rolling function will use dense_rank analytics function to achieve the
calculations
3. Time series functions operate on time-oriented dimensions. To use these functions on a particular dimension,
you must designate the dimension as a Time dimension and set one or more keys at one or more levels as
chronological keys.

Identifying a Logical Dimension as a Time Dimension

1. Return to the Administration Tool and open the BISAMPLE repository in online
mode.
2. In the BMM layer, double-click the H1 Time logical dimension to open the
Logical Dimension dialog box.
3. In the Structure section, select Time.
4. Click OK to close the Logical Dimension dialog box.

Identifying Level Keys as Chronological Keys

1. Expand the H1 Time logical dimension and double-click the Time Detail level to
open the Logical Level dialog box.
2. Click the Keys tab.
3. Select the Chronological Key check box for Calendar Date.
4. Click OK to close the Logical Level dialog box.
5. Repeat and set chronological keys for the following levels:

Logical Level Chronological Key


Year Per Name Year
Half Per Name Half
Quarter Per Name Qtr
Month Per Name Month
Week Per Name Week

6. It is best practice to designate a chronological key for every level of a time logical
dimension.

Creating a Measure Using the AGO Function

1. Right-click the F1 Revenue logical table and select New Object > Logical
Column.
2. On the General tab, name the column Month Ago Revenue.
3. On the Column Source tab, select "Derived from existing columns using an
expression".
4. Open the Expression Builder.
5. Select Functions > Time Series Functions > Ago.
6. Double-click Ago or click Insert selected item to add the Ago function to the
Expression Builder.
7. Click <<Measure>>in the expression.
8. Select Logical Tables > F1 Revenue and then double-click Revenue to add it to
the expression.
9. Click <<Level>> in the expression.
10. Select Time Dimensions > H1 Time and then double-click Month to add it to the
expression.
11. Click <<Number of Periods>> and enter 1. The Ago function will calculate the
Revenue value one month before the current month.
12. Click OK to close the Expression Builder. Check your work in the Logical
Column dialog box:
13. Click OK to close the Logical Column dialog box. The Month Ago Revenue time
series measure is added to the F1 Revenue logical table.
14. Drag the Month Ago Revenue logical column to the Base Facts presentation
folder.

Creating a Measure Using the TODATE Function

1. Right-click the F1 Revenue logical table and select New Object > Logical
Column.
2. On the General tab, name the new logical column Year To Date Revenue.
3. On the Column Source tab, select "Derived from existing columns using an
expression".
4. Open the Expression Builder.
5. Select Functions > Time Series Functions and double-click ToDate to insert the
expression.
6. Click <<Measure>> in the expression.
7. Select Logical Tables > F1 Revenue and then double-click Revenue to add it to
the expression.
8. Click <<Level>> in the expression.
9. Select Time Dimensions > H1 Time and then double-click Year to add it to the
expression.
10. Click OK to close the Expression Builder.
11. Check your work in the Logical Column dialog box.
12. Click OK to close the Logical Column dialog box.
13. Drag the Year To Date Revenue logical column to the Base Facts presentation
folder.

Creating a Measure Using the PERIODROLLING Function

1. Right-click the F1 Revenue logical table and select New Object > Logical
Column.
2. On the General tab, name the new logical column Revenue 3-Period Rolling
Sum.
3. On the Column Source tab, select "Derived from existing columns using an
expression".
4. Open the Expression Builder.
5. Select Functions > Time Series Functions and double-click PeriodRolling to
insert the expression.
6. Click <<Measure>> in the expression.
7. Select Logical Tables > F1 Revenue and then double-click Revenue to add it to
the expression.
8. Click <<Starting Period Offset>> in the expression.
9. Enter -2. This identifies the first period in the rolling aggregation.
10. Click <<Ending Period Offset>>.
11. Enter 0. This identifies the last period in the rolling aggregation.

These integers are the relative number of periods from a displayed period. In
this example, if the query grain is month, the 3 month rolling sum starts two
months in the past (-2) and includes the current month (0).

12. Click OK to close the Expression Builder.


13. Check your work in the Logical Column dialog box.
14. Click OK to close the Logical Column dialog box.
15. Drag the Revenue 3-Period Rolling Sum logical column to the Base Facts
presentation folder.
16. Save the repository and check consistency. Fix any errors or warnings before
you proceed.
17. Close the repository. Leave the Administration Tool open.

Testing Your Work

1. Return to Oracle Enterprise Manager and load the BISAMPLE repository.


2. Return to Oracle BI and sign in.
3. Create the following analysis to test AGO and TODATE functions:

Time.Per Name Month


Time.Per Name Year
Base Facts.Revenue
Base Facts.Month Ago Revenue
Base Facts.Year to Date Revenue

4. Set the following filter for the analysis:

Per Name Year is equal to / is in 2008.

5. For the Per Name Year column, select Column Properties > Column Format >
Hide. This will prevent Per Name Year from displaying in the analysis results.
6. Sort Per Name Month in ascending order.
7. Click Results.

Month Ago Revenue displays revenue from the previous month. Year To Date
Revenue calculates a running sum of revenue for the year on a monthly basis.

8. Create the following new analysis and filter to test the PERIODROLLING
function at the month grain:

Time.Per Name Month


Time.Per Name Year
Base Facts.Revenue
Base Facts.Revenue 3-Period Rolling Sum

Per Name Year is equal to / is in 2008


9. For the Per Name Year column, select Column Properties > Column Format >
Hide. This will prevent Per Name Year from displaying in the analysis results.
10. Sort Per Name Month in ascending order.
11. Click Results.

Revenue 3-Period Rolling Sum is calculated based on the month grain.

12. Create the following new analysis and filter to test the PERIODROLLING
function at the year grain:

Time.Per Name Year


Base Facts.Revenue
Base Facts.Revenue 3-Period Rolling Sum

13. Sort Per Name Year in ascending order.


14. Click Results.

9.6. Variables
1. Variables in the Oracle BI are used to streamline administrative tasks and dynamically modify metadata
content to adjust to a changing data environment.

2. There are five types of variables that you can use:


1.Repository Variables
2. Session Variables

3. Presentation Variable
4. Request Variable
5. Global Variable
3. Repository variables and session variables will be defined in RPD & can be used in
1. RPD Calculations
2. RPD Filters
3. Report Calculations
4. Report Filters
5. Dashboard prompts.
4. Presentation variables, requestvariables and global variable will be defined in dashboard
prompt & can be utilized only in
1. Report Calculations
2. Report Filters

9.6.1. Repository Variables


1. A repository variables or Oracle BI Server variables is a variable that has a single value
at any point in time.
2. There are two types of repository variables:
i)Static — Repository variables whose value persist and do not change until the
administrator decides to change them.
ii)Dynamic — Repository variables whose values are refreshed by data returnedfrom queries
3.The administrator creates repository variables using the Oracle BI Administration Tool.

Syntax: VALUEOF(<VARIABLE_NAME>)

Ex: VALUEOF(DBUSER)

4. Repository variable values will be updated whenever oracle BI server is restarted or


started and based on schedule in Initialization blocks.

i)Static Repository variable:

1.open RPD in online modeclick on Manage tab click on Variables to open variable manager
dialog box.

2.Click on ActionNewRepositoryvariable to open Static Repository variable dialog box.

3.Enter Name as DSN,select type as static and enter default initializer as ‘orcl’(quotations are
mandiotry).click ok.

Note:
The value of a static repository variable is initialized in the Variable dialog. This value
persists, and does not change until an administrator decides to change it.

4.Similary create another variable as Name as DBUSER,select type as static and enter default
initializer as ‘BISAMPLE’ and click ok and close the Static Repository variable dialog box.

5.Expand orcl database in Physcial layer and double click connectionpool.

6.Under data source name remove orcl and enter VALUEOF(DSN) and in username
VALUEOF(DBUSER).

7.Similary create another variable as Name as CURRENT_YEAR,select type as static and enter
default initializer as ‘2008’ and click ok and close the Static Repository variable dialog box.

8.save the RPD and check consistency.

9.open BI analytics and create a new analysis from sample sales with per name year,type and
revenue columns.

10.click on per name year column filterclick on add more opionsclick on repository
variableprovide name as CURRENT_YEAR and click ok and see results.

ii) Dynamic Repository variable:

1. These variables will be associated with initialization block.


2. An initialization block contains the SQL statement that will be executed to initialize or
refresh the variables associated with that block.
3. These initialization blocks will be executed with oracle BI server refresh and based
on the schedule of initialization block.
4. Schedule option is available only for dynamic repository variable initialization blocks.
Creating an Initialization Block

1. Select Manage > Variables to open the Variable Manager.


2. Select Action > New > Repository > Initialization Block.
3. Name the initialization block Current Periods.
4. Click the Edit Data Source... button to open the Repository Variable
Initialization Block Data Source dialog box.
5. Click the Browse button to open the Select Connection Pool dialog box.
6. Double-click the Connection Pool object to select it.

The connection pool is added.

7. Enter the following SQL to determine the value of the current day, month, and
year by finding the maximum value of the period key (BILL_DAY_DT) in the
fact table:

SELECT CALENDAR_DATE, PER_NAME_MONTH, PER_NAME_YEAR


FROM BISAMPLE.SAMP_TIME_DAY_D WHERE CALENDAR_DATE =
(SELECT MAX(BILL_DAY_DT) FROM BISAMPLE.SAMP_REVENUE_F)

8. Click Test and confirm the expected results are returned. In this example, the
results are determined by the data in the sample database used for this tutorial,
which holds data through December 2010.
9. Close the Results window.
10. Click OK to close the Repository Variable Initialization Block Data Source
dialog box.

Creating Variables

1. Click Edit Data Target to open the Repository Variable Initialization Block
Variable Target dialog box.
2. Use the New button to create three new variables: CurrentDay, CurrentMonth,
CurrentYear. The order is important. The value returned from the first column
in the initialization block SQL, CALENDAR_DATE, is assigned to the
CurrentDay variable. The value of the second column, PER_NAME_MONTH, is
assigned to CurrentMonth (the second variable), and the value of the third
column, PER_NAME_YEAR, is assigned to CurrentYear (the third variable). If
necessary, use the Up and Down buttons to arrange the variables.
3. Click OK to close the Repository Variable Initialization Block Variable Target
dialog box.
4. Leave the default Refresh Interval set to every hour. This means that the
variables will be reinitialized every hour.
5. Click the Test button and check the results:

In this example, the results are determined by the data in the sample database
used for this course, which holds data through December 2010.

6. Close the Results window.


7. Click OK to close the Repository Variable Initialization Block dialog box.
8. Check your work in the Variable Manager
9. Close the Variable Manager.
10. Save the repository and check consistency. Fix any errors or warnings before
proceeding.
11. Close the repository. Leave the Administration Tool open.

Testing Your Work

1. Return to Oracle Enterprise Manager and load the BISAMPLE repository.


2. Return to Oracle BI and sign in.
3. Create the following analysis to test the variables.

Time.Per Name Year


Time.Per Name Month
Time.Calendar Date
Base Facts.Revenue

4. Click Filter for the Per Name Year column. The New Filter dialog box opens.
5. Select Add More Options > Repository Variable.
6. In the Repository Variable field, enter CurrentYear to create a filter for the Per
Name Year column using the CurrentYear repository variable.
7. Click OK to close the New Filter dialog box. The filter is added to the Filters
pane.
8. Repeat the steps to add the CurrentMonth and CurrentDay repository variables
as filters for Per Name Month and Calendar Date columns, respectively.
9. Click Results and confirm that data only for the current year, month, and day is
returned (based on the sample data set).
10. Sign out of Oracle BI. Click Leave Page when prompted about navigating away
from this page. Leave the Oracle BI browser page open.

9.6.2. Session Variables

1. A session variables is a variable that is initialized at login time for each user. When a user begins a
session, the Oracle BI Server creates a new instance of a session variable and initializes it.
2. There are as many instances of a session variable as there are active sessions on the Oracle BI Server.

3. Each instance of a session variable could be initialized to a different value.


4. There are two types of session variables:
System — A session variable that the Oracle BI Server and Oracle BI Presentation Services use for specific
purposes.
System session variables have reserved names that cannot be used for other kinds of variables (such as
static or dynamic repository variables and non-system session variables).
Non-system — A system variable that the administrator creates and names.
5. The administrator creates non-system session variables using the Oracle BI Administration Tool.

6.Syntax VALUEOF(NQ_SESSION.Variablename)

EX:VALUEOF(NQ_SESSION.USER)

System session variables:


1. These are pre defined session variables used by oracle BI server for specific
purpose such as authenticating users
2. we have below pre defined system variables.[case-sensitive must be in CAPITAL]
1. USER. 12. DISABLE_CACHE_HIT
2. PASSWORD 13.DISABLE_CACHE_SEED
3. DISPLAYNAME.
4. GROUP.
5. WEBGROUP.
6. LOGLEVEL.
7. ROLES.
8. PERMISSIONS.
9. USERLOCALE.
10. TIMEZONE.
11. PORTALPATH
3. These variables are useful in special cases such as authenticating user. These
variables should not use for any other purpose. (as a static variable name or
dynamic variable,non system session variable etc)

Using a system variable


1. Open BI analytics and create a new analysis sample salesper name year, revenue columns
2. Click on per name year edit formularemove per name year and click on variable option.

3.Variableclick on sessiontype USERclick on okagain okclick on Results.

4.Observe the results displaying the current session user name that is weblogic

5.Now sign out as weblogic and sign in as msit user and create same report and Observe the
results displaying msit in the results.

Non system session variables:


1. These are application specific customized variables.
2. These variables required Session initialization blocks
3. Session initialization blocks will be executed whenever user login into analytics
application

Using system & non system variables enabling data level security or Row Level Security

1.It is useful to hide some of the data based on user.

2. Here we will enable data security to see customer corresponding region data.
3. Data Filters/security can be applied on BMM layer /Presentation layer objects.

Creating users, group and role:

Refer section 8. Weblogic Server (WLS) Admin Console for Creating users, group and role

Users Group Application role


Robin Fisher customer customers
Fay Mills customer customers
Linda Larson customer customers
Synchronize roles in BI Admin tool:

1. Open above RPD in online mode.


2. Go to Manage menu click on identitygo to action menuclick on Synchronize Application
roles.

3.New roles will be displayed in Identity Manager.

Create Dedicated connection pool

1.In Physical Layer,right click on orcl database objectclcik on new objectclick on


connectionpool.

2.Enter connection pool name as IB_SESSIONdatasource name:orclusername:BISAMPLE

Password:BISAMPLEclick on okagain ok.

Create initialization block(IB) and assign values to non-system variables

1. Creating non-system variable & initialization block.

2. Go to manage menuclick on variablesclick on action menuclick on new session


variablename it as V_USER_REGIONtype default initializer as ‘A’click on new name it as
IB_USER_REGIONEdit datasourcetype below sql

Select region from samp_customers_d c ,samp_addresses_d a

where a.address_key=c.address_key and name =’:USER’

3.Clcik on connection pool browseselect IB_SESSION connection poolclick onselect click


on okagain ok.

Create a data filter on role

1. Go to Manage menuclick on Identityselect BI Repsoitoryclick on application roles tab.

2. application roles tabDouble click on customers click on premissionsclickon data filters.

3.data filtersclick on addexpand sample salesselect customer regionsclick on select


click ok.

4.under data filterclick your mouseclcik on edit expressionselect D3 customer


tabledouble click on region type =select session variabledouble clickon
V_USER_REGIONclickon okagain okclcik close.

“Sample.sales”.”D3 customer”.”Region”=VALUEOF(NQ_SESSION.V_USER_REGION)

Test your work

1.Login BI analytics as weblogic and create a new analysis with columns


region,revenue and Observe the results.
2.Login BI analytics as Robin Fisher and create a new analysis with columns
region,revenue and Observe the results.
3.Repeat for Fay Mills and Linda Larson and Observe the results.
Row Wise Initialization Blocks

1.Are used to retrieve and initialization a list of values to a non system session variables.
2. Returns list of values separated by colon .
Ex: AMERICAS: EMEA

3.Used for implementing data security when one user normally belongs to more than one
group.

Enabling Row Wise Initialization

1.In above processes, Modify Session Initialization Block query as shown below

Select ‘V_USER-REGION’,region from samp_customers_d c ,samp_addresses_d a

where a.address_key=c.address_key and name =’:USER’

2.click on edit target and select V_USER_REGION and remove it.


3.Below enable Row –wise initialization and click ok.
4.Some reason the moment you are deleting v_user_region session variable ,one static
repository variable created (it is a bug)Delete this static repository variable
V_USER_REGION (in variable manager,click on static in left pane)check inclick yessave
RPD.

5.As we deleted v_user_region session variable , data security deleted

6.So go to manage menuidentityselect role of customersclcik on permissionsdata


filterscheck the filter expressionreplace entire filter expression or replace MISSING with
V_USER_REGION to make it right.

7.Close these windowscheck in click yes save RPD and check consistency.
Test Your Work

1.Open sql developer or Toad and insert the record into samp_customers_d and
samp_addresses_d tables.
Insert into samp_customers_d(name,addres_key) values(‘Fay Mills’,’1234’);
Insert into samp_addresses_d (region,addres_key) values(‘APAC’,’1234’);
Delete from samp_addresses_d where region=’EMEA’ and address_key=’1234’;
Commit;
Verify
Select region,name,address_key from samp_customers_d c ,samp_addresses_d a

where a.address_key=c.address_key and name in (‘Robin Fisher’,’Fay Mils’,’Linda Larson’);

2. Now login BI analytics as Fay Mills and observer the results,now he is showing
both EMEA and APAC region data.
3.logout from BI Analytics.
9.6.3. Presentation Variable
Refer section tiltled:11.2. Dashboard Prompts

9.6.4. Request Variable


Refer section tiltled:11.2. Dashboard

9.6.5. Global Variable

1.A global variable is a column created by combining a specific data type with a value.The value can be a
string, number, date, time, expression, formula, and so on.

2.The global variable is then saved in the catalog and made available to all other analyses.

3.Global variables can be of the following types:


■ Date
■ Date and Time
■ Number

■ Text

■ Time

Note:

Global variables apply to Oracle BI EE 11.1.1.7.10 and laterversions, and might not be available in earlier
versions.

Creating a global variable

1.Open BI analytics an create a new analysis with per name year,revenue,revenue


columns.

2.click on 1st revenue columnsselect edit formularemove revenue and click on


variable button and select global .

3.The Insert Global Variable dialogis displayed.


4. Click the Add New Global Variable button. The "New Global Variable dialog"
displays.
5. Enter a unique name as gv_txt_multiply_rev
5. Select a data type as text.
6. Enter a value as “Base facts”.”revenue”*3.1415
7.Click OK. The new global variable is added to the Insert Global Variable dialog.
8. Select the new global variable that you just created, and then click OK. The Edit
Column Formula dialog is displayed with the global variable inserted in the Column
Formula pane.
Note:

The Custom Headings check box is automatically selected.

9. Enter a new name for


the column to which you have assigned a global variable to more accurately reflect
the variable.
10. Click OK.The global variable is evaluated at the time the analysis is executed, and the
value of the global variable is substituted appropriately. Only users with appropriate
privilegescan manage (add, edit, and delete) global variables.

10. BI Analytics
10.1. Working with analyses
Develop a report with columns Per Name Year, Region and Revenue. Place cursor on any column
&observe below properties.

Fig 55

1. Column properties.
2. Edit formula.
3. Filters.
4. Sorting.
5. Delete.

10.1.1. Column properties

Column properties are useful to set below options:

Fig 56
1. Style:
i.It is useful to format data like font, cell and borders.

ii. We can add default images or custom images.

Fig 57
2. Column Format
Useful to change Headings,Hide columns and Suppress values
Headings
1. We can customize table and column titles
2. In OBI 11G partially title formatting is possible by taking help of HTML tags.

For example Calendar <font size="3" color="red">Year</font>

Hide columns

It is useful to hide unnecessary columns

Example:
1. Create a report with columns per name year,Product number ,product and Revenue.

2. In critrie tab,go to column properties of Product number column and select column format tab.

3.Check Hide optin in the column format tab and click on results. Observe the output.

Interview question how will you sort the column and use it in any reports?

Suppress:
It is useful to hide duplicate values in report.

Example:

1. In the above example, go to column properties of per name year column and select column
format tab.

2. In column format tab, first check repeat option and observe the results and then check suppress
option and observe the results.

3. Data Format:
It is useful to convert one format of data into another format

Example:

1. Create a report with columns per name year, per name week,revenue,Change In Cost and Units.

2. go to column properties of Revenue and select data format tab.

3. Assign values as shown in figure and Observe the results.

4. Conditional format:
It is useful to format data based on some condition.

Example:

1. Create a report with columns per name year, Region,Product Type,revenue.

2. go to column properties of Revenue and select Conditional format tab.

3. Assign values as shown in figure and Observe the results.

5. Interaction:

Interview Questiondid create master-detail report?


6. Write back:

Interview Question what is meant by write back?

10.2.2. Edit formula:


i. Column formula

It is useful to create any report level calculations.

Example:

1. Create a report with columns per name year, Region, Product Type,revenue and one more time
the revenue column.
2. go to Edit formula and go to coumn formula tab of second Revenue column and rename the
coulmn as Tax.

3. calculate the tax as shown in figure and Observe the results.

ii. Bins

1. Are useful to combine values for the column into sets.


2. Bins are useful to group the data

Example:

1. Create a report with columns Product Type ,one more time theProduct Type column and revenue.

2. go to Edit formula and Bins tab of second Product Type ..

3. Assign values as shown in figure and Observe the results.

10.2. Working with filters and Prompts


Filters and Prompts:
1. Filters and prompts is useful to restrict the data in OBIEE
2. It is equal to where and having clauses of SQL
3. Is a condition created with following elements
Example:

Per name Year = 2008


We can develop filters in below ways:
1. Creating filter using criteria column.
2. Creating filter using subject area column.
3. Use a saved filter in a report (reusability purpose).
4. Use one report output as filter in another report.
5. Using variables.
6. Using SQL expression.
7. Using group filters.
8. Add a column filter prompt to a request.

9 Add a Variable filter prompt to a request.


10. Global Filter or RPD LTS Filter.

1. Creating filter using criteria column:

Create a report with columns per name year, Region, revenue and Click on year column filter and
give value as 2008 as in the figure. Click on ok & click on Results.

2. Creating filter using subject area column:

1. In the above report delete year filter.


2. Click on filter button in filter pane click on more columns expand product select the
product type columnclick on down arrow mark of valueselect Audio, Camera and LCD as show
in figureClick on ok & click on Results.

3. Use a saved filter in a report (reusability purpose):


If we need to use a filter in multiple reports then as a best practice save that filter and
use in all reports

Example:

Step 1: Creating saved filter

1. Click on newclick on filterclick on sample salesfrom subject area pane click


on per name yeartype 2008click on ok.

2. From subject area pane click on Customer regionfrom drop down select AMERICAS &
APACclick on ok.

3. Click on saveselect any folderName filter as 2008andAMERICAS & APAC filters as shown in
figureclick on ok.

Step 2: Using saved filter


1. Click on newanalysisclick on sample salesdouble click on per name year, region, Revenue.

2. Navigate to our saved filter as shown in the figure select it & click on Plus symbolclick on
okclick on results.

4. Use one report output as filter in another report:

Requirement: Show always current month data of database

Example:

Step 1: Calculating Current Month from Database


1. Develop a report with per name month column
2. Click on per name month columnclick on edit formulatype as shown in the figureclick on
ok.

3. Click on saveSelect any foldername report as current _monthclick on ok.

Step 2: Develop another report


1. Develop another report with columns per name year, per name month , region, Revenue.

2. Click on per name month filterselect operator as is based on result of anotheranalysisclick


on saved analysis browseselect our current_month reportclick on okagain okclick on
results.

Exercise1: Develop a report with columns year, month, region and Revenue and show only
max month corresponding data from each year.

Hint: Add Year column in current month report

Exercise: Get max Revenue from district from each region.


5. Using variables:

Create variable:
1. Open RPD online modego to Manage menuVariablesSelect StaticRight click in workarea
and select New Repository variable name it as current_month and default initializer as 2008 /
04okCheck inChangesSave.
Using Variable as filter
1. Develop a report with per name year, per name month , region, revenue.

2. Click on month filterclick on add more optionsSelect repository variablestype repository


variables as current_monthclick on okclick on results.

6) Using SQL expression:


1. It is equivalent to database sub query concept.
2. Develop a report year,month, region, revenueclick on region filterclick on add more
optionssql expressiondevelop below expression.

i) Select supplier sales―customers‖ ―region‖ from ―supplier sales‖ where ―supplier


sales‖ ―customers‖ ―region‖ = ―central‖
3. Click on okclick on results.

Convert this filter to SQL(Advanced SQL Filter)

It is useful to customize where clause by using SQL syntax or functions


Process
Develop a report with year, region, revenueclick on region filterselect convert this
filter to sqlclick on OKdevelop expression UPPER("Customers"."Region") = 'APAC'click on
OKclick on result.

Exercise: Filter data of Phone number length =1 4

7) Using a group filters:


1. Develop a report with year, region, type, dollars. Develop below three filters
1) year = 2008.
2) Region = APAC
3) type =Audio,LCD,Install.

2. Click on first AND operatorobserve that is changed as OR & 1st & 2nd filters(year,
region)are now called as group filter. If you want ungroup click on 2nd filteredit filter
groupclick on ungroup.

8) Add a column filter prompt to a request:


User friendly filter or run time filter or Dynamic filter is called as prompt.
Example:
1. Develop a report with region, district, revenue

2. Click on prompts tabclick on newcolumn promptregionclick on ok.

3. Again click on newcolumn promptdistrictclick on option and expandselect limited values


by regionclick on okclick on previewselect APACSelect Districtclick on ok.

9. Add a Variable filter prompt to a request:

1. Presentation Variable is useful to capture ―User Response‖


2. In Variable Prompt, we will use Presentation or Request Variable (in 11.1 .1.7)
3. Presentation Variable syntax: @ {<variable name>} {<value>} [<format>]
a. Here <value> and <format> are optional.
b. Variable Name is the name of the Presentation Variable and it is case sensitive
c. Value is by default value
d. Format is to convert one format to other format example
@{system.currentTime}[YYYY] or @{system.currentTime}[MMM]
Process:
1. Develop a Report with columns Region, District, Dollars
2. Click on Prompts
3. Click on NewVariable PromptName it as V_USER_SELECTIONclick on OK.

4. Go to the CriteriaClick on Region FilteClick on add more optionsPresentation


VariableType Variable Expression as V_USER_SELECTIONClick on OK.

5. Click on District Filteradd more optionsPresentation VariableType Variable


Expression as V_USER_SELECTIONClick on OK.

6. Click on the and operator in the Filter pane and change it as or operator.

7. Click on Prompts-Click on the FilteType Gulf in the text box.

8. Click on OK and Observe Gulf Data.

Note : None of the projects are using report level prompt these three prompts delivered as
part of syllabus .Report level prompts are just for testing purpose only . In project if you need
prompt then we will go for DASHBOARD PROMPTS.

10) Global Filter or RPD LTS Filter


A filter which is applied on LTS of RPD is called as global filter. Normally used to filter
permanently unwanted data from BI application.

Example : As shown in the figure.

10.3. Working with Views


1. View is a template
2. View is useful to present data as per business requirement.
3. Title & table are called as by default views.
4. Views will be integrated in compound layout.
5. We have 22 types of views , some of those are
1) Title
2) Table.
3) Pivot table.
4) Graph.
5) Gauge.
6) Funnel.
7) Map (new in 11g).
8) Filters.
9) Selection Steps
10) Column selector.
11) View selector.
12) Legend.
13) Narrative.
14) Ticker.
15) Static text.
16) Logical SQL
17) No Results
1. Title
1) It is useful to display
1. name of the report
2. Logo of the client
3. Run date & time of report.
4. Provide url http://rritec.com/
2) We must copy image of client in below location
C:\OBIEE_HOME\Oracle_BI1 \bifoundation\web\appv2\res\s_blafp\images
syntax for logo Fmap:images/the name of image.jpg
where fmap =:\BI116g\Oracle_BI1 \bifoundation\web\appv2\res\s_blafp\images
2. Table
1) Table is useful to project data in 2 D format (rows * columns)
2) In OBI 11g Table prompts ,sections and excluded are newly introduced.
3) Can create Subtotals, grand totals, alternate row colors and the maximum number
of rows per page.
Process:

1) Develop a report year, month code, region, type, and dollars. Click on results

2) Click on table edit (pencil symbol),drag and drop region into table prompts, type into sections.

3) Click on year “sigma”symbolselect afterclick on columns & measures “sigma”


symbolclick afterclick done.

4) Click on table editClick on Table View Propertiesbeside row stylingenable alternate


stylingenable repeat cell valuesclick on done.

5) Click on table editDrag and drop type beside month columnclick on


Table View Propertiesselect content pagingrows per page as 20click on
done

Note:
1) In 11.1.1.7 Fixed headers with scrolling content option introduced
2) In 11.1.1.7 Include rows with only Null values option introduced

3. Pivot table
1) Provide the ability to rotate rows, columns and section headings to obtain different
perspectives of the same data
2) In Pivot Table 6 parts are available those are
1. Pivot Table Prompts
2. Sections
3. Excluded
4. Rows
5. Columns
6. Measures
Process
1) Develop a report with columns year, month code, region, sales district, type, dollars click
on results.
2) Click on new viewclick on pivot tabledelete table

3) Click on pivot table editDrag and drop region into table promptsDistrict into
sectionsmonth into excludedtype into columns
Duplicate layer
1) Click on Dollars duplicate layer
2) Click on 2nd dollars more optionsformat heading name as market share
3) Again click on more optionsshow data aspercent ofrow.

4) Drop Measure Labels under productsclick on doneobserve output.

Table Vs Pivot Table:Refer to figure

4. Graph

Easy and quick to understand data by any language or illiterate people

Process:
1) Develop a report with region, district, year, type, dollarsClick on results.

2) Click on new viewclick on graphbardefault verticalclick on edit graphdrag and drop


region into graph prompts.

3) Drag & drop sales district into section & enable display as slider, drag & drop year
into legendclick on done & observe output.

5.Gauge chart:
1) It is nothing but Speedo meter. It is designed to compare one dimension with one
measure.
Process:
1) Develop a report with region, dollars.
2) Click on resultsclick on new gauge default (dial)observe output.

6.Funnel chart:
1) It is useful to compare one dimension with two measures & those two measures must
be actual versus targets.
Process:
1) Develop a report with region, unit shipped, units orderedclick on results.

2) click on new viewfunnel (default (standards))under settingsclick on high


values properties click on custom value97.6 typeclick on low value propertiescustom
valuestype 97click on doneobserve output.

8.Filters:
1) It is useful to capture criteria filter conditions automatically.

Process:
1) Develop a report with year, region, and dollars.
2) Click on create filtermore columnsexpand productsselect typeclick on ok
select beef, bread, cheese-click on okclick on Results.
3) Click on new viewclick on filters drag and drop filter view between title & table.

9.Selection Steps:
1) It is useful to do custom calculations /row wise calculations.
2) It is a new feature in OBIEE 11.1.1.6
3) Selection steps also useful to develop filter condition.
Process:
1) Develop a report.
2) Click on results
3) Select selection stepsexpand it under products – Type click on
then new stepselect as per below

4) Type display label as non veg Revenueselect Beef,Pork,Lamb,Seafoodclick on moveClick


on ok.

5) Observe output
6) Click on new view add selection steps  Click on non veg revenue of selection steps
and observe output.

Scenario : Filter out data if dollar amount is less than 1000


In results pageSelect Selection StepsClick on the ‘Then New Steps’Click on Apply
A conditionCondition type as X>=ValueSelect as shown belowClick on OK.

10. Columns selector


1) It is useful to select columns dynamically
2) Column selector will affect all the views of compound layout.
Process:
1) Develop a report with year, region, dollars
2) click on results
3) click on new viewother viewscolumn selectorcolumn selector edit.

4) select column1double click on monthselect column2double click ondistrictselect


column3double click on unit ordered, unit shipped.

5) click on done

6) Drag and drop column selector between title & tableselect districtObserve
output..

11.View selector:
1) It is useful to select views dynamically.
2) View selector is useful to integrate other views

Process:
1) In the above report click on new viewgraphpieagain click on new viewgraphpie.

2) Click on second graph edit. Drag and drop year into excludeddrag drop region on to
SlicesClick on done.

3) Click on new viewview selectorclick on edit view selectorSelect graph1,


graph2Click on arrow mark.

4) Click on Graph1 editClick on renamename it as year versus dollarsclick on ok.


5) Similarly rename graph2 as region versus dollarsclick on doneObserve output.

12. Legend
1. It is useful to decode color coding or acronyms.
a. Example : OBIA=Oracle Business Intelligence Applications and Color
RED=Poor Business
Process:
1. Develop a report with year, region, type, and Revenue.
2. Do conditional formatting as per below.
i. Revenue < 10,00,0 then red color.
ii. Revenue between 10,00,0 & 20,00,0 then yellow.
iii. Revenue > 20,00,0 then green
3. Click on resultsClick on new viewclick on legend.

4. Click on edit legendtype goodclick on format textselect background color green.

5. Click on add captiontype avgclick on format textselect background color yellow.

6. Click on add captiontype poorclick on format textselect background color red.

7. Click on doneDrag and drop legend view between title & table.

13.Narrative:
It is useful to design paragraph reports.

Process:
1. Develop a report with year, unit ordered, unit shippedclick on resultsclick on
new viewnarrativeclick on edit narrativeIn the Narrative part type as per
below.
The year @1 orders @2 shipment is @3 [br/]line breakclick on done.

Exercise : Develop a report to capture data refresh date

14.Ticker:
Ticker is useful to scroll information on a report.

Process:
In the above report click on new view ticker.
15.Static text:

It is useful to provide some comments in report. In the above report click on new viewstatic
textclick on edit static texttype The REVENUE ANALYSISclick on done.

16.Logical SQL:
1. It is only for developer purpose ,business user can not understand this logical SQL.
17.No Results :
1. It is useful to display some custom message when ever report has no result
Process:
1. Develop a report with year ,region and dollars
2. click on year filter and create condition year=420
3. Click on results
4. click on edit analysis properties
5. Select as per below and type below matter
No Results Settings : Display Custom Message

6. Click on OKObserve output.

10.4. Advance features of Analysis


10.4.1. Combining the similar request
Combining an analysis or reports from similar or different subject areas.
1. In this concept we will use below set operators like.
1. Union.
2. Union all.
3. Intersect.
4. Minus.
2. To work with set of operators we need to make sure.
i. No. of columns in two requests must be same.
ii. Corresponding columns data types must be same.
3. Union will not provide duplicates.
4. Union all will provide duplicates.

Process:
1. From subject area pane click on customers, revenue.
2. Click on revenuecolumn filterselect operator is betweenvalue 5000 &15000click on ok.

3. click on combine resultsselect supplier sales DM subject areaclick on customer,


revenueclick on dollars column filterselect is between operatorvalue
:10000,20000okresults.

Testing: Click on criteria change set operator as per requirement and observe output as shown in the
figure. Verify nqquery.log.

10.4.2. Cross Subject Analysis


Creating an analysis from multiple subject areas is known as Cross Subject
Analysis.
Requirement:
The subject area columns used in analysis should have join condition or conformed
dimension.
Process:
1. Create a report with columns per name year, product from sample sales.
2. Click on the Add/Remove subject areas as shown in figure and select sample
sales DM and click ok.
3. Add columns region, revenue from sample sales DM to analysis and click on
results.
11. BI Interactive Dashboards
11.1. Dashboard Properties
1) Dashboard object is useful to organize reports, KPI, Score Cards or dashboard prompts.
2) We have below dashboard objects.
1. Column.
2. Section.
3. Link or image.
4. Embedded Content.
5. Text.
6. Folder.
7. Alert section(New in 11G)
8. Action link(New in 11G)
9. Action link menu(New in 11G)
Note: GUDIDED NAVIGATON dashboard object removed .however same
functionality can be achievable using condition
1. Column:
1) Column is a biggest object in dashboard page
2) It is useful to organize reports vertically or horizontally with in dashboard page
3) With Column Object we will get Collapse or expand button
2. Section:
1) Section is a biggest object in column
2) It is useful to organize reports vertically or horizontally with in the column
Scenario 1 : Hiding or showing sections based on permissions
Step1: Create user RRITECDB
Step2: Providing permissions on user RRITECDB in such a way R1 Section is hided
1. Edit Dashboard RRITEC_DASHBOARD  click on R1 Corresponding section properties
 click on permissions search and select the user  provide deny permissions
okoksave
2. Logout and login as RRITECDB user  click on RRITEC_DASHBOARD observe that
R1 report is not available
Scenario 2: Hiding or showing sections based on condition
a. Example : If one particular report contains at least one record then only show the
section
b. IN 10G we will do this functionality by using GUDIDED NAVIGATON

Process to conditionally hiding or showing sections


Step 1: Develop a Negative Business report with columns customer name,salesrep,region,
district ,dollars put filter on dollars as less than zero
Step 2: Integrating in dashboard
1) Open dashboard RRITEC_DASHBOARD  Drag and drop above report into separate
section
Step 3: Implementing conditional hiding
1) Click on section properties condition select Analysis  click on browse  select
above report Negative Business click on ok  test the results by running dashboard .
3. Link or image
1) It is useful to navigate from
a. One Dashboard to another dashboard
b. One Dashboard to another report
c. One Dashboard to another webpage
Process:
2) Create a new dashboard or use existing dashboard
3) Drag and drop link or image dashboard object into dashboard
4) Click on link or image properties  click on browse  select any one report/Dashboard
5) In the images type fmap:images/report-bad-percentage.jpg where
fmap=E:/bi11g/instance/instance1/bifoundation/oracle BI presentation server
component/core application_obips1/analytics res/s-mobile style/Images.
6) Click on ok  click on save  Click on run  click on image  Observe report click
on return.
7) If we need to navigate from dashboard to website provide URL http://www.rritec.com/
4. Embedded content
1) It is useful to insert one webpage into another webpage.
2) It is useful to get latest information from any website to dashboard.
Process:
1) Drag and drop embedded content dashboard object into work area  click on
embedded content properties.
2) Type http://localhost:7001/console/login/loginform.jsp
3) click ok  save  Run
5. Text:
1) It is useful to provide some comment lines in dashboard
2) It also accepts scripts(Java script) and Scripts are useful to get some extra
functionalities in dashboard (example Auto Refresh of reports )
Process:
1) Drag and drop text object into dashboard  type welcome to RR ITEC (Information
Technology Education Center)  preview  ok.
6. Folder:
1) It is useful to present saved content in a dashboard.
2) It is useful to business users to access reports & dashboards easily.
Process:

1. Drag and drop folder object below text object  click on folder properties 
browse select shared folders  R ok save  run.
Note: Alert, Action link and Action link menu will be discussed in rest of the chapters.

11.2. Dashboard Prompts


1. These are useful to filter dashboard data as for client requirements.
2. These are dashboard level user friendly filters.
3. These are four types.
1. Column prompt.
2. Image prompt.
3. Variable prompt.
4. Currency prompt.
1. Column prompt:
Step1: Creating Column Dashboard Prompt
1. Click on newdashboard prompt  click on supplier sales  click on new
column prompt  select region  ok again ok  click on save  select
RRITEC folder Name it as region prompt  click on ok.
Step 2: Integrating dashboard prompt into dashboard:
1. Click on dashboards RRITEC_DASHBOARD  page options  edit dashboard
2. shared folders  RRITEC  drag and drop region prompt on top of sections 
click on save  click on run.
3. In dashboard prompt select central  click on apply & observe the result in page1 &
page2.
4. Page 1 and Page2 reports are effecting. If you want to affect dashboard prompt only
for one page then click on dashboard prompt properties  scope  page  click on
save  click on run
5. Select east  click on apply. Notice that page1 is effected but page2 is not effected.
6. Click on edit dashboard  select R1 report properties edit analysis change filter
as region equal to Central select protect filter save
Note: Protected filter is useful to develop constant filters in report. These filters will not
be affected by dashboard prompt.
2. Image prompts:
1) It is a new in OBI 11G
Step1: Develop a report
1) Develop a report with column region, district and dollars.
2) Click on region filter and select operator as is prompted
3) Save it as image_source_report
Step2: Develop a Image Prompt
1) From RRitec lab data folder copy USAmap.jpg into E:/inetpub/wwwroot(OS installed
drive)
2) Click on new  Dashboard PromptSelect subject area SuppliersalesClick on
newImage Prompt
3) Image URL : http://localhost/usamap.jpg
4) From RRitec lab data folder  double click on usamap.txt file  copy entire code.
Paste it into html image map  click on extract image map from html
Type as per below
Area title columns value
a. Select central ―customers‖.‖Region‖ Central
b. Select east ―customers‖. ‖Region‖ East
c. Select west ―customers‖. ‖Region‖ West
 ok .
5) Save it as Image_promt
Step3: Integrating Report and image prompt
1) In above RRITEC_DASHBOARD create a new dashboard page
2) Drag and drop above report and prompt click on save run
3) Click on any part of image and observe corresponding result
Note : Image Prompt values can not store or capture in presentation variable
3. Variable prompt:
1) It is useful to store user response in one variable & we can call this variable in any
calculations, filters, views(Title,narrative,ticker..etc) etc.
2) This variable is called as presentation variable.
3) It is a new in OBI 11G
Step1: Develop a Variable Prompt
1) Click on new  click on dashboard prompt  Select supplier sales subject area
click on new  select variable prompt  give the name as v_region  label as select
region  user input as radio buttons  radio button value as all column values 
click on select column  expand customers  select region  click on ok again ok
click on save  select shared folder /RRITEC path and name it as region variable
prompt  click on ok
Step2: Develop a Report
1) Develop a report with column region, district and dollars.
2) click on region filter  select operator as is equal is in  click on add more options 
select presentation variable  variable expression v_region(case sensitive) default
value: Central  click on ok click on results
3) Click on title view  Click on edit  type region: @{v_region}{Central}
4) Save it as Variable_source_report
Step3: Integrating Report and Variable prompt
1) Click on dashboard Select RRITEC_DASHBOARD  click edit dashboard
2) Create a new page and name it as PV_page drag and drop  region variable prompt
into PV_page
2) Drag and drop above report Variable_source_report
3) Click on save Click on run  select East  click on apply observe result.

11.3. Catalog Manager


Catalog Manager is a tool that lets you perform online and offline management of Oracle BI
Presentation Catalogs. It should be installed on a secure computer that is accessible only to Oracle BI
Administrators.

You can use Catalog Manager to:

 Manage folders, shortcuts, and objects (analyses, filters, prompts, dashboards, and so
on). For example, you can rename and delete objects, and you can move and copy
objects within and between catalogs.
 View and edit catalog objects in Extensible Markup Language (XML).
 Preview objects, such as analyses and prompts.
 Search for and replace catalog text.
 Search for catalog objects.
 Create analyses to display catalog data.
 Localize captions.
 Catalog Migration

Caution:

 Always make backup copies of the Oracle BI Presentation Catalogs that you are
working with.
 Be sure of changes that you plan to make. Catalog Manager commits changes
immediately. There is no undo function nor are there any error messages to tell you
that a particular change does not display well to users. However, if you do make any
unwanted changes, then you can revert to your latest saved backup.
 Do not copy and paste catalog contents into e-mail, as this is not supported.

Starting Catalog Manager

Use the following procedure to start Catalog Manager.

To start Catalog Manager:

1. Using the command line, change to the following directory:

ORACLE_INSTANCE\bifoundation\OracleBIPresentationServicesComponent\corea
pplication_obipsn\catalogmanager

then run the appropriate script:

runcat.cmd (on Windows)

runcat.sh (on UNIX)

(OR)

2. Windows Start All ProgramsOBIEE FolderExpand  select catalog manager.

Opening an Oracle BI Presentation Catalog

To open an Oracle BI Presentation Catalog:

1. In Catalog Manager, from the File menu, select Open Catalog.


2. Complete the necessary fields, as described in the following list:
o Type — Select the mode (online or offline) in which to open the catalog
o Path — If you are opening the catalog in offline mode, then enter the path to
the catalog folder on the local file system, for example:

Click Browse to display a dialog for locating the catalog.

C:\ORACLE_INSTANCE\bifoundation\OracleBIPresentationServicesCompo
nent\coreapplication_obipsn\catalog\default

o URL — If you are opening the catalog in online mode, then enter the URL to
Oracle BI Presentation Services, for example:

https://hostname/analytics/saw.dll

o User — If you are opening the catalog in online mode, then enter the user
name for the host URL (disabled in offline mode).
o Password — If you are opening the catalog in online mode, then enter the
password for the host URL (disabled in offline mode).
o Locale — Select the locale to use for user interface elements in Catalog
Manager and for objects in the catalog, when opening the catalog in online
mode.
o Read-Only — Select this field to open the catalog in read-only mode (disabled
in offline mode).
3. Click OK.

12. BI Delivers
12.1. Deployment of Mail Server and Scheduler.
1. Configuring scheduler tables:
1) In 11g configuring of Oracle BI Scheduler Server will be done with the installation
Of OBI 11g product [in 10g it was manual process]
2) In 11G , Scheduler tables by default created in a schema DEV_BIPLATFORM with
installation of RCU(Repository Creation Utility )
3) Scheduler tables are
1. S_NQ_JOB
2. S_NQ_JOB_PARAM
3. S_NQ_INSTANCE
4. S_NQ_ERR_MSG
2. Scheduler tables schema configured in EM. To observe navigate
Deployment Scheduler
3. To send mails from analytics application ,we need a SMTP server .Go to the

Deployment Mail
12.2. Creating a Delivery Device
1) Device will be created by each & every user his own
2) As a developer we need to give KT(Knowledge Transfer) to business users
3) Example of Devices Email ,Phone ,Pager …etc

Process:
1) Login to Analytics by using weblogic user
2) Click on weblogic click on my account click on delivery options Click on
name :Create Device RRITECDevice Type : HTML EmailAdress/Number
dwramreddy@gmail.com Click on ok
3) Similarly add phone by selecting as per below.
Note : We can create mailids,phones..etc from console portal by admin

12.3. Creating a Delivery Profile


1) Determine which device receives content
2) Examples
a. Office profile that delivers content to Web Browser and office email
b. “On the Road” profile that delivers content to a pager or PDA depending on
priority
Process:
create delivery delivery profile  delivery options  my account Click on web logic
name it asreport office Profile select mail id as high priority, phone number as
normal priority . Similarly create another profile (On Road).

12.4. Creating and Scheduling an Agent (iBOT)


4. Agent:
1) In OBIEE 10G it was called as IBOT
2) It is useful to schedule a report or dashboard page or dashboard(in 11.1.1.7.1)
3) It is useful to generate alerts.
4) Agents can be created in three ways
a. By using Scheduled Time
b. By using Scheduled Time and Condition
c. Chain Of Agents
Process
Method 1: Develop a Agent based on Scheduled Time
Step 1: Develop a report
1) Develop a report with columns year, region, and dollars.
name it as Select sharedfolder/RRITEC folder 2) Click on save agent report click
on ok
Step 2: Create Agent based on Scheduled Time
1) Click on new under Actionable Intelligence click on Agent click on schedule
select frequency astab once click on select date & time give 2 mins forward selectfrom
current time delivery content selecttab analysis click on browse
select agent report click on ok Save into Shared Folder/RRITEC folder with
the name of first agent
2) observe our click on alerts  Click on home After 2 mins first agent click on it
observe output.
Note 1: impersonating: Running report with different user at run time
example: run report using weblogic user and send output to ram user
2. Can you schedule N reports using one agent?
Yes .We need to integrate in dashboard page/Dashboard and schedule dashboard
page/Dashboard
3. Can you send report output to non OBIEE users.
Yes .by using direct mail ids or using conditional report
Method 2: Develop Agent based on time and condition
Requirement: send a report of -ve business Customers on daily basis
Step 1: Develop Report as shown below
Step 2: Develop Agent based on time and condition
1. Go to New Click on agent Click on Condition selecttab use a condition radio
Click onbutton Browse Select –ve business report (developed in STEP 1)
Select true if row count as is greater than value as 0 Click on ok
2. Click on Delivery Content tab Click on Content browse Select –ve business
report (developed in STEP 1)
3. Go to Schedule Selecttab frequency as once  Clcik on start  set time 2
mins forward from the current time Click on save as
RRITEC_TIME_CONDITON_AGENT
4. After 2 mins observe alert button beside home
Method 3: Chain of Agents
Running one agent after another agent is called as chain of IBOTs or Agents
Process
Go toOpen any one above agent Actions select another Invoke Agent tab of AGENT
agent
5. Alerts dashboards object:
It is useful to capture all active alerts.
Process:
1) Click on Dashboard click onlink RRITEC_DASHBOARD edit page options 
drag and drop alert section in the 1dashboard st click on click on save -->
click on run observe our first agent
12.5. Job Manager

1) It is windows based component.


2) It is useful to monitor the status of agents.
Process:
Job Manager. Oracle Business Intelligence  All programs 1) Start
provide administrator name as web open scheduler connection 2) Go to file menu
Click on password:RRitec123 logic ok
3) Select web logic user & observe all our agents & their status conditions.

13. Security
13.1 Authentication
1. Authentication is a process by which a system verifies (with a user ID and password)
that a user has necessary permissions and authorizations to log on and access data
2. Simple words Validating username & password is called as authentication.
a. Eg: Analytics application username, password validation.RRITEC Website
3. Oracle BI Server authenticates each connection request that it receives.
4. Authentications are 2 types.
1. LDAP.
2. EXTERNAL TABLE.

13.1.1. LDAP Authentication


1. LDAP (Light weight Directory Access Protocol)
1) LDAP is integrated part of OBIEE 11G.
2) Can be accessed using console (http://localhost:7001/console)
3) Creating users & groups in LDAP already completed in BASIC LEVEL

13.1.2. External Table Authentication


1) Validating the username and password against External Table is called as External table
Authentication.
2) In any project first priority goes to LDAP authentication. If customer rejects LDAP then
we will go for External tables.
Process:
Step 1: Creating External table
1) Observe the securitytable data, which is already created in RR ITEC Supplier2 schema.
Select * from securitytable
Step 2: Creating Dedicated Connection Pool
In physical layer right click on1. Open RPD in online mode ORCL Clickdatabase
on new object Click on connection Pool Name it as IB_Authentication 
Data Source name : ORCL username and passwords as supplier2 Click on
ok  check in changes
Step 3: Create Session Initialization Block
1. Go to manage Click onmenu Variables Click on session Click on
Initialization blocks
click on2. Right click in work area new initialization block name it as
ib_authentication click on edit data source develop sql query
select username, salesrep, grp from securitytable where username=‘:USER and
pwd=‘:PASSWORD'
Note : username capturing is mandatory .
3. In sql query where clause user and password must be in caps
4. Click on connection pool browse select IB_Authentication connection Pool
Edit ok select Data Target Click on new name it as USER Click on
Click on okok
check in changes5. Similarly create variables DISPLAYNAME and GROUP
Step 4: Testing
1. Open Analytics l sign In password: az ogin as AZIFF
Note:
Enable Required for authentication:If we need to pass external table intilization block then enable it
Edit Execcution Precedence:To maintain Session Initilization blocks execution order we will use it .
Example : Authorization initialization block need to run after Authentication initialization block.
Creating user in web catalog:
1. LDAP user automatically appears in web catalog
2. External table user will be created when ever user first time login into analytics
application
Creating Catalog groups
1) It is useful to group ‗n‘ number of users in web catalog.
2) The catalog group name & console group name should be same (it‘s a best practice)
3) The catalog group is useful if we want to provide permissions on ‗n‘ no. of users
4) We can map ‗n‘ no. of users to group & we can provide permission on catalog group
instead of on each individual user .
Process:

Login as web logic user-->go toadministration Click on manage catalog groups


click on create a new catalog group, name it as customers list as ALL 
search select any users & click onclick on move ok.

13.1.3. SSL and SSO Authentication


Secure Sockets Layer (SSL) Authentication:

You can configure Oracle Fusion Middleware to secure communications between


Oracle Fusion Middleware components using SSL, which is an industry standard for
securing communications. Oracle Fusion Middleware supports SSL version 3, as well
as TLS version 1.

SSL secures communication by providing message encryption, integrity, and


authentication. The SSL standard allows the involved components (such as browsers
and HTTP servers) to negotiate which encryption, authentication, and integrity
mechanisms to use.

SSO Authentication:
1.Integrating a single sign-on (SSO) solution enables a user to log on (sign-on) and be
authenticated once. Thereafter, the authenticated user is given access to system
components or resources according to the permissions and privileges granted to that
user.
2. When Oracle Business Intelligence is configured to use SSO authentication, it accepts
authenticated users from whatever SSO solution Oracle Fusion Middleware is
configured to use. If SSO is not enabled, then Oracle Business Intelligence challenges
each user for authentication credentials.

13.2 Authorization
1. Once a user login to application ,what can he access controlled by Authorization
2. Authorization enforced into two types
1. Object level.
2. Data level (or) row level (Already completed in Variable chapter)
13.2.1. Object Level
Object level security can be divided into 2 types.
1. RPD level objects.
2. Presentation catalog or web catalog objects.
1. RPD level objects
1. We can control below objects from presentation layer of RPD.
1. Subject area.
2. Presentation table.
3. Presentation column.
4. Hierarchy object.
Process:
Step 1: Create users
1. Use user created in DDR handson (BxxxDDR)
Step 2: Providing Security on RPD objects
double click on1. Open RPD in online mode subject area (or) presentation table
(or) presentation column (or) hierarchy object In general tab click on
permissions type click on set online user filter  * Click on ok
2. Select BxxxDDR Enable radio button no access click on ok again ok 
check in changes save.
logout reload server metadata  login as web logic 3. Open analytics
4. Login as BxxxDDR click on new click on analysis observe that objects are
hided
Note: Subject area object level permission can also provide by using below navigation
Opposite to subject area click on it. Select users andManage PrivilegesAdministration
provide permissions.
2. Presentation catalog (or) web catalog objects
1. We can control below objects of web catalog.
1. Folder.
2. Dashboard.
3. Dashboard page.
4. Section
5. Report.
6. Saved filter
7. KPI
8. KPI Watch List
9. Scorecard…….etc
2. In OBIEE 11G we have 6 types of permission.
1. No access.
2. Traverse.
3. Open [read + Traverse]
4. Modify [read + delete + write + rename+ traverse]
5. Full control [modify + permission]
6. Custom (new in OBIEE 11G)

Providing permissions at the level of folder (or) dashboard (or) dashboard page (or)
report (or) saved filter…etc
1. Login to analytics as weblogic develop a report with columnsuser region,
dollars Click on results click on save select shared folders click on new
folder name it as object level Click on okName report as Oobect_R1 
Click on ok
2. Click on catalog Expand shared folders expand object level click onfolder
Object_r1 report more Click onoption permissions click on add user/role 
type RRREP in the list select users click on search select RRREP click
on move In set permissions select required permissions asto drop down No
Access click on ok again ok sign out & sign in as RRREP
3. Notice that RRREP user not able to see report Object_r1
Note: Similarly we can provide permissions on any object (dashboard,folder …etc)
Section level permission:
1. Open any dashboard click on section properties click on permission 
From here follow steps as per above
Exercise : Develop a dashboard with two sections and configure section level
permissions ,in such a way each user will able to see his own section.
Traverse: 1. Create a folder with the name of Test and create a report with the name of R1 and save
into Test folder 2. provide traverse permission on test folder for any one user(ex : DDR) 3. create a
dashboard DB1 in other folder called as test1 4. Drag action link into dashboard and point out the R1.
5. Logout and login as DDR and open dashboard and click on action link and observe the
report .But this report or folder test not available catalog folders window.
Privileges:
It can control inbuilt options of OBIEE tool using privileges.
Step 1: Creating a user RRITEC1 and assign to BI Authors group
1) Create a user RRITEC1
2) Map RRITEC1 user to BI Authors group
3) Login to Analytics application and observe that all options (analysis,dashboard …etc )
are available
Step 2: Denying Analysis option to RRITEC1 user
1) Login to analytics as web logic Click onuser administration Click on manage
privileges click on access to answers click on „+‘ list select type RRITEC1 
all click on move click on denied  ok logout & login as RRITEC1 .
2) Notice that analysis is not available.
Providing Permissions to External DB groups
OR
Mapping External Table groups with Application Roles
Process :
Ground Work
1) Update the table securitytable grp column with your batch number
Update securitytable set grp='bxxxsalesrep' ;
Commit;
 My Account 2) Login to analytics application with any user of securitytable
notice that our role bxxxsalesrep not mapped.
Step 1: Creating a role
navigate to1. Login EM(enterprise manager) security tab
2. click on configure & click onManage Roles create Name as bxxxsalesrep (it
must same as group name in external table )
 My Account 3. Login to analytics application with any user of securitytable
notice that our role bxxxsalesrep mapped.
Step 2: Providing No Access to any one folder
1. Login as web logic user
2. selectGo to required folder more click onpermissions  add select our role
bxxxsalesrep
3. Provide no access permissions
Step 3: Testing
password:gs username: GSMITH1. Login to analytics
2. Click on My account Click on Roles & Catalog Groups
3. Observe that bxxxsalesrep role automatically mapped to this user
4. Make sure the folder is not available
Setting Query Limits and Timing Restrictions
1. Start the Oracle BI Server service and open the ABC repository in online mode. Log in as
Weblogic with password RRitec123.
2. Disallow queries that may consume too many system resources by setting query limits for
the user or group
a. Open the RPD in online mode
Identity Manager.b. Go to manage
c. Double-click on RRREP user
d. Click the Permissions button.
e. Click the Query Limits tab.
f. Locate the ORCL database and change its Max Rows value to 5. This specifies the
maximum number of rows each query can retrieve from the ORCL database for user
RRREP
g. In the Status Max Rows column, select Enable from the drop-down list.
h. In the Max Time (Minutes) column, change the value to 1. This specifies the maximum
time a query can run on the ORCL database.
i. In the Status Max Time column, select Enable from the drop-down list.
3. Restrict the time period for which users can access specified repository resources from
midnight Sunday to 7:00 AM Sunday:
a. Click the button in the Restrict column of the ORCL database.
b. Highlight the blocks from Sunday at midnight to 7 AM Sunday. Hint: With the first block
selected, press [Shift] and select the 7 AM block, or click the first box and drag to the 7
AM block.
c. Click the Disallow button.
If a time period is not highlighted, the access rights remain unchanged. If access is
allowed or disallowed explicitly to one or more groups, the user is granted the least
restrictive access for the time periods that are defined.
d. Click OK to close the Restrictions dialog box.
e. Click OK to close the User/Group Permissions dialog box.
f. Click OK to close the Group dialog box.
g. Close Identity Manager.
h. Check in changes.
i. Save the repository.
4. Log in to Oracle BI as Weblogic and register your changes on the server by reloading the
server metadata.
5. Ensure that the changes you made to the maximum number of rows allowed per query work
correctly.
a. Log in to Oracle BI as RRREP .
b. Click the Analysis link.
c. Click the SupplierSales subject area.
d. Select Customers.Customer, SalesFacts.Dollars, and click Results.
select operator ase. Click the Dollars filter is in top  value as 5 click Results.

13.2.2. Data Level or Row Level


Already completed in while discussing Variable.
13.2.3. Precedence and Inheritance
Refer to doc titled : 13.2.3. Precedence and Inheritance of Privileges and Permissions
13.3 Direct Database Request (DDR)
Users with the appropriate privileges can create and issue a direct database request
directly to a physical back-end database. The results of the request can be displayed
and manipulated within the Analysis editor, and subsequently incorporated into
dashboards and agents

Process:
Step 1: Creating a user or take existing user

step2: Assigning DDR required privileges to User


1) Login to Analytics as weblogic user
2) In the global header, click on Administration (top right cornor) ,then click on manage privileages.
3.scrolll down to Answers and add user to BI Administrator Role as shown in figure
4.Click on BI Administrator Role and a dialog box will open,click on green plus to search and add
users.click ok.

Step 3: Providing executing DDR privileges to User in RPD


1) Open RPD in online mode manageidentityactions-->set online user
filtertype * click on ok double click on RRITECDDR user 
clck on permissions-->query limits tab-->change Execute direct database request as
allow--> click on ok click on ok Check in  reload server metadata.
Step 5: Creating DDR report
1) Login to Analytics as a user click on new -->analysis --> Click on create
direct database request --> provide connection pool name as connection pool (RPD physical
layer connection pool name)--> type sql statement as  select * from scott.emp.
2) Click on validate sql & retrieve columns click on results and observe the output as in the figure.

14. Performance Tuning


14.1. Cache management
the query cache, a feature that enables the Oracle BI Server to save the results of a query in cache files and
then reuse those results later when a similar query is requested. Using cache, the cost of database processing
must be paid only once for a query, not every time the query is run and dramatically improve the apparent
performance of the system for users.

1) In OBIEE we have two types of caches

1. Oracle BI Presentation Server Cache

2. Oracle BI Server Cache


2) The Report order of execution is

1. Oracle BI Presentation server cache

2. Oracle BI server cache

3. Database cache

as shown in figure
3) Activity to understand execution order

1. Clear OBIPS and OBIS cache 2. Develop a report with year region and dollars ,save it and run it
,notice OBIPS
and OBIS caches are created 3. Same report in same session run one more time observe that OBIS
cache Last updated and use count parameters are not changed .hence we can conform
result came from OBIPS cache 4. Logout and login with weblogic user .notice that session id changed
5. Run the same report and observe OBIS cache Usecount parameter updated from 0 to 1 .hence we
can confirm the report result came from OBIS cache 6. Logout and login with any other userid ( Ex :
B205DDR) and run the same report and notice that Usecount parameter updated from 1 to 2
Oracle BI Presentation Server Cache

1. It is a temporary cache , it will be stored in below location


C:\OBIEE_HOME\instances\instance1 \tmp\OracleBIPresentationServicesCompo
nent\coreapplication_obips1 \sawrptcache .

2. It will be unique per particular session (deleted automatically when logout or restart)
Purging:

1. If we click report refresh button it will not read OBIPS cache

2. To Skip Oracle BI presentation Server cache at report level go to Advanced


Enabletab .

3.click on Administration ,select managesessions and close all cursors.

4. If you want to disable OBI PS Cache for entire environment change


instanceconfig.xml file and restart obips
<serverinstance>
———
<cursors>
<forcerefresh>true</forcerefresh>
</cursors>
——–
</ServerInstance>

Oracle BI Server Cache


1. In OBIEE 11g cache parameters are managed using EM.
2. EM will change nqsocnfig.INI file parameters.
3. In Nqsconfig file ENABLE = ‗YES‘ then cache is enabled else cache is disabled.
4. By default OBIS cache files will be stored in below path
E:/bi11g/instances/instance1/bifoundation/oracleBIservercomponenets/coreappli
cation-obis1/cache.
5. Oracle BI server cache will not be deleted with OBI server start or restart
Advantage of cache:
■ Dramatic improvement of query performance
■ Less network traffic
■ Reduction in database processing

■ Reduction in Oracle BI Server processing overhead

Disadvantage of cache:
■ Disk space for the cache
■ Administrative costs of managing the cache
■ Potential for cached results being stale (Old data).

■ CPU and disk I/O on server computer

With cache management, the benefits typically far outweigh the disadvantages.

Enabling cache
1. Open EM --> Click on Capacity Management --> Click on performance--> Click on Lock & Edit
Configuration -->Select Cache Enabled.
2. Click on activate changes(observe NQSConfig.ini file EnableClick on Apply
Restart oracle BI server. Click on Availability  Click on close parameter) 3. go to the cacheRun
any one report with columns customers and dollars
observe a file is created.location 4. observe all the cache  go to manage Open RPD in online
mode
parameters. 5. Sign out from analysis 6. run a report password : RRitec123 Sign in with any other
user (RRREP)
go to Go to manage Go to RPD with columns customer and dollars
notice that usecountcache is updated to ‗1‘.

Note:
Because of cache query performnce willbe iproved but there will be stale data (old data).
example: If we had ran a report having columns product and revenue @10 AM in the
morning. Now we ran same report @5 PM in evening we get same data (old data)in reprot
as @10AM even though the business is done in between 10AM and 5PM.These is because
the query is fetch data not from database but from Cache. To overcome this we go for
purging of cache.
Purging:

1. Cleaning cache files is called as purging.

2. We have four types of purging.

1. Manual.
2. Persistency time.
3. Event pooling table(EPT).
4. ODBC functions

1) Manual a. run any one report and make sure cache is createdAfter enabling cache b. right click
on Click on cache go to manage menuOpen RPD in online mode
Click on close. Click on ok  click on purge the Cache entry c. We can purge cache of specific user
or we can purge specific physical table d. We can use this method in development and testing
environment e. As human interaction is required ,hence we will not use this method in production
2) Persistency: a. It is useful to purge cache based on fixed time. To provide this setting we need to
know accurately the tables refresh frequency.
Process: a. double click on D1_PRODUCTS physical tableOpen RPD in online mode b. Click on ok
type 2 mins  select cache persistence time Go to general tab c. reload server metadata save
Check in d. Develop a report with specific columns ,run it & notice that cache is created e. After 2
mins automatically cache will be purged. f. can we control cache at table level ?
general tab enable/disable cache.Yes. In physical layer double click on any table

g. By default all tables will be cacheable and it never expires. h. If we develop a report with two
tables and out of these two tables one is cacheable
another is not cacheable then the resultant report cache will not be created. i. Once we observe
result revert your work by selecting D1_PRODUCTS table as
cacheable
3) Event Pooling Table (EPT)
Step1: Creating event pooling table
a. Login to DB as supplier2 user
DROP TABLE S_NQ_EPT;
CREATE TABLE S_NQ_EPT AS SELECT * FROM DEV_BIPLATFORM.S_NQ_EPT
WHERE 1=2;
SELECT * FROM S_NQ_EPT;
Step 2: Creating a trigger
1. Create below trigger to monitor D1 _PRODUCTS and insert records into S_NQ_EPT
table when ever some change (insert,update,delete) occurred in D1_PRODUCTS table
data
copy and paste below code and make surelogin as supplier2 user 2. Open SQLPLUS
trigger is created
create or replace trigger event_table_update_trigger
after
insert or update or delete on d1_products
for each row
begin
insert into s_nq_ept
(update_type, update_ts, table_name) values (1,sysdate, 'd1_products');
end ;
3. Confirm trigger is working or not by executing below statement.
Update D1_PRODUCTS set GENERICDESCRIPTION = 'Potato Chip'
where GENERICDESCRIPTION = 'Potato Chips'
4. Make sure some record inserted in S_NQ_EPT table
Select * from S_NQ_EPT
Step 3: Configuring EPT table
1. Import table S_NQ_EPT in physical layer

import import metadata  connection pool In physical layer right click on supplier2
finish.s_nq_ept table
2. Defining table S_NQ_EPT as event pooling table
Go to tools menu utilities select oracle BI event tables select execute 
s_nq_ept Click on > provide pooling frequency as 2 mins(Minimum recommended
 check in changes  ok is 60 Mins ,only in training room we can provide 2 mins)
ok.
Step 4: Testing
1. Clear all caches if any due to previous exercise
2. Truncate table S_NQ_EPT
3. Develop a report with specific description column & make sure cache is created.
cache4. Update some data in D1_PRODUCTS, then after 2 mins(pooling frequencies)
will be cleaned
Note : If database administrator not happy to create triggers then we can incorporate insert
statement in target table post sql of ETL tools like Informatica or ODI.
4. ODBC functions (programmatically purging)
1. The Oracle BI Server provides ODBC-extension functions for the Oracle BI Administrator
to use for purging cache entries.
2. Some of these functions are particularly useful for embedding in an Extract, Transform,
and Load (ETL) task.
3. For example, 1. After a nightly ETL is performed, all Oracle BI Server cache can be purged.
2. If only the fact table was modified, only cache related to that table can be purged.
3. In some cases, you might need to purge the cache entries associated with a
specific database.
4. We have four types of ODBC functions, those are useful to purge cache
programmatically & those functions can use in ETL tool Post load sql or in DAC as a
sql script .
1. SA purge all cache.
2. SA purge cache by db.
3. SA purge cache by table.
4. SA purge cache by query.
5. It is accurate method to purge the cache. Normally these functions will be executed by
ETL team using their post load option.
1. SAPurgeAllCache. Purges all cache entries.
Call SAPurgeAllCache();
2. SAPurgeCacheByDatabase. Purges all cache entries associated with a specific physical
database name. A record is returned as a result of calling any of the ODBC procedures to purge
the cache. This function takes one parameter that represents the physical database name and
the parameter cannot be null. The following shows the syntax of this call:
Call SAPurgeCacheByDatabase( 'DBName' );
3. SAPurgeCacheByTable. Purges all cache entries associated with a specified physical table
name (fully qualified) for the repository to which the client has connected. This function takes up
to four parameters representing the four components (database, catalog, schema and table
name proper) of a fully qualified physical table name. For example, you might have a table with
the fully qualified name of DBName.CatName.SchName.TabName. To purge the cache entries
associated with this table in the physical layer of the Oracle BI repository, execute the following
call in a script
Call SAPurgeCacheByTable( 'DBName', 'CatName', 'SchName', 'TabName' );
4. SAPurgeCacheByQuery. Purges a cache entry that exactly matches a specified query. For
example, using the following query, you would have a query cache entry that retrieves the
names of all employees earning more than $100,000: select last name, first name from
employee where salary > 100000;
The following call purges the cache entry associated with this query:

Call SAPurgeCacheByQuery('select lastname,firstname from employee where salary >


100000' );
Process for understanding:
1) make sure cache is createdDevelop a report
2) Go to administration underoption in analytics maintenance & trouble shooting
Click on issue sql
goClick on issue sql Type call SAPURGEALLCACHE() & Cache issee in cache folder.
removed.
Note : These ODBC functions normally called from below interfaces
1) Informatica Post SQL
2) Calling Batch File in DAC
3) Even We can create informatica command task and we can call these
functions
Seeding:
1. Inserting cache into cache folder is called as seeding.
2. Seeding is the process of pre populating the cache with queries that are known to
generate cache hits.
3. Helps improve query performance
 Use queries that heavily consume database processing and are likely to be
reused.
4. Is performed by running prebuilt queries during off hours of business or immediately
after purging.
5. These are two types
 Manually in Answers
 Automatically using Oracle BI Delivers to schedule queries to run at a
specified time
1. Manual seeding
then automatically cache click on open report Navigate to the report in the catalog
will be created.
2. Automatically by using schedule or AGENT:
name: seeding save  develop a report with region , dollars 1. Log in to analytics
report
2. Go to new Click on agent Click on schedule tab frequency : once date &
time set 3 min. forward from current time.
3. Click on delivery content selecttab content Click onas analysis browse 
select seeding report (saved in first step)
4. Click on destinations selecttab oracle BI server cache name: seed save 
agent
5. After 3 mins go to cache folder & notice that a cache file is available.

14.2. Connection Pools


The connection pool is an object in the Physical layer that describes access to the data
source. It contains information about the connection between the Oracle BI Server and
that data source.
The Physical layer in the Administration Tool contains at least one connection pool for
each database. When you create the Physical layer by importing a schema for a data
source, the connection pool is created automatically. You can configure multiple
connection pools for a database. Connection pools allow multiple concurrent data
source requests (queries) to share a single database connection, reducing the overhead
of connecting to a database.

For each connection pool, you must specify the maximum number of concurrent
connections allowed. After this limit is reached, the connection request waits until a
connection becomes available.
Increasing the allowed number of concurrent connections can potentially increase the
load on the underlying database accessed by the connection pool. Test and consult
with your DBA to make sure the data source can handle the number of connections
specified in the connection pool. Also, if the data sources have a charge back systembased on the number of
connections, you might want to limit the number of
concurrent connections to keep the charge-back costs down.
In addition to the potential load and costs associated with the database resources, the
Oracle BI Server allocates shared memory for each connection upon server startup.
This raises the number of connections and increases Oracle BI Server memory usage

It is recommended that you create a dedicated connection pool for initialization blocks.
This connection pool should not be used for queries.
Additionally, it is recommended that you isolate the connections pools for different
types of initialization blocks. This also makes sure that authentication and
login-specific initialization blocks do not slow down the login process. The following
types should have separate connection pools:
■ All authentication and login-specific initialization blocks such as language,

externalized strings, and group assignments.


■ All initialization blocks that set session variables.

■ All initialization blocks that set repository variables. These initialization blocks

should always be run using credentials with administrator privileges.


Be aware of the number of these initialization blocks, their scheduled refresh rate,
and when they are scheduled to run.

14.3. Aggregate tables


It is extremely important to use aggregate tables to improve
query performance. Aggregate tables contain precalculated summarizations of
data. It is much faster to retrieve an answer from an aggregate table than to
recompute the answer from thousands of rows of detail.

Aggregate tables store pre-computed results, which are measures that have been aggregated (typically
summed) over a set of dimensional attributes. Using aggregate tables is a popular technique for speeding
up query response times in decision support systems. This eliminates the need for run-time calculations
and delivers faster results to users. The calculations are done ahead of time and the results are stored in
the tables. Aggregate tables typically have many fewer rows than the non-aggregate tables and, therefore,
processing is faster.

To set up and use aggregate tables, perform the following steps:

 Importing Metadata

 Creating New Logical Table Sources

 Setting Aggregate Content

 Testing Your Work

Importing Metadata
1. Return to the Administration Tool and open the BISAMPLE repository in offline
mode.
2. In the Physical layer, expand orcl.
3. Right-click Connection Pool and select Import Metadata to open the Import
Wizard.
4. In the Select Metadata Types screen, select Views and click Next.
5. In the Select Metadata Objects screen, in the data source view, expand BISAMPLE.
6. In the data source view, select the following for import:

SAMP_REVENUE_FA2
SAMP_TIME_QTR_D

7. Click the Import Selected button to move the objects to the Repository View.
8. Expand BISAMPLE in the Repository View and confirm that the objects are visible.
9. Click Finish to close the Import Wizard.
10. Confirm that the objects are visible in the Physical layer of the repository.
11. Create the following aliases:

Table Alias
SAMP_REVENUE_FA2 F2 Revenue Aggregate
SAMP_TIME_QTR_D D1 Time Quarter Grain

12. Right-click F2 Revenue Aggregate and select View Data. F2 Revenue Aggregate
stores aggregated fact information for revenue and units at the quarter and product
grain.
13. Right-click D1 Time Quarter Grain and select View Data. D1 Time Quarter Grain
stores time data at the quarter grain. It stores one record for each quarter beginning
with Q4 2006 and ending with Q4 2011.
14. Use the Physical Diagram to create the following physical joins:

"orcl".""."BISAMPLE"."D2 Product"."PROD_KEY" = "orcl".""."BISAMPLE"."F2


Revenue Aggregate"."PROD_KEY"

"orcl".""."BISAMPLE"."D1 Time Quarter Grain"."QTR_KEY" =


"orcl".""."BISAMPLE"."F2 Revenue Aggregate"."BILL_QTR_KEY"

Creating New Logical Table Sources


1. In the Physical layer, expand D1 Time Quarter Grain.
2. In the BMM layer, expand D1 Time.
3. Drag the following columns from D1 Time Quarter Grain to their corresponding
columns in D1 Time: Note: Make sure to drag them to their corresponding columns.

D1 Time Quarter Grain D1 Time


CAL_HALF Cal Half
CAL_QTR Cal Qtr
CAL_YEAR Cal Year
DAYS_IN_QTR Days in Qtr
JULIAN_QTR_NUM Julian Qtr Num
PER_NAME_HALF Per Name Half
PER_NAME_QTR Per Name Qtr
PER_NAME_YEAR Per Name Year

This action creates a new logical table source named D1 Time Quarter Grain for D1
Time.

4. Rename the D1 Time Quarter Grain logical table source to LTS2 Time Quarter
Grain.
5. Double-click LTS2 Time Quarter Grain to open the Logical Table Source dialog
box.
6. On the Column Mapping tab make sure Show mapped columns is selected and note
the column mappings. The logical columns now map to columns in two physical
tables: D1 Time and D1 Time Quarter Grain.
7. Click OK to close the Logical Table Source dialog box.
8. In the Physical layer expand F2 Revenue Aggregate.
9. In the BMM layer expand F1 Revenue.
10. Drag the following physical columns from F2 Revenue Aggregate to their
corresponding logical columns in F1 Revenue: Note: Do not add them as new
columns.

F2 Revenue Aggregate F1 Revenue


UNITS Units
REVENUE Revenue

11. This action creates a new logical table source named F2 Revenue Aggregate for F1
Revenue.
12. Rename the F2 Revenue Aggregate logical table source to LTS2 Revenue
Aggregate.
13. Double-click LTS2 Revenue Aggregate to open the Logical Table Source dialog
box.
14. On the Column Mappings tab make sure show mapped columns is selected and note
the column mappings. The Revenue and Units logical columns now map to columns
in two physical tables: F1 Revenue and F2 Revenue Aggregate.
15. Leave the Logical Table Source - LTS2 Revenue Aggregate dialog box open.

Setting Aggregate Content


1. Click the Content tab.
2. Set the following logical levels for the logical dimensions:

Logical Dimension Logical Level


H1 Time Quarter
H2 Product Product Total
H20 Product Product Total
H3 Customer Customer Total
H5 Sales Rep Total

Explanation: You are setting aggregation content for the fact table to the corresponding levels in
the dimension hierarchies. In a subsequent step, you set similar levels for the aggregate logical
table source for the Time dimension. Note that all levels are set to the total level except for the H1
Time logical dimension, which is set to Quarter. The result is, when a user queries against a
particular level, Oracle BI Server will “know” to access the aggregate tables instead of the detail
tables.
For example, if a user queries for total sales by product by quarter, the server will access the F2
Revenue Aggregate fact table and the corresponding aggregate dimension table, D1 Time Quarter
Grain. If a user queries for a level lower than the level specified here, for example Month instead of
Quarter, then the server will access the detail tables (F1 Revenue and D1 Time). If a user queries
for higher level (year instead of quarter) the aggregate tables will be used, because whenever a
query is run against a logical level or above, the aggregate tables are used.

3. Click OK to close the Logical Table Source dialog box.


4. Double-click the LTS2 Time Quarter Grain logical table source to open the Logical Table Source
dialog box.
5. On the Content tab, set the logical level to Quarter.
6. Click OK to close the Logical Table Source dialog box.
7. Save the repository and check global consistency. Fix any errors or warnings before proceeding.
8. Close the repository. Leave the Administration Tool open. Note that you did not need to change
the Presentation layer. You made changes in the business model that impact how queries are
processed and which sources are accessed. However, the user interface remains the same, so
there is no need to change the Presentation layer. Oracle BI Server will automatically use the
appropriate sources based on the user query.

Testing Your Work


1. Return to Oracle Enterprise Manager and load the BISAMPLE repository.
2. Return to Oracle BI, which should still be open, and sign in.
3. Create the following analysis to test the aggregate tables.

Time.Per Name Qtr


Base Facts.Revenue

4. Click Results.
5. Leave Oracle BI open.
6. Use the Administration link to check the query log.
7. Inspect the log. Notice that the query uses the expected tables: D1 Time Quarter Grain and F2
Revenue Aggregate.
8. Return to Oracle BI.
9. Click New > Analysis > Sample Sales.
10. Create the following analysis to test the aggregate tables.

Time.Per Name Year


Base Facts.Revenue

11. Click Results.


12. Use the Administration link to check the query log.
13. Inspect the log. Notice that the query uses the same tables: D1 Time Quarter Grain and F2
Revenue Aggregate. This is because Per Name Year is at a higher level than Per Name Quarter in
the logical dimension hierarchy, so the aggregate tables are still used.
14. Return to Oracle BI.
15. Click New > Analysis > Sample Sales.
16. Create one more analysis to test the aggregate tables.

Time.Per Name Month


Base Facts.Revenue

17. Click Results.


18. Inspect the log. Notice that this time the query uses the detail tables: D1 Time and F1 Revenue.
This is because the requested data (revenue by month) is at a lower level than what is contained
in the aggregate tables. The aggregate tables do not contain the data and, therefore, the detail
tables are used in the query. This aggregate navigation is controlled by the aggregate content
levels you set in the logical table sources.
19. Sign out of Oracle BI. Click Leave Page when prompted about navigating away from this page.
Leave the Oracle BI browser page open.
14.4. Partitioning and Fragmenting of Data
1. Splitting one large table into 'n‘ no. of smaller tables is called as partitions.
2. We can do table partition logically or physically. If we do logically then there is no
change required either ETL or reporting end.
3. Partitions are 4 types
1. Value based.

2. Fact based.
3. Level based.

4. Partitions are useful to improve the performance of querying data.

Value based partitioning


Process:
Step1: Creating Partition Tables
1. Log into database with supplier2 username and password  type
create table rritec_cust1 as select * from d1_customer2 where substr(name, 1,1) < ‗N‘
select min(substr(name,1,1)), max(substr(name,1,1)) from rritec_cust1
create table rritec_cust2 as select * from d1_customer2 where substr(name, 1,1) >= ‗N‘
select min(substr(name,1,1)), max(substr(name,1,1)) from rritec_cust2
Step 2: Import Metadata
1) Right click on connection pool  import metadata  next
2) Expand supplier2 schema  select RRITEC_CUST1 , RRITEC_CUST2  Click on
import(>)  click on finish.
Step 3: Creating Physical Joins
1) Select RRITEC_CUST1, RRITEC_CUST2, D1 _ORDERS2  click on physical diagram
 click on new join  drag & drop D1_ORDERS2 to RRITEC_CUST1  select new
key and cust key  click ok.
2) Similarly create another join between D1_ORDERS2 to RRITEC_CUST2.
3) Close the physical diagram.
Step 4: map physical columns to logical columns
1) Drag and drop RRITE_CUST1.ADDRESS (physical column) on
CUSTOMERS.ADDRESS (logical column).
2) Similarly map all RRITE_CUST1 & RRITE_CUST2 columns to CUSTOMERS logical
table columns.
Step 5: Disable D1_CUSTOMER2 Logical Table Source
1) Expand customers logical table  Double click on D1_CUSTOMER2 LTS  click on
general tab  select disabled  click on ok.
Step 6: Defining Content Fragmentation
1) Double click on RRITEC_CUST1 LTS  click on content tab under fragmentation
content  click on edit expression  select customers table  double click on
customer column type < „N‟  click on ok  Enable this source should be
combined with other source at this level  Enable distinct value assign logical
level as customer detail  click on ok.
2) Double click on RRITEC_CUST2 LTS  click on content tab logical level customer
detail  click on fragmentation content  edit  select customers table  double click
on customers  type >‟M‟  click on ok  Enable this source should be combined
with other source at this level  Enable distinct value  Click on ok.
3) Double click on customers logical table  keys tab  delete any two key rows
Step 7: Testing
Test Case 1: Execute SQL against RRITEC_CUST1 table
1) Load RPD into OBIS develop a report with columns customer, dollars
2) Create a filter on customer column as customer <‟C‟  click on result  observe
output & Physical SQL.
Conclusion:
1) Query is executed against RRITEC_CUST1 table.
Test Case 2: Execute SQL against RRITEC_CUST2 table
1) In the above report change the filter condition as >X  click on result  observe output
& Physical SQL.
Conclusion:
Query is executed against RRITEC_CUST2.
Test Case 3: Execute SQL against RRITEC_CUST1 and RRITEC_CUST1 table
1) In the above report remove the filter & run it  observe output & sql.
Conclusion:
Query is executed against the RRITEC_CUST1 and RRITEC_CUST2.
Modeling Fact Based Partition
1. Create an ODBC data source for an Excel spreadsheet.
a. Double-click Data Sources (ODBC) on your desktop to open the ODBC Data Source
Administrator.
b. Click the System DSN tab.
c. Click Add.
d. Select the Microsoft Excel Driver and click Finish.
e. Enter xls_quota as the data source name.
f. Click Select Workbook.
g. Navigate to D:\Labs and select the quota.xls workbook.
h. Click OK.
i. Click OK to close the ODBC Microsoft Excel Setup dialog box.
j. Click OK to close the ODBC Data Source Administrator.
2. Import the xls_quota data source into the ABC repository.
a. Return to the ABC repository, which should still be open in online mode.
b. Select File > Importmetadata.
c. Select the xls_quota data source, and enter weblogic as the username; no password is
needed.
d. Click next. Again next ,The Import dialog box opens.
e. Expand E:\Labs\quota.
f. Select and import the following tables:
Item Types
Quarters
Regions
regiontypequota
g. Click on finish.
3. Because Excel has limited data types, check or change the data types to conform to existing
data types for the relevant fields. Expand the physical tables and double-click the physical
columns to open the properties dialog box.
ItemType: VARCHAR 20
Quarter: DOUBLE
YR: DOUBLE
Region CHAR 16
Dollars: DOUBLE
4. Specify joins and keys.
a. For each table, double-click the table, click the Keys tab, and use the New button to
specify the table keys as follows:

b. Use the Physical Table Diagram to specify the following New join :
ItemTypes.ItemType = regiontypequota.ItemType
Regions.Region = regiontypequota.Region
Quarters.YR = regiontypequota.YR AND Quarters."Quarter" =
regiontypequota."Quarter"

Hint: Press and hold [Ctrl] and create the multicolumn join for the Quarters columns or
enter the join expression in the Expression edit field:

c. Check your work:

d. Close the Physical Diagram.


5. Create a new measure for quotas.
a. Create a new logical column in the SalesFacts table and name it Quota.

b. Set the aggregation rule to SUM.


6. Create a new logical table source for SalesFacts by dragging the Targets physical column
from the regiontypequota physical table onto the Quota logical column that was just
created. This automatically creates a new regiontypequota logical table source.
7. Specify the content for the new regiontypequota logical table source on the Content tab of
the Logical Table Source properties dialog box:

8. Create new logical table sources for the dimension tables.


a. Drag the YR physical column from the Quarters physical table onto the Year logical
column in the Periods logical table. This automatically creates a new Quarters logical
table source for the Periods logical table.

b. Repeat the steps for the following columns:


Physical Table.Column Logical Table.Column
Quarters.Quarter into Periods.Quarter
Regions.Region into Customers.Region
ItemTypes.ItemType into Products.Type
This automatically creates a Regions logical table source for the Customers logical table
and an ItemTypes logical table source for the Products logical table.
9. Specify the content level for the Quarters, Regions, and ItemTypes logical table sources.
Use the screenshots as a guide.

It is best practice to set the aggregation level of these dimension sources. You specify the

content of a dimension table source in terms of the hierarchy of that dimension only. The
content of a fact table source is specified in terms of the content of all dimensions.
10. Drag the Quota logical column into the SalesFacts presentation table.
11. Double click on customers logical table delete region corresponding key
12. Double click on Products logical table delete itemtype corresponding key
13. Check in changes.
14. Check consistency. If you get a warning that the features in database xls_quota do not
match the defaults, you can ignore the warning for the purposes of this practice. If you want
to prevent the warning message from appearing, click Options > Warnings > Database,
and disable Check Features Match Default. Fix any other errors or warnings before you
proceed.
15. Close the Consistency Check Manager.
16. Save the repository and leave it open for the next practice.
17. Develop a report with columns year ,quarter,region,type,dollars,targets Click on
ResultsObserve output and back end SQLs
Note : The merging process works similar to Outer Join
Level Based Partition
It is nothing but Aggregate tables already completed

14.5. Tuning Thread pooling using WLS Admin Console


A thread is an independent path of execution within a program. Many threads can run concurrently
within a program.

WebLogic Server automatically detects when a thread in an execute queue becomes “stuck.”
Because a stuck thread cannot complete its current work or accept new work, the server logs
a message each time it diagnoses a stuck thread. A thread might get stuck due to various
reasons.

For example: When large BI report is running and the time it takes to complete is say 800
seconds, then, as the default stuck thread timing is 600 seconds in WebLogic Server, the
thread allocated for that query waits for 600 seconds and goes to stuck state.

As best practice we have changed basic Thread Parameters as below under Console. And this
needs to be done all the Managed servers clustered participating nodes including Admin
server.
A very useful way to find the STUCK Threads are: go to Console -) Environment -) Servers -
) Click Each Managed server as bi_serverx -) Click Monitoring -) Threads .

Below are 3 important parameters which tell you the server health plus Hogging thread count
. If you see Hogging thread count is potentially high then you are in trouble with STUCK
Thread . This below snapshot is very ideal managed server nodes with no Hogging/Stuck
thread at all.
However this is most notorious STUCK thread issue . See the Count of thread waiting here.
You must be in trouble in such scenarios:

When you see Hogging thread count exceeds 20+ you will get below kind of error in
dashboard page for all sessions already In and no new sessions will be able to open
connection and will throw login error:

14.6. Indexing on Tables


For Oracle BI Server database
queries to return quickly, the underlying databases must be configured, tuned, and
indexed correctly. Note that different database products have different tuning
considerations.
If there are queries that return slowly from the underlying databases, then you can
capture the SQL statements for the queries in the query log and provide them to
the database administrator (DBA) for analysis.

Using Bitmap Indexes in Data Warehouses

Bitmap indexes are widely used in data warehousing environments. The environments
typically have large amounts of data and ad hoc queries, but a low level of concurrent DML
transactions. For such applications, bitmap indexing provides:

 Reduced response time for large classes of ad hoc queries.


 Reduced storage requirements compared to other indexing techniques.
 Dramatic performance gains even on hardware with a relatively small number of
CPUs or a small amount of memory.

Benefits for Data Warehousing Applications

Bitmap indexes are primarily intended for data warehousing applications where users query
the data rather than update it. They are not suitable for OLTP applications with large numbers
of concurrent transactions modifying the data.

Parallel query and parallel DML work with bitmap indexes. Bitmap indexing also supports
parallel create indexes and concatenated indexes.

This section contains the following topics:

 Cardinality
 Using Bitmap Join Indexes in Data Warehouses
Cardinality

The advantages of using bitmap indexes are greatest for columns in which the ratio of the
number of distinct values to the number of rows in the table is small. This ratio is referred to
as the degree of cardinality. A gender column, which has only two distinct values (male and
female), is optimal for a bitmap index. However, data warehouse administrators also build
bitmap indexes on columns with higher cardinalities.

Using Bitmap Join Indexes in Data Warehouses

In addition to a bitmap index on a single table, you can create a bitmap join index, which is a
bitmap index for the join of two or more tables. In a bitmap join index, the bitmap for the
table to be indexed is built for values coming from the joined tables. In a data warehousing
environment, the join condition is an equi-inner join between the primary key column or
columns of the dimension tables and the foreign key column or columns in the fact table.

A bitmap join index can improve the performance by an order of magnitude. By storing the
result of a join, the join can be avoided completely for SQL statements using a bitmap join
index

Example: Bitmap Join Index: Multiple Dimension Tables Join One Fact Table
CREATE BITMAP INDEX sales_c_gender_p_cat_bjix
ON sales(customers.cust_gender, products.prod_category)
FROM sales, customers, products
WHERE sales.cust_id = customers.cust_id
AND sales.prod_id = products.prod_id
LOCAL NOLOGGING COMPUTE STATISTICS;

14.7. Usage tracking


1. Business want to know the top n users using application
2. Business want to submit application usage to federal or local governments
3. Admin or development team want to know the top n reports taking longer time ..etc
4. By default user logs will be created in NQ query log file.
5. If we want to insert these logs into a database table then we need to go for usage
tracking.
6. Once data is loaded into usage tracking table(S_NQ_ACCT) then we can develop user
friendly report .
7. Collects usage statistics on each logical query submitted to the server.
Process:
Step 1: Creating table
1. Usage tracking table will be created in DEV_BIPLATFORM automatically with the
installation of OBIEE RCU installation
2. Login to database with supplier2 user
3. Execute below SQL
4. DROP TABLE S_NQ_ACCT
5. CREATE TABLE S_NQ_ACCT AS SELECT * FROM DEV_BIPLATFORM.S_NQ_ACCT
WHERE 1=2
6. SELECT * FROM S_NQ_ACCT
Step 2: Importing table into RPD
1. Open RPD in offline mode
2. Right click in Physical layer click on new database  Name it as UT_DB Select
database as Oracle 11g Click on connection pools tabClick on add Name it as
UT_CP  Data source name : ORCL  Username : supplier2  password 
supplier2Click on OK type supplier2 password  Click on Ok –>Again OK
3. Right Click on connectionpool UT_CP  Click on import  follow wizard and import
S_NQ_ACCT table
4. Double click on S_NQ_ACCT table  click on Keys tabname it as UT_KEY select
ID columnClick on ok
Step 3: Creating business model
1. Right click in BMM layer  new business model  name it as UT
2. Drag and drop s_nq_acct table onto UT business model Right click on S_NQ_ACCT
table click on duplicate  Right click UT business model  select business model
diagram  whole diagram  join both the tables.
Step 4: Creating subject area
1. Drag and drop UT business model into presentation layer  in presentation layer delete
duplicate table and keep only original table
2. Save RPD
Step5: Set Usage tracking parameters
1. Up to 11.1.1.5 version setting these parameters is manual process
2. From 11.1.1.6 we need to configure by using EM
3. Login to EM as weblogic user
4. Expand Business Intelligence  Select Coreapplication  Click on Lock and Edit
Configuration  Click on Ok
5. Expand Weblogic Domain  Right click on bifoundatio_domain Click on system
Mbean Browser
6. Expand Application Defined Mbeans Expand Oracle.biee.admin Expand
Domain:bifoundation_domain  Expand BIDomain.BIinstance.serverconfiguration
 Click on BIDomain.BIinstance.serverconfiguration Right side observe usage
tracking parameters
1. Configuring ENABLE parameter
Click on 20 Usage TrackingEnabled Select value as true  Click on apply Click
on return
2. Configuring DIRECTINSERT parameter
Click on 19 Usage TrackingDirectInsert Select value as true  Click on apply
Click on return
3. Configuring Connectionpool parameter
Click on 18 Usage TrackingConnectionpool Provide “UT_DB”.”UT_CP”  Click on
apply Click on return
4. Configuring PhysicalTableName parameter
Click on 21 Usage TrackingPhysicalTableName Provide “UT_DB”.”
supplier2”.”S_NQ_ACCT”  Click on apply Click on return
7. Click on Business Intelligence  Coreapplication Click on Activate Changes
Step 6: Load RPD in OBIS
1. Point above RPD in EM and restart OBIS
2. Open RPD in online mode
3. Set weblogic user log level as 2 (optional)
4. Close RPD.
Step 7: Testing:
1. Develop any report & run it.
2. Go to database type SELECT * FROM S_NQ_ACCT ,notice one or more records are
available
3. Develop another report using UT subject area then we find some data.

15. Troubleshooting
15.1. Troubleshooting using logs
15.2. Troubleshooting using BI Admin Tool
15.3. Troubleshooting using OFMW EM
15.4. Troubleshooting using WLS Admin Console
15.5. Troubleshooting using Configuration files

S-ar putea să vă placă și