Skip to main content

Use Essbase as an aggregate table and drill to relational

In my two previous blog posts I wrote about how to add aggregated tables into you OBI data model and how to use Aggregate Persistence Wizard to manage your aggregates more efficiently. This is the last post in the series of three, in which I will focus on using an MOLAP (Essbase) cube as an aggregate for your base data model.

The first step of all before you do any modelling is to import Essbase metadata into BI repository. Once you've done that you should follow standard method of setting the cube in the physical layer of the RPD. For example, define Measure dimension, flatten Measures etc. Once you've finished with these tasks, which are definitely different from the ones when you import and model relational database metadata, then Essbase cube is practically ready to be used. There isn't much changes required in logical and presentation layers.

And even when you want to use Essbase "only" as an aggregated store, you would use Essbase basically in the same way as you would use any other database. It is true, that when you want to drill from Essbase to relational database, data on levels should be of the same type and values in dimension hierarchies should be the same on these levels. For example, if you have value for the month like "2015 - 01" in relational database, then also month should have this same value in Essbase.

And if this is true, then Essbase is just like any other database that you would use in your data model. It is just about different technology, but when you model it in BI Administration tool the modelling is basically the same regardless of database technology used.

The process is the same as it was described in my two previous posts. Basically, measures and dimension members have to be mapped to existing business data model, and once mapped, Essbase dimensions will be added to the existing logical data sources in business model.



As in previous examples, click on one of the measures and drag it to the business model and drop it onto corresponding logical column. For example, in the picture above, Fixed Costs from flattened physical measure is mapped to Fixed Costs logical column. If you would explore further, you might have found that Fixed Costs logical column has more logical table source, and one of them is  Essbase cube from physical layer.

Similarly, you have to map physical dimensions to logical tables that act as logical dimension tables. Simply drag’n’drop physical onto logical column again. But be careful, how you map columns. “Member key” column usually maps to “ID” columns, and “Default” columns map to “description” columns in Business model. You should also pay attention to the hierarchy in physical cube.











Now, you just have to save your repository and upload it to the BI Server (or reload Metadata if changing in online mode) and test.

In our simple example, when you create the following query:








And when you run it, you would get standard output as expected.



Log would reveal the following logical SQL:

With   
  set [_Months5]  as 'Generate([Months].Generations(2).members, Descendants([Months].currentmember, [Months].Generations(6), leaves))'  
  set [_Customers2]  as '[Customers].Generations(2).members'  
  set [_Products5]  as 'Generate([Products].Generations(2).members, Descendants([Products].currentmember, [Products].Generations(5), leaves))’
select   
  { [Measures].[Gross Revenue]  } on columns,
  NON EMPTY {crossjoin({[_Months5]},crossjoin({[_Customers2]},{[_Products5]}))} properties MEMBER_NAME, GEN_NUMBER, [Customers].[Default], [Customers].[MEMBER_UNIQUE_NAME], [Months].[Default], [Products].[Default], [Products].[MEMBER_UNIQUE_NAME] on rows from [BISAMPLE.Sample]

I believe there is no surprise, but BI Server has generated MDX query which is executed against Essbase cube. Now if you drill down to any of selected dimensions, let's say from month to week, then our expectation is that query would be a regular SQL query and data would be retrieved from relational database.


Logs would confirm our expectation as the following SQL was generated behind the scenes:

WITH SAWITH0 AS 
(select sum(T4535.REVENUE) as c1,
        T4488.SEGMENT_DESC as c2,
        T4522.PROD_DSC as c3,
        T4563.PER_NAME_MONTH as c4,     
        T4563.PER_NAME_WEEK as c5,     
        T4488.SEGMENT_KEY as c6,     
     T  4522.PROD_KEY as c7
from      
        SAMP_TIME_DAY_D T4563 /* D_Time */ ,     
SAMP_PRODUCTS_D T4522 /* D_Products */ ,     
SAMP_CUST_SEGMENTS_D T4488 /* D_Customer_segments */ ,     
SAMP_CUSTOMERS_D T4491 /* D_Customers */ ,     
SAMP_REVENUE_F T4535 /* F_Revenue */ 
where …