Social Icons

Saturday, November 21, 2015

Planning (BSO) to Reporting(ASO) Replicated Partition Issue (Not really an issue) - Solution

There was an issue reported from user that a small sheet with 15 rows and 6 columns from reporting cube (ASO) is taking around 6 minutes to retrieve and getting timed-out

We tried from our end couple of times and identified that it was sporadic and tried all possible options to fix it. Sometimes, it retrieves very quickly and sometimes it takes a huge amount of time.

Option 1: Outline size was around 200MB and we compacted the outline which brought down the outline size to 22 MB. But, this didn't work

Option 2: We cleared aggregations (We were very sure that this will not improve performance). It didn't work

Option 3: We have a database where we capture all our metadata and data loads with start time, end time and elapsed time. We noted down the times when the retrievals were faster and looked in to the database to find what process were run before the retrievals.

We found that whenever a dimension build process is completed, the retrievals are faster but after that the retrieval times increased gradually. We have a process which runs every 10 mins which does currency conversion in Planning (BSO) and refreshes a replicated partition to push data to reporting (ASO) cube. We kept retrieving after each refresh is completed and could see that the retrieval times were increasing gradually. We have also noticed that the Plan to report process time was also increasing gradually

We identified that the Plan to Report process is the issue. After lot of digging and analysis, we identified that every time a partition is refreshed a new slice gets created in reporting (ASO) cube and if a user retrieves any combination which needs to check in both main slice and incremental slice and then has to do some internal processing to get you the right numbers

As a solution to the issue (It's not an issue but rather essbase behavior), we have to merge the slices. This improved the retrieval times from 6 minutes to 30 seconds. This has also reduced our Plan to Report process and the timings were very consistent at 6 minutes which was taking somewhere between 6 mins - 15 mins depending the number of slices

We also had to look from a performance standpoint to decide which type of "slice merge" we had to choose

  • Merge all slices to main cube
  • Merge incremental slices to one slice. Since, it is creating a new slice, everything went fine as 
Merge to the main cube gave us two issues for which we dropped the option

  • Merge was taking more time as we have aggregations on the cube. We had to drop the aggregations to make the merge to cube faster which was not an option
  • We tried merging without the aggregations and was taking more than a minute

Merge to the slice was the way to go. merge takes around 25 seconds with / without the cube aggregation

It was very difficult to identify the problem as what was causing the retrieves to be slower but now everything works as normal

Monday, August 10, 2015

Analytics outside of my work

My first encounter with Essbase started in 2007 when I started my career and never looked back. I loved working in essbase and how different it was compared to other databases. it wasn't that difficult for me to make a switch-over with the knowledge of relation Db's as I was a very quick learner. 

It's not just Essbase that I love but I am in love with data and anything that is related to data and analytics. As part of my work, I work on Essbase applications and outside of my work, I tend to analyze data that is available public and try to analyze the data. This isn't that easy as the data that is available is not complete. that's what I love. I love to fill the gaps / identify where the gaps are and come up with solutions. 
If the data is not available, I just scrape the data from websites / social networks and look at what people are talking about
Sounds good till now but what tools / programming languages I use?
I use R mainly for analyzing the data as I felt comfortable and relatively simple
I do use Python if there is something which I don't find in R. Mostly, I have used Python where huge amount of data is involved and R tends to give me problems as my system is not that highly configured

I have been busy working on something of my interest to be applied on what I am working on. I will be able to post it soon in couple of days / in a month

Monday, July 27, 2015

ASO Report Optimization - Short Tip

Another short post on ASO Report Script Learning

I was working on a project where we have an ASO cube and we had to extract the data from ASO cube to a different team. There were multiple requirements, where they wanted level0 of few dimensions and summary level of other dimensions

I used an existing report script that was in a BSO cube and then updated the dimensions accordingly and to my surprise, the script ran for around 2 hours to generate around 20MB file

I have written report scripts earlier and never did the script ran so long. It took a while for me and figured out that I was using LEV and when replaced with LEAVES, the report script ran for 40 seconds

I expanded the scope of my report script and was able to generate 400MB file in 4 minutes

Always use LEAVES if you need Level0 members of a dimension in ASO cube

Happy Learning!!!

Wednesday, January 28, 2015

Oracle EPM Released

Finally the Oracle EPM is released!!!
I wish it was but it's just the Beta that got released and they are made available at Here

They are not yet available in edelivery and the document is also not available yet. They should be available in couple of days. Wait for it to be available in eDelivery

I though this time atleast I will be the first one to post but Celvin have already blogged Here followed by John Goodwin about the installation and configuration (Beta version) Here

I am late in the game :( Nevertheless I will wait for the eDelivery version :)