Tag Archives: Oracle BI Suite EE

Using OBIEE against Transactional Schemas Part 4: Complex Dimensions

You guessed it… more on transactional schemas. For a quick recap, here are the posts to date:

Using OBIEE against Transactional Schemas Part 1: Introduction

Using OBIEE against Transactional Schemas Part 2: Aliases

Using OBIEE against Transactional Schemas Part 3: Simple Dimensions and Facts

Combining Entities

In the last post on Simple Dimensions and Facts, we discussed logical facts and dimensions, factless fact tables, and fake count measures. In constructing the simple dimension in that post, we saw the collapsing of normalized relationships into a single conformed dimension table when building Dim – Customer. Another important entity for reporting in the Customer Tracking application is the Contact entity. At first glance, it might seem that we should simply build another dimension called Dim – Contact, and use analyses to combine our Customer and Contact dimensions along with our Activity fact table to analyze Customer and Contact behavior. When we look again at the data model for the Customer Tracking application below, Customer and Contact are clearly two separate entities in this application, and so this probably means that we should continue to report on them this way, and structure them as such in the OBIEE Business Model and Mapping layer. Right?

Is this the way that business users think about their data? Although it makes perfect sense in a normalized data model to have separate entities for Customer and Contact, the business user likely won’t see it this way, especially when it comes to defining reporting requirements. Contacts are clearly specific instances of a Customer at a lower level in the Customer-Contact hierarchy, and our end users will expect to be able to drill cleanly from Customer on a report down to the specific Contact that is associated with a particular activity. Although OBIEE supports building hierarchies across multiple logical dimension tables so we can provide the drill-down behavior mentioned here, the end user is still confronted with two entities when building the report. This is perhaps the most challenging part of reporting against transactional schemas: deciding not to settle for the model that the OLTP designers have given us. Instead of resigning to have both Dim – Customer and Dim – Contact in our logical BMM, we should attempt to build the appropriate dimensional rendering of this model: a single Dim – Contact dimension of which Customer is simply another element in the hierarchy.

The first step in this process is to rename Dim – Customer to Dim – Contact, as it’s always best to name dimension tables according to the lowest level in the hierarchy: the declared grain of the dimension table. Then I add the physical contact tables to the Dim – Contact logical table source (renaming that as well for clarity purposes), and map all the new Contact-specific attributes to the appropriate physical table.

I also do some renaming of the Customer dimension, and include Contact Detail as the new, lowest level in the hierarchy.

I consider this merging of entities into a single logical dimension table to be more profound than our original constructing of Dim – Customer in the Simple Fact and Dimension post. Although we combined multiple normalized tables to get our original Customer dimension, those tables represented labels and descriptors, such as Customer Category and Customer Status. They didn’t represent true entities, which I argue is the distinction that exists when combining Customer and Contact. This type of high-level distinction comes into play when we try to think about our source model in dimensional terms instead of transactional ones.

An important feature of OBIEE when considering whether to build logical table sources with increasing amounts of physical tables added to them is performance. It might seem that combining the Customer and Contact entities in this way might have a negative impact on performance, because of all the tables that have to be joined to return records for the Dim – Contact dimension. Perhaps keeping the two dimension tables separate would mean less work that the database has to do. What if our query only requires Contact or Customer attributes in the result set, but not both? Doesn’t the BI Server have to include all the joins in the logical table source — in this case six tables — every time this dimension is used in an analysis? The answer is no. The BI Server is very, very smart, and in many ways acts like a database optimizer when structuring the physical SQL for an analysis. In each Intelligent Request Generation, it is aware of the smallest number of physical tables required to produce the result set in the logical query, and only includes those tables in the physical SQL. For instance, a sample analysis that returns only Customer attributes from our new Dim – Contact logical dimension only includes three tables in the SQL issued against the database:

Building a Date Dimension

One consistent element of every BI system is the necessity to handle dates effectively. For this purpose, when building a conformed data warehouse, we always build a date dimension of some kind or another. Ralph Kimball describes the special place that a date (or time) dimension plays in the data warehouse. We may have to split the difference here with Kimball, as you will see later. Regardless, we usually don’t have a date dimension table in our transactional schema. If you recall from earlier posts in this series, the application code for transactional systems usually exists in API’s designed specifically for single record “gets” and “puts”. For this reason, it’s very easy to develop a series of API’s designed to insert or return specific date attributes, and rarely do these systems deal with date conceptually as a hierarchy. It’s simply not part of what they are designed for. It’s not just dimensional models that require date dimensions: OBIEE as a product requires them in particular situations. If we want to capitalize on the built-in feature of automatic drill-down, then we have to build a logical dimension table that contains our levels: year, to quarter, to month, to day, for example. Additionally, the time-series capabilities that OBIEE has also require that our date attributes be configured in a logical hierarchy. We have two options for producing the non-existent date dimension:

  • We can build a date dimension in our transactional database, or in our foundation layer, and create aliases for this physical table in the OBIEE physical layer. We can then use these aliases in our BMM to form one or more role-playing dimension tables for our facts.
  • We can use the functionality in the OBIEE business model to logically “construct” a date dimension using a DATE attribute from the source table for the logical fact, and then use calculated attributes to flesh out the remainder of the required level-specific attributes.

The first bullet point is an attractive option, especially if the reporting environment is a replicated version of the transactional system, or better-yet, a foundation layer (described in this post), where we are able to inject additional physical structures specifically for reporting purposes. The join between the date dimension and the source table is constructed using a CAST of DATETIME to DATE for Oracle databases, which will generate the necessary TRUNC command in the join syntax. Although I recommend using DATE datatypes as the surrogate key in date dimensions, it’s not required, as date dimensions usually have a DATE dataype in them somewhere, and this column can be used as the join criteria in the physical model. Don’t be tempted into using OBIEE’s federation capabilities to use a date dimension from another database for this purpose. Though it may seem like the ideal solution on the surface, keep in mind that date-based filters would not be pushed down to the reporting database in this scenario, as these filters would have to be applied logically in memory by the BI Server. This produces a performance nightmare, and inevitably, the production DBA will blame OBIEE for not applying date filters on it’s queries.

Though it’s tempting to dive into the first bullet point further (feel free to leave comments about this), it seems like a fairly standard approach for developers familiar with using OBIEE against traditional data warehouses. We will instead focus on the second bullet point: constructing a date dimension out of sheer nothingness. We start by building an empty dimension table called Dim – Activity Date and mapping only one single attribute from our source table containing the DATE column, in this case, EBA_CUST_CUST_ACTIVITY.ACTIVITY_DATE. This will be the only attribute that we map back to a source table; all the remaining attributes will be calculated measures in the BMM. The OBIEE SQL language contains a series of date conversion functions specifically designed to extract date elements from a DATE datatype. Any functions that OBIEE can’t extract on it’s own can be extracted using the EVALUATE function to push database-specific syntax transparently down to the underlying database engine. I prefer to use the logical option Derived from existing columns using an expression as opposed to Derived from physical mappings whenever possible because we can reuse the Dim – Activity Date dimension with other facts, and we will only have to map a single column in the logical table source: the DATE attribute.

If we choose to use physical mappings instead, we will have to map all the attributes each time we add an additional logical table source to the logical dimension table. When our date dimension is complete, we’ll have a complete logical date dimension with accompanying dimension (hierarchy) ready to be used for drill-down, time-series, etc.

When we are all done with the BMM mapping, we can see how the BI Server generates the physical SQL against an Oracle Database. As we saw earlier, the BI Server doesn’t need to join to the EBA_CUST_CUST_ACTIVITY more than once, even though it’s now part of the logical table source for three logical tables: Dim – Customer Activity, Dim – Activity Date and Fact – Customer Activity.

In the next post, we’ll describe Complex Fact Tables… and see some really, really cool stuff. I promise!

Update on my OBIEE / Exalytics Books

Just a quick update on my two recently published books, “Oracle Business Intelligence Developers Guide” and “Oracle Exalytics Revealed”, both of which are now published and available to purchase around the world in printed and electronic format. Here are some links to the two books on Amazon.com, Amazon.co.uk, Google Play and the Apple iTunes Store.

NewImage

The main Oracle Business Intelligence 11g book has had some great, five-star reviews on Amazon.com and Amazon.co.uk, with comments such as:

  • “To put it straight: Many years will pass before we will witness another masterpiece like this. This book is epic and a total MUST for anyone who works with OBIEE 11g” (5-star review on Amazon.co.uk)
  • “It has has taken a long time for a good OBIEE 11 book to come out, but well worth the wait. This is not just a “warmed over” documentation book, this book has “Meat” and is a must buy for all OBIEE 11g developers. This is one book that covers it all” (5-Star Review on Amazon.com)
  • “Although it is applicable to all developers, the people who would benefit most from this book are 1) developers who new or relatively new to Oracle BI 2) developers who are are experienced with 10g and are looking to upgrade to 11g and 3) developers experienced in one or two areas (such as dashboards) and are now looking to expand their knowledge. It would also be useful to team members such as Tech Team Leads, Solution Architects and Operations staff who are not full-time Oracle BI developers but who still need to understand the concepts and “lingo”" (5-Star Review on Amazon.com)
  • “If you’ve followed OBIEE during it’s evolution from Siebel to Oracle, then you have indubitably heard of Mark, and have likely read something he’s written: a blog, a magazine article, etc. Therefore, you know there is no better source in the world for a book on this subject.” (5-Star Review on Amazon.com)

I’ve also recorded a podcast about the book with Oracle’s VP in charge of Oracle Business Intelligence product management, Paul Rodwick, which can be accessed in MP3 format from here. In the podcast we talk about how I ended up writing the book, my favourite chapters and why I chose Oracle Business Intelligence 11g to write about – it’s about 10 minutes in length and available for download now.

I’d encourage you particularly to take a look at the standalone “Oracle Exalytics Revealed” ebook in Kindle and Apple iBooks format – this was an experiment by McGraw Hill (Oracle Press) and myself to see if shorter, focused ebooks might be popular – we’ve taken the existing Exalytics chapter from the main book (which was based on the initial 11.1.1.6 release) and then extended it, covering 11.1.1.6.2 BP1 and adding additional content around testing approaches, management using Integrated Lights-Out Management, and the new Presentation Server features aimed at Exalytics that became available with the 11.1.1.6.2 BP1 patchset (trellis charts etc). The Exalytics book is only $9.99 and  is a great complement to the main book, or just something you’d buy if you’re contemplating buying an Exalytics server. Here’s a couple of review comments also from Amazon.

NewImage

  • “Nowhere will you find a more comprehensive look at Oracle Exalytics from start to finish than in this book … extremely thorough and easy to follow … and I find this narrative incredibly useful to explore the product.”(5-Star Review on Amazon.com)
  • “The book … delivers what it promises on its title, a thorough review on the most important aspects of this engineered system created by Oracle. Either if you are working with Exalytics or if you are going to do an implementation in the near future, you should have this book around as it’s packed with valuable information in a much easy way to digest than the official Exalytics Documentation” (5-Star Review on Amazon.co.uk)

The sample data for the main book is also now available from the Oracle Press / McGraw Hill website along with a sample chapter, whilst Amazon and Apple also offer sample pages from the book if you want to try it out electronically.

Finally, a bit more news about out training courses – going forward, we’ll be giving away a copy of the book to every attendee of our OBIEE 11g courses in the UK, USA and India, and to clients who hire us to deliver training for their team on-site in their offices. Full details of our training courses can be found on our Training page, with the key benefit to trainees being that the examples in the book are based around the same ones we used for our OBIEE 11g hands-on labs, so you’ll be able to read-up on areas you’re particularly interested in after the course finishes, and practice the examples using the downloadable sample data that comes with the book.

Using OBIEE against Transactional Schemas Part 3: Simple Dimensions and Facts

This could be the longest series of blog posts in Rittman Mead history… not in number of posts (this is only the third), but in length of time from start to finish. I don’t think that’s a record anyone is actively pursuing, nor am I proud to (possibly) be the record holder, so my apologies to those of you waiting anxiously for each new installment. To reset… I’m discussing using OBIEE to report against transactional schemas. I started with an introduction, and followed up with a few words on aliases. Now I’d like to discuss a general approach to defining logical fact and dimensional tables in the OBIEE Business Model and Mapping layer.

Finding logical tables buried away in a highly normalized model is really as simple as taking a business view of the data. With Customer data for instance, business users don’t think about (or even understand) normalized models with separate entities for Customer, and Customer Category, and Customer Type, and Customer Status, etc. To the business users, these are all attributes of a single entity: Customer. If we think about the BMM layer in OBIEE as a way to present a uniformed model to a business user, or at least a basic user with no knowledge of the underlying 3NF model used for running the business, then we should ask ourselves a simple question: how would a business user want to see this data?

So our job is to undo the brilliant normalization work performed by the transactional data modelers, and combine all of these separate entities into a single, conformed dimension table or fact table. In it’s purest form, I am describing nothing more than real-time ETL, because the logic applied in the BMM is the same sort of logic we would use in an ETL tool to construct facts and dimensions in the physical world.

Thinking only about logical dimension tables for a moment, we start with the most basic and granular entity that composes the heart of the logical dimension, and start joining outward to include all the other ancillary entities needed to form a complete picture of that dimension. With the Customer Tracking application, the logical place to start is the Dim – Customer logical dimension, and the most basic table to use as a starting point is EBA_CUST_CUSTOMERS. This table contains the main attributes of Customer: the name, the address, etc. So I’ll construct a logical table source and start with the EBA_CUST_CUSTOMERS table, and then I’ll add additional physical entities using the (+) button, as depicted below.

Multiple tables in a single logical table source

Once we have all the required tables joined into the logical table source, we can start mapping the physical columns to our desired logical columns, using either the Column Mapping tab in the logical table source, or by simply clicking and dragging physical columns over onto the logical dimension table.

Once all the columns have been mapped to one of the entities in our logical table source, we can then start building the hierarchy, or dimension in OBIEE terms, that defines the relationship among levels in the dimension. When we are reporting against a pure star schema, this kind of information doesn’t exist in the table definition, though hopefully it exists in the project documentation. However, when reporting off a transactional schema, we have a leg-up here. Although our logical dimension table presents as a single, conformed entity, our physical model is still 3NF, and a 3NF model gives us some hints regarding hierarchies, as those relationships are usually normalized out into separate tables. All and all, the process of building the hierarchy is no different from pure star schemas, because the logical dimension table shields us from the complexity of navigating multiple physical tables to define our hierarchy.

Shifting gears for a moment and thinking about fact tables, the process for mapping simple logical fact tables is at it’s core very similar to the process for logical dimensional tables. When our source system has transactional data, with many measures, such as transaction amount, transaction quantity, etc., then defining the fact table is relatively simple, even though the logical table source for the logical fact table may contain a join across multiple source entities, similar to how we built the simple dimension table above. A common scenario for this is with a Sales Order, where these is a Sales Order Detail table and a Sales Order Header table.

The first approach I want to demonstrate is not as easy as the Sales Order example; it involves a more operational reporting requirement, and how to arrange our logical fact table when we really have no measures at all. In this scenario, what we have is a sort of relationship table, where we require a fact table, but the purpose of that fact table is only to form a bridge between various dimensional tables. Ralph Kimball calls this form of relationship table a factless fact table. As Kimball explains, the event is the fact, and he uses the example of the student attendance fact, where the only thing that is being recorded in the fact is the presence of a particular student, in a particular class, with a particular professor, on a particular day, etc. For the Customer Tracking application, we want to report on customer activities stored in the EBA_CUST_CUST_ACTIVITY table:

The Activity Table

This table stores our basic CRM events: phone calls, meetings, delivered proposals, etc. But true to Kimball’s positioning of our requirement, this table doesn’t have any explicit facts such as amount sold, quantity sold, etc. We need a way to model these activities in the semantic layer in a way that makes sense to the BI Server, and what makes sense to the BI Server is the presence of at least one measure. This measure is required to provide the BI Server with the ability to generate aggregate queries involving SUM, AVG, etc. When we build a factless fact table in a data warehouse, we typically load each row with the appropriate surrogate keys to the dimension tables, along with a single, manufactured measure which is usually a value of 1.

We use this value as the activity count, and we can use the measure as a basis for aggregate functions across different dimensions of the data. In building a logical factless fact table, we need to do the same thing. So we start with a logical source for the fact table, even though it doesn’t have any actual measures we will use. In the case of the Customer Tracking application, our relationship table, or factless fact, will be based on EBA_CUST_CUST_ACTIVTY, which is a table with a single record for each activity that we want to record for a customer. Once we have defined the physical table for our logical table source, we map the source column for the Activity Count measure, which in our case will always be a value of 1:

Once we have the new “fake” measure, we can set the aggregation rule, in this case SUM:

It’s also common to have interesting textual attributes existing in the same physical table that we are using as the source for our logical fact table. In our case, this is the EBA_CUST_CUST_ACTIVITY table, and the attribute that we are interested in is called SUMMARY. This attribute holds a description of the activity that has been performed. Although it’s debatable from a pure analytic perspective whether this attribute has value, it certainly does from an operational perspective, especially as an attribute when drilling down to the detail level of a report. When deciding what to do with this attribute, and how to make it available to the end user, we have a series of options. The first decision is a logical one: where do we put the logical column? We could put it in the logical fact table as a non-aggregating attribute, meaning an attribute that doesn’t participate in aggregate functions, but instead forms part of the GROUP BY statement. Although I think this is usually the wrong placement, it is the solution I see more than any others. It is difficult to educate an end user about aggregating and non-aggregating attributes in the same table, and some users will never fully get the distinction. The appropriate solution then is to place these interesting textual attributes, such as SUMMARY, in their own logical dimension table, even though the attributes exist in the “fact table”, so to speak. We can see the SUMMARY column, which has been placed in the logical dimension table Dim – Customer Activity and renamed to Description. I also add an additional table, EBA_CUST_ACTIVITY_TYPE to provide an Activity Type attribute to also use in the logical dimension table.

The reason we put this attribute in it’s own dimension is the importance of separating attributes from measures. Even though OBIEE supports the idea of delivering GROUP BY attributes directly in a logical fact table, that doesn’t mean we should use that feature. Placing the attribute in it’s own dimension also provides us the ability of reusing that dimension table as a conformed dimension table with other logical fact tables that may themselves have their own Description attribute that needs to be exposed, as Venkat describes in his post on modeling degenerate dimensions.

Once we have decided to place the Description attribute in the logical dimension table called Dim – Customer Activity, instead of placing it in Fact – Customer Activity, we now have a decision to make about how to construct the logical table source. The common approach I see is to construct two aliases of the source table, in this example, EBA_CUST_ACTIVITY: one to use as the dimension component of the table, and the other to use as the fact component. Even though I often see this approach, I still haven’t figured out why it is common. The preferred approach is to use the same physical source, either a single alias or a single physical table, as the source for both the dimension logical table source, and the fact logical table source.

To compare these two approaches, I build the analysis using Dim – Customer, Dim – Customer Activity, and Fact – Customer Activity.

Using the 2 alias approach, we would get a self join to the EBA_CUST_ACTIVITY table: once as a dimension and once as a fact:

When using the same physical source for the fact table and the dimension, our report would display the same results, but we would get a single scan against the EBA_CUST_CUST_ACTIVITY table. The BI Server can make out the purpose of separating physical attributes in a logical manner and is capable of rendering the correct results with the fewest number of physical scans:

Oracle ACE Directors Nordic Tour 2012 : Venues and BI Presentations

Acedtour

Next week I’m honoured to be taking part in the Oracle ACE Directors Nordic Tour, taking in Copenhagen, Stockholm, Oslo and Helsinki over four days. Organized by Heli Helskyaho (Finnish Oracle Users Group) and Sten Vesterli (Danish Oracle Users Group) along with the leaders of the Norwegian and Swedish User Groups, and with the kind support of the OTN and the ACE Director program, we’re running four conference days in four countries covering all of Oracle tech from database through BI to ApEx and Fusion Middleware.

As well as myself, I’m joined by Carl Dudley (University of Wolverhampton), Julian Dontcheff (Accenture Finland), Alex Nuitjen (AMIS), Tanel Poder (Enkitec) and Sten Vesterli (Scott/Tiger). Each country event will have a different agenda selected by the local user group, with my own sessions covering a wide range of topics from ODI through OBIEE to Fusion Middleware and Endeca. Each presenter is an expert in their field, all proceeds go to the local user groups and many of the events have drinks, dinner or other social meet-ups after the main sessions end.

Here’s the dates and locations of each event in the tour, along with the sessions I’ll be delivering (links are to the full agenda for each event):

Copenhagen, Denmark, October 23rd 2012

  • Oracle Endeca Information Discovery for Oracle BI and DW Developers

Stockholm, Sweden, October 24th 2012

  • Integrating Oracle Business Intelligence with Oracle Fusion Middleware: Tips and Techniques
  • High-Speed, In-Memory Big Data Analysis with Oracle Exalytics

Oslo, Norway, October 25th 2012

  • Integrating Oracle Business Intelligence with Oracle Fusion Middleware: Tips and Techniques
  • Oracle Endeca Information Discovery for Oracle Business Intelligence/Data Warehouse Developers

Helsinki, Finland, October 26th 2012

  • Introduction to ODI for Oracle DW developers
  • Oracle Endeca Information Discovery for Oracle Business Intelligence/Data Warehouse Developers

As well as using slides I’ll also be demonstrating OBIEE 11.1.1.6.2 and Exalytics, along with Endeca Information Discovery, ODI, WebCenter, ODI and Oracle R Enterprise. To get everything working in an integrated way I’ve put together the “mother of all demo environments”, running all of these products (apart from EID) in the same Oracle WebLogic Server domain, managed by the same EM instance, running on my 16GB RAM Macbook Pro using an SSD disk – fingers crossed all is working together properly, so I should be able to demo all the integration features working together.

Domain

That’s it for now – next stop, Copenhagen.

ADF View Objects as a Repository Data Source using OBIEE 11.1.1.6 and JDeveloper 11.1.1.6

The Application Development Framework (ADF) within Oracle Fusion Middleware 11g provides a data abstraction technology called ADF Business Components, that provide a Java object later over database SQL queries. Since the 11g release of OBIEE the Oracle BI Repository can access ADF Business Components through what are called “view objects”, giving you the ability to create repository subject areas using the same java data abstraction objects that your Fusion Middleware applications use (which is how Oracle have put together the new Oracle Transactional BI element of the BI Apps 11g), as shown in the diagram below.

NewImage

In the following examples I’ll be using Oracle JDeveloper 11g along with OBIEE 11.1.1.6, both on Windows, to connect OBIEE’s BI Repository to an ADF application in order to use it as a data source. Let’s start by opening up JDeveloper 11g and creating a new Fusion Web Application (ADF), so that we get a Model project for developing the ADF Business Components, and a ViewController project for providing some web pages for testing. Stepping through the various wizard pages and accepting the default choices, I then end up with a basic ADF Fusion Web project, looking like this:

NewImage

In this first example we’ll create a couple of ADF business components that provide firstly, a view over the SH.PRODUCTS table, and then another view that combines this table with the SH.SALES and SH.CUSTOMERS tables to create a set of data suitable for pivot tables or graphs. Let’s start though by navigating to the Application Resources pane and adding a new JDBC connection through to the SH schema in an Oracle database.

NewImage

Now we can create our first ADF Business Component, to expose the SH.PRODUCTS table using a ADF view object. To do so, I right click on the Model project within the application and then select ADF Business Component > Entity Object, then step through the wizard steps selecting the database table to base the object on, and then using the options at the end to create view objects for these entity components, with attributes for each of the columns in the source table.

NewImage

I then repeat this for the SH.CUSTOMERS, SH.COUNTRIES and SH.SALES tables, creating ADF business component entity and view objects for each one, and then create one more view object that joins the customer and countries view objects into a single, combined view object. Once this is done, my ADF project has a set of business components listed in the Projects panel, the connection through to the database in the Application Resources panel, and if I double-click on the AppModule application module that provides access to all of these business components, you can see the joins that are automatically put in place to link customers to sales, products to sales, countries to customers and so on.

NewImage

The AppModule application module object is the data model through which OBIEE will access the ADF view objects, and if you look carefully at the screenshot, you might notice that all of the joins (or “view links”) created automatically by JDeveloper point the opposite way to what we’d need for OBIEE – nothing actually joins from the sales table through to dimension tables; instead, the joins go the other way. Now as we’ll see later on, joins set up in this way will cause us a problem when we import this data source into the BI Repository, so as an extra step for this phase I’m going to define a bunch of additional view link that join the SalesView view object through to the ProductsView and CustomersAndCountries view objects, by right-clicking on the SalesView object, selecting File > New > Business Tier > View Link, and then defining a view link that joins the Sales view to the Products view, and then another view link that joins the Sales view to the CustomersAndCountries view, so that my Application Module now looks like this once I add all view objects to the Data Model panel on the right:

Sshot 12

The list of view objects in the Data Model panel are what the BI Administration tool will import metadata for when we connect to the ADF application later on, and as you can see there’s many references to the various underlying database tables within the list. We’ll see later on what this means in terms of which items we eventually work with, and why we just carried out that additional step to create joins from the sales to the dimension tables (view objects).

So far we’ve made a couple of adjustments to the ADF application to make it suitable for using as an OBIEE data source (creating joins between the fact and dimension sources that point in the right direction), and the process of making this ADF application’s view objects available as a BI Repository data source involves a few more steps; we’ll need to install a special shared library on the WebLogic server that hosts the ADF application, and we’ll also need to make a couple more changes to the ADF application to work with this library. As such, at a high-level enabling OBIEE view object data access for a particular ADF application involves the following steps:

  • Deploy the special shared library “OBIEEBroker” to the WebLogic domain that hosts the application, which the ADF application then uses to expose its data to the BI Repository
  • Configure the ADF application’s Application Module to use a JDBC datasource, then create a JDBC datasource of that name in the OBIEE WebLogic domain
  • Within the ADF application, add a servlet that uses the OBIEEBroker shared library to expose the view objects to OBIEE’s BI Server
  • Redeploy the ADF application to the application server, now with the new servlet, and then use the BI Administration tool to connect to this and import the view objects’ metadata into the physical layer of the BI Repository.
When it’s all set up, the data access path from OBIEE through to the view objects in the ADF application will look like the diagram below, with the BI Server connecting to the OBIEEBroker servlet to get access to the ADF view objects within the application.
 
NewImage

To keep my example simple, I’m going to use the WebLogic domain that hosts OBIEE to then host the ADF application, but bear in mind that in real-life, the WebLogic Server instance that comes with OBIEE is limited-use, meaning that you can’t use it to host real production applications unless you convert the license to a full-use license.

Deploying the OBIEEBroker shared library to the WebLogic domain (which also needs to contain the Oracle Application Core (Webapp) and Oracle JRF libraries (included by default with OBIEE 11.1.1.6′s WebLogic Domain) is typically done through WebLogic Administration Console, with the oracle.bi.integration.adf.ear file that contains the library found at [middleware_home]\Oracle_BI1\bifoundation\javahost\lib\obisintegration\adf. 

NewImage

The next step is to create a configuration entry for the application module in the ADF application that provides access to the view objects, so that the module can also use a JDBC datasource defined in the WebLogic domain to connect to the underlying source database (by default the module just uses a direct JDBC URL connection, but OBIEE needs it to use a proper, defined JDBC connection hosted in WebLogic). Make a note of the name you give this datasource (for example, java:comp/env/jdbc/SH_ConnectionDS) as you’ll need the JNDI name element of this (jdbc/SH_ConnectionDS) later on when creating the datasource using WebLogic Server Administration Console.

NewImage

We’ll create the JDBC datasource on the WebLogic server shortly, but for now we need to do some configuration work within the ADF project to enable data access from OBIEE using the OBIEEBroker shared library. The first step is to take the ADF business component definitions within the Model project and bundle them up into a business component archive, which then gets deployed into a couple of JAR files that then provide access to the components for other modules in the final application.

NewImage

So at this point we’ve taken the application’s business components and packaged them up into JAR files for reusability, and now we’re going to create the servlet that will use these business components and provide access to them via the OBIEEBroker shared library.

To do this, I create a new project within the application in JDeveloper, choose the Web Project option, and then choose Servlet 2.5\JSP2.1 as the servlet type. When I then come to name the context root that the servlet will be accessed by, this defines the application name that the BI Server will access the application by, as opposed to accessing it by its usual name (defined by the context root assigned to the ViewController module).

NewImage

Next we need to configure this new project to include a reference to the business components archive JAR files we created earlier, so that it can make use of these data access components (which is the whole purpose of this servlet, obviously).

NewImage

Then we have to edit the web.xml configuration file within the project to register the servlet provided by the OBIEEBroker shared library, replacing the entries within the <web-app></web-app> element. In the example I’ve used below, I’ve chosen to remove the restriction where these servlets are only available to certain application roles by entering DISABLE_BI_WHITELIST_ROLE_CHECK as the parameter value of the application role whitelist,  I’ve disabled the feature where access is restricted to just a set of named application roles, but if you want to add this restriction you can list out the application roles between the <param-value></param-value> tags within the <context-param></context-param> element.

<?xml version = '1.0' encoding = 'windows-1252'?>
<web-app xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://java.sun.com/xml/ns/javaee http://java.sun.com/xml/ns/javaee/web-app_2_5.xsd"
         version="2.5" xmlns="http://java.sun.com/xml/ns/javaee">
  <context-param>
  <description>This holds the Principals (CSV) that a valid end user should 
have (at least one) in order to query the ADF layer from BI.</description>
  <param-name>oracle.bi.integration.approle.whitelist</param-name>
  <param-value>DISABLE_BI_WHITELIST_ROLE_CHECK</param-value>
</context-param>
<filter>
  <filter-name>ServletADFFilter</filter-name>
  <filter-class>oracle.adf.share.http.ServletADFFilter</filter-class>
</filter>
<filter-mapping>
  <filter-name>ServletADFFilter</filter-name>
  <servlet-name>OBIEEBroker</servlet-name>
  <dispatcher>FORWARD</dispatcher>
  <dispatcher>REQUEST</dispatcher>
</filter-mapping>
<servlet>
  <servlet-name>OBIEEBroker</servlet-name>
  <servlet-class>oracle.bi.integration.adf.v11g.obieebroker.OBIEEBroker
  </servlet-class>
</servlet>
<servlet-mapping>
  <servlet-name>OBIEEBroker</servlet-name>
  <url-pattern>/obieebroker</url-pattern>
</servlet-mapping>
</web-app>

Once complete, your web.xml file should look like the screenshot below, and you’re now at the point where you can deploy the updated ADF application, with this new data access servlet, to the target application server.

NewImage

There’s now just two more configuration steps to do with the ADF application. The first creates a deployment profile for this updated version of the application, and includes a reference to the new web app and the OBIEEBroker servlet, whilst the second registers a dependency on the two key WebLogic shared libraries that provide the main ADF data access framework, and the specifics required for OBIEE to access these ADF view object data sources. Note that the first of these two libraries (oracle.applcore.model) seemed to be missing in the 11.1.1.5 version of OBIEE meaning that we could never get this feature to work, but it’s back in 11.1.1.6 and seems to work fine.

NewImage

Now we’re at the point were we can deploy the updated ADF application to the WebLogic domain, which in this case is the one also used by our OBIEE 11.1.1.6 installation. Before we do this though, we need to create a JDBC datasource with a name corresponding to the JNDI name for the datasource specified in the new Application Module configuration defined earlier, which in the case of this example was “jdbc/SH_ConnectionDS”, connecting through to the SH sample schema. This is done through WebLogic Administration Console, with the final JDBC datasource deployed to all servers in domain and listed alongside any other JDBC datasources within the domain.

NewImage

Now you can deploy the application to the WebLogic server. Right-click on the global application and select Deploy, then choose the deployment profile you created earlier. When you come to choose the targets to deploy to, make sure you only deploy to those servers that have got the oracle.applcore.model and oracle.bi.integration.adf shared libraries deployed to them – on my installation, oracle.applcore.model is only deployed to the managed server cluster, and you’ll get a deployment error if you try and deploy the application to the administration server as well.

NewImage

So now we can move over to the BI Administration tool, and try and connect to this new ADF View Object data source. There’s a couple of bits of information we’ll need to create the connection:

  • The Application Module name that provides access to the view objects, which in our case was called AppModule and needs to be referenced using it’s fully qualified Java package name, which is “my.sampleADF.app.model.AppModule”
  • The Application Module Configuration name, the extra configuration that we added previously to the application module that referenced the new JDBC datasource we just configured using WebLogic Administration Console. In this example, the configuration name is “AppModule_JDBC_DS”
  • The JDBC Data Source name that the Application Module configuration uses (or you can leave blank if you just want to use the default JDBC datasource for that application module, which is what we’ll do)
  • The URL to access the application’s OBIEEBroker servlet, which in this case is http://10.10.10.10:9704/SimpleADFAppVO/obieebroker
  • A username and password to connect to the application server; in this case, weblogic/welcome1

Let’s open up the BI Administration tool now and create a new connection through to the ADF application, using these details. Select File > Import Metadata, choose OracleADF_HTTP as the Connection Type, and then enter in the connection details. Once complete, press Next to proceed to the screen where you select the view objects who’s metadata you wish to import; in most cases, there will be several views with similar names so just select one of each, then press Next to proceed. On subsequent pages you can use some new (experimental, to mind mind) features which map these objects into a business model, configure the repository for the BI Apps 11g and so on – ignore these and just press Finish to complete the import.

NewImage
 

At this point your repository’s physical model will have a number of references to ADV view objects, sharing the same name but with numbers after them., So which ones do you import, and why are there so many?

It comes back to the Application Module that we created earlier on, with its multiple references to view objects caused by view links (joins) between them. When the customers view object links to the sales view object, a reference to each are created in the application module, and another reference to the sales view object is created when the products view object references it. Crucially though, OBIEE considers these two sales view objects to be different physical layer tables, so you can’t  use them to join products and customers to sales as sales. Instead, this is why we added joins from the sales view object through to products, and then through to the combined customers and countries view object, so that we had a single reference to sales that then joined out to the dimension source objects; so therefore, when we now match the view object references in the application module to what’s been imported into the BI Repository physical layer, it’s easy to pick out the imported objects that will become the data source for our business model, as these are the ones that, when I check, have the right joins set up between them.

NewImage

So now we can create the business model, and then the presentation model, for this data, defining the default aggregation for the two measures as I do so. When you try and validate the business model though, you’ll notice that the joins are back to front – it thinks the sales table is the dimension table whilst the two dimensions are facts. This is because the join I set up in JDeveloper makes the sales table the master table in the master/detail relationship – to fix this, I go into the business model and re-join the sales logical table to the two dimension logical tables this time creating the direction of the join properly; there may be a more efficient way of doing this and if anyone knows ADF better than myself, please feel free to suggest how to set up the joins in JDeveloper so that they don’t need this last bit of adjustment (I think it comes down to how you set the 1:* cardinality of the join, but I’d be interested to hear if there’s a way to avoid creating these additional joins in the first place).

Sshot 17

At this point if you check global consistency for the repository you’ll hit two more issues, one you need to fix and one that we’ll ignore for a moment. The view objects that represent fact table sources (SalesView1, 2 and 3) don’t have primary keys defined for them but the BI Administration tool insists you do, so create them as appropriate (in this case, a composite primary key based on the product and customer IDs); the other warning is about a SQL Bypass Database but we’ll cover that in a moment. Once done, save your repository, place it online if it’s not already, and we’ll try out a query.

Sshot 18

If everything’s set up correctly you should get some numbers back, as per the screenshot above. What you’ll notice is though is that … it’s really slow. The reason why will become apparent when you look in the query log file; what you’ll see is the logical SQL query coming through, and then the call to the ADF application to retrieve the required attributes. However, the ADF application doesn’t perform any aggregation, instead sending back all of the detail-level rows and having the BI Server perform the aggregation instead.

]]
[2012-10-14T09:52:22.000+00:00] [OracleBIServerComponent] [TRACE:2] [USER-0] [] [ecid: 45c97e564b475761:22a69728:13a5bf84463:-8000-0000000000001621] [tid: 14d4] [requestid: 42220011] [sessionid: 42220000] [username: weblogic] ############################################## [[
-------------------- SQL Request, logical request hash:
49b4d059
SET VARIABLE QUERY_SRC_CD='Report';SELECT
   0 s_0,
   "Sales History from ADF"."AppModule.CustomersAndCountriesView2"."CustCity" s_1,
   "Sales History from ADF"."AppModule.ProductsView2"."ProdCategory" s_2,
   "Sales History from ADF"."AppModule.SalesView3"."AmountSold" s_3
FROM "Sales History from ADF"
ORDER BY 1, 2 ASC NULLS LAST, 3 ASC NULLS LAST
FETCH FIRST 65001 ROWS ONLY


]]
[2012-10-14T09:52:22.000+00:00] [OracleBIServerComponent] [TRACE:2] [USER-23] [] [ecid: 45c97e564b475761:22a69728:13a5bf84463:-8000-0000000000001621] [tid: 14d4] [requestid: 42220011] [sessionid: 42220000] [username: weblogic] -------------------- General Query Info: [[
Repository: Star, Subject Area: Sales History from ADF, Presentation: Sales History from ADF

]]
[2012-10-14T09:52:22.000+00:00] [OracleBIServerComponent] [TRACE:2] [USER-18] [] [ecid: 45c97e564b475761:22a69728:13a5bf84463:-8000-0000000000001621] [tid: 14d4] [requestid: 42220011] [sessionid: 42220000] [username: weblogic] -------------------- Sending query to database named my.sampleADF.app.model.AppModule_AppModule_JDBC_DS (id: <<5118>>), connection pool named Connection Pool, logical request hash 49b4d059, physical request hash 28c5caab: [[
<?xml version="1.0" encoding="UTF-8" ?>



<ADFQuery queryid="11942-4827" locale="en">

<Parameters>
</Parameters>
<Projection>
			<Attribute><Name><![CDATA[CustCity]]></Name><ViewObject><![CDATA[AppModule.CustomersAndCountriesView2]]></ViewObject></Attribute>
			<Attribute><Name><![CDATA[ProdCategory]]></Name><ViewObject><![CDATA[AppModule.ProductsView2]]></ViewObject></Attribute>
			<Attribute><Name><![CDATA[AmountSold]]></Name><ViewObject><![CDATA[AppModule.SalesView3]]></ViewObject></Attribute>
</Projection>
<JoinSpec>
	<ViewObject>
		<Name><![CDATA[AppModule.CustomersAndCountriesView2]]></Name>
		<ViewLink>
			<Name><![CDATA[AppModule.SalesCustomersAndCountriesStarFK1]]></Name>
			<ComplexJoinName><![CDATA[AppModule.SalesView3_AppModule.CustomersAndCountriesView2]]></ComplexJoinName>
		</ViewLink>
	</ViewObject>
	<ViewObject>
		<Name><![CDATA[AppModule.ProductsView2]]></Name>
		<ViewLink>
			<Name><![CDATA[AppModule.SalesProductStarFK1]]></Name>
			<ComplexJoinName><![CDATA[AppModule.SalesView3_AppModule.ProductsView2]]></ComplexJoinName>
		</ViewLink>
	</ViewObject>
	<ViewObject>
		<Name><![CDATA[AppModule.SalesView3]]></Name>
		<ViewLink>
			<Name><![CDATA[AppModule.SalesCustomersAndCountriesStarFK1]]></Name>
			<ComplexJoinName><![CDATA[AppModule.SalesView3_AppModule.CustomersAndCountriesView2]]></ComplexJoinName>
		</ViewLink>
		<ViewLink>
			<Name><![CDATA[AppModule.SalesProductStarFK1]]></Name>
			<ComplexJoinName><![CDATA[AppModule.SalesView3_AppModule.ProductsView2]]></ComplexJoinName>
		</ViewLink>
	</ViewObject>
</JoinSpec>
</ADFQuery>

]]

Why is this? It’s because the ADF application isn’t designed to respond to aggregation requests from the BI Server (function push-down as we refer to it), so any functions required by the BI Server have to be performed after the ADF application returns all of the data required for the function.

The way around this is to define an SQL Bypass Database for the connection pool used by the ADF data source in the repository, that points to the same physical database that the ADF view objects connect to. When you configure an SQL bypass database in this way, the BI Server extracts the SQL for the ADF view object when it connects to the ADF application, takes this SQL and wraps any aggregation and other functions around it, and sends the SQL directly to the view object’s database, bypassing ADF and running much more efficient queries.

NewImage

To define an SQL bypass database, first create a connection to it within the Physical layer of the repository, defining the physical database and a connection pool setting. Then, edit the connection pool for the ADF physical database and select this database as the SQL bypass database, like this:

Sshot 19

Once you’ve set the SQL bypass database for the connection pool, you can re-run the query again, and this time it should return results within a few seconds. If you look at the query log again, you’ll see this time a reference to an SQL bypass database being defined, and the actual SQL query sent to this database, based on the SQL used to define the view objects used in the query.

]]
[2012-10-14T10:37:57.000+00:00] [OracleBIServerComponent] [TRACE:2] [USER-23] [] [ecid: 45c97e564b475761:22a69728:13a5bf84463:-8000-0000000000002114] [tid: 1754] [requestid: 5a130001] [sessionid: 5a130000] [username: weblogic] -------------------- General Query Info: [[
Repository: Star, Subject Area: Sales History from ADF, Presentation: Sales History from ADF

]]
[2012-10-14T10:37:57.000+00:00] [OracleBIServerComponent] [TRACE:2] [USER-18] [] [ecid: 45c97e564b475761:22a69728:13a5bf84463:-8000-0000000000002114] [tid: 1754] [requestid: 5a130001] [sessionid: 5a130000] [username: weblogic] -------------------- Sending query to database named my.sampleADF.app.model.AppModule_AppModule_JDBC_DS (id: SQLBypass Gateway), connection pool named Connection Pool, logical request hash 2415f206, physical request hash 8b5cc1ff: [[
<?xml version="1.0" encoding="UTF-8" ?>



<ADFQuery mode="SQLBypass" queryid="24464-5705" locale="en">

<Parameters>
</Parameters>
<Projection>
			<Attribute><Name><![CDATA[CustCity]]></Name><ViewObject><![CDATA[AppModule.CustomersAndCountriesView2]]></ViewObject></Attribute>
			<Attribute><Name><![CDATA[ProdCategory]]></Name><ViewObject><![CDATA[AppModule.ProductsView2]]></ViewObject></Attribute>
			<Attribute><Name><![CDATA[QuantitySold]]></Name><ViewObject><![CDATA[AppModule.SalesView3]]></ViewObject></Attribute>
			<Attribute><Name><![CDATA[AmountSold]]></Name><ViewObject><![CDATA[AppModule.SalesView3]]></ViewObject></Attribute>
</Projection>
<JoinSpec>
	<ViewObject>
		<Name><![CDATA[AppModule.CustomersAndCountriesView2]]></Name>
		<ViewLink>
			<Name><![CDATA[AppModule.SalesCustomersAndCountriesStarFK1]]></Name>
			<ComplexJoinName><![CDATA[AppModule.SalesView3_AppModule.CustomersAndCountriesView2]]></ComplexJoinName>
		</ViewLink>
	</ViewObject>
	<ViewObject>
		<Name><![CDATA[AppModule.ProductsView2]]></Name>
		<ViewLink>
			<Name><![CDATA[AppModule.SalesProductStarFK1]]></Name>
			<ComplexJoinName><![CDATA[AppModule.SalesView3_AppModule.ProductsView2]]></ComplexJoinName>
		</ViewLink>
	</ViewObject>
	<ViewObject>
		<Name><![CDATA[AppModule.SalesView3]]></Name>
		<ViewLink>
			<Name><![CDATA[AppModule.SalesCustomersAndCountriesStarFK1]]></Name>
			<ComplexJoinName><![CDATA[AppModule.SalesView3_AppModule.CustomersAndCountriesView2]]></ComplexJoinName>
		</ViewLink>
		<ViewLink>
			<Name><![CDATA[AppModule.SalesProductStarFK1]]></Name>
			<ComplexJoinName><![CDATA[AppModule.SalesView3_AppModule.ProductsView2]]></ComplexJoinName>
		</ViewLink>
	</ViewObject>
</JoinSpec>
</ADFQuery>

]]
[2012-10-14T10:37:58.000+00:00] [OracleBIServerComponent] [TRACE:2] [USER-18] [] [ecid: 45c97e564b475761:22a69728:13a5bf84463:-8000-0000000000002114] [tid: 1754] [requestid: 5a130001] [sessionid: 5a130000] [username: weblogic] -------------------- Sending query to database named ADF VO SQL Bypass DB (id: <<1106>>), connection pool named SHConn, logical request hash 2415f206, physical request hash 1ecc951f: [[
WITH 
SAWITH0 AS (select T46492.C304898130 as c3,
     T46492.C100315219 as c4,
     T46492.C426174690 as c6,
     T46492.C111943045 as c7
from 
     (SELECT V11235427.CUST_CITY AS C304898130,         V423473769.PROD_CATEGORY AS C100315219,         V295179950.QUANTITY_SOLD AS C426174690,         V295179950.AMOUNT_SOLD AS C111943045,         V295179950.ROWID AS PKA_RowID0,         V11235427.CUST_ID AS PKA_CustId0,         V423473769.PROD_ID AS PKA_ProdId0 FROM (SELECT Sales.PROD_ID,         Sales.CUST_ID,         Sales.TIME_ID,         Sales.CHANNEL_ID,         Sales.PROMO_ID,         Sales.QUANTITY_SOLD,         Sales.AMOUNT_SOLD,         Sales.ROWID FROM SALES Sales) V295179950, (SELECT Customers.CUST_ID,         Customers.CUST_FIRST_NAME,         Customers.CUST_LAST_NAME,         Customers.CUST_GENDER,         Customers.CUST_YEAR_OF_BIRTH,         Customers.CUST_MARITAL_STATUS,         Customers.CUST_STREET_ADDRESS,         Customers.CUST_POSTAL_CODE,         Customers.CUST_CITY,         Customers.CUST_CITY_ID,         Customers.CUST_STATE_PROVINCE,         Customers.CUST_STATE_PROVINCE_ID,         Customers.COUNTRY_ID,         Customers.CUST_MAIN_PHONE_NUMBER,         Customers.CUST_INCOME_LEVEL,         Customers.CUST_CREDIT_LIMIT,         Customers.CUST_EMAIL,         Customers.CUST_TOTAL,         Customers.CUST_TOTAL_ID,         Customers.CUST_SRC_ID,         Customers.CUST_EFF_FROM,         Customers.CUST_EFF_TO,         Customers.CUST_VALID FROM CUSTOMERS Customers, COUNTRIES Countries WHERE Customers.COUNTRY_ID = Countries.COUNTRY_ID) V11235427, (SELECT Products.PROD_ID,         Products.PROD_NAME,         Products.PROD_DESC,         Products.PROD_SUBCATEGORY,         Products.PROD_SUBCATEGORY_ID,         Products.PROD_SUBCATEGORY_DESC,         Products.PROD_CATEGORY,         Products.PROD_CATEGORY_ID,         Products.PROD_CATEGORY_DESC,         Products.PROD_WEIGHT_CLASS,         Products.PROD_UNIT_OF_MEASURE,         Products.PROD_PACK_SIZE,         Products.SUPPLIER_ID,         Products.PROD_STATUS,         Products.PROD_LIST_PRICE,         Products.PROD_MIN_PRICE,         Products.PROD_TOTAL,         Products.PROD_TOTAL_ID,         Products.PROD_SRC_ID,         Products.PROD_EFF_FROM,         Products.PROD_EFF_TO,         Products.PROD_VALID FROM PRODUCTS Products) V423473769 WHERE V295179950.CUST_ID = V11235427.CUST_ID AND V295179950.PROD_ID = V423473769.PROD_ID) T46492),
SAWITH1 AS (select sum(D1.c6) as c1,
     sum(D1.c7) as c2,
     D1.c3 as c3,
     D1.c4 as c4
from 
     SAWITH0 D1
group by D1.c3, D1.c4)
select D1.c1 as c1, D1.c2 as c2, D1.c3 as c3, D1.c4 as c4, D1.c5 as c5 from ( select distinct 0 as c1,
     D1.c3 as c2,
     D1.c4 as c3,
     D1.c2 as c4,
     D1.c1 as c5
from 
     SAWITH1 D1
order by c3, c2 ) D1 where rownum <= 65001

]]

Apart from using ADF view objects as a data source for OBIEE, you can also use the BI Repository as a data source for ADF business components, and you can combine the two in an ADF application to return, say, actuals numbers from an ADF application with forecast numbers coming from Essbase via the BI Repository. For now though, what are the conclusions from the above exercise?

Firstly – it works, which is more than can be said for the 11.1.1.5.x releases of OBIEE, which for some reason broke the link between ADF and the BI Server such that whilst metadata import worked, you couldn’t actually retrieve any data. Once you know what you’re doing it’s reasonably straightforward to do (the docs detail the setup process well), but it took me a long while to work out what was wrong with the joins between the view objects and how to make them work properly with OBIEE. The SQL bypass database feature is essential as otherwise, unless you’ve got a very small dataset behind the ADF application it’ll just run too slow, but they key takeaway from me is that you need to specifically enable this feature with your ADF application, through the obieebroker web application and the changes to VOs and the Application Module, before it’ll work. Nothing too complicated or dramatic but something to bear in mind when considering using ADF applications as a data source.

Well that’s it for me for a while now around OBIEE; for a few months now I’ve actually been working more with Oracle Data Integrator than OBIEE, so I’ve got a bunch of blog posts in mind about running ODI in a complex, high-availability environment with release management and deployment requirements; check back in a while for some new posts around this different type of topic.