Tag Archives: Oracle BI Suite EE

Using DAC11g to Replicate BI Apps DW Tables into TimesTen

A couple of months ago Peter Scott and I wrote a series of posts on using Exalytics with the Oracle BI Apps, with the idea begin to try and use Exalytics’ In-Memory database cache to speed up BI Apps dashboards. We looked at a number of approaches, including using OBIEE11g’s Summary Advisor to replicate commonly-used BI Apps aggregates into TimesTen, whilst Pete looked at the more manual route of using TimesTen’s ttimportfromOracle utility, which I expanded on a couple of weeks later. But there’s another way you can replicate data into TimesTen that doesn’t involve the Summary Advisor, scripts or indeed anything outside of a regular BI Apps installation – it’s actually the DAC, which in the 11g release has a feature for replicating tables into the TimesTen in-memory database. So how does this work, and is it a viable solution for moving your BI Apps data warehouse in-memory?

As a bit of background for anyone new to Exalytics and TimesTen but that knows the BI Apps, there’s two typical use-cases for TimesTen when used in the context of OBIEE:

  • As an in-memory aggregate cache, automatically populated and managed by OBIEE’s Summary Advisor, as part of the Exalytics hardware and software package
  • As a regular database but stored in-memory, used to hold detail-level data warehouse data plus any aggregate tables you choose to add-in

Strictly-speaking, the second use-case can be done outside of Exalytics, but you’ll most probably need to use the TimesTen table compression feature only available within Exalytics if you want to store any significant volume of data in TimesTen. In fact, given the typical size of data warehouses including the BI Apps data warehouse, it’s unlikely that you’ll ever get your whole dataset into TimesTen as it has to store everything in-memory, so you’ve then got to choose either to:

  • Just store aggregates in TimesTen, which is the route Oracle went down with Exalytics
  • Only try this with data marts or other small data warehouses
  • Put just the “hot” data from your data warehouse into TimesTen – for example, the last couple of years data across the data warehouse, or a particular fact table and its dimensions

It’s this third approach that the new DAC11g is most suited to, with functionality to mark one or more BI Apps DW tables as “in-memory”, then replicate them into TimesTen using an ETL engine actually built-in to the DAC, so there’s no need to add new Informatica routines or otherwise customise your main load routine. It does, however, come with some significant limitations, some of which you can work around but others you’d need to just live with. Let’s go through what’s involved though, using BI Apps 7.9.6.4 and DAC 11g as currently downloadable from OTN.

DAC 11g has a number of new concepts within this release, and one we’re interested in for this example is the “external executor”. What this allows is the DAC to use ETL engines other than Informatica to do work for it, so you could use ODI, for example, to do some ELT-centric work in support for a wider Informatica load if that made more sense. Another type of external executor that you can register though is called “Data Copy”, and this is a special additional data movement feature that’s actually built-in to DAC 11g and uses parallel Java processes to copy data from one table to another. The TimesTen replication feature in DAC 11g uses this to move data from the BI Apps data warehouse into TimesTen tables, so lets register this Data Copy external executor before we do anything else, by switching to the Setup view within the DAC console and creating a new entry within the External Executors section.

NewImage

Notice the ODI 11g Embedded Agent option that’s also there, but for now I give the external executor a name, press the Generate button to create the parameters for this particular external executor type, then review the default settings for the parallel data load engine. DAC 11g executes the data load using multiple Java JDBC calls that take subsets of the source table’s data and copy them into the target table, and you’ll most probably need to fiddle-around with these settings for your own environment, to get the best mix of concurrent data load vs. overall capacity vs. avoiding TimesTen table locks.

NewImage

Now we can go and mark some tables as being “in-memory”, ready for replicating them into TimesTen using this new feature.

Space is of course at a premium with TimesTen, so it’d make sense to start-off with a small source BI Apps DW table and try and replicate that, to get a sense of the space it takes up, and how that might extrapolate to the wider source data set. W_AR_BALANCE_F in my sample BI Apps data warehouse has just 62 rows in it, and it joins to a number of dimension tables, once of which (W_PROFIT_CENTER_D) has just 162 rows of data in it. So let’s start with these two, and see where things go.

To select tables for replicating into TimesTen, switch to the Design view, then use the Tables tab to locate the tables, and check the In Memory checkbox, like this:

NewImage

I then do the same for the W_PROFIT_CENTER_D table, with the DAC Console creating copies of the table metadata in the new source system container I’d set up prior to this exercise.

For each of these tables, I now also need to define the datatype lengths and precisions of each column. To do this, with each of the two tables tables selected, click on the Columns tab and then navigate to the In Memory Length and In-Memory Precision columns, and either copy across the default values from each column, or use a utility such as ttimportfromOracle (if your data warehouse source is the Oracle Database) to first generate optimal datatype lengths, then copy those settings into the in-memory column definitions.

NewImage

Now the TimesTen tables are defined in the DAC repository, I now need to set up a connection from the DAC to the TimesTen server. To do so, I perform the following steps:

1. Ensure TimesTen 11.1.2.2.4+ (ideally 11.1.2.2.5+) is installed either on the server running OBIA, or the server is installed somewhere else and the client is installed on the OBIA server.

2. Copy across the TimesTen library files from $TIMESTEN_HOME/lib to $DAC_HOME/dac/lib – I just copied and pasted all of the .jar and .lib files from the TimesTen /lib directory into the DAC one.

3. If you’re running the DAC server on a 64-bit machine, and you’ve got 64-bit TimesTen installed, alter the DAC config file so that it points to a 64-bit JVM rather than the default 32-bit one that gets installed with the DAC client – I pointed it to the one that ships with 64-bit OBIEE 11.1.1.7, as also installed on this server.

4. Create the required TimesTen client ODBC DSN on your OBIA server, and the TimesTen Data Manager server DSN if you’ve got both TimesTen server and client on your OBIA machin

5. Go into the DAC client, select Setup, then the Physical Data Sources tab, then create a new data source that points through to the TimesTen Client ODBC DSN

At the end of these steps you should have an additional physical datasource connection registered in the DAC console, that successfully connects through to your TimesTen database, like this:

NewImage

Now you need to create a logical data source that we’ll use later to point to this physical connection. To do so, from the DAC Client select Tools > Seed Data > Logical Data Sources, then create a new logical data source called DBConnection_TimesTen.

To create the data copy tasks, I do the following:

1. Switch to the DAC Client Design view, select the Tables tab, press Query, check the In Memory checkbox, then press Go. The list of tables marked as in-memory, to be replicated into TimesTen, is then displayed. In this case, it’s the W_AR_BALANCE_F table and the W_PROFIT_CENTER_D one

2. Right-click anywhere in this list of tables, and select Generate Data Copy Tasks. When prompted, select the All Records in the List radio button.

NewImage

3. When the Generate Data Copy Tasks dialog is shown, select the following values:

NewImage

Primary Logical Source: DBConnection_OLAP
Primary Logical Target: DBConnection_TimesTen
Task Phase: Post Load

Note that the Task Phase can be changed to suit where you want the data copying to happen in a real-life project, and that you can check the Enable Incremental Data Copy checkbox if you want update just the new and changed data in your TimesTen tables – see the online docs on this feature for more details. Once done, press OK to have the DAC Client create the data copy tasks for you.

You’ll also need to create the corresponding TimesTen tables, a task you can also do from the DAC Client. With the same set of In Memory-flagged tables displayed, this time select the Generate DW Table Scripts option, like this:

NewImage

When the Generate DW Table Scripts dialog is shown, select TimesTen as the Physical Data Source Type, check the Execute checkbox, and then select the tt_dac option for the Physical Data Source, like this:

NewImage

The DAC Client will now generate the TimeTen table DDL scripts and execute them, to create the tables to hold your replicated DW data. Assuming all is well, the Generate DW Table Scripts dialog should show a success message like this:

NewImage

You should then repeat this to create the DW Index scripts, which the DAC Client will base on the indexes on the original BI Apps DW tables. As I talked about in my blog post on indexing TimesTen tables a few weeks ago, you should follow-up this replication process with the use of the TimesTen Index Advisor, which generates further index recommendations based on the actual query workload on your system, and can dramatically improve the performance of queries against TimesTen tables compared to what you’d get with the default DAC-generated indexes.

Now back to the data copy tasks. These new tasks need now to be added to an existing, or a new, subject area so that you can then include them in an execution plan. In this instance, I’m going to create a new subject area for the tasks, .hen add the new tasks to this subject area – in reality, you might choose to include these tasks in with your existing DAC subject areas, so that the TimesTen replication happens as the rest of the load takes place, but in my example I’m going to keep this replication separate from my main data load to keep things simple. To create this subject area and add the data copy tasks to it, I do the following:

1. From the DAC Client, select the Design view, then the Subject Areas tab. Press New to create new subject area and call it Data Copy Tasks, then press Save.

2. With the subject area still open for editing, select the Tasks tab, press the Add / Remove button, then type in “DataCopy*” into the Name field and press Go. The two data copy tasks should then be displayed. Press the Add All button to add these tasks to the right-hand list of selected tasks, then press OK.

NewImage

Now it’s a case of either adding this subject area to an existing DAC execution plan or creating a new execution plan just for this data copy subject area. I’m going to do the latter.

1. Switch to the Execute view, select the Execution Plans tab, then press New, and call the new execution plan DataCopy Tasks, and press Save.

NewImage

2. Now with that execution plan still open for editing, switch to the Subject Areas tab and use the Add / Remove button to add the new subject area you just created to this execution plan.

NewImage

3. Now switch to the Connectivity Parameters tab and press the Generate button. Once the parameter generation process completes, select the following values for the two parameters:

DBConnection_OLAP: DataWarehouse
DBConnection_TimesTen: tt_dac

NewImage

This connects the tasks’ logical data source connections to the physical ones, including the physical TimesTen database connection I created earlier in the process.

4. Finally, for this stage, press the Build button above the list of execution plans with this one still selected, to create the list of ordered tasks for this new execution plan.

Now we’re ready to run the data copy process, and replicate these two DW’s tables into the TimesTen database. To do so, again within the Execute view and with the new execution plan selected, press Run. Once the execution plan completed, you should see it listed under the Run History tab, hopefully with a success message next to it.

NewImage

Finally, if you move over to SQL*Developer and create a connection through to the TimesTen database, you should be able to see the replicated data in these two tables.

NewImage

So there you have it – replication of BI Apps DW data into equivalent TimesTen tables, all handled by the DAC and its internal JDBC-based ETL engine. But … what happens next, and how do you get these tables into the BI Apps repository and make them available to the BI Apps dashboards, so that they can be used in-place of the regular Oracle tables to speed up queries? Well – you’ll have to wait for the follow up…. ;-)

New Feature in OEM Cloud Control 12cR3 – Exalytics Server Management

A few months ago a wrote a series of posts about the new 12cR2 release of Oracle Enterprise Manager Cloud Control, which re-instated the BI Management Pack used for monitoring OBIEE 11g systems. Since then, the 12cR3 update for Cloud Control has been released, and one of the new features it added to the BI Management Pack was support for adding Exalytics systems and monitoring them as a whole. So how does this new feature work, and what’s involved in setting it up?

If you read my previous post on registering host systems for use with the BI Management Pack the process for registering an Exalytics system is more or less the same, except that there’s now a specific target type of “Exalytics System” that can be used with the “Add Targets using Guided Process” registration step. In this instance, I’m using EM12cR3 to register an Exalytics server that’s running OBIEE 11.1.1.7 and the rest of the Exalytics v1.0.0.5 stack, including Essbase and TimesTen.

NewImage

Then, assuming the Exalytics server doesn’t currently have the EM Management Agent already installed, one is installed on the Exalytics server automatically by the EM Management Server, so that the Exalytics server can then be monitored at a hardware and software level from the main Enterprise Manager web console.

NewImage

Then, you can either use the auto-discovery feature in EM to “sniff-out” the various middleware components on the Exalytics server, or you can register them manually, this time registering the WebLogic domain on the Exalytics server and automatically picking-up all of the OBIEE components on the server.

NewImage

Essbase Server gets picked-up as part of the discovery process for the OBIEE Fusion Middleware components, but TimesTen has to be separately registered through a plug-in that you have to download into your EM environment.  I covered the TimesTen plug-in download, registration and deployment process back in my earlier post on EM12cR2 and it’s the same with 12cR3, with the TimesTen plugin deployed to the Oracle Management Server in the EM environment, and then deployed to the Management Agent on the Exalytics server, Once you’ve registered both the OBIEE elements and the TimesTen elements for your Exalytics server, you can see them together as part of the overall Exalytics “system” as shown in the screenshot below.

NewImage

The key addition here in 12cR3 is that Exalytics is now a first-class EM target type, and each Exalytics server is registered as an EM “system”, grouping together all of the hardware and software elements into a single management group.

NewImage

Some basic host (server) monitoring comes out of the box and with the BI Management Pack, and there are additional plug-ins and management packs you can use to get further insights into the hardware side, ILOM etc.

NewImage

Then, for each of the OBIEE11g and TimesTen elements, you can drill further into their various monitoring and management pages, just as you would with the 12cR2 version of Cloud Control and the BI Management Pack.

NewImage

As I mentioned before, registering Exalytics Server targets is part of the new 12cR3 release of Enterprise Manager Cloud Control and is part of the pay-extra BI Management Pack, which also gives you additional insights and management features for OBIEE and Essbase. All versions of Exalytics can be registered and managed by EM using this feature, but earlier (pre v1.0.0.3) releases will need a configuration file added to their servers to enable the EM discovery process to recognise them as proper Exalytics servers. All-in-all, not a bad extra feature for the BI Management pack, and it brings Exalytics into line with the other Oracle Engineered Systems in terms of EM target registration – and it’s also a pre-requisite if you’re planning to virtualise Exalytics using OVM and the “trusted partitions” licensing scheme.

Using Non-Oracle, and Cloud-Hosted Databases with OBIEE 11g

The vast majority of our customers use Oracle Database as the underlying data source for OBIEE 11g, sometimes on Exadata but almost always the Enterprise Edition of the database. OBIEE typically sells best into customers that already extensively use Oracle products, and often it’s often bundled-in with a wider technology or apps deal including database and middleware products. But for some customers, the cost of the Oracle Database as a data source can be prohibitive and in many cases, they’re also looking at purchasing and deployment options that don’t involve upfront perpetual license deals and running the software on-premise. One of our longest-standing customers came to us with just such a scenario a few weeks ago, and so I thought it’d be interesting to take a look at what alternatives there are to licensing Oracle Database Enterprise Edition, and whether running these alternatives in the cloud was a practical option.

To set the context: imagine that you wanted to deploy a small-to-medium sized data warehouse on two servers:

  • A “production” server with 64GB RAM, 4 x 4 core servers, Intel x86_64 architecture and running Oracle Linux 6 (for licensing purposes, 8 processors)
  • A “dev/test” server with the same spec, onto which we’d install two separate databases
NewImage

For now we’ll ignore the ETL element, but if we were licensing Oracle Data Integrator Enterprise Edition, we’d need to license it for these two servers too at around $23k/processor.

So what are the alternatives to Oracle Database Enterprise Edition as a data source? To my mind, there are three main options:

Let’s take a look at the Oracle options first of all. Based on 8 processors (4 x 4 Intel x86_64 cores, x0.5 multiplication factor), the cost of licensing Oracle DB EE + commonly-used database options based on the current Tech Price List would be as follows (at list price):

  1. Oracle Database Enterprise Edition ($47.5k x 8 = $380k)
  2. Partitioning Option ($11.5k x 8 = $92k)
  3. EM Tuning Pack ($5k x 8 = $40k)
  4. EM Diagnostics Pack ($5k x 8 = $40k)

Giving us a total of $552k for 8 processors (covering both of our servers). Add onto that the standard 22% support and software updates, and you’re looking at an initial payment of $673k or the perpetual license, and an ongoing $121k/year for annual maintenance. Factor in a DBA and the hardware as well, and you’ll probably not get much change out of $800k in year one, and $250k in the years afterwards.

You could switch-down to one of the cheaper editions of Oracle Database, which also don’t come with the add-on packs, which could take you down to $5.8k/processor for SE1 or $17.5k for SE, but you’ll then have to do without most of the high-end data warehousing features in the Oracle database, and work within socket restrictions when it comes to SE1. Now of course you get what you pay for with Oracle Database, including potentially lower ongoing admin costs because much of the tuning and optimisation work is either unnecessary or automated, but if price is the issue and the alternative is a cut-down version of Oracle, what else is on offer – and that works with OBIEE?

If you start to look around at the alternatives to Oracle, and assuming you’re discounting products like IBM DB/2 or Microsoft SQL Server, you’ll find there are two main “conventional” types of RBDMS that you could use with OBIEE:

  1. General purpose RBDMS’s, such as mySQL and PostgreSQL
  2. “Analytical” databases, typically in-memory, column-store and shared-nothing, such as Greenplum and Vertica

If you look under the covers with the analytical databases, in most cases you’ll find that the core RBDMS technology they use is in fact PostgreSQL, with the (open-source) code then forked and adapted to work with the particular technology angle each vendor chooses to emphasise. Most developers tend to think that PostgreSQL is a more “enterprise-scale” free RDBMS than mySQL, and so given that it also provides the core DB technology in the other products, how might this work with OBIEE?

As a bit of background, PostgreSQL is an open-source database that’s of a similar vintage to Oracle Database, and you can download it for free or install it as part of most Linux distributions. Many developers swear by PostgreSQL itself as being all they need for data warehouse scenarios, but a couple of vendors have taken the PostgreSQL code and commercialised it, adding their own management tools and support options around the core open-source code. The most famous of these is EnterpriseDB, who package up PostgreSQL into a number of options one of which is called Postgres Plus Advanced Server. If you’re looking for an open-source based, low-cost but mature RBDMS engine for your database, this is more or less the flagship option, so I set about creating a new OBIEE 11.1.1.7 VM for testing purposes, with the aim of using Postgres Plus Advanced Server as my data source.

In fact, the first issue you hit when trying to go “non-Oracle” for the database, is that you need a Oracle, SQL Server or IBM DB/2 database to install the RCU schemas into. PostgreSQL isn’t an option, but in reality most customers using OBIEE will have a small Oracle database available somewhere, so I cheated here and installed the RCU schemas into a spare Oracle database, and then went on with the install.

Installing Postgres Plus Advanced Server itself was pretty straightforward, with the installer separately downloading the core PostgreSQL files and the Apache HTTP server used for its web-based admin tools. On Windows, once you’ve installed the software there’s a new Start Menu entry with the EnterpriseDB-specific (value-add) tools, including a console-based Enterprise Manager-type tool called pgAdmin. 

Sshot 1

Opening up pgAdmin gives you an environment that’s similar to that you’d find in tools such as TOAD, SQL*Developer and the old Java thick-client based Enterprise Manager, and using the tool I was able to connect to my PostgreSQL database, create a few tables, and run some test SQL queries.

Sshot 2

It’s pretty basic stuff compared to Oracle’s Enterprise Manager product, but it may be all you need if you’re looking for a tactical, or departmental solution. EnterpriseDB price Postgres Plus Advanced Server at just under $5k/processor/year for the core database (one processor = one socket, however many cores), with their Enterprise Manager product costing $1500/year for two monitored hosts, and their “Solution Pack” another $5k/year, so based on the same server sizing as we used for the Oracle costings, the price of an EnterpriseDB solution looks a bit like this:

  1. Postgres Plus Advanced Server ($5k x 8 = $40k)
  2. Enterprise Manager EM Tuning Pack ($1.5k)
  3. Solutions Pack ($5k)

Coming out at around $47k/year for the two servers. Now the Oracle licenses were perpetual not annual, but even so, the total cost of the on-premise PostgreSQL solution is about 1/3rd of just the annual maintenance cost of the Oracle software, let alone the software license cost, so it’s considerably cheaper. So what don’t you get with PostgreSQL – either core, or commercialised via EnterpriseDB, that you get with Oracle?

Now as a disclaimer, I’m mainly an Oracle person and can’t really speak with any great depth on PostgreSQL, but the major things that are missing in PostgreSQL compared to Oracle Database 11g are:

  • Proper, in-built support for parallel query – PostgreSQL has an open-source GridSQL project but this looks more like a federated, shared-nothing solution rather than PQ within a single server
  • Built-in ETL in the form of Oracle Warehouse Builder
  • Anything like Oracle’s built in MOLAP server, Advanced Analytics and the like
  • All the DR options, security options and so forth

And no doubt the core DW features – bitmap indexing, cost-based optimiser, support for VLDBs and so on – are not nearly as developed, and easy to manage, as with the Oracle database. But – given that not all customers need these features, and that many customers even on 10g and 11g are still using the database as if it were Oracle 7 or 8i – lack of such features may not be a show-stopper if money is tight.

Now so far we’ve been talking about running PostgreSQL “on-premise”, more or less a a direct substitute for Oracle Database; however, it’s also possible to run Postgres Plus Advanced Server on Amazon’s AWS cloud platform, with no local install of the software, simplified administration but access to the same core Postgres Plus Advanced Server features billed at an hourly rate, giving you an architecture like the diagram below:

NewImageEnterpriseDB and AWS provide a free 24-hour trial, so I set up another database this time on the Amazon cloud, provisioned it and created the same set of tables.

NewImage

Connecting to the cloud-based Postgres Plus database was the same as connecting to a regular, on-premise one, and in fact this is more-or-less just commercial PostgreSQL on Amazon AWS. Instead of charging for software and support up-front as with regular Oracle or on-premise Postgres Plus Advanced Server, you’re instead charged, AWS-style, an hourly rate based on the size of instance – one comparable to one of the Oracle servers mentioned beforehand comes in at $6.48/hour or $56k/year, or $112k for the two servers we’d need, still overall less than even the Oracle software maintenance charge but covering hosting as well (though you’ll need to pay Amazon storage and bandwidth charges on top, too).

Connecting OBIEE 11g to these two databases is also fairly straightforward – the Data Direct Postgres ODBC drivers that come with OBIEE work fine, with the connection process involving creating an ODBC DSN to the Postgres database either on-premise on on AWS, then importing the tables into the BI Repository and modelling them as normal. Connecting to the AWS cloud as a data source worked fine as wasn’t laggy, but you’d obviously need to test it at greater volume and with a realistic network connection. But – it works and seems to perform OK, at least with three rows…

Sshot 5

So something like Postgres Plus Advanced Server in the cloud would probably appeal to a customer with a temporary requirement, a departmental budget, or an infrastructure that’s already in the cloud, most probably on Amazon AWS. But you can also run Oracle Database on Amazon’s AWS cloud as well, through a partnership between Oracle and Amazon’s RDS (“Relational Database Service”). In this case, you go to the Amazon AWS website, sign-up for the service, put in your credit card details and within a few minutes, you’ve got a provisioned, Standard Edition One database ready to connect to via all the usual tools. And what this is, again, is the standard Oracle Database (or with EnterpriseDB’s offering, Postgres Plus Advanced Server) but hosted on AWS, charged hourly, with two main options for pricing:

  • A license-included option (Oracle Database Standard Edition One) at $3.14/hour for a reasonably specced-instance, coming out at about $27k/year, or $54k for the two servers we’d need
  • A “bring-your-own licence” (BYOL) option, where you provide the Oracle licenses and Amazon AWS hosts it, at around $19k/year for the same size instance, $38k/year for the two servers we’d need

So if you work on the assumption that Oracle Database SE1 is roughly equivalent to Postgres Plus Advanced Server in terms of DW features, it’s not a bad option, and probably a better one if you’re already heavily invested in Oracle and have lots of skills there. The BYOL option is really just about hosting, but at $38k/year to host the two servers, back them up and so on, it’s not a bad option compared to buying some servers and putting them somewhere.

So, with Amazon RDS or Postgres Plus Advanced Server in the cloud, if you’re happy to take the hit on features and scaleability it’s possible to bring your database costs down considerably, which may be an option when your project first starts off, and with the database platform reviewed say in a couple of years’ time to see whether it’s worth trading up to Oracle Database Enterprise Edition, or bring the whole thing back in-house. But Postgres and Oracle aren’t the only players in the analytical database market – there are other on-premise vendors such as Greenplum and Vertica who’ll sell you either a full-priced, Oracle-comparable database and who have cut-down, community editions available for single-node installs, or there are new players such as Amazon again who are bringing out cloud-native analytic databases again based on instance sizes and hourly charging – so lets look at one of them, Amazon’s “Redshift”.

NewImage

Amazon Redshift is actually based on an analytic database product called ParAccel, a column-store database in the same vein as Vertica but with major bits cut-out in it’s Amazon Redshift form. That said – it’s a lot more of an interesting starting point than core PostgreSQL or Oracle Database SE1, an given that it’s a core Amazon AWS product, it’s no doubt going to receive a lot of investment, developer evangelism and usage over the next few years. It’s also Postgres-compatible in terms of ODBC drivers, so viable for OBIEE, though like Postgres unsupported, so you’ll not get much in the way of BI Server optimisations or help from Oracle Support if it goes wrong.

NewImage

To connect OBIEE to Amazon Redshift there’s a set of ODBC and JDBC drivers that you can download, and once you’ve opened up the port to Redshift in the AWS firewall, you can create an ODBC connection on the OBIEE side just like most other PostgreSQL connections, and then import the Redshift metadata into OBIEE’s BI Repository.

NewImage

Redshift is priced around the same mark as Postgres Plus Advanced Server, but without the Amazon AWS storage and bandwidth charges. It’s a bit higher-risk than Postgres Plus, and it’s not got the Oracle Database compatibility features EnterpriseDB added to core Postgres, but it’s column-store, cloud-native and backed fully by Amazon.

So – with all of this in mind, what are the conclusions? Well it’s fair to say that, based on what turned-up today and OBIEE’s pre-existing heterogenous database support, you can connect non-Oracle databases to OBIEE as well as Oracle ones, but you’ll need to bear in mind that the newer cloud-based ones, and the open-source ones, won’t necessarily come with support from Oracle. If you’re looking to keep as close to an Oracle solution as possible but cut costs dramatically, using Amazon’s RDS service particularly with the license-included SE1 option could bring down the cost considerably, from $800k or so to around $50k/year, but you’ll need to work within the product constraints of SE/SE1, so no materialised views, partitioning option and the like – so it’s probably a matter of preference whether you’d go for Oracle SE1 or a Postgres-based solution such as EnterpriseDB’s Postgres Plus Advanced Server (I’d probably go with Amazon RDS and Oracle Database SE1, if only because I can upgrade in-time to Oracle Database Enterprise Edition and my Oracle skills can be re-used). 

Going forward – analytic databases-as-a-service certainly sound interesting, with Amazon Redshift looking particularly interesting. How much customers will take up cloud-based data warehousing databases in-general though will probably be limited until OBIEE itself goes into the cloud – either through cloud-friendly licensing policies that charge by the hour and instance capacity, or with a cloud-native version of OBIEE designed to interoperate with Oracle’s Cloud Database offering, which up until now doesn’t provide external SQL*Net access and is really designed for powering cloud-based Java apps. Watch this space as they say – and it’ll be interesting to hear what Oracle announce in this area at next months’ Oracle Openworld.

A First Look at the OBIEE11g Mobile App Designer

A few days ago you might have noticed the flurry of activity from Oracle around a new OBIEE 11g add-on called the “Oracle BI Mobile App Designer“. Designed to complement Oracle BI Mobile HD, Oracle’s “native client” mobile BI client, BI Mobile App Designer instead creates HTML5-based web apps that work with not only the iPad and iPhone, but all HTML5-based browsers including those in recent Android phones and tablets. So what is Oracle BI Mobile App Designer, how do you get in installed, and how does it’s mobile BI Apps differ from Oracle BI Mobile HD?

To use the BI Mobile App designer, all of the installation and configuration happens server-side, as the apps it creates are 100% thin-client, web-based apps. Therefore, there’s no App Store downloads or additional software that you’ll need to get cleared with your security team, but you’ll need to ensure you’re on the very latest version of OBIEE 11g – 11.1.1.7.1 – and also apply some patches on-top to enable the BI Mobile App feature. From what I can see, the complete set of patches are only currently available for Linux x86_64 and Windows x64, and assuming you’re on OBIEE 11.1.1.7 at the moment, the patches you’ll need to apply are as follows:

  • 16556157 (OBIEE 11.1.1.7.1 patchset, which also requires you to download another patch, 16569379)
  • 17004920, which enables support for BI Mobile App Designer in the catalog, and the Presentation Services UI
  • 17220944, the actual BI Mobile App Designer app, enabled through extending the OBIEE WebLogic domain

As the first patch actually contains six sub-patches, it’s quite a marathon to get all of them installed, but on my Oracle Linux 5 installation it all went smoothly and I was up-and-running in a couple of hours. Once you’ve installed everything, applied the security settings and started the BI Mobile App Designer application for the first time, you’ll now see it listed as a “create” option in the OBIEE 11g web interface, like this:

NewImage

Access to BI Mobile App designer apps is granted through the usual BIConsumer application role, with the ability to create new apps granted through BIAuthor (all of which, of course, can be customised). So let’s get on then and create our first mobile app.

The first choice you’re prompted with after selecting New > Mobile App, is to select the device type; either Phone or Tablet:

NewImage

I select Tablet, and I’m then prompted to select the data source type. At this point, it’s clear what technology BI Mobile App designer is based on – BI Publisher 11g; this ability to source data either from a custom data model, or directly from the OBIEE BI Repository, is a feature of BI Publisher (direct access to the RPD came with the 11.1.1.7 release), and in fact, the best way to understand the role of BI Mobile App Designer compared to BI Mobile HD, is to consider how BI Publisher complements OBIEE’s regular dashboards and analyses. BI Publisher accesses the same data sources as analyses and dashboards (more, in fact) and gives you a lot more flexibility and room to be creative compared to regular dashboards, with regular dashboards having the edge though when it comes to “power-user” analysis and access to features such as action links, maps, KPIs and scorecards and the like. By basing BI Mobile App Designer on BI Publisher technology though (and keeping BI Mobile HD as it is), some additional interesting mobile analytics possibilities are opened up:

  • You’ve got a lot more control and flexibility over the layout, allowing you to create “executive information system”-type dashboards, or ones focused on particular role or LOB
  • You can include images, text, custom navigation features
  • You can support Android and other non-iOS devices, with no need for App Store downloads
  • There’s no need to get the mobile app signed or approved by corporate security, as it’s a 100% thin-client, no-install application, with no data stored on the device

Let’s move on then, and create the application proper.

After you’ve initially saved the application to the Presentation Services catalag, you’re then presented with a cover page, a list of data items down the left-hand side, and options to create new pages and application elements across the top – you can see how this editor is based on the Online Layout Editor used by BI Publisher 11g. As an aside, I used the latest version of Safari on the Mac in this session, and didn’t hit any compatibility problems creating and viewing reports.

NewImage

A BI Mobile App typically contains a number of pages, including this cover page (for which you can change the image, title and so on). As with BI Publisher templates, each page you create can have a grid-type layout, to which you then add components such as charts, tables, pivot tables and navigation items. In the example below, I’ve added a page using the Navigation page style, which automatically adds a filter object to the left-hand side, and allows me to drop charts and other components to the right of it, each of which automatically gets filtered by the selected column value.

NewImage

A new component type introduced in this application is the Tile component, which takes a grouping column, then displays a set of charts or other components grouped by that column. The screenshot below on the left is the page in design mode, whilst on the right I’ve previewed it in display mode, showing all of the tiles for that set of LOBs.

NewImage

As well as filtering data using Filter components, there’s also what’s called the “Accordion” component – this lists out a set of column values plus subtotals, but when you click on a column value, a component such as a chart is then displayed for that value. In the screenshot below I’ve created an accordion navigation component for LOB, with each click of a column value showing sales broken down by product type within that LOB (and five points for whoever remembers what the computer on the right-hand side of the screen is).

NewImage

There’s also the concept of “sub-pages”, pages which are linked to from the master page and which filter on the value that’s clicked-on, similar to the click-filtering that you get with BI Publisher but across two pages. In the example page above, the bar chart shown within the LOB accordion chart can now be clicked on to link to the LOB Detail Analysis page I’ve also set up, like this:

NewImage

So once you’ve created your application, it’s time to try it out. You can either preview the app in your regular desktop web browser, or using a QR code on your tablet. Authentication is carried out as per OBIEE’s regular dashboard web application, but you can create a link directly to the BI Mobile App rather than have the user go through the standard OBIEE login page – or, you can use a new URL, http://[server_name:port]/mobile/appstore, which presents them with the BI Mobile App designer login page, and then an “App Store” of mobile apps they can select, and then “subscribe” to, adding it to their personal app page.

NewImage

Then, it’s a case of viewing and navigating around the mobile app, using the navigation items added earlier on, and viewing the charts, tables and other components contained within each page.

NewImage

So – not bad at all, and a very interesting complement to the native iOS clients, especially if you’re on Android, you want a more custom, LOB-style app or you’re concerned about the security implications of a thick-client app on your mobile device. No doubt we’ll see more imaginative mobile app layout on the web and from Rittman Mead in the future, but for now – looks good.

Three Ways to Improve Efficiency With Better Documentation

I was at a client site recently, and as part of a front-end development effort we had added some documentation within the web catalog. They had never seen anything like it and started referring to it as “smart” documentation. Quite frankly, I was more than a bit taken aback by all the positive reactions. I didn’t start out in the field of business intelligence and analytics; I got my start as a software developer programming in C, C++, and Java. In software development, commenting your work was expected… it was just something you did. In the world of dashboards and analyses however, more often than not, documenting your work proves to be the exception and not the norm. It tends to be an afterthought and even then it’s usually shoved into some document stored in an obscure shared drive that no ever has time to update.

But that shouldn’t be the case! Don’t get me wrong, I don’t find documentation all that exciting. “Hate” is a strong word, but I for one certainly don’t “enjoy” documenting. I haven’t come across all that many technologists that do. But I also don’t enjoy repeatedly answering the same questions time and time again; and I don’t enjoy endlessly maintaining projects (one of the draws of consulting is that each project has a finite end). But, if there is one thing I truly hate, it’s doing things twice.

I like efficiency. And in my experience, smart documentation leads to efficiency.

1. Naming conventions. This one’s a no-brainer. Before you start developing anything, put some thought into how your objects will be named and organized. Give the objects in your web catalog a logical name and group them in a way that makes sense. If someone new joins the development team and the analyses are all named Finance Report 1, Finance Report 2, etc. how are they supposed to determine what’s what? Hopefully I’m exaggerating, but you get the idea. Everything should have a semantic name that gives a bit of insight into what its purpose is.

I like to set up my web catalog with a folder for each object type (e.g. a folder for Analyses, Prompts, Actions, etc.). I might also prepend the name of the dashboard to the name of the analysis. To me, this makes total sense. At a glance, anyone can figure out where to look for a particular object.

NewImage

This might not work everywhere.  We typically migrate an entire subject area at once, however, if you migrate individual dashboards you’ll want to organize your catalog differently.  Every company, every implementation, even every project is different.  There is no one hard and fast rule with regard to naming conventions.  The real point I want to make is to set up some naming standards and enforce them.

2. Developer Notes. I make it a point to create a Static Text view within each and every analysis I create. I rename the view to “Developer Notes” and jot down a few words for anything that might become a question later on, especially if I think I might question what was done later on. Explain the logic behind any complex filters you’ve used, how conditional formatting has been set up, why certain columns appear as part of the criteria, etc. Organize the notes enough that someone won’t have to read it in its entirety to make some basic modifications, but otherwise don’t spend a whole lot of time worrying about the format. I promise that this will save you time down the road and it shouldn’t take more than a few minutes time.

I don’t typically add this to the compound layout, it’s generally not something I want all the users to see.  But later on if I need to see my notes I can add them in just a few clicks from the Edit Dashboard screen.

NewImage

Case in point… we’ve all used conditional formatting, it’s a great tool to highlight certain information.  Personally, I like to use conditional formatting to highlight certain information based upon a value in a particular column.  But, do you know what happens if that column is removed (which could easily happen especially if you have a shared environment with multiple developers)?

NewImage

And there’s no easy way to “undo” either. Within the XML the columns are referenced using a hashed ID, so you can’t just add the column back in. You’d need to either roll back to a saved version, recreate the conditional formatting from scratch, or manually make changes to the XML.

But not to worry, if I just include a sentence or two in the developer notes, I could let everyone know not to remove this column for any reason. Make it a standard operating procedure when working with other developers that everyone should at least glance through the developer notes prior to making any modifications.

3. Administration Page. How often do you need to look up a variable or create a quick analysis to verify the value of a variable? It’s not a big of a deal if you have access to the Admin tool, but maybe you don’t. Or maybe you have access to the Admin tool and others don’t… do they constantly pester you for what variables are available? There’s no reason to hoard all this information; empower your users. Give them an administration page with all this information so they can figure it out for themselves and by doing so, you’ve streamlined the entire process. Create an analysis that contains all your important variables. Provide the appropriate variable name and save. Throw that on a separate Administration dashboard page, secure it to your developers if you wish, and you’re good to go.

NewImage

While you’re at it, put together some notes around your naming standards and stash that out on the dashboard as well!

What are some of the ways you document your web catalog?