Tag Archives: Oracle BI Suite EE

OBIEE “Act As” vs “Impersonate”

There will come a point in the lifecycle of an OBIEE deployment when one user will need to access another user’s account. This may be to cover whilst a colleague is on leave, or a support staff trying to reproduce a reported error.

Password sharing aside (it’s zero-config! but a really really bad idea), OBIEE supports two methods for one user to access the system as if they were another: Impersonation and Act As.

This blog article is not an explanation of how to set these up (there are plenty of blogs and rip-off blogs detailing this already), but to explain the difference between the two options.

First, a quick look at what they actually are.

Impersonation

Impersonation is where a “superuser” (one with oracle.bi.server.impersonateUser application policy grant) can login to OBIEE as another user, without needing their password. It is achieved in the front end by constructing a URL, specifying:

  • The superuser’s login name and password (NQUser and NQPasword)
  • The login ID of the user to impersonate (Impersonate)

For example:

http://server:port/analytics/saw.dll?Logon&NQUser=weblogic&NQPassword=Password01&Impersonate=FSmith_FundY

The server will return a blank page to this request, but you can then submit another URL to OBIEE (eg the OBIEE catalog page or home page) and will already be authenticated as the Impersonate user – without having specified their password.

From here you can view the system as they would, and carry out whatever support or troubleshooting tasks are required.

Caution :  Impersonation is disabled by default, even for the weblogic Administrator user, and it is a good idea to leave it that way. If you do decide to enable it, make sure that the user to whom you grant it has a secure password that is not shared or known by anyone other than the account owner. Also, you will see from the illustration above that the password is submitted in plain text which is not good from a security point of view. It could be “sniffed” along the way, or more easily, extracted from the browser history.

Act As

Whilst Act As is a very similar concept to Impersonation (allow one user to access OBIEE as if they were another), Act As is much more controlled in how it grants the rights. Act As requires you to specify a list of users who may use the functionality (“Proxy users”), and for each of the proxy users, a list of users (“Target users”) who they may access OBIEE as.

Act As functionality is accessed from the user dropdown menu :

From where a list of users that the logged-in user (“proxy user”) has been configured to be able to access is shown :

Selecting a user switches straight to it:

In addition to this fine grained specification of user:user relationships, you can specify the level of access a Proxy user gets – full, or read-only. Target users (those others can Act As) can see from their account page who exactly has access to their account, and what level of access.

So what’s the difference?

Here’s a comparison I’ve drawn up

Here are a couple of examples to illustrate the point:

 

Based on this, my guidelines for use would be :

  • As an OBIEE sysadmin, you may want to use Impersonate to be able to test and troubleshoot issues. However, it is functionality much more intended for systems integration than front-end user consumption. It doesn’t offer anything that Act As doesn’t, except fewer configuration steps. It is less secure that Act As, and could even be seen as a “backdoor” option. Particularly at companies where audit/traceability is important should be left disabled.
  • Act As is generally the better choice in all scenarios of an OBIEE user needing the ability to access another’s account, whether between colleagues, L1/L2 support staff, or administrators.
    Compared to Impersonation, it is more secure, more flexible, and more granular in whose accounts can be accessed by whom. It is also fully integrated into the user interface as standard functionality of the tool.

Reference

Thanks to Christian Berg, Gianni Ceresa and Gianni Ceresa for reading drafts of this article and providing valuable feedback

SampleApp v309R2 Now Available For Download on OTN

The new v309R2 version of the OBIEE 11g SampleApp is now available for download on OTN, based on the 11.1.1.7.1 version of OBIEE and with a number of new dashboards, analyses and integration examples.

NewImage

OBIEE 11.1.1.7.1 is of course the version that supports the new Mobile App Designer, Oracle’s new HTML5-based mobile BI authoring tool. I covered this new mobile BI option a few weeks ago, and the new SampleApp includes a number of Mobile App Designer demos that you can access either from the main dashboard, or on your mobile device through the new “App Store”.

NewImage

SampleApp v309R2 also comes with the back-end Oracle Database upgraded to 12cR1, which means there’s some examples of the temporal queries, pattern matching queries and so on that we covered during the 12c launch.

NewImage

This new version also comes with a couple of “tips and tricks” features that you might want to look into further, to see how they were done. The first one is having two RPDs, and two catalogs, running on the same install, as you can see from the screenshots below – one is on port 9704 whilst the other is on 9502, but they’re both on the same IP address.

NewImage

This isn’t quite the “holy grail” of hosting multiple RPDs and catalogs on the same installation though – the way it’s actually done is by creating a second BI instance within the same middleware home, so you’ve got two WebLogic domains and therefore two admin servers, two managed servers and so on. It’s still useful if you’re looking to host multiple demos on the same VM (bearing in mind each install will take another 2GB of RAM because of the WebLogic JVMs), and it also helps illustrate the difference between middleware homes, domains and instances.

The other “tip and trick” that I noticed was the example of displaying image files uploaded into the catalog directly in the analysis results, rather than having to expose them through a WebLogic folder (the res folders that John Cook talks about in this blog post). In the screenshots below, you can see the final dashboard page with a number of catalog items displayed in it, and then the underlying analysis that access them via the saw.dll?downloadfile call.

NewImage

SampleApp v309R2 is available for download on OTN as a Virtualbox VM, along with the what’s new guide and a deployment guide for getting the VM up and running.

Tips and Tricks for the OBIEE linux sysadmin

As well as industry-leading solution architecture, consultancy and training on Oracle BI, here at Rittman Mead we also provide expert services in implementation and support of such systems. In this blog I want to share some of the things I find useful when working with OBIEE on a Linux system.

OBIEE Linux start up script / service

Ticking both the OBIEE and Linux boxes, this script that I wrote is probably top of my list of recommendations (he says modestly…). It enables you to start and stop OBIEE using the Linux standard service command, integrate it into system startup/shutdown (through init.d), and also supports an advanced status command which does its very best to determine the health of your OBIEE system.

Start up

Shutdown

Status

You can get the script from the Rittman Mead public GitHub repository, or directly download the script here (but don’t forget to check out the README).

screen

GNU screen is one of the most useful programs that I use on Linux. I wrote extensively about it in my blog post screen and OBIEE. It enables you to do things such as:

  • Run multiple commands simultaneously in one SSH session
  • Disconnect and reconnect (deliberately, or from dropped connections, e.g. on unreliable wi-fi or 3G) and pick up exactly where you left off, with all processes still running
  • Share your SSH view with someone else, for remote training or a second pair of eyes when troubleshooting
  • Search through screen scroll back history, cut and paste
  • …a lot more!

There are other screen multiplexers such as tmux, but I’ve found that screen is the most widely available by default. Since they all have quite steep learning curves and esoteric key shortcuts to operate them, I tend to stick with screen.

SSH keys

Like screen, nothing to do with OBIEE per se, but an important part of Linux server security to understand, IMNSHO (In My Not-So Humble Opinion!).

Maybe I’m overly simple but I like pretty pictures when I’m trying to grasp concepts, so here goes:


  • You create a pair of keys using ssh-keygen. These are plain text and can be cut and pasted , copied, as required. One is private (e.g. id_rsa), and you need to protect this as you would any other security artifact such as server passwords, and you can optionally secure with a pass phrase. The other is public (e.g. id_rsa.pub), and you can share with anyone.

  • Your public key is placed on any server you need access to, by the server’s administrator. It needs to go in the .ssh folder in the user’s home folder, in a file called authorized_keys. As many public keys as need access can be placed in this file. Don’t forget the leading dot on .ssh.

Why are SSH keys good?

  • You don’t need a password to login to a server, which is a big time saver and productivity booster.
  • Authentication becomes about “this is WHO may access something” rather than “here is the code to access it, we have no idea who knows it though”.
  • It removes the need to share server passwords
    • Better security practice
    • Easier auditing of exactly who used a server
  • It enables the ability to grant temporary access to servers, and precisely control when it is revoked and from whom.
  • Private keys can be protected with a passphrase, without which they can’t be used.
  • Using SSH keys to control server access is a lot more secure since you can disable server password login entirely, thus kiboshing any chance of brute force attacks
  • SSH keys can be used to support automatic connections between servers for backups, starting jobs, etc, without the need to store a password in plain text

Tips

  • SSH keys are just plain text, making them dead easy to backup in a Password Manager such as LastPass, KeePass, or 1Password.
  • SSH keys work just fine from Windows. Tools such as PuTTY and WinSCP support them, although you need to initially change the format of the private key to ppk using PuTTYGen, an ancillary PuTTY tool.
  • Whilst SSH keys reside by default in your user home .ssh folder, you can store them on a cloud service such as Dropbox and then use them from any machine you want.
    • To make an ssh connection using a key not in the default location, use the -i flag, for example
      ssh -i ~/Dropbox/ssh-keys/mykey foo@bar.com
      
  • To see more information about setting up SSH keys, type:
    man ssh
    
  • The authorized_keys file is space separated, and the last entry on each line can be a comment. This normally defaults to the user and host name where the key was generated, but can be freeform text to help identify the key more clearly if needed. See man sshd for the full spec of the file.
    ssh4

Determining your server’s public IP / validating Internet connectivity

Dead simple this one – if you’re working on a server, or maybe a development VM, and need to check it has internet connection or want to know what the IP address is :

curl -s http://icanhazip.com/

This command returns just the IP and nothing else. Of course if you don’t have curl installed then it won’t work, so you’re left with the ever-reliable

ping google.com

Learn vi

…or emacs, or whatever your poison is. My point is that if you are going to be spending any serious time as an admin you need to be able to view and edit files locally on the Linux server. Transferring them to your Windows desktop with WinSCP to view in Notepad is what my granny does, and even then she’s embarrassed about it.

Elitism and disdain aside, the point remains. The learning curve of these console-based editors repays itself many-fold in time and thus efficiency savings in the long run. It’s not only faster to work with files locally, it reduces context-switching and the associated productivity loss.

Compare:

  1. I need to view this log file
  2. vi log.txt
  3. Done

with

  1. I need to check this log file for an error
  2. Close terminal window
  3. Start menu … now where was that program … hey fred, what’s the program … yeh yeh WinSCP that’s right
  4. Scroll though list of servers, or find IP to connect to
  5. Try to remember connection credentials
  6. Hey I wonder if devopsreactions has anything cool on it today
  7. Back to the job in hand … transfer file , which file?
  8. Hmmm, what was that folder called … something something logs, right?
  9. Dammit, back to the terminal … pwd , right, gottcha
  10. Navigate to the folder in WinSCP, find the file
  11. Download the file
  12. That dbareactions is pretty funny too, might just have a quick look at that
  13. Open Notepad (or at least Notepad++, please)
  14. Open file … where did Windows put it? My Documents? Desktop? Downloads? Libraries, whatever the heck they are ? gnaaargh
  15. Wonder if those cool guys at Rittman Mead have posted anything on their blog, let’s go have a look
  16. Back to Notepad, got the log file, now …… what was I looking for?
  17. Soddit

Silent Installs

This has to be my #1 tip in the Work Smarter, Not Harder category of OBIEE administration, and is as applicable to OBIEE on Windows as it is to OBIEE on Linux. Silent installs are where you run the installer “hands off”. You create a file in advance that describes all the configuration options to use, and then crank the handle and off it goes. You can use silent installs for

  • OBIEE Enterprise install
  • OBIEE Software Only install
  • OBIEE domain configurtion
  • WebLogic Server (WLS) install
  • Repository Creation Utility (RCU), both Drop and Create

The advantages of silent installs are many:

  • Guaranteed identical configurations across installations
  • No need to waste time getting a X Server working for non-Windows installs to run the graphical install client
  • Entire configuration of a server can be pre-canned and scripted
  • Running the graphical installer is TEDIOUS the first time, the second, third, tenth, twentieth … kill me. Silent installs make the angels sing and new born lambs frolic in the virtual glades of OBIEE grass meadows heady with the scent of automatically built RCU schemas

To find out more about silent installations, check out:

We’ve shared some example response files on the Rittman Mead public GitHub repository, or you can generate your own by running the installer once in GUI mode and selecting the Save option on the Summary screen of an installation. You can just run the installer to generate the response file – you don’t have to actually proceed with the installation if all you want to do is generate the response file.

opatch napply

I wrote about this handy little option for opatch in a blog post here. Where you have more than one patch to apply (as happens frequently with OBIEE patch bundles) this can be quite a time saver.

Bash

Bash is the standard command line that you will encounter on Linux. Here are a few tricks I find useful:

Ctrl-R – command history

This is one of those shortcuts that you’ll wonder how you did without. It’s like going through your command history by pressing the up/down arrows (you knew about that one, right?), but on speed.

What Ctrl-R does is let you search through your command history and re-use a command just by hitting enter.

How it works is this:

1) Press Ctrl-R. The bash prompt changes to

    (reverse-i-search)`':

2) Start entering part of the command line entry that you want to repeat. For example, I want to switch back to my FMW config folder. All I type is “co” to match the “co” in config, and bash shows me the match:

    (reverse-i-search)`co': cd /u01/dit/fmw/instances/instance1/config/

3) If I want to amend the command, I can press left/right arrows to move along the line, or just hit enter and it gets re-issued straight off

4) If there are multiple matches, either keep typing to narrow the search down, or press Ctrl-R to show the next match or Shift-Ctrl-R to show the previous match

Another example, I want to repeat my sqlplus command, I just press Ctrl-R and start typing sql and it’s matched:

(reverse-i-search)`sq': sqlplus / as sysdba

Finally, repeat the restart of Presentation Services, just by entering ps:

(reverse-i-search)`ps': ./opmnctl restartproc ias-component=coreapplication_obips1

time

If you prefix any command by time you get a nice breakdown of how long it took to run and where the time was spent after it completes. Very handy for quick bits of performance testing etc, or just curiosity :-)

$ time ./opmnctl restartproc ias-component=coreapplication_obis1
opmnctl restartproc: restarting opmn managed processes...

real    0m14.387s
user    0m0.016s
sys     0m0.031s

watch

This is a fantastic little utility that will take the command you pass it and repeatedly issue it, by default every two seconds.

You can use it to watch disk space, directory contents, and so on.

watch df -h

watch

sudo !!

Not the exclamation “sudo!”, but sudo !!, meaning, repeat the last command but with sudo.

$ tail /var/log/messages
tail: cannot open `/var/log/messages' for reading: Permission denied

$ sudo !!
sudo tail /var/log/messages
Sep 26 18:18:16 rnm-exa-01-prod kernel: e1000: eth1 NIC Link is Up 1000 Mbps Full Duplex, Flow Control: RX
Sep 26 18:18:16 rnm-exa-01-prod avahi-daemon[4965]: Invalid query packet.

sudo !!

What is sudo? Well I’m glad you asked:

xkcd sudo
(credit: XKCD)

Over to you!

Which commands or techniques are you flabbergasted aren’t on this list? What functionality or concept should all budding OBI sysadmin padawans learn? Let us know in the comments section below.

Openworld is Over – Now We’re Coming to India..!

Oracle Openworld 2013 is now over, but no sooner have we unpacked from that trip, we’re packing again for our next one – our BI Masterclass Tour for India, starting in a few week’s time in Bangalore.

Running in partnership with ODTUG and with myself, Venkat Janakiraman and Stewart Bryson leading the sessions, we’re looking forward to sharing the news from Openworld, talking about the latest in Oracle BI and EPM development, and meeting Oracle BI enthusiast at each event.

The event is taking place over three cities – Bangalore, Hyderabad and Mumbia, with each masterclass running for a full day. We’ll go to Bangalore on Tuesday 15th October, Hyderabad on Thursday 17th October and then fly up to Mumbai for Saturday, 19th October 2013. Full details are on the event home page including details on how to register, with each masterclass’s agenda looking like this:

  • 9.30am – 10.00am: Registration and Welcome
  • 10.00am – 10.30am: Oracle BI, Analytics and EPM Product Update – Mark Rittman
  • 10.30am – 11.30pm: Extreme BI: Agile BI Development using OBIEE, ODI and Golden Gate – Stewart Bryson
  • 11.30pm – 12.30pm: OBIEE 11g Integration with the Oracle EPM Stack – Venkatakrishnan J
  • 12.30pm – 1.30pm: Lunch & Networking
  • 1.30pm – 2.30pm: OBIEE and Essbase on Exalytics Development & Deployment Best Practices – Mark Rittman
  • 2.30pm – 3.30pm: Oracle BI Multi-user Development: MDS XML versus MUDE – Stewart Bryson
  • 3.30pm – 4.00pm: Coffee Break & Networkng
  • 4.00pm – 5.00pm: Intro and tech deep dive into BI Apps 11g + ODI
  • 5.00pm – 6.00pm: Metadata & Data loads to EPM using Oracle Data Integrator - Venkatakrishnan J

The dates, locations and registration links for the three events are as follows:

We’re also investigating the idea of bringing our Rittman Mead BI Forum to India in 2014, so this would be a good opportunity to introduce yourself to us and the other attendees if you’d like to present at that event, and generally let us know what you’re doing with Oracle’s BI, EPM, analytics and data warehousing tools. There’ll also be lots of ODTUG goodies and giveaways, and a social event in the evening after the main masterclass finishes.

Numbers are limited though, and places are going fast – check out the event page for full details, and hopefully we’ll see some of you in either Bangalore, Hyderabad or Mumbai!

Oracle Openworld 2013 : Reflections on Product Announcements and Strategy

I’m sitting writing this at my desk back home, with a steaming mug of tea next to me and the kids pleased to see me after having been away for eight days (or at least my wifepleased to hand them over to me after looking after them for eight days). It was an excellent Oracle Openworld – probably the best in the ten years I’ve been going in terms of product announcements, and if you missed any of my daily updates, here’s the links to them:

We also delivered sixteen sessions over the week, and whilst a few of them can’t be circulated because they contain details on beta or forthcoming products, here’s links to the ones that we can post:

So then, on reflection, what did I think about the various product announcements during the week? Here’s my thoughts now I’m back in the UK.

NewImage

First off – Exalytics. Clearly there’s a lot of investment going into the Exalytics offering, both from the hardware and the software sides. For hardware, it’s just really a case of Oracle keeping up with additions to Sun’s product line, and with the announcement of the T5-8 model we’re now up to 4TB of RAM and 128 SPARC CPU cores – aimed at the BI consolidation market, where 1 or 2TB of RAM quickly goes if you’re hosting a number of separate BI systems. Cost-wise – it’s correspondingly expensive, about twice the price of the X3-4 machine, but it’s got twice the RAM, three times the CPU cores and runs Solaris, so you’ve got access to the more fine-grained workload separation and virtualisation that you get on that platform. Not a machine that I can see us buying for a while, but there’s definitely a market for this.

With Exalytics though you could argue that it’s been the software that’s underwhelmed so far, as opposed to the hardware. The Summary Advisor is good, but it doesn’t really handle the subsequent incremental refresh of the aggregate tables, and TimesTen itself whilst fast and powerful hasn’t had a great “out of the box” experience – in the wrong hands, it can give misleadingly-slow response-times, something I found myself a few months ago back on the blog. So it was interesting to hear some of the new features that we’re likely to see in “Exalytics v2.0″, probably late in calendar year 2014; an updated aggregate refresh mechanism based on DAC Server technology and with support for GoldenGate; new visualisations including data mash-up capabilities that I’m guessing we’ll see as exclusives on Exalytics and Oracle’s cloud products; enhancements coming for Essbase that’ll make it easier to spin-off ASO cubes from an OBIEE repository; and of course, the improvements to TimesTen to match those coming in the core Oracle database – in-memory analytics.

NewImage

And what an announcement that was – in-memory column-store technology within the Oracle database, not predicated on using Exadata, and all running transparently in the background withminimal DBA setup required. Now in-reality, not only is this not the first in-memory Oracle database offering – the Exadata boxes in previous open world presentations also were positioned as in-memory, but that was flash memory, not DRAM – and they’re not the first vendor to offer in-memory, column-store as a feature, but given that it’ll be available to all Oracle 12.1.2 databases that license the in-memory option, and it’ll be so easy to administer – in theory – it’s a potential industry game-changer.

Of course the immediate question on my lips after the in-memory Oracle Database announcement was “what about TimesTen“, and “what about TimesTen’s role in Exalytics”, but Oracle played this in the end very well – TimesTen will gain similar capabilities, implemented in a slightly different way as TimesTen already stores its data in memory, albeit in row-store format – and in fact TImesTen can then take on a role of a business-controlled, mid-tier analytic “sandbox”, probably receiving new in-memory features faster than the core Oracle database as it’s got less dependencies and a shorter release cycle, but complementing the Oracle database an it’s own, more large-scale in-memory features. And that’s not forgetting those customers with data from multiple, heterogenous sources, or those that can’t afford to stump-up for the In-Memory option for all of the processors in their data warehouse database server. So – fairly internally-consisent at least at the product roadmap level, and we’ll be looking to get on any betas or early adopter programs to put both products through their paces.

The other major announcement that affects OBIEE customers, is, of course, OBIEE in the Cloud – or “Reporting-as-a-Service” as Oracle referred to it during the keynotes. This is one of the components of Oracle’s new “platform-as-a-service” or PaaS offering, alongside a new, full version of Oracle 12c based on its new multitenant architecture, identity-management-as-a-service, documents-as-a-service and so on. What reporting-a-service will give us isn’t quite “OBIEE in the cloud”, or at least, not as we know it now; Oracle’s view on platform-as-a-service is that it should be consumer-level in terms of simplicity to setup, and the quality of the user interface, it should be self-service and self-provisioning, and simple to sign-up for with no separate need to license the product. So in OBIEE terms, what this means is a simplified RPD/data model builder, simple options to upload and store data (also in Oracle’s cloud), and automatic provisioning using just a credit card (although there’ll also be options to pay by PO number etc, for the larger customers.)

NewImage

And there’s quite a few things that we can draw-out of this announcement; first, it’s squarely aimed – at least at the start – at individual users, departmental users and the like looking to create sandbox-type applications most probably also linking to Oracle Cloud Database, Oracle Java-as-a-Service and the like. It won’t, for example, be possible to upload data to this service’s datastore using conventional ETL tools, as the only datasource it will connect to at least initially will be Oracle’s Cloud Database schema-as-a-service, which only allows access via ApEx and HTTP, because it’s a shared service and giving you SQL*Net access could compromise other users. In the future, it may well connect to Oracle’s full DBaaS which gives you a full Oracle instance, but for now (as far as I’ve heard) there’s no option to connect to an on-premise data source, or Amazon RDS, or whatever. And for this type of use-case – that may be fine, you might only want a single data source, and you can still upload spreadsheets which, if we’re honest, is where most sandbox-type applications get their data from.

This Reporting-as-a-Service offering though might well be where we see new user interface innovations coming through first, though. I get the impression that Oracle plan to use their Cloud OBIEE service to preview and test new visualisation types first, as they can iterate and test faster, and the systems running on it are smaller in scope and probably more receptive to new features. Similar to Salesforce.com and other SaaS providers, it may well be the case that there’s a “current version”, and a”preview version” available at most times, with the preview becoming the current after a while and the current being something you’ve got 6-12 months to switch from after that point. And given that Oracle will know there’s an Oracle database schema behind the service, it’s going to make services such as the proposed “personal data mashup” feature possible, where users can upload spreadsheets of data through OBIEE’s user interface, with the data then being stored in the cloud and the metrics then being merged in with the corporate dataset, with the source of each bit of data clearly marked. All this is previews and speculation though – I wouldn’t expect to see this available for general use until the middle of 2014, given the timetable for previous Oracle cloud releases.

NewImage

The final product area that I was particularly interested in hearing future product direction about, was Oracle’s Data integration and Data Quality tools. We’ve been on the ODI 12c beta for a while and we’re long-term users of OWB, EDQ, GoldenGate and the other data integration tools; moreover on recent projects, and in our look at the cloud as a potential home for our BI, DW and data analytcs projects, its become increasingly clear that database-to-database ETL is no longer what data integration is solely about. For example, if you’re loading a data warehouse in the cloud, and the source database is also in the cloud, does it make sense to host the ETL engine, and the ETL agents, on-premise, or should they live in the cloud too? 

And what if the ETL source is not a database, but a service, or an application such as Salesforce.com that provides a web service / RESTful API for data access? What if you want to integrate data on-the-fly, like OBIEE does with data federation but in the cloud, from a wide range of source types including services, Hadoop, message buses and the like. And where does replication come in, and quality-of-service management, and security and so forth come in? In my view, ODI 12c and its peers will probably be the last of the “on-premise”, “assumed-relational-source-and-target” ETL tools, with ETL instead following apps and data into the cloud, assuming that sources can be APIs, messages, big data sources and so forth as well as relational data, and it’ll be interesting to see what Oracle’ Fusion Middleware and DI teams come up with next year as their vision for this technology space. Thomas Kurian’s keynote touched on this as a subject, but I think we’re still a long way from working out what the approach will be, what the tooling will look like, and whether this will be “along with”, or “instead of” tools like ODI and Informatica.

Anyway – that’s it for Openworld for me, back to the real world now and time to see the family.  Check-back on the blog next week for normal service, but for now – laptop off, kids time.