Tag Archives: Oracle BI Apps
Oracle BI Apps 11.1.1.7.1 – GoldenGate Integration – Part 4: Initial Load and Replication
This is the final post in my series on Oracle BI Apps 11.1.1.7.1 and GoldenGate Integration. If you have been following along up to this point, we have the Source Dependent Data Store schema setup and ready to accept data from the OLTP source via replication, the GoldenGate installations are complete on both the source and target servers, and the GoldenGate parameter files are setup and ready to roll. Before the replication is started, an initial load of data from source to target must be performed.
Initial Load
As I mentioned in my previous posts, I plan on performing the initial load of the SDS a slightly different way than described in the Oracle BI Applications documentation. Using the process straight out-of-the-box, we must schedule downtime for the source application, as we do not want to skip any transactions that occur during the processing of data from source to SDS target. With a slight customization to the OBIA-delivered scripts, we can ensure the initial load and replication startup will provide a contiguous flow of transactions to the SDS schema.
Oracle BI Applications Scripts
As with the other set-up processes for the SDS schema and GoldenGate parameter files, there is an ODI Scenario available to execute that will generate the initial load scripts. In ODI Studio, browse to BI Apps Project > Components > SDS > Oracle > Copy OLTP to SDS. Expand Packages > Copy OLTP to SDS > Scenarios and you will find the Scenario “COPY_OLTP_TO_SDS Version 001″.
The Scenario calls an ODI Procedure named “Copy SDS Data”. When executed, it will generate a script with an insert statement for each target SDS table using a select over a database link to the OLTP source. The link must be manually created and specifically named DW_TO_OLTP, as the ODI Procedure has the dblink name hard-coded. This means that the link will need to be modified for each additional source, should there be multiple GoldenGate OLTP to SDS replication processes setup.
-Drop and recreate the database link. drop database link DW_TO_OLTP create database link DW_TO_OLTP connect to SYSADM identified by SYSADM using 'PSDEMO'; --test the link. select * from dual@DW_TO_OLTP;
The standard process would then be to execute the Scenario to generate the insert statements, schedule a downtime for the OLTP application, and run the initial load scripts. Rather than go through those steps, let’s take a look at how to eliminate the source system unavailability with a slight change to the code.
Zero Downtime
The initial load process will be customized to use the Oracle database flashback query capability, selecting data from the transaction log as of a specific point-in-time, based on the source SCN (system change number). Before the initial load is run, the GoldenGate extract process will be started to capture any transactions that occur during the data load. Finally, the GoldenGate replicat process will be started when the initial load completes, after the initial load SCN, eliminating the chance to skip or duplicate transactions from the source.
To perform the customizations, I recommend copying the entire “Copy OLTP to SDS” folder and pasting it in a new location. I simply pasted it in the same folder as the original and renamed it “RM Copy OLTP to SDS”. One thing to note is that the Scenario will not be copied, since it must have a unique name throughout the work repository. We will generate the Scenario with a new name after we make our changes.
Open up the ODI Procedure “Copy SDS Data” from the copied directory. Click on the “Details” tab to review the steps. We will need to modify the step “Copy Data”, which generates the DML script to move data from source to target. A review of the code will show that it uses the dictionary views on the source server, across the database link, to get all table and column names that are to be included in the script. The construction of the insert statement is the bit of code we will need to modify, adding the Oracle database flashback query syntax.
... l_trunc_stmt := 'truncate table <$=jmodelSchema$>.' || col_rec.table_name; l_sql_stmt := 'INSERT /*+ APPEND */ INTO <$=jmodelSchema$>.' || col_rec.table_name || ' (' || rtrim(l_column_list, ', ') || ') ' || 'SELECT ' || rtrim(l_column_expr, ', ') || ' FROM ' || col_rec.table_name || '@<%=DBLinkName%> as of scn #INITIAL_LOAD_SCN'; ...
As you can see, #INITIAL_LOAD_SCN is a placeholder for an ODI Variable. I chose to use a variable to perform the refresh of the SCN from the source rather than hard-code the SCN value. I created the variable called INITIAL_LOAD_SCN and set the query on the Refreshing tab to execute from the data warehouse over the database link, capturing the current SCN from the source database.
The user setup to connect to the OLTP source will need to be granted the “select any dictionary” privilege, temporarily, in order to allow the select from V$DATABASE.
SQL> grant select any dictionary to SYSADM;
Now that the Variable is set and the Procedure code has been modified, we just need to put it all together in a Package and generate a Scenario. The Package “Copy OLTP to SDS” is already setup to call the Procedure “Copy SDS Data”, so we can simply add the ODI Variable as a refresh step at the beginning of the Package.
After saving the Package, we need to generate a Scenario to execute. When generating, be sure to set all Variables except for INITIAL_LOAD_SCN as Startup Variables, as their values will be set manually during the execution of the Scenario. Also, remember to provide a different name than the original Scenario.
GoldenGate Startup and Script Execution
All of the pieces are in place to kick-off the initial load of the Source Dependent Data Store and fire up the GoldenGate replication. Even though the goal is to have zero downtime for the OLTP application, it would be best if the process were completed during a “slow” period – when a minimal amount of transactions are being processed – if possible.
First, let’s get the GoldenGate extract and data pump processes running and capturing source transactions. On the source server, browse to the GoldenGate directory and run the GGSCI application. Ensure the Manager is running, and execute the “start extract” command for each of the processes that need to be kicked off.
Now that the transactions are flowing into the source trail and across the network to the target trail, we can execute the Scenario to generate the initial load script files. When executed, a prompt will appear, allowing you to enter the appropriate value for each variable. The script can be filtered down by a specific list of tables, if necessary, by adding a comma-delimited list to the TABLE_LIST variable. We’ll just use a wildcard value to generate the script for all tables. Other options are to generate a script file (Y or N) and to execute the DML during the execution of the Scenario (even though the Variable is named RUN_DDL). I have chosen to create a script file and run it manually.
The script, named “BIA_SDS_Copy_Data_<session_number>.sql”, will disable constraints, drop indexes, and truncate each table in the SDS prior to loading the data from the source system. After executing the copy data script, we will want to run the “BIA_SDS_Schema_Index_DDL_<session_number>.sql” script to recreate the indexes.
SQL> @BIA_SDS_Copy_Data_885500.sql SQL> @BIA_SDS_Schema_Index_DDL_880500.sql
One thing to note – in the SDS Copy Data script the primary key constraints are disabled for a more performant insert of the data. But, the SDS Schema Index DDL code is set to create the constraint via an alter table script, rather than enabling the existing constraints. To work around this bug, I opened the Copy Data script in SQL Developer, copied all of the lines that are set to disable the constraints, pasted them into a new window and switched the “disable” keyword to “enable” with a simple find and replace, and then executed the script against the SDS tables.
After copying the data and recreating the indexes in the SDS (and enabling the PK constraints), we can finally startup the replicat GoldenGate process on the target server. Again, login to GGSCI and ensure the Manager process is running. This time, when we start the process we will use the AfterCSN command, ensuring the replicat only picks up transactions from the trail file after the initial load SCN.
We now have our initial load of data to the SDS schema completed and GoldenGate replication started, all without any impact to the source OLTP application. The next time the Source Data Extract (SDE) Load Plan is executed, it will be just as if it were running directly against the source database – only faster – since it’s pulling data from the SDS schema on the same server.
Be on the lookout for more blog posts on OBIA 11.1.1.7.1 in the future. And if you need a fast-track to Oracle BI Applications implementation, feel free to drop us a line here at Rittman Mead at info@rittmanmead.com.
Oracle BI Apps 11.1.1.7.1 – GoldenGate Integration
1. Introduction
2. Setup and Configuration
3. SDS and Parameter File Setup
4. Initial Load and Replication
Oracle BI Apps 11.1.1.7.1 – GoldenGate Integration – Part 3: SDS and Parameter File Setup
This is the third post in the four part series on Oracle BI Applications 11.1.1.7.1 and Oracle GoldenGate 11g. Last time, we made it through the initial setup of GoldenGate and created the necessary schemas for the Source Dependent Data Store (SDS) and GoldenGate on the source and target. Now it’s time to configure Oracle Business Intelligence Applications (OBIA) to use the SDS schema, setup the tables in the SDS schema, and generate the GoldenGate parameter files.
Enable the SDS
Set the SDS Parameter
The first step in configuration is to enable the SDS in the OBIA Configuration Manager. This indicator, set for each source instance, will be used throughout the Oracle Data Integrator (ODI) ETL processing to determine whether to connect directly to the source or to the SDS. Choose “Manage Data Load Parameters” from the Tasks list. In the Manage Data Load Parameters window, select the appropriate Source Instance (in this example we are using PeopleSoft Campus Solutions – so PS_CS is chosen) and search for the Parameter name IS_SDS_DEPLOYED.
Select the parameter and click the Edit button. There will be a warning that this parameter will be set globally for this Source Instance. Since we plan to pull all of our data from Campus Solutions using OGG, click Yes to continue (otherwise, you could choose a specific fact group on which to set the parameter). In the Edit Dialog, change the parameter value to YES.
Create the SDS Physical Schema
Now we need to create the Physical Schema for the SDS schema in the Oracle Data Integrator Topology metadata. Open ODI Studio and select the Topology Navigator. Under Physical Architecture, drill-down into Oracle and find the Data Server that corresponds to the data warehouse target for OBIA. Right-click and choose “New Physical Schema”. Set both the Schema and Work Schema values to the SDS database schema that we created in the previous post.
Next, click on the FlexFields tab. We need to set the DATASOURCE_NUM_ID FlexField to match that of the source instance for which the SDS schema is a target. If you recall from the previous post, the DataSource Number, or DSN, was set to 20 for my OBIA setup of the PeopleSoft Campus Solutions source instance. Set this value to the DSN and save the Physical Schema.
How It Works…
It’s great that once the SDS parameter is set in the Configuration Manager and the SDS schema is created in ODI, both steps we just completed, the Load Plans will just “know” to extract data from the SDS rather than the source itself. But I like to dig a little deeper to find out what’s really going on behind the scenes – especially when it comes to Oracle Data Integrator.
Thinking about how I would implement this feature in ODI, I first started looking for an ODI Variable…and sure enough, I found one named IS_SDS_DEPLOYED. The Variable refresh code calls a procedure which accepts a DSN and returns a true or false. Next, I checked the Loading Knowledge Module used in the Interfaces that extract from PeopleSoft Campus Solutions, “LKM BIAPPS SQL to Oracle (Multi Transport)”, as this is where the connection to the source is made during execution. In the list of LKM steps is one titled “Get SDS Schema”. The code checks to ensure the IS_SDS_DEPLOYED variable is true, then runs a SQL query against the ODI repository to lookup the SDS schema name based on the DSN in the physical schema flexfield.
String sqlSelSDSSchema="SELECT PS.SCHEMA_NAME ,PS.I_PSCHEMA, PS.EXT_NAME , FFV.SHORT_TXT_VALUE " + " FROM SNP_PSCHEMA PS INNER JOIN SNP_FF_VALUEM FFV ON FFV.I_INSTANCE = PS.I_PSCHEMA " + " INNER JOIN SNP_FLEX_FIELD FF ON FF.FF_CODE=FFV.FF_CODE AND FF.I_OBJECTS=FFV.I_OBJECTS " + " INNER JOIN SNP_CONNECT C ON C.I_CONNECT =PS.I_CONNECT " + " WHERE FF.FF_CODE='DATASOURCE_NUM_ID' AND FF.I_OBJECTS=1500 " + " AND C.CON_NAME='" + tgtDataServerName + "' and FFV.SHORT_TXT_VALUE='" + "#DATASOURCE_NUM_ID" + "'";
The SDS schema is then used in the next step, “Create SDS work view”, to generate the C$ view code against the SDS schema rather than the source instance schema. A bit of Java pattern matching is used on the standard “odiRef.getFrom()” substitution API call to perform the replacement.
String fromClause=odiRef.getFrom(); String sdsSchema = "[_SDS_]"; Pattern pattern = Pattern.compile("(=snpRef.getObjectName([^,]*),\s*\x22([^,]*)\x22,([^\)]*)\))"); Matcher matcher = pattern.matcher(fromClause); String squote = new Character((char)34).toString(); String replacefromClause = matcher.replaceAll("out.print("+squote + sdsSchema +".$3 " + squote + ");"); //Replace for Qualify function since this runs on target pattern = Pattern.compile("(\bQUALIFY\(([^\)\s]*)\))"); matcher = pattern.matcher(replacefromClause); replacefromClause = matcher.replaceAll(sdsSchema +".$2 “); %> create or replace view <%=odiRef.getTable("L", "COLL_NAME", "A")%> as … from <%=replacefromClause.replace("[_SDS_]","<$=sdsSchemaName$>")%> …
It looks as though the variable is also used many other places, but now we know exactly how the ODI Interfaces used in the Load Plans for OBIA’s extract can so easily switch to using the Source Dependent Data Store. The customized code written in ODI specifically for the SDS is definitely worth a look.
Generate and Execute SDS DDL Scripts
Now we have Oracle BI Apps configured to use the SDS schema as a source rather than the actual source. GoldenGate will keep the SDS schema up-to-date with the latest source changes in real-time. Next, we will execute an ODI Procedure to generate the DDL scripts for the SDS schema tables and indexes, as well as the grants on the source tables for the GoldenGate user. I like how the OBIA developers made this easy. The only way to make it even simpler would be to call this procedure from the Configuration Manager (maybe in the next release?).
To execute the Scenario, open ODI Studio, go to the Designer Navigator, and in the Projects browse to BI Apps Project > Components > SDS > Oracle > Generate SDS DDL. Expand Packages > Generate SDS DDL > Scenarios and you will find Scenario “GENERATE_SDS_DDL Version 001″.
Right-click the Scenario and select Execute. After selecting the Context and Agent, you will be presented with a list of ODI Variables used in the Scenario. These will provide input into the script generation Procedure and are described in detail in section 5.4.4 Setup Step: Generate, Deploy, and Populate the Source Dependent Schema Tables on Target Database of the OBIA documentation.
I’m generating the DDL script for the first time, so I will choose “FULL” for the REFRESH_MODE variable, which will attempt to perform a drop and recreate of all objects. I have also decided to run the scripts manually, so I set the CREATE_SCRIPT_FILE equal to “Y” and RUN_DDL to “N”. I could actually have the Scenario execution run the script for me by setting RUN_DDL equal to “Y”, which is a nice feature. After setting all variables appropriately, click OK to execute the Scenario. Browse to the temp directory specified to review the different scripts:
- BIA_SDS_Schema_Table_DDL_<session_no>.sql and BIA_SDS_Schema_Index_DDL_<session_no>.sql, which can now be executed against the SDS schema to create the SDS tables and apply indexes
- BIA_SDS_Schema_Source_Grants_DDL_<session_no>.sql, which can be run against the source schema to apply grants to the source GoldenGate user
Generate the Initial Load Script
The OBIA-GoldenGate documentation describes the setup and configuration of the initial load process next. But, since I plan on doing this a slightly different way, I’ll hold off on describing it until the next post.
Generate GoldenGate Parameter Files and Perform Configuration
Generate Parameter Files
Finally we’re at the point of generating the parameter files for the GoldenGate extract, pump, and replicat processes that we setup in the previous post. Here is where we might expect to see the “JKM Oracle to Oracle Consistent (OGG)” Journalizing Knowledge Module and the ODI CDC Framework put to use…but, that is not the case. For starters, the CDC Framework is not a part of the OBIA implementation for change data capture. OBIA developers did it a different way, by comparing the “Last Extract Date” to the “Last Update Date” for new and changed records, always pulling from the fully replicated table in the SDS. Next, they created an ODI Procedure to generate the parameter files rather than using a JKM.
Just as we did with the Generate DDL Script Scenario, browse to BI Apps Project > Components > SDS > Generate SDS OGG Param Files. Expand Packages > Generate SDS OGG Param Files > Scenarios and find Scenario “GENERATE_SDS_OGG_PARAM_FILES Version 001″. Right-click and select Execute. We are again provided with a list of Variables which need the appropriate values added prior to starting the process. Many of them should look familiar if you have worked with the GoldenGate JKM – but with much more intuitive names.
Set the appropriate values for the variables, described in detail in section 5.4.5 Setup Step: Generate and Deploy Oracle GoldenGate Parameter Files to Source and Target Machines of the OBIA docs, and click OK to execute the Scenario. Once the Scenario has completed, browse out to the temporary directory and you should find a folder DSN_<DATASOURCE_NUM_ID>. Within this directory are a source and target folder, and the generated parameter files within each. Copy the .prm files from the source folder to the source GoldenGate dirprm directory, and the ADD_TRANDATA.txt to the GoldenGate home directory. From the target folder, copy the parameter file to the target dirprm folder.
GoldenGate Configuration
To begin our configuration, let’s first edit our extract, pump, and replicat parameter files to add any options or commands necessary, such as trail file encryption. This can be done using the local system text editor or via GGSCI (the GoldenGate Software Command Interpreter). If there are no special options to add to the parameter files, they will work just fine without any modification.
Next, we’ll add the table-level supplemental logging on the source by running the ADD_TRANDATA.txt script file. This script file is just a list of GoldenGate commands that can be executed as a batch by calling the obey command. Log-in to GGSCI and run the following:
GGSCI (pssrc) 3> obey ADD_TRANDATA.txt
Once completed, remember to revoke the ALTER TABLE privilege from the source GoldenGate user.
REVOKE ALTER ANY TABLE FROM ogg_user;
The final piece of configuration is to generate the source definitions file. This file provides the GoldenGate replicat process on the target the definitions of the source tables so it can interpret the table and column mappings in the replicat parameter file. The reason it must be used for replication to the SDS is because there are 3 additional columns mapped to each target table.
CDC$_RPL_LAST_UPDATE_DATE = @DATENOW (), CDC$_SRC_LAST_UPDATE_DATE = @GETENV ("GGHEADER", "COMMITTIMESTAMP"), CDC$_DML_CODE = "I" --"U" or "D"
The replicat last update date (CDC$_RPL_LAST_UPDATE_DATE) will be the timestamp when the record is updated in the target table. The source last update date (CDC$_SRC_LAST_UPDATE_DATE) is the timestamp when the change occurred in the source. And finally the DML code (CDC$_DML_CODE) represents the type of change; insert, update, or delete.
To generate the source definitions file, exit GGSCI and run the following command:
./defgen paramfile ./dirprm/DEFGEN.prm
Once completed, copy the source definitions file from the source server to the target server dirdef directory.
We’ve now configured the GoldenGate parameter files on both the source and target. In the next, and final, post in the series, I’ll discuss the initial load process and we will finally start the source to SDS GoldenGate replication.
Oracle BI Apps 11.1.1.7.1 – GoldenGate Integration
1. Introduction
2. Setup and Configuration
3. SDS and Parameter File Setup
4. Initial Load and Replication
Rittman Mead / ODTUG India BI Masterclass Tour Roundup
Over the past week Venkat, myself and the Rittman Mead India team have been running a series of BI Masterclasses at locations in India, in conjunction with ODTUG, the Oracle Development Tools User Group. Starting off in Bangalore, then traveling to Hyderabad and Mumbai, we presented on topics ranging from OBIEE through Exalytics through to EPM Suite and BI Applications, and with networking events at the end of each day.
Around 50 attended at Bangalore, 30 in Hyderbad and 40 in Mumbai, at at the last event we were joined by Harsh Bhogle from the local Oracle office, who presented on Oracle’s high-level strategy around business analytics. Thanks to everyone who attended, thanks to ODTUG for sponsoring the networking events, and thanks especially to Vijay and Pavan from Rittman Mead India who organised everything behind the scenes. If you’re interested, here’s a Flickr set of photos from all three events (plus a few at the start where I visited our offices in Bangalore.)
For anyone who couldn’t attend the events, or if you were there and you’d like copies of the slides, the links below are for the PDF versions of the sessions we presented at various points over the week.
- Oracle BI, Analytics and EPM Product Update
- Extreme BI: Agile BI Development using OBIEE, ODI and Golden Gate
- OBIEE 11g Integration with the Oracle EPM Stack
- OBIEE and Essbase on Exalytics Development & Deployment Best Practices
- OBIEE 11g Security Auditing
- Intro and tech deep dive into BI Apps 11g + ODI
- Metadata & Data loads to EPM using Oracle Data Integrator
So I’m writing this in my hotel room in Mumbai on Sunday morning, waiting for the airport transfer and then flying back to the UK around lunchtime. It’s been a great week but my only regret was missing the UKOUG Apps’13 conference last week, where I was also supposed to be speaking but managed to double-book myself with the event in India.
In the end, Mike Vickers from Rittman Mead in the UK gamely took my place and presented my session, which was put together as a joint effort with Minesh Patel, another of the team in the UK and one of our BI Apps specialists. Entitled “Oracle BI Apps – Giving the Users the Reports they *Really* Want”, it’s a presentation around the common front-end customisations that we typically carry out for customers who want to move beyond the standard, generic dashboards and reports provided by the BI Apps, and again if you missed the session or you’d like to see the slides, they’re linked-to below:
That’s it for now – and I’ll definitely be at Tech’13 in a few weeks’ time, if only because I’ve just realised I’m delivering the BI Masterclass sessions on the Sunday, including a session on OBIEE/ODI and Hadoop integration - I’ve been saying to myself I’d like to get these two tools working with Impala as an alternative to Hive, so that gives me something to start looking at on the flight back later today.
Oracle BI Apps 11.1.1.7.1 – GoldenGate Integration – Part 2: Setup and Configuration
In my previous post, I introduced the steps necessary for integrating Oracle BI Applications 11.1.1.7.1 and GoldenGate (OGG). Now, I’m going to dive into the details and describe how to complete the setup and configuration of GoldenGate and the Source Dependent Data Store schema. As I mentioned before, this process will closely follow Oracle’s documentation on “Administering Oracle GoldenGate and Source Dependent Schemas“, providing additional information and insight along the way.
User and Schema Setup
The first step is to manually create the GoldenGate user on the source and target databases. These users, along with the Source Dependent Data Store schema, are not created by the BI Apps installer like the other standard schemas. This will be a dedicated user for OGG, and will have privileges specific to the needs of the extract process on the source and the replicat process on the target.
Create Source GoldenGate User
Beginning with the source, create the user and grant the initial privileges. Be sure your tablespace has already been created.
-- Create OGG User on the source CREATE USER ogg_user IDENTIFIED BY Password01 DEFAULT TABLESPACE ggs_data QUOTA UNLIMITED ON ggs_data; GRANT CREATE SESSION TO ogg_user; GRANT ALTER SESSION TO ogg_user; GRANT SELECT ANY DICTIONARY TO ogg_user; GRANT FLASHBACK ANY TABLE TO ogg_user;
The specific table grants will not be made until later on via a script generated by an ODI Procedure, as the GoldenGate user does not need SELECT ANY TABLE privileges. On the other hand, the user does temporarily need ALTER ANY TABLE in order to set up supplemental logging for individual tables. Later on, this privilege can be revoked.
GRANT ALTER ANY TABLE TO ogg_user;
Finally, we’ll setup supplemental logging at the database level, ensuring the necessary information is logged for each transaction.
ALTER DATABASE ADD SUPPLEMENTAL LOG DATA;
Create Target GoldenGate User
Next, we’ll go out to the target server and create the GoldenGate user with target-specific privileges. Since GoldenGate performs the DML on the target, based on the change made in the source database, the user will need to be granted privileges to INSERT, UPDATE, DELETE. Again, rather than grant INSERT ANY TABLE, etc., the specific table grants will be generated as a script via an ODI Procedure.
-- Create OGG User CREATE USER ogg_target IDENTIFIED BY Password01 DEFAULT TABLESPACE USERS QUOTA UNLIMITED ON USERS; GRANT CREATE SESSION TO ogg_target; GRANT ALTER SESSION TO ogg_target; GRANT SELECT ANY DICTIONARY TO ogg_target;
We’ll be creating the checkpoint table via GoldenGate, so this user will temporarily need to be granted the CREATE TABLE privilege. The checkpoint table will keep track of the latest position in the target trail file, allowing a clean recovery should the target database go offline.
GRANT CREATE TABLE TO ogg_target;
Create SDS User
Now we’ll create the SDS user and schema. A separate SDS schema must be created for each OLTP source application, as the SDS schema will essentially act as the source schema. We’ll follow the recommended naming conventions for the schema: <BIAPPS>SDS<Model Code>_<DSN>. BIAPPS is the user defined code signifying this is a BI Applications schema. To keep it simple, we’ll use BIAPPS. The Model Code is the unique code assigned to the data source and the DSN is the data source number for that source application.
In this example using Peoplesoft Campus Solutions, the SDS schema name is BIAPPS_SDS_PSFT_90_CS_20. Not a very friendly name to type, but serves its purpose in identifying the source of the schema data.
-- Create tablespace. CREATE TABLESPACE BIAPPS_SDS_PSFT_90_CS_20_TS DATAFILE '/u01/app/oracle/oradata/orcldata/BIAPPS_SDS_PSFT_90_CS_20..dbf' SIZE 100M AUTOEXTEND ON NEXT 10M LOGGING DEFAULT COMPRESS FOR OLTP; -- Create SDS User CREATE USER BIAPPS_SDS_PSFT_90_CS_20 IDENTIFIED BY Password01 DEFAULT TABLESPACE BIAPPS_SDS_PSFT_90_CS_20_TS QUOTA UNLIMITED ON BIAPPS_SDS_PSFT_90_CS_20_TS; -- Required Grants GRANT CREATE SESSION TO BIAPPS_SDS_PSFT_90_CS_20; GRANT CREATE TABLE TO BIAPPS_SDS_PSFT_90_CS_20;
Finally, the GoldenGate target user must be granted access to use the SDS tablespace for inserts/updates.
-- OGG user must be granted Quota to insert and update data ALTER USER ogg_target QUOTA UNLIMITED ON BIAPPS_SDS_PSFT_90_CS_20_TS;
Install and Configure GoldenGate
The schemas are in place, so the next part of the setup is to install and configure the GoldenGate application on both the source and target servers. The GoldenGate installation process is pretty well documented on Gavin Soorma’s blog, so I won’t go into much detail here. The Oracle BI Applications documentation also has some example scripts, which take you through the setup of the extract, data pump, and replicat group processes.
The naming standards for the parameter files are fairly straightforward, with DSN being the same data source number we used in the SDS schema name.
- Extract: EXT_DSN
- Data Pump: DP_DSN
- Replicat: REP_DSN
Following the OBIA documentation examples, you will end up with each process group setup and the checkpoint table created in the target GoldenGate schema. I prefer to create an obey file for both the source and target setup scripts, similar to the following example.
--stop manager on target db dblogin userid ogg_target, password Password01 stop mgr --stop gg processes stop rep_20 delete rep_20 --delete CHECKPOINTTABLE DELETE CHECKPOINTTABLE ogg_target.OGGCKPT --delete previous trail files SHELL rm ./dirdat/* --start manager on target db start mgr --create CHECKPOINTTABLE in target db dblogin userid ogg_target, password Password01 ADD CHECKPOINTTABLE ogg_target.OGGCKPT add replicat rep_20, exttrail ./dirdat/tr, CHECKPOINTTABLE ogg_target.OGGCKPT
Using an obey script allows me to rerun the process should there be any sort of issue or failure and also provides me with a template that I can use for additional sources and SDS targets. The result should be process groups setup and ready to roll (once the parameter files are in place, of course).
Remember to revoke the CREATE TABLE privilege from the target GoldenGate user once the checkpoint table has been created.
REVOKE CREATE TABLE FROM ogg_target;
In the next post, I’ll walk through the SDS setup in OBIA and ODI, as well as the ODI Procedures that help generate the GoldenGate parameter files, SDS schema DDL, and initial load scripts.
Oracle BI Apps 11.1.1.7.1 – GoldenGate Integration
1. Introduction
2. Setup and Configuration
3. SDS and Parameter File Setup
4. Initial Load and Replication
Oracle BI Apps 11.1.1.7.1 – GoldenGate Integration – Part 1: Introduction
The release of Oracle Business Intelligence Applications 11.1.1.7.1 includes a major change in components, with Oracle Data Integrator replacing Informatica as the ETL application. The next logical step was to integrate Oracle’s data replication tool, Oracle GoldenGate, for a real-time load of source system data to the data warehouse. Using GoldenGate replication rather than a conventional extract process, contention on the source is essentially eliminated and all of the source OLTP data is stored locally on the data warehouse, eliminating network bottlenecks and allowing ETL performance to increase. In this series of posts, I’m going to walk through the architecture and setup for using GoldenGate with OBIA 11.1.1.7.1.
GoldenGate and the Source Dependent Data Store
For those of you not familiar with Oracle GoldenGate (OGG), it is the standard Oracle product for data replication, providing log-based change data capture, distribution, and delivery in real-time.
GoldenGate captures transactional data changes from the source database redo log and loads the changes into its own log file, called a Trail File, using a platform-independent universal data format. The Extract process understands the schemas and tables from which to capture changes based on the configuration set in the Extract parameter file. The data is then read from the Source Trail File and moved across the network to the Target Trail File using a process called a Data Pump, also driven by a parameter file. Finally, the transactions are loaded into the target database tables using the Replicat parameter file configuration, which maps source tables and columns to their target. The entire process occurs with sub-second latency and minimal impact to the source and target systems.
In my previous blog posts regarding Oracle GoldenGate, I described how to replicate changes from the source to the Staging and Foundation layers of the Oracle Reference Architecture for Information Management. In OBIA, GoldenGate is used for pure replication from the source database to the target data warehouse, into what is known as the Source Dependent Data Store (SDS) schema.
The SDS is setup to look exactly like the source schema, allowing the Oracle Data Integrator pre-built Interfaces to change which source they are using from within the Knowledge Module by evaluating a variable (IS_SDS_DEPLOYED) at various points throughout the KM (we’ll look at this in more detail later on). Using this approach, the GoldenGate integration can be easily enabled at any point, even after initial configuration. In fact, that is exactly what I did – making the switch to using OGG after my first full data load from the source without GoldenGate. The Oracle BI Apps team did a great job of utilizing the features of ODI that allow the logical layer to be abstracted from the physical layer and data source connection.
Getting Started – High Level Steps
To begin, we must first install Oracle BI Applications 11.1.1.7.1, if it is not already up and running in your environment. I followed the recently published OTN article, “Cookbook: Installing and Configuring Oracle BI Applications 11.1.1.7.1″, written by Mark Rittman and Kevin McGinley. Rather than use Windows, though, I decided to go with Linux as my host operating system for OBIA. This had its own challenges, but nothing’s worth doing if there isn’t a bit of learning involved! After generating the “Source Extract and Load” Load Plan, it’s time to setup GoldenGate.
Before we dig into the details of the GoldenGate integration, let’s review the necessary steps at a high-level. The process follows Oracle’s documentation on administering GoldenGate and OBIA Source Dependent Schema.
1. Configure the source and target database schemas.
We need to create a GoldenGate user on both the source and target databases, as well as the Source Dependent Data Store schema on the target, along with the appropriate grants, etc.
2. Install Oracle GoldenGate on the source and target servers.
Download and install GoldenGate on each server. The Oracle BI Applications documentation shows an example on how to get the configuration started.
3. Configure the Source Dependent Data Store.
Enable the SDS in the OBIA Configuration Manager and create a new Physical Schema for the SDS in Oracle Data Integrator.
4. Generate and execute the DDL to create tables in the SDS schema on the target database.
As part of the OBIA installation, many “standard” ODI Packages and Procedures were created, including GENERATE_SDS_DDL. By entering the appropriate values into the Options during execution, the process will generate a SQL script that can then be executed against the SDS.
5. Generate the initial load script.
Yet another OBIA delivered Procedure will generate a SQL script for the initial load from the source to SDS schema. The script will contain INSERT statements using a database link from target to source. This script may be useful if an outage were called on the source application during OBIA and GoldenGate integration setup. But, if transactions are still flowing into the source application, a different approach will need to be used. We’ll get into more details on this later (hint: it involves the source SCN).
6. Generate and deploy the GoldenGate parameter files.
This is where we might expect to see the “JKM Oracle to Oracle Consistent (OGG)” Journalizing Knowledge Module put to use, correct? But no, the OBIA product team decided to go with a custom Procedure rather than the JKM, as the ODI Change Data Capture (aka Journalizing) is not put to use. Execute the GENERATE_SDS_OGG_PARAM_FILES Scenario, copy the parameter files to the appropriate locations, and complete the GoldenGate configuration.
7. Start GoldenGate replication.
Again, if there is not a source outage we will probably need to customize our start replicat statement to ensure we do not miss any transactions from the source.
Once GoldenGate replication has begun and you can continue to use the same “Source Extract and Load” Load Plan that was originally generated to pull data from the source database into the data warehouse. But now, instead of reaching out to the source database, this Load Plan will pull data from the SDS schema into the staging area.
In the next post, we’ll dive deeper into the setup and configuration details, working through each of the steps listed above.
Oracle BI Apps 11.1.1.7.1 – GoldenGate Integration
1. Introduction
2. Setup and Configuration
3. SDS and Parameter File Setup
4. Initial Load and Replication