Tag Archives: Uncategorized

OBIEE Enterprise Security

The Rittman Mead Global Services team have recently been involved in a number of security architecture implementations and produced a security model which meets a diverse set of requirements.  Using our experience and standards we have been able to deliver a robust model that addresses the common questions we routinely receive around security, such as :

“Whats considerations do I need to make when exposing Oracle BI to the outside world?”

or

“How can I make a flexible security model which is robust enough to meet the demands of my organisation but easy to maintain?”

The first question is based on a standard enterprise security model where the Oracle BI server is exposed by a web host, enabling SSL and tightening up access security.  This request can be complex to achieve but is something that we have implemented many times now.

The second question is much harder to answer, but our experience has led us to develop a multi-dimensional inheritance security model, with numerous clients that has yielded excellent results.

What is a Multi-dimensional Inheritance Security Model?

The wordy title is actually a simple concept that incorporates 5 key areas:

  • Easy to setup and maintain
  • Flexible
  • Durable
  • Expandable
  • Be consistent throughout the product

While there numerous ways of implementing a security model in Oracle BI, by sticking to the key concepts above, we ensure we get it right.  The largest challenge we face in BI is the different types of security required, and all three need to work in harmony:

  • Application security
  • Content security
  • Data security

Understanding the organisation makeup

The first approach is to consider the makeup of a common organisation and build our security around it.

1

This diagram shows different Departments (Finance, Marketing, Sales) whose data is specific to them, so normally the departmental users should only see their own data that is relevant to them.  In contrast the IT department who are developing the system need visibility across all data and so do the Directors.

 

What types of users do I have?

Next is to consider the types of users we have:

  1. BI Consumer: This will be the most basic and common user who needs to access the system for information.
  2. BI Analyst: As an Analyst the user will be expected to generate more bespoke queries and need ways to represent them. They will also need an area to save these reports.
  3. BI Author: The BI Author will be able to create content and publish that content for the BI Consumers and BI Analysts.
  4. BI Department Admin: The BI Department Admin will be responsible for permissions for their department as well as act as a focal point user.
  5. BI Developer: The BI Developer can be thought of as the person(s) who creates models in the RPD and will need additional access to the system for testing of their models. They might also be responsible for delivering Answers Requests or Dashboards in order to ‘Prove’ the model they created.
  6. BI Administrator:  The Administrator will be responsible for the running of the BI system and will have access to every role.  Most Administrator Task will not require Skills in SQL/Data Warehouse and is generally separated from the BI Developer role.

The types of users here are a combination of every requirement we have seen and might not be required by every client.  The order they are in shows the implied inheritance, so the BI Analyst inherits permissions and privileges from the BI Consumer and so on.

What Types do I need?

Depending on the size of organization determines what types of user groups are required. By default Oracle ships with:

  1. BI Consumer
  2. BI Author
  3. BI Administrator

Typically we would recommend inserting the BI Analyst into the default groups:

  1. BI Consumer
  2. BI Analyst
  3. BI Author
  4. BI Administrator

This works well when there is a central BI team who develop content for the whole organization. The structure would look like this:

2

 

For larger organizations where dashboard development and permissions is handled across multiple BI teams then the BI Department Administrator group can be used to locally manage permissions for each department.  Typically we see the BI team as a central Data Warehouse team who deliver the BI model (RPD) to the multiple BI teams.  In a large Organization the administration of Oracle BI should be handled by someone who isn’t the BI Developer, the structure could look like:

3

 

 

Permissions on groups

Each of the groups will require different permissions, at a high level the permissions would be:

 

Name
Permissions
BI Consumer
  • View Dashboards
  • Save User Selections
  • Subscribe to Ibots
BI Analyst
  • Access to Answers and standard set of views
  • Some form of storage
  • Access to Subject areas
BI Author
  • Access to Create/Modify Dashboards
  • Save Predefined Sections
  • Access to Action Links
  • Access to Dashboard Prompts
  • Access to BI Publisher
BI Department Admin
  • Ability to apply permissions and manage the Web Catalog
BI Developer
  • Advance access to answers
  • Access to all departments
BI Administrator
  • Everything

 

Understanding the basic security mechanics in 10g and 11g

In Oracle BI 10g the majority of the security is handled in the Oracle BI Server.  This would normally be done through initialisation blocks, which would authenticate the user from a LDAP server, then run a query against a database tables to populate the user into ‘Groups’ used in the RPD and ‘Web Groups’ used in the presentation server.  These groups would have to match in each level; Database, Oracle BI Server and Oracle BI Presentation Server.

With the addition of Enterprise Manager and Weblogic the security elements in Oracle BI 11g radically changed.  Authenticating the user is in the Oracle BI server is no longer the recommended way and is limited in Linux. While the RPD Groups and Presentation Server Web Groups still exist they don’t need to be used.  Users are now authenticated against Weblogic.  This can be done by using Weblogic’s own users and groups or by plugging it into a choice of LDAP servers.  The end result will be Groups and Users that exist in Weblogic.  The groups then need to be mapped to Application Roles in Enterprise Manager, which can be seen by the Oracle BI Presentation Services and Oracle BI Server.  It is recommended to create a one to one mapping for each group.

4

 

What does all this look like then?

Assuming this is for an SME size organization where the Dashboard development (BI Author) is done by the central BI team the groups would like:

 

5

 

The key points are:

  • The generic BI Consumer/Analyst groups give their permissions to the department versions
  • No users should be in the generic BI Consumer/Analyst groups
  • Only users from the BI team should be in the generic BI Author/Administrator group
  • New departments can be easily added
  • the lines denote the inheritance of permissions and privileges

 

Whats next – The Web Catalog?

The setup of the web catalog is very important to ensure that it does not get unwieldy, so it needs to reflect the security model and we would recommend setting up some base folders which look like:

6

 

Each department has their own folder and 4 sub folders. The permissions applied to each department’s root folder is BI Administrators so full control is possible across the top.  This is also true for every folder below however they will have additional explicit permissions described to ensure that the department cannot create any more than the four sub folders.

  • The Dashboard folder is where the dashboards go and the departments BI Developers group will have Full control and the departments BI consumer will have read . This will allow the departments BI Developers to create dashboards,  the departments BI Administrators to apply permissions and the departments consumers and analysts the ability to view.
  • The same permissions are applied to the Dashboard Answers folder to the same effect.
  • The Development Answers folder has Full control given to the departments BI Developers and no access to for the departments BI Analysts or BI Consumers. This folder is mainly for the departments BI Developers to store answers when in the process of development.
  • The Analyst folder is where the departments BI Analysts can save Answers. Therefore they will need full control of this folder.

I hope this article gives some insight into Security with Oracle BI.  Remember that our Global Services products offer a flexible support model where you can harness our knowledge to deliver your projects in a cost effective manner.

All You Need, and Ever Wanted to Know About the Dynamic Rolling Year

DRY_1

Among the many reporting requirements that have comprised my last few engagements has been what, from now on, I’ll be referring to as the ‘Dynamic Rolling Year’. That is, through the utilization of a dashboard prompt, the user is able to pick a year – month combination and the chosen view, in this case a line graph, will dynamically display a years’ worth of figures with the chosen year – month as the most current starting point. The picture above better demonstrates this feature. You can see that we’ve chosen the ‘201212’ year – month combination and as a result, our view displays revenue data from 201112 through to 201212. Let’s choose another combination and make sure things work properly and to demonstrate functionality.

DRY_2

From the results on the picture above, it seems we’ve gotten our view to do as expected. We’ve chosen ‘201110’ as our starting year – month combination and the view has changed to display revenue trend back to ‘201010’. Now how do we do this? A quick web search will offer up myriad complex and time-consuming solutions which, while completely feasible, can be onerous to implement. While it was tempting to reap the benefits of someone else’s perhaps hard fought solution, I just knew there had to be an easier way. As a note of caution, I’ll preface the next bit by saying that this only works on a dashboard through using a dashboard prompt in tandem with the target analysis. This solution was tested via a report prompt, however the desired results were not obtained, even when ‘hard coding’ default values for our presentation variables. As well, there are a few assumptions I’m making around what kinds of columns you have in your subject area’s time dimension, namely that you have some sort of Year and Month column within your hierarchy.

1. Establish Your Analysis and Prompt

DRY_3

I’ve brought in the Month (renamed it Year – Month) column from Sample App’s Sample Sales subject area along with the Revenue fact. For the sake of this exercise, we’re going to keep things fairly simple and straightforward. Our Month column exists in the ‘YYYY / MM’ format so we’re going have to dice it up a little bit to get it to where we need it to be. To get our dynamic rolling year to work, all we’re going to do is a little math. In a nutshell, we’re going to turn our month into an integer as ‘YYYYMM’ and simply subtract a hundred as a filter using the presentation variables that correspond to our new month column. To make the magic happen, bring in a dummy column from your time dimension, concatenate the year and month together, cast them as a char and then the whole statement as an integer (as seen in the following formula: CAST(substring(“Time”.”T02 Per Name Month”, 1, 4)||substring(“Time”.”T02 Per Name Month”, -2, 2) AS INT) ). Then, apply a ‘between’ filter to this column using two presentation variables.

DRY_4

Don’t worry that we’ve yet to establish them on our dashboard prompt as that will happen later. In the picture above, I’ve set the filter to be between two presentation variables entitled ‘YEARMONTH_VAR’. As you can see, I’ve subtracted a hundred as, in this case, my year – month concatenation exists in the ‘YYYYMM’ format. You may alter this number as needed. For example, a client already had this column established in their Time dimension, yet it was in the ‘YYYYMMM’ format with a leading zero before the month. In that case, we had to subtract 1000 in order to establish the rolling year. Moving on…. After you’ve set up the analysis, save it off and let’s build the dashboard prompt. The results tab will error out. This is normal. We’re going to be essentially doing the same thing here.

We need to get our analysis and prompt to work together (like they should) and so we’ll need to, once again, format the year – month column on the prompt the same way we did it on our analysis through the ‘prompt for column’ link. You can simply copy and paste the formula you used in the ‘Edit Formula’ column on your analysis into this screen. Don’t forget to assign a presentation variable to this column with whatever name you used on the filter in your analysis (in this case it was YEARMONTH_VAR).

DRY_6

DRY_7

DRY_8

2. Build the Dashboard

2014-12-04_10-04-56

Starting a new dashboard, bring in your prompt and analysis from the Shared Folder. In this exercise, I’ve placed the prompt below the analysis and have done some custom formatting. If you’d like to create this example exactly, then in addition to the default table view, you’ll need to create a line graph view that groups by our year – month column and uses the revenue column as the measure. And voila. Save and run the dashboard. Note that if you didn’t adopt a default value for your dashboard prompt, it will error out before you actually pick and apply a value from the prompt drop-down. Admittedly, the closer to a concatenated year-month (YYYYMM) column you have in your time dimension, the easier the implementation of this method will be. You could even set your default value to an appropriate server/repository variable so that your dashboard opens to the most current and / or a desired year-month combination. However, now that you’ve seen this, hopefully it’s sparked some inspiration into how you might apply this technique to other facets of your work and future reporting.

DRY_1

Private or Public Training? (and a Discount Code)

Rittman Mead offers world class training that covers various Business Intelligence, Data Warehousing and Data Integration tools.

Rittman Mead currently offers two types of training, Public and Private – so what is the best option for your organization?

Both Private and Public options are staffed by the same highly qualified, experienced instructors. One thing to note is our instructors aren’t merely technology instructors, but they are real world consultants.

What does that mean for you? It means questions aren’t answered by the instructor flipping to a slide in a deck and reading off the best practice. You’ll probably hear an answer more like “I was on site with a company a few weeks ago who had the same issue. Let me tell you how we helped that client!”

Let’s start with Public courses. Our offerings include an OBIEE Bootcamp, an ODI Bootcamp, Exalytics for System Administrators, RPD Modeling and our all new OBIEE Front End Development, Design and Best Practices.

Why Public? If you have four or fewer attendees, the public classes are typically the most cost effective choice. You’ll attend the course along with other individuals from various public and private sector organizations. There’s plenty of time to learn the concepts, get your hands dirty by completing labs as well as getting all of your questions answered by our highly skilled instructors.

Now, our Private courses. There’s primarily two reasons a company would choose private training – cost, and the ability to customize.

Cost : Our Private Training is billed based on the instructor’s time not the number of attendees. If you have five or more individuals, it is most likely going to be a more cost effective option for us to come to you. Also, you’ll be covering travel costs for one instructor rather than your entire team.

Customization : Once you select the course that is best for your team, we’ll set up a call with a trainer to review your team’s experience level and your main goals. Based on that conversation, we can tweak the syllabus to make sure the areas you need extra focus are thoroughly covered.

One thing to note is we do offer a few courses (OWB 11gR2 & ODI 11g) privately that aren’t offered publicly.

We also offer a Custom Course Builder (http://www.rittmanmead.com/training/customobiee11gtraining/) which allows you to select modules out of various training classes and build a customized course for your team – this flexibility is a great reason to select Private Training.

Some of our clients love having their team get away for the week and focus strictly on training while others prefer to have training on site at their offices with the flexibility to cut away for important meetings or calls. Whichever program works best for your team, we have got you covered!

For more information, reach out to us at info@rittmanmead.com. If you’ve made it all the way to the end of this blog post, congratulations! As a way of saying thanks, you can mention the code “RMTEN” to save 10% on your next private or public course booking!

UKOUG Partner of the Year Awards

A few days ago Rittman Mead won 5 awards at the UKOUG Partner of the Year Awards.

  • Business Intelligence Partner of the Year (Silver)
  • Training Partner of the Year (Silver)
  • Managed Services (Outsourcing and Operations) Partner of the Year (Silver)
  • Emerging (New Products) Partner of the Year (Silver)
  • Operating Systems, Storage and Hardware Partner of the Year (Gold)

We have consistently done well at this event and feel the awards reflect the effort we put in to both our clients and the user community alike.

The number and diversity of the awards demonstrates how much Rittman Mead has grown over the years. Enterprise Business Intelligence is still at the heart of the work we do, however awards for Operating Systems, Storage and Hardware Partner of the Year and Emerging (New Products) Partner of the Year show the investment we have made in the engineered systems, big data and cloud markets.

Likewise the Training Partner of the Year and Managed Services (Outsourcing and Operations) Partner of the Year awards show the end to end services we now offer.

I would like to say thank you very much to everyone who voted for us and most of all thanks to all our staff who have put in the effort to make this happen.

Oracle BI Cloud Service for SaaS Application Reporting Part 1: Integrating BICS to Salesforce.com using REST APIs

Last month Mark Rittman covered a series of posts detailing the Oracle BI Cloud Service (BICS), aimed at departmental users who want the power of OBIEE 11g without the need to stand-up their own infrastructure. If you’re coming in late, here’s the link to the series.

Before the GA Rittman Mead participated in the beta program for release one of Oracle’s Business Intelligence Cloud Service (BICS), the global aim of the beta was both to understand the capabilities and to identify potential use cases of the new BICS platform.  As Mark wrote, an excellent use case for BICS is to report on top of any SaaS application that expose the data (stored in the cloud) using REST APIs by taking advantage of the ApEx capabilities hosted in the Database Schema Service that comes with BICS. SaaS applications like Oracle’s Fusion CRM, Taleo or Salesforce.com – that was used during the beta program – can be easily integrated and queried with BICS.

The technical goal of our beta program has been to check the features, options and limitations of Oracle BICS by connecting it to a Salesforce.com instance, accessing the data exposed through REST APIs by using the ApEx native functions,  storing the data in the Database Schema Service, creating the Repository using the new Model Editor and showing the data in dashboard by keeping the same data security settings configured in the source platform.

Salesforce.com to Oracle BI Cloud Service

This series of post is going to explain all the details about our successful PoC, over the next few days we’ll be covering the following topics, and we’ll update the list with hyperlinks once the articles are published:

  • Oracle BI Cloud Service for SaaS Application Reporting Part 1: Integrating BICS to Salesforce.com using REST APIs
  • Oracle BI Cloud Service for SaaS Application Reporting Part 2: BICS ApEx components
  • Oracle BI Cloud Service for SaaS Application Reporting Part 3: BICS Repository and Front-end configurations

In the first post of the series we’re going to define the steps required to setup a Salesforce.com demo environment that can be accessed with REST APIs. Later in the post we intend to analyse the most interesting Salesforce.com REST APIs that are used to extract the data from the platform. Before starting digging into the BICS-Salesforce.com integration, some definitions may be needed:

  • RESTful APIs: A Web service API can be defined as RESTful if it conforms to the REST architectural constraints. Some more details about RESTful APIs can be found here.
  • Salesforce.comSalesforce.com is company specialised in software as a service (SaaS) and one of the major providers of CRM in the cloud. In addition to CRM Salesforce.com offers Force.com a cloud platform as a service (PaaS) that developers can use in order to build multitenant applications hosted on Salesforce.com servers. All the Force.com applications can be accessed by using RESTful APIs.

Environments Setup

In a new BI project it is never safe to directly extract data from a live (production) environment. Thus the first step needed in order to test BICS-Salesforce.com connection is to obtain a Salesforce.com environment that we can use without the risk of slowing it down, or even worse crashing and impacting users. A free demo Salesforce.com platform can be obtained by subscribing for the developers program and can be customised as needed. After requesting the Salesforce.com demo environment the creation of a security token is needed, the token is associated with the user invoking the RESTful API. To generate the token: login to Salesforce.com with the specific user -> click on the username -> My Settings -> Reset My Security Token. The new token should then be sent to the user’s email.

Salesforce Reset token Procedure

The third step in order to access Salesforce.com data with REST APIs is to define a Connected App. The creation of a Connected App generates a Consumer Key and a Consumer Secret that is going to be used later during the REST login calls. Once defined a Connected App, the last bit missing is to populate the Salesforce.com instance since it’s empty by default, it can be populated manually or by writing population scripts based on the create REST API statements explained here. An Oracle BICS instance is needed in order to analyse the Salesforce.com data, all the info regarding Oracle BICS and how to activate one instance can be found at this link. Having covered all the basics, it’s time to start analysing how the Salesforce.com data can be extracted.

Salesforce.com REST APIs

In the following sections we are going to analyse some of the REST APIs used in our process. The Salesforce.com REST APIs will be called with cURL commands, cURL is a command line tool and library for transferring data with URL syntax.

Salesforce.com Authentication

The first step in order to download the data from any Salesforce.com instance is to execute the authentication and retrieve the access token, the access token will then be used in all the following REST calls. There are various authentication mechanisms that can be used against Salesforce.com, for the aim of the beta program we used the one called “Username-Password OAuth Authentication Flow” that is described in detail here and in the image below.

Salesforce.com User Password OAuth

The REST API authentication command is

curl -v https://login.salesforce.com/services/oauth2/token -d "grant_type=password" \
    -d "client_id=CLIENT_ID" -d "client_secret=CLIENT_SECRET" \
    -d "username=USERNAME" -d "password=PASSWORD_TOKEN"

where the following parameters must be assigned:

  • CLIENT_ID:  The client ID generated during the creation of the Connected App (step described in Basic Setup)
  • CLIENT_SECRET: The client Secret generated during the creation of the Connected App (step described in Basic Setup)
  • USERNAME: The username you want to use in order to access Salesforce.com data
  • PASSWORD_TOKEN: The concatenation of password and user security token (e.g. if the password is ABC and the security token is 123 then PASSWORD_TOKEN is ABC123)

The response (if all the parameters are correct) should be like the following

{  
   "id":"https://login.salesforce.com/id/00Dx0000000BV7z/005x00000012Q9P",
   "issued_at":"1278448832702",
   "instance_url":"https://na1.salesforce.com",
   "signature":"0CmxinZir53Yex7nE0TD+zMpvIWYGb/bdJh6XfOH6EQ=",
   "access_token":"00Dx0000000BV7z!AR8AQAxo9UfVkh8AlV0Gomt9Czx9LjHnSSpwBMmbRcgKFmxOtvxjTrKW19ye6PE3Ds1eQz3z8jr3W7_VbWmEu4Q8TVGSTHxs"
}

where the most interesting parameters are:

  • instance_url: defines the salesforce instance to use.
  • access_token: defines the time-limited security token to use for all the following calls.

Salesforce.com List of Objects and Retrieve Metadata for an Object

Two of the Salesforce.com REST APIs were very useful when trying to build a general ETL that could be applied to any customised Salesforce.com instance:

  • Get list of Objects: retrieves the list of all objects available in Salesforce.com (custom or not) with some additional information for each object.
  • Retrieve Metadata for an Object: retrieves the list of columns for the selected object with related data types and additional information.

The Get list of Objects call is the following

curl INSTANCE/services/data/VERSION/sobjects/ -H "Authorization: Bearer TOKEN"

where

  • INSTANCE is the instance_url parameter retrieved from the authentication call.
  • TOKEN is the access_token parameter retrieved from the authentication call.
  • VERSION is the REST APIs version used, in our beta test we used v29.0.

The response should be similar to the following in which you could see for each object (identified by the “name” field) all the metadata available.

{  
   "encoding":"UTF-8",
   "maxBatchSize":200,
   "sobjects":[  
      {  
         "name":"Account",
         "label":"Account",
         "keyPrefix":"001",
         "labelPlural":"Accounts",
         "custom":false,
         "layoutable":true,
         "activateable":false,
         "urls":{  
            "sobject":"/services/data/v26.0/sobjects/Account",
            "describe":"/services/data/v26.0/sobjects/Account/describe",
            "rowTemplate":"/services/data/v26.0/sobjects/Account/{ID}"
         },
         "searchable":true,
         "updateable":true,
         "createable":true,
         "deprecatedAndHidden":false,
         "customSetting":false,
         "deletable":true,
         "feedEnabled":true,
         "mergeable":true,
         "queryable":true,
         "replicateable":true,
         "retrieveable":true,
         "undeletable":true,
         "triggerable":true
      },
      ...
   ]
}

The Retrieve Metadata of an Object can be called for each object listed in Get list of Object response, the code is the following:

curl INSTANCE/services/data/VERSION/sobjects/OBJECT/ \
    -H "Authorization: Bearer TOKEN"

where:

  • INSTANCE and TOKEN  and VERSION are the same parameters defined for Get list of objects call
  • OBJECT is the Salesforce.com object to retrieve

The response for the Account object is similar to the following

{  
   "objectDescribe":{  
      "name":"Account",
      "updateable":true,
      "label":"Account",
      "keyPrefix":"001",
      ...  
          
        "replicateable":true,
      "retrieveable":true,
      "undeletable":true,
      "triggerable":true
   },
   "recentItems":[  
      {  
         "attributes":{  
            "type":"Account",
            "url":"/services/data/v20.0/sobjects/Account/001D000000INjVeIAL"
         },
         "Id":"001D000000INjVeIAL",
         "Name":"asdasdasd"
      },
      ...
   ]
}

Interesting parameters for each column are:

  • name: is the columns name.
  • retrievable: if true it means that the particular column can be part of a query
  • soapType and length or byteLength: these fields provide the field type information and the length of the field.

With the three REST APIs analysed we could recreate all the objects (custom or not) in any Salesforce.com (or Force.com) instance as tables in the Oracle Database Schema Service. In the next section we will see how to extract the data from the list of objects by using the Salesforce.com query capabilities.

Salesforce.com query

Salesforce provides two methods of querying the objects: including or excluding deleted object. The difference in the code is minimal, the first uses the /queryAll suffix while the second uses the /query. A Salesforce.com REST API query call is:

curl INSTANCE/services/data/VERSION/query/?q=SELECT+LIST_OF_COLUMNS+from+OBJECT \
    -H "Authorization: Bearer TOKEN"

where:

  • INSTANCE and TOKEN parameters are the ones retrieved from the Authentication call
  • VERSION is the Salesforce.com REST API version used
  • OBJECT is the Salesforce.com object to query
  • LIST_OF_COLUMNS is the comma delimited list of columns to retrieve

The response of a query “select+Name+from+Account” is:

{  
   "done":true,
   "totalSize":14,
   "records":[  
      {  
         "attributes":{  
            "type":"Account",
            "url":"/services/data/v20.0/sobjects/Account/001D000000IRFmaIAH"
         },
         "Name":"Test 1"
      },
      {  
         "attributes":{  
            "type":"Account",
            "url":"/services/data/v20.0/sobjects/Account/001D000000IomazIAB"
         },
         "Name":"Test 2"
      },
      ...
   ]
}

In the JSON result you can find:

  • totalsize: the number of records retrieved
  • Name: for each record the name of the Account

If the resultSet is too big, Salesforce.com will start paging results. If the initial query returns only part of the results, the end of the response will contain a field called nextRecordsUrl. For example:

"nextRecordsUrl" : "/services/data/v20.0/query/01gD0000002HU6KIAW-2000"

In order to get the next page of data a call like the following is needed passing the access token as a parameter.

curl INSTANCE/services/data/v20.0/query/01gD0000002HU6KIAW-2000 \
    -H "Authorization: Bearer TOKEN"

Salesforce.com limits

It’s important to be aware of the limit set by Salesforce.com on the number of REST API calls per day. In order to check at any time the remaining number of calls available execute the following code.

curl INSTANCE/services/data/VERSION/limits/ -H "Authorization: Bearer TOKEN "X-PrettyPrint:1"

The DailyApiRequest -> Remaining field contained in the response shows the amount of calls still available.

{
    "DailyApiRequests":
    {
        "Remaining":"4980",
        "Max":"5000"
    },
    "DailyAsyncApexExecutions":
    {
        "Remaining":"250000",
        "Max":"250000"
    },
    ...
}

Resultset output format

Salesforce.com default output format is JSON, which is a common format for most of the cloud application. However there is the opportunity to also retrieve the result in XML format by using the HTTP ACCEPT header set to “application/xml”. This first release of the tool is based on Oracle DB11g which supports native XML and not JSON. For this reason during our beta program with Oracle BI Cloud Service we decided to use the XML output. Once BICS will be bundled with the Oracle DB 12c that supports JSON and XML native, the resultset output format could be kept as default in JSON. In this post we defined the steps required in order to setup a Salesforce.com demo environment that can be accessed with REST APIs and which are the most interesting Salesforce.com REST APIs that we used to extract the data in order to analyse it with Oracle BI Cloud Service platform. In the next post we will look more in detail at the BICS ApEx part of it by analysing how the Salesforce.com REST APIs can be called from ApEx.