Interfacing to EPBCS with OIC

Oracle Enterprise Planning and Budgeting Cloud Service is effectively Hyperion Planning hosted in Oracle’s Cloud. The traditional way to load data and execute jobs is to use the epmautomate command line utility. However, in a Cloud environment, you may not have a suitable platform from which to run the command line, so you need to look for an alternative. Fortunately, Oracle provides a comprehensive set of REST services that allow us to create a full end-to-end integration.

The Scenario

We need to extract Project hierarchy metadata and EVM data from Fusion PPM and load the two sets of data into EPBCS. Currently, the users are manually running two BI Publisher reports in Fusion, downloading the data to files to their PCs, uploading the files to the EPBCS Inbox and finally running a few jobs that have been created by the EPBCS specialist. The aim is to automate the whole process, orchestrated using Oracle Integration Cloud.


We will be using both the Migration and Planning REST APIs in EPBCS. As with a lot of well implemented REST APIs, both of these require a Version Number to be included in the URL. To fetch the current active Version Numbers we first need to make the following REST calls against the EPBCS server:

Both of these APIs return a fairly simple JSON response; to get the Version Number we need to extract the following item from both Responses and save them to variables for later use:

  • items[lifecycle=”active”]/version

Extracting the Metadata and Data

The next step is to run the BI Publisher reports that the users are already using to extract the metadata and data from Fusion PPM. If you don’t already have a suitable OIC Connection to the BIP SOAP Service, you can create one and point it to /xmlpserver/services/v2/ReportService?wsdl on the Fusion server.

Using that Connection you can execute a BI Publisher report and capture the output to a file on your OIC server. You’ll need to use the ReportService Service, the ReportService Port and the runReport Operation. As a minimum, in the Mapper you will have to provide values for the following Request Body attributes:

  • /runReport/userID
  • /runReport/password
  • /runReport/reportRequest/reportAbsolutePath

Obviously, having the Password encoded into the Integration is not ideal. At present, I don’t know of any way around this – please leave a Comment if you know how to get around this. To mitigate any maintenance issues, I suggest reading the values of the Username and Password from a common OIC Lookup.

The response from the SOAP call contains the whole BIP report output as a byte stream in /runReportResponse/runReportReturn/reportBytes. We can use a Stage File, Write File Action to write this to a file on the OIC server using the Opaque Element XSD:

<?xml version="1.0" encoding="utf-8"?>
<schema targetNamespace="" xmlns="">
  <element name="opaqueElement" type="base64Binary"/>

Upload the Files

Now that we have both the Metadata and Data files staged in OIC, we need to upload them to the EPBCS Inbox. To do that we need to do a POST on the EPBCS Migration API /interop/rest/{version}/applicationsnapshots/{snapshot}/contents, where {version} is the value saved from earlier and {snapshot} is what you want the file to be called when it gets to the Inbox.

The POST will fail if the file already exists in the Inbox, so it’s probably a good idea to try and delete it first. Perform a REST DELETE against the same URL as above. The Response includes a status – if it is 0 the delete succeeded; if 8 the file did not exist. Any other values should cause us to throw a Fault.

Execute Jobs

The final step is to run a sequence of EPBCS Jobs. The REST API to run a Job in EPBCS is /HyperionPlanning/rest/{version}/applications/PerfMan/jobs where {version} is the value saved from earlier. We need to POST a small payload to this API to provide the details of the Job to be executed:

   "jobType" : "Rules",
   "jobName" : "Clear_EVMData"

The jobType attribute can take a variety of values, which allows us to execute a Rule or a RuleSet, import or export Metadata and Data, as well other useful tasks, such as refreshing and compacting Cubes. The REST call submits the job to run and returns – it doesn’t wait for completion. To find the status of the Job we need to make another REST call. However, to use the status API we will need to pass in a Job ID, which we can retrieve from the jobId attribute in the Response to the previous call.

To get the Status, we need to perform a REST GET on /HyperionPlanning/rest/{version}/applications/PerfMan/jobs/{id} – the replacement values should be fairly obvious by now. If the returned status is -1 the job is still running. Once the job is complete you’ll either get a 0 to indicate a successful execution, or a positive integer to indicate some form of error.

As I had four jobs to run, I decided that it would be an idea to store the details of the jobs in an OIC Lookup and to use a loop to iterate over the records in the Lookup. This allowed me to easily add or remove jobs or even temporarily disable a job. I’ll cover the details of how I used a Lookup in this way in a future article.


Interfacing Metadata and Data to EPBCS in this way might seem like a lot of hard work, when you’re used to running a few batch scripts. But when your cloud environment doesn’t allow the use of epmautomate, using OIC to orchestrate the extract and load is a very clean and “cloud native” way of doing things.

Leave a comment