Saturday, 24 December 2011

JMS Queue vs Topic

JMS Queue

> A staging area that contains messages that have been sent and are waiting to be read. As the name queue suggests, the messages are delivered in the order sent. A message is removed from the queue once it has been read.

> point-to-point

> Only one consumer will get the message

> The producer does not have to be running at the time the consumer consumes the message, nor does the consumer need to be running at the time the message is sent

> Every message successfully processed is acknowledged by the consumer

JMS Topic

> A distribution mechanism for publishing messages that are delivered to multiple subscribers

> Multiple consumers can get the message

> There is a timing dependency between publishers and subscribers. The publisher has to create a subscription in order for clients to be able to subscribe. The subscriber has to remain continuously active to receive messages, unless it has established a durable subscription. In that case, messages published while the subscriber is not connected will be redistributed whenever it reconnects.

Oracle AIA (Application Integration Architecture)

Oracle AIA combines the power of Oracle’s Fusion Middleware along with a set of best in class application Enterprise Business Objects and Services.

> Enterprise Business Object (EBO)
A data model consisting of standard business data object definitions and reusable data components representing a business object

> Enterprise Business Services (EBS)
The application independent web service definition for performing a business task.

> SOA Governance
- Business Service Repository
Catalog of the objects, messages, and services that compose the integration scenarios in your Oracle Application
- Composite Application Validation System
Test integration web services without the deployed participating applications in place.
- Composite Application Error Management and Resolution
Route the error back to the correct application and to the right application user.

> Reference Architecture
Comprehensive documentation to assist in integration development.

OWSM : Applying Policies

To make message flows between client and server secure,
we need to maintain following Policies
1> Authentication - Check whether the user is authenticated
2> Integrity - Check whether message is being altered
3> Confidentiality - Check whether message is being encrypted

Using OWSM you can apply all of the above policies.

Use any one step below to apply authentication :
1> Active Directory Authenticate
2> File Authenticate
3> Ldap Authenticate and others...

Here there are two things to be considered,
one is authenication and other is authorization.
Authentication is simply checking whether the user is providing the right username and password or not...but second thing that is Authorization, meaning whether the person has permission to access particular operation or not.

OWSM allows you to manage authorization also using:
1> Active Directory Authorize
2> File Authorize
3> Ldap Authorize
4> Oracle Access Manager Authenticate Authorize and others...

Everything is happening at OWSM, dont need to worry about...

Next comes Integrity
Sign the message, so nobody can alter it...forcing integrity..
OWSM provides option to sign the message...
SIGN it with your private key on one side...
on the other side VERIFY SIGNATURE with your public key

Next is Confidentiality
The message should not be spy can see what are you sending....
Encrypt it using XML Encryption in OWSM...
Decrypt it on other side using XML Decryption....

Applying above policies makes your transaction highly secure...

Just do it...

ESB : Error Hospital

Its time to handle errors occured in Oracle ESB.
I would like to describe Error Hospital using a simple

RS - Routing Service
WS - Web Service

Suppose while invoking callWebService some error occured,
as we are calling web service asynchronously it can be retried.

To automate the process of retring the web service, we need to implement
Error Hospital, which is nothing but a BPEL process.

Why BPEL process not ESB ?
1>Suppose some error occured in ESB Error Hospital process, than the error hospital can recursively invoked again and again, which can make our server unstable.
2>In BPEL, we have the option of Human Task, as in such kind of errors we need human intervention, which is not possible in ESB.

Solution :
1> Create an empty BPEL process.
2> Create an JMS Adapter as follows :
a> Service Name : ReadESBError
b> JMS provider : Oracle Enterprise Messaging Service(OEMS) Memory/File
c> Connection : <your application server>
d> Operation Type : Consume Message
e> Consume Operation Parameters :
Destination Name :
Message Body Type : TextMessage
Message Selector : This is an important parameter which will filter your errors.
This criteria can be given as operation name (in our case : "execute")
and system guid which can be fetched from oraesb schema using following query
This will filter errors of your importance.
Hence value will be :
ESB_SYSTEM_GUID = '<system guid>' AND ESB_EVENT_KEY = 'execute'

f> Schema will be the same as used while executing RS.
g> Finish

3> Create Recieve operation for incoming JMS Adapter.
Add Header Variable in Adapters tab of recieve activity.
Header variable will be of message type : InboundHeader_msg which is defined in jmsAdapterInboundHeader.

4> Create one more JMS Adapter
a> Service Name : RetryESBError
b> Operation Type : Produce Message
c> Produce Operation Parameter
Destination name :
everything else default
d> same schema as above.
e> Finish

5> Create invoke Activity for RetryESBError.
Attach header variable in adapters tab.
Header variable will be of message type : OutboundHeader_msg which is defined in jmsAdapterOutboundHeader.

6> Add activity Human Task and attach input variable to be reviewed by human. Assign Reply from human task to outputVariable.

7> Header Transformation
This is an important part of Error Hospital.
Use transformation from inputHeader to outputHeader.
Use automap feature to transform. Assign all input properties to output properties as it is.

8> Now we require to modify some of the output header properties.
If you dont know the esb system name that can be found using property
Use query on oraesb schema
Databse adapter can be used.

Extract flowid from inboundHeader property ESB_FLOW_ID
Then the value will be 'Resubmitted-<flowid>'

9> Error Hospital completed.

Try to invoke RS using some invalid values, you can see a new bpel error hospital instance. Update values using workflow console and resubmit it.

Thursday, 22 December 2011

How to set security credentials dynamically in Oracle BPEL

Few months ago I have written a post on invoking WS-Security compliant services, In Oracle BPEL you can either propagate the security credentials coming from the caller process or you can hard-code the tokens in partner link properties.
If you want to invoke a WS-Security compliant web service and want to pass user supplied security tokens, Oracle BPEL does not let you set the security credential dynamically. You need to manually create a UserNameToken and then you need to pass the token as a SOAP header.
Follow the steps given below to change and pass security credentials dynamically:
  • Create 3 variable as given below:
<variable name="securityContext" element="ns2:Security"/>
<variable name="userNameToken" element="ns2:UsernameToken"/>
<variable name="pswd" element="ns2:Password"/>
  • Assign incoming security credentials to these variables:
<assign name="AssignSecurityCredentials">
<from variable="inputVariable" part="payload"
<to variable="pswd" query="/wsse:Password"/>
<from variable="inputVariable" part="payload"
<to variable="userNameToken"
<bpelx:from variable="pswd" query="/wsse:Password"/>
<bpelx:to variable="userNameToken"
<bpelx:from variable="userNameToken" query="/wsse:UsernameToken"/>
<bpelx:to variable="securityContext" query="/wsse:Security"/>
  • Pass the security credentials to the calling service like the expression given below:
<invoke name="InvokeAxisService" partnerLink="PartnerLinkAxisService"
portType="ns1:sample03PortType" operation="echo"
Complete you BPEL process by adding required functionalities then deploy and test it.

Deploy Soa Suite 11g composite applications with Ant scripts

With Soa Suite 11g you can deploy your composite applications from JDeveloper or with Ant. In this blog I will do this with the soa 11g Ant scripts. These ant scripts can only deploy one project so I made an Ant script around the soa ant scripts which can deploy more composites applications to different Soa enviroments. So now you can use it to automate your deployment or use it in your build tool. In my ant script I will deploy to the MDS, compile, build and package the composite application and deploy this to the soa server, after this I use an ant script to start the unit tests and generate a junit result xml and at last I can optional disable the composite.
This junit xml can be used in your continious build system. You can easily extend this build script so you use it to manage the composite applications.
For more info over ant deployment see the official deployment documentation .
The official ant scripts are located in the jdeverloper\bin folder. Here is a summary what are and can do
  • ant-sca-test.xml, This script can start the test suites of the composite and generates a juinit report and not Attaches, extracts, generates, and validates configuration plans for a SOA composite application, The official documentation description is not correct.
  • ant-sca-compile.xml, Compiles a SOA composite application ,this script is also called in the package scrip, so we don't need to call this directly.
  • ant-sca-package.xml, Packages a SOA composite application into a composite SAR file and also validates and build the composite application.
  • ant-sca-deploy.xml, Deploys a SOA composite application.
  • ant-sca-mgmt.xml, Manages a SOA composite application, including starting, stopping, activating, retiring, assigning a default revision version, and listing deployed SOA composite applications.

Here is the main where you have to define the jdeveloper and your application home, which composite applications you want to deploy and to which environment dev or acc.

# global

# temp




# dev deployment server weblogic

# acceptance deployment server weblogic

Every application can have one or more soa projects so the main ant script will load the application properties file which contains all the project with its revision number.
Here is a example of file



Because in my example I have two soa environments so I need to create two configuration plans. With this plan ( which look the wls plan ) can change the url of endpoints so it matches with the environment.
Select the composite application xml and generate a configuration plan.
Add the dev or acc extension to the file name.
Here you see how the plan looks like.

And here is the main ant build script which can do it all and calls the Oracle Ant scripts.

<?xml version="1.0" encoding="iso-8859-1"?>
<project name="soaDeployAll" default="deployAll">
<echo>basedir ${basedir}</echo>

<property environment="env"/>
<echo>current folder ${env.CURRENT_FOLDER}</echo>

<property file="${env.CURRENT_FOLDER}/"/>

<taskdef resource="net/sf/antcontrib/"/>

<import file="${basedir}/ant-sca-deploy.xml"/>
<import file="${basedir}/ant-sca-package.xml"/>
<import file="${basedir}/ant-sca-test.xml"/>
<import file="${basedir}/ant-sca-test.xml"/>
<import file="${basedir}/ant-sca-mgmt.xml"/>

<target name="deployAll">
<equals arg1="${mds.enabled}" arg2="true"/>
<antcall target="deployMDS" inheritall="true"/>
<foreach list="${applications}" param="application" target="deployApplication" inheritall="true" inheritrefs="false"/>

<target name="unDeployMDS">
<echo>undeploy MDS</echo>
<foreach list="${mds.applications}" param="mds.application" target="undeployMDSApplication" inheritall="true" inheritrefs="false"/>

<target name="deployMDS">
<echo>undeploy and deploy MDS</echo>
<equals arg1="${mds.undeploy}" arg2="true"/>
<foreach list="${mds.applications}" param="mds.application" target="undeployMDSApplication" inheritall="true" inheritrefs="false"/>
<foreach list="${mds.applications}" param="mds.application" target="deployMDSApplication" inheritall="true" inheritrefs="false"/>

<target name="deployMDSApplication">
<echo>deploy MDS application ${mds.application}</echo>

<echo>remove and create local MDS temp</echo>
<property name="mds.deploy.dir" value="${tmp.output.dir}/${mds.application}"/>

<delete dir="${mds.deploy.dir}"/>
<mkdir dir="${mds.deploy.dir}"/>

<echo>create zip from file MDS store</echo>
<zip destfile="${mds.deploy.dir}/${mds.application}_mds.jar" compress="false">
<fileset dir="${mds.reposistory}" includes="${mds.application}/**"/>

<echo>create zip with MDS jar</echo>
<zip destfile="${mds.deploy.dir}/${mds.application}" compress="false">
<fileset dir="${mds.deploy.dir}" includes="*.jar"/>

<propertycopy name="deploy.serverURL" from="${deployment.plan.environment}.serverURL"/>
<propertycopy name="deploy.overwrite" from="${deployment.plan.environment}.overwrite"/>
<propertycopy name="deploy.user" from="${deployment.plan.environment}.user"/>
<propertycopy name="deploy.password" from="${deployment.plan.environment}.password"/>
<propertycopy name="deploy.forceDefault" from="${deployment.plan.environment}.forceDefault"/>

<echo>deploy MDS app</echo>

<echo>deploy on ${deploy.serverURL} with user ${deploy.user}</echo>
<echo>deploy sarFile ${mds.deploy.dir}/${mds.application}</echo>

<antcall target="deploy" inheritall="false">
<param name="wl_home" value="${wl_home}"/>
<param name="oracle.home" value="${oracle.home}"/>
<param name="serverURL" value="${deploy.serverURL}"/>
<param name="user" value="${deploy.user}"/>
<param name="password" value="${deploy.password}"/>
<param name="overwrite" value="${deploy.overwrite}"/>
<param name="forceDefault" value="${deploy.forceDefault}"/>
<param name="sarLocation" value="${mds.deploy.dir}/${mds.application}"/>

<target name="undeployMDSApplication">
<echo>undeploy MDS application ${mds.application}</echo>

<propertycopy name="deploy.serverURL" from="${deployment.plan.environment}.serverURL"/>
<propertycopy name="deploy.overwrite" from="${deployment.plan.environment}.overwrite"/>
<propertycopy name="deploy.user" from="${deployment.plan.environment}.user"/>
<propertycopy name="deploy.password" from="${deployment.plan.environment}.password"/>
<propertycopy name="deploy.forceDefault" from="${deployment.plan.environment}.forceDefault"/>

<echo>undeploy MDS app folder apps/${mds.application} </echo>
<antcall target="removeSharedData" inheritall="false">
<param name="wl_home" value="${wl_home}"/>
<param name="oracle.home" value="${oracle.home}"/>
<param name="serverURL" value="${deploy.serverURL}"/>
<param name="user" value="${deploy.user}"/>
<param name="password" value="${deploy.password}"/>
<param name="folderName" value="${mds.application}"/>

<target name="deployApplication">
<echo>deploy application ${application}</echo>
<property file="${env.CURRENT_FOLDER}/${applications.home}/${application}/"/>
<foreach list="${projects}" param="project" target="deployProject" inheritall="true" inheritrefs="false"/>

<target name="deployProject">
<echo>deploy project ${project} for environment ${deployment.plan.environment}</echo>

<property name="proj.compositeName" value="${project}"/>
<property name="proj.compositeDir" value="${env.CURRENT_FOLDER}/${applications.home}/${application}"/>
<propertycopy name="proj.revision" from="${project}.revision"/>
<propertycopy name="proj.enabled" from="${project}.enabled"/>

<echo>deploy compositeName ${proj.compositeName}</echo>
<echo>deploy compositeDir ${proj.compositeDir}</echo>

<antcall target="package" inheritall="false">
<param name="compositeDir" value="${proj.compositeDir}/${project}"/>
<param name="compositeName" value="${proj.compositeName}"/>
<param name="revision" value="${proj.revision}"/>
<param name="oracle.home" value="${oracle.home}"/>
<param name="java.passed.home" value="${java.passed.home}"/>
<param name="wl_home" value="${wl_home}"/>
<param name="sca.application.home" value="${proj.compositeDir}"/>
<param name="scac.application.home" value="${proj.compositeDir}"/>
<param name="scac.input" value="${proj.compositeDir}/${proj.compositeName}/composite.xml"/>
<param name="scac.output" value="${tmp.output.dir}/${proj.compositeName}.xml"/>
<param name="scac.error" value="${tmp.output.dir}/${proj.compositeName}.err"/>
<param name="scac.displayLevel" value="3"/>

<property name="deploy.sarLocation" value="${proj.compositeDir}/${proj.compositeName}/deploy/sca_${proj.compositeName}_rev${proj.revision}.jar"/>
<property name="deploy.configplan" value="${proj.compositeDir}/${proj.compositeName}/${proj.compositeName}_cfgplan_${deployment.plan.environment}.xml"/>

<propertycopy name="deploy.serverURL" from="${deployment.plan.environment}.serverURL"/>
<propertycopy name="deploy.overwrite" from="${deployment.plan.environment}.overwrite"/>
<propertycopy name="deploy.user" from="${deployment.plan.environment}.user"/>
<propertycopy name="deploy.password" from="${deployment.plan.environment}.password"/>
<propertycopy name="deploy.forceDefault" from="${deployment.plan.environment}.forceDefault"/>
<propertycopy name="deploy.server" from="${deployment.plan.environment}.server"/>
<propertycopy name="deploy.port" from="${deployment.plan.environment}.port"/>

<echo>deploy on ${deploy.serverURL} with user ${deploy.user}</echo>
<echo>deploy sarFile ${deploy.sarLocation}</echo>

<antcall target="deploy" inheritall="false">
<param name="wl_home" value="${wl_home}"/>
<param name="oracle.home" value="${oracle.home}"/>
<param name="serverURL" value="${deploy.serverURL}"/>
<param name="user" value="${deploy.user}"/>
<param name="password" value="${deploy.password}"/>
<param name="overwrite" value="${deploy.overwrite}"/>
<param name="forceDefault" value="${deploy.forceDefault}"/>
<param name="sarLocation" value="${deploy.sarLocation}"/>
<param name="configplan" value="${deploy.configplan}"/>

<echo>unit test sarFile ${proj.compositeName} </echo>

<antcall target="test" inheritall="false">
<param name="scatest.input" value="${project}"/>
<param name="scatest.format" value="junit"/>
<param name="scatest.result" value="${env.CURRENT_FOLDER}/${junit.output.dir}"/>
<param name="" value="${deployment.plan.environment}"/>

<echo>disable composite ${proj.compositeName} </echo>

<equals arg1="${proj.enabled}" arg2="false"/>
<antcall target="stopComposite" inheritall="false">
<param name="host" value="${deploy.server}"/>
<param name="port" value="${deploy.port}"/>
<param name="user" value="${deploy.user}"/>
<param name="password" value="${deploy.password}"/>
<param name="compositeName" value="${proj.compositeName}"/>
<param name="revision" value="${proj.revision}"/>



For development testing environment I need to have


And finally the cmd script to run this ant script. To make this work we need the ant-contrib libray and put this in the classpath.

set ORACLE_HOME=C:\oracle\MiddlewareJdev11gR1PS1
set ANT_HOME=%ORACLE_HOME%\jdeveloper\ant
set PATH=%ANT_HOME%\bin;%PATH%
set JAVA_HOME=%ORACLE_HOME%\jdk160_14_R27.6.5-32


ant -f build.xml deployAll -Dbasedir=%ORACLE_HOME%\jdeveloper\bin

Here is the zip with all the files and extract this and put this all in the jdeveloper/bin folder.

11g SOA : How to set Title for Composite

When you have thousands of transactions processed through SOA composite, it becomes almost impossible to find/audit any particular instance through EM console unless you have a way to query particular transaction through its business key.
One solution for this is to set  business key of transaction as title to the composite. Once this is done, you can use in built search functionality available in instances tab in em console.
In 11g, you can set the composite title without adding any Java embedded activity to the process definition.
Add an assign activity and set the property .
In From section, Type as “expression”
2. look for function in “Mediator Extension Functions” called setCompositeInstanceTitle
med:setCompositeInstanceTitle(concat("",<xpath for your variable>))
Note : For this to work, make sure you pass string to this function. You can either use string() or concatenate with “” to make it string as shown above.
In To section
1. Select Type as “property”.
2. select/enter : tracking.compositeInstanceTitle
Note : This property is not visible in available properties list, but will work once you add it.

Asynchronous vs. Synchronous BPEL process

This article explains the difference between an asynchronous and a synchronous process.

I have tried to explain the difference with the help of a simple example below:

Suppose there are two processes SynchronousBPELProcess and AsynchronousBPELProcess. As the name suggest former one is a synchronous and later one is an asynchronous BPEL process. Also there is a third process which we’ll call as Client. The Client invokes the above processes.

Case 1: Client invokes SynchronousBPELProcess.
  1. Client invokes SynchronousBPELProcess.
  2. SynchronousBPELProcess gets instantiated and starts its operations while Client waits for the response from the SynchronousBPELProcess.
  3. SynchronousBPELProcess completes its operations and sends a response back to Client.
  4. Client continues and completes its processing.
Case 2: Client invokes AsynchronousBPELProcess
  1. Client invokes AsynchronousBPELProcess.
  2. AsynchronousBPELProcess gets instantiated and starts its operations while Client also continues to perform its operations.
  3. AsynchronousBPELProcess completes its operations and callback the Client with the response message.
Here we noticed that if a synchronous process is invoked, the operations of this process has to be completed first and only then the client is able to resume its operations while in the case of asynchronous both the process continues to perform their operations.

Fig1: An image showing bpel diagram of an asynchronous and a synchronous process.

What makes the difference?

Synchronous Process:

The synchronous process defines one two way operation port to receive the request and send the response. Using the invoke activity the client invokes the Synchronous BPEL process on this port and waits to receive the response on the same port. As soon as the client receives the response from the BPEL process it continues with its flow. On the BPEL process side, the Synchronous BPEL process gets instantiated on receiving the client request and sends back the reply using the reply activity on the same port on which the Client is waiting.

Asynchronous Process:

In the asynchronous process two one way operations ports are defined to receive the request and send the response. On the client side, the client uses the invoke activity to invoke the asynchronous BPEL process and continues with its flow. It uses the receive activity to receive the response later in the flow. The asynchronous BPEL process receives the request on one of the ports and sends back the reply from another port (callback port). To send the response the asynchronous BPEL process invokes the client on the callback port using the callback activity.

Fig 2: An image showing the wsdl of an asynchronous and a synchronous process.

Fig 3: An image showing a call to asynchronous and synchronous process.

We also find different operation names like initiate, onResult and process in the .bpel file. These are just labels to differentiate between sync and async processes.

* A port is nothing but a method with input and output. So a two way operation port has an input and an output while a one way operation port has only input or output.

Batch Processing with BPEL

the File

The first task is to create a new BPEL project
within JDeveloper.  I decided to create a new BPEL project with an
empty BPEL process.  This was because I wanted to control the partner
links.  I needed control of the partner links because I was not going
to use SOAP over HTTP bindings, which are generated for a default synch
or asynch process.  Rather I wanted to use file and database adapters
for the bindings.  This gave me the blank process as shown

added a partner link to this blank process and then used the file
adapter wizard to specify how I wanted it to behave.  To begin with I
specified that I wanted to read from a file.

specified the read operation I then told the adapter where to find the
input files.  I used a physical path, but better practice is to use a
logical name which can be altered via the console, making it easier to
move the process between environments.  in addition to the directory I
also specified that I wanted the file to be copied to an archive
directory after processing.  Finally I specified that I wanted the file
deleted from its original location after processing.  These choices are
reflected in the screen below.

now needed to provide some more information about the input files.  I
needed to provide a pattern to identify which files I wanted to
process, I chose a simple wildcard expression rather than a regular
expression.  Note that the file adapter uses the term message rather
than record, this doesn't entirely make sense as a message may contain
many records (as determined by the batch size) - nevertheless I will
use message to be consistent with the screens in the adapter wizard
even though this confuses records and messages.  It is at this point in
the process that the number of messages records in the file is
identified (1 or many) and also the batching of messages is
determined.  To start with I set the number of messages in a batch to
be 1.  This means that for every message (record) in the file a new
BPEL process will be created.  More on this later.

specified batch sizes and file matching expressions I now need to set
the frequency of checking the directory for new files (the Polling
Frequency) and also the minimum file age.  Minimum file age means that
a file must be at least this old before it can be processed.  This
allows for files that take a long time to create, the minimum file age
can be set to a large value to enable the source system to finish
creating the file.

I needed to specify the format of the records/messages in the file.  To
do this I used the "Define Schema for Native Format"

said that I wanted to create a new native format, in my case the file
was made up of fixed length fields, but it could have been in delimited
format such as csv (comma seperated values).

then provided a sample input file that I could use to identify the
field boundaries.  I then chose which parts of the file I wanted to
read in as a sample. 

file had more than one record type, in fact it had a header, an
arbitary number of records and a trailer.  With this in mind I chose
the appropriate type of file organisation.

specifying the individual fields I needed to chose a name space and a
root element name.

could now identify the discriminator for each record type - in this
case it was the first character of the record.  The wizard identified
the three record types and I changed the record type names to sensible
names that meant something to me.

now used the wizard to identify the individual record field
boundaries.  Each type of record needs specifying seperately.  I found
it best to do this with a mouse as I had some problems when entering
numbers directly into the list.

identified field boundaries I then needed to identify the type of the
fields.  Again this has to be done for each record

we can review the generated XSchema and select a filename to save it

have now defined our file format and mapped it onto an XML record
structure.  We can now return to the file adapter

file adapter is now set up to read a single record at a time from the
I could now add a Receive to the process and use the new
created FileInputService partner link.  When creating the receive I
marked the "Create Instance" check box as I needed to create a new
when a message was received.

to the

Having created a partner link and a receive activty
to read the input file I now needed to create a partner link to write
it to the database.  To do this I used the database adapter and then
created a simple assign statement to set up the call to the database.
The resultant process shown below was then

the Process

deploying the process to the BPEL Process Manager server the next step
is to run it.  To do this I dropped a sample file into the specified
input directory.  The file adapter detected the file, read it, and
created a process for each message in the file.  This resulted in a
large number of processes being created.  The process had two major
  • Two processes had errors because
    they had no data record (the header and trailer records were passed to
    these processes)
  • A single file had created more
    than 200 processes which made it difficult to see what was
An obvious solution to these
problems would be to batch multiple records into a single

Batching the Data

In order to
receive multiple records into a single process I went back to the File
Adapter wizard and altered the "Publish Messages in Batches of " field
to have a larger number than one, I chose fifty.

If I had deployed and
tested the process without any other changes I would have found only
one record being processed in each batch.  This is because I have yet
to modify the rest of the process to iterate over multiple
The first step is to figure out how many records I
need to process.  To calculate this I created a new integer variable
"NumberOfRecords" to hold the number of records and initialised it with
an assign statement as shown below. 

The assignment calculates the number of Data elements underneath the
document root element Records.
I then created another integer
variable "CurrentRecord", initialised to 0, to act as a for loop
counter.  I then created a while statement to loop over all the Data
records in the message.  The while condition was
"CurrentRecord<NumberOfRecords" as shown below

the while loop I then created an assignment statement to set up the
write to the database.  Unfortunately it is not possible to use the wizard to completely set up the copy operation.  I followed this process
  • Create assignment as if there were a single record rather than multiple. This gave me the following xpath expression
    • /ns3:Records/ns3:Data
  • Manually edit the the assignment statement to add an array index to identify the specific record.  The expression now looked like this.
    • /ns3:Records/ns3:Data[bpws:getVariableData('CurrentRecord')]
When incrementing the CurrentRecord variable it is important to remember that, unlike Java and C/C++, XPath indices start at 1, not 0, so the CurrentRecord value must be incremented before using it as an index into the record set.
After deploying this process a 200 record file would now produce just 4 processes, making it much more manageable.

Reasons for Batching

So why would you want to batch messages?
  • Easier to track what is happening as fewer processes to manage.
  • Slightly more efficient within the BPEL PM - some testing I did indicated it was a little more efficient to batch, presumably because of the reduced process creation overhead.
How do I process a file with multiple records as a single process?
  • Treat it as a single message by not checking the "Files contain Multiple Messages" check box in step 5 of the File Adapter Configuration wizard.
  • Individual records in the message are indexed in the same way as multiple records in the multiple records example above.
Why would you not batch?
  • Really want each individual record to initiate a new process that is to be tracked individually, for example equipment provisioning requests, each of which is part of a seperate customer order.
  • Want to simplify process by only handling a single record.
  • Too idle to deal with looping over sets of records in a message (oops, that was the same as the last one!)

Key Points to Remember

  • Use XPath to index into multiple records
    • Use wizard to create copy rules then add indexing afterwards
  • Use a while loop to iterate over records
  • XPath indices start at 1
So happy batching!

BPEL File Adapter Tutorial

Overview: In this tutorial I will explain how to read a CSV file using ReadFileAdapter and then to write a CSV file using WriteFileAdapter. We don’t have two separate FileAdapters but based on the read and write operation we mention them as ReadFileAdapter or WriteFileAdapter.

ReadFileAdapter will Receive the input data from file and translate the data based on the translation logic defined and post the XML messages.

WriteFileAdapter will receive the XML messages and translate it into actual data and write to a file.

FileAdapter supports the following file formats.

1. XML
2. Delimited (Delimiter can be anything)
3. Fixed position
4. Binary Data
You don’t need to use both ReadFileAdapter and WriteFileAdapter in a single process, for example you can read the data from file using ReadFileAdapter and write data to a database using DBAdapter. In other way you can get the data from a database and write it to a file using WriteFileAdapter. This is purely based on the business requirement and up to us how we are going to use them.

System Requirements:

1. jDeveloper 10.1.3
2. SOA Suite 10.1.3
Connections Required:

1. Start the SOA Suite
2. Create Application Server Connection
3. Create Integration Server Connection
Start the SOA Suite:

If you don’t have SOA Suite already installed on your machine click here to download and install it.

If you have SOA Suite installed on your machine navigate Start >> All Programs >> Oracle – >>Start SOA Suite

Once your Application Server starts successfully you will see the screen below

Create Application Server Connection

Navigate to Connection Navigator by clicking view >> Connections Navigator or by pressing Ctrl + Shift + o

Right click on “Application Server” and click New Application Server Connection

Wizard opens with welcome screen

Click Next

Connection Name: AppServerConnection1 (you can change it to anything)

Connection Type: Oracle Application Server 10g 10.1.3

Click Next

Username: oc4jadmin

Password: welcome1(this is the default password, if you have changed it use your own password)

Click next

Leave everything as it is and click next

And then press Test Connection

If you receive the Status as Success! You are good to proceed. If you get any errors make sure your application server is running and try again.

Click Finish to close the wizard and return to Connection Navigator.

Create Integration Server Connection

Go to Connection Navigator by pressing Ctrl+Shift+o

Right click on “Integration Server”

By Clicking the “New Integration Server Connection” wizard opens up with welcome screen.

Click Next

You can leave the Name as it is or change it if needed

Click Next

Application Server: Select the Application Server you created in first step. If you have only one application server it will be automatically selected.

Hostname: localhost

Port Number: 8888

Click Next

Click ‘Test Connection’

If you receive the following message you connection is successful.

Application Server: OK

BPEL Process Manager Server: OK

ESB Server: OK

Click finish to close the wizard

Create New Application

Open Application Navigator by clicking Ctrl+Shift+A or go to view >> Application Navigator

If you have any Application exists  you can create your BPEL process under that if not creation new application by right clicking the Applications and selecting New

Change the application name as you need and leave all the remaining things as default and click OK

Click Cancel to cancel the project. We will create new project manually by selecting the BPEL Process project in next step.

Create New BPEL Process

Right Click on the Application that we created and click New Project

Wizard opens as shown

Select BPEL Process Project and click OK

Change the Name(optional) and

Template: Empty BPEL Process

Click Finish.

If you expand the BPEL Process you can see the files below which are created by default.

Middle panel of jdeveloper opens BPELProcess.bpel file automatically if not open it by double clicking on the file shown in the above figure.

In the .bpel file you can see 3 regions

1. Services(Left Side)
2. Main Activity Region
3. Services(Right Side)
On the Right Side of your screen you can see the component palette.

From the dropdown select ‘Services’ if it is not selected.

From services select “File Adapter”

Drag it to Services region on left and leave it.

Automatically ‘Adapter Configuration Wizard’ will pop up showing the welcome screen

Click Next

Enter Service Name as ‘ReadFileAdapter’ since we are going to use this adapter for reading the file.

Click Next

Select ‘ReadFile’ and click Next

Select the ‘Physical Path’ option and select the path where you CSV file is placed.

I have placed by .CSV file in the ‘C:\readfile’ directory

Deselect the option ‘Delte files after successful retrieval’

Click Next

Enter the patten name in ‘Include Files with Name Pattern’

I have entered erpschools*.txt which means that all the files starting with ‘erpschools’ word prefix will be picked for processing.


erpschools.txt will be picked

erpschools1.txt will be picked

erpschools2.txt will be picked will not be picked.

Click Next

Leave the default values and

Click Next

Click on ‘Define Schema for Native Format’ button

Navtive Format Builder Wizard will pop up with welcome screen.

Click Next

Select ‘Delimited’ option and

Click Next

Select your delimited file.

My Delimited file looks like this

Click Next

Since our file contains only one record select first option and

Click Next

Namespace: leave default value

Enter a name for element that will represent record: readrecord

Click Next

Click Next

Change the column names and types as needed and

Click Next

.xsd(XML Schema Definition) file will created by default

Click next

Click Finish to go back to Adapter Wizard

Schema Location and Schema Element will be populated with the values that we created just now.

Click next

Click Finish

Create Partner Link window will show up as above

Click OK

Now you should be able to see the ReadFileAdapter in the services region as shown above

Now in the component Palette select ‘Process Activities’

Select ‘Receive’ component and drag it to main region

When you move the component to main region it will change the color to yellow then you can drop it.

Double click on Receive_1 component to edit

Click on ‘Flash light’ right to to the ‘Partner Link Field’

Select ‘ReadFileAdapter’ and click OK

Now Partner Link and operation fields will be populated as shown

Click on ‘Auto Create’ button right to Variable to create new variable. If you don’t understand which icon to click place your cursor on the icons and you can see the hint text

Click OK

Check ‘Create Instance’ box and click apply

Click OK

Now the figure looks like this

There is a link created from ReadFileAdapter to Receive_1 component

Create WriteFileAdapter

Go to component palette and select services from drop down

Select the drag FileAdapter to Services region on the right side

FileAdapter Wizard will open with welcome screen as shown below

Click Next

ServiceName: WriteFileAdapter

Click Next

Select ‘Write File’ option

Click Next

Enter Directory path and file naming convention as shown above.

%SEQ% will increase the numbers from 1 onwards which means if you have 10 files to write then your file names will be erp_1.txt, erp_2.txt, erp_3.txt …..erp_10.txt

Click Next

Click Browse to select the schema file that we already created before

Select readrecord

Click OK

Click Next

Click Finish to close the wizard

Click Apply and OK

WriteFileAdapter will showup in the right Services region as show above

Now go to Component palette and select ‘Process Activites’

Select ‘Invoke’ Component from component palette and drop it below the ‘Receive_1′ component

Double click on ‘Invoke_1′ component to edit it

Click Flash light to select the partner link

Click ok

Click Automatically create input variable icon to create new variable

Click ok

Click Apply and OK

Now drag and drop the ‘Transform’ component below the ‘Receive_1′ component and above ‘Invoke_1′ component

Double click on ‘Transform_1′ component to edit the settings

Select as follows

Source variable: Receive_1_Read_InputVariable

Target variable: Invoke_1_Write_InputVariable

Click Apply and OK

Transformation_1.xsl file will open as show above

select  ‘tns:readrecord’ in source and drag it to ‘tns:readrecord’ on target to map the fields.

Auto Map Preferences will pop up as shown above

Click OK to auto map

All fields will be mapped automatically as shown above

Now select BPELProcess.bpel file on top and click validate icon to validate the process

Once it is validated with out errors we are ready to deploy the process

Deploy BPEL Process

Right click on BPEL Process and select Deploy option as shown below

If you have this process already on server it will pop up with version number if not it will start deploying the process

If your deployment is successful you will see the message


Initiate BPEL Process

Open your browser and type the following URL


If your hostname is different change it in the URL

Username: oc4jadmin

Password: welcome1(default)


You can see the list of deployed BPEL Process on the left panel as shown below

Click on the process we deployed right now

You will be taken to Initiate screen where you have option to post XML Message

Now your process has initiated

Go to your writefile directory to see your file