Documente Academic
Documente Profesional
Documente Cultură
Contents
CONTENTS................................................................................................................................................ 1
HTTP ............................................................................................................................................................ 5
HTTP Listener ........................................................................................................................................................5
HTTP Request ........................................................................................................................................................9
HTTPS ........................................................................................................................................................ 11
Consuming Secured Service ................................................................................................................................. 14
Using HTTP Request Connector with SSL enabled ..................................................................................................14
HTTP Request Connector Configuration to access SSL enabled service ..................................................................14
Using Web Service Consumer ..................................................................................................................................16
ROUTING................................................................................................................................................. 18
Splitters ............................................................................................................................................................... 18
Collection splitter and Collection Aggregator ....................................................................................................... 19
Message chunk splitter and Message chunk Aggregator ...................................................................................... 19
Scatter Gather ..................................................................................................................................................... 20
For Each ............................................................................................................................................................... 21
Main Flow, Sub Flow and Flow reference. ........................................................................................................... 22
FILTERS:.................................................................................................................................................. 23
DATA MAPPER ...................................................................................................................................... 25
DataMapper Concepts: ........................................................................................................................................ 26
Basic Usage with Example: ......................................................................................................................................26
1. Viewing sample mapping values: ........................................................................................................................28
2. Input and Output Metadata: ...............................................................................................................................29
3. Propagating DataSense data: ..............................................................................................................................31
4. Mapping only even numbered values: ................................................................................................................33
5. Streaming large files through DataMapper: ........................................................................................................35
6. Using MEL to invoke Java functions: ...................................................................................................................37
7. Using Flows as Lookup Tables: ............................................................................................................................40
Contents
JMS ............................................................................................................................................................ 43
Queues: ............................................................................................................................................................... 43
Configuration with Example: ...................................................................................................................................43
Topics: ................................................................................................................................................................. 49
Example: ..................................................................................................................................................................50
Example to understand how JMS uses serializing, and de-serializing objects: ...................................................... 52
DATABASE ................................................................................................................................................ 57
Database URL ......................................................................................................................................................58
INSERT using Template Query .........................................................................................................................60
INSERT using Parameterized Query .................................................................................................................64
INSERT using Dynamic Query ..........................................................................................................................66
UPDATE using Parameterized Query ...............................................................................................................69
UPDATE using Bulk Mode ................................................................................................................................71
Execute DDL ........................................................................................................................................................74
Bulk Execute ........................................................................................................................................................75
Stored Procedure ................................................................................................................................................77
DELETE ................................................................................................................................................................80
SELECT .................................................................................................................................................................82
Building SOAP web services in Mule .................................................................................................................... 84
Simple class as a web service ............................................................................................................................... 86
Consuming using Simple Client ............................................................................................................................ 87
Creating a service using JAX-WS service ............................................................................................................... 88
Creating Client using jaxws client ......................................................................................................................... 90
Securing Web services ......................................................................................................................................... 96
Consuming using Web service consumer ............................................................................................................. 98
Contents
HTTP
HTTP
HTTP Listener
HTTP Listener connector provides a way to listen HTTP requests. Below figure shows the HTTP listener.
HTTP
Figure-3 shows the HTTP Listener configuration. Protocol, Host and Port are required fields. If we do not
supply any of these values, defaults will be set. Below figure shows the defaults.
HTTP
Figure-4 shows the configuration for the Path element in the HTTP Listener (Figure-2) highlighted in
green. All flows which are configured to use same HTTP Listener connector (Figure-3) will have the same
URL. This path will be appended at the end of the URL and helps in accessing a specific application/flow.
HTTP
HTTP
HTTP Request
The HTTP Request Connector provides the most practical way to consume an external HTTP service.
When sending HTTP requests, you can choose what method to use (GET, POST, etc) and may include a
body, headers, attachments, query parameters, form parameters and URI parameters. The response is
then received by the connector and is passed on to the next element in the flow.
Figure-6 shows the HTTP Request Configuration. Like HTTP Listener, HTTP Request can also have a global
connector defined. This global connector is similar to the HTTP Listener connector created.
HTTP
Parameters let us supply the parameters that the service we are invoking expects. These parameter can
be header, query-params etc.. we can choose from the list of options provided. We need to provide a
Name and a Value for each parameter we create. These Name and Value fields also accept dynamic
values.
Figure-9: Response
Text highlighted in red shows the custom message that we have set as shown in Figure-5.
Text highlighted in blue shows the response generated from our service invocation.
10
10
HTTPS
HTTPS
HTTPS connector is similar to the HTTP connectors shown above. The only difference is that HTTPS is SSL
enabled and uses https as protocol instead of http. Configuration is similar to the HTTP connector.
Figure-11 shows the TLS/SSL tab in Connector Configuration popup for HTTPS.
There are 2 ways we can provide the required certificate and keystore file to enable accessing
application using HTTPS.
1. Use TLS Config: This option creates TLS configuration for the specified listener. It is not
accessible outside the HTTP Listener in which it was created. Trust Store Configuration and Key
Store
Configuration
details
need
to
be
provided.
Trust store accepts .cer file path and password for that certificate. KeyStore accepts .jks file
path, the password and the keystore password those were used while generating keystore.
2. Use TLS Global Config: This option creates a global TLS configuration and can be used by any
HTTP connector to enable HTTPS. This also requires KeyStore, Trust Store files and passwords
for those files.
11
11
HTTPS
We can either create Certificate and Keystore or we can get the Certificate from the HTTPs service we
are invoking.
12
HTTPS
Figure-14: WSDL
Figure-14 shows the WSDL rendered after choosing to continue to this website option as shown in
Figure-13.
13
13
HTTPS
Figuer-16 shows the Configuration XML for the secured service consumer shown in above figure.
14
14
HTTPS
Figure-18 shows sample request and response from SOAPUI for the secured service consumer.
15
HTTPS
16
16
17
Figure-21 shows the Request and Response for the Secured service consumer using the SOAPUI.
17
Routing
Routing
The Routing module reviews the different types of Routers and how Routers are used to control
how messages are sent and received by components. The message can be route in different ways. Below
are explained in this example.
Scatter gather
For each
Filters
Splitters
Splitters are used to split the message and process split messages in parallel. After processing
completed, those messages get aggregate by aggregator components. Below is the splitters main flow
diagram.
Above flow exposes a HTTP service to implement collection splitter and message chunk splitter.
This flow expects a query parameter splitter. If splitter parameter value is collection then choice
18
18
Routing
router routes the flow to collection splitter or if the value is chunk then it routes to message chunk
splitter implementation.
In the above flow after logger component (which logs payload) we have two important message
processors. Resequencer and Collection Aggregator. While elements of List are processing invidually, the
elements may get change their order. Resequencer is used to reorder the elements of List object.
Collection Aggregator is used to aggregate the processed invidual message payload.
19
19
Routing
This splitter first convert the message into byte array then split this array into chunks. Each
chunked message is routed to another flow via VM queue in one-way mode.
Message Chunk Aggregator is used to aggregate the chunked messages. Byte Array to String
component needs to co
Scatter Gather
Scatter Gather is used to send a message to multiple endpoints concurrently. It collects the
response of all the routes and aggregate into a single message.
20
20
Routing
For Each
The Foreach scope splits a collection into elements and processes them iteratively through the
processors embedded in the scope, then returns the original message to the flow.
As For Each expects a collection object is expected a java component is used to generate a List
object.
21
21
Routing
Above properties are available in for each scope. Collection field which accepts MEL to provide a
collection object to the for each component for iteration. Counter Variable Name is a variable which
stores the count of iterations. Batch Size is partition the Collection into sub collections of the specified
size. Root Message Variable Name holds the message before being split.
22
22
Filters:
Filters:
Filters are used to filter the message using mule expressions.
Above flow accepts a HTTP request and filters the message using Expression filter and also throws an
exception if Expression filter is not satisfied using Message Filter.
23
23
Filters:
Expression Filter allows you to right a Mule Expression. if the expression returns true then the
process continuous to next message processor. or else the flow get discarded without throwing any
exception. Here the condition is checking for payload instance is java.util.List or not.
If we need to throw an exception when Expression Filter returns false, then Expression filter
needs wrapped up the Message Filter and throwOnUnaccecpted attribute should be true as shown in
below snippet.
<message-filter throwOnUnaccepted="true" doc:name="Message-filter-thow-exception">
<expression-filter expression="#[payload instanceof
com.techm.splitters.SplitterCollections]"/>
</message-filter>
24
24
Data Mapper
Data Mapper
DataMapper is a Mule transformer that delivers simple, yet powerful, visual design of
complex data transformations for use in Mule flows, including:
Filtering, extraction and transformation of input data using Xpath and powerful scripting
Augmenting data with input parameters and lookups from other data sources
Inputs and outputs can be flat (that is, row-structured) data like CSV files or Excel spreadsheet
data, or structured data in the formats supported throughout Mule: XML, JSON, key/value
Maps and trees of Plain Old Java Objects (POJOs).
25
25
Data Mapper
DataMapper Concepts:
Anypoint DataMapper takes data in a specific format and outputs the same data in the format
of your choice. For example, you can take data stored as XML and output the same data in JSON
format. Both the input and the output can be in any of the formats supported by Mule:
Flat, row=oriented formats:
CSV
Fixed-width
MS Excel sheets
Structured formats:
XML
JSON
Key-value Maps
You configure DataMapper using its GUI, called the graphical mapping editor. This editor has
two panes: an Input pane and an Output pane, where you define your input metadata (format,
names of fields, etc.) and your output metadata respectively.
26
26
Data Mapper
In the image above, you select XML from the Type drop-down menu in the Input pane and
provide an .xsd file to generate the structure, and JSON in the Output pane.
3. Click Create mapping (see image above) to create an initial data mapping. DataMapper will
automatically map corresponding fields between the input and output data and will leave any
other fields unmapped.
4. If necessary, graphically modify the mapping, defining input elements and attributes to
output elements and attributes:
27
27
Data Mapper
Note: unlike most components in Anypoint Studio, the DataMapper doesn't offer a way of
being configured via XML code. Mappings must always be done via the GUI, they are then
stored as .grf files in the /mappings folder. All you can do via your XML code is to reference one
of these existing mapping .grf files.
28
28
Data Mapper
Reload Metadata:
29
29
Data Mapper
Step 1: Right-click your main input mapping item (in the example above, companies2), and
select Add field. Enter a name for your new field, use the drop-down to define the type, then
click OK to save.
Step 2: Click the magic wand, then select Reload Metadata.
Step 3: Watch as DataMapper magically uploads a sample value for your new field. In such a
case, the value is null. My example below has a new field for
has_given_contact_permission.
Recreate Metadata:
Step 1: Add an input field to your CSV.
Step 2: In your Input panel, click Re-Create Metadata. Browse to select your newly modified
CSV example file, and then click OK. The new field appears in the Input panel.
30
30
Data Mapper
Step 2: Configure each Salesforce connector, testing the connectivity of each. See Testing
Connections for details.
Step 3: Drop a DataMapper between the Salesforce connectors.
31
31
Data Mapper
Step 4: Double-click to open the DataMapper. DataSense has already populated the input and
output configurations, pulled automatically from each connector.
Step 5: Click Finish and witness all necessary input and output fields appear, ready for dragand-drop mapping.
32
32
Data Mapper
Example:
4. Mapping only even numbered values:
Here is a sample example to illustrate this activity, consider the following XML as input and the
expected output should be the xml with only even ids:
33
33
Data Mapper
34
34
Data Mapper
Then use the Rule to map the output values with input if the id is even.
35
35
Data Mapper
The HTTP endpoint accepts a message a large file which it passes into a DataMapper.
Passing through a Logger, the message then reaches a Foreach which wraps a Database
endpoint. DataMapper must create iteratable objects from the file and so that the Foreach
can process the items iteratively and push them into the database. In order to manage the
processing of this large file, you can enable streaming on DataMapper.
Step 1: To enable streaming, click to open the DataMapper Properties (upper right hand corner
of the DataMapper console).
36
36
Data Mapper
37
37
Data Mapper
Step 2: Create any mapping you want, then click Script (upper right corner of the DataMapper
console) to view the script of the mapping which looks something like this: output.name =
input.name.
Step 3: Click to set your cursor just after input.name then add .toLowerCase() . This
modification invokes a Java function to change the input name to lowercase. See example
below.
Step 4: We can also call a java class in the script tag and check the example below:
38
38
Data Mapper
TIP! We can also use auto-complete to invoke a Java function? Set your cursor at the end of
input.name then hit Ctrl + Space Bar to display a list of auto-complete options.
39
39
Data Mapper
40
40
Data Mapper
41
41
Data Mapper
42
42
43
JMS
JMS (Java Message Service) is a widely-used API for Message Oriented Middleware. It allows
communication between different components of a distributed application to be loosely
coupled, reliable, and asynchronous.
JMS supports two models for messaging:
Queues - Point-to-point
Mule's JMS transport lets you easily send and receive messages to queues and topics for any
message service which implements the JMS specification.
Queues:
In the point-to-point or queuing model, a sender posts messages to a particular queue and a
receiver reads messages from the queue. Here, the sender knows the destination of the
message and posts the message directly to the receiver's queue. It is characterized by the
following:
The producer does not have to be running at the time the consumer consumes the
message, nor does the consumer need to be running at the time the message is sent
JMS
44
44
JMS
Mule will initialize the ActiveMQ connector with a default instance of the ActiveMQ connection
factory and establish a TCP connection to the remote standalone broker running on a local host
and listening on port 61616.
3. Enqueue to JMS Queue
We will use the request payload received from an HTTP inbound endpoint to seed the
ActiveMQ Queue. Open the jms message flow and drag and drop an HTTP endpoint on to the
flow.
45
45
JMS
Double-click on the HTTP endpoint to bring up the properties dialog. Specify jms_queue for
Path. This will make the HTTP endpoint accessible using URL http://localhost:7777/jms_queue.
Set a payload that you want to add to the queue.
Drag and drop a JMS endpoint next to the HTTP inbound endpoint.
Double-click the JMS endpoint to bring up the properties dialog. Specify queue for Queue
name.
Select Active_MQ for Connection Reference in the Connector Configuration that we created
in Step 2.
46
46
JMS
4. Create a Jms_receiver
Use a Jms endpoint to receive the messages in the queue. Its configuration is as follows:
47
47
JMS
48
48
JMS
The Output you receive after the execution is the Payload Set by the JMS-Client.
Note: Only one client can read from a queue at one time and the messages read from the queue are
removed from the queue. If you want to perform any transactions on top of JMS then the Transaction
settings come in handy.
Topics:
The publish/subscribe model supports publishing messages to a particular message topic.
Subscribers may register interest in receiving messages on a particular message topic. In this
model, neither the publisher nor the subscriber knows about each other. A good analogy for
this is an anonymous bulletin board. The following are characteristics of this model:
There is a timing dependency between publishers and subscribers. The publisher has to
create a message topic for clients to subscribe.
The subscriber has to remain continuously active to receive messages, unless it has
established a durable subscription. In that case, messages published while the
subscriber is not connected redistribute when it reconnects.
49
49
JMS
Note: The configuration is same as the queue but we use topics in the JMS Connector
Configuration.
Example:
JMS Publisher Flow Configuration:
Open the jms message flow and drag and drop an HTTP endpoint on to the flow. Double-click
on the HTTP endpoint to bring up the properties dialog. Specify jms_topic for Path. This will
make the HTTP endpoint accessible using URL http://localhost:7777/jms_topic.
Set a payload that you want to add to Publish.
Drag and drop a JMS endpoint next to the HTTP inbound endpoint.
Double-click the JMS endpoint to bring up the properties dialog. Specify topic for Topic name.
Select Active_MQ for Connection Reference in the Connector Configuration that we created
earlier.
50
50
JMS
51
JMS
Path.
This
will
make
the
HTTP
endpoint
accessible
using
URL
http://localhost:7777//jms_serializable_queue.
2. Create a Java Class that implements the Serializable interface as below:
52
52
JMS
53
53
JMS
4. Drag and drop a JMS endpoint next to the HTTP inbound endpoint.
Double-click the JMS endpoint to bring up the properties dialog.
Specify serial_queue for queue name. Select Active_MQ for Connection Reference in the
Connector Configuration that we created earlier.
54
54
JMS
6. Use a Jms endpoint to receive the messages on the Destination with the below configuration
and Active_MQ is configured before.
55
55
JMS
56
56
57
Database
The Database connector replaces the JDBC connector. The Database connector allows us to connect
with database; it allows us to run different SQL operations on the database we have connected to. These
operations include SELECT, INSERT, UPDATE, DELETE, Stored procedures and DDL. The Database
connector lets us perform predefined queries as well as queries that take the connector's input to
specify variable parameters or even to construct sections of the query dynamically. All the examples
shown in this document are executed using the PostgreSQL database.
57
Database
Figure-38 shows the Database configuration which gets opened when we click on + symbol highlighted
in red as shown in Figure-37. We have 2 ways in which we can configure database for accessing using
Database Connector.
1. Database URL
2. Configure via spring-bean
Database URL
Below screenshot shows the configuration using Database URL. It requires values for 2 attributes URL
and Driver Class Name.
URL- is the connection string. We can provide the user name and password if required, to access the
database. This is similar to obtaining connection in Java using JDBC.
58
58
Database
Figure-40 shows the type of statements available for the selected operation.
Figure-41 shows the Advanced tab and the options available in it. In this tab we can provide the Auto
generated Keys so that we need not include them in INSERT, UPDATE statements. These columns will
have auto generated value or a default value. In the example shown, id is the Auto generated/Auto
Incremented column hence we need not supply a value while INSERTing a row. created column is
TIMESTAMP and the default value given for this is CURRENT_TIMESTAMP. So whenever a row is created
59
59
Database
or modified current TIMESTAMP will be saved into this column against the row that is created or
modified.
Transactional Action-is optional, it has a list of actions from which we can select one. Default is
JOIN_IF_POSSIBLE; other options are ALWAYS_JOIN and NOT_SUPPORTED.
Figure-42 shows how to insert a record in database table using Template Query (shown in Figure-40).
The Database configuration is same as shown in Figure-38.
Figure-43 shows the Database connector configuration for INSERT using Template Query
60
60
Database
Figure-44 shows the Template Query global configuration this window gets opened when we click on +
symbol highlighted (Figure-43) in red.
Query Type is the type of the query we want to execute, we have 2 options here Parameterized Query
and Dynamic Query.
Parameterized Query with named parameters is the SQL statement we want to run. We can wither
provide values directly or using named parameters. In this case, it accepts named parameters. Input
parameters are given in Input parameters section as shown in Figure -44. Input Parameters section has
4 parameters (firstname, lastname, email, phone) defined with the values assigned from flow variables
the same parameters are used in the parameterized query.
Dynamic Query this can accept a query prepared outside the connector. We do not have any Input
parameters for this option since we can prepare a query outside the connector.
61
61
Database
62
62
Database
Figure-45 shows the Expression component used to parse payload and assign the values to flow
variables required to insert a record in a database table.
Figure-46 shows the Configuration XML for the INSERT using Template Query
63
63
Database
Figure -47 shows the request and response for Insert using Template Query
Figure-48 shows the flow configuration for inserting a record using Parameterized Query. Flow
configuration is similar to the one shown in INSERT using Template Query . Only change is the
Database connector.
64
64
Database
Figure-49 shows the Database connector configuration to use parameterized query to insert a record in
database table. Values for the flow variables are set in the expression component used in the flow. This
is same as the one used for INSERT using Template Query.
Type Parameterized
Operation - Insert
Figure-50 shows the Configuration XML for INSERT using Parameterized Query.
65
Database
66
66
Database
Figure-54 shows using a dynamic query to insert a record in database table. In this example, query is
prepared in the Expression component and set in flow variable. The same flow variable
dynamicInsertStmt is given as input to the Dynamic Query.
67
67
Database
Figure-55 shows the Configuration XML to insert a record using Dynamic Query.
Figure-56 shows the request and response to insert a record using Dynamic Query.
68
Database
Figure-58 shows the Database connector configuration to update a record using Parameterized query
Type-Parameterized
Operation-Update
69
69
Database
Figure-59 shows the configuration XML for updating a records data using Parameterized query.
Figure-60 shows the request and response for to update a record using Parameterized query. Response
for this operation is the number of rows updated. In this example, response is 1.
70
70
Database
71
71
Database
Figure-63 shows the Expression component to fetch data from payload. In the code shown below, a map
is prepared using the employee data retrieved from the input payload and the same map is set as
payload which will be used by Database connector to update the data in a database table.
72
72
Database
Figure-64 shows the configuration XML for Database update using Bulk Mode.
Figure-65 shows sample request and response to update multiple records using Bulk Mode. Response
shows whether a record is updated or not. 1 indicated update successful, 0 indicates failure.
73
73
Database
Execute DDL
Using this option we can perform a DDL operation. Connector configuration is similar to the one shown
in INSERT using Template Query . Only change is the Database connector.
Figure-67 shows DDL. The ALTER statement shown adds a new column lastModified to the employee
table.
Figure-68 shows the configuration XML for the Execute DDL operation.
74
74
Database
Figure-69 shows the Request and Response for the Execute DDL flow. Response 0 indicates the
operation is successful.
Bulk Execute
The operation Bulk Execute available in Database connector lets us execute multiple SQL statements
in single connector. This is different from the Bulk Mode we have seen in UPDATE using Bulk Mode.
Bulk Mode executes same statement with different set of data which is provided as a collection. Bulk
Execute lets us specify multiple SQL statements in the same query text and executes them.
75
75
Database
Figure-71 shows the Database connector for Bulk Execute operation. In the query text field, we have
provided 3 SQLs each terminated with a semicolon (;). In this example, we are executing an INSERT,
UPDATE and DELETE statements. Values for the insert statement are set using an Expression
component. Input payload gets parsed in expression component and the required values for the INSERT
statement are set in flow variables written in the Query text.
Figure-72 shows the Configuration XML for the Bulk Execute operation.
76
76
Database
Figure-73 shows sample request and response for Bulk Execute operation. Response indicates the
number of rows created, deleted and updated by executing the 3 statements.
Stored Procedure
Database connector provides an option to execute stored procedures which are stored on Database
server. This is similar to calling a stored procedure using CallableStatment in Java. Database connector
configuration is similar to the one shown in INSERT using Template Query . Only change is in
operation.
Figure-74 shows the Flow configuration to call a stored procedure using Database connector.
77
Database
Figure-75 shows the Database connector configuration to execute a stored procedure. We can choose
any of the Query Type from the drop-down. In this example, we have chosen Dynamic; other options
are Parameterized Query and Template Query. The configuration for these query types is same as
shown in INSERT using Template Query, INSERT using Parameterized Query, INSERT using Dynamic
Query.
Figure-76 shows the SQL for the stored procedure get_emp_details. This store procedure takes
employee id as IN param and returns employee information as OUT param.
78
Database
Figure-77 shows the configuration XML for Stored procedure operation using Database connector.
Figure-78 shows sample request and response for the stored procedure flow.
79
79
Database
DELETE
Database connector provides an option to delete record(s) from a database table using DELETE
operation. Database configuration is similar the ones shown in above. The change comes in the
Database operation. Figure-79 shows the Flow configuration for DELETE operation.
Figure-80 shows Database connector configuration to perform DELETE operation. Bulk Mode, Query
Type (Dynamic, Parameterized, Template Query) shown in previous sections applies to this as well.
Configuration remains same for all these.
80
80
Database
Figure-81 shows the configuration XML for the DELETE operation using Database connector.
Figure-82 shows sample request and response for the DELETE operation using Database connector.
Response shows the number of rows deleted.
81
81
Database
SELECT
Database connector provides an option to fetch record(s) from a database table using SELECT operation.
Database configuration is similar the ones shown in above. The change comes in the Database
operation. Figure-83 shows the Flow configuration for SELECT operation.
82
82
Database
Figure-86 shows sample request and response for SELECT operation in Database connector.
83
83
Database
Above figure shows the flow configuration to build a SOAP web service using CXF connector provided by
mule.
84
Database
As shown in Figure-2, click Generate from WSDL button if you are building a WSDL first service.
Give the details of WSDL location and package name (to generate source files) in the popup; CXF will
generate the source files in the specified package.
85
85
Database
86
Database
Above figure shows the XML configuration to consume a service using simple-client. Here, we need to
have all the java classes copied to client application which are used to create service. Similar to the
service creation, we need to provide the interface as the value for the serviceClass attribute in
<cxf:simple-client>, no implementation class is required. After configuring simple-client, we need to
invoke the service using the outbound endpoint.
87
87
Database
88
88
Database
89
89
Database
90
90
Database
91
91
Database
Configuration shown in the above figure exposes a WSDL(generated by Service) to work as proxy. In the
above figure, SOAP component is configured as Proxy Service.
92
Database
Above figure shows the details configured in CXF. Values for Port, Namespace, and Service are same as
mentioned in WSDL.
93
93
Database
Above figure shows the properties available for Web service consumer. Connector configuration is
shown in Figure-9 Web Service Consumer properties. Opertaion gets populated after the connector is
configured. If there are more than 1 opertaions are available, the drop-down provided will let us choose
the operation we are interested in. Otherwise, if there is only one operation available on the service we
want to invoke, the same will be selected by default.
94
Database
95
Database
96
96
Database
97
97
Database
98
98
Database
99
99
Database
UsernameToken authentication:
100
Database
101
101
Database
102
102
Database
103
103
104
Figure-39: Response
Once you invoke the service, the response would look similar to the one shown in figure.
Consuming service enabled with usernametoken using webserviceconsumer:
104
Java Component
Invoke Component
Java Transformer
Below is the main flow which exposes a HTTP service and refers to multiple sub flows one after
another to cover all above concepts.
Java Component:
Java component is used to refer a class which has complex code.
Example:
Below is sub flow named 'simple-java-component' which has Set Payload and java component.
105
Java component is reffered to custom made class (UsingCallable) which implements Callable
interface. This class is used to print current payload, size of inbound properties and size of invocation
properties.
public class UsingCallable implements Callable {
@Override
public Object onCall(MuleEventContext eventContext) throws Exception {
MuleMessage message = eventContext.getMessage();
System.out.println("Payload: "+message.getPayloadAsString());
System.out.println("No of Inbound Properties :
"+message.getInboundPropertyNames().size());
System.out.println("No of Variables : "+
message.getInvocationPropertyNames().size());
return null;
}
}
106
106
107
107
Click on "Advanced" tab and create following three properties using "+" as shown below and click on
Finish.
name
dept
location
108
108
Same properties with same names need to be created along with setters and getters in
"UsingSingletonObject" class. So that, specified properties values in java component are assigned into
java class properties. Below is the code to create map object with these three properties.
import java.util.HashMap;
import java.util.Map;
import org.mule.api.MuleEventContext;
import org.mule.api.lifecycle.Callable;
109
109
@Override
public Object onCall(MuleEventContext eventContext) throws Exception {
110
110
employee.put("name", getName());
employee.put("department", getDept());
employee.put("location", getLocation());
return employee;
}
Invoke component:
Invoke component is used to invoke the method of a given object (bean). Below flow has 3
invoke components which refers to 3 different methods of a bean.
System.out.print("Addition: ");
System.out.println(a + b);
return a + b;
}
System.out.print("Substraction: ");
System.out.println(a - b);
111
111
return a - b;
}
System.out.print("Multiply: ");
System.out.println(a * b);
return a * b;
}
}
A bean needs to be created in global elements to use Invoke component. Create a bean which refers to
a custom made java class in global elements. In "Global Elements" tab click on "Create" button.
Click on "..." symbol next to "Class" field to select a java custom made class. Provide some meaningful
names in "ID" and "Name" fields. Click on OK button.
112
112
Drag a invoke component and double click on the component to bring up the properties. Fill the
required fields as shown below.
113
113
Name
Object Ref
Method
Method Arguments
In the same way two more invoke components are created for two methods (substract and
multiply)
114
Below sub flow uses java components to implement Reflection Entry Point Resolver
Java class:
Below java class "EntryPointResolver" has three methods with different argument types.
public class EntryPointResolver {
115
115
116
116
No Arguments method:
Drag a 'Set Payload' component and set value a string value as "#[null]". So that payload
becomes null. Drag a java component and refer to a class "EntryPointResolver" as shown earlier.
117
117
Java class :
import java.util.Map;
118
118
import org.mule.api.annotations.param.OutboundHeaders;
import org.mule.api.annotations.param.Payload;
In the same way, all outbound properties match to the argument 'dept' which is type java.util.Map.
@OutboundHeaders Map<String, Object> dept
Java class:
119
119
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;
import java.net.HttpURLConnection;
import java.net.MalformedURLException;
import java.net.URL;
if (conn.getResponseCode() != 200) {
throw new RuntimeException("Failed : HTTP error code : "
+ conn.getResponseCode());
}
120
120
output += line;
}
conn.disconnect();
} catch (MalformedURLException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
return output;
}
121
121
Example:
Consider a message from a source system contains a zip code but the target system needs the
two letter state. A message enricher can be used to lookup the state using the zip (postal code)
from an enrichment resource. The enricher calls out to the enrichment resource with the
current message (containing the zip code) then enriches the current message with the result.
122
122
This is a very simple flow with one-way inbound and outbound endpoints, and which acts as
part of an order processing pipeline. This flow uses an enricher to add a state flow variable to
the current message with the state that the flow ref returns. The target attribute defines how
123
123
the current message is enriched using a MessageEnricher which uses the same syntax as
expression evaluators.
Description:
1. The http endpoint receives an xml input as a payload with H-No, street, city and zip elements.
2. In message enricher we modified the payload as zip and forwarded the same to sub flow to
retrieve the state for that particular zip.
3. The flow reference in the processor chain of the enricher receives the state as a payload
which enricher assigns to a new target flow variable named state.
4. The payload sent from the enricher is the same as the input payload and the new state
variable is added to the xml using Data-Mapper.
Output:
124
124
125
125
126
In this particular example the Get State endpoint receives the full message, and we are
supposed to use a part of that payload. Here we mention the part of the payload in the Source
section of the Message Enricher and that is saved in the target section.
The enrichment resource can be any message processor, outbound connector, processorchain or flow-ref. If using an outbound-connector then of course it should have a requestresponse exchange pattern.
Expressions
Mule Expression Component:
Expressions
The Expression Transformer executes one or more expressions on the current message.
The result of these expressions becomes the payload of the current message.
127
127
Expressions
128
128
Expressions
For each return argument, you enter or select from the pull-down list its expression evaluator. Then
enter the expression to use. If you set Evaluator to custom, you also need to specify the custom
evaluator. If you are using a custom expression evaluator, you must first have registered the custom
evaluator with the Expression Evaluator Manager. Expression syntax varies depending on the evaluator.
When you have multiple expressions for return arguments, by default expression evaluation returns an
error and stops when an expression evaluates to null. Check the Optional box if you want expression
evaluation to continue to the next expression when an expression evaluates to null.
The evaluators should be of this enumeration '[attachment, attachments, attachments-list, bean,
endpoint, exception-type, function, groovy, header, headers, headers-list, json, json-node, jxpath, mappayload, message, mule, ognl, payload, process, regex, string, variable, xpath, xpath2, xpath-node,
custom]'.
Example Flow:
Description:
1. Use http connector to trigger the flow.
2. Expression component to set the Employee Details as outbound properties.
3. Message Properties component to set a Session Variable.
4. Pass all the properties to another flow using a http outbound end point and add session properties to
the header as session expires after every flow.
129
129
130
5. The data received will be of Byte Array Stream so use an Object to String Transformer.
6. Check the attached Session variable using the "#[message]" MEL in Logger component.
7. Get all the details from the inbound properties and use a map object to set-payload. In a similar
fashion List can also be used.
8. Transform the payload type from Object to String.
9. Evaluate if the payload type is of String or not using Expression Filter. If the payload is of type String
the flow execution forwards.
10. Use choice router to check for a specific text in the payload and print his Server IP using Mule
Expression Transformer.
11. Refer the ExpressionExample.zip for the example flow and SOAP UI test xml.
130
Properties
Properties
A properties file is a simple collection of key-value pairs that can be parsed by
the java.util.Properties class. They are often used to store configuration or localization data. In
mule properties file can be configured using property placeholders and system properties.
Property Placeholders:
Property placeholders allow you to upload the parameters from a properties file. This enables
you, for example, to have different property files for different environments (Dev, QA, and
Prod) or allows you to reuse the same value in different parts of your configuration.
A very simple example shows how to use the property placeholders.
The values for these placeholders can be made available in a variety of ways, as described in the
sections below.
Global Properties:
You can use the <global-property> element to set a placeholder value from within your Mule
configuration, such as from within another Mule configuration file:
131
131
Properties
Properties Files:
To load the properties from a file, you can use the standard spring element
<context: property-placeholder>.
132
132
Properties
System Properties:
The placeholder value can come from a JDK system property. If you start Mule from the
command line, you would specify the properties as follows:
133
133
Properties
134
134
Properties
Environment Variables:
There is no standard way in Java to access environment variables. But the setting of environment
variables can be done in the run configurations windowchoose Environment tab.
Mule-app.properties:
The property can be configured in mule-project.xml as below:
Add an environment variable by pressing the
button:
135
135
Properties
Example:
The example above tries to display the property name which is a common property from
various sources the observation is as below:
Observation:
The property in the mule-app.properties is prioritized the most, Global variables is prioritized
the next most and next is the run time arguments followed by Environment Variables and then
follows the property files in alphabetical order.
136
136
REST
REST
Creating a REST Service using REST Component
Use this component to publish a RESTful Web Service. A REST component publishes a RESTful web
service via JAX-RS annotations and using Jersey. Mule hosts RESTful web services using Jersey, which is a
JAX-RS implementation. JAX-RS is a specification that provides a series of annotations and classes that
make it possible to build RESTful services.
Figure-23 shows the REST Service flow creating using REST component.
137
137
REST
Figure-24 shows the REST component configuration. Component is the required element, which is java
class with JAX-RS annotations.
Figure-25 shows the java class annotated with JAX-RS annotations @Path, @GET,@Produces,
@Consumes..
138
REST
@Produces-Defines the media type(s) that the methods of a resource class can produce.
@Consumes-Defines the media types that the methods of a resource class can accept.
Figure-26 shows the configuration XML for the flow shown in Figure-23.
Figure-27 shows the request and response for the REST service created when accessed using SOAPUI.
139
139
REST
Listener.
Second component is CXF component. This is optional if we do not want to expose a WSDL or do not
want to access the service in SOAP style.
140
140
REST
Figure-30 shows the Choice block which helps in routing to a particular flow based on the result of
condition under test. In this example, well use SOAPAction to identify a particular operation from the
service we have published. Choice router will route to a particular flow based on the incoming
SOAPAction.
141
141
REST
Figure-31 shows the getuser flow shown in Figure-28(highlighted in red). Three variables and one
property are set in the flow shown below.
Set UserId sets the value of userId coming from the request into a variable.
#[xpath3('//user:userDeailsRequest/userId')]
Set Path sets the URI which we want to invoke. This is same as defined in @Path(uri). For example,
if there are 2 resources (user, users) published on the same URL (http://localhost:8088), we can access
user using http://localhost:8088/user and users by http://localhost:8088/users. This path variable we
are setting here will have the path i.e. user, if we want to access user.
Set Operation- sets the HTTP method using which we want to invoke the service. The service should
support this operation. In our case we are invoking GET method (as shown in Figure-25).
Set Content Type-sets the Content-type property which is accepted by the method we are invoking. In
our example, getUserDetails method will accept either XML or JSON. So, if we have to send a contenttype, it should be one of them.
Figure-32 shows the REST service invocation using HTTP Request and values for few attributes are
dynamically set as shown in Figure-30.
142
142
REST
Figure-33 shows the HTTP Request configuration. Connector Configuration for this similar to the one
shown in HTTP Request.
Values for Path and Method are set dynamically in the flow as shown in Figure-31. As shown in
Figure-25 getUserDetails method expects a QueryParam i.e. userid. Using HTTP request we can provide
the same using query-param option as shown in the below figure. Content-type header can be sent
using header option as shown in below figure.
Figure-34 shows the configuration XML for the REST service consumer flow.
143
143
REST
Figure-35 shows the request and response of the Rest service consumer.
144
144
Transactions
Transactions
A transaction is an operation which must succeed or fail as a complete unit; it can never be only partially
complete. Mule applies transactions to a series of steps in a flow must succeed or fail as one unit. We
can apply transaction to a connector to enable using transactions. If a flow begins with a transaction
supported connector, mule can start a new transaction and manage entire flow as a transaction. If we
use a transactional outbound connector mule manages that outgoing operation also as part of
transaction. With both a transactional inbound and outbound connector, Mule executes the outgoing
operation as part of the transaction initiated by the inbound connector.
The following connectors in Mule support transactions:
1. JMS
2. VM
3. Database Connector
A Mule flow may begin with a non-transactional inbound connector such as HTTP or SFTP. In such
situations, we can use Mules Transactional scope to combine the processors and put as one
transactional unit, so that all get succeed or failed as one unit. If any flow is beginning with any of the
connectors which support transaction, entire flow will be considered transactional including
transactional outbound connectors.
Mule supports three different types of transactions Single resource, Multiple resource, XA. In mule,
transactions can be configured either by applying transaction to a transaction supported endpoint or
wrapping message processors in mule provided transactional scope.
Each of these transactions has an action attribute that needs to be specified to work with transactions.
These actions include ALWAYS_BEGIN, ALWAYS_JOIN, BEGIN_OR_JOIN, JOIN_IF_POSSIBLE, NONE,
NOT_SUPPORTED.
ALWAYS_BEGIN - will begin a new transaction for every request.
ALWAYS_JOIN- will always join an ongoing transaction, throws an error if there is no transaction is in
progress.
BEGIN_OR_JOIN- will join if it finds any ongoing transaction, begin a new transaction otherwise.
JOIN_IF_POSSIBLE-will join an ongoing transaction if it finds any.
NONE- operates as non-transactional.
NOT_SUPPORTED - execute outside any transaction.
145
145
Transactions
We can configure an exception strategy to the transactional scope. With the help of this transactional
scope specific error handling we can manage transactional exception. If we have a flow level exception
strategy, transactional exception strategy is optional as the flow level can handle all the exceptions
thrown while executing flow. If there is no exception strategy configured, mule uses default exception
strategy.
Figure-87 shows the transactional scope in mule.
Figure-88 shows a flow configuration for transaction. In this example configuration, well see how
transactional block helps in maintaining the database state. To demonstrate how transactional block
works, we take a shopping cart example. We receive a request which has details of what items have
been added to the cart, the quantity of each item in the cart, total price for all items, and the account
number and account holder name.
146
146
Transactions
In our example flow, once we receive the request for billing, well see
1. If there are enough items available.
2. Enough amounts are available in the account.
If both the conditions are met, well update the database tables according to the request we have
received.
If any of the conditions is not met database tables will not updated and corresponding error message
will be sent back to the user(or invoking service) stating the reason for failure.
Figure-89 shows part of the transactional flow configuration shown in Figure-88. In this flow, well
retrieve the details required for processing the request such as user id, account number and the billing
amount. This is done in a sub-flow.
Figure-89: Parsing request and fetching the required data for processing
FetchItems expression-component highlighted in red is used to fetch the item details (item id, quantity
requested for) from the request and create a collection. The created collection is given as input payload
to the next processor (for-each inside transactional block) in the flow. For-each accepts a collection and
iterates over the elements in the collection.
Figure-90 shows the sub-flow to process the request and fetch the userid, account number and billing
amount.
147
147
Transactions
Figure-91 shows the For-each scope (highlighted in red). Inside the for-each scope, we have Database
connector, configuration for this is similar to the one shown in
Database section. Using this database connector, we are calling a stored procedure to check the
quantity of items and update table as per the quantity in payload.
Figure-92 shows the Database configuration for the one highlighted in Figure-91.
148
148
Transactions
Figure-93 shows the update_shopping_items stored procedure. This procedure gets called from the
database connector.
Figure-94 shows the flow to verify Account details of user. Call the sub-flow to set properties required to
process account information.
149
149
Transactions
Figure-95 shows the properties set in the sub-flow (highlighted in green in figure-94)
Figure-96 shows the Accounts details flow called using VM (highlighted in red) in figure-94. In the flow
shown below, account details gets verified and updated.
150
150
Transactions
Figure-98 shows the update_account stored procedure to verify and update the account details.
151
151
Transactions
Figure-99 shows the database connector to update transaction reference number and status of the
transaction. The choice block at the beginning of the flow is used to route to one of the flows based on
the message received from Account Details flow shown in Figure-96.
If we receive a success response from account details flow, well update the transaction status and
transaction reference in userinfo table. If the response we received from Account Details is a failure
response, well just show the error message we received from Account Details flow.
152
Figure-101 shows sample request and response for the transaction flow.
153
153
Cache scope
Cache scope
The Cache Scope is used to store frequently called data thus saves time and processing load. We can
configure the caching strategy to store the responses and this cache scope can have any message
processors to process request. The responses contain payload of the response message produced by the
processing that occurs within the scope. We can configure caching strategy to let mule know how to
store data. If we do not specify any, mule uses default caching strategy.
When a mule message reaches cache scope, cache scope process the message and the sends the output
to the next processor and saves the output. Next time, when mule sends same kind of message into
cache scope, the cache scope offers a cached response rather than processing the message again. If
mule cache scope finds a match for the incoming request it is a hit. If mule does not find any match in
cache scope it is a miss. If mule finds a matching in cache block, the processors in cache block will not
be executed and the cached response will be sent as output. If mule does not find any matching in cache
block, the message processors placed in cache block will get executed and the response will be sent as
output put the next processing element in the flow and the response is cached.
By default, Mule stores all cached responses in an InMemoryObjectStore. If we want to provide our
own custom store, we can do so using the custom-object-store option. There are 4 ways how mule
stores cached responses.
1. In Memory object store
2. Custom Object store
3. Managed Store
4. Simple text file store
We can provide some options regarding the cache update while configuring the object store.
Below are some of the attributes we can include in object store configuration.
maxEntries maximum number of entries that our object store can cache. If this limit exceeds, first
cached ones will be trimmed.
entryTTL - is number of milliseconds that a cached response has to live before it is trimmed.
expirationInterval - the frequency with which the object store checks for cached response events it
should trim.
154
154
Cache scope
Figure-104 shows cache connector configuration, we can use Default caching strategy or we can create
new caching strategy using the options provided. Click + (highlighted in red) to create a new reference
strategy. Using the filter configuration (highlighted in green), we can filter the incoming messages, to
filter incoming message we need to provide an expression. So that message satisfying filter expression
will get cached. Message will be processed by the message processors inside cache block, but cache
block never store the response if the message does not satisfy the filter expression.
155
155
Cache scope
Figure-105 shows the caching strategy configuration as shown in figure-104 (highlighted in red). We can
provide a key to store response, we can use Default key to store response in object store. Else, we can
generate a key using the Key Expression or Key Generator. In this example, we have used Key
Expression to store response, once this expression is evaluated, the result will be used as key to store
response.
156
156
Cache scope
Figure-108 shows the console output for the caching. For the example shown, time to clear the object
store is set as 3sec. In below console output we can see the service was invoked 3 times, second time
when the service was invoked mule has sent the cached response (highlighted in red) instead of fetching
from database.
157
157
Cache scope
Figure-110 shows sample request and response for the custom cache.
158
158
Figure-111 shows sample request and response for the custom cache flow.
159
Batch Processing
Batch Processing
Batch component is used to process huge messages in batches. In batch we have 3 phases.
1. Input
2. Process Records
3. On complete
Input
Input phase is used to prepare a collection object with input message. Because process records
phase expects a collection object.
Process Records
Process Record phase expects a collection object to process the each record of collection in
individually and parallel. Here each object of collection is a record.
On Complete
On complete phase is used to summarize the flow. Following variables are available in On
Complete phase to get the status of flow.
Example
In the following example, it explains how to transform CSV to XML using batch. This example
exposes a HTTP rest service.
In the main flow input csv file path sets to payload and refer to a batch job.
160
160
Batch Processing
java.io.BufferedReader;
java.io.IOException;
java.io.InputStream;
java.io.InputStreamReader;
java.io.UnsupportedEncodingException;
java.util.Iterator;
java.util.LinkedList;
java.util.List;
161
161
Batch Processing
In Process Records phase, we have two batch steps to transform payload from csv to xml using
datamapper and write the xml data into a file. Second batch step contains batch commit. The message
processors which are in batch commit scope get execute depends of size of batch commit.
<batch:commit size="5" doc:name="Batch Commit">
162
162