ArganoMS3 | Cloud Integration Solutions

  • About
    • Why Us
    • Leadership
    • Our Core Values
    • Clients
    • ArganoMS3 PHILIPPINES
  • Services & Expertise
    • API & INTEGRATION
      • Tavros
      • KONG API
      • MuleSoft
      • APIGEE
      • REDHAT FUSE / CAMEL
    • STRATEGY & OPTIMIZATION
      • DEVOPS
      • 24x7x365 Ops Support
    • CLOUD SERVICES
      • AMAZON AWS
      • SALESFORCE
    • ARTIFICIAL INTELLIGENCE
      • RPA
      • BIG DATA
      • IoT
  • Partners
    • MuleSoft
    • AWS
    • UiPath
    • Kong
    • Apigee
    • Red Hat
  • Resources
    • ArganoMS3 Is Talkin’ Nerdy
    • Case Study
    • MUnit Whitepaper
    • MuleSoft Runtime Whitepaper
    • API-Led Architecture Whitepaper
  • Careers
    • Careers at ArganoMS3
    • Company Culture
    • Job Opportunities
    • ArganoMS3 Benefits
    • Internal Base Camp
    • Recruitment Brochure
  • Blog
    • Archive
  • Contact

Creating an HTTP Outbound Logger
Using Mule’s Server Notifications

Marcos Villalba

INITIAL REQUIREMENTS

The task at hand required us to create some kind of listener/interceptor to be triggered whenever we were about to hit a http outbound component. This logger was supposed to write down URL and payload of the http outbound call we were about to do.

Other requirements were to be reusable, easy to add to any project, and have the ability to connect/disconnect via config files. All of this combined would make for quite a useful tool for testing and troubleshooting.

FIRST APPROACH

We started with an out of the box interceptor – this approach is quite useful for attaching behavior to Mule components. All official docs here.

Simply adding interceptor components in our flow, and referring them to a java class, implementing EnvelopeInterceptor or AbstractEnvelopeInterceptor Interfaces, flow will be intercepted.

<flow name=”interceptorFlow”>
    <http:listener config-ref=”HTTP_Listener_Configuration” path=”/input” doc:name=”HTTP”/>
    <set-payload value=”#[‘sample payload’]” doc:name=”Set Payload”/>
    <custom-interceptor class=”com.ms3.logging.interceptor.LoggingInterceptor” />
    <http:request config-ref=”HTTP_Request_Configuration” path=”destination_path” method=”GET” doc:name=”HTTP”/>
</flow>

With an implementation class:

public class LoggingInterceptor extends AbstractEnvelopeInterceptor implements Interceptor{
    @Override
    public MuleEvent process(MuleEvent event){
        //flow intercepted!

        //lets do something here!
        return event;
    }
}

At this point, we are free to recover the message context, flow variables, payload, etc and process them how we see fit.  However, this approach was not successful in providing us the needed URL and payload content.

Now we ask, why was this approach unsuccessful? Well, our objective was to intercept the flow, and access http outbound transport parameters (host,port,path etc). These variables belong to the http component and not to the flow or message, so an alternative approach was needed.

In conclusion.. we need to go deeper and listen to the http component directly!

we-need-to-go-deeper

MULE SERVER NOTIFICATIONS

Mule provides an internal notification mechanism that you can use to access changes that occur on the Mule Server, such as a flow component being added, a Mule Model being initialized, or Mule being started. You can set up your agents or flow components to react to these notifications.

Message notifications that are fired by the code provide a snapshot of all information sent in and out of the Mule Server, and are fired whenever a message is received or sent. All official docs here.

In other words, our developer friends from MuleSoft fire thousands of notifications when different events happen deep inside the code. We can catch these notifications and use it in our favor and get some insight to what is happening within our applications.

THE ArganoMS3 IMPLEMENTATION

In our particular case, we want to listen/catch a notification that is being fired by the http component (DefaultHttpRequester) just before the actual network call. At this particular moment, We will ask the component “hey, what is the URL,port,path.. you are about to hit?” It will provide us with the necessary information that we can then record. So lets get to work and show you how it is done.

All of these notifications fired by the code are disabled by default because of if left on there would be significant performance impact, So we need to enable them and then inject a java class that will be used to handle the notifications. Here is the mule configuration elements to enable events and the receiving class.

<spring:beans>
    <spring:bean name=”messageProcessorListener” class=”com.ms3.httpListener.HttpNotificationListener”/>
</spring:beans>

<notifications>
    <notification-listener ref=”messageProcessorListener” />
</notifications>

Our class HttpNotificationListener will be listening for notifications that will be firing from within the core Mule code, but… which notifications will we get? There will be hundreds fired per second, so we need a way of selecting which ones are the ones that we are looking for and filter all the rest of the noise out of the class.

For this, we need to select which interface our class will be implementing. Mulesoft gives the developers a list of all the possible NotificationListener interfaces, each one of them will be triggered under a particular situation/change/scenario. Here’s a complete list.

Some dev notes here:

  • We selected MessageProcessorNotificationListener, triggered whenever a message processor was invoked. Since this is quite generic, we narrow it by listening to DefaultHttpRequester processor.
  • All classes implementing NotificationListener must implement method onNotification().
  • The input parameter of this method will depend on the NotificationListener used.
  • Once the method is triggered, we can access all the parameters of the current message processor. In our case, the desired DefaultHttpRequester.
public class HttpNotificationListener implements MessageProcessorNotificationListener<MessageProcessorNotification> {

    @Override

    public void onNotification(MessageProcessorNotification notification) {

        MessageProcessor messageProcessor = notification.getProcessor();

        if(messageProcessor instanceof DefaultHttpRequester)

        {

        DefaultHttpRequester httpRequester = ((DefaultHttpRequester) messageProcessor);

        String msgId = notification.getSource().getMessage().getMessageRootId();

        String outbound_URL =              httpRequester.getHost()+”_”+

        httpRequester.getPort()+”_”+

        httpRequester.getPath()+”_”+

        httpRequester.getMethod();

        //log all this info somewhere!

        }

    }

}

CONCLUSION

Mule Server notifications are an extremely flexible, powerful Mule capability. Since there is not much documentation out there,  you will need to dive into Mule´s source code to see what notification is being fired in the part of the code you are interested in interacting with.

Dont be shy!
Contact ArganoMS3 today to see how we can help you with your Mulesoft project contact@ms3-inc.com

 

LINKS

Filed Under: Integration, Mulesoft Tagged With: Anypoint, Debug, HTTP, HTTPS, Interceptor, MS3, Mulesoft, Outbound HTTP, Testing

Mule – Credentials – Vault

Robert Whitmer

REVIEW

There is an age old question that people often ask; what is the single most valuable thing to society? Some of the top responses may be things like gold, silver, and money. Personally I believe that in the 21st century the correct answer is information. Just as the gold in Fort Knox is protected and secured we must do the same with our sensitive data to keep it from falling into the wrong hands. The next question becomes how do we protect and store our sensitive data? The answer is encryption and MULE offers us a great way of encrypting and storing our data securely by using the MULE-Credentials-Vault. The MULE-Credentials-Vault is very flexible and offers us 19 different encryption algorithms by default!

REQUIREMENTS

To correctly secure a .properties file using MULE the following requirements must be meet:

  • The Use Of A MULE-Credentials-Vault (an encrypted .properties files)
  • A Global Security Property Placeholder Element
  • An Encryption Key For Opening The Vault

HOW IT WORKS

When implementing the use of the MULE-Credentials-Vault our sensitive data such as user names and passwords are stored as key value pairs inside of a .properties file. Once a .properties file becomes encrypted the file is then referred to as a Credentials Vault. After a Credentials Vault has been implemented MULE uses a Secure Properties Placeholder (Global Element) to point to our Credentials Vault, decrypt the stored data, and return the stored data (only if it has the correct key to the vault). This process can be thought about in the same way as using a key to open your front door to your house to allow entry. The KEY to a MULE-Credentials-Vault gets stored in a user’s runtime memory (it is never written or stored to disk). KEY credentials are prompted and gathered when a MULE application starts. The KEY is then stored in memory for the complete lifecycle of the application. Once the applications session has ended the KEY is then cleared from memory and is thus forgotten.

WALKTHROUGH

Before we can begin to utilize the MULE-Credentials-Vault we first must have installed Anypoint Enterprise Security for Anypoint Studio. Start by opening your project application in Anypoint Studio.

0308 pic1

  • From the file menu bar select:
    • Help > Install New Software…

0308 pic2

  • Install the software shown above
    • The latest version and update site can be found at https://docs.mulesoft.com/release-notes/anypoint-enterprise-security-release-notes

0308 pic 3

  • Anypoint Enterprise Security for Anypoint Studio will then begin to install.

0308 pic4

  • After the Anypoint Enterprise Security for Anypoint Studio process completes restart Anypoint Studio

0308 pic5

  • Next we must create a .properties file that will later become our Credentials Vault by:
    • (R-Click) src/main/resources > New > File from the Package Explorer

0308 pic6

  • Open the newly created .properties file with the Mule Properties Editor
    • (R-Click) *.properties file* > Open With > Other from the Project Explorer
      • Select Mule Properties Editor
        0308 pic7
  • After opening the Mule Properties Editor :
    • (L-Click) Add Properties Button
    • Input field values:
      • Key
      • Value
    • (L-Click) Encrypt
      • Select your Encryption Algorithm & Key0308 pic8
  • Your value will then be encrypted

    0308 pic9
  • Next we must create our Secure Property Placeholder Global Element

0308 pic 10

  • The Secure Property Placeholder Global Element is configured as show in the image above

FORCE MULE RUNTIME KEY

Once ready to move your application into production configure MULE to demand that a user enter a password key at runtime, you need to include the following in the system properties (the mule-app.properties file in the src/main/app folder)

  • M-Dprod.key=uniquepassword -M-Denv=prod

0308 pic11

  • For development purposes the src/main/app/mule-app.properties file can be configured as shown above

RELATIONSHIP STRUCTURE

As previously stated there are 3 requirements that must be meeting to properly secure our sensitive data. There are many varieties these key ingredients based upon the use case of your application can be structured.  Typically an application will use one of three ways. The structure can be thought about in the exact same as a MySQL database table relationship. The relationships are as followed:

  • ONE to ONE to ONE Relationship

0308 pic12

  • ONE to ONE to MANY Relationship

0308 pic 13

  • (MANY) ONE to ONE to ONE Relationship

0308 pic14

Filed Under: Integration, Mulesoft

AP Studio Documentation Generation

Mark J. Norton

**This blog post has been updated. For an updated version of AP Studio Documentation blog post, please click here**

Documentation is something most programmers would willingly avoid given even the smallest chance.  Most see it as a distraction from what they do, which is write code.  If asked directly, however, programmers will usually admit the importance of documentation for the client, for team members, for those who follow and even (whisper it) themselves.  With a very little bit of extra effort, however, MuleSoft developers can take advantage of the built in documentation generation in AnyPoint Studio.

The concept of documentation embedded in code has been around for quite some time.  Since MuleSoft uses Java extensively in developing their own products, most MuleSoft developers are familiar with JavaDoc, a documentation generation utility introduced in 1995 (https://en.wikipedia.org/wiki/Javadoc).  JavaDoc generates high quality documentation for Java interfaces, classes, and objects formatted as a set of HTML pages.

MuleSoft AnyPoint Studio provides documentation generation in a similar manner.  Once code is complete, select the AP Studio project and use the file menu to select Export Studio Documentation.

0223 pic1

You will be prompted for a location to save the results.  To illustrate the results, a simplistic application was created with three elements:  an HTTP listener, a database request, and a data formatter, which is included in the illustration above.

The result is an index.html page that looks like this:

BLOG PIC

All flows are listed on the left.  There is only a single flow in this example, but if there were more, they would be included here.

At the top right,  a graphical representation of the example flow is shown, the same one you would see in AP Studio. Each element is then broken out below for further detail.  Section 1 in the blue box shows one of these elements, a database operation that fetches records based on a simple query.  The element is labeled as part of the “get_persons_Flow”, along with the label specified by the developer, “All Persons” in this case.  Note the text at the bottom of the section that reads “To protect privacy …”.  This information was added by the developer using the Notes tab of the element in AP Studio:

0223 pic3

This is very similar to the annotations supported by JavaDoc that allows additional information to be included in the code to further clarify the documentation generated later.

Element in the documentation includes a clickable link that reveals the XML associated with it.

0223 pic4

This also illustrates how the description note is included in the XML definitions.

Overall, the results are clean and professional looking.  While all of the important information is included, there is room for improvement.  For example, error handling does not have any documentation generated, though it can be viewed by viewing the whole flow XML.  I would love to see additional resources included, even if just as a list.  These could include POM files, schemas, API definitions, property files, etc.  Finally, there should be an option to generate for specifically for PDF so that it can be printed and/or distributed.  Fields can be expanded manually and the page printed to PDF, but these extra steps shouldn’t be needed.

While not perfect, documentation generated by AP Studio is a quick and easy solution for documentation required by a client.  A little descriptive work by the developer results in a doc set that describes the application, how it is broken down, and (roughly) how it works.

Filed Under: Integration, Mulesoft

Database Connection Check

Mark J. Norton

I was recently given the task to create a uniform approach to checking the status of all running applications.  These checks use an HTTP listener to provide status information back to the caller. Currently, this information is returned in JSON format, but a shared formatter flow is called to provide a global method to transform it into another form, such as HTML.

The basic application check information includes a timestamp, the name of the application, what host, port, and cluster id it is running on, etc.  Additionally, checks are made for the status of any third party resources being used, such as a database, a REST or WSDL service, message queue, or others.  In particular, I’d like to share how the database connection check works.

The usual way to check to see if a particular database connection is up and running is to run a query against it.  Perhaps something like this:

<db:select config-ref=”MySQL_Configuration” doc:name=”Database”>

            <db:parameterized-query>

                          <![CDATA[select sysdate();]]>

            </db:parameterized-query>

</db:select>

If the connection is down, an exception is thrown that can be caught and status rendered appropriately.  However, there are times when this approach will fail even though the connection is up and running.  Databases containing sensitive information, such as those protected by HIPAA, may disallow select operations.  A more general approach can be developed by accessing the database connector object directly.

The simplest way to access the database connector object directly is to define it as a bean in a Mule global configuration file or a domain configuration file.

<spring:beans>

   <spring:bean id=”Connector” name=”conName”

                    class=”org.apache.commons.dbcp2.BasicDataSource”>

         <spring:property name=”driverClassName”    

                   value=”oracle.jdbc.driver.OracleDriver” />

         <spring:property name=”url” value=”${db.url}” />

         <spring:property name=”username” value=”${db.username}” />

         <spring:property name=”password” value=”${db.password}” />

   </spring:bean>

</spring:beans>

Once the bean is defined, we can access existing methods such as isClosed() to determine if the connection is open or closed.  An active database connection must be open to be used.  This example shows how to capture the database connector status using a MEL statement.

<set-variable doc:name=”isClosed”

value=”#[app.registry.conName.getConnection().isClosed() ? ‘Down’ : ‘Up’]”

variableName=”isClosed”/>

Depending on the database driver you are using, the method name might be slightly different.  Still, this is a more reliable and consistent way to determine if a database connection is viable

Filed Under: Integration, Mulesoft

  • « Previous Page
  • 1
  • 2
  • 3
  • 4
FUTURE PROOF SOFTWARE SOLUTIONS
ArganoMS³ enables organizations to meet today’s software, integration, cloud, and data-related challenges with confidence and ease.

About

  • Why Us
  • Leadership
  • Team
  • Clients
  • We're hiring

Solutions

  • API & Integration
  • Strategy and Optimization
  • Cloud Services
  • Artificial Intelligence

Partners

  • Apigee
  • AWS
  • Kong
  • MuleSoft
  • Red Hat
  • Salesforce
  • UiPath

Popular Links

  • Contact Us
  • Blog
  • White Papers
  • Case Study
COPYRIGHT © 2022 ⬤ ArganoMS³ MOUNTAIN STATE SOFTWARE SOLUTIONS

Copyright © 2022 · MS3 Mountain State Software Solutions · Log in