# Using Camel and NiFi in one solution

Both Camel and NiFi are Apache projects. Their code is mostly written in Java and both are targeting data processing and integration. However, there are also many differences. One difference is that NiFi is a platform and Camel a framework.

For NiFi it means that it’s a software solution where you centrally build dataflows. The concept of dataflows let you take multiple processors to process data. Together processors form a dataflow. NiFi has around 200 processors, most of which are built-in.

Camel is mostly used at a code level. On the top of the framework companies have build platforms like for example Talend ESB or Red Hat Fuse. However, you can just as easily use it in your own application code or build an API, an integration or a microservice.

Camel supports all kind of integration patterns and components. A developer can take the core-engine of the framework and can add more than 300 components to it. The components and patterns together form a route.

# Combining superpowers

There is currently no NiFi component in Camel and no Camel processor in NiFi. The difficulty is that both have implemented lots of protocols, but they don’t provide one for external parties. It’s like having a gasoline and an electric engine. They can work together in all kind of hybrid ways, but it’s not easy to combine them.

## The example

Moving files between directories.

As source directory we use C:\in and as destination C:\out

How would one create a pure NiFi solution? Well, just use the GetFile and PutFile processor:

And how would this work in Camel? This can be done by using the Camel DSL:

from(file://C:/in).to(file://C:/out);

Both provide a simple and sufficient solution. Nobody would complicate things by using multiple technologies. But to keep things simple this is exactly what we will do :)

Keep in mind that there are many more complex situations where it makes sense to use both. We’ll come back to that later. First we will create a demo to combine Camel and NiFi in one solution. We do this on a software (tooling) and a code level.

# One solution on software level

In NiFi the normal approach is creating flow with the user interface. As we don’t want to code our Camel Route too, we use Assimbly Gateway to configure the route in a browser. Assimbly allows to create connections with Camel and ActiveMQ.

Next step is to find a matching protocol to connect both technologies. A good candidate is to use JMS. Both are well-supported by both Camel and NiFi. Here is the combined flow:

Let’s check the JMS example in more detail.

1. Camel

The Camel route running in Assimbly picks up a file from C:\in and puts it as a message on the JMS Queue “in” on ActiveMQ (also running in Assimbly).

You can find out how to run this Assimbly flow with Camel and ActiveMQ on the Assimbly wiki. There is a quick-start and also a tuturial on message queueing.

2. NiFi

Apache NiFi gets the message from the queue ‘in’ with the ConsumeJMS processor and publishes it on queue ‘out’ with PublishJMS processor.

To accomplish this, we first create a controller service for JMS:

The ActiveMQ Artemis client library (JMS Client Libraries) is downloaded directly from Maven:

Next step is to configure the ConsumeJMS processor:

And the PublishJMS processor:

Last, but not least, we start the flow:

3. Camel

Another Assimbly flow let Camel consumes the file from the queue ‘out’ and saves it into the directory C:\out. For this flow we clone the first flow and configure it in reverse:

When testing the flow it still functions the same way as NiFi or Camel did on their own, but now combined in one solution.

## More complex stuff

• Separation of Concerns: let NiFi run flow logic and Camel run the connections (without the need of applications doing a lot of integration).
• Let NiFi work centrally and Camel distributed.
• Enhances functionality: NiFi processors and Camel’s components.
• Have a clear transport layer (MQ).

It’s possible that completely different teams or engineers work on either of those tools.

## Other options

There are many other possibilities to use both NiFi and Camel (through Assimbly Gateway) together. For example use Apache Kafka broker with topics as broker instead of ActiveMQ. Or to use their REST interfaces. The key point to take is that here you can have separate-of-concerns and this setup support all kinds of use cases.

# One solution on code level

and also on the mailing list:

This has not been materialized yet and there is not a lot of code to find on this topic. Therefore, I created two experimental custom NiFi processors which combines NiFi and Camel code.

## How do they work?

With this guide we create a new ‘ConsumeWithCamel’ processor. We add the following properties:

1. From URI (the URI of the Camel component for consuming)
2. Error URI (The URI of the Camel component for errors)
3. LogLevel (The loglevel to the NiFi log of the Camel component).

Then we add the Camel code that:

1. Starts a CamelContext
2. Configures the route
3. Creates a consumer template

We let Assimbly Connector handle the Camel code. This API is used in Assimbly Gateway as well. It uses a convention over configuration approach and already has a lot of Camel components (like the File component) built-in.

Here is the code used when starting the NiFi processor:

@OnScheduledpublic void onScheduled(final ProcessContext context) {//Use Assimbly Connector to manage Apache Camel (https://github.com/assimbly/connector)     getLogger().info("Starting Apache Camel");                //Start Apache camel        try {          startCamelConnector();        } catch (Exception e2) {          getLogger().error("Can't start Apache Camel.");          e2.printStackTrace();        }  //Create an Assimbly flow ID  UUID uuid = UUID.randomUUID();  flowId = context.getName() + uuid.toString();       //configure the flow (Camel route)        try {           configureCamelFlow(context);        } catch (Exception e1) {          getLogger().error("Can't configure Apache Camel route.");          e1.printStackTrace();        }               //start the flow (Camel route)        try {          connector.startFlow(flowId);        } catch (Exception e1) {         getLogger().error("Can't start Apache Camel.");         e1.printStackTrace();        }                     //Create the endpoint     try {        template = connector.getConsumerTemplate();      } catch (Exception e) {         getLogger().error("Can't create Apache Camel endpoint.");        e.printStackTrace();      }          }

Last step is to get the messages from Camel with the help of the consumerTemplate and pass it through to the NiFi Processor.

The code to process a message:

@Overridepublic void onTrigger(final ProcessContext context, final ProcessSession session) throws ProcessException {//Get the message from the Camel routeObject output = template.receiveBody("direct:nifi-" + flowId);if ( output == null ) {            return;}     FlowFile flowfile = session.create();// To write the results back out to flow fileflowfile = session.write(flowfile, new OutputStreamCallback() {@Overridepublic void process(OutputStream out) throws IOException {           out.write(output.toString().getBytes());       }});session.transfer(flowfile, SUCCESS);}

You can find the complete code on Github:

## ProduceWithCamel

Note: These are experimental processors created only for this demo.

# Testing the code

Now we can use the new Consume processor and configure it:

The Error URI is empty, which mean errors will be logged to the NiFi log file.

Secondly we configure the produce processor:

Finally we connect both processors with each other and start the flow.

The file will be picked up and stored just like in all other examples.

# More possibilities

The first process group uses the producewithcamel with the URI vm://secondProcessGroup

The second process group consumes this message:

Now both flows move the file from one directory to another, but the process groups aren’t connected as usual. The new solution acts like a ‘wormhole’.

Though every example had the same result, there were many paths. Within integration it’s good to use open source as well as an open mind. Together they’re unstoppable on whatever path you are on.

Blogs on tech & society.

## More from ski n

Blogs on tech & society.

## 8 Best Lambdas, Stream, and Functional Programming Courses for Java Developers

Get the Medium app