ActiveMQ as a Message Broker for Logstash

When scaling Logstash it is common to add a message broker that is used to temporarily buffer incoming messages before they are being processed by one or more Logstash nodes. Data is pushed to the brokers either through a shipper like Beaver that reads logfiles and sends each event to the broker. Alternatively the application can send the log events directly using something like a Log4j appender.

A common option is to use Redis as a broker that stores the data in memory but using other options like Apache Kafka is also possible. Sometimes organizations are not that keen to introduce lots of new technology and want to reuse existing stores. ActiveMQ is a widely used messaging and integration platform that supports different protocols and looks just perfect for the use as a message broker. Let's see the options to integrate it.

Setting up ActiveMQ

ActiveMQ can easily be set up using the scripts that ship with it. On Linux it's just a matter of executing ./activemq console. Using the admin console at http://127.0.0.1:8161/admin/ you can create new queues and even enqueue messages for testing.

Consuming messages with AMQP

An obvious way to try to connect ActiveMQ to Logstash is using AMQP, the Advanced Message Queuing Protocol. It's a standard protocol that is supported by different messaging platforms.

There used to be a Logstash input for AMQP but unfortunately it has been renamed to rabbitmq-input because RabbitMQ is the main system that is supported.

Let's see what happens if we try to use the input with ActiveMQ.

input {
rabbitmq {
host => "localhost"
queue => "TestQueue"
port => 5672
}
}

output {
stdout {
codec => "rubydebug"
}
}

We tell Logstash to listen on localhost on the standard port on a queue named TestQueue. The result should just be dumped to the standard output. Unfortunately Logstash only issues errors because it can't connect.

Logstash startup completed
RabbitMQ connection error: . Will reconnect in 10 seconds... {:level=>:error}

In the ActiveMQ logs we can see that our parameters are correct but unfortunately both systems seem to speak different dialects of AMQP.

 WARN | Connection attempt from non AMQP v1.0 client. AMQP,0,0,9,1
org.apache.activemq.transport.amqp.AmqpProtocolException: Connection from client using unsupported AMQP attempted
...

So bad luck with this option.

Consuming messages with STOMP

The aptly named Simple Text Oriented Messaging Protocol is another option that is supported by ActiveMQ. Fortunately there is a dedicated input for it. It is not included in Logstash by default but can be installed easily.

bin/plugin install logstash-input-stomp

Afterwards we can just use it in our Logstash config.

input {
stomp {
host => "localhost"
destination => "TestQueue"
}
}

output {
stdout {
codec => "rubydebug"
}
}

This time we are better off: Logstash really can connect and dumps our message to the standard output.

bin/logstash --config stomp.conf 
Logstash startup completed
{
"message" => "Can I kick it...",
"@version" => "1",
"@timestamp" => "2015-07-22T05:42:35.016Z"
}

Consuming messages with JMS

Though the stomp-input works there is even another option that is not released yet but can already be tested: jms-input supports the Java Messaging System, the standard way of doing messaging on the JVM.

Currently you need to build the plugin yourself (which didn't work on my machine but should be caused by my outdated local jruby installation).

Getting data in ActiveMQ

Now that we know of ways to consume data from ActiveMQ it is time to think about how to get data in. When using Java you can use something like a Log4j- or Logback-Appender that push the log events directly to the queue using JMS.

When it comes to shipping data unfortunately none of the more popular solutions seems to be able to push data to ActiveMQ. If you know of any solution that can be used it would be great if you could leave a comment.

All in all I think it can be possible to use ActiveMQ as a broker for Logstash but it might require some more work when it comes to shipping data.

About Florian Hopf

I am working as a freelance software developer and consultant in Karlsruhe, Germany and have written a German book on Elasticsearch. If you liked this post you can follow me on Twitter or subscribe to my feed to get notified of new posts. If you think I can help you and your company and you'd like to work with me please contact me directly

.