Logstash version 7.13 and later are known to be incompatible with Humio. When using you may receive the error ‘Could connect to a compatible version of Elasticsearch.’

Logstash is an established open source tool for collecting logs, parsing them, and outputting them to other systems. You can use Logstash alongside Humio to process and analyze logs together. In this scenario, you use Logstash as the log collection and parsing agent and instruct it to send the data to Humio.

Humio supports the Elasticsearch bulk insertion API, but due to recent changes by Elastic the Logstash Elasticsearch output plugin is no longer compatible. The best option for sending data to Humio is to use the HTTP output option (built-in) and send the data to the Splunk HEC ingest API.

The benefit of this approach is that you can take advantage of the extensible architecture of Logstash to parse many kinds of data:

  • You can install one of the many available plugins that can parse many well-known data formats.

  • You can use the Grok language to build custom parsers for unusual data formats. Grok has many built-in patterns to help parse your data.

Installation & Configuration

To download Logstash visit the Logstash downloads page.

You can find the complete documentation for Logstash at the Logstash website.

Because Humio supports an open HTTP based ingest API, you can use the built-in http output in the Logstash configuration.

The following example shows a very simple Logstash configuration that sends data to Humio:

    command => "date"
    interval => "5"
output {
  http {
    format => "json"
    content_type => "application/json"
    http_compression => true
    http_method => "post"
    url => "$YOUR_HUMIO_URL/services/collector/raw"
    headers => ['Authorization', 'Splunk $INGEST_TOKEN']

The $YOUR_HUMIO_URL variable is the base URL of your Humio server, for example (https://cloud.humio.com:443 or http://localhost:8080). The $INGEST_TOKEN is the ingest token for your repository, (i.e., a string such as 5413ab7e-3ae3-116b-9b90-da446e01e131).

The port should be the same as used for the Humio web UI, typically 443 (https) or 80 (http). It is recommended these ports be explicitly put in $YOUR_HUMIO_URL.

In the above example, Logstash calls the Linux date command every five seconds. It passes the output from this command to Humio.

Field Mapping

When you use the ElasticSearch output, Logstash outputs JSON objects. The JSON for an event sent to Humio with the above configuration looks like this:

  "@timestamp": "2016-08-25T08:34:37.041Z",
  "message": "Thu Aug 25 10:34:37 CEST 2016\n",
  "command": "date"

Humio can map each JSON object into an Event. Each field in the JSON object becomes a field in the Humio Event. You will need to pair your ingest token with a parser to process the Logstash timestamps and parse the fields from your events, see parsing.

Create a parser logstash-json with the following content:

// Parse the event as json

// Convert the timestamp to unix time
| parseTimestamp("yyyy-MM-dd'T'HH:mm:ss[.SSS]XXX", field=@timestamp)

Alternatively, depending on the configuration, the timestamp can be the time at which Logstash handles the event, or the actual timestamp in the data. If the timestamp is present in the data you can configure logstash to parse it, for example, by using the date filter. Another option is to handle parsing the timestamp in Humio by connecting a parser to the ingest token.

Adding Parsers in Humio

Humio can do further parsing/transformation of the data it receives by connecting a parser to the ingest token. For more information on parsers,

Dropping Fields

Logstash often adds fields like host and @version to events. You can remove these fields using a filter and the drop_field function in Logstash.