Infoblox NIOS

The parser normalizes for Infoblox NIOS DDI logging events data to a common schema.

Breaking Changes

This update includes parser changes, which means that data ingested after upgrade will not be backwards compatible with logs ingested with the previous version.

Updating to version 1.0.0 or newer will therefore result in issues with existing queries in for example dashboards or alerts created prior to this version.

See CrowdStrike Parsing Standard (CPS) for more details on the new parser schema.

Follow the CPS Migration to update your queries to use the fields and tags that are available in data parsed with version 1.0.0.

Installing the Package in LogScale

Find the repository where you want to send the Infoblox NIOS DDI events, or create a new one.

  1. Navigate to your repository in the LogScale interface, click Settings and then Packages on the left.

  2. Click Marketplace and install the LogScale package for infoblox NIOS DDI (i.e. infoblox/nios).

  3. When the package has finished installing, click Ingest tokens on the left still under the Settings.

  4. In the right panel, click + Add Token to create a new token. Give the token an appropriate name (e.g. the name of the server the token is ingesting logs for), and leave the parser unassigned.

    Before leaving this page, view the ingest token and copy it to your clipboard — to save it temporarily elsewhere.

  5. Configure the syslog server to send the Infoblox NIOS DDI events to LogScale.

  6. Add the following snippet to your LogScale Collector configuration, see Configuring LogScale Collector for more information on the LogScale Collector.

    yaml
    infoblox_nios:
      type: syslog
      mode: udp
      port: 1514
      sink: logscale
      parser: "infoblox/nios::syslog-utc"
  7. https://docs.infoblox.com/space/NAG8/22252249/Using+a+Syslog+Serverto redirect logs to your LogScale log collector

  8. Now that you have a repository set up in LogScale along with an ingest token you're ready to send events to LogScale.

Verify Data is Arriving in LogScale

Once you have completed the above steps the Infoblox Nios DDI data should be arriving in your LogScale repository.

You can verify this by doing a simple search for #Vendor = "infoblox" | Product = "nios" to see the events.

Package Contents Explained

This package parses incoming data, and normalizing the data as part of that parsing. The parser normalizes the data to CrowdStrike Parsing Standard (CPS) schema based on OpenTelemetry standards, while still preserving the original data.

If you want to search using the original field names and values, you can access those in the fields whose names are prefixed with the word "Vendor". Fields which are not prefixed with "Vendor" are standard fields which are either based on the schema (e.g. source.ip) or on LogScale conventions (e.g. @rawstring).

The fields which the parser currently maps the data to, are chosen based on what seems the most relevant, and will potentially be expanded in the future. But the parser won't necessarily normalize every field that has potential to be normalized.

Event Categorisation

As part of the schema, events are categorized by four different fields:

  • event.category

event.category is an array, so it needs to be searched like so:

array:contains("event.category[]", value="authentication")

Normalized Fields

Here are some of the normalized fields which are being set by this parser:

  • host.* (e.g. host.name, host.ip)

  • user.* (e.g. user.name)

  • dns.* (e.g. dns.question.ip, dns.question.name)

  • client.* (e.g. client.ip, client.port)

  • interface.* (e.g. interface.name)