Imperva Cloud Web Application Firewall

Web application attacks prevent important transactions and steal sensitive data.

Imperva Cloud Web Application Firewall (WAF) stops these attacks with near-zero false positives and a global SOC to ensure your organization is protected from the latest attacks minutes after they are discovered in the wild.

The parser normalizes data to a common schema based on CrowdStrike Parsing Standard (CPS). This schema allows you to search the data without knowing the data specifically, and just knowing the common schema instead. It also allows you to combine the data more easily with other data sources which conform to the same schema.

Breaking Changes

This update includes parser changes, which means that data ingested after upgrade will not be backwards compatible with logs ingested with the previous version.

Updating to version 1.0.0 or newer will therefore result in issues with existing queries in for example dashboards or alerts created prior to this version.

See CrowdStrike Parsing Standard (CPS) for more details on the new parser schema.

Follow the CPS Migration to update your queries to use the fields and tags that are available in data parsed with version 1.0.0.

Installing the Package in LogScale

Find the repository where you want to send the Imperva Cloud Web Application Firewall events, or Creating a Repository or View.

  1. Navigate to your repository in the LogScale interface, click Settings and then Packages on the left.

  2. Click Marketplace and install the LogScale package for Imperva Cloud Web Application Firewall (i.e. imperva/cloud-waf).

Configurations and Sending the Logs to LogScale

  1. First you need to configure Imperva Cloud WAF log integration. The preferred option for sending logs from Imperva to LogScale is to choose Push Mode to Amazon S3.

  2. Once this configuration is completed, your logs will be automatically transferred from the Imperva cloud repository to your pre-defined AWS S3 bucket.

  3. Then you need to configure LogScale to collect data from AWS S3 buckets using the cwaf-cef. See the documentation for cloud: Ingesting Data from AWS S3 and self-hosted Ingesting Data from AWS S3 deployments to send logs directly from S3 bucket into LogScale repository.

Refer to Imperva log file structure to get additional details about recorded events.

Verify Data is Arriving in LogScale

Once you have completed the above steps the Imperva Cloud Web Application Firewall data should be arriving in your LogScale repository.

You can verify this by doing a simple search for to see the events:

logscale
#Vendor = "imperva"
| Product="cloud-waf"

Package Contents Explained

this package contains:

Package Contents - Dashboards

Note that you can narrow the dataset used by the widgets to only specific values of certain fields e.g select all (*) or a specific value for common fields, such as fqdn, using parameters selection at the top of the dashboards.

Once you make parameter selections click Apply and the widgets will update to reflect only the data from the parameters selected. (when you click in the parameters selection all widgets on the dashboard that make use of the parameters have a blue outline to the widget).

The package contains the following Dashboards.

  • Account Overview - high level overview of the traffic across all the web applications and websites, number of allowed, blocked, cached and bad requests. It also visualize the source traffic to on-boarded web applications and websites.

  • Search - for a given fully qualified domain name (FQDN) and source IP address, it shows the events over time, what actions have been taken and flows the events in real time.

  • WAF Overview - it reports security events over a time, illegal resource access, cross site scripting, attackers and much more

Package Contents - Parsers

This package contains the following parsers: cwaf-cef - a parser for Imperva Access and Security events sent in the CEF format. Sample logs can be found here.

This package parses incoming data, and normalizing the data as part of that parsing. The parser normalizes the data to CrowdStrike Parsing Standard (CPS) schema based on OpenTelemetry standards, while still preserving the original data.

If you want to search using the original field names and values, you can access those in the fields whose names are prefixed with the word Vendor. Fields which are not prefixed with Vendor are standard fields which are either based on the schema (e.g. source.ip) or on LogScale conventions (e.g. @rawstring).

The fields which the parser currently maps the data to, are chosen based on what seems the most relevant, and will potentially be expanded in the future. But the parser won't necessarily normalize every field that has potential to be normalized.

Event Categorisation

As part of the schema, events are categorized by fields:

  • event.kind

  • event.category

  • event.type

  • #event.outcome

(#event.outcome is a tag, hence the "#")

event.kind and #event.outcome can be searched using Field Filters, but event.category and event.type are arrays, so need to be searched using the following syntax:

logscale
array:contains("event.category[]", value="network")

This will find events where some event.category[n] field contains the value network, regardless of what n is.

Note that not all events will be categorized to this level of detail.

For example, the following will find events where some event.type[n] field contains the value network, regardless of what n is.

Note that not all events will be categorized to this level of detail.

Normalized Fields

Here are some of the normalized fields which are being set by this parser:

  • source.* (e.g. source.ip, source.port )

  • related.* (e.g. related.ip )

  • destination.* (e.g. destination.port)

Next Steps and Use Cases

You can get actionable insights from your Imperva Web Application Framework by searching for suspicious activity in LogScale using the search UI, dashboards or alerts.