This is a filter that lets you completely drop an event and can be used both during queries and within the parser pipeline. When the event is dropped it is removed entirely. If used during parsing, the event data will not be stored in LogScale. When used within a query query, the event is removed from the event stream and effectively removes the entire event from the query output.
Note
The dropEvent()
does not accept any arguments.
dropEvent()
Examples
Dropping Events based on Parsing JSON Value
case {
@rawstring="#*"
| dropEvent();
* }
When parsing incoming data, it's possible the data may include
'commented' data, for example the
#
character is often used to
identify comments in files rather than real data. This example
removes those lines from the ingest process during parsing using
the dropEvent()
function to drop the entire
event from the ingest pipeline.
Starting with the source repository events
- flowchart LR; %%{init: {"flowchart": {"defaultRenderer": "elk"}} }%% repo{{Events}} 0[/Filter/] 1[/Filter/] result{{Result Set}} repo --> 0 0 --> 1 1 --> result style 0 fill:#ff0000,stroke-width:4px,stroke:#000;
Start a
case
statement, with the first matching expression looking for the hash symbol in a line to indiciate that it could be removed, then dropping the entire event usingdropEvent()
logscalecase { @rawstring="#*" | dropEvent();
- flowchart LR; %%{init: {"flowchart": {"defaultRenderer": "elk"}} }%% repo{{Events}} 0[/Filter/] 1[/Filter/] result{{Result Set}} repo --> 0 0 --> 1 1 --> result style 1 fill:#ff0000,stroke-width:4px,stroke:#000;
For all other lines, the
case
expression matches all other events and lets them through.logscale* }
Event Result set
An example of removing data during ingest that matches a typical source construct (the comment).
Ignoring Commented Lines During Parsing by Dropping Events
parseJson()
| case { someField = "some_value"
| dropEvent(); * }
| parseTimestamp(field=@timestamp)
The dropEvent()
function is often used within
parsers to drop events during parsing that do not need to be
ingested. The following example shows how to filter events as part
of a parser by matching a particular field value from being
ingested.
Starting with the source repository events
- flowchart LR; %%{init: {"flowchart": {"defaultRenderer": "elk"}} }%% repo{{Events}} 0[/Filter/] 1[/Filter/] 2[/Filter/] result{{Result Set}} repo --> 0 0 --> 1 1 --> 2 2 --> result style 0 fill:#ff0000,stroke-width:4px,stroke:#000;
First parse the incoming data to identify JSON values and convert them into a usable field.
logscaleparseJson()
- flowchart LR; %%{init: {"flowchart": {"defaultRenderer": "elk"}} }%% repo{{Events}} 0[/Filter/] 1[/Filter/] 2[/Filter/] result{{Result Set}} repo --> 0 0 --> 1 1 --> 2 2 --> result style 1 fill:#ff0000,stroke-width:4px,stroke:#000;
Start a
case
statement, with the first matching expression identifying a field value in the extracted JSON field from the previous step, and then dropping the event. This has the effect of terminating the parsing for this event as there is no more data to be processed.logscale| case { someField = "some_value" | dropEvent(); * }
- flowchart LR; %%{init: {"flowchart": {"defaultRenderer": "elk"}} }%% repo{{Events}} 0[/Filter/] 1[/Filter/] 2[/Filter/] result{{Result Set}} repo --> 0 0 --> 1 1 --> 2 2 --> result style 2 fill:#ff0000,stroke-width:4px,stroke:#000;
For all other events that don't match the JSON value, parse the timestamp from the @timestamp field.
logscale| parseTimestamp(field=@timestamp)
Event Result set
dropEvent()
is a simple and practical way
of eliminating events during parsing incoming data.