Parsing Data

Security Requirements and Controls

When you send logs and metrics to LogScale for ingestion, they need to be parsed before they are stored in a repository. This is the case for all input channels except LogScale's structured ingest endpoint which stores data as-is.

A parser takes text as input; the text can be structured text (like JSON) or unstructured text (like syslog or application stdout). It then extracts fields which are stored along with the original text in the repository.

graph LR; A["Install and Configure LogScale"]--> B B["Create a Repository"]--> C C["Configure Data Ingest"]--> D D["Parse and Filter Data"]--> E E["Enrich Data"]--> F F["Query Data"] style D fill:#A6A0D2

Figure 39. Flow

Choosing a Parser

A client sending data to LogScale must specify which repository to store the data in and which parser to use for ingesting the data. You do this either by setting the special #type field to the name of the parser to use or by Ingest Tokens to the Ingest API Token used to authenticate the client. Assigning a parser to the API Token is the recommended approach since it allows you to change the parser in LogScale without changing the client.

Built-In Parsers

LogScale supports common log formats through the Built-in Parsers. They include formats such as json and accesslog and are suitable when starting out with LogScale. Once you get better acquainted with your data and how parsers work you will likely want to create your own custom parsers.

Creating a Custom Parser

Writing a custom parser allows you to take full control of which fields are extracted during ingest and which datasources events are stored in.

Parsers are written in the LogScale query language (the same you use for searching). Next, you'll want to learn to Creating a Parser.