Example Queries
Let's look into different log entities, interpret some example queries and inspect them line by line.
Example 1
status >=400
| method != PATCH
| top(method, limit=3)
We can break this down to:
Search for status codes greater or equal to 400:
logscalestatus >=400
Select all events having any HTTP method (
GET
,POST
, etc.) exceptPATCH
.logscale| method != PATCH
Select the top three HTTP methods having the highest number of events that match the two conditions above and give the total count for each of the three methods found.
logscale| top(method, limit=3)
Example 2
#type=humio #kind=metrics name=load-segment-total
| timeChart(#vhost, function=max(m1), limit=30)
We can break this down to:
Select all events having field #type equal to
humio
, #kind equal tometrics
and name equal toload-segment-total
.
#type=humio #kind=metrics name=load-segment-total
Draw a linechart where the X-axis displays the time values grouped into buckets and the Y-axis shows results with one line per #vhost, limiting the results to only the top 30 vhosts that have the maximum values.
| timeChart(#vhost, function=max(m1), limit=30)
Example 3
#host=github #parser=json
| repo.name=docker/*
| groupBy(repo.name, function=count())
| sort()
We can break this down to:
Narrow the search to events in which the #host equals github, and the #parser used was json.
logscale#host=github #parser=json
Limit results to events that are taken from the GitHub repositories that start with the name, docker using a filter expression in cases where you're searching a view based on multiple joined repositories.
repo.name=docker/*
Aggregate the filtered results by first grouping by the repository name and then counting the number of events (github and json events only) from each docker repository.
groupBy(repo.name, function=count())
Sort by default field _count — results are sorted numerically, in descending order, so you get the most frequently mentioned repo.names at the top of the list.
sort()
Example 4
url=/^\/add_to_cart\/(?<product_id>\d+)/
| match(file="products.csv", column=product_id, field=product_id)
| sum(product_price, as="Total revenue")
Suppose we have a add_to_cart/productid field in our
logs and that we have enriched these logs with
product_name and product_price
fields by importing a file named products.csv
.
We can break this down to:
Find anything after /add_to_cart in the URL and make that into a product_id that consists of one or more digits.
logscaleurl=/^\/add_to_cart\/(?<product_id>\d+)/
Using the product_id, look up the product to get product_name and product_price from the
products.csv
file.
| match(file="products.csv", column=product_id, field=product_id)
Sum all the product_price values and report the result in a field named Total revenue.
| sum(product_price, as="Total revenue")