My latest product Apex Logs is now in open beta! Apex Logs is a structured and plain-text log management solution, with a minimal design, simple API, a rich query language, and flexible alerting integrations.

On top of that Apex Logs currently has the most competitive pricing in the industry, up to 10 times more cost-effective than the offerings from Papertrail, Loggly, DataDog, and others at only $0.35/GB (ingested).

Let’s take a look!

Exploring the API

The API has a simple contract: all requests are made via POST to a method such as add_events, with a JSON body as input followed by a JSON response body for the output, that’s it!

Here’s an example from the terminal using curl:

$ curl \
-H "Authorization: Bearer <TOKEN>" \
-H "Content-Type: application/json" \
-d '{ "project_id": "<PROJECT_ID>", "events": [{ "level": "info", "message": "Sign in", "fields": { "email": "tobi@example.com" } }] }' \
https://<ENDPOINT>/add_events

And Node.js using the apex/logs-winston transport:

logger.info('Sign in', { email: 'tobi@example.com' })

And finally from Go using the apex/log package:

log.WithField("email", "tobi@example.com").Info("Sign in")

See the API documentation for more information regarding language clients and integrations.

Let’s dive into the interface next.

Exploring the UI

For example you may have a single project for each product such as MyApp and use the event fieldsenvironment=staging and program=api for filtering, or you could create projects for each environment, for example “MyApp Production”, or you could take it further and have projects per-product, per-team: “MyApp Mobile Production” and “MyApp Backend Production”.

Project listing
Project listing for Apex Software products

The JSON event fields are presented as clickable in the UI, allowing you to filter the search query further based on the value, in this case showing or hiding all regions matching “us-east-1”.

Apex Logs is index-free and schema-less, you can update your logging conventions as you go without breaking anything. The “Discovered fields” panel on the right is updated to reflect the fields available based on the current search query, and the percentage of occurrences.

Clicking an event brings up a detailed view of each field:

Or if you prefer, click over to the JSON tab to view the raw event, Apex Logs will remember your preference for the next time you view a log event.

Fields in this view are clickable as well, with the options reflecting the type of value, for example numeric fields provide you with further options to filter results above or below the selected value.

Clicking a field in the “Discovered fields” panel such as the AWS Lambda “function” will show you the values, filtered against the current query and time range.

Here we can see that 65% of AWS Lambda function calls were made to alert_processor, by far the most busy function.

Hitting the button next to the search input will bring up a list of recent & saved searches, where you can star any recent searches. Recent searches are personal, while saved searches are available to your entire team.

The last bit of user interface I wanted to touch on is alerting — just give your alert a name, search query, a threshold (the number of matched events) , then select a Slack, Email, SMS, Webhook or PagerDuty integration and you’re done!

There are three display modes for those who prefer line-wrapping or expanded logs.

Now that we’ve checked out the interface, let’s look at the query language.

Custom query language

Here’s the example structured log event we’ll use for the upcoming queries, note that there’s no rigid schema, you can name the fields however you want.

{
"level": "info",
"message": "uploaded file",
"fields": {
"file": "sloth.png",
"type": "image/png",
"duration": 2502,
"size": 43008,
"user": {
"name": "Tobi",
"email": "tobi@example.com"
},
"source": {
"host": "api-01"
}
}
}

The syntax for querying plain-text logs also works for structured logs, for example the following search terms are treated as uploaded AND tobi@example.com.

uploaded tobi@example.com

To be more specific you can quote whitespace and special characters for an exact match:

"uploaded file" tobi@example.com

The previous examples search against the entire log event text, so the terms “uploaded” or “tobi@example.com” could be anywhere in the event. Let’s get more specific by referencing fields:

message = "uploaded file" user.email = "tobi@example.com"

The AND operator is implied, but we can add it for readability:

message = "uploaded file" and user.email = "tobi@example.com"

The great thing about structured logging is we can perform more complex comparisons, for example the following query will give us large images uploaded by anyone with a “@example.com” email address, using the * wildcard:

message = "uploaded file"
user.email = "*@example.com"
size >= 400kb
type = "image/*"

Note the use of size >= 400kb — Apex Logs has built-in units for durations and byte-sizes:

file.size >= 10mb
request.size > 100kb
request.size <= 0
response.duration >= 300ms
response.duration >= 1.5s

You can read more about the query language in the documentation. If you have any suggestions I’d love to hear them!

What makes Apex Logs different?

Logs can be ingested from anywhere, including AWS CloudWatch, Digital Ocean, or anywhere else using the API integrations, so it’s important to note that just because it’s utilizing Google Cloud’s offerings, you can use it as the logging solution for mobile applications, games, browser apps, to complex infrastructure on AWS or Azure.

If you’d like to stay up to date make sure to follow on Twitter, or subscribe to the public roadmap on GitHub.

Code. Photography. Art.