Skip to content

Logger

Stack9 uses Pino as a logger, as it is a very low overhead Node.js logger.

Due to Node's single-threaded event loop, it's highly recommended that log processing, alert triggering, and reformatting is done in a separate process. In Pino, all log processes are called transports and run as a separate process, piping stdout of an application to stdin of the transport.

Environment variables

The exported pino function takes two optional arguments, options and destination and returns a logger instance.

These optional enviroment variables can be configured in order to customize logger options

// Enables pretty printing log logs LOGGING_PRETTY_PRINT // TRUE or FALSE

// Configure level of logging LOGGING_MINIMUM_LEVEL // One of 'fatal', 'error', 'warn', 'info', 'debug', 'trace' or 'silent'.

Default log level value is info

Default pretty print value is false

This environment variable is required:

// Sets the destination log service
LOG_ADAPTOR // 'default', 'seq', or 'splunk' 

default sends logs to the console

Stack9-core

The logger instance is the object returned by the main exported pino function.

The primary purpose of the logger instance is to provide logging methods, the default method options are trace, debug, info, warn, error, and fatal.

Usage

Each logging method has the following signature: ([mergingObject], [message], [...interpolationValues]).

  • mergingObject: An object can optionally be supplied as the first parameter. Each enumerable key and value of the mergingObject is copied in to the JSON log line.
logger.info({MIX: {IN: true}})
// {"level":30,"time":1531254555820,"pid":55956,"hostname":"x","MIX":{"IN":true}}
  • message: A message string can optionally be supplied as the first parameter, or as the second parameter after supplying a mergingObject.

By default, the contents of the message parameter will be merged into the JSON log line under the msg key:

logger.info('hello world')
// {"level":30,"time":1531257112193,"msg":"hello world","pid":55956,"hostname":"x"}

The message parameter takes precedence over the mergedObject. That is, if a mergedObject contains a msg property, and a message parameter is supplied in addition, the msg property in the output log will be the value of the message parameter not the value of the msg property on the mergedObject.

The messageKey option can be used at instantiation time to change the namespace from msg to another string as preferred.

The message string may contain a printf style string with support for the following placeholders:

  • %s – string placeholder

  • %d – digit placeholder

  • %O, %o and %j – object placeholder

  • ...interpolationValues: All arguments supplied after message are serialized and interpolated according to any supplied printf-style placeholders (%s, %d, %o|%O|%j) to form the final output msg value for the JSON log line.

logger.info('%o hello %s', {worldly: 1}, 'world')
// {"level":30,"time":1531257826880,"msg":"{\"worldly\":1} hello world","pid":55956,"hostname":"x"}

Other examples:

logger.trace({ fieldKey: key }, 'No update required {fieldKey}');
logger.debug('Debugging service');
logger.info('Starting migration');
logger.warn(message, 'KNEX: warn');
logger.error(error, 'Error with validating schemas');
logger.fatal(error, "Tried to clean up relationship type that doesn't work");

pino-http

Stack9 also exports an instance of pino-http. The pinoHttp instance has a property logger, which references to an actual logger instance, used by pinoHttp. This instance will be a child of an instance, passed as opts.logger.

Logging to Seq

In order to send logs to Seq you must set the following environment variables:

LOG_ADAPTOR=seq
LOG_SERVER_URI=<seq-endpoint>
LOG_SERVER_API_KEY=<seq-api-key>

Follow the instructions here to create a seq API Key.

Logging to Splunk

In order to send logs to Splunk you must set the following environment variables:

LOG_ADAPTOR=splunk
LOG_SERVER_URI=<splunk-endpoint> // must be in the form https://<host>:8088
LOG_SERVER_API_KEY=<splunk-token>

Follow the instructions here​ to create a splunk token. Optional fields and input settings can be left as default.

References