The multi-channel retailing platform we are building at GALERIA Kaufhof provides centralised logging for all deployed applications and services. In order to leverage this common logging facility for upcoming metrics and analytics use cases we picked JSON as the agreed upon log format across all domains.
We are primarily using Scala and the Play Framework for our application development and therefore had to get Play to log in JSON. This note briefly describes our solution.
The logstash team provides a JSON formatter for logback logs. You can add this to your sbt project as the following dependency in the build.sbt:
libraryDependencies += "net.logstash.logback" % "logstash-logback-encoder" % "4.0"
Then configure logging in logger.xml with the minimal addition of this encoder to the desired appender:
Logback will now start to write one-line JSON documents to your log file, properly escaping double quotes and newlines.
Just logging a bunch of rather technical fields and one field ‘message’ containing what you actually log in the code doesn’t make for a good analytics harvest later on. Let’s try to add some structure to our logs.
The facility Logback JSON Encoder uses to add user defined fields to the emitted JSON are slf4j Markers:
import net.logstash.logback.marker.Markers._ import scala.collection.JavaConversions._ // appendEntries() expects a Java Map logger.info(append("name", "value"), "log message") logger.info(append("name1", "value1").and(append("name2", "value2")), "log message") val fields = Map( "productId" -> "123456" , "traceId" -> "98765" ) logger.info(appendEntries(fields), "log message")
Given we are using Play, the straight forward way to get hold of a Logger instance is to import Play’s play.api.Logger:
import play.api.Logger import net.logstash.logback.marker.Markers._ Logger.info(append("name", "value"), "log message")
When you try to run this, the compiler will complain that Logger.info isn’t applicable to Marker. The reason for this is that LoggerLike simply does not define these corresponding methods of the underying Logger interface.
One solution is to augment LoggerLike with an implicit trait, but we found that this messes up the source location information in error log events. Not good.
An alternative to augmenting Play’s logger is to use a logging library that properly implements the Marker-variants of the log methods.
Adding Scala Logging to the project solved the issue right away:
libraryDependencies += "com.typesafe.scala-logging" %% "scala-logging" % "3.1.0"
In addition Scala Logging comes with nice features such as the LazyLogging trait.
Logstash JSON Encoder is a little verbose (to say the least):
Besides reducing the amount of data logged we also wanted to rename some fields to avoid overlap with fields added by other tools in the logging chain. We also wanted to shorten the names.
Logstash JSON Encoder lets you do this in the logger configuration. For example:
It took some digging to find out the corresponding XML tag names but the above removes most of the clutter.
There are also the context-fields
application.home which you can remove
from the JSON output by disabeling the context:
<encoder class="net.logstash.logback.encoder.LogstashEncoder"> <includeContext>false</includeContext> ... </encode>
Finally, an issue came up when using these Marker based variants of the log methods:
logger.info(append("name", "value"), "log message") logger.info(append("name1", "value1").and(append("name2", "value2")), "log message")
For Scala these are ambiguous with other logging methods so the above did not compile in our case (using Scala up to 2.11.5) - Thinking about it, this might just be the reason why Play’s LoggerLike does not provide the methods in the first place…
Anyhow, using only the
val fields = Map( "productId" -> "123456" , "traceId" -> "98765" ) logger.info(appendEntries(fields), "log message")
variants of the logging methods solves this problem and these seem like the nicer methods to use in most cases anyhow.
That’s your JSON logging in Play Framework.