It’s easy to add console.log statements to log information from a Lambda function into CloudWatch, but then it can be a challenge to find the information we want. Logs for each Lambda are broken out into different groups, and logs within each group are further broken out into different streams. CloudWatch is a powerful tool but has limited search capabilities. Looking at logs while editing Lambda functions can involve quite a lot of clicking around.
As a result, we’ve started experimenting with logging JSON objects that are streamed into Elasticsearch. For example, our getAccount query, which retrieves information about a specific Workspace, might log the following JSON if a user tries to load the wrong Workspace:
“od_message”: "Unauthorized: mismatched clientUUID"
We’ve configured a subscription (filtered on the string “od_event” to minimize cruft) from the getAccount CloudWatch stream to a Lambda provided by AWS that inserts the JSON into Elasticsearch.
The benefit of passing the JSON into Elasticsearch, rather than trying to review in CloudWatch, is that the key/value pairs in the JSON are broken out into searchable fields that are easy to filter and count. For example, we can check to find out if we’re seeing an unusual number of errors for a particular user or customer:
We’re also building visualizations with Kibana that we aim to cover in a future blog post.
With a new product in early Beta and a team that is building a serverless application using the latest and greatest, logging will to continue to evolve rapidly. For the moment we’re super stoked about how sending logs from CloudWatch to Elasticsearch gives us a great deal more visibility than using CloudWatch alone and, in doing so, allows us to increase development velocity.