Code matrix image
Photo by Markus Spiske on Unsplash

Logging with log-levels and to different sink targets in Azure Functions

Tauseef Malik
4 min readMar 10, 2021

--

Most of the enterprise grade applications need some or other of form logging mechanism. From traditional rolling log text files, console logs, or log monitors in Cloud.

Logging is a must in most applications as it helps in the most dreaded situation. Yeah you are right ‘ Debugging’!

Problems with traditional Loggers in Server-less

In most Server-less boilerplate template, generally a log context is provided which logs to default log service of the respective Cloud provider i.e. Azure Application metric or Cloud-watch in AWS using LambdaLogger or Microsoft.Extensions.Logging class

However, they do not provide the ability to log messages with different log levels which is really necessary for large distributed micro-service based systems where-in you need to drill down to a particular warning or debug message.

Without a log-level, it becomes tedious to filter out the types of message that really interests us.

Also, most of the times, you will come across some third-party centralized log aggregator service being used in your organization to collect logs from all applications in one place e.g. Splunk, Datadog. Now, These logging services might have their own required format for log inputs, to be able to parse, display the messages and filter based on attributes.

How to overcome these challenges?

There are lot of readily available libraries which provide a means to perform structured event logging with context and push event message in custom formatted template. We will explore one such open-source library called Serilog

Serilog provides a mechanism to perform structured logging and way to switch between different log-levels. It also supports a bunch of different target sinks.

Some of the popular target sinks which are available to use with Serilog -

SERVICE                  NUGET PACKAGE  

Application Insights Serilog.Sinks.ApplicationInsights
Azure Analytics Serilog.Sinks.AzureAnalytics
Event Grid Serilog.Sinks.EventGrid
Azure Event Hub Serilog.Sinks.AzureEventHub
Azure Web Jobs Serilog.Sinks.AzureWebJobsTraceWriter
AmazonCloudWatch Serilog.Sinks.AwsCloudWatch
DataDog Serilog.Sinks.DataDog
Splunk Serilog.Sinks.Splunk

How to use Serilog in your Azure Functions

We will use Serilog library to perform logging in our HTTP trigger Azure function. The function simply prints the name passed in the query parameter as a response.

If you are using Vs Code, you can follow the instructions in this blog to develop, debug and test Azure HTTP triggered function locally and publish to a Function App service.

Steps:

Import the required Serilog libraries using dotnet add package command.

E.g.dotnet add package Serilog
dotnet add package Serilog.Sinks.AzureWebJobsTraceWriter

Your packagereference.csproj will look like below:

<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>netcoreapp3.1</TargetFramework>
<AzureFunctionsVersion>v3</AzureFunctionsVersion>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Microsoft.NET.Sdk.Functions" Version="3.0.11" />
<PackageReference Include="Newtonsoft.Json" Version="12.0.3" />
<PackageReference Include="Serilog" Version="2.10.0" />
<PackageReference Include="Serilog.Formatting.Compact" Version="1.1.0" />
<PackageReference Include="Serilog.Sinks.AzureWebJobsTraceWriter" Version="1.0.22" />
</ItemGroup>
<ItemGroup>
<None Update="host.json">
<CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
</None>
<None Update="local.settings.json">
<CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
<CopyToPublishDirectory>Never</CopyToPublishDirectory>
</None>
</ItemGroup>
</Project>

And the function business logic as below:

SeriloggerTest.cs

using System.IO;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Newtonsoft.Json;
using Serilog;
using Serilog.Context;
using Serilog.Core;
using Serilog.Events;
//To be used to format log message in the form of JSON
//using Serilog.Formatting.Compact;
using Serilog.Sinks.AzureWebJobsTraceWriter;
using Microsoft.Azure.WebJobs.Host;
namespace Mslearn
{
public static class SeriloggerTest
{
[FunctionName("SeriloggerTest")]
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)] HttpRequest req, TraceWriter log)
{
var levelSwitch = new LoggingLevelSwitch ();
levelSwitch.MinimumLevel = LogEventLevel.Debug;
var logger = new LoggerConfiguration ().Enrich.FromLogContext ()
.MinimumLevel.ControlledBy (levelSwitch).WriteTo.TraceWriter(log).CreateLogger ();


using (LogContext.PushProperty ("level", "warning")) {
logger.Warning ("This is a warning level message");
}
string name = req.Query["name"];

logger.Information ("query param 'name':" + name);
string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
dynamic data = JsonConvert.DeserializeObject(requestBody);
name = name ?? data?.name;
string responseMessage = string.IsNullOrEmpty(name)
? "This HTTP triggered function executed successfully. Pass a name in the query string or in the request body for a personalized response."
: $"Hello, {name}. This HTTP triggered function executed successfully.";
return new OkObjectResult(responseMessage);
}
}
}

Once published, test the deployed Azure function using cURL or Postman passing required query parameter. You can find the function URL in the overview blade of the Azure function.

Switch to Monitor section in the function and check the App Insights Logs to view the logs in real-time. Notice how the logs are colored based on the log severity i.e. warning, info or error etc.

Azure function log Monitor

You can also use Formatters to push log events in JSON or other custom template form which is required in some third-party log aggregator services and sink targets as per requirement.

--

--