Using Dependency Injection in Azure Functions

Summary

In Azure Functions 1.0, it was never possible to use Dependency Injection (DI). DI is a technique to achieve Inversion of Control (IoC) between classes and their methods. In Azure Functions 2.0, DI is possible and with that, loosely-coupled architectures, separating responsibilities, creating manageable units of code, and reducing cyclomatic complexity are possible.

In this post, I’ll demonstrate how to implement DI in a small project that uses Azure Blob Storage.

Azure Application Architecture

The purpose of this post is to demonstrate DI. I will not be creating any business logic.

The architecture is as follows:

  1. A file is uploaded to a Blob container.
  2. Azure Blob Storage Trigger Events will be used to bind to any events.
  3. Business logic will take place, and a file will be created in the Transform folder.

Note that I will not be using Azure Event Grid. Azure Event Grid is an event-driven architecture (Publisher/Subscriber). I’ll be using Blob Triggers which uses a Polling Consumer architecture. This means that when the Azure Functions is triggered, it may take 10 minutes or so before the output file is generated.

Azure Blob Storage Project

  • Create a new Azure Resource Group.
  • Create a new Azure Storage Account in the Resource Group.
  • In the Storage Account, create two Containers; raw and transformed. These will become the ‘zones’ to manage, raw, and transformed data.
  • Copy the Storage connection string from the Access Keys blade (Key1). SAS can be used if preferred.
  • Create a new Azure Functions Visual Studio project. The event type is BlobTrigger. This will create a new base Functions class. Name the project: AzureDataTransformation.
  • A new base class should be created as follows:
using System;
using System.IO;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.Extensions.Logging;

namespace AzureDataTransformation
{
    public static class Function1
    {
        [FunctionName("Function1")]
        public static void Run([BlobTrigger("raw/{name}", Connection = "DefaultEndpointsProtocol=https;AccountName=bscdatatransform;AccountKey=KaGxiw4mLVFdwaU7ZR3Vom39DCPf/SDP8os2YFWbKjvYQ4GdijDNAXnej0DliViPFeyWaBR0ZiCes1gE9d19+A==;EndpointSuffix=core.windows.net")]Stream myBlob, string name, ILogger log)
        {
            log.LogInformation($"C# Blob trigger function Processed blob\n Name:{name} \n Size: {myBlob.Length} Bytes");
        }
    }
}
  • Build the project to ensure that assemblies have been added correctly.
  • Refactor to ensure the Namespace, Class and the Function are: Namespace: AzureDataTransformation, Class: DataTransformFunction, Function: Transform.
  • Add the Serilog NuGet package for better logging of events.
  • Updated any dependent packages.
using System;
using System.IO;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.Extensions.Logging;

namespace AzureDataTransformation
{
    public static class DataTransformFunction
    {
        [FunctionName("Transform")]
        public static void Run([BlobTrigger("raw/{name}", Connection = "DefaultEndpointsProtocol=https;AccountName=bscdatatransform;AccountKey=KaGxiw4mLVFdwaU7ZR3Vom39DCPf/SDP8os2YFWbKjvYQ4GdijDNAXnej0DliViPFeyWaBR0ZiCes1gE9d19+A==;EndpointSuffix=core.windows.net")] Stream myBlob, string name, ILogger log)
        {
            log.LogInformation($"C# Blob trigger function Processed blob\n Name:{name} \n Size: {myBlob.Length} Bytes");
        }
    }
}
  • Add the connection string in the local.setting.json file. This file should resemble something like the following. Different connection strings can be used for different environments (Dev, Staging, Integration etc.) These details will be used in the Azure Functions Environment variables.
{
    "IsEncrypted": false,
    "Values": {
        "AzureWebJobsStorage": "UseDevelopmentStorage=true",
        "FUNCTIONS_WORKER_RUNTIME": "dotnet"
    }
}

Update the Connection string to:

{
  "IsEncrypted": false,
  "Values": {
    "AzureWebJobsStorage": "UseDevelopmentStorage=true",
    "FUNCTIONS_WORKER_RUNTIME": "dotnet",
    "DevStorage": "DefaultEndpointsProtocol=https;AccountName=rg01bd17;......"
    "ProdStorage": "DefaultEndpointsProtocol=https;AccountName=prodrg01bd17;......"
  }
}
  • Add the Connection string to the class level.
[StorageAccount("BlobConnection")]
  • The base class should now look like this:
using System;
using System.IO;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.Extensions.Logging;

namespace AzureDataTransformation
{
    [StorageAccount("BlobConnection")]
    public static class DataTransformFunction
    {
        [FunctionName("Transform")]
        public static void Run([BlobTrigger("raw/{name}", Connection = "DefaultEndpointsProtocol=https;AccountName=bscdatatransform;AccountKey=KaGxiw4mLVFdwaU7ZR3Vom39DCPf/SDP8os2YFWbKjvYQ4GdijDNAXnej0DliViPFeyWaBR0ZiCes1gE9d19+A==;EndpointSuffix=core.windows.net")] Stream myBlob, string name, ILogger log)
        {
            log.LogInformation($"C# Blob trigger function Processed blob\n Name:{name} \n Size: {myBlob.Length} Bytes");
        }
    }
}
  • Add a new folder called Services in the Visual Studio Project
  • Create a new C# Interface called IDataTransform.cs in the Services folder
  • Make the interface public
using System;
using System.Collections.Generic;
using System.Text;

namespace AzureDataTransformation.Services
{
    public interface IDataTransform
    {
    }
}
  • Add a DataTransformationService method inside the IDataTransform interface. The method will take in two parameters (string input, string output)
using System.IO;

namespace AzureDataTransformation.Services
{
    public interface IDataTransform
    {
        void DataTransformationService(Stream input, TextWriter ouput);
    }
}
  • Create a new C# implementation class called DataTransform.cs that implements the IDataTransform Interface. Save this in the Services folder. Set the Class to public.
using System.IO;

namespace AzureDataTransformation.Services
{
    public class DataTransform : IDataTransform
    {
        public void DataTransformationService(Stream input, TextWriter ouput)
        {
            // Add Business Logic Here
        }
    }
}

Add Dependency Injection to Startup.cs

We are going to add Dependency injection to ensure that our classes are not tightly coupled.

  • Add dependency injection by adding a new class called Startup.cs to the project.
  • Add new packages to enable dependency injection. Add Microsoft.Azure.Functions.Extensions & Microsoft.Extensions.DependencyInjection.
  • In the new Startup.cs class, add the attribute assembly.
[assembly: FunctionsStartup(typeof(AzureDataTransformation.Startup))]
  • Update the Startup.cs code with the following:
using AzureDataTransformation.Services;
using Microsoft.Azure.Functions.Extensions.DependencyInjection;
using Microsoft.Extensions.DependencyInjection;

[assembly: FunctionsStartup(typeof(AzureDataTransformation.Startup))]

namespace AzureDataTransformation
{
    public class Startup : FunctionsStartup
    {
        public override void Configure(IFunctionsHostBuilder builder)
        {
            builder.Services.AddSingleton<IDataTransform, DataTransform>();
        }
    }
}

DataTransformFunction.cs

  • Change the DataTransformFunction.cs class to be non-static by removing static.

From this:

using System;
using System.IO;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.Extensions.Logging;

namespace AzureDataTransformation
{
    [StorageAccount("BlobConnection")]
    public static class DataTransformFunction
    {
        [FunctionName("Transform")]
        public static void Run([BlobTrigger("raw/{name}", Connection = "DefaultEndpointsProtocol=https;AccountName=bscdatatransform;AccountKey=KaGxiw4mLVFdwaU7ZR3Vom39DCPf/SDP8os2YFWbKjvYQ4GdijDNAXnej0DliViPFeyWaBR0ZiCes1gE9d19+A==;EndpointSuffix=core.windows.net")] Stream myBlob, string name, ILogger log)
        {
            log.LogInformation($"C# Blob trigger function Processed blob\n Name:{name} \n Size: {myBlob.Length} Bytes");
        }
    }
}

To this:

using System;
using System.IO;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.Extensions.Logging;

namespace AzureDataTransformation
{
    [StorageAccount("BlobConnection")]
    public class DataTransformFunction
    {
        [FunctionName("Transform")]
        public void Run([BlobTrigger("raw/{name}", Connection = "DefaultEndpointsProtocol=https;AccountName=bscdatatransform;AccountKey=KaGxiw4mLVFdwaU7ZR3Vom39DCPf/SDP8os2YFWbKjvYQ4GdijDNAXnej0DliViPFeyWaBR0ZiCes1gE9d19+A==;EndpointSuffix=core.windows.net")] Stream myBlob, string name, ILogger log)
        {
            log.LogInformation($"C# Blob trigger function Processed blob\n Name:{name} \n Size: {myBlob.Length} Bytes");
        }
    }
}
  • Inject the service into the Function class.
    private readonly IDataTransform _dataTransform;
        public DataTransformFunction(IDataTransform dataTransform)
           => this._dataTransform = dataTransform;
  • Add the Business Logic to the code. The final code should look like this:
using AzureDataTransformation.Services;
using Microsoft.Azure.WebJobs;
using Microsoft.Extensions.Logging;
using System;
using System.IO;

namespace AzureDataTransformation
{
    [StorageAccount("BlobConnection")]
    public class DataTransformFunction
    {
        private readonly IDataTransform _dataTransform;

        public DataTransformFunction(IDataTransform dataTransform)
           => this._dataTransform = dataTransform;

        [FunctionName("Transform")]
        public void Run(
            [BlobTrigger("raw/{name}")] Stream inputBlob,
            [Blob("transformed/{name}", FileAccess.Write)] TextWriter outputBlob,
            string name,
            ILogger log)
        {
            log.LogInformation($"C# Blob trigger function Processed blob\n Name:{name} \n Size: {inputBlob.Length} Bytes");

            try
            {
                _dataTransform.DataTransformationService(inputBlob, outputBlob);

                log.LogInformation("File completed processing");
            }
            catch (Exception e)
            {
                log.LogError("Processing failed", e);
            }
        }
    }
}

Once complete, deploy to Azure.

Publish

  1. Publish the Azure Functions to Azure.
  2. Configure Azure Application Insights.

Environment Configurations

In Azure Functions blade, under configurations add the following:

Add the Connection String name and value:

Save and close.

Testing

Upload a document to the raw container. If you added business logic to the DataTransform.cs class, you will notice that the business logic is executed and a new file is created in the ‘transform’ container.