Azure Active Directory – Capturing Events in Azure Events Hub

8 min read

Summary

Azure Active Directory doesn’t emit events on any type of object creation. This means that if you want to run an events-based action on the creation of a new User or a Group in AAD, you have to use other resources in Azure to accomplish this.

A recent project that I worked on required an event to be triggered when new Users were created in Azure Active Directory. Microsoft’s Power Automate and Logic Apps do not have triggers that capture events being emitted by Azure Active Directory.

  1. Use Azure Events Hub to capture new User Creation, use the output as a command.
  2. Use a scheduled polling approach with Logic Apps or Power Automate.
  3. Use Azure Monitor to parse logs.

In this post, I’m going to demonstrate how to use Events Hub to stream log data from Azure AD. This data will then be polled at intervals using Logic Apps to extract details of new User creation.

Using Azure Events Hub

Create a Resource Group

Create a new Resource Group in Azure. The resource group location doesn’t matter, but the closer the region is to your location the better.

Create an Event Hubs Namespace

Create a new Event Hub Namespace. For this example, I’ve created a low-tier Event Hub resource.

Once an Event Hub Namespace has been created, you should be presented with a screen that looks like this:

Message Content should be zero. We’re going to populate this soon with Log data. Based on the Throughput units, messages may appear from 1 to 15 minutes.

Export the Audit Logs to Event Hubs

We’re going to query Audit log data, using the Event Processing query tool. Firstly the data needs to be exported, for this go to Azure Active Directory > Monitoring > Export Settings.

Once in the Export Data Settings, click + Add diagnostic setting and then configure as required. Note that I am only streaming AuditLogs and not the SigninLogs.

This is all that is required in Active Directory. Future logs should now be sent to the Events Hub namespace.

Process Data in Events Hub (optional)

This is an optional step. If you want to export the messages into persistent storage like Azure Blobs or the Data Lake, then you would need to set this up here.

Go to your Events Hub Namespace that you created earlier. Click on Process data and then Explore.

Wait for the Input data to load. Once the data has loaded, you should see the log files in the input preview.

Events Hub uses the Streaming Analytics Query Language. The documentation for this language can be found on the Microsoft Azure Docs site.

Audit Logs hold very specific data. You can read more about the data here: Stream Azure platform logs to Azure Event Hubs

For this example, we need the value that sits in userPrincipalName

  {
    "records": [
      {
        "time": "2020-05-05T10:45:04.9999380Z",
        "resourceId": "/tenants/cb5b6a12-9b46-476e-b5a7-6ff821a37838/providers/Microsoft.aadiam",
        "operationName": "Add user",
        "operationVersion": "1.0",
        "category": "AuditLogs",
        "tenantId": "cb5b6a12-9b46-476e-b5a7-6ff821a37838",
        "resultSignature": "None",
        "durationMs": 0,
        "callerIpAddress": "40.81.156.112",
        "correlationId": "be897c47-73e1-4952-ada3-2725410d7464",
        "level": "Informational",
        "properties": {
          "id": "Directory_be897c47-73e1-4952-ada3-2725410d7464_O3LFF_92495974",
          "category": "UserManagement",
          "correlationId": "be897c47-73e1-4952-ada3-2725410d7464",
          "result": "success",
          "resultReason": "",
          "activityDisplayName": "Add user",
          "activityDateTime": "2020-05-05T10:45:04.9999380",
          "loggedByService": "Core Directory",
          "operationType": "Add",
          "initiatedBy": {
            "user": {
              "id": "2da1f70c-5e5f-4f05-8c96-39a670d9ea09",
              "displayName": null,
              "userPrincipalName": "pnixon2_outlook.com#EXT#@pnixon2outlook.onmicrosoft.com",
              "ipAddress": "40.81.156.112",
              "roles": []
            }
          },
          "targetResources": [
            {
              "id": "29f083e6-1088-4c60-942e-7b222263e10c",
              "displayName": null,
              "type": "User",
              "userPrincipalName": "Marcus.Aurelio@pnixon2outlook.onmicrosoft.com",
              "modifiedProperties": [
                {
                  "displayName": "AccountEnabled",
                  "oldValue": "[]",
                  "newValue": "[true]"
                },
                {
                  "displayName": "StsRefreshTokensValidFrom",
                  "oldValue": "[]",
                  "newValue": "[\"2020-05-05T10:45:04Z\"]"
                },
                {
                  "displayName": "UserPrincipalName",
                  "oldValue": "[]",
                  "newValue": "[\"Marcus.Aurelio@pnixon2outlook.onmicrosoft.com\"]"
                },
                {
                  "displayName": "UserType",
                  "oldValue": "[]",
                  "newValue": "[\"Member\"]"
                },
                {
                  "displayName": "Included Updated Properties",
                  "oldValue": null,
                  "newValue": "\"AccountEnabled, StsRefreshTokensValidFrom, UserPrincipalName, UserType\""
                }
              ],
              "administrativeUnits": []
            }
          ],
          "additionalDetails": []
        }
      }
    ],
    "EventProcessedUtcTime": "2020-05-05T10:57:36.5035270Z",
    "PartitionId": 2,
    "EventEnqueuedUtcTime": "2020-05-05T10:49:17.7730000Z"
  },

Here is the query to capture the data we need.

SELECT
records.ArrayValue.[properties]
INTO
[output-a]
FROM
[insights-logs-auditlogs] AS e
CROSS APPLY GetArrayElements(e.records) AS records

SELECT 
    properties.targetResources
INTO
    [output-b]
FROM 
    [output-a]

The Test Results should be successful.

Event Hubs Message Consumption

Messages can be consumed in Events Hub using a variety of strategies, but to keep it simple I am going to use Logic Apps.

My Logic Apps design is as follows:

I set the interval frequency to 1 minute, but this can easily be amended to cater for a variety of scenarios.

I added an extra Action (Initialise variable) to capture the message contents. This is again for demonstration purposes:

With Logic Apps now complete, I ran a simple test by creating a new User in Azure AD.

Shortly after, I noticed that my Logic Apps triggered, and my message was visible in the Content variable.

The content included both the Add user and UserPrincipalName objects. At this point, I am free to do what I need to with the full JSON object.

Final thoughts

In my use case, I needed to capture new Users created in Azure AD without having to poll Azure AD every few minutes. Events Hub allows this to happen by explicitly exporting log files that can then be consumed using a variety of tool.s

I chose the lowest Event Hub tier to keep my costs relatively low, however, what I did notice was that there was a delay from object creation in Azure AD to when my Logic Apps was triggered. The delay was 6 minutes, with Logic Apps being set to a 1 minuter polling frequency.

What are the costs associated with the above? I ran this through the Azure Calculator based on my needs. The cost estimation was:

From previous experience, the indicative costs seem very close.

There are ways to optimise this, e.g. I could create a query that only sent User Creation events and not everything into Azure Blob storage. From this point forward, I could use Logic Apps or Azure Functions to execute commands based on those logs.

The final question that some would ask – Why couldn’t Logic Apps just poll Azure Active Directory, directly at regular intervals to pick up new Users that were created?

You must be logged in to post a comment.