API Gateway Design Patterns: Proxy & Edge

Summary

The API Proxy Pattern serves as an intermediary, enhancing individual API interactions by adding functionalities such as rate limiting, logging, and security checks, ensuring smoother and more secure data exchanges between clients and target APIs. On the other hand, the API Edge Pattern, commonly seen in microservices architectures, offers a unified entry point for external clients, abstracting the system’s underlying services, and managing requests ranging from routing and composition to implementing resilience measures. Both patterns play pivotal roles in streamlining and securing API communications, but they cater to slightly different architectural needs and challenges.

Proxy Pattern

The Proxy Pattern involves the API Gateway acting as a proxy between the client and backend services. The API Gateway receives requests from the client and forwards them to the appropriate backend service, and then returns the response to the client.

First, create a new project using the .NET 6 Minimal API template.

using Microsoft.AspNetCore.Builder;
using Microsoft.Azure.ApiManagement.AspNetCore;
using Microsoft.Extensions.Hosting;

var builder = WebApplication.CreateBuilder(args);

builder.Services.AddAzureApiManagement();

var app = builder.Build();

app.UseAzureApiManagement();

app.MapGet("/proxy/{**path}", async (context, path) =>
{
    var apiManagement = context.RequestServices.GetService<IApiManagementService>();
    var request = context.Request.ToHttpRequestMessage();
    request.RequestUri = new Uri(apiManagement.GetBackendUrl(request, path));

    using var httpClient = new HttpClient();
    using var response = await httpClient.SendAsync(request);
    var responseContent = await response.Content.ReadAsStringAsync();

    context.Response.StatusCode = (int)response.StatusCode;
    await context.Response.WriteAsync(responseContent);
});

app.Run();

In this example, we’re defining a single endpoint for requests to the “/proxy” URL using the MapGet method. The {**path} parameter captures the entire URL path after “/proxy”.

Inside the endpoint handler, we’re using the IApiManagementService interface provided by the Azure API Management middleware to get the backend URL for the request. We’re then forwarding the request to the backend API using a HttpClient instance and returning the response to the client.

This will start the app and listen for requests on http://localhost:5000. If you navigate to the URL http://localhost:5000/proxy/<backend-url>, where <backend-url> is the URL of your backend API, you should see the response from the backend API returned.

You can add additional endpoints and middleware to the app using the fluent API provided by the WebApplication class. For example, you could add authentication middleware, routing middleware, or logging middleware.

Microsoft Azure & APIM Implementation

  1. Intermediary Layer – Azure API Gateway acts as an intermediary layer between clients and backend services. To get started, you can create an API gateway instance on the Azure portal and define API endpoints that map to your backend services. It supports various protocols, including HTTP, HTTPS, gRPC, and WebSocket, making it easy to integrate with different types of services.
  2. Protocol Translation With Azure API Gateway, you can seamlessly translate between different protocols. For instance, if your backend services use gRPC, but your clients communicate over HTTP, the API gateway can handle the conversion between the two. This ensures compatibility and a smooth communication flow between clients and services.
  3. Security and Authentication – Security is of paramount importance in any API management strategy. Azure API Gateway offers robust authentication mechanisms, allowing you to secure your APIs with API keys, OAuth, or Azure Active Directory (Azure AD) tokens. You can easily configure access controls to restrict access to authorized users only.
  4. Rate Limiting and Throttling – To prevent abuse and ensure fair usage of resources, you can implement rate limiting and throttling policies in Azure API Gateway. With just a few clicks, you can set limits on the number of requests a client can make within a specific time window, protecting your backend services from overloading.
  5. Caching – Caching can significantly improve API performance by reducing the load on backend servers. Azure API Gateway supports response caching, allowing you to cache API responses at the gateway level. You can customize caching behaviour based on your API requirements and control cache duration to optimize data delivery.
  6. Monitoring and Analytics: – Azure API Gateway provides comprehensive monitoring and analytics features. You can leverage Azure Monitor to gain insights into API usage, performance metrics, and error rates. Additionally, Application Insights integration enables detailed logging and troubleshooting capabilities.
  7. Transformation and Aggregation – Implementing data transformation and aggregation is made easy with Azure API Gateway’s policy-based approach. You can use policies to modify requests and responses, aggregate data from multiple backend services, and present it to clients in a unified format, reducing client-side complexities.
  8. Error Handling – Azure API Gateway allows you to define custom error-handling policies to provide meaningful error messages to clients. You can handle exceptions gracefully and ensure a positive user experience even in case of failures.

Proxy Design Pattern using Azure HTTP Function:

using Microsoft.AspNetCore.Http;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.ApiManagement.AspNetCore;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.Extensions.Logging;
using System;
using System.Net.Http;
using System.Threading.Tasks;

public static class HttpTrigger
{
    [FunctionName("HttpTrigger")]
    public static async Task<IActionResult> Run(
        [HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", "put", "delete", Route = "{**path}")] HttpRequest req,
        ILogger log)
    {
        var apiManagement = req.HttpContext.RequestServices.GetService<IApiManagementService>();
        var request = req.ToHttpRequestMessage();
        request.RequestUri = new Uri(apiManagement.GetBackendUrl(request, req.Path));

        using var httpClient = new HttpClient();
        using var response = await httpClient.SendAsync(request);
        var responseContent = await response.Content.ReadAsStringAsync();

        return new ContentResult
        {
            Content = responseContent,
            ContentType = response.Content.Headers.ContentType.MediaType,
            StatusCode = (int)response.StatusCode
        };
    }
}

The above code is an implementation of an API Design Proxy using Azure Functions and the Azure API Management service.

Here’s a brief explanation of what is happening in the code:

  1. The [HttpTrigger] attribute on the Run method specifies that this is an HTTP trigger function that can handle requests for any HTTP method and any URL path using the Route parameter.
  2. The IApiManagementService interface provided by the Azure API Management middleware is used to get the backend URL for the request.
  3. The incoming HTTP request is converted to an HttpRequestMessage object, and the backend URL obtained from the IApiManagementService interface is used to set the RequestUri property of the HttpRequestMessage.
  4. A new HttpClient instance is created, and the SendAsync method is called to send the HttpRequestMessage to the backend API.
  5. The response from the backend API is converted to a string using the ReadAsStringAsync method.
  6. The response is returned to the client in the form of an IActionResult, which is created using the ContentResult class. The ContentResult contains the response content, content type, and status code obtained from the response from the backend API.

In summary, the code is using the Azure API Management middleware to proxy incoming HTTP requests to a backend API. The middleware is responsible for obtaining the backend URL, while the Azure Function is responsible for forwarding the request and returning the response to the client.

Edge Gateway Pattern

The Edge Gateway Pattern involves placing the API Gateway at the edge of the network, typically in a DMZ, to provide a secure entry point for external clients to access internal services.

Microgateway API Architecture

Microgateway Architecture: The Microgateway Architecture is a design approach where the traditional monolithic API gateway is broken down into smaller, lightweight components known as “microgateways.” Each microgateway is deployed in close proximity to individual microservices, providing a dedicated entry point for a specific service.

Key Characteristics:

  1. Lightweight and Focused: Unlike a centralized API gateway that handles multiple responsibilities for all microservices, a microgateway is lightweight and focuses only on the specific concerns of its associated microservice.
  2. Decentralized: In a microgateway architecture, the API gateway functionality is distributed across the microservices. Each microservice has its own microgateway, and they work independently.
  3. Single Responsibility Principle (SRP): Following the SRP, each microgateway is responsible for a specific set of tasks related to its microservice, such as authentication, request/response transformation, and rate limiting.
  4. Reduced Bottlenecks: Since each microservice has its own microgateway, the potential performance bottlenecks of a centralized API gateway are mitigated, and the microgateways can scale independently based on the load of their associated microservice.
  5. Focused Security Concerns: Microgateways handle security concerns at the individual service level, allowing for fine-grained control over authentication and authorization for each microservice.
  6. Isolation: A failure in one microgateway does not affect other microgateways, providing better isolation and resilience.

Use Cases: The microgateway architecture is particularly suitable for complex microservices architectures and large-scale systems where there is a need for fine-grained control over each service’s API management. It can be beneficial when:

  • Each microservice requires different authentication and authorization mechanisms.
  • Specific microservices need customized rate limiting or caching strategies.
  • Different microservices communicate over various protocols or message formats.

Leave a comment