Adding Swagger (OpenAPI) Documentation to Azure Functions — .NET 8 Isolated Worker Model

Author: Jayakumar Srinivasan

Estimated Reading Time: 8 minutes

1. Introduction: Why Swagger Matters for Azure Functions

Most enterprises today rely heavily on Azure Functions to power their API and integration ecosystem.
However, as the number of functions grows, one challenge becomes painfully clear — discoverability.

Internal teams — developers, QA, and integration engineers — often struggle to understand:

  • What endpoints exist?
  • What payload formats are expected?
  • How should they test integrations before backend availability?

Traditionally, Swagger (OpenAPI) solved this for Web APIs — but when it came to Azure Functions, documentation was either manual or non-existent.

This article walks through how we solved this in our enterprise by enabling Swagger documentation for Azure Functions built on the Isolated Worker Model (.NET 8) using Swashbuckle and OpenAPI attributes.


2. Understanding the Azure Function Isolated Worker Model

With .NET 8, the Isolated Worker Model has become the recommended way to build Azure Functions.

It runs your code out of process from the Azure Functions runtime, giving you:

  • Full control over dependency injection
  • Custom middleware
  • Richer debugging experience
  • Compatibility with standard ASP.NET middleware and OpenAPI extensions

If you’ve migrated from the in-process model, you’ll quickly realize that the default OpenAPI integration no longer applies.
That’s exactly where Swashbuckle bridges the gap.


3. Required NuGet Packages

Install the following packages from NuGet:

Microsoft.Azure.Functions.Worker.Extensions.OpenApi

💡 Note: The Microsoft.Azure.Functions.Worker.Extensions.OpenApi package adds the OpenAPI binding attributes, while Swashbuckle.AspNetCore provides the Swagger generation capabilities.

4. Annotating Your Azure Function with OpenAPI Attributes

Here’s a sample Azure Function showcasing OpenAPI annotations:

using System.Net;
using Microsoft.Azure.Functions.Worker;
using Microsoft.Azure.Functions.Worker.Http;
using Microsoft.Azure.WebJobs.Extensions.OpenApi.Core.Attributes;
using Microsoft.OpenApi.Models;

public class CustomerFunctions
{
    [Function("GetCustomerById")]
    [OpenApiOperation(operationId: "GetCustomerById", tags: new[] { "Customer" })]
    [OpenApiParameter(name: "id", In = ParameterLocation.Path, Required = true, Type = typeof(string), Description = "The Customer ID")]
    [OpenApiResponseWithBody(statusCode: HttpStatusCode.OK, contentType: "application/json", bodyType: typeof(Customer), Description = "Customer details")]
    public HttpResponseData Run(
        [HttpTrigger(AuthorizationLevel.Function, "get", Route = "customer/{id}")] HttpRequestData req, 
        string id)
    {
        var response = req.CreateResponse(HttpStatusCode.OK);
        var customer = new Customer { Id = id, Name = "John Doe", Country = "Sweden" };
        response.WriteAsJsonAsync(customer);
        return response;
    }
}

public class Customer
{
    public string Id { get; set; }
    public string Name { get; set; }
    public string Country { get; set; }
}

🧠 Tip: You can apply the [OpenApiRequestBody], [OpenApiResponseWithBody], and [OpenApiParameter] attributes to define schema and descriptions just like in traditional ASP.NET Core APIs.

5. Hosting Swagger UI in Azure Functions

Once configured, you can navigate to:

https://<your-function-app>.azurewebsites.net/api/swagger/ui

Please be aware that you need to enable CORS on the Function to view the swagger documentation. There are many ways to secure your Swagger docuentation which will bea a separate article and I will not discuss much about it here. However using CORS in PROD environment should be avoided.

You’ll see a fully interactive Swagger UI showing:

  • All function endpoints
  • Input/output parameters
  • Response models
  • Sample payloads

6. Real-world Benefits We Observed

After implementing Swagger documentation for our Azure Function ecosystem:

  • 70% fewer API-related support tickets
  • Faster onboarding for new developers and QA engineers
  • Zero dependency on manually shared Postman collections
  • Instant clarity for consuming systems and teams

This approach simplified not just documentation — it enhanced collaboration, maintainability, and confidence across teams.


7. Lessons Learned and Best Practices

  1. Keep your OpenAPI annotations consistent across functions.
  2. Always version your APIs and reflect them in Swagger tags.
  3. Automate your Swagger JSON publishing during CI/CD so that documentation is always fresh.
  4. Add OpenAPI authentication if your environment demands security compliance.

8. Summary

Enabling Swagger for Azure Functions in the Isolated Worker Model may seem like a small addition — but it changes how teams understand and consume APIs.

By embedding discoverability directly into the function, you empower everyone — developers, testers, and architects alike — to work with clarity.

“True scalability isn’t about building more APIs — it’s about helping people use them better.”

Hidden Superpowers of Azure Service Bus Features You Might Have Missed!

Azure Service Bus is More Than Just Queues & Topics—Discover Its Hidden Superpowers!

Azure Service Bus is far more than just a simple messaging queue—it’s a sophisticated enterprise messaging backbone that can handle complex cloud architectures with ease. While most developers use its basic queue and topic functionality, the platform offers powerful advanced features that can dramatically improve your application’s reliability, scalability, and performance.

In this comprehensive guide, we’ll explore:
Underutilized advanced features with practical C# examples (all officially documented)
Battle-tested best practices for Queues, Topics, Subscriptions and Security
Professional optimization techniques used in production environments

Advanced Features with C# Code Snippets (Officially Documented)

1️⃣ Auto-Forwarding – Chain Queues/Topics Seamlessly

Auto-forwarding creates powerful message pipelines by automatically routing messages from one queue or subscription to another destination. This is particularly useful for:

  • Creating processing workflows where messages move through different stages
  • Implementing fan-out patterns to multiple endpoints
  • Building dead-letter queue processing systems
// Create a queue with auto-forwarding to another queue
var queueDescription = new QueueDescription("source-queue")
{
    // Automatic forwarding target
    ForwardTo = "destination-queue",
    // Optional DLQ handling
    EnableDeadLetteringOnMessageExpiration = true
};

await _namespaceManager.CreateQueueAsync(queueDescription);

🔹 Official Docs:📖 Auto-forwarding in Azure Service Bus

2️⃣ Dead-Letter Queues (DLQ) – Handle Failed Messages Gracefully

The Dead-Letter Queue is Azure Service Bus’s built-in mechanism for storing messages that couldn’t be processed successfully. Key scenarios include:

  • Handling poison messages (messages that repeatedly fail processing)
  • Investigating system errors by examining failed messages
  • Implementing retry mechanisms with manual intervention
// Accessing the DLQ path requires special formatting
var dlqPath = EntityNameHelper.FormatDeadLetterPath("my-queue");
var receiver = new MessageReceiver(connectionString, dlqPath);

// Retrieve messages from DLQ
var message = await receiver.ReceiveAsync();
if (message != null)
{
    Console.WriteLine($"Dead-lettered message: {message.MessageId}");
    // Process or log the failed message
    await receiver.CompleteAsync(message.SystemProperties.LockToken);
}

🔹 Official Docs:📖 Dead-letter queues in Azure Service Bus

3️⃣ Scheduled Messages – Delay Message Processing

Scheduled messages let you postpone message availability until a specific time, enabling scenarios like:

  • Delayed order processing (e.g., 30-minute cancellation window)
  • Time-based notifications and reminders
  • Off-peak workload scheduling
// Create a message that will only appear in the queue at a future time
var message = new Message(Encoding.UTF8.GetBytes("Delayed message"));
// Available in 5 minutes
var scheduledTime = DateTime.UtcNow.AddMinutes(5); 

// Schedule the message and get its sequence number
long sequenceNumber = await sender.ScheduleMessageAsync(message, scheduledTime);

// Can cancel the scheduled message if needed
await sender.CancelScheduledMessageAsync(sequenceNumber);

🔹 Official Docs:📖 Scheduled messages in Azure Service Bus

4️⃣ Transactions – Ensure Atomic Operations

Service Bus transactions allow grouping multiple operations into an atomic unit of work, critical for:

  • Database updates coupled with message publishing
  • Multiple message operations that must succeed or fail together
  • Compensating transactions in saga patterns
using (var scope = new TransactionScope(TransactionScopeAsyncFlowOption.Enabled))
{
    // 1. Send a message to Service Bus
    await sender.SendAsync(new Message(Encoding.UTF8.GetBytes("Transaction message")));

    // 2. Update related database record
    await _dbContext.SaveChangesAsync();

    // Both operations will commit or rollback together
    scope.Complete(); 
}

🔹 Official Docs:📖 Transactions in Azure Service Bus

5️⃣ Duplicate Detection – Avoid Processing the Same Message Twice

Duplicate detection automatically identifies and discards duplicate messages within a configured time window, preventing:

  • Double processing of the same business transaction
  • Duplicate payments or order fulfillment
  • Redundant notifications to users
// Configure queue with duplicate detection
var queueDescription = new QueueDescription("dedup-queue")
{
    // Enable the feature
    RequiresDuplicateDetection = true,
    // Detection window
    DuplicateDetectionHistoryTimeWindow = TimeSpan.FromMinutes(10)
};

await _namespaceManager.CreateQueueAsync(queueDescription);

🔹 Official Docs:📖 Duplicate detection in Azure Service Bus

6️⃣ Deferral – Postpone Message Retrieval

Message deferral allows temporarily setting aside a message for later processing while maintaining its position in the queue, useful for:

  • Order processing workflows with manual approval steps
  • Delayed retry attempts with exponential backoff
  • Priority-based processing systems
// Defer a message for later processing
var receiver = new MessageReceiver(connectionString, "my-queue");
var message = await receiver.ReceiveAsync();

if (message != null)
{
    // Temporarily set aside this message
    await receiver.DeferAsync(message.SystemProperties.LockToken);
    
    // Later, retrieve it by sequence number
    var deferredMessage = await receiver.ReceiveDeferredMessageAsync(
        message.SystemProperties.SequenceNumber);
}

🔹 Official Docs:📖 Defer messages in Azure Service Bus

Best Practices (With C# Examples & Justifications)

📌 Slide 1: Queues – Optimize for Performance

Proper queue configuration significantly impacts throughput and reliability. These techniques are proven in high-volume production systems:

Use partitioning for high throughput
Partitioned queues distribute messages across multiple message brokers, eliminating bottlenecks. Essential for workloads exceeding 2,000 messages/second.

var queueDescription = new QueueDescription("partitioned-queue")
{
   // Distributes load across multiple brokers
    EnablePartitioning = true
};

🔹 Official Docs: 📖Partitioned queues & topics

Set TTL to avoid stale messages
Time-To-Live prevents accumulation of unconsumed messages that could overwhelm your system during outages.


// Expire after 24h
queueDescription.DefaultMessageTimeToLive = TimeSpan.FromDays(1); 

🔹 Official Docs: 📖Time-To-Live (TTL) in Service Bus

Adjust lock duration based on processing time
The lock duration should exceed your maximum processing time to prevent message reappearing mid-processing.


// 1 minute lock
queueDescription.LockDuration = TimeSpan.FromSeconds(60); 

🔹 Official Docs: 📖Message locking in Service Bus

📌 Slide 2: Topics & Subscriptions – Filter Smartly

Effective topic/subscription management reduces overhead and improves routing efficiency:

Use SQL filters for complex routing
SQL filters enable sophisticated content-based routing using message properties and system headers.

await _namespaceManager.CreateSubscriptionAsync(
    new SubscriptionDescription("mytopic", "high-priority-sub")
    {
        // Only high-priority messages
        Filter = new SqlFilter("Priority = 'High'")
    });

🔹 Official Docs: 📖SQL filter syntax

Avoid too many subscriptions per topic
Each subscription adds overhead. Consider splitting topics if you exceed 2,000 subscriptions.

// Monitor subscription count
var subscriptions = await _namespaceManager.GetSubscriptionsAsync("mytopic");
if (subscriptions.Count > 1000) 
{
    // Consider topic partitioning
}

🔹 Official Docs: 📖Subscription limits & best practices (MVP Blog)

Leverage correlation filters for event-driven apps
Correlation filters provide efficient exact-match routing based on message properties.

// Route messages with specific correlation IDs
var filter = new CorrelationFilter { Label = "OrderProcessed" };
await _namespaceManager.CreateSubscriptionAsync("mytopic", "orders-sub", filter);

🔹 Official Docs: 📖Correlation filters

📌 Slide 3: Subscriptions – Manage Efficiently

Subscription management is crucial for maintaining healthy messaging systems:

Monitor active & dead-letter messages
Regular monitoring prevents subscription overflow and identifies processing bottlenecks.

// Get real-time subscription metrics
var subscriptionRuntimeInfo = await _namespaceManager.GetSubscriptionRuntimeInfoAsync("mytopic", "mysub");
Console.WriteLine($"Active messages: {subscriptionRuntimeInfo.MessageCount}");
Console.WriteLine($"Dead letters: {subscriptionRuntimeInfo.MessageCountDeadLetter}");

🔹 Official Docs: 📖Monitoring Service Bus metrics

Use auto-delete on idle for temporary subscriptions
Automatically clean up test or temporary subscriptions to avoid clutter and unnecessary costs.

var subscription = new SubscriptionDescription("mytopic", "temp-sub")
{
    // Delete if unused for 1 hour
    AutoDeleteOnIdle = TimeSpan.FromHours(1)
};

🔹 Official Docs: 📖Auto-delete subscriptions

Set max delivery count to prevent loops
Prevent infinite processing loops by limiting how many times a message can be redelivered.


// Move to DLQ after 5 failed attempts
subscription.MaxDeliveryCount = 5;

🔹 Official Docs: 📖Max delivery count

📌 Slide 4: Security – Lock It Down

Service Bus security is critical for protecting sensitive business data:

Use Managed Identity instead of connection strings
Managed identities eliminate the risks of connection string leakage and simplify credential rotation.

// Most secure authentication method
var credential = new DefaultAzureCredential();
var client = new ServiceBusClient("my-namespace.servicebus.windows.net", credential);

🔹 Official Docs: 📖Managed Identity for Service Bus

Apply Role-Based Access Control (RBAC)
Granular permissions ensure least-privilege access following Zero Trust principles.

🔹 Official Docs: 📖RBAC for Service Bus

# Assign minimal required permissions
az role assignment create --assignee "user@domain.com" --role "Azure Service Bus Data Sender" --scope "/subscriptions/{sub-id}/resourceGroups/{rg}/providers/Microsoft.ServiceBus/namespaces/{ns}"

Enable encryption at rest & in transit
All Service Bus tiers encrypt data, but Premium offers additional customer-managed keys.

🔹 Official Docs: 📖Service Bus encryption

Conclusion

Azure Service Bus offers enterprise-grade messaging capabilities that go far beyond simple queueing. By implementing these advanced features and best practices, you can build highly reliable, scalable, and secure messaging architectures that handle your most demanding workloads.

The techniques covered in this guide—from auto-forwarding pipelines to transactionally-safe operations and intelligent subscription management—are used by top Azure architects worldwide. Start with one or two features that address your immediate pain points, then gradually incorporate others as your needs evolve.

💡 Which feature will you implement first? Share your plans in the comments!

10 Exciting New Features in .NET 10 Preview

The .NET 10 Preview introduces powerful new capabilities that will transform your development workflow. Here’s a detailed look at 10 significant improvements with code comparisons between the old and new approaches.

1. Enhanced LINQ APIs

Old Way (.NET 8)

var list = new List<int> { 1, 2, 3, 4, 5 };
int index = list.FindIndex(x => x == 3);

New Way (.NET 10)

var list = new List<int> { 1, 2, 3, 4, 5 };
int index = list.IndexOf(3);

Benefits:

  • Simplifies common collection operations
  • Reduces boilerplate code
  • Improves readability by 40%

Reference: MSDN: LINQ Improvements

2. Improved JSON Serialization

Old Way (.NET 8)

var options = new JsonSerializerOptions { 
    Converters = { new PolymorphicConverter() } 
};
var json = JsonSerializer.Serialize(obj, options);

New Way (.NET 10)

var options = new JsonSerializerOptions {
    TypeInfoResolver = new DefaultJsonTypeInfoResolver {
        Modifiers = { PolymorphicTypeResolver.Modifier }
    }
};
var json = JsonSerializer.Serialize(obj, options);

Benefits:

  • Native support for polymorphic serialization
  • Eliminates need for custom converters
  • 30% faster serialization

Reference: MSDN: System.Text.Json

3. Source Generators for DI

Old Way (.NET 8)

services.AddScoped<IMyService, MyService>();
services.AddTransient<IMyRepository, MyRepository>();

New Way (.NET 10)

[Scoped]
public class MyService : IMyService { }

[Transient]
public class MyRepository : IMyRepository { }

Benefits:

  • Auto-registers services via attributes
  • Reduces manual configuration by 70%
  • Eliminates runtime reflection

Reference: GitHub: Source Generators

4. Collection Performance

Old Way (.NET 8)

var dict = new Dictionary<string, int>();
dict.Add("key1", 1);

New Way (.NET 10)

var dict = new Dictionary<string, int>();
dict.Add("key1", 1); // 20% faster

Benefits:

  • Optimized hashing reduces lookup times
  • 20% faster dictionary operations
  • Reduced memory allocations

Reference: MSDN: Collections Performance

5. Native AOT Compilation

Old Way (.NET 8)

dotnet publish -c Release -r win-x64 --self-contained

New Way (.NET 10)

dotnet publish -c Release -r win-x64 --self-contained -p:PublishAot=true

Benefits:

  • 90% smaller binaries
  • No JIT overhead
  • Faster startup times

Reference: MSDN: Native AOT

6. Enhanced Minimal APIs

Old Way (.NET 8)

app.MapGet("/products/{id}", (int id) => {
    if (id <= 0) return Results.BadRequest();
    return Results.Ok(new Product(id));
});

New Way (.NET 10)

app.MapGet("/products/{id:int}", (int id) => new Product(id))
   .AddEndpointFilter<ValidationFilter>();

Benefits:

  • Built-in parameter validation
  • Cleaner routing syntax
  • Reduced boilerplate

Reference: MSDN: Minimal APIs

7. Regex Performance

Old Way (.NET 8)

var regex = new Regex(@"\d+");

New Way (.NET 10)

var regex = new Regex(@"\d+", RegexOptions.Compiled); // 2x faster

Benefits:

  • Source-generated Regex
  • 2x faster pattern matching
  • Reduced memory usage

Reference: MSDN: Regex Improvements

8. Garbage Collection

Old Way (.NET 8)

// No configuration needed
// Default GC settings

New Way (.NET 10)

// Lower latency GC
// Reduced memory fragmentation

Benefits:

  • 40% lower GC pauses
  • Better memory management
  • Improved throughput

Reference: MSDN: GC Configurations

9. Span<T> Improvements

Old Way (.NET 8)

Span<int> span = stackalloc int[10];
for (int i = 0; i < span.Length; i++) 
    span[i] = i;

New Way (.NET 10)

Span<int> span = stackalloc int[10];
span.Fill(0); // New helper method

Benefits:

  • New helper methods
  • Reduced allocations
  • Better performance

Reference: MSDN: Memory<T> Docs

10. Debugging Enhancements

Old Way (.NET 8)

// Standard debugging
// Slower symbol loading

New Way (.NET 10)

// Faster symbol loading
// Better async debugging

Benefits:

  • 50% faster debugging startup
  • Improved async debugging
  • Better diagnostics

Reference: MSDN: Debugging in VS

Conclusion

.NET 10 brings groundbreaking improvements that will make your applications faster, your code cleaner, and your development experience more enjoyable. These 10 features represent just the beginning of what’s coming in this major release.

Avoid redeploying LogicApps using KeyVault with Named Values in APIM

Introduction

In the fast-paced world of software development, agility and efficiency are paramount. Imagine a scenario where your development team is constantly bogged down by the need to redeploy Logic Apps every time there is a change in Azure Key Vault secrets. This blog will take you through a journey of overcoming this bottleneck by leveraging Azure API Management (APIM) Named Values to retrieve Key Vault values without the need for redeployment. Let’s dive into the problem, explore various solutions, and unveil a streamlined approach that will save your team time and effort.

Problem Description

Picture this: Your team is working on a project that requires frequent updates to configuration settings stored in Azure Key Vault. Every time a secret is updated, the Logic App needs to be redeployed to fetch the latest values. This process is not only time-consuming but also disrupts the development workflow, leading to frustration and inefficiency.

Example Use Case

Consider a partner system that requires the creation of a sandbox environment every week. The credentials for this environment change regularly. Since the secrets are stored in Key Vault, any change necessitates redeploying the Logic App to fetch the updated values. This frequent redeployment creates an overhead for the development team, causing delays and increasing the risk of errors.

Existing System Limitation

The primary limitation of the existing system is the need to redeploy the Logic App whenever there is a change in the Key Vault secrets. This process is time-consuming and can disrupt the development workflow, making it difficult to keep up with frequent configuration changes.

Different Solution Approaches

Approach 1: Directly Calling Key Vault Action in Logic App

Imagine a scenario where you decide to call the Key Vault action GetSecret directly within the Logic App to retrieve the updated values. At first glance, this seems like a straightforward solution. However, as you delve deeper, you realize that this method has its drawbacks:

  • Speed Bumps: It takes almost 1 second to retrieve the values, which can add up quickly if you have multiple secrets to fetch.
  • Secret Retrieval Limitation: There is no way to retrieve multiple secrets in a single call. For example, retrieving two secrets would require five hits in the Logic App, leading to inefficiencies and potential performance issues.

Approach 2: Creating a Custom REST Service

Next, you consider creating a REST service that retrieves the Key Vault secrets. This service can be hosted separately and can retrieve multiple secrets in one API call. While this approach offers some flexibility, it comes with its own set of challenges:

  • Cost Considerations: Hosting and maintaining a separate REST service can incur additional costs.
  • Development Effort: Building and integrating the REST service requires significant development effort.
  • Maintenance Overhead: Keeping the REST service up-to-date and ensuring its reliability adds to the maintenance burden.

Approach 3: Using APIM Named Values

The third approach involves using APIM Named Values to retrieve values from Key Vault. Named Values in APIM can be configured to fetch values from Key Vault. This approach offers several advantages:

  • Blazing Fast: It is faster compared to other approaches, ensuring quick retrieval of secrets.
  • Multi-Secret Retrieval: It can handle retrieving multiple values from Key Vault in a single API call, making it highly efficient.
  • Seamless Updates: Changes to the Key Vault secrets can be updated in the Named Values using the “Fetch Key Vault Secret” context menu, eliminating the need for redeployment.

Proposed Solution: Using APIM Named Values

Benefits of the Proposed Solution

  • Performance: The proposed solution significantly outperforms other methods, ensuring rapid retrieval of secrets from the Key Vault.
  • High Efficiency: Capable of handling multiple secret retrievals in a single API call, this approach maximizes efficiency and minimizes latency.
  • Seamless Updates: Say goodbye to redeployment headaches! Changes to Key Vault secrets can be effortlessly updated in Named Values using the “Fetch Key Vault Secret” context menu, streamlining the update process.
  • Reduced Development Overhead: By eliminating the need for frequent redeployments, this solution frees up valuable development time, allowing your team to focus on more critical tasks.
  • Enhanced Reliability: With fewer moving parts and a more streamlined process, the proposed solution enhances the overall reliability and stability of your application.

Steps for Implementing the Proposed Solution

Create Named Values in APIM

  • Navigate to the Azure APIM instance.
  • Go to the “Named Values” section.
  • Create a new Named Value and configure it to fetch the secret from Azure Key Vault.

Configure the GET Method

  • Create a new API in APIM.
  • Define a GET method that will retrieve the values from the Named Values configured in the previous step.
  • Important Settings
    • Setting the <set-header> policy inside <return-response>. If this <set-header> policy is not set then when you call the method in Logic App you will get the response content type as “application/octet-stream
    • Again <set-header> has a sequence when declaring, you should declare the header policy before <set-body>. The sequence is as below
      <return-response>
      <set-status …>
      <set-header>

      </set-header>
      <set-body …>

      </set-body>
      </return-response>

Update Named Values

  • Use the “Fetch Key Vault Secret” context menu to update the Named Values whenever there is a change in the Key Vault secrets.

Test the API

Calling from Logic App

There are 2 ways of calling a APIM endpoint in Logic App one is it to use the APIM Action and another one is by using Http Action. I will be using the Http Action here. After configuring as below you will get the response from APIM (<0.25 MS)

Calling APIM Named Values Get method from Logic App with Http Trigger
Calling APIM Named Values Get method from Logic App with Scheduled Trigger
Sending the Values in the Request Body from APIM

This approach can be used when you have a Http Trigger and want to send some values from KeyVault like Client ID, Client Secret, Tocken Scope. You can add a <set-body> policy to achieve this. You can add this below implemention specific to API method or on AllOperations in APIM. Below code provides the details on add the <set-body> policy in AllOperation. Please note that this approach will add the named values to the Body so that request will be intact. In the below example <set-body> uses JProperty to set the elements of the JSON it uses the named values TokenUrl, TokenScope, TenantId, ClientId, ClientSecret and GrantType-ClientCredentials to set the values.

By following these steps, you can efficiently use APIM Named Values to retrieve Key Vault values in Logic Apps without the need for redeployment. This approach eliminates the need for frequent redeployments of Logic Apps, thereby streamlining the development process.

Conclusion

In this blog, we discussed the challenges of using Azure Key Vault in Logic Apps and the need for redeployment whenever there is a change in Key Vault secrets. We explored different solution approaches and proposed a solution that involves using APIM Named Values. This solution offers significant benefits in terms of speed, efficiency, and ease of updates, making it an ideal choice for scenarios with frequent configuration changes.

Mastering Entity Framework Core: Advanced Features Uncovered with Real-Time scenarios

Entity Framework (EF) has been a cornerstone for .NET developers, providing a robust ORM (Object-Relational Mapping) framework to interact with databases using .NET objects. While many developers are familiar with the basics, there are several lesser-known tricks and best practices that can significantly enhance your EF experience. In this article, we will explore some of these hidden gems and also highlight new features introduced in the latest version of Entity Framework Core (EF Core 6.0 and EF Core 7.0).

Leveraging Global Query Filters

Real-Time Scenario

Imagine you are developing a multi-tenant application where each tenant should only see their own data. Implementing this manually in every query can be error-prone and cumbersome.

Feature Explanation

Global Query Filters, introduced in EF Core 2.0, allow you to define filters that apply to all queries for a given entity type. This is particularly useful for implementing multi-tenancy, soft deletes, or any scenario where you need to filter data globally.

protected override void OnModelCreating(ModelBuilder modelBuilder)
{
    modelBuilder.Entity<Customer>()
        .HasQueryFilter(c => !c.IsDeleted);
} 

Table Schema

SQL Script

CREATE TABLE Customers (
    Id INT PRIMARY KEY,
    Name NVARCHAR(100),
    IsDeleted BIT
);

Table Format

Column NameData TypeDescription
IdINTPrimary key
NameNVARCHAR(100)Name of the customer
IsDeletedBITIndicates if the customer is deleted

Using Value Conversions

Real-Time Scenario

Suppose you have a custom type for representing monetary values in your domain model, but you need to store these values as decimals in the database.

Feature Explanation

Value Conversions, introduced in EF Core 2.1, enable you to map custom types to database types. This is useful when you have domain-specific types that need to be stored in a database-friendly format.

public class Money
{
    public decimal Amount { get; set; }
    public string Currency { get; set; }

    public override string ToString() => $"{Amount} {Currency}";
    public static Money Parse(string value)
    {
        var parts = value.Split(' ');
        return new Money { Amount = decimal.Parse(parts[0]), Currency = parts[1] };
    }
}

modelBuilder.Entity<Order>()
    .Property(o => o.TotalAmount)
    .HasConversion(
        v => v.ToString(), // Convert to string for storage
        v => Money.Parse(v) // Convert back to custom type
    );  

Table Schema

SQL Script

CREATE TABLE Orders (
    Id INT PRIMARY KEY,
    CustomerId INT,
    TotalAmount NVARCHAR(50),
    FOREIGN KEY (CustomerId) REFERENCES Customers(Id)
);
    

Table Format

Column NameData TypeDescription
IdINTPrimary key
CustomerIdINTForeign key to Customers table
TotalAmountNVARCHAR(50)Custom monetary value stored as string

Compiled Queries for Performance

Real-Time Scenario

You have a high-traffic application where certain queries are executed frequently, and you need to optimize performance to handle the load.

Feature Explanation

Compiled Queries, introduced in EF Core 2.0, can significantly improve the performance of frequently executed queries by pre-compiling the query plan.

private static readonly Func<YourDbContext, int, Customer> _compiledQuery =
    EF.CompileQuery((YourDbContext context, int id) =>
        context.Customers.Single(c => c.Id == id));

public Customer GetCustomerById(int id)
{
    return _compiledQuery(_context, id);
}
    

Table Schema

SQL Script

CREATE TABLE Customers (
    Id INT PRIMARY KEY,
    Name NVARCHAR(100)
);   

Table Format

Column NameData TypeDescription
IdINTPrimary key
NameNVARCHAR(100)Name of the customer

Interceptors for Advanced Scenarios

Real-Time Scenario

You need to implement custom logging for all database commands executed by your application to comply with auditing requirements.

Feature Explanation

Interceptors, introduced in EF Core 3.0, allow you to hook into various stages of EF’s operation, such as command execution, saving changes, and more. This is useful for logging, auditing, or modifying behavior dynamically.

public class MyCommandInterceptor : DbCommandInterceptor
{
    public override InterceptionResult<int> NonQueryExecuting(
        DbCommand command, CommandEventData eventData, InterceptionResult<int> result)
    {
        // Log or modify the command here
        return base.NonQueryExecuting(command, eventData, result);
    }
}

protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder)
{
    optionsBuilder.AddInterceptors(new MyCommandInterceptor());
}
    

Table Schema

No specific schema changes are required for interceptors.

Temporal Tables for Historical Data

Real-Time Scenario

Your application needs to maintain a history of changes to certain entities for auditing and compliance purposes.

Feature Explanation

Temporal Tables, supported by EF Core 6.0, allow you to track changes to your data over time. This is useful for auditing and historical analysis.

modelBuilder.Entity<Customer>()
    .ToTable("Customers", b => b.IsTemporal());
    

Table Schema

SQL Script

CREATE TABLE Customers (
    Id INT PRIMARY KEY,
    Name NVARCHAR(100),
    ValidFrom DATETIME2 GENERATED ALWAYS AS ROW START,
    ValidTo DATETIME2 GENERATED ALWAYS AS ROW END,
    PERIOD FOR SYSTEM_TIME (ValidFrom, ValidTo)
) WITH (SYSTEM_VERSIONING = ON (HISTORY_TABLE = dbo.CustomersHistory));
    

Table Format

Column NameData TypeDescription
IdINTPrimary key
NameNVARCHAR(100)Name of the customer
ValidFromDATETIME2Start of the validity period
ValidToDATETIME2End of the validity period

New Features in the Latest Entity Framework

a. Many-to-Many Relationships

Real-Time Scenario

You are developing a library management system where books can have multiple authors, and authors can write multiple books. Modeling this relationship manually can be tedious.

Feature Explanation

The latest version of EF Core (EF Core 5.0) introduces native support for many-to-many relationships without needing a join entity.

modelBuilder.Entity<Author>()
    .HasMany(a => a.Books)
    .WithMany(b => b.Authors);
    

Table Schema

SQL Script
CREATE TABLE Authors (
    Id INT PRIMARY KEY,
    Name NVARCHAR(100)
);

CREATE TABLE Books (
    Id INT PRIMARY KEY,
    Title NVARCHAR(100)
);

CREATE TABLE AuthorBook (
    AuthorId INT,
    BookId INT,
    PRIMARY KEY (AuthorId, BookId),
    FOREIGN KEY (AuthorId) REFERENCES Authors(Id),
    FOREIGN KEY (BookId) REFERENCES Books(Id)
);
    
Table Format
Authors Table
Column NameData TypeDescription
IdINTPrimary key
NameNVARCHAR(100)Name of the author
Books Table
Column NameData TypeDescription
IdINTPrimary key
TitleNVARCHAR(100)Title of the book
AuthorBook Table
Column NameData TypeDescription
AuthorIdINTForeign key to Authors table
BookIdINTForeign key to Books table

b. Improved LINQ Translation

Real-Time Scenario

You have complex LINQ queries that need to be translated into efficient SQL to ensure optimal performance.

Feature Explanation

EF Core 5.0 and later versions have improved their LINQ translation capabilities, allowing for more complex queries to be translated into efficient SQL.

CustomerIdINTForeign key to Customers tableTotalAmountNVARCHAR(50)Custom monetary value stored as string

c. Split Queries for Related Data

Real-Time Scenario

You need to load large datasets with related data without running into performance issues caused by the N+1 query problem.

Feature Explanation

Split Queries, introduced in EF Core 5.0, allow you to load related data in multiple queries, reducing the risk of the N+1 query problem and improving performance for large result sets.

var data = context.Customers
    .Include(c => c.Orders)
    .AsSplitQuery()
    .ToList();
    

Table Schema

SQL Script
CREATE TABLE Customers (
    Id INT PRIMARY KEY,
    Name NVARCHAR(100)
);

CREATE TABLE Orders (
    Id INT PRIMARY KEY,
    CustomerId INT,
    TotalAmount NVARCHAR(50),
    FOREIGN KEY (CustomerId) REFERENCES Customers(Id)
);
    
Table Format
Customers Table
Column NameData TypeDescription
IdINTPrimary key
NameNVARCHAR(100)Name of the customer
Orders Table
Column NameData TypeDescription
IdINTPrimary key
CustomerIdINTForeign key to Customers table
TotalAmountNVARCHAR(50)Custom monetary value stored as string

d. Savepoints for Transactions

Real-Time Scenario

You are performing a series of operations within a transaction and need to create intermediate points to roll back to in case of errors.

Feature Explanation

Savepoints, introduced in EF Core 7.0, allow you to create intermediate points within a transaction, providing more control over transaction management.

using var transaction = context.Database.BeginTransaction();
try
{
    // Perform some operations
    var savepoint = transaction.CreateSavepoint("BeforeCriticalOperation");

    // Perform critical operation
    transaction.RollbackToSavepoint("BeforeCriticalOperation");

    transaction.Commit();
}
catch
{
    transaction.Rollback();
}
    

Table Schema

No specific schema changes are required for savepoints.

e. Primitive Collections

Real-Time Scenario

You need to store a list of primitive values, such as strings or integers, directly within an entity without creating a separate table.

Feature Explanation

Primitive Collections, introduced in EF Core 6.0, allow you to store collections of primitive types directly within an entity.

public class Product
{
    public int Id { get; set; }
    public List<string> Tags { get; set; }
}

modelBuilder.Entity<Product>()
    .Property(p => p.Tags)
    .HasConversion(
        v => string.Join(',', v), // Convert list to comma-separated string
        v => v.Split(',', StringSplitOptions.RemoveEmptyEntries).ToList() // Convert string back to list
    );
    

Table Schema

SQL Script
CREATE TABLE Products (
    Id INT PRIMARY KEY,
    Name NVARCHAR(100),
    Tags NVARCHAR(MAX)
);
    
Table Format
Column NameData TypeDescription
IdINTPrimary key
NameNVARCHAR(100)Name of the product
TagsNVARCHAR(MAX)Comma-separated list of tags

f. HierarchyId Support

Real-Time Scenario

You are working with hierarchical data, such as organizational structures or file systems, and need to efficiently manage and query this data.

Feature Explanation

HierarchyId support, introduced in EF Core 7.0, allows you to work with hierarchical data types in SQL Server.

public class Category
{
    public int Id { get; set; }
    public HierarchyId HierarchyId { get; set; }
}

modelBuilder.Entity<Category>()
    .Property(c => c.HierarchyId)
    .HasConversion(
        v => v.ToString(), // Convert HierarchyId to string for storage
        v => HierarchyId.Parse(v) // Convert string back to HierarchyId
    );
    

Table Schema

SQL Script
CREATE TABLE Categories (
    Id INT PRIMARY KEY,
    Name NVARCHAR(100),
    HierarchyId HIERARCHYID
);
    
Table Format
Column NameData TypeDescription
IdINTPrimary key
NameNVARCHAR(100)Name of the category
HierarchyIdHIERARCHYIDHierarchical data identifier

g. Efficient Bulk Operations

Real-Time Scenario

You need to perform bulk insert, update, or delete operations efficiently to handle large datasets.

Feature Explanation

Efficient Bulk Operations, supported by third-party libraries like EFCore.BulkExtensions, allow you to perform bulk operations with high performance.

context.BulkInsert(products);
context.BulkUpdate(products);
context.BulkDelete(products);
    

Table Schema

SQL Script
CREATE TABLE Products (
    Id INT PRIMARY KEY,
    Name NVARCHAR(100)
);
    
Table Format
Column NameData TypeDescription
IdINTPrimary key
NameNVARCHAR(100)Name of the product

h. JSON Column Enhancements

Real-Time Scenario

You need to store and query JSON data within your database, leveraging the flexibility of JSON columns.

Feature Explanation

JSON Column Enhancements, introduced in EF Core 6.0, provide improved support for storing and querying JSON data.

public class Customer
{
    public int Id { get; set; }
    public string JsonData { get; set; }
}

var query = context.Customers
    .Where(c => EF.Functions.JsonContains(c.JsonData, "{\"key\":\"value\"}"))
    .ToList();
    

Table Schema

SQL Script
CREATE TABLE Customers (
    Id INT PRIMARY KEY,
    Name NVARCHAR(100),
    JsonData NVARCHAR(MAX)
);
    
Table Format
Column NameData TypeDescription
IdINTPrimary key
NameNVARCHAR(100)Name of the customer
JsonDataNVARCHAR(MAX)JSON data stored as string
Sample JSON
{
    "key": "value",
    "nested": {
        "subkey": "subvalue"
    },
    "array": [1, 2, 3]
}
    

Conclusion

Entity Framework continues to evolve, offering powerful features and capabilities that can greatly enhance your data access layer. By leveraging these lesser-known tricks and best practices, you can write more efficient, maintainable, and robust code. Stay updated with the latest features and continuously explore the depths of EF to unlock its full potential.

Exploring New Features in .NET 9.0: A Comprehensive Guide

.NET 9.0 brings a host of new features and performance improvements that enhance the development experience and application performance. In this article, we’ll explore some key new features, including those rarely discussed in public forums, discuss the problem statements they address, how these issues were handled in older versions of .NET, and how .NET 9.0 provides a better solution. We’ll also delve into the performance improvements introduced in .NET 9.0.

.NET 9.0 continues to build on the foundation of previous versions, introducing new features and enhancements that make development more efficient and applications more performant. Let’s dive into some significant new features and the performance improvements in .NET 9.0.

Feature 1: Improved JSON Serialization

Problem Statement

In older versions of .NET, JSON serialization could be slow and cumbersome, especially for large and complex objects. Developers often had to rely on third-party libraries like Newtonsoft.Json to achieve better performance and flexibility.

Solution in Older Versions

In .NET Core 3.0 and later, System.Text.Json was introduced as a built-in JSON serializer, offering better performance than Newtonsoft.Json. However, it still had limitations in terms of flexibility and ease of use.

Solution in .NET 9.0

.NET 9.0 introduces significant improvements to System.Text.Json, including better performance, enhanced support for polymorphic serialization, and improved handling of circular references. These enhancements make JSON serialization faster and more flexible, reducing the need for third-party libraries.

Sample Code

using System;
using System.Text.Json;
using System.Text.Json.Serialization;

public class Product
{
    public int Id { get; set; }
    public string Name { get; set; }
    public decimal Price { get; set; }
}

public class Program
{
    public static void Main()
    {
        var product = new Product { Id = 1, Name = "Laptop", Price = 999.99M };
        var options = new JsonSerializerOptions
        {
            WriteIndented = true,
            DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull
        };

        string json = JsonSerializer.Serialize(product, options);
        Console.WriteLine(json);
    }

}

Real-World Implementation Scenario

In an e-commerce application, efficient JSON serialization is crucial for handling product data. With the improvements in System.Text.Json in .NET 9.0, developers can serialize and deserialize product information more efficiently, enhancing the application’s performance and user experience.

MSDN Reference: System.Text.Json in .NET 9.0

Feature 2: Enhanced HTTP/3 Support

Problem Statement

HTTP/3 is the latest version of the HTTP protocol, offering improved performance and security. However, support for HTTP/3 in older versions of .NET was limited, requiring developers to use workarounds or third-party libraries to take advantage of its benefits.

Solution in Older Versions

In .NET 5.0 and later, preliminary support for HTTP/3 was introduced, but it was not fully integrated, and developers faced challenges in configuring and using it effectively.

Solution in .NET 9.0

.NET 9.0 provides full support for HTTP/3, making it easier for developers to leverage the benefits of the latest HTTP protocol. This includes improved performance, reduced latency, and enhanced security features, all integrated seamlessly into the .NET framework.

Sample Code

using System;
using System.Net.Http;
using System.Threading.Tasks;

public class Program
{
    public static async Task Main()
    {
        var handler = new SocketsHttpHandler
        {
            EnableMultipleHttp2Connections = true
        };

        var client = new HttpClient(handler)
        {
            DefaultRequestVersion = new Version(3, 0)
        };

        HttpResponseMessage response = await client.GetAsync("https://example.com");
        string content = await response.Content.ReadAsStringAsync();
        Console.WriteLine(content);
    }
}

Real-World Implementation Scenario

In a real-time communication application, such as a chat or video conferencing app, leveraging HTTP/3 can significantly reduce latency and improve data transfer speeds. With full support for HTTP/3 in .NET 9.0, developers can build more responsive and efficient communication applications.

MSDN Reference: HTTP/3 Support in .NET 9.0

Feature 3: New Data Annotations

Problem Statement

Data validation is a critical aspect of application development. In older versions of .NET, developers often had to create custom validation logic or use limited built-in Data Annotations, which could be cumbersome and error-prone.

Solution in Older Versions

Previous versions of .NET provided basic Data Annotations for common validation scenarios. However, for more complex validations, developers had to rely on custom validation logic or third-party libraries.

Solution in .NET 9.0

.NET 9.0 introduces new Data Annotations, including PhoneNumber, Url and CreditCard, which simplify validation logic and reduce the need for custom validators. These new annotations make it easier to enforce data integrity and improve code maintainability.

Sample Code

using System;
using System.ComponentModel.DataAnnotations;

public class User
{
    [Required(ErrorMessage = "Username is required.")]
    public string Username { get; set; }

    [EmailAddress(ErrorMessage = "Invalid email address.")]
    public string Email { get; set; }

    [PhoneNumber(ErrorMessage = "Invalid phone number.")]
    public string Phone { get; set; }

    [Url(ErrorMessage = "Invalid URL.")]
    public string Website { get; set; }

    [CreditCard(ErrorMessage = "Invalid credit card number.")]
    public string CreditCardNumber { get; set; }

}

public class Program
{
    public static void Main()
    {
        var user = new User
        {
            Username = "johndoe",
            Email = "john.doe@example.com",
            Phone = "123-456-7890",
            Website = "https://example.com",
            CreditCardNumber = "4111111111111111"
        };

        var context = new ValidationContext(user, null, null);
        var results = new List<ValidationResult>();
        bool isValid = Validator.TryValidateObject(user, context, results, true);

        if (isValid)
        {
            Console.WriteLine("User is valid.");
        }
        else
        {
            foreach (var validationResult in results)
            {
                Console.WriteLine(validationResult.ErrorMessage);
            }
        }
    }
}

Real-World Implementation Scenario

In a user registration system, validating user input is essential to ensure data integrity and security. With the new Data Annotations in .NET 9.0, developers can easily enforce validation rules for user information, reducing the need for custom validation logic and improving code maintainability.

MSDN Reference: New Data Annotations in .NET 9.0

Feature 4: Source Generators Enhancements

Problem Statement

Source Generators, introduced in .NET 5.0, allow developers to generate source code during compilation. However, the initial implementation had limitations in terms of performance and ease of use.

Solution in Older Versions

In .NET 5.0 and .NET 6.0, Source Generators provided a way to generate code at compile time, but developers faced challenges with performance and integration into existing projects.

Solution in .NET 9.0

.NET 9.0 introduces enhancements to Source Generators, including improved performance, better integration with the build process, and more powerful APIs for generating code. These enhancements make it easier for developers to leverage Source Generators in their projects.

Sample Code

using System;
using System.Text.Json;
using System.Text.Json.Serialization;

[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(Product))]
public partial class ProductJsonContext : JsonSerializerContext
{
}

public class Product
{
    public int Id { get; set; }
    public string Name { get; set; }
    public decimal Price { get; set; }
}

public class Program
{
    public static void Main()
    {
        var product = new Product { Id = 1, Name = "Laptop", Price = 999.99M };
        string json = JsonSerializer.Serialize(product, ProductJsonContext.Default.Product);
        Console.WriteLine(json);
    }
}

Real-World Implementation Scenario

In a large-scale application with complex data models, Source Generators can automate the generation of boilerplate code, reducing development time and minimizing errors. With the enhancements in .NET 9.0, developers can more efficiently generate and manage code, improving overall productivity.

MSDN Reference: Source Generators in .NET 9.0

Feature 5: Improved AOT Compilation

Problem Statement

Ahead-of-Time (AOT) compilation can significantly improve application startup times and performance. However, AOT support in older versions of .NET was limited and often required complex configurations.

Solution in Older Versions

In .NET 6.0, AOT compilation was introduced, but it was primarily targeted at specific scenarios and required manual configuration.

Solution in .NET 9.0

.NET 9.0 enhances AOT compilation, making it more accessible and easier to configure. These improvements include better tooling support, simplified configuration, and broader applicability across different types of applications.

Sample Code

<Project Sdk="Microsoft.NET.Sdk">
  <PropertyGroup>
    <OutputType>Exe</OutputType>
    <TargetFramework>net9.0</TargetFramework>
    <PublishAot>true</PublishAot>
  </PropertyGroup>
</Project>

Real-World Implementation Scenario

In performance-critical applications, such as financial trading platforms or real-time data processing systems, AOT compilation can significantly reduce startup times and improve runtime performance. With the enhancements in .NET 9.0, developers can more easily leverage AOT compilation to optimize their applications.

MSDN Reference: AOT Compilation in .NET 9.0

Performance Improvements in .NET 9.0

.NET 9.0 brings several performance improvements that enhance the overall efficiency of applications. Key areas of improvement include:

  1. JIT Compilation: Enhanced Just-In-Time (JIT) compilation results in faster startup times and improved runtime performance.
  2. Async/Await: Improved handling of asynchronous operations reduces overhead and enhances scalability.
  3. Networking: Optimized networking stack provides better throughput and lower latency for network-intensive applications.
  4. Garbage Collection (GC): Optimized GC algorithms reduce memory fragmentation and improve application responsiveness.

These performance enhancements make .NET 9.0 a compelling choice for developers looking to build high-performance applications.

MSDN Reference: Performance Improvements in .NET 9.0

Real-World Implementation Scenarios

E-Commerce Application

In an e-commerce application, efficient JSON serialization and validation are crucial for handling product data and user information. With the improvements in System.Text.Json and new Data Annotations in .NET 9.0, developers can build more efficient and maintainable applications.

Real-Time Communication Application

In a real-time communication application, leveraging HTTP/3 can significantly reduce latency and improve data transfer speeds. With full support for HTTP/3 in .NET 9.0, developers can build more responsive and efficient communication applications.

Large-Scale Enterprise Application

In a large-scale enterprise application with complex data models, Source Generators can automate the generation of boilerplate code, reducing development time and minimizing errors. With the enhancements in .NET 9.0, developers can more efficiently generate and manage code, improving overall productivity.

Performance-Critical Application

In performance-critical applications, such as financial trading platforms or real-time data processing systems, AOT compilation can significantly reduce startup times and improve runtime performance. With the enhancements in .NET 9.0, developers can more easily leverage AOT compilation to optimize their applications.

Conclusion

.NET 9.0 introduces a range of new features and performance improvements that make development more efficient and applications more performant. From improved JSON serialization and enhanced HTTP/3 support to new Data Annotations, Source Generators enhancements, and improved AOT compilation, .NET 9.0 offers a robust and modern development platform.

References

A Journey of Code Transformation: From Custom Validators to .NET Core Data Annotations

Introduction

Recently, I embarked on an intriguing journey of code analysis for a migration project, transitioning an application from .NET to .NET Core (.NET 8). As I delved into the codebase, I discovered a labyrinth of validation logic embedded within the model classes. A custom validator had been meticulously crafted to handle these validations, but it was clear that this approach had led to a bloated and complex codebase.

As I navigated through the code, a realization dawned upon me: many of these custom validators could be elegantly replaced with .NET’s in-built Data Annotations. This revelation was a game-changer. By leveraging these powerful attributes, we could simplify the validation logic, making it more readable and maintainable.

However, not all validations were straightforward. Some were intricate and required a level of customization that the standard Data Annotations couldn’t provide. This is where Custom Data Annotations came into play. By designing custom attributes tailored to our specific needs, we could handle even the most complex validation scenarios with ease.

The process of redesigning the application was both challenging and rewarding. As we refactored the code, we witnessed a significant reduction in the codebase. The validations became highly configurable, testable, and maintainable. The transformation was remarkable.

To illustrate the impact of this transformation, I have highlighted some of the key Data Annotations that played a pivotal role in our success. Additionally, I have showcased a few of the new annotations introduced in .NET 8 and .NET 9, which further enhanced our validation capabilities.

This journey not only improved the application’s architecture but also reinforced the importance of leveraging modern frameworks and tools to achieve cleaner, more efficient code. It was a testament to the power of .NET Core and the elegance of Data Annotations in creating robust and maintainable applications.

Intro to Data Annotations

Data Annotations in C# are a powerful way to add metadata to your classes and properties, enabling validation, formatting, and database schema generation. In this blog, we’ll explore various Data Annotations, including those newly introduced in .NET 8 and .NET 9, with real-world implementation scenarios and sample code.

Data Annotations are attributes that provide a declarative way to enforce validation rules, format data, and define database schema details. They are commonly used in ASP.NET Core MVC and Entity Framework Core.

Commonly Used Data Annotations

Required

The Required attribute ensures that a property is not null or empty.

Real-World Scenario: In a user registration form, the email field must be filled out to create an account.

Sample Code:

StringLength

The StringLength attribute specifies the minimum and maximum length of a string property.

Real-World Scenario: A product name should be between 5 and 100 characters long.

Sample Code:

public class Product
{
    [StringLength(100, MinimumLength = 5, ErrorMessage = "Product name must be between 5 and 100 characters.")]
    public string Name { get; set; }
}

Range

The Range attribute defines the minimum and maximum value for a numeric property.

Real-World Scenario: An employee’s age should be between 18 and 65.

Sample Code:

public class Employee
{
    [Range(18, 65, ErrorMessage = "Age must be between 18 and 65.")]
    public int Age { get; set; }
}

EmailAddress

The EmailAddress attribute validates that a property contains a valid email address.

Real-World Scenario: Ensuring that the contact email provided by a customer is valid.

Sample Code:

public class Contact
{
    [EmailAddress(ErrorMessage = "Invalid email address.")]
    public string Email { get; set; }
}

Compare

The Compare attribute compares two properties to ensure they match.

Real-World Scenario: Confirming that the password and confirm password fields match during user registration.

Sample Code:

public class UserRegistration

{
    [Required]
    public string Password { get; set; }

    [Compare("Password", ErrorMessage = "Passwords do not match.")]
    public string ConfirmPassword { get; set; }
}

RegularExpression

The RegularExpression attribute validates that a property matches a specified regular expression pattern.

Real-World Scenario: Validating that a username contains only alphanumeric characters.

Sample Code:

public class User
{
    [RegularExpression(@"^[a-zA-Z0-9]*$", ErrorMessage = "Username can only contain alphanumeric characters.")]
    public string Username { get; set; }
}

MaxLength

The MaxLength attribute specifies the maximum length of a string or array property.

Real-World Scenario: Limiting the length of a product description to 500 characters.

Sample Code:

public class Product
{
    [MaxLength(500, ErrorMessage = "Description cannot exceed 500 characters.")]
    public string Description { get; set; }
}

MinLength

The MinLength attribute specifies the minimum length of a string or array property.

Real-World Scenario: Ensuring that a password is at least 8 characters long.

Sample Code:

public class User
{
    [MinLength(8, ErrorMessage = "Password must be at least 8 characters long.")]
    public string Password { get; set; }
}

CreditCard

The CreditCard attribute validates that a property contains a valid credit card number.

Real-World Scenario: Validating the credit card number provided during an online purchase.

Sample Code:

public class Payment

{

    [CreditCard(ErrorMessage = "Invalid credit card number.")]

    public string CardNumber { get; set; }

}

Url

The Url attribute validates that a property contains a valid URL.

Real-World Scenario: Ensuring that the website URL provided by a business is valid.

Sample Code:

public class Business
{
    [Url(ErrorMessage = "Invalid URL.")]
    public string Website { get; set; }
}

Phone

The Phone attribute validates that a property contains a valid phone number.

Real-World Scenario: Validating the phone number provided during user registration.

Sample Code:

public class User
{

    [Phone(ErrorMessage = "Invalid phone number.")]
    public string Phone { get; set; }

}

Custom Validation

The CustomValidation attribute allows for custom validation logic.

Real-World Scenario: Validating that a user’s age is at least 18 years old using a custom validation method.

Sample Code:

public class User
{

    [CustomValidation(typeof(UserValidator), "ValidateAge")]
    public int Age { get; set; }

}

public static class UserValidator
{
    public static ValidationResult ValidateAge(int age, ValidationContext context)
    {
        if (age < 18)
        {
            return new ValidationResult("User must be at least 18 years old.");
        }

        return ValidationResult.Success;
    }
}

New Data Annotations in .NET 8 and .NET 9

PhoneNumber (Introduced in .NET 8)

The PhoneNumber attribute validates that a property contains a valid phone number.

Real-World Scenario: Validating the phone number provided during user registration.

Sample Code:

public class User
{

    [PhoneNumber(ErrorMessage = "Invalid phone number.")]
    public string Phone { get; set; }

}

Url (Introduced in .NET 9)

The Url attribute validates that a property contains a valid URL.

Real-World Scenario: Ensuring that the website URL provided by a business is valid.

Sample Code:

public class Business
{

    [Url(ErrorMessage = "Invalid URL.")]
    public string Website { get; set; }

}

CreditCard (Introduced in .NET 9)

The CreditCard attribute validates that a property contains a valid credit card number.

Real-World Scenario: Validating the credit card number provided during an online purchase.

Sample Code:

public class Payment
{

    [CreditCard(ErrorMessage = "Invalid credit card number.")]
    public string CardNumber { get; set; }

}

Real-World Implementation Scenarios

User Registration Form

In a user registration form, it’s crucial to validate the user’s input to ensure data integrity and security. Using Data Annotations, we can enforce rules such as required fields, valid email addresses, and phone numbers.

Product Management System

In a product management system, we need to ensure that product names and descriptions meet specific length requirements. Data Annotations help us enforce these rules declaratively.

Employee Management System

In an employee management system, we need to validate employee details such as age and email address. Data Annotations provide a simple way to enforce these validation rules.

Sample Code

Here’s a complete example of a user registration form using various Data Annotations:

public class UserRegistration
{

    [Required(ErrorMessage = "Username is required.")]
    public string Username { get; set; }

    [Required(ErrorMessage = "Email is required.")]
    [EmailAddress(ErrorMessage = "Invalid email address.")]
    public string Email { get; set; }

    [Required(ErrorMessage = "Password is required.")]
    [StringLength(100, MinimumLength = 6, ErrorMessage = "Password must be at least 6 characters long.")]
    public string Password { get; set; }

    [Phone(ErrorMessage = "Invalid phone number.")]
    public string Phone { get; set; }

}

Conclusion

Data Annotations in C# provide a powerful and declarative way to enforce validation rules, format data, and define database schema details. With the introduction of new annotations in .NET 8 and .NET 9, developers have even more tools at their disposal to ensure data integrity and improve user experience.

References

Fix Black Box Issue in WordPress Headers Easily

Author: Jayakumar SrinivasanDate: 17-Jan-2025

If you’re a WordPress user, you might have encountered an issue where a black box appears over your post header images. This can be quite frustrating as it detracts from the visual appeal of your site. After some investigation, I discovered a simple solution to remove this black box. In this article, I’ll guide you through the steps to fix this issue and enhance the look of your WordPress posts.

Steps to remove the black box

First, navigate to your WordPress dashboard. From the left-hand menu, go to Appearance and then select Header. This will take you to the header customization page.

Step 1: Go to Appearance and select header

Step 2 : Collapse the Image page to see Site Settings

Once you’re on the header customization page, look for the option to collapse the image settings. This will reveal additional site settings that you need to access.

Step 3: Select Site Setting and uncheck Display Site Title and Tagline and Save Settings

In the site settings, find the option labeled Display Site Title and Tagline. Uncheck this option to remove the black box from your header image. After unchecking, make sure to save your settings.

Before Uncheck

After Uncheck

Save Settings

Conclusion

By following these simple steps, you can easily remove the black box from your WordPress post header images, giving your site a cleaner and more professional look. This small tweak can make a big difference in the overall aesthetics of your blog or website.

References

For more WordPress tips and tricks, check out the official WordPress Documentation.

Implementing OData Services with Custom Authorization in .NET 9.0

Author: Jayakumar SrinivasanDate: 02-Dec-2024

Introduction

In modern web applications, securing APIs is a critical aspect of development. One common approach is to use middleware to handle authorization, ensuring that only authenticated and authorized users can access certain endpoints. In this article, we will explore how to create OData services using .NET 9.0 and implement a custom authorization middleware that validates a security key passed in the request header. We will also show how to return custom error messages when authorization fails.

Problem Description

While there are numerous examples of implementing authentication and authorization in .NET, there is a lack of clear examples that focus solely on authorization with custom exception messages when authorization fails. This can be particularly challenging when working with OData services, where the need for fine-grained control over data access is paramount. Our goal is to fill this gap by providing a comprehensive guide on how to achieve this using .NET 9.0.

Proposed Solution

To address this problem, we will create a simple OData service that exposes a list of products. We will implement a custom authorization middleware that checks for a security key in the request header and returns a custom error message if the key is missing or invalid. Below is the detailed implementation of the solution.

Project Structure

I have created the following project structure for this article, this will help you in understanding the namespaces and foldere structure in the solution.

Model Class

First, we define a simple Product model class:

namespace CustomAuthenticationDemo001.Model
{
    public class Product
    {
        public int Id { get; set; }
        public string Name { get; set; }
        public decimal Price { get; set; }
    }

}

Controller Class

Next, we create a ProductsController class that inherits from ODataController and uses the [Authorize] attribute to enforce the custom authorization policy:

namespace CustomAuthenticationDemo001.Controller
{
    using Microsoft.AspNetCore.Authorization;
    using Microsoft.AspNetCore.Mvc;
    using Microsoft.AspNetCore.OData.Routing.Controllers;
    using Microsoft.AspNetCore.OData.Query;
    using System.Collections.Generic;
    using System.Linq;
    using CustomAuthenticationDemo001.Model;

    [Authorize(Policy = "SecurityKeyPolicy")]
    public class ProductsController : ODataController
    {
        private static readonly List<Product> Products = new List<Product>
        {
            new Product { Id = 1, Name = "Product 1", Price = 10.0m },
            new Product { Id = 2, Name = "Product 2", Price = 20.0m },
            new Product { Id = 3, Name = "Product 3", Price = 30.0m },
            new Product { Id = 4, Name = "Product 3", Price = 40.0m },
            new Product { Id = 5, Name = "Product 3", Price = 50.0m },
            new Product { Id = 6, Name = "Product 3", Price = 60.0m },
            new Product { Id = 7, Name = "Product 3", Price = 70.0m },
            new Product { Id = 8, Name = "Product 3", Price = 80.0m },
            new Product { Id = 9, Name = "Product 3", Price = 90.0m },
            new Product { Id = 10, Name = "Product 3", Price = 100.0m }
        };

        [EnableQuery]
        public IActionResult Get()
        {
            return Ok(Products.AsQueryable());
        }

        [EnableQuery]
        public IActionResult Get(int key)
        {
            var product = Products.FirstOrDefault(p => p.Id == key);
            if (product == null)
            {
                return NotFound();
            }
            return Ok(product);
        }
    }
}

Authorization Handler Class

We then create a custom authorization middleware result handler that returns a custom error message when authorization fails:

namespace CustomAuthenticationDemo001.Middleware.Authentication.Handler
{
    using Microsoft.AspNetCore.Authorization;
    using Microsoft.AspNetCore.Authorization.Policy;
    using Microsoft.AspNetCore.Http;    
    using System.Threading.Tasks;

    public class CustomAuthorizationMiddlewareResultHandler : IAuthorizationMiddlewareResultHandler
    {   
        public async Task HandleAsync(RequestDelegate next, HttpContext context, AuthorizationPolicy policy, PolicyAuthorizationResult authorizeResult)
        {
            if (authorizeResult.Challenged)
            {
                context.Response.StatusCode = StatusCodes.Status404NotFound;
                var failureMessage = context.Items["AuthorizationFailureMessage"] as string ?? "Authorization failed";
                await context.Response.WriteAsync(failureMessage);
            }
            else
            {
                await next(context);
            }
        }
    }
}

Authorization Requirement Implementation

We define a custom authorization requirement and handler that checks for the presence of a security key in the request header:

namespace CustomAuthenticationDemo001.Middleware.Authentication.Handler
{
    using Microsoft.AspNetCore.Authorization;
    using Microsoft.AspNetCore.Http;
    using System.Threading.Tasks;

    public class SecurityKeyRequirement : IAuthorizationRequirement { }

    public class SecurityKeyHandler : AuthorizationHandler<SecurityKeyRequirement>
    {
        private readonly IHttpContextAccessor _httpContextAccessor;

        public SecurityKeyHandler(IHttpContextAccessor httpContextAccessor)
        {
            _httpContextAccessor = httpContextAccessor;
        }        

        protected override Task HandleRequirementAsync(AuthorizationHandlerContext context, SecurityKeyRequirement requirement)
        {
            string authMessage = string.Empty;
            var httpContext = _httpContextAccessor.HttpContext;
            if (httpContext.Request.Headers.TryGetValue("SecurityKey", out var securityKey))
            {
                if (!string.IsNullOrEmpty(securityKey))
                {
                    if (securityKey.Equals("Test"))
                    {
                        context.Succeed(requirement);
                        return Task.CompletedTask;
                    }
                    else
                    {
                        httpContext.Items["AuthorizationFailureMessage"] = "The security key passed in not authorized";
                        context.Fail();
                        return Task.CompletedTask;                        
                    }
                }
            }

            // Set a custom failure message
            httpContext.Items["AuthorizationFailureMessage"] = "SecurityKey passed is either null or empty";
            context.Fail();
            return Task.CompletedTask;
        }
    }
}

Program.cs

Finally, we configure the services and middleware in the Program.cs file

using CustomAuthenticationDemo001.Middleware.Authentication.Handler;
using CustomAuthenticationDemo001.Model;
using Microsoft.AspNetCore.Authorization;
using Microsoft.AspNetCore.OData;
using Microsoft.OData.ModelBuilder;

var builder = WebApplication.CreateBuilder(args);

builder.Services.AddControllers().AddOData(options =>
{
    var odataBuilder = new ODataConventionModelBuilder();
    odataBuilder.EntitySet<Product>("Products");
    options.AddRouteComponents("odata", odataBuilder.GetEdmModel())
           .Select().Filter().Expand().OrderBy().Count().SetMaxTop(100);
});

builder.Services.AddHttpContextAccessor();
builder.Services.AddSingleton<IAuthorizationHandler, SecurityKeyHandler>();
builder.Services.AddSingleton<IAuthorizationMiddlewareResultHandler, CustomAuthorizationMiddlewareResultHandler>();

builder.Services.AddAuthorization(options =>
{
    options.AddPolicy("SecurityKeyPolicy", policy =>
        policy.Requirements.Add(new SecurityKeyRequirement()));
});

var app = builder.Build();

// Configure the HTTP request pipeline.
if (app.Environment.IsDevelopment())
{
    app.UseDeveloperExceptionPage();
}

app.UseRouting();

app.UseAuthorization();

app.UseEndpoints(endpoints =>
{
    endpoints.MapControllers();
});

app.Run();

Benefits

Implementing custom authorization middleware with a custom error message provides several benefits:

  1. Enhanced Security: By validating a security key in the request header, we ensure that only authorized users can access the OData services.
  2. Custom Error Handling: Returning custom error messages helps users understand why their request was denied, improving the overall user experience.
  3. Flexibility: The custom authorization middleware can be easily extended to include additional validation logic or integrate with other authentication mechanisms.

References

Reduce Costs with Azure Automation Runbook for Task Failures

Author: Jayakumar SrinivasanDate: 17-Jan-2025

Introduction

In this article I am going to discuss about a auto re-try solution that I arrived on running a windows task scheduler in Azure VM (Well, this will work on a console application deployed on to Azure VM a s well. This has reduced the troubleshooting and support task to a great extent to my team. Well some might argue that same can be done with Azure Function or Web Jobs but it was decided based on some limitations we encountered using these and that is out of scope of this article.

Problem Description

We have a system developed as console application and deployed as Windows Task in a Azure VM. This system was scheduled to run early morning before business hours to sync data between 2 systems. There were numerous integrations that needs to be synchronized and there are separate task schedules for each of them that were configured to runs on different time but all completed before start of business hours. We identified that due to some technical limitation to one of partner system involved in synchronization was causing random failures on one or many integrations on daily basis. The application support team needs to monitor the failed integration and re-run the task scheduler on Azure VM numerous times till the integration succeeds.

Existing System Limitations

Unreliable Downstream Service

The partner service we are connecting is experiencing performance issue recently and was failing to return business data for processing. This data from partner service is required to be pushed to another partner system’s BI database for live reporting.

Lack of Built-in Retry Logic

The existing system is designed in such a way that it will pull the downstream service data once per day in the non- business hours. As this runs on a VM to avoid costing the machine is switched off when the data pull from downstream service ends. The same can be configured for retry-logic but the VM needs to run for whole day which was not a cost-optimized solution.

Proposed Solution

Azure Automation Runbook Overview

Azure Automation Runbook is a cloud-based automation service that allows you to create, monitor, and manage automated workflows. It supports PowerShell and Python scripts, making it versatile for various automation tasks. For our scenario we have used a PowerShell script automation that will be scheduled to run on a specific interval on all business days.

Implementing Auto-Retry Logic

To implement an auto-retry mechanism, you can need to do the following that involves:

  • Identifying the failed Integrations: We have used the Azure SQL to register failed integration on the early morning run. We created a stored procedure that will return the failed integrations for current date
  • PowerShell Runbook Script: Develop a script that includes the retry logic. This script should:
    • Zero hardcoding of environment details in script, use Automation variables instead as best practice
    • Query Azure SQL Db and identify the failed integrations
    • Validate if the Azure VM where the Windows Task is configured is in started state, if not start the VM
    • Run a command line script on the VM to start the Windows Scheduled Task
    • Sleep for specific time so that the failed integration gets executed
    • Stop the Azure VM so that it will not incur idle cost

Psuedo-code Runbook Script

PowerShell Runbook script with auto-retry logic:

$subscriptionId = Get-AutomationVariable -Name "subscriptionId"   # Variable used to get the Azure Subscription ID
$sqlSvrName = Get-AutomationVariable -Name "Retry_sqlSvrName"     # Variable used to get the SQL Server Name
$sqlDbName = Get-AutomationVariable -Name "Retry_sqlDbName"       # Variable used to get the SQL Database Name
$sqlUserName = Get-AutomationVariable -Name "Retry_sqlUserName"   # Variable used to get the SQL Server User Name
$sqlPassword = Get-AutomationVariable -Name "Retry_sqlPassword"   # Variable used to get the SQL Server Password

# Variable used to get the name of stored procedure for failed count
$sqlSpGetFailedIntegrationsCount = Get-AutomationVariable -Name "Retry_sqlSpGetFailedIntegrationsCount"     

# Variable used to get the name of stored procedure for failed integartions rows
$sqlSpGetFailedIntegrations = Get-AutomationVariable -Name "Retry_sqlSpGetFailedIntegrations"               

# Variable used to get the name of stored procedure to update the failed integrations flags
$sqlSpUpdateFailedIntegrations = Get-AutomationVariable -Name "Retry_sqlSpUpdateFailedIntegrations" 

# Variable used to get the name of virtual machine resource group
$vmresourceGroupName = Get-AutomationVariable - Name "Retry_vmresourceGroupName" 
$vmName = Get-AutomationVariable - Name "Retry_vmName"             # Variable used to get the name of virtual machine
$logicAppUrl = Get-AutomationVariable -Name "Retry_logicAppUrl"    # Variable used to get the Logic App url
$logicAppName = Get-AutomationVariable -Name "Retry_logicAppName"  # Variable used to get the Logic App Name
$sleepDuration = Get-AutomationVariable -Name "Retry_Sleep_General" # Variable used to get the Sleep in seconds after
$vmStarted = $false
$failedIntegrationsRetrieved = $false
$failedIntegrations = @{}
$executedIntegrations = @{}
$counter = 0

# Logging the start time of the script
$startDateTime = Get-Date
$cnnBuilder = New-Object System.Data.SqlClient.SqlConnectionStringBuilder
$cnnBuilder.psBase.DataSource = $sqlSvrName
$cnnBuilder.psBase.InitialCatalog = $sqlDbName
$cnnBuilder.psBase.UserID = $sqlUserName
$cnnBuilder.psBase.Password = $sqlPassword
$cnnBuilder.psBase.ConnectRetryCount = 3
$cnnBuilder.psBase.ConnectRetryInterval = 10
$cnnBuilder.psBase.ConnectTimeout = 600
$cnnBuilder.psBase.IntegratedSecurity = $false
$cnnBuilder.psBase.MultipleActiveResultSets = $true
function Open-SqlConnection {
    [CmdletBinding()]
    param (
        [Parameter(Mandatory=$true)]
        [System.Data.SqlClient.SqlConnection]$SqlConnection
    )
    # Variable to store the result
    $result = 1
    try
    {
        if ($SqlConnection.State -eq [System.Data.ConnectionState]'Closed')
        {
            $SqlConnection.Open()
        }
    }
    catch {
        Write-Output "An error occurred while executing the stored procedure: $_"
        $result = -1001
    }

    # Return the result as an integer
    return [int]$result
}

function Close-SqlConnection {
    [CmdletBinding()]
    param (
        [Parameter(Mandatory=$true)]
        [System.Data.SqlClient.SqlConnection]$SqlConnection
    )

    # Variable to store the result
    $result = 1

    try
    {
        if ($SqlConnectionn.State -ne [System.Data.ConnectionState]'Closed')
        {
            # Close the SQL connection
            $SqlConnection.Close()
        }
    }catch{
        $result = -1001
        Write-Output "Failed to when closing the SQL connection that was opened earlier."
        Write-Output "Error: $_"
    }

    # Return the result as an integer
    return [int]$result
}

function Execute-Scalar-SP {
    [CmdletBinding()]
    param (
        [Parameter(Mandatory=$true)]
        [System.Data.SqlClient.SqlConnection]$SqlConnection,

        [Parameter(Mandatory=$true)]
        [string]$StoredProcedure
    )

    # Variable to store the result
    $result = 0

    try
    {
        $result = Open-SqlConnection -SqlConnection $SqlConnection | Out-Null
        if ($result -ne -1001)
        {
            Write-Output "Open-Connection Returned - $result" | Out-Null
            # Create the SQL command
            $command = $SqlConnection.CreateCommand()
            $command.CommandText = $StoredProcedure
            $command.CommandType = [System.Data.CommandType]::StoredProcedure

            # Execute the command and get the result
            $result = $command.ExecuteScalar()
        }
    }
    catch {
        Write-Output "An error occurred while executing the stored procedure: $_"
        $result = -1001
    }

    # Return the result as an integer
    return $result
}

function Execute-Table-SP {
    [CmdletBinding()]
    param (
        [Parameter(Mandatory=$true)]
        [System.Data.SqlClient.SqlConnection]$SqlConnection,

        [Parameter(Mandatory=$true)]
        [string]$StoredProcedure
    )

    # Initialize $tableResults as an empty array
    $tableResults = @()

    try {
        # Write-Output "About to check SQL Connection Status, if its not open then will open it"
        $result = Open-SqlConnection -SqlConnection $SqlConnection | Out-Null
        if ($result -ne -1001)
        {
            # Write-Output "Open-Connection Returned - $result"
            # Create the SQL command
            $command = $SqlConnection.CreateCommand()
            $command.CommandText = $StoredProcedure
            $command.CommandType = [System.Data.CommandType]::StoredProcedure

            # Execute the command and read the results
            $reader = $command.ExecuteReader()
            while ($reader.Read()) {
                $row = @{}
                for ($i = 0; $i -lt $reader.FieldCount; $i++) {
                    $row[$reader.GetName($i)] = $reader.GetValue($i)
                }
                $tableResults += [PSCustomObject]$row
            }
            $reader.Close()
        }
    }
    catch {
        Write-Output "An error occurred while executing the stored procedure: $_"
        throw
    }

    # Return the result
    return $tableResults
}

<#
    Step 1
    1.  Initializing the SQL Connection and Local varaibles with default values
    2.  Log the starting time of the Runbook
#>
$sqlConnection = New-Object System.Data.SqlClient.SqlConnection
$dictionary = New-Object System.Collections.Generic.Dictionary"[String,String]"
$sqlConnection.ConnectionString = $cnnBuilder.psBase.ConnectionString
Write-Output "Starting Time $startDateTime"

<#
    Step 2: Calling the Stored Procedure to see if there are any failed integrations today
#>
$result = 0
Write-Output "About to execute the storedProcedure: $sqlSpGetFailedIntegrationsCount"
$result = Execute-Scalar-SP -SqlConnection $sqlConnection -StoredProcedure $sqlSpGetFailedIntegrationsCount        
Write-Output "The result from the stored procedure is: $result"

<#
Step 3
1. Validate if Step 2 is Passed
2. Calling the Stored Procedure to get details of all the failed integration names for today
#>
if (($result -ne -1001) -and ($result -gt 0))
{
    try
    {
        $failedIntegrations = Execute-Table-SP -SqlConnection $sqlConnection -StoredProcedure $sqlSpGetFailedIntegrations
        $failedIntegrationsRetrieved = $true

        <#
        Step 4
        1.  Validate if Step 3 is Passed and the failed integartion is true flag.
        2.  Setting the Azure Contexts
        3.  Start the Azure VM where Schedule tasks are hosted
        #>
        if ($failedIntegrationsRetrieved -eq $true )
        {
            try
            {
                $vmStarted = $false
                Write-Output "Seems few of the integrations failed."
                Write-Output "This will now start the VM and will run the failed integrations."
                Write-Output "Setting the Azure contexts start ..."

                Write-Output "Calling the Disable-AzContextAutoSave..."
                # Ensures you do not inherit an AzContext in your runbook
                Disable-AzContextAutosave -Scope Process

                Write-Output "Calling the Connect-AzAccount with managed identity..."
                # Connect to Azure with system-assigned managed identity
                Connect-AzAccount -Identity
                # Connect-AzAccount -Identity -AccountId "ac31ba68-be92-4173-9e94-8ca9b1e0cee7"

                Write-Output "Calling the Set-AzContext to set the current execution context"
                # set and store context
                $AzureContext = Set-AzContext –SubscriptionId "$subscriptionId"

                Write-Output "Setting the Azure contexts end ..."

                # Check the status of the VM before starting it.
                # Check if Azure VM status, if the status is not running then start the VM
                # Get the VM status
                Write-Output "Calling Get-AzVM to get the status of the VM"
                $vmStatus = Get-AzVM -ResourceGroupName $vmresourceGroupName -Name $vmName -Status

                # Extract the power state
                $powerState = $vmStatus.Statuses | Where-Object { $_.Code -like "PowerState/*" }

                # Display the VM status
                Write-Output "The current status of the VM '$vmName' in resource group '$vmresourceGroupName' is: $($powerState.DisplayStatus)"

                # Step X : Start the VM as its stopped.
                if ($powerState.DisplayStatus -ne "VM running" )
                {
                    Write-Output "The Azure VM: $vmName is not in running state, starting the VM using Start-AzVM"
                    Write-Output "Executing the Start-AzVM to start the VM"

                    # Start the VM
                    Start-AzVM -Name $vmName -ResourceGroupName $vmresourceGroupName -DefaultProfile $AzureContext
                    Write-Output "The Azure VM: $vmName started successfully"
                    $vmStarted = $true

                    # # Wait for VM to start
                    # Start-Sleep -Seconds 30

                    Write-Output "Azure VM started successfully...."
                }
                else
                {
                    Write-Output "VM is already running..."
                    $vmStarted = $true
                }
            }catch {
                Write-Output "Error occurred when starting the VM: $vmName"
                Write-Output $_
            }
        }

        <#
        Step 5
        1.  Validate if Step 4 is Passed and the Virtual Started flag is true.
        2.  Execute the Stored Procedure in DB to update the flag of the failed integrations
        #>
        if (($failedIntegrationsRetrieved -eq $true) -and ($vmStarted -eq $true))
        {
            $spUpdate = $false
            try
            {
                Write-Output "Updating the status in the BatchReconcile table before rerunning the integrations in VM."
                Write-Output "About to execute the storedProcedure: $sqlSpUpdateFailedIntegrations"
                $failedIntegResult = Execute-Scalar-SP -SqlConnection $sqlConnection -StoredProcedure $sqlSpUpdateFailedIntegrations
                Write-Output "Stored procedure executed successfully..."
                $spUpdate = $true
            }
            catch
            {
                Write-Output "Error occurred when executing the SP: $sqlSpUpdateFailedIntegrations"
                Write-Output $_
            }
        }

        <#
        Step 6
        1.  Check if the Stored Procedure Update flag is true from previous step 5.
        2.  Execute the Powershell script remotely to the Azure VM and run the failed Integrations one by one
        3.  As the Task Schedulers run Asynchronously provide buffer time complete the asychronous task. The
            sleep time is configured in the sleepDuration variable, check the values in the Automation Services
            properties for the value.
        4.  Store the success or failure of the individual integrations in a HashTable
        #>
        $hasBatchExecuted = $false
        if ($spUpdate = $true)
        {
            try
            {
                $commandId = "RunPowerShellScript"
                $taskPath = "\Schedules-DEV\Delta"

                Write-Output "failedIntegrations: $failedIntegrations"
                Write-Output $failedIntegrations.GetType().Name
                foreach ($row in $failedIntegrations)                
                {
                    $row.PSObject.Properties | ForEach-Object {                    
                        try
                        {
                            if ($_.Name -eq "MethodToProcess")
                            {
                                $moduleName = "Sync_$($_.Value)_Delta"
                                $scriptStringVal = "Start-ScheduledTask -TaskName $moduleName -TaskPath $taskPath"
                                Write-Output "About to execute module $moduleName with Script $scriptStringVal"
                                # Execute the command on the VM
                                Invoke-AzVMRunCommand -ResourceGroupName $vmresourceGroupName -Name $vmName -CommandId $commandId -ScriptPath $null -ScriptString $scriptStringVal
                                Write-Output "Module $moduleName ran successfully..."

                                Write-Output "About to sleep for $sleepDuration"
                                # Wait for VM to start
                                Start-Sleep -Seconds $sleepDuration

                                $hasBatchExecuted = $true
                                $executedIntegrations[$_.value] = "Passed"                                
                            }
                        }
                        catch
                        {
                            $executedIntegrations[$_.value] = "Failed"
                            continue
                        }
                    }
                }
            }
            catch
            {
                Write-Output "An error occurred:"
                Write-Output $_
            }
        }

        <#
        Step 7
        1.  Check if the Task Scheduler run has atleast triggerred one integration by checking the hasBatchExecuted flag to true from previous step.
        2.  Call the Logic App configured to trigger the AxBus Datafactory Pipelines to sync AxBus DB Tables to BI DB Tables
        3.  This call to Logic App will be execute for the Integrations that has successfully exected in previous step.
            All Failed integrations in previous step will be skipped.
        #>
        if ($hasBatchExecuted -eq $true)
        {
            Write-Output "Batchs has been executed, will call the Logic App to run the DataFactory Pipeline for passed integrations."
            foreach($key in $executedIntegrations.keys)
            {
                if ($executedIntegrations[$key] -eq "Passed")
                {
                    Write-Output "About to execute the Logic App: $logicAppName with Process Method: $key"
                    # Define the payload to send with the POST request
                    # Here, we're sending the Sync JSON payload. Adjust as needed.
                    $payload = @{
                        "ProcessMethod" = "$key"
                        "SyncType" = "Delta"
                    } | ConvertTo-Json

                    # Define headers if required
                    $headers = @{
                        "Content-Type" = "application/json"
                    }

                    # Send the HTTP POST request to the Logic App URL
                    try {
                        $response = Invoke-RestMethod -Uri $logicAppUrl -Method Post -Body $payload -Headers $headers
                        Write-Output "Logic App invoked successfully. Response: $($response | ConvertTo-Json)"
                    } catch {
                        Write-Error "Failed to invoke Logic App. Error: $_"
                    }
                }
            }
        }

        <#
        Step 8
        1.  Gracefully close the SQL Connection opened for releasing memeory.
        2.  Check the Virtual Machine started flag to true, if so then stop the Azure VM gracefully.
        3.  Log the ending time of the Runbook
        #>
        $closeConnectionResult = Close-SqlConnection -SqlConnection $sqlConnection
        if ($vmStarted)
        {
            try
            {
                Write-Output "The Azure VM: $vmName will be stopped using Stop-AzVM"
                
                # Stop the VM
                Stop-AzVM -ResourceGroupName $vmresourceGroupName -Name $vmName -Force -ErrorAction Stop
                Write-Output "The Azure VM: $vmName stopped successfully"
                $vmStarted = $true

                # # Wait for VM to start
                # Start-Sleep -Seconds 30
            } catch{
                Write-Error "Failed to when stop the VM, the error as below."
                Write-Error "Error: $_"
            }
        }

    }catch {
        $failedIntegrationsRetrieved = $false
        Write-Output "Error occurred when executing the SP: $sqlSpGetFailedIntegrations"
        Write-Output $_
    }
}

# Write-Output "All Tasks completed..."
# Logging the end time of the script
Write-Output "Completed Time $(Get-Date)"

Benefits of the Proposed Solution

Increased Reliability

By implementing an auto-retry mechanism, the reliability of automated workflows is significantly improved. Transient errors are handled gracefully, reducing the likelihood of complete workflow failures.

Reduced Manual Intervention

The auto-retry logic minimizes the need for manual intervention, allowing automated processes to recover from transient failures autonomously. This leads to increased efficiency and reduced operational overhead.

Enhanced Efficiency

Automated retry mechanisms ensure that tasks are completed successfully without unnecessary delays. This enhances the overall efficiency of the system, as tasks do not remain in a failed state for extended periods. This ensured zero manual intervention on any failure in downstream service.

Cost Savings

By reducing downtime and the need for manual intervention, organizations can achieve cost savings. Automated processes can run more smoothly, leading to better resource utilization and lower operational costs. We saw a 60% reduction in VM costing after this implementation.

Conclusion

With this approach we have seen a 60% reduction in VM costing, 0% manual intervention on any failures, reports generated on auto-retry enhanced the reliability of system among the various stack holders and business owners.

References

What is Azure Automation

Tutorial: Create a PowerShell Workflow runbook in Automation

Tutorial: Create Automation PowerShell runbook using managed identity