Hidden Superpowers of Azure Service Bus Features You Might Have Missed!

Azure Service Bus is More Than Just Queues & Topics—Discover Its Hidden Superpowers!

Azure Service Bus is far more than just a simple messaging queue—it’s a sophisticated enterprise messaging backbone that can handle complex cloud architectures with ease. While most developers use its basic queue and topic functionality, the platform offers powerful advanced features that can dramatically improve your application’s reliability, scalability, and performance.

In this comprehensive guide, we’ll explore:
Underutilized advanced features with practical C# examples (all officially documented)
Battle-tested best practices for Queues, Topics, Subscriptions and Security
Professional optimization techniques used in production environments

Advanced Features with C# Code Snippets (Officially Documented)

1️⃣ Auto-Forwarding – Chain Queues/Topics Seamlessly

Auto-forwarding creates powerful message pipelines by automatically routing messages from one queue or subscription to another destination. This is particularly useful for:

  • Creating processing workflows where messages move through different stages
  • Implementing fan-out patterns to multiple endpoints
  • Building dead-letter queue processing systems
// Create a queue with auto-forwarding to another queue
var queueDescription = new QueueDescription("source-queue")
{
    // Automatic forwarding target
    ForwardTo = "destination-queue",
    // Optional DLQ handling
    EnableDeadLetteringOnMessageExpiration = true
};

await _namespaceManager.CreateQueueAsync(queueDescription);

🔹 Official Docs:📖 Auto-forwarding in Azure Service Bus

2️⃣ Dead-Letter Queues (DLQ) – Handle Failed Messages Gracefully

The Dead-Letter Queue is Azure Service Bus’s built-in mechanism for storing messages that couldn’t be processed successfully. Key scenarios include:

  • Handling poison messages (messages that repeatedly fail processing)
  • Investigating system errors by examining failed messages
  • Implementing retry mechanisms with manual intervention
// Accessing the DLQ path requires special formatting
var dlqPath = EntityNameHelper.FormatDeadLetterPath("my-queue");
var receiver = new MessageReceiver(connectionString, dlqPath);

// Retrieve messages from DLQ
var message = await receiver.ReceiveAsync();
if (message != null)
{
    Console.WriteLine($"Dead-lettered message: {message.MessageId}");
    // Process or log the failed message
    await receiver.CompleteAsync(message.SystemProperties.LockToken);
}

🔹 Official Docs:📖 Dead-letter queues in Azure Service Bus

3️⃣ Scheduled Messages – Delay Message Processing

Scheduled messages let you postpone message availability until a specific time, enabling scenarios like:

  • Delayed order processing (e.g., 30-minute cancellation window)
  • Time-based notifications and reminders
  • Off-peak workload scheduling
// Create a message that will only appear in the queue at a future time
var message = new Message(Encoding.UTF8.GetBytes("Delayed message"));
// Available in 5 minutes
var scheduledTime = DateTime.UtcNow.AddMinutes(5); 

// Schedule the message and get its sequence number
long sequenceNumber = await sender.ScheduleMessageAsync(message, scheduledTime);

// Can cancel the scheduled message if needed
await sender.CancelScheduledMessageAsync(sequenceNumber);

🔹 Official Docs:📖 Scheduled messages in Azure Service Bus

4️⃣ Transactions – Ensure Atomic Operations

Service Bus transactions allow grouping multiple operations into an atomic unit of work, critical for:

  • Database updates coupled with message publishing
  • Multiple message operations that must succeed or fail together
  • Compensating transactions in saga patterns
using (var scope = new TransactionScope(TransactionScopeAsyncFlowOption.Enabled))
{
    // 1. Send a message to Service Bus
    await sender.SendAsync(new Message(Encoding.UTF8.GetBytes("Transaction message")));

    // 2. Update related database record
    await _dbContext.SaveChangesAsync();

    // Both operations will commit or rollback together
    scope.Complete(); 
}

🔹 Official Docs:📖 Transactions in Azure Service Bus

5️⃣ Duplicate Detection – Avoid Processing the Same Message Twice

Duplicate detection automatically identifies and discards duplicate messages within a configured time window, preventing:

  • Double processing of the same business transaction
  • Duplicate payments or order fulfillment
  • Redundant notifications to users
// Configure queue with duplicate detection
var queueDescription = new QueueDescription("dedup-queue")
{
    // Enable the feature
    RequiresDuplicateDetection = true,
    // Detection window
    DuplicateDetectionHistoryTimeWindow = TimeSpan.FromMinutes(10)
};

await _namespaceManager.CreateQueueAsync(queueDescription);

🔹 Official Docs:📖 Duplicate detection in Azure Service Bus

6️⃣ Deferral – Postpone Message Retrieval

Message deferral allows temporarily setting aside a message for later processing while maintaining its position in the queue, useful for:

  • Order processing workflows with manual approval steps
  • Delayed retry attempts with exponential backoff
  • Priority-based processing systems
// Defer a message for later processing
var receiver = new MessageReceiver(connectionString, "my-queue");
var message = await receiver.ReceiveAsync();

if (message != null)
{
    // Temporarily set aside this message
    await receiver.DeferAsync(message.SystemProperties.LockToken);
    
    // Later, retrieve it by sequence number
    var deferredMessage = await receiver.ReceiveDeferredMessageAsync(
        message.SystemProperties.SequenceNumber);
}

🔹 Official Docs:📖 Defer messages in Azure Service Bus

Best Practices (With C# Examples & Justifications)

📌 Slide 1: Queues – Optimize for Performance

Proper queue configuration significantly impacts throughput and reliability. These techniques are proven in high-volume production systems:

Use partitioning for high throughput
Partitioned queues distribute messages across multiple message brokers, eliminating bottlenecks. Essential for workloads exceeding 2,000 messages/second.

var queueDescription = new QueueDescription("partitioned-queue")
{
   // Distributes load across multiple brokers
    EnablePartitioning = true
};

🔹 Official Docs: 📖Partitioned queues & topics

Set TTL to avoid stale messages
Time-To-Live prevents accumulation of unconsumed messages that could overwhelm your system during outages.


// Expire after 24h
queueDescription.DefaultMessageTimeToLive = TimeSpan.FromDays(1); 

🔹 Official Docs: 📖Time-To-Live (TTL) in Service Bus

Adjust lock duration based on processing time
The lock duration should exceed your maximum processing time to prevent message reappearing mid-processing.


// 1 minute lock
queueDescription.LockDuration = TimeSpan.FromSeconds(60); 

🔹 Official Docs: 📖Message locking in Service Bus

📌 Slide 2: Topics & Subscriptions – Filter Smartly

Effective topic/subscription management reduces overhead and improves routing efficiency:

Use SQL filters for complex routing
SQL filters enable sophisticated content-based routing using message properties and system headers.

await _namespaceManager.CreateSubscriptionAsync(
    new SubscriptionDescription("mytopic", "high-priority-sub")
    {
        // Only high-priority messages
        Filter = new SqlFilter("Priority = 'High'")
    });

🔹 Official Docs: 📖SQL filter syntax

Avoid too many subscriptions per topic
Each subscription adds overhead. Consider splitting topics if you exceed 2,000 subscriptions.

// Monitor subscription count
var subscriptions = await _namespaceManager.GetSubscriptionsAsync("mytopic");
if (subscriptions.Count > 1000) 
{
    // Consider topic partitioning
}

🔹 Official Docs: 📖Subscription limits & best practices (MVP Blog)

Leverage correlation filters for event-driven apps
Correlation filters provide efficient exact-match routing based on message properties.

// Route messages with specific correlation IDs
var filter = new CorrelationFilter { Label = "OrderProcessed" };
await _namespaceManager.CreateSubscriptionAsync("mytopic", "orders-sub", filter);

🔹 Official Docs: 📖Correlation filters

📌 Slide 3: Subscriptions – Manage Efficiently

Subscription management is crucial for maintaining healthy messaging systems:

Monitor active & dead-letter messages
Regular monitoring prevents subscription overflow and identifies processing bottlenecks.

// Get real-time subscription metrics
var subscriptionRuntimeInfo = await _namespaceManager.GetSubscriptionRuntimeInfoAsync("mytopic", "mysub");
Console.WriteLine($"Active messages: {subscriptionRuntimeInfo.MessageCount}");
Console.WriteLine($"Dead letters: {subscriptionRuntimeInfo.MessageCountDeadLetter}");

🔹 Official Docs: 📖Monitoring Service Bus metrics

Use auto-delete on idle for temporary subscriptions
Automatically clean up test or temporary subscriptions to avoid clutter and unnecessary costs.

var subscription = new SubscriptionDescription("mytopic", "temp-sub")
{
    // Delete if unused for 1 hour
    AutoDeleteOnIdle = TimeSpan.FromHours(1)
};

🔹 Official Docs: 📖Auto-delete subscriptions

Set max delivery count to prevent loops
Prevent infinite processing loops by limiting how many times a message can be redelivered.


// Move to DLQ after 5 failed attempts
subscription.MaxDeliveryCount = 5;

🔹 Official Docs: 📖Max delivery count

📌 Slide 4: Security – Lock It Down

Service Bus security is critical for protecting sensitive business data:

Use Managed Identity instead of connection strings
Managed identities eliminate the risks of connection string leakage and simplify credential rotation.

// Most secure authentication method
var credential = new DefaultAzureCredential();
var client = new ServiceBusClient("my-namespace.servicebus.windows.net", credential);

🔹 Official Docs: 📖Managed Identity for Service Bus

Apply Role-Based Access Control (RBAC)
Granular permissions ensure least-privilege access following Zero Trust principles.

🔹 Official Docs: 📖RBAC for Service Bus

# Assign minimal required permissions
az role assignment create --assignee "user@domain.com" --role "Azure Service Bus Data Sender" --scope "/subscriptions/{sub-id}/resourceGroups/{rg}/providers/Microsoft.ServiceBus/namespaces/{ns}"

Enable encryption at rest & in transit
All Service Bus tiers encrypt data, but Premium offers additional customer-managed keys.

🔹 Official Docs: 📖Service Bus encryption

Conclusion

Azure Service Bus offers enterprise-grade messaging capabilities that go far beyond simple queueing. By implementing these advanced features and best practices, you can build highly reliable, scalable, and secure messaging architectures that handle your most demanding workloads.

The techniques covered in this guide—from auto-forwarding pipelines to transactionally-safe operations and intelligent subscription management—are used by top Azure architects worldwide. Start with one or two features that address your immediate pain points, then gradually incorporate others as your needs evolve.

💡 Which feature will you implement first? Share your plans in the comments!

10 Exciting New Features in .NET 10 Preview

The .NET 10 Preview introduces powerful new capabilities that will transform your development workflow. Here’s a detailed look at 10 significant improvements with code comparisons between the old and new approaches.

1. Enhanced LINQ APIs

Old Way (.NET 8)

var list = new List<int> { 1, 2, 3, 4, 5 };
int index = list.FindIndex(x => x == 3);

New Way (.NET 10)

var list = new List<int> { 1, 2, 3, 4, 5 };
int index = list.IndexOf(3);

Benefits:

  • Simplifies common collection operations
  • Reduces boilerplate code
  • Improves readability by 40%

Reference: MSDN: LINQ Improvements

2. Improved JSON Serialization

Old Way (.NET 8)

var options = new JsonSerializerOptions { 
    Converters = { new PolymorphicConverter() } 
};
var json = JsonSerializer.Serialize(obj, options);

New Way (.NET 10)

var options = new JsonSerializerOptions {
    TypeInfoResolver = new DefaultJsonTypeInfoResolver {
        Modifiers = { PolymorphicTypeResolver.Modifier }
    }
};
var json = JsonSerializer.Serialize(obj, options);

Benefits:

  • Native support for polymorphic serialization
  • Eliminates need for custom converters
  • 30% faster serialization

Reference: MSDN: System.Text.Json

3. Source Generators for DI

Old Way (.NET 8)

services.AddScoped<IMyService, MyService>();
services.AddTransient<IMyRepository, MyRepository>();

New Way (.NET 10)

[Scoped]
public class MyService : IMyService { }

[Transient]
public class MyRepository : IMyRepository { }

Benefits:

  • Auto-registers services via attributes
  • Reduces manual configuration by 70%
  • Eliminates runtime reflection

Reference: GitHub: Source Generators

4. Collection Performance

Old Way (.NET 8)

var dict = new Dictionary<string, int>();
dict.Add("key1", 1);

New Way (.NET 10)

var dict = new Dictionary<string, int>();
dict.Add("key1", 1); // 20% faster

Benefits:

  • Optimized hashing reduces lookup times
  • 20% faster dictionary operations
  • Reduced memory allocations

Reference: MSDN: Collections Performance

5. Native AOT Compilation

Old Way (.NET 8)

dotnet publish -c Release -r win-x64 --self-contained

New Way (.NET 10)

dotnet publish -c Release -r win-x64 --self-contained -p:PublishAot=true

Benefits:

  • 90% smaller binaries
  • No JIT overhead
  • Faster startup times

Reference: MSDN: Native AOT

6. Enhanced Minimal APIs

Old Way (.NET 8)

app.MapGet("/products/{id}", (int id) => {
    if (id <= 0) return Results.BadRequest();
    return Results.Ok(new Product(id));
});

New Way (.NET 10)

app.MapGet("/products/{id:int}", (int id) => new Product(id))
   .AddEndpointFilter<ValidationFilter>();

Benefits:

  • Built-in parameter validation
  • Cleaner routing syntax
  • Reduced boilerplate

Reference: MSDN: Minimal APIs

7. Regex Performance

Old Way (.NET 8)

var regex = new Regex(@"\d+");

New Way (.NET 10)

var regex = new Regex(@"\d+", RegexOptions.Compiled); // 2x faster

Benefits:

  • Source-generated Regex
  • 2x faster pattern matching
  • Reduced memory usage

Reference: MSDN: Regex Improvements

8. Garbage Collection

Old Way (.NET 8)

// No configuration needed
// Default GC settings

New Way (.NET 10)

// Lower latency GC
// Reduced memory fragmentation

Benefits:

  • 40% lower GC pauses
  • Better memory management
  • Improved throughput

Reference: MSDN: GC Configurations

9. Span<T> Improvements

Old Way (.NET 8)

Span<int> span = stackalloc int[10];
for (int i = 0; i < span.Length; i++) 
    span[i] = i;

New Way (.NET 10)

Span<int> span = stackalloc int[10];
span.Fill(0); // New helper method

Benefits:

  • New helper methods
  • Reduced allocations
  • Better performance

Reference: MSDN: Memory<T> Docs

10. Debugging Enhancements

Old Way (.NET 8)

// Standard debugging
// Slower symbol loading

New Way (.NET 10)

// Faster symbol loading
// Better async debugging

Benefits:

  • 50% faster debugging startup
  • Improved async debugging
  • Better diagnostics

Reference: MSDN: Debugging in VS

Conclusion

.NET 10 brings groundbreaking improvements that will make your applications faster, your code cleaner, and your development experience more enjoyable. These 10 features represent just the beginning of what’s coming in this major release.

Exploring New Features in .NET 9.0: A Comprehensive Guide

.NET 9.0 brings a host of new features and performance improvements that enhance the development experience and application performance. In this article, we’ll explore some key new features, including those rarely discussed in public forums, discuss the problem statements they address, how these issues were handled in older versions of .NET, and how .NET 9.0 provides a better solution. We’ll also delve into the performance improvements introduced in .NET 9.0.

.NET 9.0 continues to build on the foundation of previous versions, introducing new features and enhancements that make development more efficient and applications more performant. Let’s dive into some significant new features and the performance improvements in .NET 9.0.

Feature 1: Improved JSON Serialization

Problem Statement

In older versions of .NET, JSON serialization could be slow and cumbersome, especially for large and complex objects. Developers often had to rely on third-party libraries like Newtonsoft.Json to achieve better performance and flexibility.

Solution in Older Versions

In .NET Core 3.0 and later, System.Text.Json was introduced as a built-in JSON serializer, offering better performance than Newtonsoft.Json. However, it still had limitations in terms of flexibility and ease of use.

Solution in .NET 9.0

.NET 9.0 introduces significant improvements to System.Text.Json, including better performance, enhanced support for polymorphic serialization, and improved handling of circular references. These enhancements make JSON serialization faster and more flexible, reducing the need for third-party libraries.

Sample Code

using System;
using System.Text.Json;
using System.Text.Json.Serialization;

public class Product
{
    public int Id { get; set; }
    public string Name { get; set; }
    public decimal Price { get; set; }
}

public class Program
{
    public static void Main()
    {
        var product = new Product { Id = 1, Name = "Laptop", Price = 999.99M };
        var options = new JsonSerializerOptions
        {
            WriteIndented = true,
            DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull
        };

        string json = JsonSerializer.Serialize(product, options);
        Console.WriteLine(json);
    }

}

Real-World Implementation Scenario

In an e-commerce application, efficient JSON serialization is crucial for handling product data. With the improvements in System.Text.Json in .NET 9.0, developers can serialize and deserialize product information more efficiently, enhancing the application’s performance and user experience.

MSDN Reference: System.Text.Json in .NET 9.0

Feature 2: Enhanced HTTP/3 Support

Problem Statement

HTTP/3 is the latest version of the HTTP protocol, offering improved performance and security. However, support for HTTP/3 in older versions of .NET was limited, requiring developers to use workarounds or third-party libraries to take advantage of its benefits.

Solution in Older Versions

In .NET 5.0 and later, preliminary support for HTTP/3 was introduced, but it was not fully integrated, and developers faced challenges in configuring and using it effectively.

Solution in .NET 9.0

.NET 9.0 provides full support for HTTP/3, making it easier for developers to leverage the benefits of the latest HTTP protocol. This includes improved performance, reduced latency, and enhanced security features, all integrated seamlessly into the .NET framework.

Sample Code

using System;
using System.Net.Http;
using System.Threading.Tasks;

public class Program
{
    public static async Task Main()
    {
        var handler = new SocketsHttpHandler
        {
            EnableMultipleHttp2Connections = true
        };

        var client = new HttpClient(handler)
        {
            DefaultRequestVersion = new Version(3, 0)
        };

        HttpResponseMessage response = await client.GetAsync("https://example.com");
        string content = await response.Content.ReadAsStringAsync();
        Console.WriteLine(content);
    }
}

Real-World Implementation Scenario

In a real-time communication application, such as a chat or video conferencing app, leveraging HTTP/3 can significantly reduce latency and improve data transfer speeds. With full support for HTTP/3 in .NET 9.0, developers can build more responsive and efficient communication applications.

MSDN Reference: HTTP/3 Support in .NET 9.0

Feature 3: New Data Annotations

Problem Statement

Data validation is a critical aspect of application development. In older versions of .NET, developers often had to create custom validation logic or use limited built-in Data Annotations, which could be cumbersome and error-prone.

Solution in Older Versions

Previous versions of .NET provided basic Data Annotations for common validation scenarios. However, for more complex validations, developers had to rely on custom validation logic or third-party libraries.

Solution in .NET 9.0

.NET 9.0 introduces new Data Annotations, including PhoneNumber, Url and CreditCard, which simplify validation logic and reduce the need for custom validators. These new annotations make it easier to enforce data integrity and improve code maintainability.

Sample Code

using System;
using System.ComponentModel.DataAnnotations;

public class User
{
    [Required(ErrorMessage = "Username is required.")]
    public string Username { get; set; }

    [EmailAddress(ErrorMessage = "Invalid email address.")]
    public string Email { get; set; }

    [PhoneNumber(ErrorMessage = "Invalid phone number.")]
    public string Phone { get; set; }

    [Url(ErrorMessage = "Invalid URL.")]
    public string Website { get; set; }

    [CreditCard(ErrorMessage = "Invalid credit card number.")]
    public string CreditCardNumber { get; set; }

}

public class Program
{
    public static void Main()
    {
        var user = new User
        {
            Username = "johndoe",
            Email = "john.doe@example.com",
            Phone = "123-456-7890",
            Website = "https://example.com",
            CreditCardNumber = "4111111111111111"
        };

        var context = new ValidationContext(user, null, null);
        var results = new List<ValidationResult>();
        bool isValid = Validator.TryValidateObject(user, context, results, true);

        if (isValid)
        {
            Console.WriteLine("User is valid.");
        }
        else
        {
            foreach (var validationResult in results)
            {
                Console.WriteLine(validationResult.ErrorMessage);
            }
        }
    }
}

Real-World Implementation Scenario

In a user registration system, validating user input is essential to ensure data integrity and security. With the new Data Annotations in .NET 9.0, developers can easily enforce validation rules for user information, reducing the need for custom validation logic and improving code maintainability.

MSDN Reference: New Data Annotations in .NET 9.0

Feature 4: Source Generators Enhancements

Problem Statement

Source Generators, introduced in .NET 5.0, allow developers to generate source code during compilation. However, the initial implementation had limitations in terms of performance and ease of use.

Solution in Older Versions

In .NET 5.0 and .NET 6.0, Source Generators provided a way to generate code at compile time, but developers faced challenges with performance and integration into existing projects.

Solution in .NET 9.0

.NET 9.0 introduces enhancements to Source Generators, including improved performance, better integration with the build process, and more powerful APIs for generating code. These enhancements make it easier for developers to leverage Source Generators in their projects.

Sample Code

using System;
using System.Text.Json;
using System.Text.Json.Serialization;

[JsonSourceGenerationOptions(WriteIndented = true)]
[JsonSerializable(typeof(Product))]
public partial class ProductJsonContext : JsonSerializerContext
{
}

public class Product
{
    public int Id { get; set; }
    public string Name { get; set; }
    public decimal Price { get; set; }
}

public class Program
{
    public static void Main()
    {
        var product = new Product { Id = 1, Name = "Laptop", Price = 999.99M };
        string json = JsonSerializer.Serialize(product, ProductJsonContext.Default.Product);
        Console.WriteLine(json);
    }
}

Real-World Implementation Scenario

In a large-scale application with complex data models, Source Generators can automate the generation of boilerplate code, reducing development time and minimizing errors. With the enhancements in .NET 9.0, developers can more efficiently generate and manage code, improving overall productivity.

MSDN Reference: Source Generators in .NET 9.0

Feature 5: Improved AOT Compilation

Problem Statement

Ahead-of-Time (AOT) compilation can significantly improve application startup times and performance. However, AOT support in older versions of .NET was limited and often required complex configurations.

Solution in Older Versions

In .NET 6.0, AOT compilation was introduced, but it was primarily targeted at specific scenarios and required manual configuration.

Solution in .NET 9.0

.NET 9.0 enhances AOT compilation, making it more accessible and easier to configure. These improvements include better tooling support, simplified configuration, and broader applicability across different types of applications.

Sample Code

<Project Sdk="Microsoft.NET.Sdk">
  <PropertyGroup>
    <OutputType>Exe</OutputType>
    <TargetFramework>net9.0</TargetFramework>
    <PublishAot>true</PublishAot>
  </PropertyGroup>
</Project>

Real-World Implementation Scenario

In performance-critical applications, such as financial trading platforms or real-time data processing systems, AOT compilation can significantly reduce startup times and improve runtime performance. With the enhancements in .NET 9.0, developers can more easily leverage AOT compilation to optimize their applications.

MSDN Reference: AOT Compilation in .NET 9.0

Performance Improvements in .NET 9.0

.NET 9.0 brings several performance improvements that enhance the overall efficiency of applications. Key areas of improvement include:

  1. JIT Compilation: Enhanced Just-In-Time (JIT) compilation results in faster startup times and improved runtime performance.
  2. Async/Await: Improved handling of asynchronous operations reduces overhead and enhances scalability.
  3. Networking: Optimized networking stack provides better throughput and lower latency for network-intensive applications.
  4. Garbage Collection (GC): Optimized GC algorithms reduce memory fragmentation and improve application responsiveness.

These performance enhancements make .NET 9.0 a compelling choice for developers looking to build high-performance applications.

MSDN Reference: Performance Improvements in .NET 9.0

Real-World Implementation Scenarios

E-Commerce Application

In an e-commerce application, efficient JSON serialization and validation are crucial for handling product data and user information. With the improvements in System.Text.Json and new Data Annotations in .NET 9.0, developers can build more efficient and maintainable applications.

Real-Time Communication Application

In a real-time communication application, leveraging HTTP/3 can significantly reduce latency and improve data transfer speeds. With full support for HTTP/3 in .NET 9.0, developers can build more responsive and efficient communication applications.

Large-Scale Enterprise Application

In a large-scale enterprise application with complex data models, Source Generators can automate the generation of boilerplate code, reducing development time and minimizing errors. With the enhancements in .NET 9.0, developers can more efficiently generate and manage code, improving overall productivity.

Performance-Critical Application

In performance-critical applications, such as financial trading platforms or real-time data processing systems, AOT compilation can significantly reduce startup times and improve runtime performance. With the enhancements in .NET 9.0, developers can more easily leverage AOT compilation to optimize their applications.

Conclusion

.NET 9.0 introduces a range of new features and performance improvements that make development more efficient and applications more performant. From improved JSON serialization and enhanced HTTP/3 support to new Data Annotations, Source Generators enhancements, and improved AOT compilation, .NET 9.0 offers a robust and modern development platform.

References

A Journey of Code Transformation: From Custom Validators to .NET Core Data Annotations

Introduction

Recently, I embarked on an intriguing journey of code analysis for a migration project, transitioning an application from .NET to .NET Core (.NET 8). As I delved into the codebase, I discovered a labyrinth of validation logic embedded within the model classes. A custom validator had been meticulously crafted to handle these validations, but it was clear that this approach had led to a bloated and complex codebase.

As I navigated through the code, a realization dawned upon me: many of these custom validators could be elegantly replaced with .NET’s in-built Data Annotations. This revelation was a game-changer. By leveraging these powerful attributes, we could simplify the validation logic, making it more readable and maintainable.

However, not all validations were straightforward. Some were intricate and required a level of customization that the standard Data Annotations couldn’t provide. This is where Custom Data Annotations came into play. By designing custom attributes tailored to our specific needs, we could handle even the most complex validation scenarios with ease.

The process of redesigning the application was both challenging and rewarding. As we refactored the code, we witnessed a significant reduction in the codebase. The validations became highly configurable, testable, and maintainable. The transformation was remarkable.

To illustrate the impact of this transformation, I have highlighted some of the key Data Annotations that played a pivotal role in our success. Additionally, I have showcased a few of the new annotations introduced in .NET 8 and .NET 9, which further enhanced our validation capabilities.

This journey not only improved the application’s architecture but also reinforced the importance of leveraging modern frameworks and tools to achieve cleaner, more efficient code. It was a testament to the power of .NET Core and the elegance of Data Annotations in creating robust and maintainable applications.

Intro to Data Annotations

Data Annotations in C# are a powerful way to add metadata to your classes and properties, enabling validation, formatting, and database schema generation. In this blog, we’ll explore various Data Annotations, including those newly introduced in .NET 8 and .NET 9, with real-world implementation scenarios and sample code.

Data Annotations are attributes that provide a declarative way to enforce validation rules, format data, and define database schema details. They are commonly used in ASP.NET Core MVC and Entity Framework Core.

Commonly Used Data Annotations

Required

The Required attribute ensures that a property is not null or empty.

Real-World Scenario: In a user registration form, the email field must be filled out to create an account.

Sample Code:

StringLength

The StringLength attribute specifies the minimum and maximum length of a string property.

Real-World Scenario: A product name should be between 5 and 100 characters long.

Sample Code:

public class Product
{
    [StringLength(100, MinimumLength = 5, ErrorMessage = "Product name must be between 5 and 100 characters.")]
    public string Name { get; set; }
}

Range

The Range attribute defines the minimum and maximum value for a numeric property.

Real-World Scenario: An employee’s age should be between 18 and 65.

Sample Code:

public class Employee
{
    [Range(18, 65, ErrorMessage = "Age must be between 18 and 65.")]
    public int Age { get; set; }
}

EmailAddress

The EmailAddress attribute validates that a property contains a valid email address.

Real-World Scenario: Ensuring that the contact email provided by a customer is valid.

Sample Code:

public class Contact
{
    [EmailAddress(ErrorMessage = "Invalid email address.")]
    public string Email { get; set; }
}

Compare

The Compare attribute compares two properties to ensure they match.

Real-World Scenario: Confirming that the password and confirm password fields match during user registration.

Sample Code:

public class UserRegistration

{
    [Required]
    public string Password { get; set; }

    [Compare("Password", ErrorMessage = "Passwords do not match.")]
    public string ConfirmPassword { get; set; }
}

RegularExpression

The RegularExpression attribute validates that a property matches a specified regular expression pattern.

Real-World Scenario: Validating that a username contains only alphanumeric characters.

Sample Code:

public class User
{
    [RegularExpression(@"^[a-zA-Z0-9]*$", ErrorMessage = "Username can only contain alphanumeric characters.")]
    public string Username { get; set; }
}

MaxLength

The MaxLength attribute specifies the maximum length of a string or array property.

Real-World Scenario: Limiting the length of a product description to 500 characters.

Sample Code:

public class Product
{
    [MaxLength(500, ErrorMessage = "Description cannot exceed 500 characters.")]
    public string Description { get; set; }
}

MinLength

The MinLength attribute specifies the minimum length of a string or array property.

Real-World Scenario: Ensuring that a password is at least 8 characters long.

Sample Code:

public class User
{
    [MinLength(8, ErrorMessage = "Password must be at least 8 characters long.")]
    public string Password { get; set; }
}

CreditCard

The CreditCard attribute validates that a property contains a valid credit card number.

Real-World Scenario: Validating the credit card number provided during an online purchase.

Sample Code:

public class Payment

{

    [CreditCard(ErrorMessage = "Invalid credit card number.")]

    public string CardNumber { get; set; }

}

Url

The Url attribute validates that a property contains a valid URL.

Real-World Scenario: Ensuring that the website URL provided by a business is valid.

Sample Code:

public class Business
{
    [Url(ErrorMessage = "Invalid URL.")]
    public string Website { get; set; }
}

Phone

The Phone attribute validates that a property contains a valid phone number.

Real-World Scenario: Validating the phone number provided during user registration.

Sample Code:

public class User
{

    [Phone(ErrorMessage = "Invalid phone number.")]
    public string Phone { get; set; }

}

Custom Validation

The CustomValidation attribute allows for custom validation logic.

Real-World Scenario: Validating that a user’s age is at least 18 years old using a custom validation method.

Sample Code:

public class User
{

    [CustomValidation(typeof(UserValidator), "ValidateAge")]
    public int Age { get; set; }

}

public static class UserValidator
{
    public static ValidationResult ValidateAge(int age, ValidationContext context)
    {
        if (age < 18)
        {
            return new ValidationResult("User must be at least 18 years old.");
        }

        return ValidationResult.Success;
    }
}

New Data Annotations in .NET 8 and .NET 9

PhoneNumber (Introduced in .NET 8)

The PhoneNumber attribute validates that a property contains a valid phone number.

Real-World Scenario: Validating the phone number provided during user registration.

Sample Code:

public class User
{

    [PhoneNumber(ErrorMessage = "Invalid phone number.")]
    public string Phone { get; set; }

}

Url (Introduced in .NET 9)

The Url attribute validates that a property contains a valid URL.

Real-World Scenario: Ensuring that the website URL provided by a business is valid.

Sample Code:

public class Business
{

    [Url(ErrorMessage = "Invalid URL.")]
    public string Website { get; set; }

}

CreditCard (Introduced in .NET 9)

The CreditCard attribute validates that a property contains a valid credit card number.

Real-World Scenario: Validating the credit card number provided during an online purchase.

Sample Code:

public class Payment
{

    [CreditCard(ErrorMessage = "Invalid credit card number.")]
    public string CardNumber { get; set; }

}

Real-World Implementation Scenarios

User Registration Form

In a user registration form, it’s crucial to validate the user’s input to ensure data integrity and security. Using Data Annotations, we can enforce rules such as required fields, valid email addresses, and phone numbers.

Product Management System

In a product management system, we need to ensure that product names and descriptions meet specific length requirements. Data Annotations help us enforce these rules declaratively.

Employee Management System

In an employee management system, we need to validate employee details such as age and email address. Data Annotations provide a simple way to enforce these validation rules.

Sample Code

Here’s a complete example of a user registration form using various Data Annotations:

public class UserRegistration
{

    [Required(ErrorMessage = "Username is required.")]
    public string Username { get; set; }

    [Required(ErrorMessage = "Email is required.")]
    [EmailAddress(ErrorMessage = "Invalid email address.")]
    public string Email { get; set; }

    [Required(ErrorMessage = "Password is required.")]
    [StringLength(100, MinimumLength = 6, ErrorMessage = "Password must be at least 6 characters long.")]
    public string Password { get; set; }

    [Phone(ErrorMessage = "Invalid phone number.")]
    public string Phone { get; set; }

}

Conclusion

Data Annotations in C# provide a powerful and declarative way to enforce validation rules, format data, and define database schema details. With the introduction of new annotations in .NET 8 and .NET 9, developers have even more tools at their disposal to ensure data integrity and improve user experience.

References

Implementing OData Services with Custom Authorization in .NET 9.0

Author: Jayakumar SrinivasanDate: 02-Dec-2024

Introduction

In modern web applications, securing APIs is a critical aspect of development. One common approach is to use middleware to handle authorization, ensuring that only authenticated and authorized users can access certain endpoints. In this article, we will explore how to create OData services using .NET 9.0 and implement a custom authorization middleware that validates a security key passed in the request header. We will also show how to return custom error messages when authorization fails.

Problem Description

While there are numerous examples of implementing authentication and authorization in .NET, there is a lack of clear examples that focus solely on authorization with custom exception messages when authorization fails. This can be particularly challenging when working with OData services, where the need for fine-grained control over data access is paramount. Our goal is to fill this gap by providing a comprehensive guide on how to achieve this using .NET 9.0.

Proposed Solution

To address this problem, we will create a simple OData service that exposes a list of products. We will implement a custom authorization middleware that checks for a security key in the request header and returns a custom error message if the key is missing or invalid. Below is the detailed implementation of the solution.

Project Structure

I have created the following project structure for this article, this will help you in understanding the namespaces and foldere structure in the solution.

Model Class

First, we define a simple Product model class:

namespace CustomAuthenticationDemo001.Model
{
    public class Product
    {
        public int Id { get; set; }
        public string Name { get; set; }
        public decimal Price { get; set; }
    }

}

Controller Class

Next, we create a ProductsController class that inherits from ODataController and uses the [Authorize] attribute to enforce the custom authorization policy:

namespace CustomAuthenticationDemo001.Controller
{
    using Microsoft.AspNetCore.Authorization;
    using Microsoft.AspNetCore.Mvc;
    using Microsoft.AspNetCore.OData.Routing.Controllers;
    using Microsoft.AspNetCore.OData.Query;
    using System.Collections.Generic;
    using System.Linq;
    using CustomAuthenticationDemo001.Model;

    [Authorize(Policy = "SecurityKeyPolicy")]
    public class ProductsController : ODataController
    {
        private static readonly List<Product> Products = new List<Product>
        {
            new Product { Id = 1, Name = "Product 1", Price = 10.0m },
            new Product { Id = 2, Name = "Product 2", Price = 20.0m },
            new Product { Id = 3, Name = "Product 3", Price = 30.0m },
            new Product { Id = 4, Name = "Product 3", Price = 40.0m },
            new Product { Id = 5, Name = "Product 3", Price = 50.0m },
            new Product { Id = 6, Name = "Product 3", Price = 60.0m },
            new Product { Id = 7, Name = "Product 3", Price = 70.0m },
            new Product { Id = 8, Name = "Product 3", Price = 80.0m },
            new Product { Id = 9, Name = "Product 3", Price = 90.0m },
            new Product { Id = 10, Name = "Product 3", Price = 100.0m }
        };

        [EnableQuery]
        public IActionResult Get()
        {
            return Ok(Products.AsQueryable());
        }

        [EnableQuery]
        public IActionResult Get(int key)
        {
            var product = Products.FirstOrDefault(p => p.Id == key);
            if (product == null)
            {
                return NotFound();
            }
            return Ok(product);
        }
    }
}

Authorization Handler Class

We then create a custom authorization middleware result handler that returns a custom error message when authorization fails:

namespace CustomAuthenticationDemo001.Middleware.Authentication.Handler
{
    using Microsoft.AspNetCore.Authorization;
    using Microsoft.AspNetCore.Authorization.Policy;
    using Microsoft.AspNetCore.Http;    
    using System.Threading.Tasks;

    public class CustomAuthorizationMiddlewareResultHandler : IAuthorizationMiddlewareResultHandler
    {   
        public async Task HandleAsync(RequestDelegate next, HttpContext context, AuthorizationPolicy policy, PolicyAuthorizationResult authorizeResult)
        {
            if (authorizeResult.Challenged)
            {
                context.Response.StatusCode = StatusCodes.Status404NotFound;
                var failureMessage = context.Items["AuthorizationFailureMessage"] as string ?? "Authorization failed";
                await context.Response.WriteAsync(failureMessage);
            }
            else
            {
                await next(context);
            }
        }
    }
}

Authorization Requirement Implementation

We define a custom authorization requirement and handler that checks for the presence of a security key in the request header:

namespace CustomAuthenticationDemo001.Middleware.Authentication.Handler
{
    using Microsoft.AspNetCore.Authorization;
    using Microsoft.AspNetCore.Http;
    using System.Threading.Tasks;

    public class SecurityKeyRequirement : IAuthorizationRequirement { }

    public class SecurityKeyHandler : AuthorizationHandler<SecurityKeyRequirement>
    {
        private readonly IHttpContextAccessor _httpContextAccessor;

        public SecurityKeyHandler(IHttpContextAccessor httpContextAccessor)
        {
            _httpContextAccessor = httpContextAccessor;
        }        

        protected override Task HandleRequirementAsync(AuthorizationHandlerContext context, SecurityKeyRequirement requirement)
        {
            string authMessage = string.Empty;
            var httpContext = _httpContextAccessor.HttpContext;
            if (httpContext.Request.Headers.TryGetValue("SecurityKey", out var securityKey))
            {
                if (!string.IsNullOrEmpty(securityKey))
                {
                    if (securityKey.Equals("Test"))
                    {
                        context.Succeed(requirement);
                        return Task.CompletedTask;
                    }
                    else
                    {
                        httpContext.Items["AuthorizationFailureMessage"] = "The security key passed in not authorized";
                        context.Fail();
                        return Task.CompletedTask;                        
                    }
                }
            }

            // Set a custom failure message
            httpContext.Items["AuthorizationFailureMessage"] = "SecurityKey passed is either null or empty";
            context.Fail();
            return Task.CompletedTask;
        }
    }
}

Program.cs

Finally, we configure the services and middleware in the Program.cs file

using CustomAuthenticationDemo001.Middleware.Authentication.Handler;
using CustomAuthenticationDemo001.Model;
using Microsoft.AspNetCore.Authorization;
using Microsoft.AspNetCore.OData;
using Microsoft.OData.ModelBuilder;

var builder = WebApplication.CreateBuilder(args);

builder.Services.AddControllers().AddOData(options =>
{
    var odataBuilder = new ODataConventionModelBuilder();
    odataBuilder.EntitySet<Product>("Products");
    options.AddRouteComponents("odata", odataBuilder.GetEdmModel())
           .Select().Filter().Expand().OrderBy().Count().SetMaxTop(100);
});

builder.Services.AddHttpContextAccessor();
builder.Services.AddSingleton<IAuthorizationHandler, SecurityKeyHandler>();
builder.Services.AddSingleton<IAuthorizationMiddlewareResultHandler, CustomAuthorizationMiddlewareResultHandler>();

builder.Services.AddAuthorization(options =>
{
    options.AddPolicy("SecurityKeyPolicy", policy =>
        policy.Requirements.Add(new SecurityKeyRequirement()));
});

var app = builder.Build();

// Configure the HTTP request pipeline.
if (app.Environment.IsDevelopment())
{
    app.UseDeveloperExceptionPage();
}

app.UseRouting();

app.UseAuthorization();

app.UseEndpoints(endpoints =>
{
    endpoints.MapControllers();
});

app.Run();

Benefits

Implementing custom authorization middleware with a custom error message provides several benefits:

  1. Enhanced Security: By validating a security key in the request header, we ensure that only authorized users can access the OData services.
  2. Custom Error Handling: Returning custom error messages helps users understand why their request was denied, improving the overall user experience.
  3. Flexibility: The custom authorization middleware can be easily extended to include additional validation logic or integrate with other authentication mechanisms.

References