Hidden Superpowers of Azure Service Bus Features You Might Have Missed!

Azure Service Bus is More Than Just Queues & Topics—Discover Its Hidden Superpowers!

Azure Service Bus is far more than just a simple messaging queue—it’s a sophisticated enterprise messaging backbone that can handle complex cloud architectures with ease. While most developers use its basic queue and topic functionality, the platform offers powerful advanced features that can dramatically improve your application’s reliability, scalability, and performance.

In this comprehensive guide, we’ll explore:
Underutilized advanced features with practical C# examples (all officially documented)
Battle-tested best practices for Queues, Topics, Subscriptions and Security
Professional optimization techniques used in production environments

Advanced Features with C# Code Snippets (Officially Documented)

1️⃣ Auto-Forwarding – Chain Queues/Topics Seamlessly

Auto-forwarding creates powerful message pipelines by automatically routing messages from one queue or subscription to another destination. This is particularly useful for:

  • Creating processing workflows where messages move through different stages
  • Implementing fan-out patterns to multiple endpoints
  • Building dead-letter queue processing systems
// Create a queue with auto-forwarding to another queue
var queueDescription = new QueueDescription("source-queue")
{
    // Automatic forwarding target
    ForwardTo = "destination-queue",
    // Optional DLQ handling
    EnableDeadLetteringOnMessageExpiration = true
};

await _namespaceManager.CreateQueueAsync(queueDescription);

🔹 Official Docs:📖 Auto-forwarding in Azure Service Bus

2️⃣ Dead-Letter Queues (DLQ) – Handle Failed Messages Gracefully

The Dead-Letter Queue is Azure Service Bus’s built-in mechanism for storing messages that couldn’t be processed successfully. Key scenarios include:

  • Handling poison messages (messages that repeatedly fail processing)
  • Investigating system errors by examining failed messages
  • Implementing retry mechanisms with manual intervention
// Accessing the DLQ path requires special formatting
var dlqPath = EntityNameHelper.FormatDeadLetterPath("my-queue");
var receiver = new MessageReceiver(connectionString, dlqPath);

// Retrieve messages from DLQ
var message = await receiver.ReceiveAsync();
if (message != null)
{
    Console.WriteLine($"Dead-lettered message: {message.MessageId}");
    // Process or log the failed message
    await receiver.CompleteAsync(message.SystemProperties.LockToken);
}

🔹 Official Docs:📖 Dead-letter queues in Azure Service Bus

3️⃣ Scheduled Messages – Delay Message Processing

Scheduled messages let you postpone message availability until a specific time, enabling scenarios like:

  • Delayed order processing (e.g., 30-minute cancellation window)
  • Time-based notifications and reminders
  • Off-peak workload scheduling
// Create a message that will only appear in the queue at a future time
var message = new Message(Encoding.UTF8.GetBytes("Delayed message"));
// Available in 5 minutes
var scheduledTime = DateTime.UtcNow.AddMinutes(5); 

// Schedule the message and get its sequence number
long sequenceNumber = await sender.ScheduleMessageAsync(message, scheduledTime);

// Can cancel the scheduled message if needed
await sender.CancelScheduledMessageAsync(sequenceNumber);

🔹 Official Docs:📖 Scheduled messages in Azure Service Bus

4️⃣ Transactions – Ensure Atomic Operations

Service Bus transactions allow grouping multiple operations into an atomic unit of work, critical for:

  • Database updates coupled with message publishing
  • Multiple message operations that must succeed or fail together
  • Compensating transactions in saga patterns
using (var scope = new TransactionScope(TransactionScopeAsyncFlowOption.Enabled))
{
    // 1. Send a message to Service Bus
    await sender.SendAsync(new Message(Encoding.UTF8.GetBytes("Transaction message")));

    // 2. Update related database record
    await _dbContext.SaveChangesAsync();

    // Both operations will commit or rollback together
    scope.Complete(); 
}

🔹 Official Docs:📖 Transactions in Azure Service Bus

5️⃣ Duplicate Detection – Avoid Processing the Same Message Twice

Duplicate detection automatically identifies and discards duplicate messages within a configured time window, preventing:

  • Double processing of the same business transaction
  • Duplicate payments or order fulfillment
  • Redundant notifications to users
// Configure queue with duplicate detection
var queueDescription = new QueueDescription("dedup-queue")
{
    // Enable the feature
    RequiresDuplicateDetection = true,
    // Detection window
    DuplicateDetectionHistoryTimeWindow = TimeSpan.FromMinutes(10)
};

await _namespaceManager.CreateQueueAsync(queueDescription);

🔹 Official Docs:📖 Duplicate detection in Azure Service Bus

6️⃣ Deferral – Postpone Message Retrieval

Message deferral allows temporarily setting aside a message for later processing while maintaining its position in the queue, useful for:

  • Order processing workflows with manual approval steps
  • Delayed retry attempts with exponential backoff
  • Priority-based processing systems
// Defer a message for later processing
var receiver = new MessageReceiver(connectionString, "my-queue");
var message = await receiver.ReceiveAsync();

if (message != null)
{
    // Temporarily set aside this message
    await receiver.DeferAsync(message.SystemProperties.LockToken);
    
    // Later, retrieve it by sequence number
    var deferredMessage = await receiver.ReceiveDeferredMessageAsync(
        message.SystemProperties.SequenceNumber);
}

🔹 Official Docs:📖 Defer messages in Azure Service Bus

Best Practices (With C# Examples & Justifications)

📌 Slide 1: Queues – Optimize for Performance

Proper queue configuration significantly impacts throughput and reliability. These techniques are proven in high-volume production systems:

Use partitioning for high throughput
Partitioned queues distribute messages across multiple message brokers, eliminating bottlenecks. Essential for workloads exceeding 2,000 messages/second.

var queueDescription = new QueueDescription("partitioned-queue")
{
   // Distributes load across multiple brokers
    EnablePartitioning = true
};

🔹 Official Docs: 📖Partitioned queues & topics

Set TTL to avoid stale messages
Time-To-Live prevents accumulation of unconsumed messages that could overwhelm your system during outages.


// Expire after 24h
queueDescription.DefaultMessageTimeToLive = TimeSpan.FromDays(1); 

🔹 Official Docs: 📖Time-To-Live (TTL) in Service Bus

Adjust lock duration based on processing time
The lock duration should exceed your maximum processing time to prevent message reappearing mid-processing.


// 1 minute lock
queueDescription.LockDuration = TimeSpan.FromSeconds(60); 

🔹 Official Docs: 📖Message locking in Service Bus

📌 Slide 2: Topics & Subscriptions – Filter Smartly

Effective topic/subscription management reduces overhead and improves routing efficiency:

Use SQL filters for complex routing
SQL filters enable sophisticated content-based routing using message properties and system headers.

await _namespaceManager.CreateSubscriptionAsync(
    new SubscriptionDescription("mytopic", "high-priority-sub")
    {
        // Only high-priority messages
        Filter = new SqlFilter("Priority = 'High'")
    });

🔹 Official Docs: 📖SQL filter syntax

Avoid too many subscriptions per topic
Each subscription adds overhead. Consider splitting topics if you exceed 2,000 subscriptions.

// Monitor subscription count
var subscriptions = await _namespaceManager.GetSubscriptionsAsync("mytopic");
if (subscriptions.Count > 1000) 
{
    // Consider topic partitioning
}

🔹 Official Docs: 📖Subscription limits & best practices (MVP Blog)

Leverage correlation filters for event-driven apps
Correlation filters provide efficient exact-match routing based on message properties.

// Route messages with specific correlation IDs
var filter = new CorrelationFilter { Label = "OrderProcessed" };
await _namespaceManager.CreateSubscriptionAsync("mytopic", "orders-sub", filter);

🔹 Official Docs: 📖Correlation filters

📌 Slide 3: Subscriptions – Manage Efficiently

Subscription management is crucial for maintaining healthy messaging systems:

Monitor active & dead-letter messages
Regular monitoring prevents subscription overflow and identifies processing bottlenecks.

// Get real-time subscription metrics
var subscriptionRuntimeInfo = await _namespaceManager.GetSubscriptionRuntimeInfoAsync("mytopic", "mysub");
Console.WriteLine($"Active messages: {subscriptionRuntimeInfo.MessageCount}");
Console.WriteLine($"Dead letters: {subscriptionRuntimeInfo.MessageCountDeadLetter}");

🔹 Official Docs: 📖Monitoring Service Bus metrics

Use auto-delete on idle for temporary subscriptions
Automatically clean up test or temporary subscriptions to avoid clutter and unnecessary costs.

var subscription = new SubscriptionDescription("mytopic", "temp-sub")
{
    // Delete if unused for 1 hour
    AutoDeleteOnIdle = TimeSpan.FromHours(1)
};

🔹 Official Docs: 📖Auto-delete subscriptions

Set max delivery count to prevent loops
Prevent infinite processing loops by limiting how many times a message can be redelivered.


// Move to DLQ after 5 failed attempts
subscription.MaxDeliveryCount = 5;

🔹 Official Docs: 📖Max delivery count

📌 Slide 4: Security – Lock It Down

Service Bus security is critical for protecting sensitive business data:

Use Managed Identity instead of connection strings
Managed identities eliminate the risks of connection string leakage and simplify credential rotation.

// Most secure authentication method
var credential = new DefaultAzureCredential();
var client = new ServiceBusClient("my-namespace.servicebus.windows.net", credential);

🔹 Official Docs: 📖Managed Identity for Service Bus

Apply Role-Based Access Control (RBAC)
Granular permissions ensure least-privilege access following Zero Trust principles.

🔹 Official Docs: 📖RBAC for Service Bus

# Assign minimal required permissions
az role assignment create --assignee "user@domain.com" --role "Azure Service Bus Data Sender" --scope "/subscriptions/{sub-id}/resourceGroups/{rg}/providers/Microsoft.ServiceBus/namespaces/{ns}"

Enable encryption at rest & in transit
All Service Bus tiers encrypt data, but Premium offers additional customer-managed keys.

🔹 Official Docs: 📖Service Bus encryption

Conclusion

Azure Service Bus offers enterprise-grade messaging capabilities that go far beyond simple queueing. By implementing these advanced features and best practices, you can build highly reliable, scalable, and secure messaging architectures that handle your most demanding workloads.

The techniques covered in this guide—from auto-forwarding pipelines to transactionally-safe operations and intelligent subscription management—are used by top Azure architects worldwide. Start with one or two features that address your immediate pain points, then gradually incorporate others as your needs evolve.

💡 Which feature will you implement first? Share your plans in the comments!

Avoid redeploying LogicApps using KeyVault with Named Values in APIM

Introduction

In the fast-paced world of software development, agility and efficiency are paramount. Imagine a scenario where your development team is constantly bogged down by the need to redeploy Logic Apps every time there is a change in Azure Key Vault secrets. This blog will take you through a journey of overcoming this bottleneck by leveraging Azure API Management (APIM) Named Values to retrieve Key Vault values without the need for redeployment. Let’s dive into the problem, explore various solutions, and unveil a streamlined approach that will save your team time and effort.

Problem Description

Picture this: Your team is working on a project that requires frequent updates to configuration settings stored in Azure Key Vault. Every time a secret is updated, the Logic App needs to be redeployed to fetch the latest values. This process is not only time-consuming but also disrupts the development workflow, leading to frustration and inefficiency.

Example Use Case

Consider a partner system that requires the creation of a sandbox environment every week. The credentials for this environment change regularly. Since the secrets are stored in Key Vault, any change necessitates redeploying the Logic App to fetch the updated values. This frequent redeployment creates an overhead for the development team, causing delays and increasing the risk of errors.

Existing System Limitation

The primary limitation of the existing system is the need to redeploy the Logic App whenever there is a change in the Key Vault secrets. This process is time-consuming and can disrupt the development workflow, making it difficult to keep up with frequent configuration changes.

Different Solution Approaches

Approach 1: Directly Calling Key Vault Action in Logic App

Imagine a scenario where you decide to call the Key Vault action GetSecret directly within the Logic App to retrieve the updated values. At first glance, this seems like a straightforward solution. However, as you delve deeper, you realize that this method has its drawbacks:

  • Speed Bumps: It takes almost 1 second to retrieve the values, which can add up quickly if you have multiple secrets to fetch.
  • Secret Retrieval Limitation: There is no way to retrieve multiple secrets in a single call. For example, retrieving two secrets would require five hits in the Logic App, leading to inefficiencies and potential performance issues.

Approach 2: Creating a Custom REST Service

Next, you consider creating a REST service that retrieves the Key Vault secrets. This service can be hosted separately and can retrieve multiple secrets in one API call. While this approach offers some flexibility, it comes with its own set of challenges:

  • Cost Considerations: Hosting and maintaining a separate REST service can incur additional costs.
  • Development Effort: Building and integrating the REST service requires significant development effort.
  • Maintenance Overhead: Keeping the REST service up-to-date and ensuring its reliability adds to the maintenance burden.

Approach 3: Using APIM Named Values

The third approach involves using APIM Named Values to retrieve values from Key Vault. Named Values in APIM can be configured to fetch values from Key Vault. This approach offers several advantages:

  • Blazing Fast: It is faster compared to other approaches, ensuring quick retrieval of secrets.
  • Multi-Secret Retrieval: It can handle retrieving multiple values from Key Vault in a single API call, making it highly efficient.
  • Seamless Updates: Changes to the Key Vault secrets can be updated in the Named Values using the “Fetch Key Vault Secret” context menu, eliminating the need for redeployment.

Proposed Solution: Using APIM Named Values

Benefits of the Proposed Solution

  • Performance: The proposed solution significantly outperforms other methods, ensuring rapid retrieval of secrets from the Key Vault.
  • High Efficiency: Capable of handling multiple secret retrievals in a single API call, this approach maximizes efficiency and minimizes latency.
  • Seamless Updates: Say goodbye to redeployment headaches! Changes to Key Vault secrets can be effortlessly updated in Named Values using the “Fetch Key Vault Secret” context menu, streamlining the update process.
  • Reduced Development Overhead: By eliminating the need for frequent redeployments, this solution frees up valuable development time, allowing your team to focus on more critical tasks.
  • Enhanced Reliability: With fewer moving parts and a more streamlined process, the proposed solution enhances the overall reliability and stability of your application.

Steps for Implementing the Proposed Solution

Create Named Values in APIM

  • Navigate to the Azure APIM instance.
  • Go to the “Named Values” section.
  • Create a new Named Value and configure it to fetch the secret from Azure Key Vault.

Configure the GET Method

  • Create a new API in APIM.
  • Define a GET method that will retrieve the values from the Named Values configured in the previous step.
  • Important Settings
    • Setting the <set-header> policy inside <return-response>. If this <set-header> policy is not set then when you call the method in Logic App you will get the response content type as “application/octet-stream
    • Again <set-header> has a sequence when declaring, you should declare the header policy before <set-body>. The sequence is as below
      <return-response>
      <set-status …>
      <set-header>

      </set-header>
      <set-body …>

      </set-body>
      </return-response>

Update Named Values

  • Use the “Fetch Key Vault Secret” context menu to update the Named Values whenever there is a change in the Key Vault secrets.

Test the API

Calling from Logic App

There are 2 ways of calling a APIM endpoint in Logic App one is it to use the APIM Action and another one is by using Http Action. I will be using the Http Action here. After configuring as below you will get the response from APIM (<0.25 MS)

Calling APIM Named Values Get method from Logic App with Http Trigger
Calling APIM Named Values Get method from Logic App with Scheduled Trigger
Sending the Values in the Request Body from APIM

This approach can be used when you have a Http Trigger and want to send some values from KeyVault like Client ID, Client Secret, Tocken Scope. You can add a <set-body> policy to achieve this. You can add this below implemention specific to API method or on AllOperations in APIM. Below code provides the details on add the <set-body> policy in AllOperation. Please note that this approach will add the named values to the Body so that request will be intact. In the below example <set-body> uses JProperty to set the elements of the JSON it uses the named values TokenUrl, TokenScope, TenantId, ClientId, ClientSecret and GrantType-ClientCredentials to set the values.

By following these steps, you can efficiently use APIM Named Values to retrieve Key Vault values in Logic Apps without the need for redeployment. This approach eliminates the need for frequent redeployments of Logic Apps, thereby streamlining the development process.

Conclusion

In this blog, we discussed the challenges of using Azure Key Vault in Logic Apps and the need for redeployment whenever there is a change in Key Vault secrets. We explored different solution approaches and proposed a solution that involves using APIM Named Values. This solution offers significant benefits in terms of speed, efficiency, and ease of updates, making it an ideal choice for scenarios with frequent configuration changes.

Mastering Entity Framework Core: Advanced Features Uncovered with Real-Time scenarios

Entity Framework (EF) has been a cornerstone for .NET developers, providing a robust ORM (Object-Relational Mapping) framework to interact with databases using .NET objects. While many developers are familiar with the basics, there are several lesser-known tricks and best practices that can significantly enhance your EF experience. In this article, we will explore some of these hidden gems and also highlight new features introduced in the latest version of Entity Framework Core (EF Core 6.0 and EF Core 7.0).

Leveraging Global Query Filters

Real-Time Scenario

Imagine you are developing a multi-tenant application where each tenant should only see their own data. Implementing this manually in every query can be error-prone and cumbersome.

Feature Explanation

Global Query Filters, introduced in EF Core 2.0, allow you to define filters that apply to all queries for a given entity type. This is particularly useful for implementing multi-tenancy, soft deletes, or any scenario where you need to filter data globally.

protected override void OnModelCreating(ModelBuilder modelBuilder)
{
    modelBuilder.Entity<Customer>()
        .HasQueryFilter(c => !c.IsDeleted);
} 

Table Schema

SQL Script

CREATE TABLE Customers (
    Id INT PRIMARY KEY,
    Name NVARCHAR(100),
    IsDeleted BIT
);

Table Format

Column NameData TypeDescription
IdINTPrimary key
NameNVARCHAR(100)Name of the customer
IsDeletedBITIndicates if the customer is deleted

Using Value Conversions

Real-Time Scenario

Suppose you have a custom type for representing monetary values in your domain model, but you need to store these values as decimals in the database.

Feature Explanation

Value Conversions, introduced in EF Core 2.1, enable you to map custom types to database types. This is useful when you have domain-specific types that need to be stored in a database-friendly format.

public class Money
{
    public decimal Amount { get; set; }
    public string Currency { get; set; }

    public override string ToString() => $"{Amount} {Currency}";
    public static Money Parse(string value)
    {
        var parts = value.Split(' ');
        return new Money { Amount = decimal.Parse(parts[0]), Currency = parts[1] };
    }
}

modelBuilder.Entity<Order>()
    .Property(o => o.TotalAmount)
    .HasConversion(
        v => v.ToString(), // Convert to string for storage
        v => Money.Parse(v) // Convert back to custom type
    );  

Table Schema

SQL Script

CREATE TABLE Orders (
    Id INT PRIMARY KEY,
    CustomerId INT,
    TotalAmount NVARCHAR(50),
    FOREIGN KEY (CustomerId) REFERENCES Customers(Id)
);
    

Table Format

Column NameData TypeDescription
IdINTPrimary key
CustomerIdINTForeign key to Customers table
TotalAmountNVARCHAR(50)Custom monetary value stored as string

Compiled Queries for Performance

Real-Time Scenario

You have a high-traffic application where certain queries are executed frequently, and you need to optimize performance to handle the load.

Feature Explanation

Compiled Queries, introduced in EF Core 2.0, can significantly improve the performance of frequently executed queries by pre-compiling the query plan.

private static readonly Func<YourDbContext, int, Customer> _compiledQuery =
    EF.CompileQuery((YourDbContext context, int id) =>
        context.Customers.Single(c => c.Id == id));

public Customer GetCustomerById(int id)
{
    return _compiledQuery(_context, id);
}
    

Table Schema

SQL Script

CREATE TABLE Customers (
    Id INT PRIMARY KEY,
    Name NVARCHAR(100)
);   

Table Format

Column NameData TypeDescription
IdINTPrimary key
NameNVARCHAR(100)Name of the customer

Interceptors for Advanced Scenarios

Real-Time Scenario

You need to implement custom logging for all database commands executed by your application to comply with auditing requirements.

Feature Explanation

Interceptors, introduced in EF Core 3.0, allow you to hook into various stages of EF’s operation, such as command execution, saving changes, and more. This is useful for logging, auditing, or modifying behavior dynamically.

public class MyCommandInterceptor : DbCommandInterceptor
{
    public override InterceptionResult<int> NonQueryExecuting(
        DbCommand command, CommandEventData eventData, InterceptionResult<int> result)
    {
        // Log or modify the command here
        return base.NonQueryExecuting(command, eventData, result);
    }
}

protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder)
{
    optionsBuilder.AddInterceptors(new MyCommandInterceptor());
}
    

Table Schema

No specific schema changes are required for interceptors.

Temporal Tables for Historical Data

Real-Time Scenario

Your application needs to maintain a history of changes to certain entities for auditing and compliance purposes.

Feature Explanation

Temporal Tables, supported by EF Core 6.0, allow you to track changes to your data over time. This is useful for auditing and historical analysis.

modelBuilder.Entity<Customer>()
    .ToTable("Customers", b => b.IsTemporal());
    

Table Schema

SQL Script

CREATE TABLE Customers (
    Id INT PRIMARY KEY,
    Name NVARCHAR(100),
    ValidFrom DATETIME2 GENERATED ALWAYS AS ROW START,
    ValidTo DATETIME2 GENERATED ALWAYS AS ROW END,
    PERIOD FOR SYSTEM_TIME (ValidFrom, ValidTo)
) WITH (SYSTEM_VERSIONING = ON (HISTORY_TABLE = dbo.CustomersHistory));
    

Table Format

Column NameData TypeDescription
IdINTPrimary key
NameNVARCHAR(100)Name of the customer
ValidFromDATETIME2Start of the validity period
ValidToDATETIME2End of the validity period

New Features in the Latest Entity Framework

a. Many-to-Many Relationships

Real-Time Scenario

You are developing a library management system where books can have multiple authors, and authors can write multiple books. Modeling this relationship manually can be tedious.

Feature Explanation

The latest version of EF Core (EF Core 5.0) introduces native support for many-to-many relationships without needing a join entity.

modelBuilder.Entity<Author>()
    .HasMany(a => a.Books)
    .WithMany(b => b.Authors);
    

Table Schema

SQL Script
CREATE TABLE Authors (
    Id INT PRIMARY KEY,
    Name NVARCHAR(100)
);

CREATE TABLE Books (
    Id INT PRIMARY KEY,
    Title NVARCHAR(100)
);

CREATE TABLE AuthorBook (
    AuthorId INT,
    BookId INT,
    PRIMARY KEY (AuthorId, BookId),
    FOREIGN KEY (AuthorId) REFERENCES Authors(Id),
    FOREIGN KEY (BookId) REFERENCES Books(Id)
);
    
Table Format
Authors Table
Column NameData TypeDescription
IdINTPrimary key
NameNVARCHAR(100)Name of the author
Books Table
Column NameData TypeDescription
IdINTPrimary key
TitleNVARCHAR(100)Title of the book
AuthorBook Table
Column NameData TypeDescription
AuthorIdINTForeign key to Authors table
BookIdINTForeign key to Books table

b. Improved LINQ Translation

Real-Time Scenario

You have complex LINQ queries that need to be translated into efficient SQL to ensure optimal performance.

Feature Explanation

EF Core 5.0 and later versions have improved their LINQ translation capabilities, allowing for more complex queries to be translated into efficient SQL.

CustomerIdINTForeign key to Customers tableTotalAmountNVARCHAR(50)Custom monetary value stored as string

c. Split Queries for Related Data

Real-Time Scenario

You need to load large datasets with related data without running into performance issues caused by the N+1 query problem.

Feature Explanation

Split Queries, introduced in EF Core 5.0, allow you to load related data in multiple queries, reducing the risk of the N+1 query problem and improving performance for large result sets.

var data = context.Customers
    .Include(c => c.Orders)
    .AsSplitQuery()
    .ToList();
    

Table Schema

SQL Script
CREATE TABLE Customers (
    Id INT PRIMARY KEY,
    Name NVARCHAR(100)
);

CREATE TABLE Orders (
    Id INT PRIMARY KEY,
    CustomerId INT,
    TotalAmount NVARCHAR(50),
    FOREIGN KEY (CustomerId) REFERENCES Customers(Id)
);
    
Table Format
Customers Table
Column NameData TypeDescription
IdINTPrimary key
NameNVARCHAR(100)Name of the customer
Orders Table
Column NameData TypeDescription
IdINTPrimary key
CustomerIdINTForeign key to Customers table
TotalAmountNVARCHAR(50)Custom monetary value stored as string

d. Savepoints for Transactions

Real-Time Scenario

You are performing a series of operations within a transaction and need to create intermediate points to roll back to in case of errors.

Feature Explanation

Savepoints, introduced in EF Core 7.0, allow you to create intermediate points within a transaction, providing more control over transaction management.

using var transaction = context.Database.BeginTransaction();
try
{
    // Perform some operations
    var savepoint = transaction.CreateSavepoint("BeforeCriticalOperation");

    // Perform critical operation
    transaction.RollbackToSavepoint("BeforeCriticalOperation");

    transaction.Commit();
}
catch
{
    transaction.Rollback();
}
    

Table Schema

No specific schema changes are required for savepoints.

e. Primitive Collections

Real-Time Scenario

You need to store a list of primitive values, such as strings or integers, directly within an entity without creating a separate table.

Feature Explanation

Primitive Collections, introduced in EF Core 6.0, allow you to store collections of primitive types directly within an entity.

public class Product
{
    public int Id { get; set; }
    public List<string> Tags { get; set; }
}

modelBuilder.Entity<Product>()
    .Property(p => p.Tags)
    .HasConversion(
        v => string.Join(',', v), // Convert list to comma-separated string
        v => v.Split(',', StringSplitOptions.RemoveEmptyEntries).ToList() // Convert string back to list
    );
    

Table Schema

SQL Script
CREATE TABLE Products (
    Id INT PRIMARY KEY,
    Name NVARCHAR(100),
    Tags NVARCHAR(MAX)
);
    
Table Format
Column NameData TypeDescription
IdINTPrimary key
NameNVARCHAR(100)Name of the product
TagsNVARCHAR(MAX)Comma-separated list of tags

f. HierarchyId Support

Real-Time Scenario

You are working with hierarchical data, such as organizational structures or file systems, and need to efficiently manage and query this data.

Feature Explanation

HierarchyId support, introduced in EF Core 7.0, allows you to work with hierarchical data types in SQL Server.

public class Category
{
    public int Id { get; set; }
    public HierarchyId HierarchyId { get; set; }
}

modelBuilder.Entity<Category>()
    .Property(c => c.HierarchyId)
    .HasConversion(
        v => v.ToString(), // Convert HierarchyId to string for storage
        v => HierarchyId.Parse(v) // Convert string back to HierarchyId
    );
    

Table Schema

SQL Script
CREATE TABLE Categories (
    Id INT PRIMARY KEY,
    Name NVARCHAR(100),
    HierarchyId HIERARCHYID
);
    
Table Format
Column NameData TypeDescription
IdINTPrimary key
NameNVARCHAR(100)Name of the category
HierarchyIdHIERARCHYIDHierarchical data identifier

g. Efficient Bulk Operations

Real-Time Scenario

You need to perform bulk insert, update, or delete operations efficiently to handle large datasets.

Feature Explanation

Efficient Bulk Operations, supported by third-party libraries like EFCore.BulkExtensions, allow you to perform bulk operations with high performance.

context.BulkInsert(products);
context.BulkUpdate(products);
context.BulkDelete(products);
    

Table Schema

SQL Script
CREATE TABLE Products (
    Id INT PRIMARY KEY,
    Name NVARCHAR(100)
);
    
Table Format
Column NameData TypeDescription
IdINTPrimary key
NameNVARCHAR(100)Name of the product

h. JSON Column Enhancements

Real-Time Scenario

You need to store and query JSON data within your database, leveraging the flexibility of JSON columns.

Feature Explanation

JSON Column Enhancements, introduced in EF Core 6.0, provide improved support for storing and querying JSON data.

public class Customer
{
    public int Id { get; set; }
    public string JsonData { get; set; }
}

var query = context.Customers
    .Where(c => EF.Functions.JsonContains(c.JsonData, "{\"key\":\"value\"}"))
    .ToList();
    

Table Schema

SQL Script
CREATE TABLE Customers (
    Id INT PRIMARY KEY,
    Name NVARCHAR(100),
    JsonData NVARCHAR(MAX)
);
    
Table Format
Column NameData TypeDescription
IdINTPrimary key
NameNVARCHAR(100)Name of the customer
JsonDataNVARCHAR(MAX)JSON data stored as string
Sample JSON
{
    "key": "value",
    "nested": {
        "subkey": "subvalue"
    },
    "array": [1, 2, 3]
}
    

Conclusion

Entity Framework continues to evolve, offering powerful features and capabilities that can greatly enhance your data access layer. By leveraging these lesser-known tricks and best practices, you can write more efficient, maintainable, and robust code. Stay updated with the latest features and continuously explore the depths of EF to unlock its full potential.