πŸ”·

Threading Async Guide

Language Fundamentals Advanced 6 min read 1100 words

C# Threading & Asynchronous Programming - Principal Engineer Deep Dive

Table of Contents

  1. Thread Creation & Management
  2. Synchronization Primitives
  3. System.Threading.Channels
  4. Async Streams
  5. Parallel Task Execution
  6. LINQ Advanced Topics
  7. Language Internals
  8. DateTime & Timezone Handling
  9. Interview Questions & Answers

1. Thread Creation & Management

1.1 Ways to Create Threads

C# provides multiple ways to execute code concurrently:

Method 1: Thread Class (Manual Management)

// Direct Thread creation - low-level, manual management
var thread = new Thread(() =>
{
    Console.WriteLine($"Thread {Thread.CurrentThread.ManagedThreadId} executing");
    Thread.Sleep(1000);
    Console.WriteLine("Thread completed");
});

thread.IsBackground = true; // Won't prevent app from exiting
thread.Priority = ThreadPriority.Normal;
thread.Start();

// Wait for completion
thread.Join();

// With parameters
var paramThread = new Thread((object? state) =>
{
    var data = (string)state!;
    Console.WriteLine($"Processing: {data}");
});
paramThread.Start("Hello World");

[INTERNALS] Thread Creation Under the Hood:

  • Each Thread object creates an OS thread (1:1 mapping)
  • Default stack size: 1MB on Windows, varies on Linux
  • Thread creation cost: ~200ΞΌs + memory allocation
  • Kernel mode transition required for creation

Method 2: ThreadPool.QueueUserWorkItem

// ThreadPool - reuses existing threads
ThreadPool.QueueUserWorkItem(state =>
{
    Console.WriteLine($"ThreadPool thread {Thread.CurrentThread.ManagedThreadId}");
    // Short-running work only!
});

// With state
ThreadPool.QueueUserWorkItem(state =>
{
    var message = (string)state!;
    Console.WriteLine(message);
}, "Hello from ThreadPool");

// Check pool status
ThreadPool.GetMinThreads(out int workerMin, out int ioMin);
ThreadPool.GetMaxThreads(out int workerMax, out int ioMax);
ThreadPool.GetAvailableThreads(out int workerAvail, out int ioAvail);

Console.WriteLine($"Worker threads: {workerAvail}/{workerMax}");
Console.WriteLine($"IO threads: {ioAvail}/{ioMax}");

[INTERNALS] ThreadPool Architecture:

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚                    ThreadPool                            β”‚
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚  Global Queue (FIFO)                                     β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”                        β”‚
β”‚  β”‚ W1  β”‚ W2  β”‚ W3  β”‚ W4  β”‚ ... β”‚                        β”‚
β”‚  β””β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”˜                        β”‚
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚  Worker Threads                                          β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”     β”‚
β”‚  β”‚  Thread 1    β”‚ β”‚  Thread 2    β”‚ β”‚  Thread N    β”‚     β”‚
β”‚  β”‚ Local Queue  β”‚ β”‚ Local Queue  β”‚ β”‚ Local Queue  β”‚     β”‚
β”‚  β”‚ (LIFO/steal) β”‚ β”‚ (LIFO/steal) β”‚ β”‚ (LIFO/steal) β”‚     β”‚
β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜     β”‚
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚  Hill Climbing Algorithm                                 β”‚
β”‚  - Adjusts thread count dynamically                      β”‚
β”‚  - Monitors throughput                                   β”‚
β”‚  - Injects/retires threads as needed                     β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Method 3: Task.Run

// Task.Run - modern, preferred approach
Task task1 = Task.Run(() =>
{
    Console.WriteLine("Task executing on ThreadPool");
    return 42;
});

// With return value
Task<int> task2 = Task.Run(async () =>
{
    await Task.Delay(100);
    return 42;
});
int result = await task2;

// With cancellation
var cts = new CancellationTokenSource();
Task task3 = Task.Run(() =>
{
    while (!cts.Token.IsCancellationRequested)
    {
        // Do work
        cts.Token.ThrowIfCancellationRequested();
    }
}, cts.Token);

// Cancel after 5 seconds
cts.CancelAfter(5000);

Method 4: TaskFactory.StartNew

// TaskFactory.StartNew - more options, more complexity
var taskFactory = new TaskFactory();

// Basic usage
Task task1 = taskFactory.StartNew(() =>
{
    Console.WriteLine("TaskFactory task");
});

// With options
Task task2 = Task.Factory.StartNew(
    () => Console.WriteLine("With options"),
    CancellationToken.None,
    TaskCreationOptions.LongRunning, // Creates new thread, bypasses ThreadPool
    TaskScheduler.Default
);

// With state (avoids closure allocation)
Task task3 = Task.Factory.StartNew(
    state =>
    {
        var data = (string)state!;
        Console.WriteLine(data);
    },
    "Hello",
    CancellationToken.None,
    TaskCreationOptions.None,
    TaskScheduler.Default
);

Method 5: Parallel.For/ForEach

// Parallel.For - data parallelism
var numbers = Enumerable.Range(0, 1000).ToArray();
var results = new int[1000];

Parallel.For(0, 1000, i =>
{
    results[i] = numbers[i] * 2;
});

// Parallel.ForEach
var items = new List<string> { "a", "b", "c", "d", "e" };
var processedItems = new ConcurrentBag<string>();

Parallel.ForEach(items, item =>
{
    processedItems.Add(item.ToUpper());
});

// With options
var options = new ParallelOptions
{
    MaxDegreeOfParallelism = Environment.ProcessorCount,
    CancellationToken = CancellationToken.None
};

Parallel.ForEach(items, options, item =>
{
    Console.WriteLine($"Processing {item} on thread {Thread.CurrentThread.ManagedThreadId}");
});

// Parallel.ForEachAsync (.NET 6+)
await Parallel.ForEachAsync(items, options, async (item, ct) =>
{
    await ProcessItemAsync(item, ct);
});

1.2 Task.Run vs TaskFactory.StartNew

[BENCHMARK] Performance and Behavior Comparison:

Aspect Task.Run TaskFactory.StartNew
Scheduler Always ThreadPool Configurable
Unwrapping Automatic for async Manual (Unwrap())
Default options DenyChildAttach None
Async lambdas Handles correctly Returns Task
Long-running Not ideal LongRunning option
Simplicity Simple API Complex, more control
// GOTCHA: Async lambdas with TaskFactory.StartNew
// Wrong - returns Task<Task<int>>
Task<Task<int>> wrong = Task.Factory.StartNew(async () =>
{
    await Task.Delay(100);
    return 42;
});

// Must unwrap
Task<int> unwrapped = Task.Factory.StartNew(async () =>
{
    await Task.Delay(100);
    return 42;
}).Unwrap();

// Task.Run handles this automatically
Task<int> correct = Task.Run(async () =>
{
    await Task.Delay(100);
    return 42;
});

LongRunning Tasks:

// When to use LongRunning
// DO use for: truly long-running operations that would block a ThreadPool thread
Task longTask = Task.Factory.StartNew(() =>
{
    // Blocking operation that runs for minutes/hours
    while (true)
    {
        ProcessBatchFromQueue(); // Blocking call
        Thread.Sleep(1000);
    }
}, TaskCreationOptions.LongRunning);

// DON'T use for: async operations or short work
// This is wasteful - creates a new thread that immediately awaits
Task badLongTask = Task.Factory.StartNew(async () =>
{
    await SomeAsyncOperation(); // Thread sits idle during await!
}, TaskCreationOptions.LongRunning);

[INTERNALS] What LongRunning Does:

  • Bypasses the ThreadPool entirely
  • Creates a dedicated OS thread
  • Thread is not returned to pool when work completes
  • Use only for blocking operations that would starve the ThreadPool

1.3 Thread Pool Internals Deep Dive

[INTERNALS] Hill Climbing Algorithm:

// The ThreadPool uses hill climbing to optimize thread count
// It periodically measures throughput and adjusts threads

// You can configure min/max threads
ThreadPool.SetMinThreads(
    workerThreads: Environment.ProcessorCount * 2,
    completionPortThreads: Environment.ProcessorCount
);

ThreadPool.SetMaxThreads(
    workerThreads: Environment.ProcessorCount * 100,
    completionPortThreads: Environment.ProcessorCount * 10
);

// Monitor thread pool starvation
public class ThreadPoolMonitor
{
    private static readonly Timer _timer = new Timer(CheckThreadPool, null,
        TimeSpan.Zero, TimeSpan.FromSeconds(5));

    private static void CheckThreadPool(object? state)
    {
        ThreadPool.GetAvailableThreads(out int workerAvailable, out int ioAvailable);
        ThreadPool.GetMaxThreads(out int workerMax, out int ioMax);

        int workerInUse = workerMax - workerAvailable;
        int ioInUse = ioMax - ioAvailable;

        if (workerAvailable < Environment.ProcessorCount)
        {
            Console.WriteLine($"WARNING: ThreadPool may be starving! " +
                $"Workers in use: {workerInUse}, Available: {workerAvailable}");
        }
    }
}

Work Stealing:

// ThreadPool uses work-stealing for load balancing
// Each thread has a local queue (LIFO for cache locality)
// Threads steal from other threads' queues (FIFO) when idle

// This is why Task.Run is preferred for CPU-bound work:
// - Work is queued to the local queue of the current thread
// - Promotes cache locality (LIFO)
// - Allows work stealing for load balancing

// Example: Recursive task parallelism benefits from work stealing
public static long ParallelFibonacci(int n)
{
    if (n <= 1) return n;

    long result1 = 0, result2 = 0;

    Parallel.Invoke(
        () => result1 = ParallelFibonacci(n - 1),
        () => result2 = ParallelFibonacci(n - 2)
    );

    return result1 + result2;
}

2. Synchronization Primitives

2.1 Lock Alternatives for Async Code

[GOTCHAS] Why lock Doesn’t Work with Async:

// WRONG - lock cannot be used with async/await
private readonly object _lock = new object();

public async Task WrongAsync()
{
    lock (_lock) // Compiler error!
    {
        await SomeAsyncOperation();
    }
}

// Why?
// 1. lock is thread-affine - must be released on same thread
// 2. await can resume on different thread
// 3. This could cause deadlocks or undefined behavior

SemaphoreSlim for Async Locking:

public class AsyncLockExample
{
    private readonly SemaphoreSlim _semaphore = new SemaphoreSlim(1, 1);

    public async Task SafeAsyncOperation()
    {
        await _semaphore.WaitAsync();
        try
        {
            // Critical section - only one task at a time
            await SomeAsyncOperation();
        }
        finally
        {
            _semaphore.Release();
        }
    }

    // With timeout
    public async Task<bool> TryOperationWithTimeout(TimeSpan timeout)
    {
        if (await _semaphore.WaitAsync(timeout))
        {
            try
            {
                await SomeAsyncOperation();
                return true;
            }
            finally
            {
                _semaphore.Release();
            }
        }
        return false;
    }

    // With cancellation
    public async Task OperationWithCancellation(CancellationToken ct)
    {
        await _semaphore.WaitAsync(ct);
        try
        {
            await SomeAsyncOperation();
        }
        finally
        {
            _semaphore.Release();
        }
    }
}

[CODE] Complete AsyncLock Implementation:

/// <summary>
/// Async-compatible mutual exclusion lock.
/// Use this when you need lock semantics with async code.
/// </summary>
public sealed class AsyncLock : IDisposable
{
    private readonly SemaphoreSlim _semaphore = new SemaphoreSlim(1, 1);
    private readonly Task<Releaser> _releaser;

    public AsyncLock()
    {
        _releaser = Task.FromResult(new Releaser(this));
    }

    public Task<Releaser> LockAsync()
    {
        var wait = _semaphore.WaitAsync();
        return wait.IsCompleted
            ? _releaser
            : wait.ContinueWith(
                (_, state) => new Releaser((AsyncLock)state!),
                this,
                CancellationToken.None,
                TaskContinuationOptions.ExecuteSynchronously,
                TaskScheduler.Default);
    }

    public Task<Releaser> LockAsync(CancellationToken ct)
    {
        var wait = _semaphore.WaitAsync(ct);
        return wait.IsCompleted
            ? _releaser
            : wait.ContinueWith(
                (_, state) => new Releaser((AsyncLock)state!),
                this,
                ct,
                TaskContinuationOptions.ExecuteSynchronously,
                TaskScheduler.Default);
    }

    public void Dispose()
    {
        _semaphore.Dispose();
    }

    public readonly struct Releaser : IDisposable
    {
        private readonly AsyncLock _lock;

        internal Releaser(AsyncLock @lock) => _lock = @lock;

        public void Dispose()
        {
            _lock?._semaphore.Release();
        }
    }
}

// Usage
public class Service
{
    private readonly AsyncLock _lock = new AsyncLock();

    public async Task DoWorkAsync()
    {
        using (await _lock.LockAsync())
        {
            // Protected async operation
            await SomeAsyncOperation();
        }
    }
}

[PRODUCTION] Deadlock Prevention Patterns:

public class DeadlockPrevention
{
    // Pattern 1: Lock ordering - always acquire locks in same order
    private readonly AsyncLock _lockA = new AsyncLock();
    private readonly AsyncLock _lockB = new AsyncLock();

    public async Task SafeMultiLockAsync()
    {
        // Always acquire A before B
        using (await _lockA.LockAsync())
        using (await _lockB.LockAsync())
        {
            await DoWork();
        }
    }

    // Pattern 2: Timeout-based detection
    public async Task<bool> TryAcquireWithTimeoutAsync()
    {
        var semaphore = new SemaphoreSlim(1, 1);

        if (!await semaphore.WaitAsync(TimeSpan.FromSeconds(30)))
        {
            // Log potential deadlock
            Console.WriteLine("WARNING: Possible deadlock detected");
            return false;
        }

        try
        {
            await DoWork();
            return true;
        }
        finally
        {
            semaphore.Release();
        }
    }

    // Pattern 3: Reader-Writer lock for async
    private readonly SemaphoreSlim _writerLock = new SemaphoreSlim(1, 1);
    private readonly SemaphoreSlim _readerCountLock = new SemaphoreSlim(1, 1);
    private int _readerCount = 0;

    public async Task<IDisposable> AcquireReaderLockAsync()
    {
        await _readerCountLock.WaitAsync();
        try
        {
            if (++_readerCount == 1)
            {
                await _writerLock.WaitAsync();
            }
        }
        finally
        {
            _readerCountLock.Release();
        }

        return new ReaderLockReleaser(this);
    }

    private async Task ReleaseReaderLockAsync()
    {
        await _readerCountLock.WaitAsync();
        try
        {
            if (--_readerCount == 0)
            {
                _writerLock.Release();
            }
        }
        finally
        {
            _readerCountLock.Release();
        }
    }
}

2.2 Thread-Safe Collections

[INTERNALS] Collection Comparison:

Collection Thread-Safe Lock-Free Best For
ConcurrentDictionary<K,V> Yes Partial* Key-value lookups
ConcurrentQueue<T> Yes Yes Producer-consumer
ConcurrentStack<T> Yes Yes LIFO scenarios
ConcurrentBag<T> Yes Per-thread Unordered add/take
BlockingCollection<T> Yes No Bounded producer-consumer
Channel<T> Yes Partial* Async producer-consumer
ImmutableList<T> Yes** N/A Functional patterns

*Uses fine-grained locking for some operations **Immutable = inherently thread-safe for reads

ConcurrentDictionary Deep Dive:

public class ConcurrentDictionaryPatterns
{
    private readonly ConcurrentDictionary<string, int> _cache = new();

    // Pattern 1: GetOrAdd - atomic get or add
    public int GetOrCompute(string key)
    {
        // Value factory may be called multiple times for same key!
        // But only one value will be stored
        return _cache.GetOrAdd(key, k =>
        {
            Console.WriteLine($"Computing value for {k}");
            return ExpensiveComputation(k);
        });
    }

    // Pattern 2: GetOrAdd with Lazy for expensive computation
    private readonly ConcurrentDictionary<string, Lazy<int>> _lazyCache = new();

    public int GetOrComputeOnce(string key)
    {
        // Lazy ensures computation happens exactly once
        var lazy = _lazyCache.GetOrAdd(key,
            k => new Lazy<int>(() => ExpensiveComputation(k)));
        return lazy.Value;
    }

    // Pattern 3: AddOrUpdate
    public void IncrementCounter(string key)
    {
        _cache.AddOrUpdate(
            key,
            addValue: 1,                    // Initial value if not exists
            updateValueFactory: (k, old) => old + 1  // Update existing
        );
    }

    // Pattern 4: Thread-safe conditional update
    public bool TryUpdate(string key, int newValue, int expectedOldValue)
    {
        return _cache.TryUpdate(key, newValue, expectedOldValue);
    }

    // Pattern 5: Atomic snapshot operations
    public Dictionary<string, int> GetSnapshot()
    {
        // ToArray/ToDictionary are atomic snapshots
        return _cache.ToDictionary(kvp => kvp.Key, kvp => kvp.Value);
    }
}

[INTERNALS] ConcurrentDictionary Internal Structure:

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚              ConcurrentDictionary<K,V>                   β”‚
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚  Bucket Array (Array of Nodes)                          β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”     β”‚
β”‚  β”‚  0  β”‚  1  β”‚  2  β”‚  3  β”‚  4  β”‚  5  β”‚  6  β”‚  7  β”‚     β”‚
β”‚  β””β”€β”€β”¬β”€β”€β”΄β”€β”€β”¬β”€β”€β”΄β”€β”€β”€β”€β”€β”΄β”€β”€β”¬β”€β”€β”΄β”€β”€β”€β”€β”€β”΄β”€β”€β”¬β”€β”€β”΄β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”˜     β”‚
β”‚     β”‚     β”‚           β”‚           β”‚                      β”‚
β”‚     β–Ό     β–Ό           β–Ό           β–Ό                      β”‚
β”‚   Node   Node       Node        null                     β”‚
β”‚    β”‚      β”‚           β”‚                                  β”‚
β”‚    β–Ό      β–Ό           β–Ό                                  β”‚
β”‚   Node   null        Node                                β”‚
β”‚    β”‚                  β”‚                                  β”‚
β”‚    β–Ό                  β–Ό                                  β”‚
β”‚   null               null                                β”‚
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚  Lock Array (Stripe Locks)                              β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”                          β”‚
β”‚  β”‚Lock 0β”‚Lock 1β”‚Lock 2β”‚Lock 3β”‚  (default: CPU count)    β”‚
β”‚  β””β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”˜                          β”‚
β”‚  Each lock protects a subset of buckets                  β”‚
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚  Concurrency Level: Number of stripe locks               β”‚
β”‚  Default: Environment.ProcessorCount                     β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Channel vs BlockingCollection:

public class ChannelVsBlockingCollection
{
    // BlockingCollection - synchronous API
    public void BlockingCollectionExample()
    {
        var collection = new BlockingCollection<int>(boundedCapacity: 100);

        // Producer (blocks if full)
        Task.Run(() =>
        {
            for (int i = 0; i < 1000; i++)
            {
                collection.Add(i); // Blocks if at capacity
            }
            collection.CompleteAdding();
        });

        // Consumer (blocks if empty)
        Task.Run(() =>
        {
            foreach (var item in collection.GetConsumingEnumerable())
            {
                Process(item);
            }
        });
    }

    // Channel - async API (preferred for async code)
    public async Task ChannelExample()
    {
        var channel = Channel.CreateBounded<int>(100);

        // Producer (async)
        var producer = Task.Run(async () =>
        {
            for (int i = 0; i < 1000; i++)
            {
                await channel.Writer.WriteAsync(i); // Async wait if full
            }
            channel.Writer.Complete();
        });

        // Consumer (async)
        var consumer = Task.Run(async () =>
        {
            await foreach (var item in channel.Reader.ReadAllAsync())
            {
                await ProcessAsync(item);
            }
        });

        await Task.WhenAll(producer, consumer);
    }
}

[BENCHMARK] Collection Performance:

// Benchmark results (operations per second, single thread):
//
// Operation: Add 1M items
// List<T>:                    50,000,000 ops/s
// ConcurrentBag<T>:            5,000,000 ops/s
// ConcurrentQueue<T>:         15,000,000 ops/s
// BlockingCollection<T>:       2,000,000 ops/s
// Channel<T> (Unbounded):     12,000,000 ops/s
//
// Multi-threaded (4 threads):
// ConcurrentBag<T>:           18,000,000 ops/s (scales well)
// ConcurrentQueue<T>:         25,000,000 ops/s (good scaling)
// Channel<T>:                 35,000,000 ops/s (excellent scaling)

2.3 Signaling Mechanisms

AutoResetEvent vs ManualResetEvent:

public class SignalingComparison
{
    // AutoResetEvent - automatically resets after ONE waiter is released
    private readonly AutoResetEvent _autoEvent = new AutoResetEvent(false);

    // ManualResetEvent - stays signaled until manually reset
    private readonly ManualResetEvent _manualEvent = new ManualResetEvent(false);

    // ManualResetEventSlim - lightweight, user-mode spinning
    private readonly ManualResetEventSlim _slimEvent = new ManualResetEventSlim(false);

    public void AutoResetExample()
    {
        // Worker 1
        Task.Run(() =>
        {
            Console.WriteLine("Worker 1 waiting...");
            _autoEvent.WaitOne(); // Blocks
            Console.WriteLine("Worker 1 released!");
            // Event automatically resets - next waiter will block
        });

        // Worker 2
        Task.Run(() =>
        {
            Console.WriteLine("Worker 2 waiting...");
            _autoEvent.WaitOne(); // Also blocks
            Console.WriteLine("Worker 2 released!");
        });

        Thread.Sleep(100);

        // Signal - releases ONE waiter
        _autoEvent.Set(); // Worker 1 OR Worker 2 released, not both

        Thread.Sleep(100);

        // Signal again for the other waiter
        _autoEvent.Set();
    }

    public void ManualResetExample()
    {
        // Worker 1
        Task.Run(() =>
        {
            Console.WriteLine("Worker 1 waiting...");
            _manualEvent.WaitOne();
            Console.WriteLine("Worker 1 released!");
        });

        // Worker 2
        Task.Run(() =>
        {
            Console.WriteLine("Worker 2 waiting...");
            _manualEvent.WaitOne();
            Console.WriteLine("Worker 2 released!");
        });

        Thread.Sleep(100);

        // Signal - releases ALL waiters
        _manualEvent.Set(); // Both workers released!

        // Event stays signaled - new waiters pass immediately
        _manualEvent.WaitOne(); // Returns immediately

        // Must manually reset
        _manualEvent.Reset();
    }
}

[INTERNALS] Kernel vs User-Mode Primitives:

Primitive Mode Wait Cost Best For
Monitor (lock) User (spin + kernel) ~50ns spin, ~1ΞΌs kernel Short locks
SpinLock User ~10-50ns Very short locks
SpinWait User ~10ns Custom spin patterns
SemaphoreSlim User (spin + kernel) ~50ns spin Async, counting
Semaphore Kernel ~1ΞΌs Cross-process
Mutex Kernel ~1ΞΌs Cross-process, ownership
AutoResetEvent Kernel ~1ΞΌs Signal once
ManualResetEvent Kernel ~1ΞΌs Broadcast signal
ManualResetEventSlim User (spin + kernel) ~50ns spin In-process broadcast

CountdownEvent Pattern:

public class CountdownEventExample
{
    // Wait for N operations to complete
    public async Task WaitForMultipleOperations()
    {
        const int operationCount = 5;
        using var countdown = new CountdownEvent(operationCount);

        for (int i = 0; i < operationCount; i++)
        {
            int index = i;
            Task.Run(() =>
            {
                try
                {
                    Console.WriteLine($"Operation {index} starting");
                    Thread.Sleep(Random.Shared.Next(100, 500));
                    Console.WriteLine($"Operation {index} completed");
                }
                finally
                {
                    countdown.Signal(); // Decrement count
                }
            });
        }

        // Wait for all operations
        countdown.Wait();
        Console.WriteLine("All operations completed!");
    }
}

Barrier Synchronization:

public class BarrierExample
{
    // Barrier - synchronize multiple phases of parallel work
    public void ParallelPhases()
    {
        const int participantCount = 3;
        int phase = 0;

        using var barrier = new Barrier(participantCount, b =>
        {
            // Post-phase action (runs after all participants reach barrier)
            Console.WriteLine($"=== Phase {phase++} completed ===");
        });

        for (int i = 0; i < participantCount; i++)
        {
            int participantId = i;
            Task.Run(() =>
            {
                for (int p = 0; p < 3; p++) // 3 phases
                {
                    Console.WriteLine($"Participant {participantId} doing phase {p} work");
                    Thread.Sleep(Random.Shared.Next(50, 200));

                    Console.WriteLine($"Participant {participantId} reached barrier");
                    barrier.SignalAndWait(); // Wait for all participants
                }
            });
        }
    }
}

[CODE] Producer-Consumer with AutoResetEvent:

public class ProducerConsumerWithAutoResetEvent
{
    private readonly Queue<WorkItem> _workQueue = new();
    private readonly AutoResetEvent _workAvailable = new(false);
    private readonly object _queueLock = new();
    private volatile bool _shutdownRequested;

    public void StartConsumer()
    {
        Task.Run(() =>
        {
            while (true)
            {
                // Wait for work to be available
                _workAvailable.WaitOne();

                if (_shutdownRequested)
                    break;

                WorkItem? item;
                lock (_queueLock)
                {
                    if (_workQueue.Count > 0)
                        item = _workQueue.Dequeue();
                    else
                        continue;
                }

                ProcessWorkItem(item);
            }
        });
    }

    public void EnqueueWork(WorkItem item)
    {
        lock (_queueLock)
        {
            _workQueue.Enqueue(item);
        }
        // Signal ONE consumer that work is available
        _workAvailable.Set();
    }

    public void Shutdown()
    {
        _shutdownRequested = true;
        _workAvailable.Set(); // Wake up consumer to exit
    }

    private void ProcessWorkItem(WorkItem item)
    {
        Console.WriteLine($"Processing: {item.Id}");
    }
}

[CODE] Batch Processing with ManualResetEventSlim:

public class BatchProcessor
{
    private readonly List<DataItem> _batch = new();
    private readonly ManualResetEventSlim _batchReady = new(false);
    private readonly object _batchLock = new();
    private const int BatchSize = 100;

    public void StartProcessors(int processorCount)
    {
        for (int i = 0; i < processorCount; i++)
        {
            int id = i;
            Task.Run(() =>
            {
                while (true)
                {
                    // All processors wait for the batch to be ready
                    _batchReady.Wait();

                    List<DataItem> batchToProcess;
                    lock (_batchLock)
                    {
                        if (_batch.Count == 0)
                            continue;

                        // Each processor gets a portion
                        int startIdx = id * (BatchSize / processorCount);
                        int count = BatchSize / processorCount;
                        batchToProcess = _batch.GetRange(startIdx, count);
                    }

                    // Process in parallel
                    foreach (var item in batchToProcess)
                    {
                        ProcessItem(item);
                    }

                    Console.WriteLine($"Processor {id} finished batch");
                }
            });
        }
    }

    public void AddItem(DataItem item)
    {
        lock (_batchLock)
        {
            _batch.Add(item);

            if (_batch.Count >= BatchSize)
            {
                // Signal ALL processors that batch is ready
                _batchReady.Set();

                // Wait for processing then reset
                Thread.Sleep(100);
                _batchReady.Reset();
                _batch.Clear();
            }
        }
    }

    private void ProcessItem(DataItem item)
    {
        // Processing logic
    }
}

[BENCHMARK] Signaling Mechanism Performance:

// Benchmark: 1 million signal/wait operations

| Primitive                | Avg Latency | Throughput    | Notes                    |
|-------------------------|-------------|---------------|--------------------------|
| ManualResetEventSlim    | ~50ns       | 20M ops/s     | Best for in-process      |
| AutoResetEvent          | ~800ns      | 1.2M ops/s    | Kernel transition        |
| ManualResetEvent        | ~800ns      | 1.2M ops/s    | Kernel transition        |
| SemaphoreSlim           | ~60ns       | 16M ops/s     | User-mode + async        |
| Semaphore               | ~900ns      | 1.1M ops/s    | Kernel, cross-process    |
| Monitor.Pulse/Wait      | ~100ns      | 10M ops/s     | Requires lock            |

// Recommendation:
// - In-process: Use ManualResetEventSlim or SemaphoreSlim
// - Cross-process: Use named Semaphore or Mutex
// - Need async: Use SemaphoreSlim.WaitAsync()

[GOTCHAS] Common Signaling Mistakes:

public class SignalingGotchas
{
    // GOTCHA 1: Forgetting to reset ManualResetEvent
    private readonly ManualResetEvent _event = new(false);

    public void Gotcha1_ForgetReset()
    {
        _event.Set();    // Signal
        // ... later ...
        _event.WaitOne(); // Returns immediately! Event still signaled!
        // Fix: _event.Reset() or use ManualResetEventSlim with Reset()
    }

    // GOTCHA 2: Race condition between Set and WaitOne
    public void Gotcha2_RaceCondition()
    {
        // Thread 1:
        // _event.Set();
        // _event.Reset();

        // Thread 2:
        // _event.WaitOne(); // May miss the signal!

        // Fix: Use AutoResetEvent or proper synchronization
    }

    // GOTCHA 3: Disposing while threads are waiting
    public void Gotcha3_DisposeDuringWait()
    {
        var evt = new ManualResetEventSlim(false);

        Task.Run(() => evt.Wait()); // Waiting...

        // evt.Dispose(); // WRONG! ObjectDisposedException

        // Fix: Signal completion, wait for threads, then dispose
        evt.Set();
        Thread.Sleep(100); // Wait for thread to wake
        evt.Dispose();
    }

    // GOTCHA 4: Using kernel-mode events for high-frequency operations
    public void Gotcha4_HighFrequencyKernelEvents()
    {
        var autoEvent = new AutoResetEvent(false);

        // BAD: Kernel transition on every signal
        for (int i = 0; i < 1_000_000; i++)
        {
            autoEvent.Set();
            autoEvent.WaitOne();
        }

        // BETTER: Use SemaphoreSlim for user-mode operations
        var semaphore = new SemaphoreSlim(0, 1);
        for (int i = 0; i < 1_000_000; i++)
        {
            semaphore.Release();
            semaphore.Wait();
        }
    }
}

2.4 volatile vs Interlocked

[INTERNALS] Memory Ordering and Barriers:

public class VolatileVsInterlocked
{
    // volatile - ensures reads/writes are not reordered by compiler/CPU
    private volatile bool _stopRequested;
    private volatile int _counter; // volatile int has limited use!

    // Why volatile int is problematic:
    // volatile ensures visibility but NOT atomicity of compound operations
    public void VolatileProblem()
    {
        // This is NOT thread-safe even with volatile!
        _counter++; // Read + Increment + Write - not atomic!

        // This is equivalent to:
        // int temp = _counter;  // Read
        // temp = temp + 1;       // Increment
        // _counter = temp;       // Write
        // Another thread could read/write between these operations
    }

    // Interlocked - atomic operations with memory barriers
    private int _atomicCounter;

    public void InterlockedExample()
    {
        // Atomic increment
        Interlocked.Increment(ref _atomicCounter);

        // Atomic decrement
        Interlocked.Decrement(ref _atomicCounter);

        // Atomic add
        Interlocked.Add(ref _atomicCounter, 5);

        // Atomic exchange
        int oldValue = Interlocked.Exchange(ref _atomicCounter, 100);

        // Atomic compare-and-swap (CAS)
        int expected = 100;
        int newValue = 200;
        int original = Interlocked.CompareExchange(
            ref _atomicCounter,
            newValue,    // Value to set if comparison succeeds
            expected     // Value to compare against
        );

        if (original == expected)
        {
            Console.WriteLine("CAS succeeded");
        }
    }
}

When to Use Each:

public class WhenToUseWhat
{
    // Use volatile for: simple flags
    private volatile bool _isRunning = true;

    public void Stop()
    {
        _isRunning = false; // Simple write, visibility guaranteed
    }

    public void WorkerLoop()
    {
        while (_isRunning) // Simple read, sees latest value
        {
            DoWork();
        }
    }

    // Use Interlocked for: counters, complex updates
    private long _operationCount;

    public void RecordOperation()
    {
        Interlocked.Increment(ref _operationCount);
    }

    public long GetOperationCount()
    {
        return Interlocked.Read(ref _operationCount); // Atomic 64-bit read
    }

    // Use lock for: multiple related updates
    private readonly object _lock = new object();
    private int _balance;
    private DateTime _lastTransaction;

    public void UpdateBalance(int amount)
    {
        lock (_lock)
        {
            _balance += amount;
            _lastTransaction = DateTime.UtcNow;
            // Both updates are atomic together
        }
    }
}

[GOTCHAS] Volatile Pitfalls:

public class VolatilePitfalls
{
    private volatile int _value;

    // WRONG: volatile doesn't make compound operations atomic
    public void IncrementWrong()
    {
        _value++; // NOT THREAD-SAFE!
    }

    // WRONG: volatile reference doesn't make object members volatile
    private volatile List<int> _list = new List<int>();

    public void ModifyListWrong()
    {
        // The reference read is volatile, but Add() is not thread-safe!
        _list.Add(42); // NOT THREAD-SAFE!
    }

    // CORRECT patterns:
    private int _atomicValue;
    private List<int>? _immutableList;

    public void IncrementCorrect()
    {
        Interlocked.Increment(ref _atomicValue);
    }

    public void UpdateListCorrect()
    {
        // Immutable update pattern
        List<int> snapshot, newList;
        do
        {
            snapshot = Volatile.Read(ref _immutableList) ?? new List<int>();
            newList = new List<int>(snapshot) { 42 };
        }
        while (Interlocked.CompareExchange(ref _immutableList, newList, snapshot) != snapshot);
    }
}

3. System.Threading.Channels

3.1 Channel Types

[INTERNALS] Channel Architecture:

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚                    Channel<T>                            β”‚
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚  ChannelWriter<T>           ChannelReader<T>            β”‚
β”‚  β”œβ”€ WriteAsync()            β”œβ”€ ReadAsync()              β”‚
β”‚  β”œβ”€ TryWrite()              β”œβ”€ TryRead()                β”‚
β”‚  β”œβ”€ Complete()              β”œβ”€ ReadAllAsync()           β”‚
β”‚  └─ WaitToWriteAsync()      β”œβ”€ WaitToReadAsync()        β”‚
β”‚                             └─ Completion (Task)         β”‚
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚  Unbounded Channel                                       β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”    β”‚
β”‚  β”‚ No capacity limit, WriteAsync never waits        β”‚    β”‚
β”‚  β”‚ Memory can grow unbounded (careful!)             β”‚    β”‚
β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜    β”‚
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚  Bounded Channel                                         β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”    β”‚
β”‚  β”‚ Fixed capacity, WriteAsync waits when full       β”‚    β”‚
β”‚  β”‚ Full mode: Wait, DropOldest, DropNewest, DropWriteβ”‚   β”‚
β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜    β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Channel Creation Options:

public class ChannelCreation
{
    // Unbounded channel - unlimited capacity
    public Channel<int> CreateUnbounded()
    {
        return Channel.CreateUnbounded<int>(new UnboundedChannelOptions
        {
            SingleWriter = false,  // Multiple producers
            SingleReader = false,  // Multiple consumers
            AllowSynchronousContinuations = false // Safer default
        });
    }

    // Bounded channel - fixed capacity
    public Channel<int> CreateBounded()
    {
        return Channel.CreateBounded<int>(new BoundedChannelOptions(capacity: 100)
        {
            FullMode = BoundedChannelFullMode.Wait, // Default: wait when full
            SingleWriter = true,   // Optimized for single producer
            SingleReader = true,   // Optimized for single consumer
            AllowSynchronousContinuations = false
        });
    }

    // Different full modes:
    public void FullModeExamples()
    {
        // Wait - WriteAsync blocks until space available (default)
        var waitChannel = Channel.CreateBounded<int>(
            new BoundedChannelOptions(10) { FullMode = BoundedChannelFullMode.Wait });

        // DropOldest - oldest item removed to make room
        var dropOldestChannel = Channel.CreateBounded<int>(
            new BoundedChannelOptions(10) { FullMode = BoundedChannelFullMode.DropOldest });

        // DropNewest - newest item (the one being written) is dropped
        var dropNewestChannel = Channel.CreateBounded<int>(
            new BoundedChannelOptions(10) { FullMode = BoundedChannelFullMode.DropNewest });

        // DropWrite - same as DropNewest, write returns false
        var dropWriteChannel = Channel.CreateBounded<int>(
            new BoundedChannelOptions(10) { FullMode = BoundedChannelFullMode.DropWrite });
    }
}

3.2 Producer-Consumer Patterns

Basic Producer-Consumer:

public class BasicProducerConsumer
{
    public async Task RunAsync()
    {
        var channel = Channel.CreateBounded<WorkItem>(100);

        // Start producer
        var producer = ProduceAsync(channel.Writer);

        // Start multiple consumers
        var consumers = Enumerable.Range(0, 4)
            .Select(i => ConsumeAsync(channel.Reader, i))
            .ToArray();

        await producer;
        await Task.WhenAll(consumers);
    }

    private async Task ProduceAsync(ChannelWriter<WorkItem> writer)
    {
        try
        {
            for (int i = 0; i < 1000; i++)
            {
                var item = new WorkItem { Id = i, Data = $"Item {i}" };
                await writer.WriteAsync(item);

                // Simulate varying production rate
                if (i % 100 == 0)
                {
                    await Task.Delay(10);
                }
            }
        }
        finally
        {
            writer.Complete(); // Signal no more items
        }
    }

    private async Task ConsumeAsync(ChannelReader<WorkItem> reader, int consumerId)
    {
        await foreach (var item in reader.ReadAllAsync())
        {
            Console.WriteLine($"Consumer {consumerId} processing item {item.Id}");
            await ProcessItemAsync(item);
        }
        Console.WriteLine($"Consumer {consumerId} finished");
    }
}

public record WorkItem
{
    public int Id { get; init; }
    public string Data { get; init; } = "";
}

[CODE] Complete Pipeline Implementation:

/// <summary>
/// Multi-stage async pipeline using Channels.
/// Each stage processes items concurrently and passes to next stage.
/// </summary>
public class AsyncPipeline<TInput, TOutput>
{
    private readonly Func<TInput, CancellationToken, Task<TOutput>> _processor;
    private readonly int _maxConcurrency;
    private readonly int _bufferSize;

    public AsyncPipeline(
        Func<TInput, CancellationToken, Task<TOutput>> processor,
        int maxConcurrency = 4,
        int bufferSize = 100)
    {
        _processor = processor;
        _maxConcurrency = maxConcurrency;
        _bufferSize = bufferSize;
    }

    public ChannelReader<TOutput> Process(
        ChannelReader<TInput> input,
        CancellationToken ct = default)
    {
        var output = Channel.CreateBounded<TOutput>(_bufferSize);

        // Start concurrent processors
        var tasks = Enumerable.Range(0, _maxConcurrency)
            .Select(_ => ProcessAsync(input, output.Writer, ct))
            .ToArray();

        // Complete output when all processors finish
        Task.WhenAll(tasks).ContinueWith(_ => output.Writer.Complete());

        return output.Reader;
    }

    private async Task ProcessAsync(
        ChannelReader<TInput> input,
        ChannelWriter<TOutput> output,
        CancellationToken ct)
    {
        await foreach (var item in input.ReadAllAsync(ct))
        {
            var result = await _processor(item, ct);
            await output.WriteAsync(result, ct);
        }
    }
}

// Usage example - image processing pipeline
public class ImagePipeline
{
    public async Task ProcessImagesAsync(IEnumerable<string> imagePaths)
    {
        // Create pipeline stages
        var downloadStage = new AsyncPipeline<string, byte[]>(
            async (path, ct) => await DownloadImageAsync(path, ct),
            maxConcurrency: 8);

        var resizeStage = new AsyncPipeline<byte[], byte[]>(
            async (data, ct) => await ResizeImageAsync(data, ct),
            maxConcurrency: 4);

        var uploadStage = new AsyncPipeline<byte[], string>(
            async (data, ct) => await UploadImageAsync(data, ct),
            maxConcurrency: 8);

        // Create input channel
        var inputChannel = Channel.CreateUnbounded<string>();

        // Connect pipeline stages
        var downloadedImages = downloadStage.Process(inputChannel.Reader);
        var resizedImages = resizeStage.Process(downloadedImages);
        var uploadedUrls = uploadStage.Process(resizedImages);

        // Feed input
        var feedTask = Task.Run(async () =>
        {
            foreach (var path in imagePaths)
            {
                await inputChannel.Writer.WriteAsync(path);
            }
            inputChannel.Writer.Complete();
        });

        // Collect output
        var results = new List<string>();
        await foreach (var url in uploadedUrls.ReadAllAsync())
        {
            results.Add(url);
            Console.WriteLine($"Processed: {url}");
        }

        await feedTask;
    }
}

Backpressure Handling:

public class BackpressureExample
{
    // Bounded channel automatically provides backpressure
    public async Task WithBackpressure()
    {
        // Only 10 items can be buffered
        var channel = Channel.CreateBounded<int>(new BoundedChannelOptions(10)
        {
            FullMode = BoundedChannelFullMode.Wait
        });

        // Fast producer
        var producer = Task.Run(async () =>
        {
            for (int i = 0; i < 1000; i++)
            {
                // Will wait when buffer is full (backpressure!)
                await channel.Writer.WriteAsync(i);
                Console.WriteLine($"Produced {i}");
            }
            channel.Writer.Complete();
        });

        // Slow consumer
        var consumer = Task.Run(async () =>
        {
            await foreach (var item in channel.Reader.ReadAllAsync())
            {
                await Task.Delay(100); // Slow processing
                Console.WriteLine($"Consumed {item}");
            }
        });

        await Task.WhenAll(producer, consumer);
    }

    // Manual backpressure with WaitToWriteAsync
    public async Task ManualBackpressure()
    {
        var channel = Channel.CreateBounded<int>(10);

        var producer = Task.Run(async () =>
        {
            for (int i = 0; i < 1000; i++)
            {
                // Check if we can write (non-blocking)
                if (!channel.Writer.TryWrite(i))
                {
                    Console.WriteLine("Buffer full, waiting...");
                    await channel.Writer.WaitToWriteAsync();
                    channel.Writer.TryWrite(i);
                }
            }
            channel.Writer.Complete();
        });

        await producer;
    }
}

Graceful Shutdown:

public class GracefulShutdownExample
{
    private readonly Channel<WorkItem> _workChannel;
    private readonly CancellationTokenSource _cts;
    private Task? _processingTask;

    public GracefulShutdownExample()
    {
        _workChannel = Channel.CreateBounded<WorkItem>(100);
        _cts = new CancellationTokenSource();
    }

    public void Start()
    {
        _processingTask = ProcessWorkAsync(_cts.Token);
    }

    public async Task StopAsync(TimeSpan timeout)
    {
        // Signal no more work
        _workChannel.Writer.Complete();

        // Wait for processing to complete with timeout
        using var timeoutCts = new CancellationTokenSource(timeout);
        using var linkedCts = CancellationTokenSource.CreateLinkedTokenSource(
            _cts.Token, timeoutCts.Token);

        try
        {
            if (_processingTask != null)
            {
                await _processingTask.WaitAsync(linkedCts.Token);
            }
        }
        catch (OperationCanceledException)
        {
            Console.WriteLine("Shutdown timed out, forcing cancellation");
            await _cts.CancelAsync();
        }
    }

    public async Task QueueWorkAsync(WorkItem item)
    {
        await _workChannel.Writer.WriteAsync(item);
    }

    private async Task ProcessWorkAsync(CancellationToken ct)
    {
        try
        {
            await foreach (var item in _workChannel.Reader.ReadAllAsync(ct))
            {
                try
                {
                    await ProcessItemAsync(item, ct);
                }
                catch (Exception ex) when (ex is not OperationCanceledException)
                {
                    Console.WriteLine($"Error processing item: {ex.Message}");
                    // Continue processing other items
                }
            }
        }
        catch (OperationCanceledException)
        {
            Console.WriteLine("Processing cancelled");
        }

        Console.WriteLine("Processing completed");
    }
}

[BENCHMARK] Channels vs BlockingCollection:

// Benchmark: 1M items, 4 producers, 4 consumers
//
// BlockingCollection<T>:
//   Throughput: 2.5M items/sec
//   Memory: ~50MB
//   CPU: High (blocking threads)
//
// Channel<T> (Bounded):
//   Throughput: 8M items/sec
//   Memory: ~20MB
//   CPU: Low (async, no blocking)
//
// Channel<T> (Unbounded):
//   Throughput: 12M items/sec
//   Memory: Variable (can grow)
//   CPU: Low
//
// Key advantages of Channel<T>:
// 1. Native async/await support
// 2. No thread blocking
// 3. Better scalability
// 4. Lower memory overhead
// 5. Built-in backpressure

4. Async Streams

4.1 IAsyncEnumerable

[INTERNALS] How Async Streams Work:

// Traditional enumerable - synchronous
public IEnumerable<int> GetNumbers()
{
    for (int i = 0; i < 10; i++)
    {
        Thread.Sleep(100); // Blocks thread!
        yield return i;
    }
}

// Async enumerable - non-blocking
public async IAsyncEnumerable<int> GetNumbersAsync(
    [EnumeratorCancellation] CancellationToken ct = default)
{
    for (int i = 0; i < 10; i++)
    {
        await Task.Delay(100, ct); // Non-blocking!
        yield return i;
    }
}

// Consuming async streams
public async Task ConsumeAsync()
{
    await foreach (var number in GetNumbersAsync())
    {
        Console.WriteLine(number);
    }

    // With cancellation
    using var cts = new CancellationTokenSource(TimeSpan.FromSeconds(5));

    await foreach (var number in GetNumbersAsync(cts.Token))
    {
        Console.WriteLine(number);
    }

    // Manual enumeration
    await using var enumerator = GetNumbersAsync().GetAsyncEnumerator();
    while (await enumerator.MoveNextAsync())
    {
        Console.WriteLine(enumerator.Current);
    }
}

[INTERNALS] State Machine Generation:

// The compiler generates a state machine for async iterators
// Similar to regular async methods but with additional complexity

// This async iterator:
public async IAsyncEnumerable<int> SimpleAsyncIterator()
{
    await Task.Delay(100);
    yield return 1;
    await Task.Delay(100);
    yield return 2;
}

// Compiles to something like:
public sealed class SimpleAsyncIteratorStateMachine : IAsyncEnumerable<int>,
    IAsyncEnumerator<int>, IAsyncStateMachine
{
    private int _state;
    private int _current;
    private AsyncIteratorMethodBuilder _builder;

    public int Current => _current;

    public async ValueTask<bool> MoveNextAsync()
    {
        switch (_state)
        {
            case 0:
                _state = -1;
                await Task.Delay(100);
                _current = 1;
                _state = 1;
                return true;

            case 1:
                _state = -1;
                await Task.Delay(100);
                _current = 2;
                _state = 2;
                return true;

            default:
                return false;
        }
    }

    public ValueTask DisposeAsync() => default;
    public IAsyncEnumerator<int> GetAsyncEnumerator(CancellationToken ct = default) => this;
}

ConfigureAwait in Async Streams:

public class AsyncStreamConfigureAwait
{
    // ConfigureAwait(false) in async streams
    public async IAsyncEnumerable<int> GetDataAsync(
        [EnumeratorCancellation] CancellationToken ct = default)
    {
        // Each await can have ConfigureAwait
        var data = await FetchDataAsync().ConfigureAwait(false);

        foreach (var item in data)
        {
            ct.ThrowIfCancellationRequested();
            yield return item;
        }
    }

    // GOTCHA: ConfigureAwait on the consumer side
    public async Task ConsumeWithConfigureAwait()
    {
        // This doesn't work as expected!
        // ConfigureAwait applies to the MoveNextAsync() calls
        await foreach (var item in GetDataAsync().ConfigureAwait(false))
        {
            // This code may run on different thread
            Console.WriteLine(item);
        }
    }

    // Extension method for ConfigureAwait on async enumerables
    public static ConfiguredCancelableAsyncEnumerable<T> ConfigureAwait<T>(
        this IAsyncEnumerable<T> source, bool continueOnCapturedContext)
    {
        return source.ConfigureAwait(continueOnCapturedContext);
    }
}

[PRODUCTION] Streaming from Database:

public class DatabaseStreaming
{
    private readonly DbContext _context;

    // Stream large result sets without loading all into memory
    public async IAsyncEnumerable<Customer> GetAllCustomersAsync(
        [EnumeratorCancellation] CancellationToken ct = default)
    {
        // AsAsyncEnumerable() enables streaming
        await foreach (var customer in _context.Customers
            .AsNoTracking()
            .AsAsyncEnumerable()
            .WithCancellation(ct))
        {
            yield return customer;
        }
    }

    // Batch processing with async streams
    public async IAsyncEnumerable<IReadOnlyList<Customer>> GetCustomerBatchesAsync(
        int batchSize = 100,
        [EnumeratorCancellation] CancellationToken ct = default)
    {
        var batch = new List<Customer>(batchSize);

        await foreach (var customer in GetAllCustomersAsync(ct))
        {
            batch.Add(customer);

            if (batch.Count >= batchSize)
            {
                yield return batch.ToArray();
                batch.Clear();
            }
        }

        if (batch.Count > 0)
        {
            yield return batch.ToArray();
        }
    }

    // API endpoint returning async stream
    // [HttpGet("customers/stream")]
    public async IAsyncEnumerable<Customer> StreamCustomers(
        [EnumeratorCancellation] CancellationToken ct)
    {
        await foreach (var customer in GetAllCustomersAsync(ct))
        {
            yield return customer;
        }
    }
}

4.2 Advanced Async Patterns

ValueTask vs Task:

public class ValueTaskExample
{
    private int _cachedValue;
    private bool _hasCache;

    // ValueTask is ideal when result is often synchronous
    public ValueTask<int> GetValueAsync()
    {
        if (_hasCache)
        {
            // Synchronous path - no allocation!
            return new ValueTask<int>(_cachedValue);
        }

        // Async path - allocates Task
        return new ValueTask<int>(FetchAndCacheAsync());
    }

    private async Task<int> FetchAndCacheAsync()
    {
        _cachedValue = await FetchFromDatabaseAsync();
        _hasCache = true;
        return _cachedValue;
    }

    // GOTCHA: ValueTask can only be awaited ONCE!
    public async Task ValueTaskGotcha()
    {
        ValueTask<int> valueTask = GetValueAsync();

        // WRONG - awaiting twice!
        int result1 = await valueTask;
        int result2 = await valueTask; // UNDEFINED BEHAVIOR!

        // If you need to await multiple times, convert to Task:
        Task<int> task = GetValueAsync().AsTask();
        int r1 = await task;
        int r2 = await task; // OK
    }

    // GOTCHA: Don't store ValueTask for later
    public async Task ValueTaskStorageGotcha()
    {
        // WRONG - storing ValueTask
        ValueTask<int> stored = GetValueAsync();
        await Task.Delay(1000);
        int result = await stored; // May be invalid!

        // If you need to store, convert to Task
        Task<int> storable = GetValueAsync().AsTask();
        await Task.Delay(1000);
        result = await storable; // OK
    }
}

[BENCHMARK] ValueTask Performance:

// Benchmark: 1M calls, 90% cache hit rate
//
// Task<int> (always async):
//   Allocations: 1M Task objects
//   Time: 150ms
//   Memory: ~80MB
//
// ValueTask<int> (cached path):
//   Allocations: 100K Task objects (10% async)
//   Time: 50ms
//   Memory: ~8MB
//
// Use ValueTask when:
// 1. Result is often synchronous (cache hit)
// 2. Hot path with many calls
// 3. Pool-backed operations (Socket.ReceiveAsync)
//
// Use Task when:
// 1. Result is always async
// 2. Need to await multiple times
// 3. Need to store for later
// 4. Not performance critical

IAsyncDisposable:

public class AsyncDisposableExample : IAsyncDisposable
{
    private readonly FileStream _fileStream;
    private readonly HttpClient _httpClient;
    private bool _disposed;

    public AsyncDisposableExample(string filePath)
    {
        _fileStream = File.OpenRead(filePath);
        _httpClient = new HttpClient();
    }

    public async ValueTask DisposeAsync()
    {
        if (_disposed)
            return;

        _disposed = true;

        // Dispose async resources
        await _fileStream.DisposeAsync();
        _httpClient.Dispose(); // HttpClient.Dispose is sync

        // Suppress finalizer
        GC.SuppressFinalize(this);
    }

    // Usage
    public static async Task UseAsync()
    {
        await using var resource = new AsyncDisposableExample("file.txt");
        // Use resource...
    } // DisposeAsync called automatically
}

// Pattern: Both IDisposable and IAsyncDisposable
public class DualDisposable : IDisposable, IAsyncDisposable
{
    private readonly FileStream _stream;
    private bool _disposed;

    public void Dispose()
    {
        Dispose(disposing: true);
        GC.SuppressFinalize(this);
    }

    public async ValueTask DisposeAsync()
    {
        await DisposeAsyncCore();

        Dispose(disposing: false);
        GC.SuppressFinalize(this);
    }

    protected virtual void Dispose(bool disposing)
    {
        if (_disposed) return;

        if (disposing)
        {
            _stream.Dispose();
        }

        _disposed = true;
    }

    protected virtual async ValueTask DisposeAsyncCore()
    {
        if (!_disposed)
        {
            await _stream.DisposeAsync();
        }
    }
}

ConfigureAwait Deep Dive:

public class ConfigureAwaitDeepDive
{
    // ConfigureAwait(false) - don't capture sync context
    // Use in library code, not UI code
    public async Task LibraryMethodAsync()
    {
        // Before await: running on caller's context
        await Task.Delay(100).ConfigureAwait(false);
        // After await: may run on any thread pool thread

        // This is fine for CPU-bound or context-independent work
        ProcessData();

        // But this would crash in UI app if run on background thread:
        // UpdateUI(); // WRONG!
    }

    // ConfigureAwait(true) - capture sync context (default)
    // Use in UI code or when context matters
    public async Task UIMethodAsync()
    {
        await Task.Delay(100); // ConfigureAwait(true) is default

        // After await: back on UI thread
        UpdateUI(); // Safe!
    }

    // [INTERNALS] What ConfigureAwait does:
    // 1. ConfigureAwait(true): Captures SynchronizationContext.Current
    //    After await, posts continuation to captured context
    // 2. ConfigureAwait(false): Doesn't capture context
    //    After await, continues on any available thread

    // When to use ConfigureAwait(false):
    // 1. Library code (NuGet packages)
    // 2. Performance-critical code
    // 3. When you don't need the original context

    // When NOT to use ConfigureAwait(false):
    // 1. UI event handlers
    // 2. ASP.NET Core (no sync context anyway)
    // 3. When you need HttpContext or thread-local state
}

5. Parallel Task Execution

5.1 Multiple Task Execution

Task.WhenAll Patterns:

public class WhenAllPatterns
{
    // Basic WhenAll - wait for all
    public async Task BasicWhenAll()
    {
        var tasks = new[]
        {
            FetchDataAsync("source1"),
            FetchDataAsync("source2"),
            FetchDataAsync("source3")
        };

        // Wait for all, results in same order as tasks
        string[] results = await Task.WhenAll(tasks);

        foreach (var result in results)
        {
            Console.WriteLine(result);
        }
    }

    // WhenAll with exception handling
    public async Task WhenAllWithExceptions()
    {
        var tasks = new[]
        {
            Task.Run(() => { throw new Exception("Error 1"); return 1; }),
            Task.Run(() => { throw new Exception("Error 2"); return 2; }),
            Task.Run(() => 3)
        };

        try
        {
            await Task.WhenAll(tasks);
        }
        catch (Exception ex)
        {
            // Only first exception is thrown!
            Console.WriteLine($"First exception: {ex.Message}");

            // To get all exceptions:
            var allExceptions = tasks
                .Where(t => t.IsFaulted)
                .SelectMany(t => t.Exception!.InnerExceptions)
                .ToList();

            foreach (var e in allExceptions)
            {
                Console.WriteLine($"Exception: {e.Message}");
            }
        }
    }

    // Get results even with some failures
    public async Task<List<T>> WhenAllWithPartialResults<T>(
        IEnumerable<Task<T>> tasks,
        T defaultValue = default!)
    {
        var taskList = tasks.ToList();
        await Task.WhenAll(taskList.Select(t =>
            t.ContinueWith(_ => { }, TaskContinuationOptions.ExecuteSynchronously)));

        return taskList.Select(t =>
            t.IsCompletedSuccessfully ? t.Result : defaultValue
        ).ToList();
    }
}

Task.WhenAny Patterns:

public class WhenAnyPatterns
{
    // First to complete wins
    public async Task<string> FirstSuccessfulResult()
    {
        var tasks = new[]
        {
            FetchFromServerAsync("primary"),
            FetchFromServerAsync("backup1"),
            FetchFromServerAsync("backup2")
        };

        // First to complete
        var completedTask = await Task.WhenAny(tasks);
        return await completedTask;
    }

    // Timeout pattern
    public async Task<string?> WithTimeout(TimeSpan timeout)
    {
        var workTask = DoLongWorkAsync();
        var timeoutTask = Task.Delay(timeout);

        var completedTask = await Task.WhenAny(workTask, timeoutTask);

        if (completedTask == timeoutTask)
        {
            Console.WriteLine("Operation timed out");
            return null;
        }

        return await workTask;
    }

    // Process as completed
    public async Task ProcessAsCompleted()
    {
        var tasks = Enumerable.Range(0, 10)
            .Select(i => DelayedResultAsync(i, Random.Shared.Next(100, 1000)))
            .ToList();

        while (tasks.Count > 0)
        {
            var completedTask = await Task.WhenAny(tasks);
            tasks.Remove(completedTask);

            var result = await completedTask;
            Console.WriteLine($"Got result: {result}");
        }
    }
}

[CODE] Throttled Parallel Execution:

public class ThrottledExecution
{
    // Using SemaphoreSlim for throttling
    public async Task<List<TResult>> ExecuteWithThrottle<TInput, TResult>(
        IEnumerable<TInput> items,
        Func<TInput, CancellationToken, Task<TResult>> processor,
        int maxConcurrency,
        CancellationToken ct = default)
    {
        using var semaphore = new SemaphoreSlim(maxConcurrency);

        var tasks = items.Select(async item =>
        {
            await semaphore.WaitAsync(ct);
            try
            {
                return await processor(item, ct);
            }
            finally
            {
                semaphore.Release();
            }
        });

        return (await Task.WhenAll(tasks)).ToList();
    }

    // Using Parallel.ForEachAsync (.NET 6+)
    public async Task ExecuteWithParallelForEach<T>(
        IEnumerable<T> items,
        Func<T, CancellationToken, ValueTask> processor,
        int maxConcurrency,
        CancellationToken ct = default)
    {
        var options = new ParallelOptions
        {
            MaxDegreeOfParallelism = maxConcurrency,
            CancellationToken = ct
        };

        await Parallel.ForEachAsync(items, options, processor);
    }

    // Rate-limited execution (X per second)
    public async Task ExecuteRateLimited<T>(
        IEnumerable<T> items,
        Func<T, Task> processor,
        int requestsPerSecond)
    {
        var delay = TimeSpan.FromSeconds(1.0 / requestsPerSecond);

        foreach (var item in items)
        {
            var startTime = DateTime.UtcNow;

            await processor(item);

            var elapsed = DateTime.UtcNow - startTime;
            if (elapsed < delay)
            {
                await Task.Delay(delay - elapsed);
            }
        }
    }

    // Sliding window rate limiter
    public class SlidingWindowRateLimiter
    {
        private readonly Queue<DateTime> _requestTimes = new();
        private readonly int _maxRequests;
        private readonly TimeSpan _window;
        private readonly SemaphoreSlim _lock = new(1, 1);

        public SlidingWindowRateLimiter(int maxRequests, TimeSpan window)
        {
            _maxRequests = maxRequests;
            _window = window;
        }

        public async Task<bool> TryAcquireAsync(CancellationToken ct = default)
        {
            await _lock.WaitAsync(ct);
            try
            {
                var now = DateTime.UtcNow;
                var windowStart = now - _window;

                // Remove old entries
                while (_requestTimes.Count > 0 && _requestTimes.Peek() < windowStart)
                {
                    _requestTimes.Dequeue();
                }

                if (_requestTimes.Count >= _maxRequests)
                {
                    return false;
                }

                _requestTimes.Enqueue(now);
                return true;
            }
            finally
            {
                _lock.Release();
            }
        }

        public async Task WaitForPermitAsync(CancellationToken ct = default)
        {
            while (!await TryAcquireAsync(ct))
            {
                await Task.Delay(10, ct);
            }
        }
    }
}

6. LINQ Advanced Topics

6.1 Deferred Execution

[INTERNALS] How Deferred Execution Works:

public class DeferredExecutionDeepDive
{
    // Deferred execution - query not executed until enumerated
    public void DeferredExample()
    {
        var numbers = new List<int> { 1, 2, 3, 4, 5 };

        // Query is defined, NOT executed
        var query = numbers.Where(n =>
        {
            Console.WriteLine($"Filtering {n}");
            return n > 2;
        });

        Console.WriteLine("Query defined");

        // Modify source before enumeration
        numbers.Add(6);
        numbers.Add(7);

        Console.WriteLine("Starting enumeration");

        // NOW the query executes
        foreach (var n in query)
        {
            Console.WriteLine($"Result: {n}");
        }

        // Output:
        // Query defined
        // Starting enumeration
        // Filtering 1
        // Filtering 2
        // Filtering 3
        // Result: 3
        // Filtering 4
        // Result: 4
        // Filtering 5
        // Result: 5
        // Filtering 6
        // Result: 6
        // Filtering 7
        // Result: 7
    }

    // Immediate execution - query executed immediately
    public void ImmediateExample()
    {
        var numbers = new List<int> { 1, 2, 3, 4, 5 };

        // ToList() forces immediate execution
        var results = numbers.Where(n => n > 2).ToList();

        // Modifications don't affect results
        numbers.Add(6);
        numbers.Add(7);

        // results still contains only 3, 4, 5
    }
}

[GOTCHAS] Multiple Enumeration:

public class MultipleEnumerationGotcha
{
    // WRONG - query executed multiple times
    public void MultipleEnumerationBad()
    {
        IEnumerable<int> query = GetExpensiveData()
            .Where(x => x > 10);

        // Enumeration 1
        var count = query.Count(); // Executes query!

        // Enumeration 2
        var first = query.First(); // Executes query again!

        // Enumeration 3
        var list = query.ToList(); // Executes query a third time!
    }

    // CORRECT - materialize once
    public void MultipleEnumerationGood()
    {
        var results = GetExpensiveData()
            .Where(x => x > 10)
            .ToList(); // Materialize once

        var count = results.Count; // Just property access
        var first = results[0]; // Just indexer access
    }

    // Detecting multiple enumeration
    public IEnumerable<int> GetDataWithWarning()
    {
        Console.WriteLine("WARNING: Enumeration started!");
        for (int i = 0; i < 10; i++)
        {
            yield return i;
        }
    }
}

Streaming vs Buffering Operations:

public class StreamingVsBuffering
{
    // Streaming operations - process one at a time, O(1) memory
    // Where, Select, Take, Skip, OfType, Cast
    public void StreamingOps()
    {
        var query = Enumerable.Range(0, 1_000_000)
            .Where(x => x % 2 == 0)   // Streaming
            .Select(x => x * 2)        // Streaming
            .Take(10);                 // Streaming

        // Only 10 items ever in memory
        foreach (var item in query)
        {
            Console.WriteLine(item);
        }
    }

    // Buffering operations - must read all, O(n) memory
    // OrderBy, GroupBy, Distinct, Reverse, ToList, ToArray
    public void BufferingOps()
    {
        var query = Enumerable.Range(0, 1_000_000)
            .OrderBy(x => x)           // BUFFERS all items!
            .Take(10);

        // All 1M items must be sorted before Take can return 10
    }
}

6.2 ToLookup Method

ILookup<TKey, TElement> Explained:

public class LookupDeepDive
{
    // ILookup is like a read-only dictionary that allows duplicate keys
    // Key difference from Dictionary<K, List<V>>:
    // - Lookup[missingKey] returns empty sequence, not exception
    // - Lookup is immutable after creation
    // - More memory efficient for read-only scenarios

    public void LookupBasics()
    {
        var orders = new[]
        {
            new Order { CustomerId = 1, Amount = 100 },
            new Order { CustomerId = 1, Amount = 200 },
            new Order { CustomerId = 2, Amount = 150 },
            new Order { CustomerId = 1, Amount = 50 }
        };

        // Create lookup - immediate execution
        ILookup<int, Order> ordersByCustomer = orders.ToLookup(o => o.CustomerId);

        // Access by key - never throws!
        var customer1Orders = ordersByCustomer[1]; // 3 orders
        var customer3Orders = ordersByCustomer[3]; // Empty sequence, NOT exception!

        // Check if key exists
        bool hasCustomer2 = ordersByCustomer.Contains(2); // true

        // Iterate all groups
        foreach (var group in ordersByCustomer)
        {
            Console.WriteLine($"Customer {group.Key}:");
            foreach (var order in group)
            {
                Console.WriteLine($"  Order: {order.Amount}");
            }
        }
    }

    // ToLookup vs GroupBy vs ToDictionary
    public void ComparisonExample()
    {
        var items = GetItems().ToList();

        // GroupBy - deferred execution, re-evaluated each enumeration
        IEnumerable<IGrouping<int, Item>> grouped = items.GroupBy(i => i.Category);

        // ToLookup - immediate execution, cached result
        ILookup<int, Item> lookup = items.ToLookup(i => i.Category);

        // ToDictionary with list values - manual grouping
        Dictionary<int, List<Item>> dict = items
            .GroupBy(i => i.Category)
            .ToDictionary(g => g.Key, g => g.ToList());

        // Performance comparison:
        // - GroupBy: O(1) to create, O(n) per enumeration
        // - ToLookup: O(n) to create, O(1) per access
        // - ToDictionary: O(n) to create, O(1) per access, but mutable
    }
}

[PRODUCTION] Real-World Use Cases:

public class LookupUseCases
{
    // Use case 1: One-to-many joins in memory
    public IEnumerable<CustomerOrdersSummary> GetCustomerSummaries(
        IEnumerable<Customer> customers,
        IEnumerable<Order> orders)
    {
        // Create lookup once - O(n)
        var orderLookup = orders.ToLookup(o => o.CustomerId);

        // Join efficiently - O(1) per customer
        return customers.Select(c => new CustomerOrdersSummary
        {
            Customer = c,
            TotalOrders = orderLookup[c.Id].Count(),
            TotalAmount = orderLookup[c.Id].Sum(o => o.Amount)
        });
    }

    // Use case 2: Batch processing by category
    public async Task ProcessByCategoryAsync(IEnumerable<WorkItem> items)
    {
        var byCategory = items.ToLookup(i => i.Category);

        // Process each category with different handler
        foreach (var category in byCategory)
        {
            var handler = GetHandler(category.Key);
            await handler.ProcessBatchAsync(category.ToList());
        }
    }

    // Use case 3: Fast parent-child resolution
    public List<TreeNode> BuildTree(IEnumerable<FlatNode> flatNodes)
    {
        var nodeList = flatNodes.ToList();
        var childLookup = nodeList.ToLookup(n => n.ParentId);

        void PopulateChildren(TreeNode node)
        {
            node.Children = childLookup[node.Id]
                .Select(flat => new TreeNode { Id = flat.Id, Name = flat.Name })
                .ToList();

            foreach (var child in node.Children)
            {
                PopulateChildren(child);
            }
        }

        // Root nodes have null parent
        var roots = childLookup[null]
            .Select(flat => new TreeNode { Id = flat.Id, Name = flat.Name })
            .ToList();

        foreach (var root in roots)
        {
            PopulateChildren(root);
        }

        return roots;
    }
}

7. Language Internals

7.1 Static Constructors

[INTERNALS] Type Initialization Rules:

public class StaticConstructorDeepDive
{
    // Static constructor - called automatically before first access
    // Guaranteed to run exactly once, thread-safe

    public class TypeWithStaticCtor
    {
        // Static field initializers run before static constructor
        private static readonly int _field1 = InitField1();
        private static readonly int _field2 = InitField2();

        // Static constructor
        static TypeWithStaticCtor()
        {
            Console.WriteLine("Static constructor running");
            // Both _field1 and _field2 are already initialized here
        }

        private static int InitField1()
        {
            Console.WriteLine("Initializing _field1");
            return 1;
        }

        private static int InitField2()
        {
            Console.WriteLine("Initializing _field2");
            return 2;
        }

        public static void DoSomething()
        {
            Console.WriteLine("DoSomething");
        }
    }

    // Output when calling TypeWithStaticCtor.DoSomething():
    // Initializing _field1
    // Initializing _field2
    // Static constructor running
    // DoSomething
}

BeforeFieldInit Optimization:

public class BeforeFieldInitExample
{
    // WITHOUT explicit static constructor
    // CLR may run type initialization at any time before first field access
    // This is "relaxed" semantics - beforefieldinit flag is set
    public class RelaxedInit
    {
        public static readonly int Value = Initialize();

        private static int Initialize()
        {
            Console.WriteLine("RelaxedInit.Initialize");
            return 42;
        }
    }

    // WITH explicit static constructor
    // CLR guarantees initialization happens immediately before first access
    // This is "precise" semantics - no beforefieldinit flag
    public class PreciseInit
    {
        public static readonly int Value = Initialize();

        // Adding empty static constructor changes initialization timing!
        static PreciseInit() { }

        private static int Initialize()
        {
            Console.WriteLine("PreciseInit.Initialize");
            return 42;
        }
    }

    // Performance implication:
    // - BeforeFieldInit allows JIT to optimize field access checks
    // - Precise init requires checking if type is initialized on EVERY access
    // - ~5-10% performance difference in tight loops
}

[GOTCHAS] Static Constructor Deadlocks:

public class StaticDeadlock
{
    // DANGER: This can deadlock!
    public class TypeA
    {
        public static int Value = TypeB.Value + 1;

        static TypeA()
        {
            Console.WriteLine("TypeA static ctor");
        }
    }

    public class TypeB
    {
        public static int Value = TypeA.Value + 1;

        static TypeB()
        {
            Console.WriteLine("TypeB static ctor");
        }
    }

    // If Thread1 accesses TypeA first and Thread2 accesses TypeB first:
    // Thread1: Holds TypeA lock, waiting for TypeB lock
    // Thread2: Holds TypeB lock, waiting for TypeA lock
    // = DEADLOCK

    // Also dangerous:
    public class ThreadInStaticCtor
    {
        private static int _value;

        static ThreadInStaticCtor()
        {
            // DANGER: Starting thread that accesses this type
            var thread = new Thread(() =>
            {
                _value = 42; // Deadlock! Thread waits for static ctor to complete
            });
            thread.Start();
            thread.Join(); // Deadlock!
        }
    }
}

Static Constructor Exception Handling:

public class StaticCtorException
{
    public class FailingType
    {
        static FailingType()
        {
            throw new InvalidOperationException("Static ctor failed!");
        }

        public static void Method() { }
    }

    public void TryAccess()
    {
        try
        {
            FailingType.Method();
        }
        catch (TypeInitializationException ex)
        {
            // Original exception is in InnerException
            Console.WriteLine(ex.InnerException?.Message);
        }

        // Second attempt also throws TypeInitializationException
        // CLR remembers that type initialization failed
        try
        {
            FailingType.Method();
        }
        catch (TypeInitializationException)
        {
            Console.WriteLine("Type initialization already failed!");
        }
    }
}

7.2 Parameter Passing (ref, in, out)

[INTERNALS] How Parameters Work:

public class ParameterPassingDeepDive
{
    // By value (default) - copy of value or reference is passed
    public void ByValue(int x, string s, MyClass obj)
    {
        x = 100;        // Changes local copy, caller's value unchanged
        s = "new";      // Changes local reference, caller's reference unchanged
        obj.Value = 42; // Changes object content - caller sees this!
        obj = new MyClass(); // Changes local reference, caller's reference unchanged
    }

    // ref - pass by reference, must be initialized before call
    public void ByRef(ref int x, ref MyClass obj)
    {
        x = 100;        // Caller's variable is changed!
        obj = new MyClass(); // Caller's reference is changed!
    }

    // out - pass by reference, must be assigned inside method
    public void ByOut(out int x, out MyClass obj)
    {
        // Must assign before method returns
        x = 100;
        obj = new MyClass();
    }

    // in - pass by reference, readonly (cannot modify)
    public void ByIn(in int x, in MyClass obj)
    {
        // x = 100;  // Compiler error!
        // obj = new MyClass(); // Compiler error!
        obj.Value = 42; // Allowed! Reference is readonly, not object
    }

    public class MyClass { public int Value; }
}

[BENCHMARK] Performance with Large Structs:

public class LargeStructPerformance
{
    // Large struct (64 bytes)
    public struct LargeStruct
    {
        public long A, B, C, D, E, F, G, H;
    }

    // By value - copies 64 bytes every call
    public long ProcessByValue(LargeStruct s)
    {
        return s.A + s.B + s.C + s.D;
    }

    // By ref - passes 8-byte pointer, can modify
    public long ProcessByRef(ref LargeStruct s)
    {
        return s.A + s.B + s.C + s.D;
    }

    // By in - passes 8-byte pointer, readonly
    public long ProcessByIn(in LargeStruct s)
    {
        return s.A + s.B + s.C + s.D;
    }

    // Benchmark results (1M calls):
    // By value:  ~50ms (copying 64 bytes each time)
    // By ref:    ~15ms (just pointer)
    // By in:     ~15ms (just pointer, readonly)

    // GOTCHA: For small structs (< 16 bytes), by value can be faster
    // due to better cache locality and no indirection
}

ref struct and Span:

public class RefStructExample
{
    // ref struct - must live on stack, cannot be boxed
    public ref struct StackOnlyStruct
    {
        public Span<int> Data;

        public StackOnlyStruct(Span<int> data)
        {
            Data = data;
        }

        // Can have methods
        public int Sum()
        {
            int sum = 0;
            foreach (var item in Data)
            {
                sum += item;
            }
            return sum;
        }
    }

    // Restrictions on ref struct:
    // - Cannot be boxed (no object or interface conversions)
    // - Cannot be field in class or regular struct
    // - Cannot be used in async methods
    // - Cannot be used in lambdas that capture
    // - Cannot be generic type argument (until C# 11)

    public void UseStackOnly()
    {
        Span<int> span = stackalloc int[100];
        var stackStruct = new StackOnlyStruct(span);
        int sum = stackStruct.Sum();
    }

    // Span<T> usage
    public void SpanExample()
    {
        // Stack allocation
        Span<int> stackSpan = stackalloc int[100];

        // Slice of array (no copy!)
        int[] array = new int[100];
        Span<int> arraySpan = array.AsSpan(10, 20);

        // String parsing without allocation
        ReadOnlySpan<char> text = "Hello, World!".AsSpan();
        ReadOnlySpan<char> hello = text.Slice(0, 5);

        // Safe stack memory for parsing
        Span<byte> buffer = stackalloc byte[256];
        int bytesWritten = ProcessData(buffer);
    }
}

ref Returns and ref Locals:

public class RefReturnsExample
{
    private int[] _data = new int[100];

    // ref return - returns reference to array element
    public ref int GetElementRef(int index)
    {
        return ref _data[index];
    }

    // ref readonly return - readonly reference
    public ref readonly int GetElementReadOnly(int index)
    {
        return ref _data[index];
    }

    public void UseRefReturns()
    {
        // ref local - stores a reference
        ref int element = ref GetElementRef(10);
        element = 42; // Modifies _data[10] directly!

        // ref readonly local
        ref readonly int readOnlyElement = ref GetElementReadOnly(20);
        // readOnlyElement = 100; // Compiler error!

        Console.WriteLine(_data[10]); // 42
    }

    // Useful for avoiding copies in collections
    public struct LargeValue
    {
        public long A, B, C, D, E, F, G, H;
    }

    private LargeValue[] _largeArray = new LargeValue[100];

    public ref LargeValue GetLargeValueRef(int index)
    {
        return ref _largeArray[index];
    }

    public void ModifyInPlace()
    {
        // No copy - direct modification
        ref LargeValue item = ref GetLargeValueRef(5);
        item.A = 100;
    }
}

7.3 Method Overloading vs Overriding

[INTERNALS] Compile-Time vs Runtime Binding:

public class OverloadingVsOverriding
{
    // OVERLOADING - resolved at compile time based on STATIC type
    public class Overloading
    {
        public void Process(object obj) => Console.WriteLine("object");
        public void Process(string str) => Console.WriteLine("string");
        public void Process(int num) => Console.WriteLine("int");

        public void Demo()
        {
            object obj = "hello";

            Process("hello");  // Output: string (compile-time: string)
            Process(obj);      // Output: object (compile-time: object!)
            Process(42);       // Output: int
        }
    }

    // OVERRIDING - resolved at runtime based on ACTUAL type
    public class Animal
    {
        public virtual void Speak() => Console.WriteLine("Animal speaks");
    }

    public class Dog : Animal
    {
        public override void Speak() => Console.WriteLine("Woof!");
    }

    public class Overriding
    {
        public void Demo()
        {
            Animal animal = new Dog();
            animal.Speak(); // Output: Woof! (runtime: Dog)
        }
    }
}

[INTERNALS] Virtual Table (VTable) Mechanics:

public class VTableExplanation
{
    // CLR uses virtual method tables for polymorphism

    public class Base
    {
        // VTable slot 0 (inherited from Object)
        public virtual void Method1() => Console.WriteLine("Base.Method1");

        // VTable slot 1
        public virtual void Method2() => Console.WriteLine("Base.Method2");
    }

    public class Derived : Base
    {
        // Replaces slot 1 in VTable
        public override void Method2() => Console.WriteLine("Derived.Method2");

        // VTable slot 2 (new slot)
        public virtual void Method3() => Console.WriteLine("Derived.Method3");
    }

    // VTable layout:
    // Base VTable:    [Method1 -> Base.Method1, Method2 -> Base.Method2]
    // Derived VTable: [Method1 -> Base.Method1, Method2 -> Derived.Method2, Method3 -> Derived.Method3]

    // Method call:
    // 1. Get object's type pointer (first 8 bytes of object)
    // 2. Get VTable from type
    // 3. Call method at appropriate slot
}

new Keyword (Hiding vs Overriding):

public class HidingVsOverriding
{
    public class Base
    {
        public virtual void VirtualMethod() => Console.WriteLine("Base.Virtual");
        public void NonVirtualMethod() => Console.WriteLine("Base.NonVirtual");
    }

    public class Derived : Base
    {
        // Override - replaces in VTable
        public override void VirtualMethod() => Console.WriteLine("Derived.Virtual");

        // new - hides, creates separate method
        public new void NonVirtualMethod() => Console.WriteLine("Derived.NonVirtual");
    }

    public void Demo()
    {
        Derived d = new Derived();
        Base b = d;

        // Virtual method - always calls Derived version
        d.VirtualMethod(); // Derived.Virtual
        b.VirtualMethod(); // Derived.Virtual (runtime dispatch)

        // Hidden method - depends on reference type
        d.NonVirtualMethod(); // Derived.NonVirtual
        b.NonVirtualMethod(); // Base.NonVirtual (compile-time binding!)
    }

    // Warning: Hiding can be confusing and is usually a design smell
    // If you see "new" keyword without intentional hiding, refactor
}

Covariant Return Types (C# 9+):

public class CovariantReturns
{
    public class Animal { }
    public class Dog : Animal { }

    public class AnimalShelter
    {
        public virtual Animal GetAnimal() => new Animal();
    }

    public class DogShelter : AnimalShelter
    {
        // C# 9+ - can return more derived type
        public override Dog GetAnimal() => new Dog();
    }

    public void Demo()
    {
        DogShelter dogShelter = new DogShelter();

        // No cast needed!
        Dog dog = dogShelter.GetAnimal();

        // Still works polymorphically
        AnimalShelter shelter = dogShelter;
        Animal animal = shelter.GetAnimal(); // Returns Dog
    }
}

Generic Method Overloading Resolution:

public class GenericOverloadingRules
{
    // The compiler prefers more specific overloads
    public void Process<T>(T item) => Console.WriteLine($"Generic: {typeof(T)}");
    public void Process(string item) => Console.WriteLine("String specific");
    public void Process(int item) => Console.WriteLine("Int specific");

    public void Demo()
    {
        Process("hello");           // String specific (exact match)
        Process(42);                // Int specific (exact match)
        Process<string>("hello");   // Generic: String (forced generic)
        Process(new List<int>());   // Generic: List`1 (no specific match)

        // Constraint affects resolution
        ProcessConstrained(42);     // Uses IComparable overload
    }

    public void ProcessConstrained<T>(T item) where T : IComparable<T>
        => Console.WriteLine($"IComparable: {item}");

    public void ProcessConstrained<T>(T item) where T : IEnumerable<int>
        => Console.WriteLine($"IEnumerable<int>");
}

Operator Overloading Best Practices:

public class OperatorOverloading
{
    public readonly struct Money : IEquatable<Money>
    {
        public decimal Amount { get; }
        public string Currency { get; }

        public Money(decimal amount, string currency)
        {
            Amount = amount;
            Currency = currency;
        }

        // Arithmetic operators
        public static Money operator +(Money a, Money b)
        {
            if (a.Currency != b.Currency)
                throw new InvalidOperationException("Cannot add different currencies");
            return new Money(a.Amount + b.Amount, a.Currency);
        }

        public static Money operator -(Money a, Money b)
        {
            if (a.Currency != b.Currency)
                throw new InvalidOperationException("Cannot subtract different currencies");
            return new Money(a.Amount - b.Amount, a.Currency);
        }

        public static Money operator *(Money m, decimal multiplier)
            => new Money(m.Amount * multiplier, m.Currency);

        public static Money operator *(decimal multiplier, Money m)
            => m * multiplier; // Commutative

        // Comparison operators (must be in pairs)
        public static bool operator ==(Money a, Money b)
            => a.Currency == b.Currency && a.Amount == b.Amount;

        public static bool operator !=(Money a, Money b) => !(a == b);

        public static bool operator <(Money a, Money b)
        {
            if (a.Currency != b.Currency)
                throw new InvalidOperationException("Cannot compare different currencies");
            return a.Amount < b.Amount;
        }

        public static bool operator >(Money a, Money b) => b < a;
        public static bool operator <=(Money a, Money b) => !(b < a);
        public static bool operator >=(Money a, Money b) => !(a < b);

        // Conversion operators
        public static explicit operator decimal(Money m) => m.Amount;
        public static implicit operator Money(decimal amount) => new Money(amount, "USD");

        // Required for equality
        public bool Equals(Money other) => this == other;
        public override bool Equals(object? obj) => obj is Money m && Equals(m);
        public override int GetHashCode() => HashCode.Combine(Amount, Currency);
        public override string ToString() => $"{Amount:F2} {Currency}";
    }

    public void Demo()
    {
        var price1 = new Money(100, "USD");
        var price2 = new Money(50, "USD");

        var total = price1 + price2;           // 150 USD
        var discounted = total * 0.9m;         // 135 USD
        var tax = 0.08m * total;               // 12 USD

        bool isExpensive = total > new Money(200, "USD");

        decimal amount = (decimal)price1;      // Explicit conversion
        Money m = 99.99m;                      // Implicit conversion
    }
}

Interface Explicit vs Implicit Implementation:

public class InterfaceImplementationPatterns
{
    public interface IReadable
    {
        string Read();
    }

    public interface IWritable
    {
        void Write(string data);
    }

    // Implicit implementation - accessible via class and interface
    public class ImplicitImplementation : IReadable
    {
        public string Read() => "Data"; // Accessible directly
    }

    // Explicit implementation - only accessible via interface
    public class ExplicitImplementation : IReadable, IWritable
    {
        // Only via IReadable reference
        string IReadable.Read() => "Read data";

        // Only via IWritable reference
        void IWritable.Write(string data) { }

        // Public method with different signature
        public string Read(int count) => "Different read";
    }

    // Handling conflicting interface methods
    public interface IDisplay1
    {
        void Show();
    }

    public interface IDisplay2
    {
        void Show();
    }

    public class MultiInterface : IDisplay1, IDisplay2
    {
        // Explicit implementations for each
        void IDisplay1.Show() => Console.WriteLine("Display1");
        void IDisplay2.Show() => Console.WriteLine("Display2");

        // Optional: public method
        public void Show() => Console.WriteLine("Default");
    }

    public void Demo()
    {
        var obj = new ExplicitImplementation();
        // obj.Read(); // Compile error!
        string data = ((IReadable)obj).Read(); // Works

        var multi = new MultiInterface();
        multi.Show();                    // "Default"
        ((IDisplay1)multi).Show();       // "Display1"
        ((IDisplay2)multi).Show();       // "Display2"
    }
}

[GOTCHAS] Common Interview Trick Questions:

public class InterviewTrickQuestions
{
    // TRICK 1: Overloading with params
    public void Method(params int[] args) => Console.WriteLine("params");
    public void Method(int a, int b) => Console.WriteLine("two ints");

    public void Trick1()
    {
        Method(1, 2);      // "two ints" - more specific wins!
        Method(1, 2, 3);   // "params"
        Method(new[] { 1, 2 }); // "params"
    }

    // TRICK 2: Reference type overloading
    public void Process(object obj) => Console.WriteLine("object");
    public void Process(string str) => Console.WriteLine("string");

    public void Trick2()
    {
        string? nullString = null;
        Process(nullString);  // "string" - compile-time type is string

        object? nullObject = null;
        Process(nullObject);  // "object" - compile-time type is object
    }

    // TRICK 3: Virtual vs new
    public class Base
    {
        public virtual void Virtual() => Console.WriteLine("Base.Virtual");
        public void NonVirtual() => Console.WriteLine("Base.NonVirtual");
    }

    public class Middle : Base
    {
        public override void Virtual() => Console.WriteLine("Middle.Virtual");
        public new void NonVirtual() => Console.WriteLine("Middle.NonVirtual");
    }

    public class Derived : Middle
    {
        public override void Virtual() => Console.WriteLine("Derived.Virtual");
        public new void NonVirtual() => Console.WriteLine("Derived.NonVirtual");
    }

    public void Trick3()
    {
        Base b = new Derived();
        b.Virtual();      // "Derived.Virtual" - runtime dispatch
        b.NonVirtual();   // "Base.NonVirtual" - compile-time binding

        Middle m = new Derived();
        m.Virtual();      // "Derived.Virtual"
        m.NonVirtual();   // "Middle.NonVirtual" - new hides Base version
    }
}

8. DateTime & Timezone Handling

8.1 Working with Timezones

DateTime vs DateTimeOffset:

public class DateTimeDeepDive
{
    public void DateTimeBasics()
    {
        // DateTime - ambiguous timezone handling
        var localNow = DateTime.Now;        // Kind = Local
        var utcNow = DateTime.UtcNow;       // Kind = Utc
        var unspecified = new DateTime(2024, 1, 1); // Kind = Unspecified

        Console.WriteLine($"Local: {localNow}, Kind: {localNow.Kind}");
        Console.WriteLine($"UTC: {utcNow}, Kind: {utcNow.Kind}");
        Console.WriteLine($"Unspecified: {unspecified}, Kind: {unspecified.Kind}");

        // GOTCHA: Unspecified kind can cause bugs!
        var fromDatabase = new DateTime(2024, 6, 15, 12, 0, 0);
        // Is this local? UTC? We don't know!
    }

    public void DateTimeOffsetBasics()
    {
        // DateTimeOffset - unambiguous, includes offset from UTC
        var now = DateTimeOffset.Now;
        var utcNow = DateTimeOffset.UtcNow;

        // Explicit offset
        var pacific = new DateTimeOffset(2024, 6, 15, 12, 0, 0,
            TimeSpan.FromHours(-7)); // PDT
        var eastern = new DateTimeOffset(2024, 6, 15, 15, 0, 0,
            TimeSpan.FromHours(-4)); // EDT

        // Same instant in time!
        Console.WriteLine(pacific == eastern); // True!
        Console.WriteLine(pacific.UtcDateTime == eastern.UtcDateTime); // True

        // Different local representation
        Console.WriteLine(pacific); // 2024-06-15 12:00:00 -07:00
        Console.WriteLine(eastern); // 2024-06-15 15:00:00 -04:00
    }
}

TimeZoneInfo Usage:

public class TimeZoneHandling
{
    public void ConvertBetweenZones()
    {
        // Get timezone info
        var pacificZone = TimeZoneInfo.FindSystemTimeZoneById("Pacific Standard Time");
        var tokyoZone = TimeZoneInfo.FindSystemTimeZoneById("Tokyo Standard Time");
        var londonZone = TimeZoneInfo.FindSystemTimeZoneById("GMT Standard Time");

        // List all timezones
        foreach (var tz in TimeZoneInfo.GetSystemTimeZones())
        {
            Console.WriteLine($"{tz.Id}: {tz.DisplayName}");
        }

        // Convert between timezones
        var utcTime = DateTime.UtcNow;
        var pacificTime = TimeZoneInfo.ConvertTimeFromUtc(utcTime, pacificZone);
        var tokyoTime = TimeZoneInfo.ConvertTimeFromUtc(utcTime, tokyoZone);

        Console.WriteLine($"UTC: {utcTime}");
        Console.WriteLine($"Pacific: {pacificTime}");
        Console.WriteLine($"Tokyo: {tokyoTime}");

        // Convert between two timezones
        var converted = TimeZoneInfo.ConvertTime(
            pacificTime,
            pacificZone,
            tokyoZone);
    }

    // With DateTimeOffset
    public void ConvertDateTimeOffset()
    {
        var pacificZone = TimeZoneInfo.FindSystemTimeZoneById("Pacific Standard Time");

        var now = DateTimeOffset.UtcNow;
        var pacificNow = TimeZoneInfo.ConvertTime(now, pacificZone);

        Console.WriteLine($"UTC: {now}");
        Console.WriteLine($"Pacific: {pacificNow}");
    }
}

[GOTCHAS] DST Edge Cases:

public class DSTGotchas
{
    public void DaylightSavingProblems()
    {
        var pacificZone = TimeZoneInfo.FindSystemTimeZoneById("Pacific Standard Time");

        // Spring forward: 2am becomes 3am
        // March 10, 2024 at 2:30 AM doesn't exist!
        var nonExistent = new DateTime(2024, 3, 10, 2, 30, 0);

        if (pacificZone.IsInvalidTime(nonExistent))
        {
            Console.WriteLine("This time doesn't exist (DST gap)!");
        }

        // Fall back: 1am happens twice
        // November 3, 2024 at 1:30 AM is ambiguous!
        var ambiguous = new DateTime(2024, 11, 3, 1, 30, 0);

        if (pacificZone.IsAmbiguousTime(ambiguous))
        {
            Console.WriteLine("This time is ambiguous (DST overlap)!");

            // Get both possible offsets
            var offsets = pacificZone.GetAmbiguousTimeOffsets(ambiguous);
            foreach (var offset in offsets)
            {
                Console.WriteLine($"Possible offset: {offset}");
            }
        }
    }

    // Safe conversion that handles DST
    public DateTimeOffset SafeConvert(
        DateTime localDateTime,
        TimeZoneInfo sourceZone)
    {
        if (sourceZone.IsInvalidTime(localDateTime))
        {
            // Move forward past the gap
            var adjustment = sourceZone.GetAdjustmentRules()
                .First(r => r.DateStart <= localDateTime && r.DateEnd >= localDateTime);
            localDateTime = localDateTime.Add(adjustment.DaylightDelta);
        }

        if (sourceZone.IsAmbiguousTime(localDateTime))
        {
            // Assume standard time (later offset)
            var offsets = sourceZone.GetAmbiguousTimeOffsets(localDateTime);
            var standardOffset = offsets.Max();
            return new DateTimeOffset(localDateTime, standardOffset);
        }

        var offset2 = sourceZone.GetUtcOffset(localDateTime);
        return new DateTimeOffset(localDateTime, offset2);
    }
}

[PRODUCTION] Global Application Patterns:

public class GlobalAppPatterns
{
    // Best practice: Store UTC, convert for display
    public class Event
    {
        public Guid Id { get; set; }
        public string Name { get; set; } = "";

        // Always store in UTC
        public DateTimeOffset StartsAtUtc { get; set; }

        // User's timezone preference
        public string UserTimezoneId { get; set; } = "UTC";

        // Display in user's timezone
        public DateTimeOffset StartsAtLocal
        {
            get
            {
                var tz = TimeZoneInfo.FindSystemTimeZoneById(UserTimezoneId);
                return TimeZoneInfo.ConvertTime(StartsAtUtc, tz);
            }
        }
    }

    // Service for timezone handling
    public class TimezoneService
    {
        public DateTimeOffset ToUserTimezone(
            DateTimeOffset utcTime,
            string timezoneId)
        {
            var tz = TimeZoneInfo.FindSystemTimeZoneById(timezoneId);
            return TimeZoneInfo.ConvertTime(utcTime, tz);
        }

        public DateTimeOffset ToUtc(
            DateTime localTime,
            string timezoneId)
        {
            var tz = TimeZoneInfo.FindSystemTimeZoneById(timezoneId);
            var offset = tz.GetUtcOffset(localTime);
            var dto = new DateTimeOffset(localTime, offset);
            return dto.ToUniversalTime();
        }

        // For scheduling recurring events
        public IEnumerable<DateTimeOffset> GetOccurrences(
            DateTime localStartTime,
            string timezoneId,
            TimeSpan interval,
            int count)
        {
            var tz = TimeZoneInfo.FindSystemTimeZoneById(timezoneId);
            var current = localStartTime;

            for (int i = 0; i < count; i++)
            {
                // Calculate in local time to respect DST
                var offset = tz.GetUtcOffset(current);
                yield return new DateTimeOffset(current, offset);

                current = current.Add(interval);
            }
        }
    }

    // NodaTime recommendation for complex scenarios
    // using NodaTime;
    //
    // public class NodaTimeExample
    // {
    //     public void BetterTimezoneHandling()
    //     {
    //         var clock = SystemClock.Instance;
    //         var now = clock.GetCurrentInstant();
    //
    //         var pacific = DateTimeZoneProviders.Tzdb["America/Los_Angeles"];
    //         var zonedDateTime = now.InZone(pacific);
    //
    //         // Much better DST handling
    //         // IANA timezone database (cross-platform)
    //         // Explicit handling of ambiguous/invalid times
    //     }
    // }
}

9. Interview Questions & Answers

Threading & Async Questions

Q1: What’s the difference between Task.Run and TaskFactory.StartNew?

// A1: Key differences:

// 1. Scheduler - Task.Run always uses ThreadPool
Task.Run(() => { }); // Always ThreadPool

Task.Factory.StartNew(() => { },
    CancellationToken.None,
    TaskCreationOptions.None,
    TaskScheduler.Current); // Uses current scheduler (might not be ThreadPool!)

// 2. Async lambda handling
// Task.Run properly unwraps:
Task<int> good = Task.Run(async () => {
    await Task.Delay(100);
    return 42;
}); // Returns Task<int>

// TaskFactory.StartNew wraps:
Task<Task<int>> bad = Task.Factory.StartNew(async () => {
    await Task.Delay(100);
    return 42;
}); // Returns Task<Task<int>> - must call .Unwrap()

// 3. Default options
// Task.Run uses DenyChildAttach
// TaskFactory.StartNew uses None

// Best practice: Use Task.Run for simple ThreadPool work
// Use TaskFactory.StartNew only when you need:
// - Different TaskScheduler
// - TaskCreationOptions.LongRunning
// - Attached child tasks

Q2: When should you use SemaphoreSlim vs lock?

// A2:

// Use lock for:
// - Short synchronous critical sections
// - Simple mutual exclusion
// - When you don't need async

private readonly object _lock = new object();
public void SyncMethod()
{
    lock (_lock)
    {
        // Short, synchronous work
        DoQuickWork();
    }
}

// Use SemaphoreSlim for:
// - Async code
// - Throttling (allowing N concurrent operations)
// - Timeout support
// - Cancellation support

private readonly SemaphoreSlim _semaphore = new SemaphoreSlim(1, 1);
public async Task AsyncMethod()
{
    await _semaphore.WaitAsync();
    try
    {
        await DoAsyncWork();
    }
    finally
    {
        _semaphore.Release();
    }
}

// Key rule: You CANNOT use lock with await inside
// This is a compiler error:
// lock (_lock) { await SomeAsync(); } // ERROR!

Q3: Explain volatile vs Interlocked. When to use each?

// A3:

// volatile: Memory visibility guarantee
// - Reads always get latest value
// - Writes immediately visible to other threads
// - Does NOT make compound operations atomic

private volatile bool _flag;

public void SetFlag()
{
    _flag = true; // Other threads see this immediately
}

// WRONG with volatile:
private volatile int _counter;
public void Increment()
{
    _counter++; // NOT ATOMIC! Race condition!
}

// Interlocked: Atomic operations with memory barriers
// - Thread-safe compound operations
// - Lock-free synchronization

private int _atomicCounter;
public void SafeIncrement()
{
    Interlocked.Increment(ref _atomicCounter); // Atomic
}

// Use volatile for:
// - Simple flags (bool)
// - Status indicators read by multiple threads

// Use Interlocked for:
// - Counters
// - Any read-modify-write operation
// - Compare-and-swap patterns

Q4: What are Channels and when would you use them over BlockingCollection?

// A4: Channels are async-first producer-consumer collections

// BlockingCollection - synchronous API, blocks threads
var bc = new BlockingCollection<int>(100);
bc.Add(42);           // Blocks if full
var item = bc.Take(); // Blocks if empty

// Channel - async API, doesn't block threads
var ch = Channel.CreateBounded<int>(100);
await ch.Writer.WriteAsync(42);    // Awaits if full
var item2 = await ch.Reader.ReadAsync(); // Awaits if empty

// Use Channels when:
// 1. Building async pipelines
// 2. High-throughput scenarios (3-4x faster)
// 3. Modern async code
// 4. Need backpressure handling

// Use BlockingCollection when:
// 1. Legacy synchronous code
// 2. Simple producer-consumer with blocking semantics
// 3. GetConsumingEnumerable() pattern is convenient

Q5: How does deferred execution work in LINQ and what are the gotchas?

// A5: Deferred execution means the query isn't executed until enumerated

var numbers = new List<int> { 1, 2, 3 };

// Query defined, NOT executed
var query = numbers.Where(n => n > 1);

// Source modified
numbers.Add(4);

// NOW executed - includes 4!
var results = query.ToList(); // [2, 3, 4]

// GOTCHA 1: Multiple enumeration
IEnumerable<int> query2 = GetData().Where(x => x > 0);
var count = query2.Count();  // Executes query
var first = query2.First();  // Executes query AGAIN!
// Fix: Materialize with ToList()

// GOTCHA 2: Captured variables
var filters = new List<Func<int, bool>>();
for (int i = 0; i < 3; i++)
{
    filters.Add(x => x == i); // Captures variable i
}
// All filters check x == 3 (final value of i)!

// Fix: Capture copy
for (int i = 0; i < 3; i++)
{
    int captured = i;
    filters.Add(x => x == captured);
}

// GOTCHA 3: Disposing during deferred execution
IEnumerable<string> QueryDatabase()
{
    using var connection = new SqlConnection();
    // WRONG: Connection disposed before enumeration!
    return connection.Query<string>("SELECT Name FROM Users");
}
// Fix: Materialize inside using, or use yield return carefully

C# Language Questions

Q6: What happens when a static constructor throws an exception?

// A6: The type becomes permanently unusable

public class FailingType
{
    static FailingType()
    {
        throw new Exception("Init failed!");
    }
}

// First access
try
{
    var x = new FailingType();
}
catch (TypeInitializationException ex)
{
    // InnerException contains the original exception
    Console.WriteLine(ex.InnerException!.Message); // "Init failed!"
}

// Second access - SAME exception, type is "poisoned"
try
{
    var y = new FailingType();
}
catch (TypeInitializationException)
{
    // Still throws, even though constructor won't run again
}

// Key points:
// 1. Static constructor runs exactly once
// 2. If it fails, type is permanently broken
// 3. TypeInitializationException wraps the original
// 4. Make static constructors simple and failure-proof

Q7: Explain ref, in, and out parameters.

// A7:

// ref - pass by reference, must be initialized before call
public void RefExample(ref int value)
{
    value = 42; // Modifies caller's variable
}
int x = 10;
RefExample(ref x); // x is now 42

// out - pass by reference, must be assigned inside method
public bool TryParse(string s, out int result)
{
    // Must assign result before returning
    return int.TryParse(s, out result);
}
// int y; // Doesn't need initialization
// TryParse("42", out y);

// in - pass by reference, readonly
public double Calculate(in LargeStruct data)
{
    // data = new LargeStruct(); // Compile error!
    return data.A + data.B; // Can read
}

// Performance consideration:
// - Small types (< 16 bytes): pass by value
// - Large types: use 'in' for readonly, 'ref' for modification

Q8: What’s the difference between ToLookup and GroupBy?

// A8:

var items = GetItems(); // IEnumerable<Item>

// GroupBy - deferred execution
IEnumerable<IGrouping<int, Item>> grouped = items.GroupBy(i => i.Category);
// Query not executed yet
// Re-executed every time you enumerate

// ToLookup - immediate execution, cached result
ILookup<int, Item> lookup = items.ToLookup(i => i.Category);
// Query executed immediately
// Results cached, O(1) access

// Key difference: Access to missing key
// grouped[999] - Not directly possible with GroupBy
// lookup[999] - Returns empty sequence (never throws!)

// Use GroupBy for:
// - Single enumeration
// - Deferred execution needed
// - Part of larger query

// Use ToLookup for:
// - Multiple accesses by key
// - Cached grouping result
// - Join-like operations

DateTime Questions

Q9: Why is DateTimeOffset preferred over DateTime?

// A9: DateTimeOffset is unambiguous

// DateTime problem: Kind is often Unspecified
var dt = new DateTime(2024, 6, 15, 12, 0, 0); // What timezone?
// dt.Kind == Unspecified - could be any timezone!

// DateTimeOffset always includes offset
var dto = new DateTimeOffset(2024, 6, 15, 12, 0, 0,
    TimeSpan.FromHours(-7)); // Clearly PDT

// Same instant, different representations
var pacific = new DateTimeOffset(2024, 6, 15, 12, 0, 0, TimeSpan.FromHours(-7));
var eastern = new DateTimeOffset(2024, 6, 15, 15, 0, 0, TimeSpan.FromHours(-4));
Console.WriteLine(pacific == eastern); // True!

// Best practice:
// - Store DateTimeOffset in databases
// - Use DateTimeOffset.UtcNow instead of DateTime.UtcNow
// - Convert to user's timezone only for display

Q10: How do you handle DST (Daylight Saving Time) transitions?

// A10: Be aware of invalid and ambiguous times

var pacific = TimeZoneInfo.FindSystemTimeZoneById("Pacific Standard Time");

// Spring forward gap (2am β†’ 3am)
var gapTime = new DateTime(2024, 3, 10, 2, 30, 0);
if (pacific.IsInvalidTime(gapTime))
{
    // This time doesn't exist!
    // Option 1: Move forward
    // Option 2: Use UTC
    // Option 3: Reject input
}

// Fall back overlap (1am happens twice)
var ambiguousTime = new DateTime(2024, 11, 3, 1, 30, 0);
if (pacific.IsAmbiguousTime(ambiguousTime))
{
    // This time exists twice!
    var offsets = pacific.GetAmbiguousTimeOffsets(ambiguousTime);
    // Choose standard time or daylight time
}

// Best practice: Use DateTimeOffset to avoid ambiguity
// Or store UTC and convert for display

Quick Reference

Thread Creation Methods

Method Use When Thread Source
Thread Need full control, long-running blocking New OS thread
ThreadPool.QueueUserWorkItem Simple fire-and-forget ThreadPool
Task.Run Modern async/await code ThreadPool
TaskFactory.StartNew Need options, different scheduler Configurable
Parallel.For/ForEach Data parallelism ThreadPool

Synchronization Primitives

Primitive Async Support Best For
lock / Monitor No Short sync sections
SemaphoreSlim Yes Async, throttling
SpinLock No Very short locks
ReaderWriterLockSlim No Read-heavy workloads
AsyncLock (custom) Yes Async mutual exclusion

Collections

Collection Thread-Safe Lock-Free Best For
ConcurrentDictionary Yes Partial Key-value
ConcurrentQueue Yes Yes FIFO
ConcurrentStack Yes Yes LIFO
ConcurrentBag Yes Per-thread Unordered
Channel<T> Yes Partial Async producer-consumer

Parameter Keywords

Keyword Must Initialize Can Modify Use For
(none) Yes Local copy Default
ref Yes Yes Modify caller’s variable
out No Must assign Return multiple values
in Yes No Large struct optimization

Summary

This guide covered Principal Engineer-level knowledge of C# threading and async programming:

  1. Thread Creation: Multiple methods from Thread to Task.Run, with performance implications
  2. Synchronization: From lock to SemaphoreSlim, including async-compatible patterns
  3. Channels: Modern producer-consumer with backpressure and async support
  4. Async Streams: IAsyncEnumerable and ValueTask for streaming scenarios
  5. Parallel Execution: WhenAll, WhenAny, and throttling patterns
  6. LINQ: Deferred execution gotchas and ToLookup usage
  7. Language Internals: Static constructors, parameter passing, virtual dispatch
  8. DateTime: Timezone handling and DST edge cases

Master these concepts to build high-performance, thread-safe applications.

πŸ“š Related Articles