<?xml version="1.0" encoding="utf-8"?>
<feed xml:lang="en" xmlns="http://www.w3.org/2005/Atom">
  <title type="text"><![CDATA[Max Hamulyák · Kaylumah]]></title>
  <subtitle type="text"><![CDATA[A blog and portfolio website for Kaylumah, a company founded by Max Hamulyák]]></subtitle>
  <id>https://kaylumah.nl/feed.xml</id>
  <rights type="text"><![CDATA[© Kaylumah. All rights reserved.]]></rights>
  <updated>2026-03-14T10:02:52+01:00</updated>
  <logo>https://kaylumah.nl/assets/logo_alt.svg</logo>
  <generator>Kaylumah Site Generator</generator>
  <link rel="self" type="application/atom+xml" href="https://kaylumah.nl/feed.xml" />
  <link rel="alternate" type="text/html" href="https://kaylumah.nl/blog.html" />
  <entry>
    <id>https://kaylumah.nl/2025/08/09/automating-user-secrets.html</id>
    <title type="text"><![CDATA[Automating .NET User Secrets with PowerShell]]></title>
    <summary type="text"><![CDATA[Manage secret configuration for dotnet projects using PowerShell]]></summary>
    <published>2025-08-09T12:30:00+02:00</published>
    <updated>2025-08-09T12:30:00+02:00</updated>
    <author>
      <name>Max Hamulyák</name>
      <email>max@kaylumah.nl</email>
    </author>
    <link href="https://kaylumah.nl/2025/08/09/automating-user-secrets.html" />
    <category term="PowerShell" />
    <content type="html"><![CDATA[<p>For dotnet developers Microsoft created a dev-time convenience to handle secret values.
No need for a shared infrastructure dependency, and no need for storing secrets in the repository.
Add a helper script on top of it, and your dev shop will have a convenient way to get up and running.</p>
<p>UserSecrets are stored in an unencrypted JSON file. Depending on platform they are in either <code>%APPDATA%\Microsoft\UserSecrets</code> or <code>~/.microsoft/usersecrets</code>.</p>
<h2 id="simple-variant"><a href="#simple-variant">Simple Variant</a></h2>
<p>The simplest variant is demonstrated by this PowerShell script.
Please note, in a real-world scenario you would parameterize the script to allow entry of the secrets.
For simplicity we use random GUIDs here.</p>
<pre><code class="language-powershell">#Requires -Version 7.4

$ErrorActionPreference = &quot;Stop&quot;
$RepoRoot = Split-Path $PSScriptRoot -Parent

$Secret1 = [System.Guid]::NewGuid().ToString()
$Secret2 = [System.Guid]::NewGuid().ToString()

$APP1_FOLDER = Join-Path -Path $RepoRoot -ChildPath &quot;src/App1&quot;
Push-Location $APP1_FOLDER
Write-Host &quot;Setting secrets for $APP1_FOLDER&quot;
dotnet user-secrets clear
dotnet user-secrets set &quot;App1:ConnectionStrings:Secret1&quot; $Secret1
dotnet user-secrets set &quot;App1:ConnectionStrings:Secret2&quot; $Secret2
Pop-Location
</code></pre>
<p>This produces one of two possible outputs.</p>
<p>Failure output:</p>
<pre><code class="language-output">Could not find the global property 'UserSecretsId' in MSBuild project '/Secrets/src/App1/App1.csproj'. Ensure this property is set in the project or use the '--id' command line option.
</code></pre>
<p>Success output:</p>
<pre><code class="language-output">Setting secrets for /Users/maxhamulyak/Dev/BlogTopics/_posts/Secrets/src/App1
Successfully saved App1:ConnectionStrings:Secret1 to the secret store.
Successfully saved App1:ConnectionStrings:Secret2 to the secret store.
</code></pre>
<p>To be able to set secrets on a project level, the property UserSecretsId needs to be set.
For example <code>&lt;UserSecretsId&gt;[ANY-STRING-VALUE]&lt;/UserSecretsId&gt;</code>.
Doing this for a large solution, project-by-project can be a hassle. So I prefer creating a Directory.Build.Targets file.
We can then ensure each project either has an explicit or implicit secret id.</p>
<pre><code class="language-xml">&lt;Project&gt;
  &lt;PropertyGroup&gt;
      &lt;UserSecretsId Condition=&quot;'$(UserSecretsId)' == ''&quot;&gt;$(MSBuildProjectName)-dev-secrets&lt;/UserSecretsId&gt;
  &lt;/PropertyGroup&gt;
&lt;/Project&gt;
</code></pre>
<h2 id="multiple-secrets-at-once"><a href="#multiple-secrets-at-once">Multiple secrets at once</a></h2>
<p>The first version of the script works, but calling a command line for a ton of secrets feels ineffective.
Luckily, we can also bulk import by using a JSON file.
The trick here is to create the object in PowerShell, convert it to JSON and run the <code>dotnet user-secrets</code> command.</p>
<pre><code class="language-powershell">#Requires -Version 7.4

$ErrorActionPreference = &quot;Stop&quot;
$RepoRoot = Split-Path $PSScriptRoot -Parent

$Secret1 = [System.Guid]::NewGuid().ToString()
$Secret2 = [System.Guid]::NewGuid().ToString()

$APP1_FOLDER = Join-Path -Path $RepoRoot -ChildPath &quot;src/App1&quot;
Push-Location $APP1_FOLDER
Write-Host &quot;Setting secrets for $APP1_FOLDER&quot;
dotnet user-secrets clear
$App1Config = @{
    App1 = @{
        ConnectionStrings = @{
            Secret1 = $Secret1
            Secret2 = $Secret2
        }
    }
}
$App1Config | ConvertTo-Json -Depth 5 | dotnet user-secrets set
Pop-Location
</code></pre>
<h2 id="using-the-same-secret-across-multiple-projects"><a href="#using-the-same-secret-across-multiple-projects">Using the Same Secret Across Multiple Projects</a></h2>
<p>The previous iteration was already an improvement over our first script.
But, for me it does not quite match the real-world. For instance, in Azure, I would create a KeyVault per resource group. I would not create multiple key vaults. For this, I picked up the habit of prefixing secrets per executable. For example, thus far in this blog I have used <code>App1</code>.</p>
<p>If we now set the MSBuild property <code>&lt;UserSecretsId&gt;Project-5ea2d981-14f7-4487-93c0-d4b7e3dbebf1&lt;/UserSecretsId&gt;</code>, we can apply it to all projects at once.</p>
<pre><code class="language-powershell">#Requires -Version 7.4

$ErrorActionPreference = &quot;Stop&quot;

$Secret1 = [System.Guid]::NewGuid().ToString()
$Secret2 = [System.Guid]::NewGuid().ToString()


$App1Config = @{
    ConnectionStrings = @{
            Secret1 = $Secret1
            Secret2 = $Secret2
    }
}

$Config = @{
    App1 = $App1Config
}

$SecretId = &quot;Project-5ea2d981-14f7-4487-93c0-d4b7e3dbebf1&quot;
dotnet user-secrets clear --id $SecretId
$Config | ConvertTo-Json -Depth 10 | dotnet user-secrets set --id $SecretId
</code></pre>
<h2 id="closing-thoughts"><a href="#closing-thoughts">Closing thoughts</a></h2>
<p>User Secrets are a nice addition to the tool belt. Remembering the correct format of clearing/updating secrets, is not something you should burden your team with. Wrapping it inside a script for convenience is my recommended approach.
Depending on your deployment model I would go with either option 2 or option 3, keeping it as close to production as possible.</p>
<p>Remember: User Secrets are intended for local development only and should never be used to store production secrets.</p>
<h2 id="references"><a href="#references">References</a></h2>
<ul>
<li><a href="https://learn.microsoft.com/en-us/aspnet/core/security/app-secrets" class="external">UserSecrets Documentation</a></li>
</ul>]]></content>
  </entry>
  <entry>
    <id>https://kaylumah.nl/2025/04/05/tracking-nuget-updates-with-powershell.html</id>
    <title type="text"><![CDATA[Tracking NuGet Updates with PowerShell: Handling Pinned Versions & Constraints]]></title>
    <summary type="text"><![CDATA[Extend the .NET SDK to check outdated NuGet packages using PowerShell, with special handling for version ranges and pinned dependencies.]]></summary>
    <published>2025-04-05T17:30:00+02:00</published>
    <updated>2025-04-05T17:30:00+02:00</updated>
    <author>
      <name>Max Hamulyák</name>
      <email>max@kaylumah.nl</email>
    </author>
    <link href="https://kaylumah.nl/2025/04/05/tracking-nuget-updates-with-powershell.html" />
    <category term="PowerShell" />
    <category term="NuGet" />
    <content type="html"><![CDATA[<p>Most of the time, managing NuGet dependencies in .NET projects is straightforward.
Whether you believe in &quot;don't fix what's not broken&quot; or &quot;always update&quot;, there is always value in knowing about outdated packages.
You need to be able to make an informed decision either way.
While tools like Dependabot can automate this process, I sometimes prefer more control.
In this post I will share a script I wrote that extends the dotnet SDK to provide this information.</p>
<h2 id="create-a-helper-script"><a href="#create-a-helper-script">Create a helper script</a></h2>
<p>The dotnet SDK comes with a built-in command to <a href="https://learn.microsoft.com/en-us/dotnet/core/tools/dotnet-package-list" class="external">list the packages for a project/solution</a>.
Even if you execute the command for a <code>.sln</code> file, you get the outdated packages per project.
The package version shown will always be the latest available.
However, ever since central package management was introduced, most projects in a SLN would have the same version of a package.
For this purpose we can create a very simple helper script using PowerShell.</p>
<ol>
<li>List packages for solution in JSON format</li>
<li>Process every project with a valid TargetFramework (assumes single)</li>
<li>Capture outdated packages (unique by PackageId)</li>
<li>Print result</li>
</ol>
<pre><code class="language-powershell">param (
    [Parameter(Mandatory=$true, HelpMessage = &quot;The path to the project file&quot;)]
    [string] $ProjectPath
)

$OutdatedOutput = dotnet list $ProjectPath package --outdated --format json
$OutdatedOutputAsJson = $OutdatedOutput | ConvertFrom-json
$Projects = $OutdatedOutputAsJson.Projects

$Result = @{}
foreach ($Project in $Projects)
{
    $Frameworks = $Project.Frameworks
    if ($Frameworks -ne $null)
    {
        $Framework = $Frameworks[0]
        $TopLevelPackages = $Framework.TopLevelPackages
        foreach ($Package in $TopLevelPackages)
        {
            $PackageId = $Package.Id

            if ($Result.ContainsKey($PackageId))
            {
                Write-Verbose &quot;Skipping '$PackageId' already processed&quot;
                continue
            }

            $Result[$PackageId] = [pscustomobject]@{
                Id               = $PackageId
                From             = $Package.ResolvedVersion
                To               = $LatestVersion
            }
        }
    }
}

$Outdated = $Result.Values
if ($Outdated.Count -gt 0) {
    $sb = [System.Text.StringBuilder]::new()
    [void]$sb.AppendLine(&quot;The following dependencies have newer versions available:&quot;)
    foreach ($entry in $Outdated) {
        [void]$sb.AppendLine(&quot; - $($entry.Id): $($entry.From) → $($entry.To)&quot;)
    }
    $sb | Write-Warning
}
</code></pre>
<p>Where example output looks like this</p>
<pre><code class="language-shell">WARNING: The following dependencies have newer versions available:
 - FluentAssertions: 7.2.0 → 8.2.0
</code></pre>
<h2 id="lock-versions"><a href="#lock-versions">Lock versions</a></h2>
<p>The script shared above has one big shortcoming, it does not handle pinned versions.
Sometimes, for whatever reason, you want to prevent a package from being bumped.
The example I prefer to give is locking a <code>Microsoft.Extensions.*</code> package to its corresponding <code>.NET</code> framework version.
More recently, in the .NET open source community, there have been other cases:
My first thought was <code>Moq</code> with SponsorLink (2023), then <code>FluentAssertions</code> with the new paid license model (January 2025).
This week, AutoMapper, Mediator, and MassTransit joined the club. Nudging me to finally finish this article.
Even though the announcement coincided with April Fool’s Day, it didn’t appear to be a joke.
While I fully support and understand the need for these maintainers to earn money for their hard work, depending on how the update
is handled it opens you up for liabilities.</p>
<p>Luckily we can pin a package version using <a href="https://learn.microsoft.com/en-us/nuget/concepts/package-versioning?tabs=semver20sort#version-ranges" class="external">version ranges</a>.
We can set an inclusive boundary by using <code>[</code> or <code>]</code> and an exclusive boundary by using <code>(</code> or <code>)</code>.
Following this logic</p>
<ul>
<li><code>Moq</code> can be pinned with <code>[4.18.2]</code> or the equivalent <code>[4.18.2, 4.18.2]</code></li>
<li><code>FluentAssertions</code> can receive update until the next major version with <code>[7.0.0, 8.0.0)</code></li>
</ul>
<p>We now need to update the script to parse these version ranges. If the latest version is higher than the resolved one but still within range, we should update.
Also keep in mind that NuGet will always resolve the earliest possible resolution.
In this case that means we get version <code>7.0.0</code>.</p>
<p>The challenge is, that at the time of writing FluentAssertions has the following versions available <code>(7.0.0, ..., 7.2.0, 8.0.0, ..., 8.2.0)</code>.
This means NuGet resolves to 7.0.0, while dotnet list package reports 8.2.0 which violates the version range, and we don't know about <code>7.2.0</code> package that would be a valid upgrade.</p>
<p>We make the following changes to the script.
Check via regex if we detect a version range (min, max package versions), and handle the following cases</p>
<ol>
<li><code>Min == Max</code> =&gt; no update, version pinned.</li>
<li><code>Latest &lt; Max</code> =&gt; update, latest version within range.</li>
<li><code>Latest &gt; Max</code> =&gt; check NuGet, there might be a version.</li>
</ol>
<pre><code class="language-powershell">param (
    [Parameter(Mandatory=$true, HelpMessage = &quot;The path to the project file&quot;)]
    [string] $ProjectPath
)

$OutdatedOutput = dotnet list $ProjectPath package --outdated --format json
$OutdatedOutputAsJson = $OutdatedOutput | ConvertFrom-json
$Projects = $OutdatedOutputAsJson.Projects

$Result = @{}
foreach ($Project in $Projects)
{
    $Frameworks = $Project.Frameworks
    if ($Frameworks -ne $null)
    {
        $Framework = $Frameworks[0]
        $TopLevelPackages = $Framework.TopLevelPackages
        foreach ($Package in $TopLevelPackages)
        {
            $PackageId = $Package.Id

            if ($Result.ContainsKey($PackageId))
            {
                continue
            }

            $ResolvedVersion = $Package.ResolvedVersion
            $LatestVersion = [version]$Package.LatestVersion
            $RequestedVersion = $Package.RequestedVersion
            $NewVersion = $null
            $Description = &quot;N/A&quot;

            $SpecialVersionRegexMatch = $RequestedVersion -match &quot;^(?:(?&lt;Open&gt;[\[\(])(?&lt;Min&gt;[^,\)\]]*)?,?(?&lt;Max&gt;[^,\)\]]*)(?&lt;Close&gt;[\]\)])?)$&quot;

            if (-not $SpecialVersionRegexMatch)
            {
                $NewVersion = $LatestVersion
                $Description = &quot;Regular&quot;
            }
            else
            {
                $min = $null
                $max = $null
                $minInclusive = $Matches.Open -eq &quot;[&quot;
                $maxInclusive = $Matches.Close -eq &quot;]&quot;

                $minText = $Matches.Min
                $maxText = $Matches.Max

                if ($minText -match &quot;-&quot; -or $maxText -match &quot;-&quot;)
                {
                    $NewVersion = $LatestVersion
                    $Description = &quot;Preview version check manually&quot;
                } 
                else 
                {
                    if ($minText) { 
                        $min = [version]$minText
                    } 
                    else {
                        # Fallback to ResolvedVersion 
                        $min = $ResolvedVersion 
                    }

                    if ($maxText) { 
                        $max = [version]$maxText 
                    }
                    elseif (-not $maxInclusive)
                    {
                        # No upper version, resolve to latest
                        $max = $LatestVersion
                    }
                    elseif ($min -ne $null -and $minInclusive -and $maxInclusive)
                    {
                        # Fixed version [1.0.0]
                        $max = $min
                    }
                    else
                    {
                        throw &quot;Unreachable code: unexpected version constraint state for '$PackageId'&quot;
                    }
                    
                    if ($min -eq $max) {
                        $NewVersion = $min
                        $Description = &quot;Pinned&quot;
                    }
                    elseif ($LatestVersion -le $max)
                    {
                        $NewVersion = $LatestVersion
                        $Description = &quot;Below upper-constraint&quot;
                    }
                    elseif ($LatestVersion -gt $max -and $ResolvedVersion -lt $max)
                    {
                        $url = &quot;https://api.nuget.org/v3-flatcontainer/$packageId/index.json&quot;.ToLower()
                        
                        $response = Invoke-RestMethod -Uri $url -ErrorAction Stop
                        $allVersions = $response.versions | Where-Object { $_ -notmatch &quot;-&quot; } | ForEach-Object { [version]$_ }

                        if ($minInclusive) {
                            $allVersions = $allVersions | Where-Object { $_ -ge $min }
                        } else {
                            $allVersions = $allVersions | Where-Object { $_ -gt $min }
                        }

                        if ($maxInclusive) {
                            $allVersions = $allVersions | Where-Object { $_ -le $max }
                        } else {
                            $allVersions = $allVersions | Where-Object { $_ -lt $max }
                        }
                        
                        $NewVersion = $allVersions | Sort-Object -Descending | Select-Object -First 1
                        if ($ResolvedVersion -eq $NewVersion) {
                            $Description = &quot;No new allowed version on NuGet&quot;
                        } else {
                            $Description = &quot;Found version in range on NuGet&quot;
                        }
                    }
                }
            }

            $Result[$PackageId] = [pscustomobject]@{
                Id               = $PackageId
                From             = $ResolvedVersion
                To               = $NewVersion
                Description      = $Description
            }
        }
    }
}

$Outdated = $Result.Values
if ($Outdated.Count -gt 0) {
    $sb = [System.Text.StringBuilder]::new()
    [void]$sb.AppendLine(&quot;The following dependencies have newer versions available:&quot;)
    foreach ($entry in $Outdated) {
        [void]$sb.AppendLine(&quot; - $($entry.Id): $($entry.From) → $($entry.To) ($($entry.Description))&quot;)
    }
    $sb | Write-Warning
}
</code></pre>
<p>If we have the version set to <code>[7.0.0, 8.0.0)</code> we get the following output:</p>
<pre><code class="language-shell">WARNING: The following dependencies have newer versions available:
- FluentAssertions: 7.0.0 → 7.2.0 (Found version in range on NuGet)
</code></pre>
<p>After upgrading the range to <code>[7.2.0, 8.0.0)</code> we get the following output:</p>
<pre><code class="language-shell">WARNING: The following dependencies have newer versions available:
 - FluentAssertions: 7.2.0 → 7.2.0 (No new allowed version on NuGet)
</code></pre>
<h2 id="next-steps-ensure-the-script-always-runs"><a href="#next-steps-ensure-the-script-always-runs">Next steps: ensure the script always runs</a></h2>
<p>For my blog's repo I took it one step further.
I included a <code>Directory.Solution.props</code>, which triggers a custom target post build.
In my case I installed PowerShell as a dotnet tool.</p>
<pre><code class="language-xml">&lt;?xml version=&quot;1.0&quot; encoding=&quot;utf-8&quot;?&gt;
&lt;Project xmlns=&quot;http://schemas.microsoft.com/developer/msbuild/2003&quot;&gt;
  &lt;Target Name=&quot;CheckDependencies&quot; AfterTargets=&quot;Build&quot;&gt;
    &lt;PropertyGroup&gt;
      &lt;PowerShellCommand&gt;dotnet pwsh&lt;/PowerShellCommand&gt;
      &lt;PowerShellExecutionPolicy&gt;Bypass&lt;/PowerShellExecutionPolicy&gt;
      &lt;OutdatedScript&gt;$(MSBuildProjectDirectory)/tools/Outdated.ps1&lt;/OutdatedScript&gt;
      &lt;TargetProject&gt;$(MSBuildProjectDirectory)/SSG.sln&lt;/TargetProject&gt;
    &lt;/PropertyGroup&gt;
    &lt;Exec Command=&quot;$(PowerShellCommand) -ExecutionPolicy $(PowerShellExecutionPolicy) -NoProfile -File $(OutdatedScript) -ProjectPath $(TargetProject)&quot; /&gt;
  &lt;/Target&gt;
&lt;/Project&gt;
</code></pre>
<p>If you are on <code>net9.0</code> you will probably not see any output.
This is due to changes in MSBuild output, where the output from the script gets suppressed.
Running <code>dotnet build --verbosity detailed</code> will provide the output.</p>]]></content>
  </entry>
  <entry>
    <id>https://kaylumah.nl/2024/08/06/fix-vscode-markdown-preview.html</id>
    <title type="text"><![CDATA[Fixing VSCode Markdown preview with symbolic links!]]></title>
    <summary type="text"><![CDATA[Render images from everywhere inside VSCode's markdown preview]]></summary>
    <published>2024-08-06T17:30:00+02:00</published>
    <updated>2024-08-06T17:30:00+02:00</updated>
    <author>
      <name>Max Hamulyák</name>
      <email>max@kaylumah.nl</email>
    </author>
    <link href="https://kaylumah.nl/2024/08/06/fix-vscode-markdown-preview.html" />
    <category term="VS Code" />
    <category term="Markdown" />
    <content type="html"><![CDATA[<p>Many static website generators support writing blogs in Markdown. Jekyll and several other generators organize content, such as blog posts, and assets, like images and CSS files, into separate directories.</p>
<p>My favorite editor for writing Markdown is VSCode. While this separation is useful for organization, it can be somewhat cumbersome when editing and previewing Markdown.</p>
<p>To illustrate this point, let’s look at an example. A typical directory structure looks like this:</p>
<pre><code class="language-sh"># generated with the command tree -L 2
.
├── _posts
│   └── hello-world.md
├── assets
│   └── logo.svg
└── index.html
</code></pre>
<p>Using the Markdown preview feature of VSCode that would look like this:</p>
<!-- code --profile "Blog" . -->
<p><picture><source type="image/webp" srcset="https://kaylumah.nl/assets/images/posts/20240806/markdown-preview/001_RootPreview.png.webp" /><img loading="lazy" src="https://kaylumah.nl/assets/images/posts/20240806/markdown-preview/001_RootPreview.png" width="2272" height="1760" alt="Markdown preview from project root" /></picture></p>
<h2 id="the-issue"><a href="#the-issue">The issue</a></h2>
<p>If you’re like me, your project contains many more files than the few shown in the example. In such cases, I prefer working inside the _posts folder. Unfortunately, as the screenshot below shows, this breaks the image preview functionality.</p>
<p><picture><source type="image/webp" srcset="https://kaylumah.nl/assets/images/posts/20240806/markdown-preview/002_FolderPreview.png.webp" /><img loading="lazy" src="https://kaylumah.nl/assets/images/posts/20240806/markdown-preview/002_FolderPreview.png" width="2272" height="1760" alt="Markdown preview from subfolder" /></picture></p>
<p>Instead of displaying my logo, the preview now shows a broken image icon. Technically, this behavior is correct because, relative to our “hello-world.md” post, there is no “assets” directory. You might think that changing the path to “../assets/” would solve the issue, since that’s where the folder exists on disk. However, VSCode does not allow this due to <a href="https://github.com/Microsoft/vscode/issues/64685#issuecomment-446414622" class="external">security concerns</a>. Even if it did work, it would create the issue that the preview would no longer function correctly when opened from the root directory.</p>
<h2 id="the-solution"><a href="#the-solution">The solution</a></h2>
<p>To my knowledge, there is no built-in function in VSCode to address this issue. However, there is an operating system-level solution: using symbolic links.</p>
<p>We can create a symlink by running the following command inside the &quot;_posts&quot; directory</p>
<pre><code class="language-sh">ln -s ../assets assets
</code></pre>
<p>From the filesystem perspective the &quot;_posts&quot; folder now has a subfolder called posts. If we now open it inside VSCode it renders the image correctly.</p>
<p><picture><source type="image/webp" srcset="https://kaylumah.nl/assets/images/posts/20240806/markdown-preview/003_SymlinkPreview.png.webp" /><img loading="lazy" src="https://kaylumah.nl/assets/images/posts/20240806/markdown-preview/003_SymlinkPreview.png" width="2272" height="1760" alt="Markdown preview with symlink" /></picture></p>
<h2 id="to-consider"><a href="#to-consider">To consider</a></h2>
<p>Personally, I believe its a nice workaround for an issue that irritated me.
Before you leave I like to leave you with some final thoughts.</p>
<ul>
<li>This behavior is disabled by default, to prevent opening untrusted content. So don't blindly apply this solution everywhere</li>
<li>If your blog is under source control via GIT you can <a href="https://stackoverflow.com/questions/954560/how-does-git-handle-symbolic-links/18791647#18791647" class="external">include those symlinks</a> in GIT so your teammates have the same benefits. If you are using Git for Windows you may need <a href="https://stackoverflow.com/questions/5917249/git-symbolic-links-in-windows/59761201#59761201" class="external">additional steps to support symlinks</a></li>
</ul>]]></content>
  </entry>
  <entry>
    <id>https://kaylumah.nl/2024/02/09/long-live-reqnroll.html</id>
    <title type="text"><![CDATA[Specflow has died; long live Reqnroll!]]></title>
    <summary type="text"><![CDATA[SpecFlow rebooted; please welcome Reqnroll!]]></summary>
    <published>2024-02-09T21:00:00+01:00</published>
    <updated>2024-02-09T21:00:00+01:00</updated>
    <author>
      <name>Max Hamulyák</name>
      <email>max@kaylumah.nl</email>
    </author>
    <link href="https://kaylumah.nl/2024/02/09/long-live-reqnroll.html" />
    <category term="Testing" />
    <category term="Reqnroll" />
    <content type="html"><![CDATA[<p>Today's post is a short one. My goal is to spread awareness inside the .NET community that on 2024-02-08 at 14:56:14Z, Specflow was <a href="https://github.com/SpecFlowOSS/SpecFlow/issues/2719#issuecomment-1934292742" class="external">pronounced dead</a>. The statement was made by none other than Gáspár Nagy, the original creator of Specflow.</p>
<p>For those who don't know, Specflow is (or I suppose was) the official Cucumber implementation for .NET, which has been open-source all the way since 2009. With Cucumber, you write tests using the Gherkin language. This allows for human readable tests and many more incredible advantages. One of the last things I did at my previous company was give a dev talk on all things Specflow and my experiences using it.</p>
<p><picture><source type="image/webp" srcset="https://kaylumah.nl/assets/images/posts/20240209/reqnroll/ilionx_devdays_2023_specflow.jpeg.webp" /><img loading="lazy" src="https://kaylumah.nl/assets/images/posts/20240209/reqnroll/ilionx_devdays_2023_specflow.jpeg" width="2560" height="1707" alt="Dev Days 2023 - ilionx - Specflow Presentation" /></picture></p>
<p>Somewhere in the past, Specflow was sold, and unfortunately, for the past two years, the project has seen no activity. Gáspár has made the decision <a href="https://reqnroll.net/news/2024/02/from-specflow-to-reqnroll-why-and-how/" class="external">to fork and revive</a> it under the name of Reqnroll(pronounced as [reknroʊl]). He has done a lot of grunt work, and now it is up to him and the community to keep it alive.</p>
<p>A <a href="https://docs.reqnroll.net/latest/guides/migrating-from-specflow.html" class="external">migration guide</a> was published, and it is as simple as changing a package and changing a few namespaces. I think the migration took me about 15 minutes, and it was done. There is even a package published for backwards compatibility, so even fewer changes are required.</p>
<p>Most of the tests for the blog and static website generator you are reading this article on were written using Specflow (and now rewritten!). I was planning out a bunch of articles on this topic, but with a new job and everything, time got away from me.
I'd like to personally thank Gáspár for the effort to first revive Specflow and now for the process of starting the fork and a brand new community. I hope the efforts will not be in vain and we can make the project successful. I will make it a priority that upcoming content will make use of Reqnroll! Not only because I love this way of writing tests but also to show that Gáspár has my full support moving forward!</p>]]></content>
  </entry>
  <entry>
    <id>https://kaylumah.nl/2023/04/14/csharp-client-for-openapi-revisited.html</id>
    <title type="text"><![CDATA[Generate C# client for OpenAPI - Revisited]]></title>
    <summary type="text"><![CDATA[A comparison of NSwag.MSBuild and OpenApiReference]]></summary>
    <published>2023-04-14T00:00:00+02:00</published>
    <updated>2023-04-14T00:00:00+02:00</updated>
    <author>
      <name>Max Hamulyák</name>
      <email>max@kaylumah.nl</email>
    </author>
    <link href="https://kaylumah.nl/2023/04/14/csharp-client-for-openapi-revisited.html" />
    <category term="C#" />
    <category term="NSwag" />
    <category term="OpenAPI" />
    <category term="Swashbuckle" />
    <content type="html"><![CDATA[<p>I am working on an article for the blog that relies on a C# generated Open API client. I wrote an article on that a few years ago called <a href="https://kaylumah.nl/2021/05/23/generate-csharp-client-for-openapi.html">&quot;Generate C# client for OpenAPI&quot;</a>. So I decided to check if the advice from that post would still be valid today. Combined with the fact that, according to analytics, it is one of my most popular articles to date, this post was born.</p>
<p>The solution provided relied on using an MSBuild task to generate the API on build using a tool called NSwag. However, even back then, in 2021, an alternative was already available. Steve Collins, another dotnet content creator, published an article called <a href="https://stevetalkscode.co.uk/openapireference-commands" class="external">&quot;Using OpenApiReference To Generate Open API Client Code&quot;</a>. The alternative directly adds OpenAPI support to the project while still using NSWag under the hood. Back then, Steve mentioned that there was little documentation, and I was already familiar with doing it manually, so I decided to stick with that. Today I wanted to compare doing it manually or via the built-in mechanism.</p>
<h2 id="safe-openapi-specification-on-build"><a href="#safe-openapi-specification-on-build">Safe OpenAPI specification on build</a></h2>
<p>The purpose of the post is not to detail how to configure an OpenAPI spec for your project since the standard template already supports Swashbuckle. You can find more information on that over <a href="https://learn.microsoft.com/en-us/aspnet/core/tutorials/getting-started-with-swashbuckle?view=aspnetcore-7.0&amp;tabs=visual-studio" class="external">at Microsoft Learn</a>. One thing I like to add to the standard template, is that I want the specification to be part of the project output. We can achieve that with the Swashbuckle CLI, which you can install with the command <code>dotnet tool install --local Swashbuckle.AspNetCore.Cli --version 6.4.0</code>. Note that the version of the CLI must match the version of Swashbuckle used in the API project. After you install the tool, you can modify the csproj to look like this.</p>
<pre><code class="language-xml">&lt;Target Name=&quot;Generate OpenAPI Specification Document&quot; AfterTargets=&quot;Build&quot;&gt;
  &lt;PropertyGroup&gt;
    &lt;OpenApiDocumentName&gt;v1&lt;/OpenApiDocumentName&gt;
    &lt;ApiDll&gt;$(OutputPath)$(AssemblyName).dll&lt;/ApiDll&gt;
    &lt;OutputApiDocument&gt;$(OutputPath)$(AssemblyName).json&lt;/OutputApiDocument&gt;
  &lt;/PropertyGroup&gt;
  &lt;Exec Command=&quot;dotnet swagger tofile --output $(OutputApiDocument) $(ApiDll) $(OpenApiDocumentName)&quot; ContinueOnError=&quot;true&quot; /&gt;
&lt;/Target&gt;
</code></pre>
<p>The <code>swagger</code> command takes the output location (OutputApiDocument), the DLL for the specification (ApiDll) and the document name (OpenAPIDocumentName) as input parameters. The default name of the API document is <code>v1</code>. We use some existing MSBuild properties to populate these parameters, so in our case, <code>OutputPath</code> looks like <code>bin/Debug/net7.0/</code> and <code>AssemblyName</code> is <code>Demo</code>. That means that after the project builds, a file <code>bin/Debug/net7.0/Demo.json</code> will contain our Open API Specification.</p>
<p>Note that as part of the <code>bin</code> folder, the specification is not under source control. Sometimes I place it in the project root to track any changes made to the specification. Doing so is especially useful for monitoring unexpected or unintended changes to the specification.</p>
<h2 id="use-nswag.msbuild-to-generate-a-csharp-client"><a href="#use-nswag.msbuild-to-generate-a-csharp-client">Use NSwag.MSBuild to generate a csharp client</a></h2>
<p>To add NSwag manually to our project, we need the <code>NSwag.MSBuild</code> NuGet package. Which we can install via <code>dotnet add package NSwag.MSBuild --version 13.18.2</code>. The process is mostly the same as I detailed in 2021; one of the few changes is the target framework to use. Modify the csproj as follows:</p>
<pre><code class="language-xml">&lt;Target Name=&quot;NSwag&quot; AfterTargets=&quot;PostBuildEvent&quot; Condition=&quot; '$(Configuration)' == 'Debug' &quot;&gt;
    &lt;!--https://github.com/RicoSuter/NSwag/wiki/NSwag.MSBuild--&gt;
    &lt;!-- &lt;Exec Command=&quot;$(NSwagExe_Net70) new&quot; /&gt; --&gt;
    &lt;PropertyGroup&gt;
      &lt;OpenApiDocument&gt;../../Api/Demo/bin/Debug/net7.0/Demo.json&lt;/OpenApiDocument&gt;
      &lt;NSwagConfiguration&gt;nswag.json&lt;/NSwagConfiguration&gt;
      &lt;GeneratedOutput&gt;Client.g.cs&lt;/GeneratedOutput&gt;
    &lt;/PropertyGroup&gt;
    &lt;Exec Command=&quot;$(NSwagExe_Net70) run $(NSwagConfiguration) /variables:OpenApiDocument=$(OpenApiDocument),GeneratedOutput=$(GeneratedOutput)&quot; /&gt;
&lt;/Target&gt;
</code></pre>
<p>You can uncomment <code>$(NSwagExe_Net70) new</code> to generate a fresh nswag.json, the configuration file used for NSwag. After you have the config file, you still need to specify the runtime, the document, and the output location. Abbreviated the change to the file looks like this:</p>
<pre><code class="language-json">{
  &quot;runtime&quot;: &quot;Net70&quot;,
  &quot;defaultVariables&quot;: null,
  &quot;documentGenerator&quot;: {
    &quot;fromDocument&quot;: {
      &quot;json&quot;: &quot;$(OpenApiDocument)&quot;
    }
  },
  &quot;codeGenerators&quot;: {
     &quot;openApiToCSharpClient&quot;: { 
      // ...
      &quot;output&quot;: &quot;$(GeneratedOutput)&quot;
      // ...
     }
  }
}
</code></pre>
<p>I don't remember it being possible back in 2021, but you can now change the JSON serializer used in the generated client. You can do so by modifying the value of <code>codeGenerators openApiToCSharpClient jsonLibrary</code> to <code>SystemTextJson</code>. If you do not do this, you must install the <code>Newtonsoft.Json</code> package, or the generated code will not compile.</p>
<h2 id="using-openapi-reference"><a href="#using-openapi-reference">Using OpenAPI Reference</a></h2>
<h3 id="using-openapi-reference-from-visual-studio"><a href="#using-openapi-reference-from-visual-studio">Using OpenAPI reference from Visual Studio</a></h3>
<p>I can imagine that people do not like the manual way, especially if you don't know the inner workings of MSBuild; it can feel a bit like magic. Adding an OpenAPI reference via Visual Studio is as simple as right-clicking any project and choosing add connected service.</p>
<p><picture><source type="image/webp" srcset="https://kaylumah.nl/assets/images/posts/20230414/openapi/01_add_service_reference.png.webp" /><img loading="lazy" src="https://kaylumah.nl/assets/images/posts/20230414/openapi/01_add_service_reference.png" width="1546" height="1000" alt="Microsoft Visual Studio - Add Service reference" /></picture></p>
<p><picture><source type="image/webp" srcset="https://kaylumah.nl/assets/images/posts/20230414/openapi/02_select_service_type.png.webp" /><img loading="lazy" src="https://kaylumah.nl/assets/images/posts/20230414/openapi/02_select_service_type.png" width="1443" height="925" alt="Microsoft Visual Studio - Select service reference type" /></picture></p>
<p>By choosing the option &quot;Service reference...&quot; instead of &quot;Connected Service&quot; you get the second prompt immediately. By selecting &quot;Connected service&quot; you get the overview of all connected services for the project and then need an extra click to add the service reference.</p>
<p>We can customize the input for the msbuild task on the third screen. We only need to specify the file location of the Open API JSON.</p>
<p><picture><source type="image/webp" srcset="https://kaylumah.nl/assets/images/posts/20230414/openapi/03_add_openapi.png.webp" /><img loading="lazy" src="https://kaylumah.nl/assets/images/posts/20230414/openapi/03_add_openapi.png" width="1547" height="923" alt="Microsoft Visual Studio - Add service reference OpenAPI" /></picture></p>
<p>By selecting &quot;finish&quot;, Visual Studio will make all necessary modifications. Easy right? The project file should now look like this:</p>
<pre><code class="language-xml">&lt;ItemGroup&gt;
  &lt;OpenApiReference Include=&quot;..\..\Api\Demo\bin\Debug\net7.0\Demo.json&quot; 
                    CodeGenerator=&quot;NSwagCSharp&quot;
                    Link=&quot;OpenAPIs\Demo.json&quot; /&gt;
&lt;/ItemGroup&gt;

&lt;ItemGroup&gt;
  &lt;PackageReference Include=&quot;Microsoft.Extensions.ApiDescription.Client&quot; Version=&quot;3.0.0&quot;&gt;
    &lt;PrivateAssets&gt;all&lt;/PrivateAssets&gt;
    &lt;IncludeAssets&gt;runtime; build; native; contentfiles; analyzers; buildtransitive&lt;/IncludeAssets&gt;
  &lt;/PackageReference&gt;
  &lt;PackageReference Include=&quot;Newtonsoft.Json&quot; Version=&quot;13.0.1&quot; /&gt;
  &lt;PackageReference Include=&quot;NSwag.ApiDescription.Client&quot; Version=&quot;13.0.5&quot;&gt;
    &lt;PrivateAssets&gt;all&lt;/PrivateAssets&gt;
    &lt;IncludeAssets&gt;runtime; build; native; contentfiles; analyzers; buildtransitive&lt;/IncludeAssets&gt;
  &lt;/PackageReference&gt;
&lt;/ItemGroup&gt;
</code></pre>
<p>The generated code will be in the <code>obj</code> folder if you build the project. As mentioned above, the API specification is in the <code>Debug/net7.0</code> folder, so it will break if I retarget this solution to a newer framework. That is another reason to just put the specification at the root of the API project.</p>
<h3 id="using-openapi-reference-from-command-line"><a href="#using-openapi-reference-from-command-line">Using OpenAPI reference from command line</a></h3>
<p>You may wonder if it is as simple if you do not have Visual Studio as your IDE. It is; Microsoft published a dotnet tool for this exact reason. You can install it by running <code>dotnet tool install --local Microsoft.dotnet-openapi --version 7.0.4</code>. You can add the API specification by using a terminal from your project's directory and running the following command.</p>
<pre><code class="language-shell">dotnet dotnet-openapi add file ..\..\Api\Demo\bin\Debug\net7.0\Demo.json
</code></pre>
<p>The result looks like this:</p>
<pre><code class="language-xml">&lt;ItemGroup&gt;
  &lt;PackageReference Include=&quot;Newtonsoft.Json&quot; Version=&quot;12.0.2&quot; /&gt;
  &lt;PackageReference Include=&quot;NSwag.ApiDescription.Client&quot; Version=&quot;13.0.5&quot; /&gt;
&lt;/ItemGroup&gt;
&lt;ItemGroup&gt;
  &lt;OpenApiReference Include=&quot;..\..\Api\Demo\bin\Debug\net7.0\Demo.json&quot; /&gt;
&lt;/ItemGroup&gt;
</code></pre>
<p>Yeah, that is right, it is similar but not the same as if done via Visual Studio.</p>
<ul>
<li>The package <code>Microsoft Extensions ApiDescription Client</code> is missing.</li>
<li>The version for NewtonSoft is different.</li>
<li>The CodeGenerator is not specified, and it defaults to <code>NSwagCSharp</code>.
I expected the tool to use the same templates as Visual Studio, but this is not the case. The missing package is still used, but as a transitive dependency of <code>NSwag.ApiDescription.Client</code>, with the installed version it is just a preview build.</li>
</ul>
<p>Do note that it is not required to use the dotnet tool for this; you can manually add the same lines as provided above. The tool is just there for convenience. Documentation for the tool is still somewhat limited but is described at the <a href="https://learn.microsoft.com/en-us/aspnet/core/web-api/microsoft.dotnet-openapi?view=aspnetcore-7.0" class="external">docs</a> over here.</p>
<h3 id="influence-created-output"><a href="#influence-created-output">Influence created output</a></h3>
<p>We have already seen that for the manual approach, making changes to the configuration can be done by modifying the nswag.json, a file we do not have when using OpenApiReference. So this section will go into making the same modification for this version.</p>
<p>Before I go into it, we must fix one issue with the template used so far. There is a glaring issue which only appears if you have built the project in different ways. For example, here is the output building from Visual Studio</p>
<pre><code class="language-shell">1&gt;GenerateNSwagCSharp:
1&gt;  &quot;C:\Users\hamulyak\.nuget\packages\nswag.msbuild\13.0.5\build\../tools/Win/NSwag.exe&quot; openapi2csclient /className:DemoClient /namespace:ConnectedService /input:C:\projects\BlogTopics\MyBlog\src\Api\Demo\bin\Debug\net7.0\Demo.json /output:obj\DemoClient.cs
1&gt;NSwag command line tool for .NET 4.6.1+ WinX64, toolchain v13.0.5.0 (NJsonSchema v10.0.22.0 (Newtonsoft.Json v11.0.0.0))
1&gt;Visit http://NSwag.org for more information.
1&gt;NSwag bin directory: C:\Users\hamulyak\.nuget\packages\nswag.msbuild\13.0.5\tools\Win
1&gt;Code has been successfully written to file.
</code></pre>
<p>Compare that with the output from the dotnet CLI:</p>
<pre><code class="language-shell">  GenerateNSwagCSharp:
    dotnet --roll-forward-on-no-candidate-fx 2 C:\Users\hamulyak\.nuget\packages\nswag.msbuild\13.0.5\build\../tools/NetCore21//dotnet-nswag.dll openapi2csclient /className:DemoClient /na
  mespace:ConnectedService /input:C:\projects\BlogTopics\MyBlog\src\Api\Demo\bin\Debug\net7.0\Demo.json /output:obj\DemoClient.cs
  NSwag command line tool for .NET Core NetCore21, toolchain v13.0.5.0 (NJsonSchema v10.0.22.0 (Newtonsoft.Json v11.0.0.0))
  Visit http://NSwag.org for more information.
  NSwag bin directory: C:\Users\hamulyak\.nuget\packages\nswag.msbuild\13.0.5\tools\NetCore21
  Code has been successfully written to file.
</code></pre>
<p>Do you see the issue? The CLI variant differs from the NSwag version used; it uses a <code>NetCore21</code> dll. We get this behaviour because the templates use an outdated package version. According to NuGet the old version (13.0.5) is downloaded over 2 million times, whereas all other versions do not exceed half a million. After updating, the NSwag version will equal your project's target framework.</p>
<p>Back to the issue at hand, how do we customize the output? It is a mix-match situation. You can modify the Namespace and Client name directly by specifying them as properties on the <code>OpenApiReference</code> line like this:</p>
<pre><code class="language-xml">&lt;OpenApiReference Include=&quot;..\..\Api\Demo\bin\Debug\net7.0\Demo.json&quot; 
                  CodeGenerator=&quot;NSwagCSharp&quot; 
                  Namespace=&quot;MyNamespace&quot;
                  ClassName=&quot;MyClient&quot; 
                  Link=&quot;OpenAPIs\Demo.json&quot; /&gt;
</code></pre>
<p>Other options, like the JsonLibrary, need to be formatted differently. Like <code>Namespace</code>, there is an <code>Options</code> attribute. For example, we change the configuration below to use SystemTextJson and provide a custom name for the Exception class in the generated code base.</p>
<pre><code class="language-xml">&lt;OpenApiReference Include=&quot;..\..\Api\Demo\bin\Debug\net7.0\Demo.json&quot; 
                  CodeGenerator=&quot;NSwagCSharp&quot; 
                  Options=&quot;/JsonLibrary:SystemTextJson /ExceptionClass:DemoApiException&quot; 
                  ClassName=&quot;MyClient&quot; 
                  Link=&quot;OpenAPIs\Demo.json&quot; /&gt;
</code></pre>
<p>Any value set by nswag.json can also be provided here in the format <code>/propertyName:value</code>. I like to point out that properties like namespace can not be set here, so the following snippet will not work.</p>
<pre><code class="language-xml">&lt;OpenApiReference 
  Include=&quot;..\..\Api\Demo\bin\Debug\net7.0\Demo.json&quot; 
  CodeGenerator=&quot;NSwagCSharp&quot; 
  Options=&quot;/Namespace:MyNamspace /JsonLibrary:SystemTextJson /ExceptionClass:DemoApiException&quot; 
  ClassName=&quot;MyClient&quot; 
  Link=&quot;OpenAPIs\Demo.json&quot; /&gt;
</code></pre>
<p>The reason is that task creates the following NSwag command (displayed in the output window)</p>
<pre><code class="language-shell">dotnet --roll-forward-on-no-candidate-fx 2 &quot;C:\Users\hamulyak\.nuget\packages\nswag.msbuild\13.18.2\build\../tools/Net70//dotnet-nswag.dll&quot; openapi2csclient /className:MyClient /namespace:Override /input:&quot;C:\projects\BlogTopics\MyBlog\src\Api\Demo\bin\Debug\net7.0\Demo.json&quot; /output:&quot;obj\DemoClient.cs&quot; /Namespace:MyNamspace /JsonLibrary:SystemTextJson /ExceptionClass:DemoApiException
</code></pre>
<p>It has a duplicate <code>/Namespace</code>, and the first wins. The only way to customize the namespace is by providing it as an attribute. Otherwise, the default value, which is the assembly name, will be used.</p>
<h2 id="conclusion"><a href="#conclusion">Conclusion</a></h2>
<p>In conclusion, I cannot deny that the OpenApiReference feels a lot easier than the manual approach. It has its issues. For example, the lack of documentation configuring the generated code is disappointing. I also think it is weird that the different approaches between Visual Studio and command-line are out of sync and that both templates are outdated. It makes me question if people are using it. Both reasons are not a hard no-go for me. The new approach does feel less like magic, but the trade-off for me is that the settings used are less transparent than a nswag.json file. As I showed in the previous version of this article, NSwag can also function without specifying all options and will apply the defaults themselves. I like things to be explicit, so that is a point in favour of the manual approach, that it gives me a choice to specify everything or omit the things I don't need. The abstraction might have been better if we could still provide a config file or if there were properties that Microsoft would map to NSwag stuff internally, decoupling my code generation from the implementation. What do you think about this? Which version do you prefer?</p>
<p>As always, if you have any questions, feel free to reach out. Do you have suggestions or alternatives? I would love to hear about them.</p>
<p>The corresponding source code for this article is on <a href="https://github.com/kaylumah/GenerateCSharpClientForOpenAPIRevisited" class="external">GitHub</a>.</p>
<p>See you next time, stay healthy and happy coding to all 🧸!</p>
<h2 id="additional-resources"><a href="#additional-resources">Additional Resources</a></h2>
<ul>
<li><a href="https://devblogs.microsoft.com/dotnet/generating-http-api-clients-using-visual-studio-connected-services/" class="external">Visual Studio Connected Services</a></li>
</ul>]]></content>
  </entry>
  <entry>
    <id>https://kaylumah.nl/2022/09/17/how-to-use-azurite-for-testing-azure-storage-in-dotnet.html</id>
    <title type="text"><![CDATA[How to use Azurite for testing Azure Storage in dotnet?]]></title>
    <summary type="text"><![CDATA[Learn how to setup the Azure SDK for dotnet with Azurite]]></summary>
    <published>2022-09-17T18:00:00+02:00</published>
    <updated>2022-09-17T18:00:00+02:00</updated>
    <author>
      <name>Max Hamulyák</name>
      <email>max@kaylumah.nl</email>
    </author>
    <link href="https://kaylumah.nl/2022/09/17/how-to-use-azurite-for-testing-azure-storage-in-dotnet.html" />
    <category term="C#" />
    <category term="Azure" />
    <category term="Testing" />
    <content type="html"><![CDATA[<h2 id="how-to-test-a-dependency-on-azure"><a href="#how-to-test-a-dependency-on-azure">How to test a dependency on Azure?</a></h2>
<p>A recent project tasked me with integrating an application with Azure Blob Storage. Due to my post <a href="https://kaylumah.nl/2022/02/21/working-with-azure-sdk-for-dotnet.html">&quot;Working with Azure SDK for .NET&quot;</a>, I knew all about the current SDK to interface with Azure. The team in charge of the dotnet SDK has done a great job with providing accessible samples. My previous post did not focus on the testability aspects of the System, mainly because it was a simple demo for production code that is, of course, a no-go.</p>
<p>There are a few ways we can go about testing this:</p>
<ol>
<li>Create a mock or fake implementation of every Storage API required.</li>
<li>Hide the blob implementation behind an internal interface and mock that in your tests.</li>
<li>Create a real storage account (per developer) in Azure.</li>
<li>Emulate storage account.</li>
</ol>
<p>The argument to go with option 1 / 2 is that you, the developer, are not responsible for testings Azure's internal components. Option 3 has the challenge of cost and test repeatability. Because option three hosts the dependency externally, you need to set up and teardown for anything done in your tests. For example, you cannot create a file with the same name twice. Option 4 has the problem: any emulator does not guarantee to be 100% equal to the real deal.</p>
<p>While I agree with the argument for the first two options, the point here is to test if we can successfully integrate with Azure (as opposed to asserting their SDK works as expected). You can debate if testing with emulators or Azure is still a unit test. Using EntityFramework's DbContext in a test would warrant the same definition question.</p>
<blockquote>
<p><strong>Important</strong>: if you only remember one thing from this post, let it be that every option except the third requires you to test in Azure. All other options are not the actual integration, and your application can behave differently once deployed.</p>
</blockquote>
<h2 id="how-can-azurite-help-by-emulating-azure-storage"><a href="#how-can-azurite-help-by-emulating-azure-storage">How can Azurite help by emulating Azure Storage?</a></h2>
<p>The test solution I picked was using the popular open-source emulator called <a href="https://docs.microsoft.com/en-us/azure/storage/common/storage-use-azurite" class="external">Azurite</a>. The Azurite tool offers a local environment for Azure Blob, Azure Queue and Azure Table services. In the past, we also had Microsofts own Storage Account Emulator, but it appears that development on that has stopped, and the focussed shifted to Azurite.</p>
<p>There are several ways to run Azurite (i.e. Docker or NPM).</p>
<pre><code class="language-shell"># install Azurite
npm install -g azurite

# run Azurite
azurite --silent --location c:\azurite --debug c:\azurite\debug.log
</code></pre>
<p>If we create a demo app and install blob storage via <code>dotnet add package Azure.Storage.Blobs</code>. We can connect with the following snippet:</p>
<pre><code class="language-csharp">using Azure.Storage.Blobs;

var connectionString = &quot;DefaultEndpointsProtocol=http;AccountName=devstoreaccount1;AccountKey=Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==;BlobEndpoint=http://127.0.0.1:10000/devstoreaccount1;QueueEndpoint=http://127.0.0.1:10001/devstoreaccount1;TableEndpoint=http://127.0.0.1:10002/devstoreaccount1;&quot;;
var blobServiceClient = new BlobServiceClient(connectionString);
var properties = await blobServiceClient.GetPropertiesAsync().ConfigureAwait(false);
</code></pre>
<p>The snippet works because the connection string we provided is the default connection string for Azurite. It contains the default account known as <code>devstoreaccount1</code> and connects over HTTP. The default connection string also assumes you are running blob, queue and table services. For example, on NPM you could run:</p>
<pre><code class="language-shell"># Run only Blob
azurite-blob --silent --location c:\azurite --debug c:\azurite\debug.log
# Run only Queue
azurite-queue --silent --location c:\azurite --debug c:\azurite\debug.log
# Run only Table
azurite-table --silent --location c:\azurite --debug c:\azurite\debug.log
</code></pre>
<p>Most would stop here because what I have shown so far is more than enough to use BlobServices from test code. It even works in <a href="https://docs.microsoft.com/en-us/azure/storage/blobs/use-azurite-to-run-automated-tests#run-tests-on-azure-pipelines" class="external">Azure Pipelines</a>. I, however, am still not entirely happy with it.</p>
<h2 id="how-to-use-azurite-without-a-connectionstring"><a href="#how-to-use-azurite-without-a-connectionstring">How to use Azurite without a ConnectionString?</a></h2>
<p>In the post <a href="https://kaylumah.nl/2022/02/21/working-with-azure-sdk-for-dotnet.html">&quot;Working with Azure SDK for .NET&quot;</a> I made a point that connection strings should be a thing of the past. The <code>TokenCredential</code> should be the way forward (<code>dotnet add package Azure.Identity</code>).</p>
<pre><code class="language-csharp">using Azure.Identity;
using Azure.Storage.Blobs;

var endpoint = new Uri(&quot;http://127.0.0.1:10000/devstoreaccount1&quot;);
var credential = new DefaultAzureCredential();
var blobServiceClient = new BlobServiceClient(endpoint, credential, new BlobClientOptions());
var properties = await blobServiceClient.GetPropertiesAsync().ConfigureAwait(false);
</code></pre>
<p>Based on the default configuration, the above snippet should have worked. However, you get the following error <code>System.ArgumentException: Cannot use TokenCredential without HTTPS.</code></p>
<p>Azurite has an overload to provide HTTPS support. You can <a href="https://github.com/Azure/Azurite#https-setup" class="external">use a tool</a> called mkcert to generate the required files.</p>
<pre><code class="language-shell"># Run once
mkcert 127.0.0.1

# Run over HTTPs
azurite --silent --location c:\azurite --debug c:\azurite\debug.log --cert 127.0.0.1.pem --key 127.0.0.1-key.pem
</code></pre>
<p>Update the endpoint Uri to HTTPS:</p>
<pre><code class="language-csharp">using Azure.Identity;
using Azure.Storage.Blobs;

var endpoint = new Uri(&quot;https://127.0.0.1:10000/devstoreaccount1&quot;);
var credential = new DefaultAzureCredential();
var blobServiceClient = new BlobServiceClient(endpoint, credential, new BlobClientOptions());
var properties = await blobServiceClient.GetPropertiesAsync().ConfigureAwait(false);
</code></pre>
<p>If we run our test example now, it will fail with the warning that an SSL connection cannot be established. We can solve this with generating a CA certificate from mkcert with <code>mkcert --install</code>. However, even with a valid SSL certificate TokenCredential will still fail. For TokenCredential to work we need to pass <code>--oath basic</code> to Azurite.</p>
<pre><code class="language-shell">azurite --silent --location c:\azurite --debug c:\azurite\debug.log --cert 127.0.0.1.pem --key 127.0.0.1-key.pem --oauth basic
</code></pre>
<h2 id="can-i-use-azurite-https-connection-string-in-cicd-pipelines"><a href="#can-i-use-azurite-https-connection-string-in-cicd-pipelines">Can I use Azurite HTTPS connection string in CICD pipelines?</a></h2>
<p>Now that we can use TokenCredential, I am happy. The test instance of our BlobServiceClient is almost identical to the production configuration. We have established that it works locally, but how about a CICD environment? I modified the example pipeline to add the mkcert bits.</p>
<pre><code class="language-yaml">steps:
  - bash: |
      choco install mkcert
      npm install -g azurite
      mkdir azurite
      cd azurite
      mkcert --install
      mkcert 127.0.0.1
      azurite --oauth basic --cert 127.0.0.1.pem --key 127.0.0.1-key.pem --silent --location data --debug data\debug.log &amp;
    displayName: &quot;Install and Run Azurite&quot;
</code></pre>
<p>Unfortunately, adding a certificate to the trust store requires a password prompt. On an Azure-hosted agent, this does not work and causes the agent <a href="https://github.com/FiloSottile/mkcert/issues/286" class="external">to be stuck</a>. To me, this could mean one of two things. Either the Azure team does not test over HTTPS, or they have a different set of test tooling. As it turns out, they have a set of helpers to construct the service clients and disable SSL verification. Like this:</p>
<pre><code class="language-csharp">using Azure.Core.Pipeline;
using Azure.Identity;
using Azure.Storage.Blobs;

var endpoint = new Uri(&quot;https://127.0.0.1:10000/devstoreaccount1&quot;);
var credential = new DefaultAzureCredential();
var blobServiceClient = new BlobServiceClient(endpoint, credential, new BlobClientOptions()
{
    Transport = new HttpClientTransport(new HttpClient(new HttpClientHandler
    {
        ServerCertificateCustomValidationCallback =
            HttpClientHandler.DangerousAcceptAnyServerCertificateValidator
    }))
});
var properties = await blobServiceClient.GetPropertiesAsync().ConfigureAwait(false);
</code></pre>
<p>It can now be used in Azure Pipelines like this (note the lack of mkcert --install)</p>
<pre><code class="language-yaml">steps:
  - bash: |
      choco install mkcert
      npm install -g azurite
      mkdir azurite
      cd azurite
      mkcert 127.0.0.1
      azurite --oauth basic --cert 127.0.0.1.pem --key 127.0.0.1-key.pem --silent --location data --debug data\debug.log &amp;
    displayName: &quot;Install and Run Azurite&quot;
</code></pre>
<h2 id="how-to-use-azurite-in-my-project"><a href="#how-to-use-azurite-in-my-project">How to use Azurite in my project?</a></h2>
<p>The testing helpers have more to it than disabling SSL but are not present on NuGet. So naturally, I raised <a href="https://github.com/Azure/azure-sdk-for-net/issues/30751" class="external">an issue</a> to the SDK team if they have any plans in that direction. Unfortunately, at this point, they have no interest in releasing their internal test tooling. The techniques I mentioned thus far can be used standalone. I, however, felt this was an excellent opportunity to create my first NuGet Package. The package cannot assume how anybody runs Azurite, so I introduced two classes. You can use <code>AzuriteAccountBuilder</code> to configure how things are run, like the account or the ports being used. The <code>AzuriteAccount</code> class provides access to stuff like the connection string. For convenience the package also creates helper methods to create <code>BlobServiceClient</code>, <code>TableServiceClient</code> or <code>QueueServiceClient</code> form an <code>AzuriteAccount</code>.</p>
<p>My package is designed for use in a test project so let us create a new test project (i.e. <code>dotnet new xunit</code>) and add my package to it <code>dotnet add package Kaylumah.Testing.Azurite --version 1.0.0</code>.
The most explicit way to create an AzuriteAccount is with the following code:</p>
<pre><code class="language-csharp">[Fact]
public async Task Test1()
{
    var account = new AzuriteAccountBuilder()
        .WithProtocol(secure: false)
        .WithDefaultAccount()
        .WithDefaultBlobEndpoint()
        .WithDefaultQueueEndpoint()
        .WithDefaultTableEndpoint()
        .Build();

    var blobServiceClient = account.CreateBlobServiceViaConnectionString();
    await blobServiceClient.GetPropertiesAsync();
}
</code></pre>
<p>The snippet above creates a connection string based on the default settings. That means it should match the connection string when someone runs Azurite without parameters. For convenience, I have also added a helper class that creates this default account for you.</p>
<pre><code class="language-csharp">[Fact]
public async Task Test2()
{
    var account = AzuriteHelper.CreateDefaultAzuriteAccountBuilder().Build();
    var blobServiceClient = account.CreateBlobServiceViaSharedKeyCredential();
    await blobServiceClient.GetPropertiesAsync();
}
</code></pre>
<p>The package offers the same convenience helpers for Queue and Table storage. You can use the connection string, shared key, azure sas key or token credential with the helpers.</p>
<h2 id="closing-thoughts"><a href="#closing-thoughts">Closing Thoughts</a></h2>
<p>I started this journey with knowledge about Azurite and the dotnet SDK for Azure. I knew from experience that I no longer wanted to work with managed identity instead of connection strings. I needed a way to have repeatable tests on local and CI/CD environments. The funny thing is that after I had everything working the way I wanted, I could not use the required API I needed. For a moment, I had forgotten Azurite is an emulator, and not all features are supported. So I had to fall back on shared key credentials, which work fine over HTTPS and could already be used in pipelines. Luckily I designed the package to work with a variety of configurations.</p>
<p>Usually, this is where I post a link to the posts GitHub repo. This time, the source code is the NuGet package on this <a href="https://github.com/kaylumah/Kaylumah.Testing.Azurite" class="external">GitHub Repo</a>. In the post itself, I focussed on NPM; if you are looking for how to integrate with Docker containers, you can find an example test <a href="https://github.com/kaylumah/Kaylumah.Testing.Azurite/blob/main/test/Unit/AzuriteDockerRunner.cs" class="external">in the GitHub repo</a> As always, if you have any questions, feel free to reach out. Do you have suggestions or alternatives? I would love to hear about them. Especially since this is my first NuGet package let me know if it helped you out.</p>]]></content>
  </entry>
  <entry>
    <id>https://kaylumah.nl/2022/06/07/share-debug-configuration-with-launch-profiles.html</id>
    <title type="text"><![CDATA[Share debug configuration between .NET IDEs with launch profiles]]></title>
    <summary type="text"><![CDATA[Pick your own .NET IDE and benefit from shared developer configuration with launchSettings.json]]></summary>
    <published>2022-06-07T23:30:00+02:00</published>
    <updated>2022-06-07T23:30:00+02:00</updated>
    <author>
      <name>Max Hamulyák</name>
      <email>max@kaylumah.nl</email>
    </author>
    <link href="https://kaylumah.nl/2022/06/07/share-debug-configuration-with-launch-profiles.html" />
    <category term="Rider" />
    <category term="Visual Studio" />
    <category term="VS Code" />
    <content type="html"><![CDATA[<p>Anno 2022, as <code>.NET</code> developers, we are spoilt with multiple options for our development environment. Of course, having a choice sparks the debate that my IDE is better than your IDE. I feel that after <code>bring your own device</code>, we are moving to a <code>bring your own IDE</code> workspace. Given the rise of tooling like <code>VS Code DevContainer</code> and <code>GitHub Codespaces</code>, I think more developers will likely opt for such tooling.</p>
<blockquote>
<p>Did you know that most of my blogs are written for use in dev containers and are available in GitHub Codespaces?</p>
</blockquote>
<p>Each IDE has its perks but also its quirks. Who am I to tell you that tool X is better than Y. If you can work using the tool you prefer, you can be much more productive than a tool because the company said so. It does bring its challenges. For example, if you change a file in your IDE, I don't want it formatted when I open it in my IDE. My version control system will show more changes to a project than happened. Lucky for us, tools like <code>.editorConfig</code> help us a lot to streamline this process. I switch back and forth a lot between VS Code and Visual Studio. My team was working with <code>Rider</code> for a recent customer project. Keeping settings in sync between two IDEs was hard enough. So it made me wonder, is there an equivalent for <code>.editorConfig</code> but used for debug-configuration. I knew that <code>Visual Studio</code> has the concept of a <code>launchSettings.json</code> file. As I discovered, it is possible to make both <code>Rider</code> and <code>VS Code</code> play nice with <code>launchSettings.json</code>. It is by no means perfect, but at least for me, it solves some of the caveats in a <code>bring your own IDE</code> world.</p>
<p>If you were wondering, &quot;Max launchSettings.json has been around for years; why are you writing this article?&quot; The answer to that is straightforward. It bugged me a lot that I had to repeat myself. When searching for how to configure my IDE, I came across the <a href="https://docs.microsoft.com/en-us/aspnet/core/fundamentals/environments?view=aspnetcore-6.0#development-and-launchsettingsjson" class="external">ASP.NET Fundamentals - Environment</a>. It suggests using the <code>VS Code</code> variant of configuration but does not mention that you can reuse your <code>Visual Studio</code> one. Reading that article prompted me to write down what I learned so that someday someone might benefit from it.</p>
<h2 id="what-is-launchsettings"><a href="#what-is-launchsettings">What is LaunchSettings?</a></h2>
<p>Let me begin with a quick recap about <code>launchSettings.json</code>. A launch settings file contains <a href="https://github.com/dotnet/project-system/blob/main/docs/launch-profiles.md" class="external">Launch Profiles</a>. A <code>Launch Profile</code> is a kind of configuration that specifies how to run your project. Having these launch profiles allows you to switch between configurations easily. Typical scenarios are switching between Development and Production environments or enabling feature flags. Launch profiles are in the optional <code>Properties\launchSettings.json</code> file. For example, a freshly created console project will not have one, whereas a web API project will define one.</p>
<p>A launch profile has a couple of properties depending on the project type. I will highlight the ones that are relevant to this post.</p>
<ul>
<li><code>commandName</code>: the only required setting which determines how a project is launched. To work in every IDE this settings needs to be <code>Project</code>.</li>
<li><code>commandLineArgs</code>: a string containing arguments to supply to the application.</li>
<li><code>environmentVariables</code>: A collection of name/value pairs, each specifying an environment variable and value to set.</li>
</ul>
<p>A few important notes:</p>
<ul>
<li>Environment values set in launchSettings.json override values set in the system environment.</li>
<li>The launchSettings.json file is only used on the local development machine.</li>
<li>The launchSettings.json file shouldn't store secrets</li>
</ul>
<h2 id="project-setup"><a href="#project-setup">Project Setup</a></h2>
<p>For our sample application, we will create a new project using the &quot;Console Template&quot; with <code>dotnet new console</code>. Since it is a console, we must create a <code>Properties\launchSettings.json</code> by hand. At a minimum, the file would look like this.</p>
<pre><code class="language-json">{
    &quot;$schema&quot;: &quot;https://json.schemastore.org/launchsettings.json&quot;,
    &quot;profiles&quot;: {
        &quot;DemoConsole.V0&quot;: {
            &quot;commandName&quot;: &quot;Project&quot;
        }
    }
}
</code></pre>
<p>Since we are demoing features of <code>launchSettings.json</code>, it will not be a nice demo if we don't populate it.</p>
<pre><code class="language-json">{
    &quot;$schema&quot;: &quot;https://json.schemastore.org/launchsettings.json&quot;,
    &quot;profiles&quot;: {
        &quot;DemoConsole.V0&quot;: {
            &quot;commandName&quot;: &quot;Project&quot;,
            &quot;commandLineArgs&quot;: &quot;&quot;,
            &quot;environmentVariables&quot;: {
                &quot;KAYLUMAH_ENVIRONMENT&quot;: &quot;Development&quot;
            }
        },
        &quot;DemoConsole.V1&quot;: {
            &quot;commandName&quot;: &quot;Project&quot;,
            &quot;environmentVariables&quot;: {
                &quot;KAYLUMAH_ENVIRONMENT&quot;: &quot;Production&quot;
            }
        },
        &quot;DemoConsole.V2&quot;: {
            &quot;commandName&quot;: &quot;Project&quot;,
            &quot;commandLineArgs&quot;: &quot;--mysetting myvalue&quot;,
            &quot;environmentVariables&quot;: {
                &quot;KAYLUMAH_ENVIRONMENT&quot;: &quot;Production&quot;
            }
        },
        &quot;DemoConsole.V3&quot;: {
            &quot;commandName&quot;: &quot;Project&quot;,
            &quot;commandLineArgs&quot;: &quot;--mysetting myvalue&quot;,
            &quot;environmentVariables&quot;: {
                &quot;KAYLUMAH_ENVIRONMENT&quot;: &quot;Production&quot;,
                &quot;KAYLUMAH_FROMVARIABLE1&quot;: &quot;$(TargetFramework)&quot;,
                &quot;KAYLUMAH_FROMVARIABLE2&quot;: &quot;$(MyCustomProp)&quot;
            }
        }
    }
}
</code></pre>
<p>The console app will build an <code>IConfiguration</code> and print it to the console. Since I don't feel like adding all my environment variables, I add only the ones prefixed with <code>KAYLUMAH_</code>, kinda like how .NET automatically includes variables prefixed with <code>DOTNET_</code>.</p>
<pre><code class="language-csharp">using Microsoft.Extensions.Configuration;

IConfigurationBuilder configurationBuilder = new ConfigurationBuilder();
configurationBuilder.AddEnvironmentVariables(&quot;KAYLUMAH_&quot;);
if (args is { Length: &gt; 0 })
{
    configurationBuilder.AddCommandLine(args);
}
var configuration = configurationBuilder.Build();

if (configuration is IConfigurationRoot configurationRoot)
{
    Console.WriteLine(configurationRoot.GetDebugView());
}

Console.WriteLine(&quot;Done...&quot;);
Console.ReadLine();
</code></pre>
<p>If we run the project now, the output should be:</p>
<pre><code class="language-output">ENVIRONMENT=Development (EnvironmentVariablesConfigurationProvider Prefix: 'KAYLUMAH_')

Done...
</code></pre>
<p>We also generate a project from the <code>webapi template</code>. We slightly modify it so it contains a second profile, so it looks like this.</p>
<pre><code class="language-json">{
  &quot;$schema&quot;: &quot;https://json.schemastore.org/launchsettings.json&quot;,
  &quot;iisSettings&quot;: {
    &quot;windowsAuthentication&quot;: false,
    &quot;anonymousAuthentication&quot;: true,
    &quot;iisExpress&quot;: {
      &quot;applicationUrl&quot;: &quot;http://localhost:33652&quot;,
      &quot;sslPort&quot;: 44325
    }
  },
  &quot;profiles&quot;: {
    &quot;DemoApi&quot;: {
      &quot;commandName&quot;: &quot;Project&quot;,
      &quot;dotnetRunMessages&quot;: true,
      &quot;launchBrowser&quot;: true,
      &quot;launchUrl&quot;: &quot;swagger&quot;,
      &quot;applicationUrl&quot;: &quot;https://localhost:7238;http://localhost:5200&quot;,
      &quot;environmentVariables&quot;: {
        &quot;ASPNETCORE_ENVIRONMENT&quot;: &quot;Development&quot;
      }
    },
    &quot;DemoApi.Production&quot;: {
      &quot;commandName&quot;: &quot;Project&quot;,
      &quot;dotnetRunMessages&quot;: true,
      &quot;launchBrowser&quot;: true,
      &quot;launchUrl&quot;: &quot;swagger&quot;,
      &quot;applicationUrl&quot;: &quot;https://localhost:7238;http://localhost:5200&quot;,
      &quot;environmentVariables&quot;: {
        &quot;ASPNETCORE_ENVIRONMENT&quot;: &quot;Production&quot;
      }
    },
    &quot;IIS Express&quot;: {
      &quot;commandName&quot;: &quot;IISExpress&quot;,
      &quot;launchBrowser&quot;: true,
      &quot;launchUrl&quot;: &quot;swagger&quot;,
      &quot;environmentVariables&quot;: {
        &quot;ASPNETCORE_ENVIRONMENT&quot;: &quot;Development&quot;
      }
    }
  }
}
</code></pre>
<p>Depending on your chosen profile, you see a <code>Swagger UI</code> dashboard.</p>
<h2 id="share-debug-configuration-from-microsoft-visual-studio"><a href="#share-debug-configuration-from-microsoft-visual-studio">Share debug configuration from Microsoft Visual Studio</a></h2>
<p>I could not verify it online, but I think Visual Studio introduced launch settings as part of the first <code>ASP NET Core</code> release. Since launch profiles is a <code>Visual Studio</code> feature, I don't have much to add above the definition I've already given for the specification. One cool thing I like to mention is that running from <code>Visual Studio</code> <code>launchSettings</code> can reference <code>MSBuild</code> variables. That is a pretty handy way to provide something dynamic.</p>
<p>For our console, we see the following selection in Visual Studio:</p>
<p><picture><source type="image/webp" srcset="https://kaylumah.nl/assets/images/posts/20220607/launch-settings/visualstudio_console_launchprofile.png.webp" /><img loading="lazy" src="https://kaylumah.nl/assets/images/posts/20220607/launch-settings/visualstudio_console_launchprofile.png" width="2640" height="796" alt="Microsoft Visual Studio - Console Launch Profile" /></picture></p>
<p>For our API, we see the following selection in Visual Studio:</p>
<p><picture><source type="image/webp" srcset="https://kaylumah.nl/assets/images/posts/20220607/launch-settings/visualstudio_api_launchprofile.png.webp" /><img loading="lazy" src="https://kaylumah.nl/assets/images/posts/20220607/launch-settings/visualstudio_api_launchprofile.png" width="2604" height="1080" alt="Microsoft Visual Studio - API Launch Profile" /></picture></p>
<p>As you see, the WebAPI variant shows more than just our launch profiles.</p>
<p>Another aspect of development configuration is the ability to run more projects simultaneously. We can achieve this in <code>Visual Studio</code> by selecting multiple startup projects.  As far as I know, this function is user-specific, which would result in every developer repeating information. Luckily there is a handy plugin called <a href="https://marketplace.visualstudio.com/items?itemName=vs-publisher-141975.SwitchStartupProjectForVS2022" class="external">SwitchStartUpProject</a>.</p>
<p>We can quickly provide multiple configurations. We can provide a <code>ProfileName</code> for each project that matches one in our launch settings. It is that simple.</p>
<pre><code class="language-json">{
    &quot;Version&quot;: 3,
    &quot;ListAllProjects&quot;: false,
    &quot;MultiProjectConfigurations&quot;: {
        &quot;Demo&quot;: {
            &quot;Projects&quot;: {
                &quot;DemoConsole&quot;: {
                    &quot;ProfileName&quot;: &quot;DemoConsole.V1&quot;
                },
                &quot;DemoApi&quot;: {
                    &quot;ProfileName&quot;: &quot;DemoApi&quot;
                }
            },
            &quot;SolutionConfiguration&quot;: &quot;Release&quot;,
            &quot;SolutionPlatform&quot;: &quot;x64&quot;
        }
    }
}
</code></pre>
<p><picture><source type="image/webp" srcset="https://kaylumah.nl/assets/images/posts/20220607/launch-settings/visualstudio_compound_configuration.png.webp" /><img loading="lazy" src="https://kaylumah.nl/assets/images/posts/20220607/launch-settings/visualstudio_compound_configuration.png" width="3848" height="348" alt="Microsoft Visual Studio - Compound Settings" /></picture></p>
<h2 id="share-debug-configuration-from-jetbrains-rider"><a href="#share-debug-configuration-from-jetbrains-rider">Share debug configuration from JetBrains Rider</a></h2>
<p>As it turns out, <code>launchSettings</code> has been supported in <code>Rider</code> for a long time. They first introduced it in <a href="https://blog.jetbrains.com/dotnet/2018/11/08/using-net-core-launchsettings-json-rundebug-apps-rider/" class="external">November 2018</a>. As a matter of fact, to use <code>launchSettings</code> inside <code>Rider</code> you don't need to do a thing. <code>Rider</code> <a href="https://www.jetbrains.com/help/rider/Run_Debug_Configuration_dotNet_Launch_Settings_Profile.html#creating-run-debug-configurations-based-on-launch-profiles" class="external">automatically detects</a> if your projects are using <code>launchSettings</code>. Not all features are supported, but using profiles of <code>commandName project</code> are. If you did provide MSBuild variable in <code>launchSettings</code> <code>Rider</code> would correctly pass them along.</p>
<p><picture><source type="image/webp" srcset="https://kaylumah.nl/assets/images/posts/20220607/launch-settings/rider_launchprofiles.png.webp" /><img loading="lazy" src="https://kaylumah.nl/assets/images/posts/20220607/launch-settings/rider_launchprofiles.png" width="964" height="904" alt="JetBrains Rider - launch profiles" /></picture></p>
<p>A thing I like about <code>Rider</code> is that I don't need an additional plugin to support multiple start up projects.</p>
<p><picture><source type="image/webp" srcset="https://kaylumah.nl/assets/images/posts/20220607/launch-settings/rider_compound_configuration.png.webp" /><img loading="lazy" src="https://kaylumah.nl/assets/images/posts/20220607/launch-settings/rider_compound_configuration.png" width="3456" height="2144" alt="JetBrains Rider - Compound Settings" /></picture></p>
<p>It's important to check <code>Store as project file</code>; otherwise, you won't share it with your team. In this particular example, it will look like this:</p>
<pre><code class="language-xml">&lt;component name=&quot;ProjectRunConfigurationManager&quot;&gt;
  &lt;configuration default=&quot;false&quot; name=&quot;Console and API&quot; type=&quot;CompoundRunConfigurationType&quot;&gt;
    &lt;toRun name=&quot;DemoApi: DemoApi.Production&quot; type=&quot;LaunchSettings&quot; /&gt;
    &lt;toRun name=&quot;DemoConsole: DemoConsole.V3&quot; type=&quot;LaunchSettings&quot; /&gt;
    &lt;method v=&quot;2&quot; /&gt;
  &lt;/configuration&gt;
&lt;/component&gt;
</code></pre>
<h2 id="share-debug-configuration-from-microsoft-vs-code"><a href="#share-debug-configuration-from-microsoft-vs-code">Share debug configuration from Microsoft VS Code</a></h2>
<p>Last but not least is <code>VS Code</code>, the reason I started this article. When you open a .NET project in <code>VS Code</code>, you get prompted to create a <code>launch.json</code> file. A <code>launch.json</code> file is very similar to a <code>launchSettings.json</code>. Both options provide the means to choose a project, set command-line arguments and override environment variables. The default <code>launch.json</code> does not pass any additional configuration to the project. So what would be the logical output of our command?
The answer might surprise you.</p>
<p>Given the following configuration in <code>launch.json</code></p>
<pre><code class="language-json">{
    &quot;name&quot;: &quot;.NET Core Launch (console)&quot;,
    &quot;type&quot;: &quot;coreclr&quot;,
    &quot;request&quot;: &quot;launch&quot;,
    &quot;preLaunchTask&quot;: &quot;build&quot;,
    &quot;program&quot;: &quot;${workspaceFolder}/bin/Debug/net6.0/DemoConsole.dll&quot;,
    &quot;args&quot;: [],
    &quot;cwd&quot;: &quot;${workspaceFolder}&quot;,
    &quot;console&quot;: &quot;internalConsole&quot;,
    &quot;stopAtEntry&quot;: false
}
</code></pre>
<p>The output will be:</p>
<pre><code class="language-output">ENVIRONMENT=Development (EnvironmentVariablesConfigurationProvider Prefix: 'KAYLUMAH_')

Done...
</code></pre>
<p>That is because you have secretly been using <code>launchSettings.json</code> the whole time. In May 2018, release <a href="https://github.com/OmniSharp/omnisharp-vscode/blob/master/CHANGELOG.md#1150-may-10-2018" class="external">1.15.0</a> of the extension shipped <code>launchSettings.json</code> support. If you don't add <code>launchSettingsProfile</code> to your <code>launch.json</code>, it will use the first profile for a project that is of type <code>&quot;commandName&quot;: &quot;Project&quot;</code>. Ever had unexplained variables in your project? This is likely the reason why. Remember our default profile set an environment variable, and variables from <code>launchSettings.json</code> win from system environment variables. I recommend explicitly specifying <code>launchSettingsProfile</code> to make it clear that a) you are using it and b) if you change the order of profiles, you don't create unexpected changes for other developers.</p>
<p>Like <code>Rider</code> the support for this feature comes with a few <a href="https://github.com/OmniSharp/omnisharp-vscode/blob/master/debugger-launchjson.md#launchsettingsjson-support" class="external">restrictions</a>:</p>
<ol>
<li>Only profiles with &quot;commandName&quot;: &quot;Project&quot; are supported.</li>
<li>Only environmentVariables, applicationUrl and commandLineArgs properties are supported</li>
<li>Settings in launch.json will take precedence over settings in launchSettings.json, so for example, if args is already set to something other than an empty string/array in launch.json then the launchSettings.json content will be ignored.</li>
</ol>
<p>Since you can provide arguments and environment variables in both <code>launch.json</code> and <code>launchSettings.json</code>, let's look at an example.</p>
<pre><code class="language-json">{
    &quot;name&quot;: &quot;.NET Core Launch (console)&quot;,
    &quot;type&quot;: &quot;coreclr&quot;,
    &quot;request&quot;: &quot;launch&quot;,
    &quot;preLaunchTask&quot;: &quot;build&quot;,
    &quot;program&quot;: &quot;${workspaceFolder}/bin/Debug/net6.0/DemoConsole.dll&quot;,
    &quot;cwd&quot;: &quot;${workspaceFolder}&quot;,
    &quot;console&quot;: &quot;internalConsole&quot;,
    &quot;stopAtEntry&quot;: false,
    &quot;launchSettingsProfile&quot;: &quot;DemoConsole.V2&quot;,
    &quot;args&quot;: [
        &quot;--othersetting&quot;,
        &quot;vscode&quot;
    ],
    &quot;env&quot;: {
        &quot;KAYLUMAH_ENVIRONMENT&quot;: &quot;Development&quot;,
        &quot;KAYLUMAH_OTHER&quot;: &quot;From target&quot;
    }
}
</code></pre>
<pre><code class="language-output">ENVIRONMENT=Development (EnvironmentVariablesConfigurationProvider Prefix: 'KAYLUMAH_')
OTHER=From target (EnvironmentVariablesConfigurationProvider Prefix: 'KAYLUMAH_')
othersetting=vscode (CommandLineConfigurationProvider)

Done...
</code></pre>
<p>There are a few things that happen:</p>
<ol>
<li>Since <code>launch.json</code> specified args the commandLineArgs from <code>launchSettings.json</code> are ignored.</li>
<li>Since <code>launch.json</code> specified env and <code>launchSettings.json</code> specified <code>environmentVariables</code> both sets get merged.</li>
<li>Since <code>launch.json</code> will win, the value for <code>KAYLUMAH_ENVIRONMENT</code> is <code>Development</code>.</li>
</ol>
<p>The default configuration for our web api looks slightly different because it adds support to open the browser after the project starts.
Our base URL comes from the <code>launchSettings.json</code>, but the <code>launchUrl</code> gets ignored. You can achieve the same behaviour by updating the generated <code>serverReadyAction</code> with an <code>uriFormat</code>.</p>
<pre><code class="language-json">{
    &quot;version&quot;: &quot;0.2.0&quot;,
    &quot;configurations&quot;: [
        {
            &quot;name&quot;: &quot;.NET Core Launch (web)&quot;,
            &quot;type&quot;: &quot;coreclr&quot;,
            &quot;request&quot;: &quot;launch&quot;,
            &quot;preLaunchTask&quot;: &quot;build&quot;,
            &quot;program&quot;: &quot;${workspaceFolder}/bin/Debug/net6.0/DemoApi.dll&quot;,
            &quot;args&quot;: [],
            &quot;cwd&quot;: &quot;${workspaceFolder}&quot;,
            &quot;stopAtEntry&quot;: false,
            &quot;serverReadyAction&quot;: {
                &quot;action&quot;: &quot;openExternally&quot;,
                &quot;pattern&quot;: &quot;\\bNow listening on:\\s+(https?://\\S+)&quot;,
                &quot;uriFormat&quot;: &quot;%s/swagger&quot;
            },
            &quot;env&quot;: {
                &quot;ASPNETCORE_ENVIRONMENT&quot;: &quot;Development&quot;
            },
            &quot;sourceFileMap&quot;: {
                &quot;/Views&quot;: &quot;${workspaceFolder}/Views&quot;
            }
        },
        {
            &quot;name&quot;: &quot;.NET Core Attach&quot;,
            &quot;type&quot;: &quot;coreclr&quot;,
            &quot;request&quot;: &quot;attach&quot;
        }
    ]
}
</code></pre>
<p>Of the three IDEs, <code>VS Code</code> has the easiest way to share compound configurations. Just add the following to your <code>launch.json</code>:</p>
<pre><code class="language-json">&quot;compounds&quot;: [
    {
        &quot;name&quot;: &quot;Console + API&quot;,
        &quot;configurations&quot;: [
            &quot;Launch WebAPI&quot;,
            &quot;Launch Console&quot;
        ]
    }
]
</code></pre>
<p><picture><source type="image/webp" srcset="https://kaylumah.nl/assets/images/posts/20220607/launch-settings/vscode_launchprofiles.png.webp" /><img loading="lazy" src="https://kaylumah.nl/assets/images/posts/20220607/launch-settings/vscode_launchprofiles.png" width="680" height="620" alt="Microsoft VS Code - launch profiles" /></picture></p>
<h2 id="bonus-use-launch-settings-from-dotnet-cli"><a href="#bonus-use-launch-settings-from-dotnet-cli">Bonus use Launch Settings from Dotnet CLI</a></h2>
<p>Technically the Dotnet CLI is not an IDE, so consider this a small bonus chapter. I am including the CLI since it also uses launch profiles when running locally.</p>
<p>As it turns out the CLI also defaults to the first project in <code>Properties\launchSettings.json</code>, so in our case <code>DemoConsole.V0</code>. Just like VS Code did. The following example uses a bit of <code>PowerShell</code> to run the CLI.</p>
<pre><code class="language-pwsh"># prints the default
dotnet run

# Sets env var for current session
$env:KAYLUMAH_COMMANDLINE=&quot;Session ENV var&quot;
# prints COMMANDLINE + the default
dotnet run
</code></pre>
<p>If we don't want any launch profile just run <code>dotnet run --no-launch-profile</code> and to specify a profile run <code>dotnet run --launch-profile &quot;DemoConsole.V2&quot;</code></p>
<h2 id="closing-thoughts"><a href="#closing-thoughts">Closing Thoughts</a></h2>
<p>As we discovered, it's more than likely that you were using launch profiles outside Visual Studio without realising it. I am not sure that not if unspecified, it is a sensible default to take the first one, particularly since someone can change the order of projects. I like that in at least three popular IDEs; we have an easy mechanism for sharing settings.</p>
<p>In the future, I am also hoping for a shared tool for compound configurations. At the very least, managing compound configurations using existing launch profiles is much easier than duplicating arguments and environment variables for each IDE. One last note is that I discovered that MSBuild variable replacement does not appear to be working from <code>VS Code</code> or the <code>CLI</code>. Not sure if that's a bug or by design, but it's important that not every IDE supports all features of the <code>launchSettings.json</code>. As always, if you have any questions, feel free to reach out. Do you have suggestions or alternatives? I would love to hear about them.</p>
<p>The corresponding source code for this article is on <a href="https://github.com/kaylumah/WorkingWithLaunchSettings" class="external">GitHub</a>.</p>
<p>See you next time, stay healthy and happy coding to all 🧸!</p>
<h2 id="sources-used"><a href="#sources-used">Sources Used</a></h2>
<ul>
<li><a href="https://github.com/dotnet/project-system/blob/main/docs/launch-profiles.md" class="external">Project system - Launch Profiles</a></li>
<li><a href="https://www.jetbrains.com/help/rider/Run_Debug_Configuration_dotNet_Launch_Settings_Profile.html" class="external">JetBrains - Launch Profiles</a></li>
<li><a href="https://github.com/OmniSharp/omnisharp-vscode/blob/master/debugger-launchjson.md#launchsettingsjson-support" class="external">OmniSharp - LaunchProfiles</a></li>
<li><a href="https://docs.microsoft.com/en-us/dotnet/core/tools/dotnet-run" class="external">Dotnet CLI - Run command</a></li>
<li><a href="https://docs.microsoft.com/en-us/aspnet/core/fundamentals/configuration/?view=aspnetcore-6.0#environment-variables-set-in-generated-launchsettingsjson" class="external">Launchsettings - environment variables</a></li>
<li><a href="https://docs.microsoft.com/en-us/aspnet/core/fundamentals/environments?view=aspnetcore-6.0#development-and-launchsettingsjson" class="external">Launchsettings.json file</a></li>
</ul>]]></content>
  </entry>
  <entry>
    <id>https://kaylumah.nl/2022/02/21/working-with-azure-sdk-for-dotnet.html</id>
    <title type="text"><![CDATA[Working with Azure SDK for .NET]]></title>
    <summary type="text"><![CDATA[The latest iteration of the Azure SDK for dotnet has several cool features baked into its design. We take a look at some common scenarios]]></summary>
    <published>2022-02-21T21:30:00+01:00</published>
    <updated>2022-02-21T21:30:00+01:00</updated>
    <author>
      <name>Max Hamulyák</name>
      <email>max@kaylumah.nl</email>
    </author>
    <link href="https://kaylumah.nl/2022/02/21/working-with-azure-sdk-for-dotnet.html" />
    <category term="C#" />
    <category term="Azure" />
    <content type="html"><![CDATA[<p>February 2022 marks the 20th anniversary of the dotnet platform, which is quite a milestone. I found it the perfect time to reflect; I have been working professionally for almost six years and using .NET during the four years before that in my studies. For a dotnet blogger like myself, I could not stand idly by and let this pass without a post. February 2022 also marks another milestone for me. My first ever open-source contribution has been released into the wild. I made a <a href="https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/servicebus/Azure.Messaging.ServiceBus/CHANGELOG.md#760-2022-02-08" class="external">small contribution</a> to the <code>Azure SDK for .NET</code>. So in honour of both, I wrote this article with small tips and tricks I picked up when working with the SDK.</p>
<h2 id="which-azure-sdk-should-i-use"><a href="#which-azure-sdk-should-i-use">Which Azure SDK should I use?</a></h2>
<p>Since <a href="https://devblogs.microsoft.com/azure-sdk/state-of-the-azure-sdk-2021/" class="external">July 2019</a>, Microsoft has made a design effort to unify the SDKs for the different services. There are shared concepts between the libraries like authentication and diagnostics. The libraries follow the pattern <code>Azure.{service}.{library}</code>.
My contribution was to the ServiceBus SDK, so today's article focus is the service bus. Almost everything described is transferable to the other SDKs; only a few bits are ServiceBus specific. The NuGet package we need is the <code>Azure.Messaging.Service</code> package.</p>
<h2 id="how-to-set-up-azure-service-bus-with-azure-cli"><a href="#how-to-set-up-azure-service-bus-with-azure-cli">How to set up Azure Service Bus with Azure CLI?</a></h2>
<p>I think the local development aspect of any service is as important as ease of use in production. Unfortunately, there is no way to emulate the service bus locally; Jimmy Bogard wrote about that in <a href="https://jimmybogard.com/local-development-with-azure-service-bus/" class="external">this article</a>. Without emulating, we need to set up our resources in Azure, even for our development environment. There are a few possible options to create resources in Azure:</p>
<ul>
<li>Manually via the Azure Portal</li>
<li>Infrastructure as Code (ARM, Bicep, etc.)</li>
<li>Scripting (Azure CLI, Azure Powershell Module)</li>
</ul>
<p>For prototypes such as this article, I prefer Azure CLI since the commands are repeatable and, more importantly, easy to understand.</p>
<blockquote>
<p><strong>NOTE</strong>:</p>
<p>When I work with the Azure CLI, I use the <a href="https://marketplace.visualstudio.com/items?itemName=ms-vscode.azurecli" class="external">Azure CLI Tools</a> extension for VS Code. It provides Intellisense and snippets to work with the CLI.</p>
</blockquote>
<pre><code class="language-sh">AzureSubscriptionId=&quot;&lt;subscription-id&gt;&quot;
AzureTenantId=&quot;&lt;tenant-id&gt;&quot;
AzureResourceGroup=&quot;demorg001&quot;
AzureLocation=&quot;westeurope&quot;

# Sign in to Azure using device code - After login session is scoped to Subscription in Tenant
az login --use-device-code --tenant $AzureTenantId
az account set --subscription $AzureSubscriptionId

# Set default values for location and resource group
az config set defaults.location=$AzureLocation defaults.group=$AzureResourceGroup

# Create resource group and capture resource group identifier
ResourceGroupId=$(az group create --name $AzureResourceGroup --query &quot;id&quot; --output tsv)

# Generate Unique ID based on ResourceGroupId
UniqueId=$(echo -n $ResourceGroupId | md5sum | cut -c-13)

# Create ServiceBus and Queue
ServiceBusNamespace=&quot;sbdemo0001$UniqueId&quot;
QueueName=&quot;demoqueue&quot;
echo &quot;Going to create ServiceBus $ServiceBusNamespace and Queue $QueueName&quot;
AzureServiceBusId=$(az servicebus namespace create --name $ServiceBusNamespace --sku Basic --query id -o tsv)
AzureServiceBusQueueId=$(az servicebus queue create --name $QueueName --namespace-name $ServiceBusNamespace --default-message-time-to-live P0Y0M0DT0H0M30S --query id -o tsv)

# Fetch ServiceBus Connectionstring
PrimaryConnectionString=$(az servicebus namespace authorization-rule keys list \
    --namespace-name $ServiceBusNamespace \
    --name &quot;RootManageSharedAccessKey&quot; \
    --query &quot;primaryConnectionString&quot; \
    --output tsv)

echo &quot;$PrimaryConnectionString&quot;
</code></pre>
<blockquote>
<p><strong>Note</strong></p>
<p>The above snippet uses the default generated RootManageSharedAccessKey, which provides full access to your servicebus so use with caution!</p>
</blockquote>
<h2 id="how-does-the-azure-service-bus-sdk-work"><a href="#how-does-the-azure-service-bus-sdk-work">How does the Azure Service Bus SDK work?</a></h2>
<p>A message bus is dependent on both a sender and receiver for communication. There are many examples in the <a href="https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/servicebus/Azure.Messaging.ServiceBus" class="external">official GitHub repo</a>, so I won't go into much more details regarding the bus itself.</p>
<p>This demo will focus on SDK features, so I created an Xunit project that runs multiple scenarios. Since all scenarios require some logic to communicate with the bus, I made the following extension method to avoid unnecessary boilerplate. In a real-world application sending and receiving messages using the ServiceBusClient would not be hidden behind a single extension method.</p>
<pre><code class="language-csharp">using System;
using System.Threading.Tasks;
using Azure.Messaging.ServiceBus;
using FluentAssertions;

namespace Test.Integration;

public static partial class ServiceBusClientTestExtensions
{
    public static async Task RunScenario(this ServiceBusClient client, string queueName, string scenarioName)
    {
        var sender = client.CreateSender(queueName);
        var receiver = client.CreateReceiver(queueName);

        var message = $&quot;{scenarioName}-{DateTimeOffset.Now:s}&quot;;
        await sender.SendMessageAsync(new ServiceBusMessage(message));
        var receivedMessage = await receiver.ReceiveMessageAsync();

        receivedMessage.Body.ToString().Should().Be(message);
        await Task.Delay(TimeSpan.FromSeconds(35));
    }
}
</code></pre>
<p>The default method described by the docs is to pass the ServiceBusConnection string to the ServiceBusClient and create it as needed.</p>
<pre><code class="language-csharp">public class UnitTest1
{
    private const string ConnectionString = &quot;&lt;your-connectionstring&gt;&quot;;
    private const string QueueName = &quot;demoqueue&quot;;

    [Fact]
    public async Task Test_Scenario01_UsePrimaryConnectionString()
    {
        await using var client = new ServiceBusClient(ConnectionString);
        var scenario = async () =&gt; await client.RunScenario(QueueName, nameof(Test_Scenario01_UsePrimaryConnectionString));
        await scenario();
    }
}
</code></pre>
<blockquote>
<p><strong>Warning</strong></p>
<p>Never store credentials in source control!</p>
</blockquote>
<h2 id="how-to-use-azure-sdk-without-connection-strings"><a href="#how-to-use-azure-sdk-without-connection-strings">How to use Azure SDK without connection strings?</a></h2>
<p>Working with secrets like our connection string provides extra overhead. Luckily this incarnation of the Azure SDK embraces token authentication via TokenCredential. For this, we need to install the package <code>Azure.Identity</code>. Using this method is the preferred method of authenticating the Azure SDK.
The easiest way to use this SDK is by creating a <code>DefaultAzureCredential</code>, which attempts to authenticate with a couple of common authentication mechanisms in order.</p>
<ol>
<li>Environment</li>
<li>Managed Identity</li>
<li>Visual Studio</li>
<li>Azure CLI</li>
<li>Azure Powershell</li>
</ol>
<pre><code class="language-csharp">public class UnitTest1
{
    private const string FullyQualifiedNamespace = &quot;&lt;your-namespace&gt;.servicebus.windows.net&quot;;
    private const string QueueName = &quot;demoqueue&quot;;

    [Fact]
    public async Task Test_Scenario02_UseFullyQualifiedNamespace()
    {
        await using var client = new ServiceBusClient(FullyQualifiedNamespace, new DefaultAzureCredential());
        var scenario = async () =&gt; await client.RunScenario(QueueName, nameof(Test_Scenario02_UseFullyQualifiedNamespace));
        await scenario();
    }
}
</code></pre>
<p>Seeing the snippet, you might wonder how is providing <code>your-namespace.servicebus.windows.net</code> any better than a connection string? It's a good question; you still should not store something like that as plain text in source control. For one thing, it will probably be environment-specific. We still need it because we need an address so our application can communicate with Azure. The big difference here is that our address does not contain the key; the address alone is not enough to provide access to our resources.</p>
<p>Depending on how your organization handles roles and access management in Azure, you can now run this test and achieve the same result as before, without those pesky connection strings.
For example, since I created a service bus, my user is the owner of that bus. Being the service bus instance owner is not enough to authenticate and successfully run our scenario. I require one of the service bus specific data roles. You can find a list of supported under <code>Access Control (IAM)</code> in the portal. I opted to use the <code>&quot;Azure Service Bus Data Owner&quot;</code> role for this tutorial.
The tricky bit is that role management in Azure is very granular. When I assign a role, I need to select a scope:</p>
<ul>
<li>subscription</li>
<li>resourceGroup</li>
<li>resource (i.e. ServiceBusNamespace)</li>
<li>child resource (i.e. queue)</li>
</ul>
<p>Scopes are inherited, so if I assign my user a role on a resource group, all resources (if applicable) in that resource group will provide me with the same access.</p>
<p>We can update our Azure CLI script to provide the logged-in user access to the resource.</p>
<pre><code class="language-sh"># Assign Role &quot;Azure Service Bus Data Owner&quot; for the current user
UserIdentity=$(az ad signed-in-user show --query objectId -o tsv)
az role assignment create --assignee $UserIdentity --role &quot;Azure Service Bus Data Owner&quot; --scope $AzureServiceBusId
</code></pre>
<p>Now you know why the previous script captured the AzureServiceBusId ;-)</p>
<p>One thing to note is that DefaultAzureCredential's intended use is to simplify getting started with development. In a real-world application, you would probably need a custom ChainedTokenCredential that uses ManagedIdentityCredential for production and AzureCliCredential for development.</p>
<h2 id="how-can-i-use-the-azure-sdk-with-dependency-injection"><a href="#how-can-i-use-the-azure-sdk-with-dependency-injection">How can I use the Azure SDK with Dependency Injection?</a></h2>
<p>One thing that always bothered me with the code I have shown so far is creating clients on the fly. I prefer to receive my service bus client from the dependency injection container. Discovering that this was a viable solution caused me to submit that PR to the Azure SDK repo. The team had already provided the normal ServiceBusClient, so I recreated the extension method to make ServiceBusAdministrationClient available via DI. It's time to install our third NuGet package, <code>Microsoft.Extensions.Azure</code> which provides the necessary bits.</p>
<p>After installing the package, we get the <code>AddAzureClients</code> extension method on IServiceCollection. It provides access to the <code>AzureClientFactoryBuilder</code> on which we can register everything Azure SDK related. In the case of ServiceBus we get <code>AddServiceBusClient</code> and <code>AddServiceBusClientWithNamespace</code>. I like that these methods are much more explicit than the constructor.</p>
<pre><code class="language-csharp">public class UnitTest1
{
    private const string FullyQualifiedNamespace = &quot;&lt;your-namespace&gt;.servicebus.windows.net&quot;;
    private const string ConnectionString = &quot;&lt;your-connectionstring&gt;&quot;;
    private const string QueueName = &quot;demoqueue&quot;;

    [Fact]
    public async Task Test_Scenario03_UseDependencyInjectionWithPrimaryConnectionString()
    {
        var services = new ServiceCollection();
        services.AddAzureClients(builder =&gt; {
            builder.AddServiceBusClient(ConnectionString);
        });
        var serviceProvider = services.BuildServiceProvider();
        var client = serviceProvider.GetRequiredService&lt;ServiceBusClient&gt;();
        var scenario = async () =&gt; await client.RunScenario(QueueName, nameof(Test_Scenario03_UseDependencyInjectionWithPrimaryConnectionString));
        await scenario();
    }

    [Fact]
    public async Task Test_Scenario04_UseDependencyInjectionWithFullyQualifiedNamespace()
    {
        var services = new ServiceCollection();
        services.AddAzureClients(builder =&gt; {
            builder.AddServiceBusClientWithNamespace(FullyQualifiedNamespace);
        });
        var serviceProvider = services.BuildServiceProvider();
        var client = serviceProvider.GetRequiredService&lt;ServiceBusClient&gt;();
        var scenario = async () =&gt; await client.RunScenario(QueueName, nameof(Test_Scenario04_UseDependencyInjectionWithFullyQualifiedNamespace));
        await scenario();
    }
}
</code></pre>
<p>You might wonder why the <code>FullyQualifiedNamespace</code> one does not need credentials this time around. That's because the Azure SDK can take care of this by default. As mentioned in the previous section, <code>DefaultAzureCredential</code> is the easiest way to hit the ground running. There are two ways we can customize this behaviour. We can either provide a default credential for all Azure Clients or on a per-client basis.</p>
<pre><code class="language-csharp">public class UnitTest1
{
    private const string FullyQualifiedNamespace = &quot;&lt;your-namespace&gt;.servicebus.windows.net&quot;;
    private const string ConnectionString = &quot;&lt;your-connectionstring&gt;&quot;;
    private const string QueueName = &quot;demoqueue&quot;;

    [Fact]
    public async Task Test_Scenario05_DependencyInjectionChangeDefaultToken()
    {
        var services = new ServiceCollection();
        services.AddAzureClients(builder =&gt; {
            builder.AddServiceBusClientWithNamespace(FullyQualifiedNamespace);
            
            builder.UseCredential(new ManagedIdentityCredential());
        });
        var serviceProvider = services.BuildServiceProvider();
        var client = serviceProvider.GetRequiredService&lt;ServiceBusClient&gt;();
        var scenario = async () =&gt; await client.RunScenario(QueueName, nameof(Test_Scenario05_DependencyInjectionChangeDefaultToken));
        await scenario.Should().ThrowAsync&lt;CredentialUnavailableException&gt;();
    }

    [Fact]
    public async Task Test_Scenario06_DependencyInjectionChangeDefaultTokenOnClientLevel()
    {
        var services = new ServiceCollection();
        services.AddAzureClients(builder =&gt; {
            builder.AddServiceBusClientWithNamespace(FullyQualifiedNamespace)
                .WithCredential(new AzureCliCredential());
            
            builder.UseCredential(new ManagedIdentityCredential());
        });
        var serviceProvider = services.BuildServiceProvider();
        var client = serviceProvider.GetRequiredService&lt;ServiceBusClient&gt;();
        var scenario = async () =&gt; await client.RunScenario(QueueName, nameof(Test_Scenario06_DependencyInjectionChangeDefaultTokenOnClientLevel));
        await scenario();
    }
}
</code></pre>
<p>The first sample will not work since I have not set up ManagedIdentity in my environment. The second one also sets ManagedIdentityCredential as the default credential. However, since I set up AzureCliCredential on the client registration, it trumps the global one.</p>
<h2 id="can-we-have-different-client-config-when-using-the-azure-sdk"><a href="#can-we-have-different-client-config-when-using-the-azure-sdk">Can we have different client config when using the Azure SDK?</a></h2>
<p>Here is where things get cool. When you register a client with the SDK, a client named <code>Default</code> gets registered. If, for example, you retrieve <code>ServiceBusClient</code> from the dependency injection, what happens is that the AzureClientFactoy creates this client for you.</p>
<p>In the case of servicebus, you might have multiple different namespaces registered. Every registration provides access to a method <code>WithName</code>. To use named clients in your code, replace <code>ServiceBusClient</code> with <code>IAzureClientFactory&lt;ServiceBusClient</code>.</p>
<pre><code class="language-csharp">public class UnitTest1
{
    private const string FullyQualifiedNamespace = &quot;&lt;your-namespace&gt;.servicebus.windows.net&quot;;
    private const string ConnectionString = &quot;&lt;your-connectionstring&gt;&quot;;
    private const string QueueName = &quot;demoqueue&quot;;

    [Fact]
    public async Task Test_Scenario07_MultipleClients()
    {
        var services = new ServiceCollection();
        services.AddAzureClients(builder =&gt;
        {
            builder.AddServiceBusClient(ConnectionString);

            builder.AddServiceBusClientWithNamespace(FullyQualifiedNamespace)
                .WithName(&quot;OtherClient&quot;);
        });
        var serviceProvider = services.BuildServiceProvider();
        var clientFactory = serviceProvider.GetRequiredService&lt;IAzureClientFactory&lt;ServiceBusClient&gt;&gt;();
        
        var clientDefault = clientFactory.CreateClient(&quot;Default&quot;);
        var scenarioDefaultClient = async () =&gt; await clientDefault.RunScenario(QueueName, nameof(Test_Scenario07_MultipleClients) + &quot;A&quot;);
        await scenarioDefaultClient();
        
        var otherClient = clientFactory.CreateClient(&quot;OtherClient&quot;);
        var scenarioOtherClient = async () =&gt; await otherClient.RunScenario(QueueName, nameof(Test_Scenario07_MultipleClients) + &quot;B&quot;);
        await scenarioOtherClient();
    }
}
</code></pre>
<h2 id="can-i-use-configuration-to-create-azure-sdk-clients"><a href="#can-i-use-configuration-to-create-azure-sdk-clients">Can I use configuration to create Azure SDK clients?</a></h2>
<p>If I had one criticism of the SDK, it would be that the extension methods require the address right there in the call to the method. To be fair, there is an overload that uses IConfiguration, but that leaves everything up to the SDK to validate.</p>
<p>In my <a href="https://kaylumah.nl/2021/11/29/validated-strongly-typed-ioptions.html">previous article on validating IOptions</a>, I wrote about a way to make sure all configuration for my app is valid.</p>
<p>That approach, of course, requires access to the dependency injection container. Luckily there is an additional method available.</p>
<pre><code class="language-csharp">public class UnitTest1
{
    private const string FullyQualifiedNamespace = &quot;&lt;your-namespace&gt;.servicebus.windows.net&quot;;
    private const string QueueName = &quot;demoqueue&quot;;

    [Fact]
    public async Task Test_Scenario08_StronglyTypedOptions()
    {
        var services = new ServiceCollection();
        services.Configure&lt;DemoOptions&gt;(options =&gt;
        {
            options.ServiceBusNamespace = FullyQualifiedNamespace;
        });
        services.AddAzureClients(builder =&gt;
        {
            builder.AddClient&lt;ServiceBusClient, ServiceBusClientOptions&gt;((options, credential, provider) =&gt;
            {
                var demoOptions = provider.GetRequiredService&lt;IOptions&lt;DemoOptions&gt;&gt;();
                return new ServiceBusClient(demoOptions.Value.ServiceBusNamespace, credential, options);
            });
        });
        var serviceProvider = services.BuildServiceProvider();
        var client = serviceProvider.GetRequiredService&lt;ServiceBusClient&gt;();
        var scenario = async () =&gt; await client.RunScenario(QueueName, nameof(Test_Scenario08_StronglyTypedOptions));
        await scenario();
    }
}
</code></pre>
<h2 id="closing-thoughts"><a href="#closing-thoughts">Closing Thoughts</a></h2>
<p>A single blog is too short for providing an overview of everything the Azure SDK offers. I like that authentication and interoperability with the dependency injection container are baked into the SDK. I have not even touched on diagnostics and testability, which are both great topics built into the entire SDK. Who knows, perhaps that is a topic for another time.</p>
<p>As always, if you have any questions, feel free to reach out. Do you have suggestions or alternatives? I would love to hear about them.</p>
<p>The corresponding source code for this article is on <a href="https://github.com/kaylumah/WorkingWithAzureSdkForDotnet" class="external">GitHub</a>.</p>
<p>See you next time, stay healthy and happy coding to all 🧸!</p>
<h2 id="additional-resources"><a href="#additional-resources">Additional Resources</a></h2>
<ul>
<li><a href="https://docs.microsoft.com/en-us/dotnet/azure/sdk/azure-sdk-for-dotnet" class="external">Azure SDK for Dotnet on Microsoft Docs</a></li>
<li><a href="https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/servicebus/Azure.Messaging.ServiceBus/README.md" class="external">Azure.Messaging.ServiceBus on GitHub</a></li>
<li><a href="https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/core/Azure.Core/README.md" class="external">Azure.Core on GitHub</a></li>
<li><a href="https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/identity/Azure.Identity/README.md" class="external">Azure.Identity on GitHub</a></li>
<li><a href="https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/extensions/Microsoft.Extensions.Azure/README.md" class="external">Microsoft.Extensions.Azure on GitHub</a></li>
<li><a href="https://docs.microsoft.com/en-us/azure/service-bus-messaging" class="external">Service bus on Microsoft Docs</a></li>
<li><a href="https://devblogs.microsoft.com/azure-sdk/best-practices-for-using-azure-sdk-with-asp-net-core" class="external">Best practices Azure SDK</a></li>
<li><a href="https://docs.microsoft.com/en-gb/cli/azure/use-cli-effectively" class="external">Best practices Azure CLI</a></li>
</ul>]]></content>
  </entry>
  <entry>
    <id>https://kaylumah.nl/2022/01/31/improve-code-quality-with-bannedsymbolanalyzers.html</id>
    <title type="text"><![CDATA[Improve Code Quality with Banned Symbol Analyzers]]></title>
    <summary type="text"><![CDATA[Learn how a simple Roslyn Analyzer can improve code consistency]]></summary>
    <published>2022-01-31T21:45:00+01:00</published>
    <updated>2022-01-31T21:45:00+01:00</updated>
    <author>
      <name>Max Hamulyák</name>
      <email>max@kaylumah.nl</email>
    </author>
    <link href="https://kaylumah.nl/2022/01/31/improve-code-quality-with-bannedsymbolanalyzers.html" />
    <category term="C#" />
    <category term="Code Quality" />
    <content type="html"><![CDATA[<p>There are many aspects of code quality; testability and consistency come to mind. I don t find it hard to imagine that the longer a project takes with multiple engineers that work on it, the more inconsistencies are in your codebase.</p>
<p>Thanks to the combination of .editorConfig and Roslyn Analyzers, managing this for your team is easier. Recently, however, I was required to create a custom check for my code. A quick google search pointed me to <a href="https://www.meziantou.net/the-roslyn-analyzers-i-use.htm" class="external">an article from Meziantou</a> which mentioned the &quot;Banned Symbol&quot; Roslyn Analyzer.</p>
<h2 id="scenario"><a href="#scenario">Scenario</a></h2>
<p>Before diving into this Analyzer, let's take a few steps back and do a quick scenario sketch. Imagine you work for a company that allows funds transfer between two parties. Since your company needs to make money, you charge a small fee.</p>
<pre><code class="language-cs">public class FeeCalculator : IFeeCalculator
{
    private const decimal FeePercentage = 0.12M;
    private const decimal MinimumCharge = 0.50M;
    private const decimal PriorityFeePercentage = 0.25M;
    private const decimal PriorityMinimumCharge = 7.50M;

    public decimal Calculate(decimal baseAmount, bool isPriority = false)
    {
        if (isPriority)
        {
            return InternalCalculate(baseAmount, PriorityFeePercentage, PriorityMinimumCharge);
        }
        return InternalCalculate(baseAmount, FeePercentage, MinimumCharge);
    }

    private static decimal InternalCalculate(decimal amount, decimal percentage, decimal minimumFee)
    {
        var calculatedFee = amount * (percentage / 100);
        if (calculatedFee &lt; minimumFee)
        {
            return minimumFee;
        }
        return calculatedFee;
    }
}
</code></pre>
<p>The company decided to offer &quot;Monday Madness&quot; at a heavily discounted fee as a special offer.
The implementation would look similar to the snippet below.</p>
<pre><code class="language-cs">public class DatedFeeCalculator : IFeeCalculator
{
    private const decimal DiscountedFeePercentage = 0.07M;
    private const decimal FeePercentage = 0.12M;
    private const decimal MinimumCharge = 0.50M;
    private const decimal PriorityFeePercentage = 0.25M;
    private const decimal PriorityMinimumCharge = 7.50M;

    public decimal Calculate(decimal baseAmount, bool isPriority = false)
    {
        if (isPriority)
        {
            return InternalCalculate(baseAmount, PriorityFeePercentage, PriorityMinimumCharge);
        }

        if (DateTime.Now.DayOfWeek == DayOfWeek.Monday)
        {
            return InternalCalculate(baseAmount, DiscountedFeePercentage, MinimumCharge);
        }
        return InternalCalculate(baseAmount, FeePercentage, MinimumCharge);
    }

    private static decimal InternalCalculate(decimal amount, decimal percentage, decimal minimumFee)
    {
        var calculatedFee = amount * (percentage / 100);
        if (calculatedFee &lt; minimumFee)
        {
            return minimumFee;
        }
        return calculatedFee;
    }
}
</code></pre>
<p>Take notice of line <code>16</code> which now uses <code>DateTime.Now</code>; the problem that now arises is: how do we test this code?</p>
<h2 id="why-testing-datetime-is-hard"><a href="#why-testing-datetime-is-hard">Why testing DateTime is hard</a></h2>
<p>The following test only results in a green build on a Monday, which is excellent if we release every week just before &quot;Monday Madness&quot; but not so great on every other day.</p>
<pre><code class="language-cs">[Fact]
public void Test2_Discounted()
{
    IFeeCalculator calculator = new DatedFeeCalculator();
    var fee = calculator.Calculate(10_000M, false);
    fee.Should().Be(7.00M); // note only on Mondays it's 7.00; every other day its 12.00
}
</code></pre>
<p>We could mitigate by skipping the offending test every day that is not Monday like this; problem solved, right?</p>
<pre><code class="language-cs">[SkippableFact]
public void Test2_Discounted_Alternative()
{
    Skip.If(DateTimeOffset.Now.DayOfWeek != DayOfWeek.Monday);
    IFeeCalculator calculator = new DatedFeeCalculator();
    var fee = calculator.Calculate(10_000M, false);
    fee.Should().Be(7.00M);
}
</code></pre>
<p>Well, no, not actually. What we want is to decouple our code from statics like DateTime.Now by putting them behind an interface. By providing an interface implementation, we can <a href="https://docs.microsoft.com/en-us/dotnet/core/testing/unit-testing-best-practices#stub-static-references" class="external">stub a static reference</a>.
In an ideal world, this interface would already exist, similar to ILogger in Microsoft.Extensions. For some background reading on why it does not yet exist, see this <a href="https://github.com/dotnet/runtime/issues/36617" class="external">GitHub Issue</a>.</p>
<h2 id="updated-scenario"><a href="#updated-scenario">Updated Scenario</a></h2>
<p>In its most simple from the SystemClock can look like the snippet below.</p>
<pre><code class="language-cs">public interface ISystemClock
{
    DateTimeOffset Now { get; }
}

public class SystemClock : ISystemClock
{
    public DateTimeOffset Now =&gt; DateTimeOffset.Now;
}
</code></pre>
<p>Our updated scenario looks like this:</p>
<pre><code class="language-cs">public class SystemClockFeeCalculator : IFeeCalculator
{
    private const decimal DiscountedFeePercentage = 0.07M;
    private const decimal FeePercentage = 0.12M;
    private const decimal MinimumCharge = 0.50M;
    private const decimal PriorityFeePercentage = 0.25M;
    private const decimal PriorityMinimumCharge = 7.50M;

    private readonly ISystemClock systemClock;

    public SystemClockFeeCalculator(ISystemClock systemClock)
    {
        this.systemClock = systemClock;
    }


    public decimal Calculate(decimal baseAmount, bool isPriority = false)
    {
        if (isPriority)
        {
            return InternalCalculate(baseAmount, PriorityFeePercentage, PriorityMinimumCharge);
        }

        if (systemClock.Now.DayOfWeek == DayOfWeek.Monday)
        {
            return InternalCalculate(baseAmount, DiscountedFeePercentage, MinimumCharge);
        }
        return InternalCalculate(baseAmount, FeePercentage, MinimumCharge);
    }

    private static decimal InternalCalculate(decimal amount, decimal percentage, decimal minimumFee)
    {
        var calculatedFee = amount * (percentage / 100);
        if (calculatedFee &lt; minimumFee)
        {
            return minimumFee;
        }
        return calculatedFee;
    }
}
</code></pre>
<p>With the use of a TestSystemClock or a Moq, we can test our behaviour every day of the week. See, we are improving quality already. In a previous article, <a href="https://kaylumah.nl/2021/04/11/an-approach-to-writing-mocks.html">&quot;Adventures with Mock&quot;</a> you can read more about my preferred way of creating mocks.</p>
<pre><code class="language-cs">public sealed class SystemClockMock : Mock&lt;ISystemClock&gt;
{
    public SystemClockMock SetupSystemTime(DateTimeOffset systemTime)
    {
        Setup(x =&gt; x.Now).Returns(systemTime);
        return this;
    }
}
</code></pre>
<p>Thanks to <code>SystemClockMock</code> I can now change the current date for the test.</p>
<pre><code class="language-cs">[Fact]
public void Test3_FakeClock_Monday()
{
    var clock = new SystemClockMock()
        .SetupSystemTime(new DateTimeOffset(new DateTime(2022, 1, 31)));
    IFeeCalculator calculator = new SystemClockFeeCalculator(clock.Object);
    var fee = calculator.Calculate(10_000M, false);
    fee.Should().Be(7.00M);
}

[Fact]
public void Test3_FakeClock_Tuesday()
{
    var clock = new SystemClockMock()
        .SetupSystemTime(new DateTimeOffset(new DateTime(2022, 2, 1)));
    IFeeCalculator calculator = new SystemClockFeeCalculator(clock.Object);
    var fee = calculator.Calculate(10_000M, false);
    fee.Should().Be(12.00M);
}
</code></pre>
<h2 id="force-wrapper-over-static-reference"><a href="#force-wrapper-over-static-reference">Force Wrapper over Static Reference</a></h2>
<p>Now that we have our SystemClock, how do we make sure every dev in our team uses it over just calling <code>DateTimeOffset.Now</code>?</p>
<p>Finally, our Roslyn Analyzer comes into play. We can use <a href="https://github.com/dotnet/roslyn-analyzers/blob/main/src/Microsoft.CodeAnalysis.BannedApiAnalyzers/BannedApiAnalyzers.Help.md" class="external">Microsoft.CodeAnalysis.BannedApiAnalyzers</a>, which triggers the build warning <code>RS0030</code>. I prefer to enable these warnings on every project under src, so I use a Directory.Build.props file to install the analyzer via NuGet.</p>
<pre><code class="language-xml">&lt;Project&gt;
  &lt;Import Project=&quot;../Directory.Build.props&quot; /&gt;
  &lt;ItemGroup&gt;
    &lt;PackageReference Include=&quot;Microsoft.CodeAnalysis.BannedApiAnalyzers&quot; Version=&quot;3.3.2&quot;&gt;
      &lt;PrivateAssets&gt;all&lt;/PrivateAssets&gt;
      &lt;IncludeAssets&gt;runtime; build; native; contentfiles; analyzers&lt;/IncludeAssets&gt;
    &lt;/PackageReference&gt;
  &lt;/ItemGroup&gt;
  &lt;ItemGroup&gt;
    &lt;AdditionalFiles Include=&quot;$(MSBuildThisFileDirectory)/BannedSymbols.txt&quot; /&gt;
  &lt;/ItemGroup&gt;
&lt;/Project&gt;
</code></pre>
<p>All that remains is to create a file called <code>BannedSymbols.txt</code> with the following content</p>
<blockquote>
<p><strong>note</strong>: I also blocked the use of DateTime in favour of DateTimeOffset.</p>
</blockquote>
<pre><code class="language-txt">T:System.DateTime;Always use System.DateTimeOffset over System.DateTime
P:System.DateTimeOffset.Now;Use ISystemClock.Now instead
</code></pre>
<p>From this point on every use of <code>DateTimeOffset.Now</code> results in the following error: <code>error RS0030: The symbol 'DateTimeOffset.Now' is banned in this project: Use ISystemClock.Now instead</code>. Which in my opinion is pretty cool :)</p>
<h2 id="closing-thoughts"><a href="#closing-thoughts">Closing Thoughts</a></h2>
<p>Even if the system used in today's example is fictional, I think the BannedSymbolAnalyzers is a compelling package to include in your toolbelt. At the very least, I will use it to force DateTimeOffset over DateTime. Situation allowing I will also push my wrappers over static references to improve testability.</p>
<p>As always, if you have any questions, feel free to reach out. Do you have suggestions or alternatives? I would love to hear about them.</p>
<p>The corresponding source code for this article is on <a href="https://github.com/kaylumah/ImproveCodeQualityWithBannedSymbolAnalyzers" class="external">GitHub</a>.</p>
<p>See you next time, stay healthy and happy coding to all 🧸!</p>
<h2 id="additional-resources"><a href="#additional-resources">Additional Resources</a></h2>
<ul>
<li>[https://github.com/dotnet/roslyn-analyzers](Roslyn Analyzers)</li>
<li>[https://docs.microsoft.com/en-us/visualstudio/code-quality/roslyn-analyzers-overview](Visual Studio Code Quality)</li>
</ul>]]></content>
  </entry>
  <entry>
    <id>https://kaylumah.nl/2021/11/29/validated-strongly-typed-ioptions.html</id>
    <title type="text"><![CDATA[Validated Strongly Typed IOptions]]></title>
    <summary type="text"><![CDATA[Find configuration errors early with data annotations validation for IOptions in .NET]]></summary>
    <published>2021-11-29T19:00:00+01:00</published>
    <updated>2021-11-29T19:00:00+01:00</updated>
    <author>
      <name>Max Hamulyák</name>
      <email>max@kaylumah.nl</email>
    </author>
    <link href="https://kaylumah.nl/2021/11/29/validated-strongly-typed-ioptions.html" />
    <category term="C#" />
    <category term="Configuration" />
    <content type="html"><![CDATA[<p>Almost every project will have some settings that are configured differently per environment. Chapter three of &quot;The Twelve-Factor App&quot; <a href="https://12factor.net/config" class="external">explains</a> why separating configuration from code is a good idea. In <code>.NET</code>, we use the <code>IConfigurationBuilder</code> to manage our configuration. An <code>IOptions&lt;&gt;</code> is used to make a configuration available as a strongly typed type in our applications.</p>
<p>As I understand it, the <code>configuration</code> concept in <code>.NET</code> is the combination of different configuration sources, called configuration providers, resulting in a single combined configuration. In contrast, the <code>options</code> concept provides access to <code>configuration</code> from our application code. I've attempted to illustrate it with the image below.</p>
<p><img src="https://kaylumah.nl/assets/images/posts/20211129/validated-strongly-typed-ioptions/001_configuration_sources.svg" width="1040" height="920" alt="Microsoft Extensions - IConfiguration - Different configuration sources" /></p>
<h2 id="configuration-in.net"><a href="#configuration-in.net">Configuration in .NET</a></h2>
<p>Technically the image above is an over-simplification. In reality, you use an <code>IConfigurationBuilder</code> where different providers are provided, and the configuration block in the middle is the merged build-result of the configuration builder. In fact, you get a preconfigured configuration builder every time you use the <code>ASP.NET</code> Web templates. You get a <a href="https://github.com/dotnet/runtime/blob/12a8819eee9865eb38bca6c05fdece1053102854/src/libraries/Microsoft.Extensions.Hosting/src/Host.cs#L53" class="external">default HostBuilder</a> that setups an <a href="https://github.com/dotnet/runtime/blob/12a8819eee9865eb38bca6c05fdece1053102854/src/libraries/Microsoft.Extensions.Hosting/src/HostBuilder.cs#L124" class="external">IHost</a>. This default builder also takes care of the <a href="https://github.com/dotnet/runtime/blob/12a8819eee9865eb38bca6c05fdece1053102854/src/libraries/Microsoft.Extensions.Hosting/src/HostingHostBuilderExtensions.cs#L188" class="external">default configuration</a>.</p>
<p>The default configuration adds in order</p>
<ul>
<li>appsettings.json</li>
<li>appsettings.Environment.json</li>
<li>user secrets (if the environment is development)</li>
<li>environment variables</li>
<li>command-line arguments</li>
</ul>
<p>The priority of settings is in the reverse order of adding them to the builder. Passing a setting via the <code>command line</code> will always win from a setting in the <code>appsettings.json</code> file. Fun fact, there are two configurations in <code>ASP.NET</code>. You have the <code>AppConfiguration</code> we just discussed, and you have the <code>HostConfiguration</code>. The <code>HostConfiguration</code> is used to set variables like the <code>DOTNET_ENVIRONMENT</code>, which is used to load the proper <code>appsettings.json</code> and user secrets. Via means of <code>ChainedConfiguration</code> the entire <code>HostConfiguration</code> is also available as part of <code>AppConfiguration</code>.</p>
<p>Let's look at an example. Take the following JSON configuration:</p>
<pre><code class="language-json">{
    &quot;MySample&quot;: {
        &quot;MyText&quot;: &quot;Hello World!&quot;,
        &quot;MyCollection&quot;: [
            {
                &quot;MyOtherText&quot;: &quot;Goodbye Cruel World!&quot;
            }
        ]
    }
}
</code></pre>
<p>That would result in the following two settings being present in our IConfiguration.</p>
<ul>
<li><code>MySample:MyText</code></li>
<li><code>MySample:MyCollection:0:MyOtherText</code></li>
</ul>
<p>With this bit of knowledge, you can override any setting in any provider you can imagine. Visually it would look something like the image below. You can provide sensible defaults in appsettings.json and overwrite values as needed.</p>
<p><img src="https://kaylumah.nl/assets/images/posts/20211129/validated-strongly-typed-ioptions/002_configuration_dotnet.svg" width="551" height="490" alt="Microsoft Extensions - IConfiguration - configuration builder resolve order" /></p>
<blockquote>
<p>As pointed out by the &quot;The Twelve-Factor App&quot; article linked previously, adding configuration files per environment does not scale. I typically end up with one appsettings.json for the defaults and an appsettings.Production.json that gets transformed in my CICD pipeline.</p>
</blockquote>
<p>You can <a href="https://andrewlock.net/exploring-dotnet-6-part-1-looking-inside-configurationmanager-in-dotnet-6/" class="external">read about changes to IConfiguration</a> in <code>.NET6</code> in a post from Andrew Lock. It also contains a different visual representation of configuration, which neatly displays the merging of the different levels.</p>
<h2 id="options-in.net"><a href="#options-in.net">Options in .NET</a></h2>
<p>According to the <a href="https://docs.microsoft.com/en-us/aspnet/core/fundamentals/configuration/options?view=aspnetcore-6.0#bind-hierarchical-configuration" class="external">Microsoft Docs</a> the options pattern is the preferred way to read related configuration values. The options pattern comes in three different flavours, <code>IOptions&lt;&gt;</code>, <code>IOptionsSnapshot&lt;&gt;</code> and <code>IOptionsMonitor&lt;&gt;</code>. Probably the most used one is the default <code>IOptions</code> one, with the drawback that you cannot read configuration after your app starts. Others have taken the task upon themself to explain the differences between the interfaces, for example <a href="https://andrewlock.net/creating-singleton-named-options-with-ioptionsmonitor" class="external">Andrew Lock</a> and <a href="https://khalidabuhakmeh.com/aspnet-core-ioptions-configuration" class="external">Khalid Abuhakmeh</a>. For this post, I will keep it simple with the regular <code>IOptions</code>.</p>
<p>A typical registration of configuration would look like this:</p>
<pre><code class="language-csharp">public static partial class ServiceCollectionExtensions
{
    public static IServiceCollection AddDemo(this IServiceCollection services, IConfiguration configuration)
    {
        services.Configure&lt;DemoOptions&gt;(configuration.GetSection(DemoOptions.DefaultConfigurationSectionName));
        return services;
    }
}
</code></pre>
<blockquote>
<p>This snippet requires the <code>Microsoft Extensions Options ConfigurationExtensions</code> package to work</p>
</blockquote>
<p>Looking at our dependency injection container right after this registration, we see more than just <code>IOptions</code>. We have a total of seven registrations at this point.</p>
<pre><code class="language-output">ServiceType = 'Microsoft.Extensions.Options.IOptions`1[TOptions]' ImplementationType = 'Microsoft.Extensions.Options.UnnamedOptionsManager`1[TOptions]'
ServiceType = 'Microsoft.Extensions.Options.IOptionsSnapshot`1[TOptions]' ImplementationType = 'Microsoft.Extensions.Options.OptionsManager`1[TOptions]'
ServiceType = 'Microsoft.Extensions.Options.IOptionsMonitor`1[TOptions]' ImplementationType = 'Microsoft.Extensions.Options.OptionsMonitor`1[TOptions]'
ServiceType = 'Microsoft.Extensions.Options.IOptionsFactory`1[TOptions]' ImplementationType = 'Microsoft.Extensions.Options.OptionsFactory`1[TOptions]'
ServiceType = 'Microsoft.Extensions.Options.IOptionsMonitorCache`1[TOptions]' ImplementationType = 'Microsoft.Extensions.Options.OptionsCache`1[TOptions]'
ServiceType = 'Microsoft.Extensions.Options.IOptionsChangeTokenSource`1[Test.Unit.DemoOptions]' ImplementationType = ''
ServiceType = 'Microsoft.Extensions.Options.IConfigureOptions`1[Test.Unit.DemoOptions]' ImplementationType = ''
</code></pre>
<p>The problem with the above approach is that it assumes the configuration exists at a predefined section, which is not very flexible. An alternative approach to register <code>IOptions</code> is the use of an <code>Action&lt;&gt;</code>.</p>
<pre><code class="language-csharp">public static partial class ServiceCollectionExtensions
{
    public static IServiceCollection AddExample(this IServiceCollection services, Action&lt;ExampleOptions&gt; configureDelegate)
    {
        services.Configure(configureDelegate);
        return services;
    }
}
</code></pre>
<p>With this approach, we get a total of six DI registrations.</p>
<pre><code class="language-output">ServiceType = 'Microsoft.Extensions.Options.IOptions`1[TOptions]' ImplementationType = 'Microsoft.Extensions.Options.UnnamedOptionsManager`1[TOptions]'
ServiceType = 'Microsoft.Extensions.Options.IOptionsSnapshot`1[TOptions]' ImplementationType = 'Microsoft.Extensions.Options.OptionsManager`1[TOptions]'
ServiceType = 'Microsoft.Extensions.Options.IOptionsMonitor`1[TOptions]' ImplementationType = 'Microsoft.Extensions.Options.OptionsMonitor`1[TOptions]'
ServiceType = 'Microsoft.Extensions.Options.IOptionsFactory`1[TOptions]' ImplementationType = 'Microsoft.Extensions.Options.OptionsFactory`1[TOptions]'
ServiceType = 'Microsoft.Extensions.Options.IOptionsMonitorCache`1[TOptions]' ImplementationType = 'Microsoft.Extensions.Options.OptionsCache`1[TOptions]'
ServiceType = 'Microsoft.Extensions.Options.IConfigureOptions`1[Test.Unit.ExampleOptions]' ImplementationType = ''
</code></pre>
<p>The only difference is that we do not get the <code>IOptionsChangeTokenSource</code>. To be most flexible, you can combine both techniques like this.</p>
<pre><code class="language-csharp">public static partial class ServiceCollectionExtensions
{
    public static IServiceCollection AddExample(this IServiceCollection services, IConfiguration config)
    {
        services.AddExample(options =&gt; config.GetSection(ExampleOptions.DefaultConfigurationSectionName).Bind(options));
        return services;
    }

    public static IServiceCollection AddExample(this IServiceCollection services, Action&lt;ExampleOptions&gt; configureDelegate)
    {
        services.Configure(configureDelegate);
        return services;
    }
}
</code></pre>
<h2 id="validated-options"><a href="#validated-options">Validated Options</a></h2>
<p>Now that we covered the basics, I can move on to the focal point of this blog post. As you can imagine overlaying the different configuration sources does not guarantee a valid result from the point of view of your application. Worse, since the number of configuration sources can differ between environments, you can potentially have configuration issues very late in your CICD pipeline. For example, if you use Azure Key Vault as a configuration provider, settings might be changed by anyone with access to the vault.</p>
<p>In my article <a href="https://kaylumah.nl/2021/05/23/generate-csharp-client-for-openapi.html">Generate C# client for OpenAPI</a>, I used HttpClient to call a generated OpenAPI service. HTTP is the perfect example for validating configuration. In our API example, we will likely have different base URLs per environment. If we represent an URL as a string in configuration, it is feasible to enter &quot;not-an-url&quot; as its value, which causes your application to crash and burn.</p>
<p>As I see it, there are two distinct ways configuration can fail.</p>
<h3 id="missing-configuration-sections"><a href="#missing-configuration-sections">Missing Configuration Sections</a></h3>
<p>The first variant is binding configuration at a section that does not exist. That is because <code>configuration.GetSection</code> does not throw but <a href="https://docs.microsoft.com/en-us/dotnet/api/system.configuration.configuration.getsection?view=dotnet-plat-ext-6.0" class="external">returns null</a> for a section that does not exist. Oddly enough, when configuration fails to bind, you still get an <code>IOptions&lt;TOptions&gt;</code> but with null values.</p>
<p>When specifying a section by name, I expect that section to exist. Therefore I want my application to not boot with missing configuration sections. The following extension method takes care of that.</p>
<pre><code class="language-cs">public static IConfigurationSection GetExistingSectionOrThrow(this IConfiguration configuration, string key)
{
    var configurationSection = configuration.GetSection(key);

    if (!configurationSection.Exists())
    {
        throw configuration switch
        {
            IConfigurationRoot configurationIsRoot =&gt; new ArgumentException($&quot;Section with key '{key}' does not exist. Existing values are: {configurationIsRoot.GetDebugView()}&quot;, nameof(key)),
            IConfigurationSection configurationIsSection =&gt; new ArgumentException($&quot;Section with key '{key}' does not exist at '{configurationIsSection.Path}'. Expected configuration path is '{configurationSection.Path}'&quot;, nameof(key)),
            _ =&gt; new ArgumentException($&quot;Failed to find configuration at '{configurationSection.Path}'&quot;, nameof(key))
        };
    }

    return configurationSection;
}
</code></pre>
<blockquote>
<p><strong>caution</strong>: configurationIsRoot.GetDebugView() prints all configuration settings and their value, if you have secrets you should add log masking to prevent them from being logged.</p>
</blockquote>
<h3 id="dataannotations-validation"><a href="#dataannotations-validation">DataAnnotations Validation</a></h3>
<p>The second variant is the most likely to occur. That is, settings are present but not valid in the context of the application. I recently browsed the Microsoft Docs after (again) losing time chasing configuration issues when I came across <code>IValidateOptions</code>. I also rediscovered <code>ValidateDataAnnotations</code> on the <code>IOptionsBuilder</code>, which I previously dismissed since it was a different API (<code>AddOptions&lt;&gt;</code>) than the <code>Configure&lt;&gt;</code> APIs. With Resharper by my side, I checked the implementation and discovered that it uses <code>DataAnnotationValidateOptions</code> a class that is a <code>IValidateOptions</code>.</p>
<p>When consuming an IOptions, there are three hooks we can use. We have <code>IConfigureOptions</code>, <code>IPostConfigureOptions</code> and <code>IValidateOptions</code>. If you head back up to where I printed the dependency injection container, you see that every time you use <code>Configure&lt;&gt;</code>, you get an <code>IConfigureOptions</code>. I illustrated this process below, IOptions makes use of an OptionsFactory. This factory goes through all registered &quot;option services&quot;.</p>
<p><img src="https://kaylumah.nl/assets/images/posts/20211129/validated-strongly-typed-ioptions/003_ioptions.svg" width="837" height="1674" alt="Microsoft Extensions - IConfiguration - options factory" /></p>
<p>You can add any number of implementations of these three interfaces. Implementations of the same interface execute in the order in which you define them. If you register an <code>IPostConfigureOptions</code> or <code>IValidateOptions</code> before the normal <code>IConfigureOptions</code>, it won't run before it. The factory runs through 0 or more <code>IConfigureOptions</code>, 0 or more <code>IPostConfigureOptions</code> and finally 0 or more <code>IValidateOptions</code> and always in that order.</p>
<p>To demonstrate how this works, consider the following example:</p>
<pre><code class="language-csharp">public class ConfigureLibraryExampleServiceOptions : IConfigureOptions&lt;LibraryExampleServiceOptions&gt;, IPostConfigureOptions&lt;LibraryExampleServiceOptions&gt;, IValidateOptions&lt;LibraryExampleServiceOptions&gt;
{
    private readonly ILogger _logger;

    public ConfigureLibraryExampleServiceOptions(ILogger&lt;ConfigureLibraryExampleServiceOptions&gt; logger)
    {
        _logger = logger;
    }
    
    public void Configure(LibraryExampleServiceOptions options)
    {
        _logger.LogInformation(&quot;ConfigureExampleServiceOptions Configure&quot;);
    }

    public void PostConfigure(string name, LibraryExampleServiceOptions options)
    {
        _logger.LogInformation(&quot;ConfigureExampleServiceOptions PostConfigure&quot;);
    }

    public ValidateOptionsResult Validate(string name, LibraryExampleServiceOptions options)
    {
        _logger.LogInformation(&quot;ConfigureExampleServiceOptions ValidateOptionsResult&quot;);
        return ValidateOptionsResult.Skip;
    }
}
</code></pre>
<p>You might assume that this validation triggers the moment we resolve an IOptions from the DI container. Unfortunately, this is not the case; it only triggers when using the <code>.Value</code> property.</p>
<pre><code class="language-csharp">var configuration = new ConfigurationBuilder()
    .AddInMemoryCollection(new Dictionary&lt;string, string&gt;() {
        [string.Join(&quot;:&quot;, LibraryExampleServiceOptions.DefaultConfigurationSectionName, nameof(LibraryExampleServiceOptions.BaseUrl))] = &quot;http://example.com&quot;
    })
    .Build();
var serviceProvider = new ServiceCollection()
    .AddLogging(builder =&gt; builder.AddConsole())
    .AddExampleLibrary(configuration)
    .BuildServiceProvider();

var logger = serviceProvider.GetRequiredService&lt;ILogger&lt;Program&gt;&gt;();
logger.LogInformation(&quot;Before retrieving IOptions&quot;);
var options = serviceProvider.GetRequiredService&lt;IOptions&lt;LibraryExampleServiceOptions&gt;&gt;();
logger.LogInformation(&quot;After retrieving IOptions; before IOptions.Value&quot;);
var optionsValue = options.Value;
logger.LogInformation(&quot;After IOptions.Value&quot;);

Console.ReadLine();
</code></pre>
<p>Which outputs:</p>
<pre><code class="language-output">info: Program[0]
      Before retrieving IOptions
info: Program[0]
      After retrieving IOptions; before IOptions.Value
info: Kaylumah.ValidatedStronglyTypedIOptions.Library.ConfigureLibraryExampleServiceOptions[0]
      ConfigureExampleServiceOptions Configure
info: Kaylumah.ValidatedStronglyTypedIOptions.Library.ConfigureLibraryExampleServiceOptions[0]
      ConfigureExampleServiceOptions PostConfigure
info: Kaylumah.ValidatedStronglyTypedIOptions.Library.ConfigureLibraryExampleServiceOptions[0]
      ConfigureExampleServiceOptions ValidateOptionsResult
info: Program[0]
      After IOptions.Value
</code></pre>
<p>Circling back to validation, I've created an extension method that registers <code>DataAnnotationValidateOptions </code>for us. One thing to note is that <code>IValidateOptions</code> is a named option, whereas the normal <code>IOptions</code> is an unnamed option. Microsoft solved this by providing a &quot;DefaultName&quot; for an options object which is an empty string.</p>
<pre><code class="language-csharp">public static partial class ServiceCollectionExtensions
{
    public static IServiceCollection ConfigureWithValidation&lt;TOptions&gt;(this IServiceCollection services, IConfiguration config) where TOptions : class
        =&gt; services.ConfigureWithValidation&lt;TOptions&gt;(Options.Options.DefaultName, config);
    
    public static IServiceCollection ConfigureWithValidation&lt;TOptions&gt;(this IServiceCollection services, string name, IConfiguration config) where TOptions : class
    {
        _ = config ?? throw new ArgumentNullException(nameof(config));
        services.Configure&lt;TOptions&gt;(name, config);
        services.AddDataAnnotationValidatedOptions&lt;TOptions&gt;(name);
        return services;
    }

    public static IServiceCollection ConfigureWithValidation&lt;TOptions&gt;(this IServiceCollection services, Action&lt;TOptions&gt; configureOptions) where TOptions : class
        =&gt; services.ConfigureWithValidation&lt;TOptions&gt;(Options.Options.DefaultName, configureOptions);

    public static IServiceCollection ConfigureWithValidation&lt;TOptions&gt;(this IServiceCollection services, string name, Action&lt;TOptions&gt; configureOptions) where TOptions : class
    {
        services.Configure(name, configureOptions);
        services.AddDataAnnotationValidatedOptions&lt;TOptions&gt;(name);
        return services;
    }

    private static IServiceCollection AddDataAnnotationValidatedOptions&lt;TOptions&gt;(this IServiceCollection services, string name) where TOptions : class
    {
        services.TryAddEnumerable(ServiceDescriptor.Singleton&lt;IValidateOptions&lt;TOptions&gt;&gt;(new DataAnnotationValidateOptions&lt;TOptions&gt;(name)));
        return services;
    }
}
</code></pre>
<p>If we put it to the test, our settings object could look like this. In this case, we use the <code>Required</code> and <code>Url</code> attributes. You can use <a href="https://docs.microsoft.com/en-us/dotnet/api/system.componentmodel.dataannotations?view=net-6.0" class="external">any of the attributes</a> provided by default, or create your custom attributes.</p>
<pre><code class="language-csharp">public class LibraryExampleServiceOptions
{
    public const string DefaultConfigurationSectionName = nameof(LibraryExampleServiceOptions);

    [Required, Url]
    public string? BaseUrl { get;set; }
}
</code></pre>
<blockquote>
<p>Consider nullability and default values of properties when defining them. In the spirit of the example, you might have a retry-count if it has the value 0; is that because you specified it or forgot to define it? That's why I always define properties as <code>[Required]</code> and <code>Nullable</code>.</p>
</blockquote>
<pre><code class="language-output">info: Program[0]
      Before retrieving IOptions
info: Program[0]
      After retrieving IOptions; before IOptions.Value
info: Kaylumah.ValidatedStronglyTypedIOptions.Library.ConfigureLibraryExampleServiceOptions[0]
      ConfigureExampleServiceOptions Configure
info: Kaylumah.ValidatedStronglyTypedIOptions.Library.ConfigureLibraryExampleServiceOptions[0]
      ConfigureExampleServiceOptions PostConfigure
info: Kaylumah.ValidatedStronglyTypedIOptions.Library.ConfigureLibraryExampleServiceOptions[0]
      ConfigureExampleServiceOptions ValidateOptionsResult
Unhandled exception. Microsoft.Extensions.Options.OptionsValidationException: DataAnnotation validation failed for 'LibraryExampleServiceOptions' members: 'BaseUrl' with the error: 'The BaseUrl field is not a valid fully-qualified http, https, or ftp URL.'.
   at Microsoft.Extensions.Options.OptionsFactory`1.Create(String name)
   at Microsoft.Extensions.Options.UnnamedOptionsManager`1.get_Value()
   at Program.&lt;Main&gt;$(String[] args) 
</code></pre>
<p>I think that is pretty neat. But I am not a big fan of the formatting. I remember the last time I used the <code>Web API</code> template, which resulted in a nicely formatted error. I had to dig in the ASPNET code, and it's the <a href="https://github.com/dotnet/aspnetcore/blob/a450cb69b5e4549f5515cdb057a68771f56cefd7/src/Mvc/Mvc.Core/src/Infrastructure/ModelStateInvalidFilter.cs#L80" class="external">ModelStateInvalidFilter</a> that transforms <a href="https://github.com/dotnet/aspnetcore/blob/d9660d157627af710b71c636fa8cb139616cadba/src/Mvc/Mvc.Abstractions/src/ModelBinding/ModelStateDictionary.cs#L147" class="external">ModelStateDictionary.cs</a> into a ValidationProblemDetails. I've added an example of this to the source repo, with the output shown below.</p>
<pre><code class="language-json">{
    &quot;type&quot;: &quot;https://tools.ietf.org/html/rfc7231#section-6.5.1&quot;,
    &quot;title&quot;: &quot;One or more validation errors occurred.&quot;,
    &quot;status&quot;: 400,
    &quot;traceId&quot;: &quot;00-50f5816f844377e66f37688f297dfd29-ab771434a82ee290-00&quot;,
    &quot;errors&quot;: {
        &quot;Name&quot;: [&quot;The Name field is required.&quot;],
        &quot;EmailAddresses[0].Label&quot;: [&quot;The Label field is required.&quot;],
        &quot;EmailAddresses[0].Address&quot;: [&quot;The Address field is required.&quot;]
    }
}
</code></pre>
<p>In the example above, I added validation on both the parent and the child DTO. It appears, however, that doing the same with DataAnnotations does not work. To enable the same behaviour for DataAnnotations, we can create custom <code>ValidationAttributes</code>. We begin with defining a special <code>ValidationResult</code> that is a composite of multiple ValidationResults.</p>
<pre><code class="language-csharp">public class CompositeValidationResult : System.ComponentModel.DataAnnotations.ValidationResult
{
    private readonly List&lt;System.ComponentModel.DataAnnotations.ValidationResult&gt; results = new();

    public IEnumerable&lt;System.ComponentModel.DataAnnotations.ValidationResult&gt; Results =&gt; results;

    public CompositeValidationResult(string? errorMessage) : base(errorMessage)
    {
    }

    public CompositeValidationResult(string errorMessage, IEnumerable&lt;string&gt;? memberNames) : base(errorMessage, memberNames)
    {
    }

    protected CompositeValidationResult(System.ComponentModel.DataAnnotations.ValidationResult validationResult) : base(validationResult)
    {
    }

    public void AddResult(System.ComponentModel.DataAnnotations.ValidationResult validationResult)
    {
        results.Add(validationResult);
    }
}
</code></pre>
<p>Next we create a custom <code>ValidationAttribute</code> for objects.</p>
<pre><code class="language-csharp">[AttributeUsage(AttributeTargets.Property | AttributeTargets.Parameter)]
public sealed class ValidateObjectAttribute : ValidationAttribute
{
    protected override System.ComponentModel.DataAnnotations.ValidationResult IsValid(object? value, ValidationContext validationContext)
    {
        if (value != null &amp;&amp; validationContext != null)
        {
            var results = new List&lt;System.ComponentModel.DataAnnotations.ValidationResult&gt;();
            var context = new ValidationContext(value, null, null);

            System.ComponentModel.DataAnnotations.Validator.TryValidateObject(value, context, results, true);

            if (results.Count != 0)
            {
                var compositeValidationResult = new CompositeValidationResult($&quot;Validation for {validationContext.DisplayName} failed.&quot;, new[] { validationContext.MemberName });
                results.ForEach(compositeValidationResult.AddResult);

                return compositeValidationResult;
            }
        }

        return System.ComponentModel.DataAnnotations.ValidationResult.Success;
    }
}
</code></pre>
<p>And finally, we need a <code>ValidationAttribute</code> for collections.</p>
<pre><code class="language-csharp">[AttributeUsage(AttributeTargets.Property | AttributeTargets.Parameter)]
public sealed class ValidateCollectionAttribute : ValidationAttribute
{
    protected override System.ComponentModel.DataAnnotations.ValidationResult IsValid(object? value, ValidationContext validationContext)
    {
        CompositeValidationResult? collectionCompositeValidationResult = null;

        if (value is IEnumerable collection &amp;&amp; validationContext != null)
        {
            var index = 0;
            foreach (var obj in collection)
            {
                var results = new List&lt;System.ComponentModel.DataAnnotations.ValidationResult&gt;();
                var context = new ValidationContext(obj, null, null);

                System.ComponentModel.DataAnnotations.Validator.TryValidateObject(obj, context, results, true);

                if (results.Count != 0)
                {
                    var compositeValidationResult = new CompositeValidationResult($&quot;Validation for {validationContext.MemberName}[{index}] failed.&quot;, new[] { $&quot;{validationContext.MemberName}[{index}]&quot; });
                    results.ForEach(compositeValidationResult.AddResult);

                    if (collectionCompositeValidationResult == null)
                    {
                        collectionCompositeValidationResult = new CompositeValidationResult($&quot;Validation for {validationContext.MemberName} failed.&quot;, new[] { validationContext.MemberName });
                    }

                    collectionCompositeValidationResult.AddResult(compositeValidationResult);
                }

                index++;
            }

            if (collectionCompositeValidationResult != null)
            {
                return collectionCompositeValidationResult;
            }
        }

        return System.ComponentModel.DataAnnotations.ValidationResult.Success;
    }
}
</code></pre>
<p>Our validation would already trigger with just these attributes. But we are also interested in handling our CompositeValidationResult and pretty-printing it.</p>
<pre><code class="language-csharp">public static class Validator
{
    public static ValidationResult[] ValidateReturnValue(object objectToValidate)
    {
        var validationResults = new List&lt;System.ComponentModel.DataAnnotations.ValidationResult&gt;();

        if (objectToValidate == null)
        {
            validationResults.Add(new System.ComponentModel.DataAnnotations.ValidationResult(&quot;Return value is required.&quot;));
        }
        else
        {
            var validationContext = new ValidationContext(objectToValidate);

            System.ComponentModel.DataAnnotations.Validator.TryValidateObject(objectToValidate, validationContext, validationResults, true);

            if (validationResults.Count != 0)
            {
                var compositeValidationResult = new CompositeValidationResult($&quot;Validation for {validationContext.DisplayName} failed.&quot;, new[] { validationContext.MemberName });
                validationResults.ForEach(compositeValidationResult.AddResult);
            }
        }

        var structuredValidationResults = StructureValidationResults(validationResults);
        return structuredValidationResults;
    }

    private static ValidationResult[] StructureValidationResults(IEnumerable&lt;System.ComponentModel.DataAnnotations.ValidationResult&gt; validationResults)
    {
        var structuredValidationResults = new List&lt;ValidationResult&gt;();
        foreach (var validationResult in validationResults)
        {
            var structuredValidationResult = new ValidationResult
            {
                ErrorMessage = validationResult.ErrorMessage,
                MemberNames = validationResult.MemberNames.ToArray()
            };

            if (validationResult is CompositeValidationResult compositeValidationResult)
            {
                structuredValidationResult.ValidationResults = StructureValidationResults(compositeValidationResult.Results);
            }

            structuredValidationResults.Add(structuredValidationResult);
        }

        return structuredValidationResults.ToArray();
    }
}
</code></pre>
<p>You can then use it in an <code>IValidateOptions</code> like this</p>
<pre><code class="language-csharp">internal class CustomValidate : IValidateOptions&lt;NestedParent&gt;
{
    public ValidateOptionsResult Validate(string name, NestedParent options)
    {
        var validationResults = Kaylumah.ValidatedStronglyTypedIOptions.Utilities.Validation.Validator.ValidateReturnValue(options);
        if (validationResults.Any())
        {
            var builder = new StringBuilder();
            foreach (var result in validationResults)
            {
                var pretty = PrettyPrint(result, string.Empty, true);
                builder.Append(pretty);
            }
            return ValidateOptionsResult.Fail(builder.ToString());
        }

        return ValidateOptionsResult.Success;
    }

    private string PrettyPrint(Kaylumah.ValidatedStronglyTypedIOptions.Utilities.Validation.ValidationResult root, string indent, bool last)
    {
        // Based on https://stackoverflow.com/a/1649223
        var sb = new StringBuilder();
        sb.Append(indent);
        if (last)
        {
            sb.Append(&quot;|-&quot;);
            indent += &quot;  &quot;;
        }
        else
        {
            sb.Append(&quot;|-&quot;);
            indent += &quot;| &quot;;
        }

        sb.AppendLine(root.ToString());

        if (root.ValidationResults != null)
        {
            for (var i = 0; i &lt; root.ValidationResults.Length; i++)
            {
                var child = root.ValidationResults[i];
                var pretty = PrettyPrint(child, indent, i == root.ValidationResults.Length - 1);
                sb.Append(pretty);
            }
        }

        return sb.ToString();
    }
}
</code></pre>
<p>Which prints</p>
<pre><code class="language-output">Microsoft.Extensions.Options.OptionsValidationException : |-Children =&gt; Validation for Children failed.
  |-Children[0] =&gt; Validation for Children[0] failed.
    |-Name =&gt; The Name field is required.

    Stack Trace:
       at Microsoft.Extensions.Options.OptionsFactory`1.Create(String name)
   at Microsoft.Extensions.Options.UnnamedOptionsManager`1.get_Value()
</code></pre>
<p>That looks more like it. One thing this approach, unfortunately, cannot solve is that errors occur at runtime. Wherewith <code>IConfiguration</code>, we could get the error at startup; we don't have the same luxury with <code>IOptions</code> since, as demonstrated, <code>Value</code> triggers at runtime. It is, however, a step in the right direction.</p>
<blockquote>
<p><strong>Note</strong>: since IOptions&lt;&gt; is an unbound generic you cannot retrieve all instances of it from the DI container to trigger this behaviour at startup</p>
</blockquote>
<h2 id="bonus-strongly-typed-options"><a href="#bonus-strongly-typed-options">Bonus: Strongly typed options</a></h2>
<p>I never liked using <code>IOptions&lt;&gt;</code> all over the place. I've found it especially bothersome in unit tests. I would either need <code>Options.Create</code> or create an <code>IOptions</code> Moq. If you don't rely on reloading configuration (remember <code>IOptions</code> is a Singleton), you can register a typed instance, which I find pretty neat.</p>
<pre><code class="language-csharp">var serviceProvider = new ServiceCollection()
            .Configure&lt;StronglyTypedOptions&gt;(builder =&gt; {
                builder.Name = &quot;TestStronglyTypedOptions&quot;;
            })
            .AddSingleton(sp =&gt; sp.GetRequiredService&lt;IOptions&lt;StronglyTypedOptions&gt;&gt;().Value)
            .BuildServiceProvider();
var options = serviceProvider.GetRequiredService&lt;IOptions&lt;StronglyTypedOptions&gt;&gt;().Value;
var typedOptions = serviceProvider.GetRequiredService&lt;StronglyTypedOptions&gt;();
typedOptions.Name.Should().Be(options.Name);
</code></pre>
<h2 id="closing-thoughts"><a href="#closing-thoughts">Closing Thoughts</a></h2>
<p>Using the options and configuration patterns described in this article makes it a lot less likely to run into configuration errors, or at the very least, it makes it easier to troubleshoot configuration mistakes.</p>
<p>As always, if you have any questions, feel free to reach out. Do you have suggestions or alternatives? I would love to hear about them.</p>
<p>The corresponding source code for this article is on <a href="https://github.com/kaylumah/ValidatedStronglyTypedIOptions" class="external">GitHub</a>.</p>
<p>See you next time, stay healthy and happy coding to all 🧸!</p>
<h2 id="resources"><a href="#resources">Resources</a></h2>
<ul>
<li><a href="https://docs.microsoft.com/en-us/dotnet/core/extensions/configuration" class="external">Configuration in .NET</a></li>
<li><a href="https://docs.microsoft.com/en-us/aspnet/core/fundamentals/configuration/?view=aspnetcore-6.0" class="external">Configuration in ASP.NET Core</a></li>
<li><a href="https://docs.microsoft.com/en-us/dotnet/core/extensions/options" class="external">Options pattern in .NET</a></li>
<li><a href="https://docs.microsoft.com/en-us/aspnet/core/fundamentals/configuration/options?view=aspnetcore-6.0" class="external">Options pattern in ASP.NET Core</a></li>
</ul>]]></content>
  </entry>
  <entry>
    <id>https://kaylumah.nl/2021/11/14/capture-logs-in-unit-tests.html</id>
    <title type="text"><![CDATA[Capture Logs in Unit Tests]]></title>
    <summary type="text"><![CDATA[A guide to capturing logs in Xunit]]></summary>
    <published>2021-11-14T20:30:00+01:00</published>
    <updated>2021-11-14T20:30:00+01:00</updated>
    <author>
      <name>Max Hamulyák</name>
      <email>max@kaylumah.nl</email>
    </author>
    <link href="https://kaylumah.nl/2021/11/14/capture-logs-in-unit-tests.html" />
    <category term="C#" />
    <category term="Testing" />
    <category term="Xunit" />
    <content type="html"><![CDATA[<p>In application code, we are used to writing log statements primarily for diagnostic purposes. For instance, we use logs to capture unexpected error flows. Therefore it is not uncommon to want to capture the log output in our unit tests. You have three distinctive options to handle log output in unit tests, as far as I can tell.</p>
<h2 id="scenario"><a href="#scenario">Scenario</a></h2>
<p>Our test scenario is a service or system under test (SUT) that takes a string input and returns it without modification. We rely on <code>Microsoft Extensions</code> for our logging purposes. As the test framework, we will be using <code>Xunit</code>.</p>
<pre><code class="language-cs">public interface IEchoService
{
    Task&lt;string&gt; Echo(string input);
}
</code></pre>
<p>The initial implementation of our SUT could look like this:</p>
<pre><code class="language-cs">public class EchoService : IEchoService
{
    private readonly ILogger&lt;EchoService&gt; _logger;

    public EchoService(ILogger&lt;EchoService&gt; logger)
    {
        _logger = logger;
    }

    public Task&lt;string&gt; Echo(string input)
    {
        _logger.LogInformation(&quot;echo was invoked&quot;);
        return Task.FromResult(input);
    }
}
</code></pre>
<p>For this article, the snippet above would be more than sufficient. But in a real-life application, I prefer to log the input as well. If, however, we would use simple string interpolation, we immediately get a Code-Analysis warning about it. The recommendation here is to use LoggerMessage that enables the use of <a href="https://docs.microsoft.com/en-us/aspnet/core/fundamentals/logging/loggermessage?view=aspnetcore-6.0" class="external">high-performance logging</a>. I've always found that implementing the LoggerMessage pattern required quite a bit of boilerplate. Luckily in .NET 6, this is a lot easier. We can <a href="https://docs.microsoft.com/en-us/dotnet/core/extensions/logger-message-generator" class="external">generate</a> all the boilerplate we need. As per usual, Andrew Lock <a href="https://andrewlock.net/exploring-dotnet-6-part-8-improving-logging-performance-with-source-generators/" class="external">wrote a piece</a> about this new feature already.</p>
<p>After applying our <code>LoggerMessage</code> changes to the SUT it looks like the snippet below. Please note that in order for this to work the class <code>EchoService</code> it self is now marked as <code>partial</code>.</p>
<pre><code class="language-cs">public partial class EchoService : IEchoService
{
    private readonly ILogger&lt;EchoService&gt; _logger;

    public EchoService(ILogger&lt;EchoService&gt; logger)
    {
        _logger = logger;
    }

    public Task&lt;string&gt; Echo(string input)
    {
        //_logger.LogInformation(&quot;echo was invoked&quot;);

        // The logging message template should not vary between calls to ... csharp(CA2254)
        // _logger.LogInformation($&quot;echo was invoked with {input}&quot;);

        LogEchoCall(input);

        return Task.FromResult(input);
    }

    [LoggerMessage(1000, LogLevel.Information, &quot;echo was invoked '{EchoInput}'&quot;)]
    partial void LogEchoCall(string echoInput);
}
</code></pre>
<h2 id="option-1"><a href="#option-1">Option 1</a></h2>
<p>First up is doing absolutely nothing. Yeah, you read that correctly. You might find it silly to start this piece with the first option being nothing, but doing nothing with log statements in your test code is perfectly fine. Heck, even doing nothing comes in two flavours.</p>
<p>If we use Dependency Injection in our test, we have access to &quot;AddLogging()&quot;. If we don't provide a logging provider, our code will run just fine. Otherwise, if you have already set up a logging provider or provided one explicitly, it will log to zero or more providers depending on your current configuration. For instance, you could use the ConsoleLoggerProvider to log to the console during the test. I often use the DI variant in my test since I am writing extension methods on IServiceCollection to write up my code anyway, so using the same extension method in test code simplifies matters.</p>
<pre><code class="language-cs">[Fact]
public async Task Test_DependencyInjection_EmptyLoggingBuilder()
{
    var configuration = new ConfigurationBuilder().Build();
    var serviceProvider = new ServiceCollection()
        .AddLogging() // could also be part of AddEcho to make sure ILogger is available outside ASP.NET runtime
        .AddEcho(configuration)
        .BuildServiceProvider();
    var sut = serviceProvider.GetRequiredService&lt;IEchoService&gt;();
    var testInput = &quot;Scenario: empty logging builder&quot;;
    var testResult = await sut.Echo(testInput).ConfigureAwait(false);
    testResult.Should().Be(testInput, &quot;the input should have been returned&quot;);
}
</code></pre>
<p><picture><source type="image/webp" srcset="https://kaylumah.nl/assets/images/posts/20211114/capture-logs-in-unit-tests/001_NoLogger.png.webp" /><img loading="lazy" src="https://kaylumah.nl/assets/images/posts/20211114/capture-logs-in-unit-tests/001_NoLogger.png" width="1564" height="814" alt="VS Code - Dotnet Debugger - No ILogger Registered" /></picture></p>
<pre><code class="language-cs">[Fact]
public async Task Test_DependencyInjection_ConsoleLoggingBuilder()
{
    var configuration = new ConfigurationBuilder().Build();
    var serviceProvider = new ServiceCollection()
        .AddLogging(loggingBuilder =&gt; {
            loggingBuilder.AddConsole();
        })
        .AddEcho(configuration)
        .BuildServiceProvider();
    var sut = serviceProvider.GetRequiredService&lt;IEchoService&gt;();
    var testInput = &quot;Scenario: console logging builder&quot;;
    var testResult = await sut.Echo(testInput).ConfigureAwait(false);
    testResult.Should().Be(testInput, &quot;the input should have been returned&quot;);
}
</code></pre>
<p><picture><source type="image/webp" srcset="https://kaylumah.nl/assets/images/posts/20211114/capture-logs-in-unit-tests/002_ConsoleLogger.png.webp" /><img loading="lazy" src="https://kaylumah.nl/assets/images/posts/20211114/capture-logs-in-unit-tests/002_ConsoleLogger.png" width="1564" height="814" alt="VS Code - Dotnet Debugger - console ILogger registered" /></picture></p>
<p>If, however, you cannot rely on dependency injection in your tests, you have the alternative of manual creating your SUT and relevant dependencies. The only dependency of our EchoService is an instance of ILogger. For testing purposes, you can use the NullLoggerFactory, which creates a logger that logs into the void.</p>
<pre><code class="language-cs">[Fact]
public async Task Test_Manuel_NullLoggingFactory()
{
    var sut = new EchoService(NullLogger&lt;EchoService&gt;.Instance);
    var testInput = &quot;Scenario: null logger factory&quot;;
    var testResult = await sut.Echo(testInput).ConfigureAwait(false);
    testResult.Should().Be(testInput, &quot;the input should have been returned&quot;);
}
</code></pre>
<p><picture><source type="image/webp" srcset="https://kaylumah.nl/assets/images/posts/20211114/capture-logs-in-unit-tests/003_NullLogger.png.webp" /><img loading="lazy" src="https://kaylumah.nl/assets/images/posts/20211114/capture-logs-in-unit-tests/003_NullLogger.png" width="1564" height="814" alt="VS Code - Dotnet Debugger - null ILogger registered" /></picture></p>
<blockquote>
<p>As you can see in the screenshot above, and empty logger and a NullLogger are not the same thing.</p>
</blockquote>
<h2 id="option-2"><a href="#option-2">Option 2</a></h2>
<p>The second method uses the Moq framework, which makes it possible to hide the logger behind a Mock, which means it's a fake version of ILogger. In my previous article, <a href="https://kaylumah.nl/2021/04/11/an-approach-to-writing-mocks.html">&quot;Adventures with Mock&quot;</a>, I touched upon my preferred method of writing mocks. I even included an initial version of the LoggerMock. Since then, I have fleshed out the concept more, so here is an updated version of the Logger Mock.</p>
<pre><code class="language-cs">public class LoggerMock&lt;TCategoryName&gt; : Mock&lt;ILogger&lt;TCategoryName&gt;&gt;
{
    private readonly List&lt;LogMessage&gt; logMessages = new();

    public ReadOnlyCollection&lt;LogMessage&gt; LogMessages =&gt; new(logMessages);

    protected LoggerMock()
    {
    }

    public static LoggerMock&lt;TCategoryName&gt; CreateDefault()
    {
        return new LoggerMock&lt;TCategoryName&gt;()
            .SetupLog()
            .SetupIsEnabled(LogLevel.Information);
    }

    public LoggerMock&lt;TCategoryName&gt; SetupIsEnabled(LogLevel logLevel, bool enabled = true)
    {
        Setup(x =&gt; x.IsEnabled(It.Is&lt;LogLevel&gt;(p =&gt; p.Equals(logLevel))))
            .Returns(enabled);
        return this;
    }

    public LoggerMock&lt;TCategoryName&gt; SetupLog()
    {
        Setup(logger =&gt; logger.Log(
            It.IsAny&lt;LogLevel&gt;(),
            It.IsAny&lt;EventId&gt;(),
            It.Is&lt;It.IsAnyType&gt;((v, t) =&gt; true),
            It.IsAny&lt;Exception&gt;(),
            It.Is&lt;Func&lt;It.IsAnyType, Exception?, string&gt;&gt;((v, t) =&gt; true)
        ))
        .Callback(new InvocationAction(invocation =&gt; {
            var logLevel = (LogLevel)invocation.Arguments[0];
            var eventId = (EventId)invocation.Arguments[1];
            var state = invocation.Arguments[2];
            var exception = (Exception?)invocation.Arguments[3];
            var formatter = invocation.Arguments[4];

            var invokeMethod = formatter.GetType().GetMethod(&quot;Invoke&quot;);
            var actualMessage = (string?)invokeMethod?.Invoke(formatter, new[] { state, exception });

            logMessages.Add(new LogMessage {
                EventId = eventId,
                LogLevel = logLevel,
                Message = actualMessage,
                Exception = exception,
                State = state
            });
        }));
        return this;
    }
}
</code></pre>
<p>Any <code>Mock</code> created with Moq will provide you with the ability to assert invocations made to the mocked class. Since my approach makes the mock stateful, I can capture any request made against it. We can make concrete assertions because we can access information like <code>EventId</code> and <code>LogLevel</code>. If, for instance, you have alerts written against business events, you want to validate that the correct information passes into your logging system.</p>
<pre><code class="language-cs">[Fact]
public async Task Test_Moq_DefaultMockedLogger()
{
    var loggerMock = LoggerMock&lt;EchoService&gt;.CreateDefault();
    var sut = new EchoService(loggerMock.Object);
    var testInput = &quot;Scenario: mocked logger&quot;;
    var testResult = await sut.Echo(testInput).ConfigureAwait(false);
    testResult.Should().Be(testInput, &quot;the input should have been returned&quot;);

    loggerMock.LogMessages.Should().NotBeEmpty().And.HaveCount(1);
    loggerMock.VerifyEventWasLogged(new EventId(1000));
}

[Fact]
public async Task Test_Moq_LogLevelDisabledMockedLogger()
{
    var loggerMock = LoggerMock&lt;EchoService&gt;.CreateDefault().SetupIsEnabled(LogLevel.Information, enabled: false);
    var sut = new EchoService(loggerMock.Object);
    var testInput = &quot;Scenario: log level disabled mocked logger&quot;;
    var testResult = await sut.Echo(testInput).ConfigureAwait(false);
    testResult.Should().Be(testInput, &quot;the input should have been returned&quot;);

    loggerMock.LogMessages.Should().BeEmpty();
}
</code></pre>
<p><picture><source type="image/webp" srcset="https://kaylumah.nl/assets/images/posts/20211114/capture-logs-in-unit-tests/004_MockLogger.png.webp" /><img loading="lazy" src="https://kaylumah.nl/assets/images/posts/20211114/capture-logs-in-unit-tests/004_MockLogger.png" width="1564" height="814" alt="VS Code - Dotnet Debugger - mock ILogger registered" /></picture></p>
<h2 id="options-3"><a href="#options-3">Options 3</a></h2>
<p>Thus far, we have discussed options that would work outside <code>Xunit</code>. The third technique is not limited to <code>Xunit</code>, but its implementation is restricted to use in a <code>Xunit</code> project because we will now rely on Xunit's <code>ITestOutputHelper</code> mechanism. In most cases, we would use <code>ITestOutputHelper</code> to log lines inside the test case itself; it is, however, possible to create an <code>ILogger</code> that writes to <code>ITestOutputHelper</code> so we can also capture logs our SUT produces.</p>
<p>Microsoft has <a href="https://docs.microsoft.com/en-us/dotnet/core/extensions/custom-logging-provider" class="external">well-written documentation</a> on how to create a custom logger provider. We start with a configuration class for our <code>XunitLogger</code>. We will have no custom settings in this demo, but putting the configuration in place makes it easier to add settings later. The <code>ConsoleLogger</code>, for example, uses configuration to control LogScope inclusion and timestamp formats.</p>
<pre><code class="language-cs">public class XunitLoggerConfiguration
{
}
</code></pre>
<p>Next up is our Xunit logger itself. The ColoredConsole sample from the docs does nothing with scope, but to not limit ourselves later, we changed the implementation of <code>BeginScope</code> to use <code>IExternalScopeProvider</code>. To print the log line, we need the last argument of <code>Log&lt;TState&gt;</code>, which is the formatter. We then pass it the Xunit's ITestOutputHelper to <a href="https://xunit.net/docs/capturing-output" class="external">capture output</a>. Depending on your specific needs, you can log the logger's category (name), event, log level, scope or even exception. For now, let's keep it simple.</p>
<pre><code class="language-cs">public class XunitLogger : ILogger
{
    private readonly string _loggerName;
    private readonly Func&lt;XunitLoggerConfiguration&gt; _getCurrentConfig;
    private readonly IExternalScopeProvider _externalScopeProvider;
    private readonly ITestOutputHelper _testOutputHelper;

    public XunitLogger(string loggerName, Func&lt;XunitLoggerConfiguration&gt; getCurrentConfig, IExternalScopeProvider externalScopeProvider, ITestOutputHelper testOutputHelper)
    {
        _loggerName = loggerName;
        _getCurrentConfig = getCurrentConfig;
        _externalScopeProvider = externalScopeProvider;
        _testOutputHelper = testOutputHelper;
    }

    public IDisposable BeginScope&lt;TState&gt;(TState state) =&gt; _externalScopeProvider.Push(state);

    public bool IsEnabled(LogLevel logLevel) =&gt; LogLevel.None != logLevel;

    public void Log&lt;TState&gt;(LogLevel logLevel, EventId eventId, TState state, Exception? exception, Func&lt;TState, Exception?, string&gt; formatter)
    {
        if (!IsEnabled(logLevel))
        {
            return;
        }

         var message = formatter(state, exception);
         _testOutputHelper.WriteLine(message);
    }
}
</code></pre>
<p>An <code>ILoggerProvider</code> is responsible for creating <code>ILogger</code> instances; this means we also need the custom <code>XunitLoggerProvider</code> to take care of making our <code>XunitLogger</code>.</p>
<pre><code class="language-cs">public sealed class XunitLoggerProvider : ILoggerProvider
{
    private readonly IDisposable _configurationOnChangeToken;
    private XunitLoggerConfiguration _currentConfiguration;
    private readonly ConcurrentDictionary&lt;string, XunitLogger&gt; _loggers = new();
    private readonly IExternalScopeProvider _externalScopeProvider = new LoggerExternalScopeProvider();
    private readonly ITestOutputHelper _testOutputHelper;

    public XunitLoggerProvider(IOptionsMonitor&lt;XunitLoggerConfiguration&gt; optionsMonitor, ITestOutputHelper testOutputHelper)
    {
        _currentConfiguration = optionsMonitor.CurrentValue;
        _configurationOnChangeToken = optionsMonitor.OnChange(updatedConfiguration =&gt; _currentConfiguration = updatedConfiguration);
        _testOutputHelper = testOutputHelper;
    }

    public ILogger CreateLogger(string categoryName)
    {
        var logger = _loggers.GetOrAdd(categoryName, name =&gt; new XunitLogger(name, GetCurrentConfiguration, _externalScopeProvider, _testOutputHelper));
        return logger;
    }

    public void Dispose()
    {
        _loggers.Clear();
        _configurationOnChangeToken.Dispose();
    }

    private XunitLoggerConfiguration GetCurrentConfiguration() =&gt; _currentConfiguration;
}
</code></pre>
<p>The final puzzle piece is an extension method that allows us to register the new logger type. Note that we also add <code>ITestOutputHelper</code> to the DI container of the LoggingBuilder; that is why the <code>XunitLoggingProvider</code> in the previous snippet can retrieve it from the dependency injection container.</p>
<pre><code class="language-cs">public static class XunitLoggingBuilderExtensions
{
    public static ILoggingBuilder AddXunit(this ILoggingBuilder builder, ITestOutputHelper testOutputHelper)
    {
        builder.AddConfiguration();

        builder.Services.TryAddSingleton(testOutputHelper);

        builder.Services.TryAddEnumerable(
            ServiceDescriptor.Singleton&lt;ILoggerProvider, XunitLoggerProvider&gt;());

        LoggerProviderOptions.RegisterProviderOptions
            &lt;XunitLoggerConfiguration, XunitLoggerProvider&gt;(builder.Services);

        return builder;
    }

    public static ILoggingBuilder AddXunit(this ILoggingBuilder builder, ITestOutputHelper testOutputHelper, Action&lt;XunitLoggerConfiguration&gt; configure)
    {
        builder.AddXunit(testOutputHelper);
        builder.Services.Configure(configure);

        return builder;
    }
}
</code></pre>
<p>The usage is the same as the ConsoleLogger example we did previously.</p>
<pre><code class="language-cs">[Fact]
public async Task Test_Custom_XunitLoggingBuilder()
{
    var configuration = new ConfigurationBuilder().Build();
    var serviceProvider = new ServiceCollection()
        .AddLogging(loggingBuilder =&gt; {
            loggingBuilder.AddXunit(_testOutputHelper);
        })
        .AddEcho(configuration)
        .BuildServiceProvider();
    var sut = serviceProvider.GetRequiredService&lt;IEchoService&gt;();
    var testInput = &quot;Scenario: custom logging builder&quot;;
    var testResult = await sut.Echo(testInput).ConfigureAwait(false);
    testResult.Should().Be(testInput, &quot;the input should have been returned&quot;);
}
</code></pre>
<p><picture><source type="image/webp" srcset="https://kaylumah.nl/assets/images/posts/20211114/capture-logs-in-unit-tests/005_XunitLogger.png.webp" /><img loading="lazy" src="https://kaylumah.nl/assets/images/posts/20211114/capture-logs-in-unit-tests/005_XunitLogger.png" width="1564" height="814" alt="VS Code - Dotnet Debugger - Xunit ILogger registered" /></picture></p>
<p>The first time I ran this test, I was baffled. I could only see the console output from ConsoleLogger test we did previously. A quick google search brought me to the <a href="https://github.com/xunit/xunit/issues/1141#issuecomment-555717377" class="external">solution</a>. We need to tell the dotnet test runner to display it with <code>dotnet test --logger:&quot;console;verbosity=detailed&quot;</code>. Telling an entire team they can no longer simply run <code>dotnet test</code> was not a real solution; luckily, we can simplify things with <code>dotnet test --settings runsettings.xml</code>.</p>
<pre><code class="language-xml">&lt;?xml version=&quot;1.0&quot; encoding=&quot;utf-8&quot; ?&gt;
&lt;RunSettings&gt;
    &lt;LoggerRunSettings&gt;
        &lt;Loggers&gt;
            &lt;Logger friendlyName=&quot;console&quot; enabled=&quot;True&quot;&gt;
                &lt;Configuration&gt;
                    &lt;Verbosity&gt;detailed&lt;/Verbosity&gt;
                &lt;/Configuration&gt;
            &lt;/Logger&gt;
        &lt;/Loggers&gt;
    &lt;/LoggerRunSettings&gt;
&lt;/RunSettings&gt;
</code></pre>
<p>However, explicitly passing <code>--settings</code> every time does not solve anything. On the <a href="https://docs.microsoft.com/en-us/visualstudio/test/configure-unit-tests-by-using-a-dot-runsettings-file?view=vs-2022" class="external">Microsoft Docs</a> I found the solution. We can tell MSBuild to use <code>RunSettingsFilePath</code>, which takes care of it for us. If we now run <code>dotnet test</code> we get proper output. For example, you can add a <code>Directory.Build.props</code> to the root of your project.</p>
<pre><code class="language-xml">&lt;Project&gt;
  &lt;PropertyGroup&gt;
    &lt;RunSettingsFilePath&gt;$(MSBuildThisFileDirectory)runsettings.xml&lt;/RunSettingsFilePath&gt;
  &lt;/PropertyGroup&gt;
&lt;/Project&gt;
</code></pre>
<h2 id="closing-thoughts"><a href="#closing-thoughts">Closing Thoughts</a></h2>
<p>I know I am not the first to write about this topic, but I hope to provide fresh insight into the subject matter. The different techniques all have their merit. I have used all three on other occasions and remind you that the NullLogger is a viable option in many cases. Nine times out of 10, you probably only care about the business logic to test. For the final remaining time, I can only say the well-known programming wisdom: &quot;It depends&quot;.</p>
<p>As always, if you have any questions, feel free to reach out. Do you have suggestions or alternatives? I would love to hear about them.</p>
<p>The corresponding source code for this article is on <a href="https://github.com/kaylumah/CaptureLogsInUnitTests" class="external">GitHub</a>.</p>
<p>See you next time, stay healthy and happy coding to all 🧸!</p>]]></content>
  </entry>
  <entry>
    <id>https://kaylumah.nl/2021/07/17/decreasing-solution-build-time-with-filters.html</id>
    <title type="text"><![CDATA[Decreasing Solution Build time with Filters]]></title>
    <summary type="text"><![CDATA[How to use solution filters to increase focus and decrease build time]]></summary>
    <published>2021-07-17T00:00:00+02:00</published>
    <updated>2021-07-17T00:00:00+02:00</updated>
    <author>
      <name>Max Hamulyák</name>
      <email>max@kaylumah.nl</email>
    </author>
    <link href="https://kaylumah.nl/2021/07/17/decreasing-solution-build-time-with-filters.html" />
    <category term="MSBuild" />
    <category term="Visual Studio 2019" />
    <content type="html"><![CDATA[<p>There are many ways to structure your projects source code. My preference is a style called single-solution-model. Amongst other things, I like that it provides a single entry point to my project. If, however, your project grows, it can become slow to build it. I am sure some of you will be familiar with the following <a href="https://imgs.xkcd.com/comics/compiling.png" class="external">xkcd joke</a> or some variant of it:</p>
<p><picture><source type="image/webp" srcset="https://kaylumah.nl/assets/images/posts/20210717/decreasing-solution-build-time-with-filters/xkcd_joke_compiling.png.webp" /><img loading="lazy" src="https://kaylumah.nl/assets/images/posts/20210717/decreasing-solution-build-time-with-filters/xkcd_joke_compiling.png" width="413" height="360" alt="xkcd_joke code is compiling" /></picture></p>
<p>The <a href="https://devblogs.microsoft.com/visualstudio/visual-studio-2022-preview-1-now-available" class="external">next version</a> of Visual Studio will come with a lot of promised performance improvements. VisualStudio 2022 is the first version that takes advantage of the 64-bit processor architecture. I have not yet tested it, but I am hopeful for a more performant experience developing when it ships.</p>
<blockquote>
<p>While I think the 1600+ projects in a solution demo are cool, I would not see myself using the single solution model at that scale.</p>
</blockquote>
<p>That brings me to the topic of today's post. I recently discovered a VS2019 feature I did not know that can bring some improvement to my experience. VS2019 introduced a new feature called <a href="https://docs.microsoft.com/en-us/visualstudio/ide/filtered-solutions?view=vs-2019" class="external">solution filters</a>. I googled a bit against it and did not find a lot about it, except for the Microsoft Docs itself. So I wrote this post to help raise awareness for something I found very useful.</p>
<h2 id="project-setup"><a href="#project-setup">Project Setup</a></h2>
<p>I think over my past couple of posts, it's become clear that I am a fan of the <code>Microsoft.Extensions</code> repository. While Microsoft uses multiple solution files throughout the repository, I would opt for the single solution model.</p>
<p>Many of the projects in the repo follow this pattern:</p>
<ul>
<li><code>Concept.Abstractions</code> provides interfaces</li>
<li><code>.Concept</code> provides default implementation for <code>Concept.Abstractions</code></li>
<li><code>Concept.Concrete</code> technology specific implementation for <code>Concept.Abstractions</code></li>
</ul>
<pre><code class="language-shell">dotnet new sln --name &quot;SlnFilter&quot;

dotnet new classlib --framework netstandard2.1 --name Kaylumah.SlnFilter.Extensions.Concept.Abstractions --output src/Kaylumah.SlnFilter.Extensions.Concept.Abstractions
dotnet new classlib --framework netstandard2.1 --name Kaylumah.SlnFilter.Extensions.Concept --output src/Kaylumah.SlnFilter.Extensions.Concept
dotnet new classlib --framework netstandard2.1 --name Kaylumah.SlnFilter.Extensions.Concept.ConcreteAlpha --output src/Kaylumah.SlnFilter.Extensions.Concept.ConcreteAlpha
dotnet new classlib --framework netstandard2.1 --name Kaylumah.SlnFilter.Extensions.Concept.ConcreteBravo --output src/Kaylumah.SlnFilter.Extensions.Concept.ConcreteBravo

dotnet new xunit --framework netcoreapp3.1 --name Kaylumah.SlnFilter.Extensions.Concept.Tests --output test/Kaylumah.SlnFilter.Extensions.Concept.Tests
dotnet new xunit --framework netcoreapp3.1 --name Kaylumah.SlnFilter.Extensions.Concept.ConcreteAlpha.Tests --output test/Kaylumah.SlnFilter.Extensions.Concept.ConcreteAlpha.Tests
dotnet new xunit --framework netcoreapp3.1 --name Kaylumah.SlnFilter.Extensions.Concept.ConcreteBravo.Tests --output test/Kaylumah.SlnFilter.Extensions.Concept.ConcreteBravo.Tests

dotnet sln add src/Kaylumah.SlnFilter.Extensions.Concept.Abstractions/Kaylumah.SlnFilter.Extensions.Concept.Abstractions.csproj
dotnet sln add src/Kaylumah.SlnFilter.Extensions.Concept/Kaylumah.SlnFilter.Extensions.Concept.csproj
dotnet sln add src/Kaylumah.SlnFilter.Extensions.Concept.ConcreteAlpha/Kaylumah.SlnFilter.Extensions.Concept.ConcreteAlpha.csproj
dotnet sln add src/Kaylumah.SlnFilter.Extensions.Concept.ConcreteBravo/Kaylumah.SlnFilter.Extensions.Concept.ConcreteBravo.csproj
dotnet sln add test/Kaylumah.SlnFilter.Extensions.Concept.Tests/Kaylumah.SlnFilter.Extensions.Concept.Tests.csproj
dotnet sln add test/Kaylumah.SlnFilter.Extensions.Concept.ConcreteAlpha.Tests/Kaylumah.SlnFilter.Extensions.Concept.ConcreteAlpha.Tests.csproj
dotnet sln add test/Kaylumah.SlnFilter.Extensions.Concept.ConcreteBravo.Tests/Kaylumah.SlnFilter.Extensions.Concept.ConcreteBravo.Tests.csproj

dotnet new classlib --framework netstandard2.1 --name Kaylumah.SlnFilter.Test.Utilities --output test/Kaylumah.SlnFilter.Test.Utilities
</code></pre>
<blockquote>
<p>Note <code>Kaylumah.SlnFilter.Test.Utilities</code> should not yet be added to the solution.</p>
</blockquote>
<h2 id="setting-up-our-filters"><a href="#setting-up-our-filters">Setting up our filters</a></h2>
<p>After following these steps, our project should look like the picture below in Visual Studio.</p>
<p><picture><source type="image/webp" srcset="https://kaylumah.nl/assets/images/posts/20210717/decreasing-solution-build-time-with-filters/001_vs2019_sln_all_projects.png.webp" /><img loading="lazy" src="https://kaylumah.nl/assets/images/posts/20210717/decreasing-solution-build-time-with-filters/001_vs2019_sln_all_projects.png" width="1428" height="1040" alt="Visual Studio 2019 - Solution all projects loaded" /></picture></p>
<p>We can select one or more projects at a time and unload them from the solution.</p>
<p><picture><source type="image/webp" srcset="https://kaylumah.nl/assets/images/posts/20210717/decreasing-solution-build-time-with-filters/002_vs2019_sln_unload_projects.png.webp" /><img loading="lazy" src="https://kaylumah.nl/assets/images/posts/20210717/decreasing-solution-build-time-with-filters/002_vs2019_sln_unload_projects.png" width="2360" height="1864" alt="Visual Studio 2019 - Unload project menu" /></picture></p>
<p>Up until now, this is how I would have done things. Just unload projects I won't need and don't worry about them anymore. What I did not know is that we save the current state of the solution.</p>
<p><picture><source type="image/webp" srcset="https://kaylumah.nl/assets/images/posts/20210717/decreasing-solution-build-time-with-filters/003_vs2019_sln_save_filter_001.png.webp" /><img loading="lazy" src="https://kaylumah.nl/assets/images/posts/20210717/decreasing-solution-build-time-with-filters/003_vs2019_sln_save_filter_001.png" width="2464" height="224" alt="Visual Studio 2019 - Save as Solution Filter" /></picture></p>
<p>Unloading projects manually to create filters can be error-prone. Since a solution filter only builds the projects selected by the filter missing a project causes the build to fail.</p>
<p>An alternative can be to unload all projects, select the project you want, and use the &quot;reload with dependencies&quot; option.</p>
<p><picture><source type="image/webp" srcset="https://kaylumah.nl/assets/images/posts/20210717/decreasing-solution-build-time-with-filters/004_vs2019_sln_reload_project_dependencies.png.webp" /><img loading="lazy" src="https://kaylumah.nl/assets/images/posts/20210717/decreasing-solution-build-time-with-filters/004_vs2019_sln_reload_project_dependencies.png" width="2316" height="1528" alt="Visual Studio 2019 - Reload Project with Dependencies" /></picture></p>
<p>Like before, we can save the solution filter with the <code>Save As Solution Filter</code> option. The only difference is that we now get 4/7 projects as opposed to 5/7 projects. That's because we loaded the <code>ConcreteBravo.Tests</code> projects and it's dependencies. Even though that loads <code>Extensions.Concept</code> it does not load <code>Extensions.Concept.Tests</code> since it is not a dependency of <code>ConcreteBravo.Tests</code>.</p>
<p><picture><source type="image/webp" srcset="https://kaylumah.nl/assets/images/posts/20210717/decreasing-solution-build-time-with-filters/005_vs2019_sln_save_filter_002.png.webp" /><img loading="lazy" src="https://kaylumah.nl/assets/images/posts/20210717/decreasing-solution-build-time-with-filters/005_vs2019_sln_save_filter_002.png" width="2464" height="2212" alt="Visual Studio 2019 - Save as Solution Filter - Scenario A" /></picture></p>
<p>While researching something unrelated to this post, I noticed that the <a href="https://github.com/dotnet/efcore" class="external">EF Core team</a> used this feature I did not know existed. The cool thing was that they also had a filter for all projects. So I had to try that out, and as it turns out, you can create a filter without unloading projects.</p>
<p><picture><source type="image/webp" srcset="https://kaylumah.nl/assets/images/posts/20210717/decreasing-solution-build-time-with-filters/006_vs2019_sln_save_filter_003.png.webp" /><img loading="lazy" src="https://kaylumah.nl/assets/images/posts/20210717/decreasing-solution-build-time-with-filters/006_vs2019_sln_save_filter_003.png" width="2464" height="2160" alt="Visual Studio 2019 - Save as Solution Filter - Scenario B" /></picture></p>
<p>The image below shows the difference between the three filters we created. It looks exactly like a traditional Solution Explorer with the addition that the name of the filter applied is displayed.</p>
<p><picture><source type="image/webp" srcset="https://kaylumah.nl/assets/images/posts/20210717/decreasing-solution-build-time-with-filters/007_vs2019_slnf_project_overview.png.webp" /><img loading="lazy" src="https://kaylumah.nl/assets/images/posts/20210717/decreasing-solution-build-time-with-filters/007_vs2019_slnf_project_overview.png" width="4284" height="1040" alt="Visual Studio 2019 - Solution Filter Scenarios Compared" /></picture></p>
<p>For example, the <code>SlnFilter.Alpha.slnf</code> I created for <code>Concept.ConcreteAlpha</code> implementation looks like this:</p>
<pre><code class="language-json">{
  &quot;solution&quot;: {
    &quot;path&quot;: &quot;SlnFilter.sln&quot;,
    &quot;projects&quot;: [
      &quot;src\\Kaylumah.SlnFilter.Extensions.Concept.Abstractions\\Kaylumah.SlnFilter.Extensions.Concept.Abstractions.csproj&quot;,
      &quot;src\\Kaylumah.SlnFilter.Extensions.Concept.ConcreteAlpha\\Kaylumah.SlnFilter.Extensions.Concept.ConcreteAlpha.csproj&quot;,
      &quot;src\\Kaylumah.SlnFilter.Extensions.Concept\\Kaylumah.SlnFilter.Extensions.Concept.csproj&quot;,
      &quot;test\\Kaylumah.SlnFilter.Extensions.Concept.ConcreteAlpha.Tests\\Kaylumah.SlnFilter.Extensions.Concept.ConcreteAlpha.Tests.csproj&quot;,
      &quot;test\\Kaylumah.SlnFilter.Extensions.Concept.Tests\\Kaylumah.SlnFilter.Extensions.Concept.Tests.csproj&quot;
    ]
  }
}
</code></pre>
<p>It contains a reference to the <code>sln-file</code> and relative paths to all my <code>*.csprojs</code> I included in the <code>.slnf-file</code>.</p>
<h2 id="manage-solution-changes"><a href="#manage-solution-changes">Manage solution changes</a></h2>
<p>You might be wondering what happens when I need to add new projects to my solution?</p>
<p>To demonstrate, let us assume our test projects have a shared helper project. At this time, I want to update our &quot;Concept.Bravo&quot; solution filter. This time I don't want to use dotnet CLI but use <code>Add existing project</code>.</p>
<blockquote>
<p>You cannot use <code>dotnet sln add</code> on slnf files, but you can use them with <code>dotnet build</code></p>
</blockquote>
<p><picture><source type="image/webp" srcset="https://kaylumah.nl/assets/images/posts/20210717/decreasing-solution-build-time-with-filters/008_vs2019_slnf_add_existing_project.png.webp" /><img loading="lazy" src="https://kaylumah.nl/assets/images/posts/20210717/decreasing-solution-build-time-with-filters/008_vs2019_slnf_add_existing_project.png" width="3088" height="1644" alt="Visual Studio 2019 - Add Existing Project" /></picture></p>
<p>As soon as you did this, you get this pop-up stating a mismatch between the loaded projects and the project specified in the filter.</p>
<p>If you followed the steps in a GIT environment, you would see that even before pressing <code>Update Solution Filter</code> the underlying solution is already updated.</p>
<p><picture><source type="image/webp" srcset="https://kaylumah.nl/assets/images/posts/20210717/decreasing-solution-build-time-with-filters/009_vs2019_slnf_update_solution_filter.png.webp" /><img loading="lazy" src="https://kaylumah.nl/assets/images/posts/20210717/decreasing-solution-build-time-with-filters/009_vs2019_slnf_update_solution_filter.png" width="1428" height="1040" alt="Visual Studio 2019 - Regenerate Solution Filter" /></picture></p>
<h2 id="the-missing-bit"><a href="#the-missing-bit">The missing bit</a></h2>
<p>I discussed this feature at work as a potential workaround for an issue we had in structuring our projects. One of my colleagues remembered looking at it about a year ago and finding it lacking. A few minutes later, he found a <a href="https://developercommunity.visualstudio.com/t/Solution-Filter-should-allow-for-Include/1090914?space=8&amp;q=solution+filter" class="external">post</a> on the developer community for Visual Studio. Funnily enough, it's a small world; the user-post links to a GitHub issue he created in this matter.</p>
<p>The problem is the management of multiple solutions filters because the filters are inclusive with relative paths following the sln-filter location. A proposed improvement would be to use glob patterns to include/exclude projects. That would make it easier when following naming conventions to have always up-to-date filters.</p>
<p>At a customer I work for, they use PowerShell as their script platform of choice, so I needed a deeper understanding of PowerShell. With PowerShell, it's reasonably easy to work with the file system and convert from and to JSON. So I thought, how hard can it be to script this.</p>
<p>The following script loads the paths of all *.csproj present in the solution directory and filters them out by RegEx. It then writes it to disk in the .slnf-format.</p>
<pre><code class="language-ps">$inputSln = &quot;SlnFilter.sln&quot;
$outputSlnFilter = &quot;SlnFilter.Generated.slnf&quot;

$projectFiles = Get-ChildItem -Recurse -Filter &quot;*.csproj&quot; -Name
# $excludeFilters = @()
$excludeFilters = @('.ConcreteBravo')


$targetProjects = New-Object Collections.Generic.List[String]

foreach ($project in $projectFiles)
{
    $shouldInclude = $true

    foreach ($filter in $excludeFilters)
    {
        $shouldInclude = $project -notmatch $filter
        if (!$shouldInclude)
        {
            break
        }
    }

    if ($shouldInclude)
    {
        $targetProjects.Add($project)
    }
}

$sln = New-Object -TypeName psobject
$sln | Add-Member -MemberType NoteProperty -Name &quot;path&quot; -Value $inputSln
$sln | Add-Member -MemberType NoteProperty -Name &quot;projects&quot; -value $targetProjects

$root = New-Object -TypeName psobject
$root | Add-Member -MemberType NoteProperty -Name &quot;solution&quot; -value $sln

$root | ConvertTo-Json | Out-File $outputSlnFilter
</code></pre>
<h2 id="closing-thoughts"><a href="#closing-thoughts">Closing Thoughts</a></h2>
<p>I like this new feature as a way to manage my larger solutions. Of course, it's not practical to maintain my (very basic) script for this. It will be a huge help if you think this is a valuable feature to upvote the Visual Studio Community forum issue.</p>
<p>As always, if you have any questions, feel free to reach out. Do you have suggestions or alternatives? I would love to hear about them.</p>
<p>The corresponding source code for this article is on <a href="https://github.com/kaylumah/SolutionFilter" class="external">GitHub</a>.</p>
<p>See you next time, stay healthy and happy coding to all 🧸!</p>
<h2 id="sources"><a href="#sources">Sources</a></h2>
<ul>
<li><a href="https://docs.microsoft.com/en-us/visualstudio/ide/filtered-solutions?view=vs-2019" class="external">slnf in VisualStudio</a></li>
<li><a href="https://docs.microsoft.com/en-us/visualstudio/msbuild/solution-filters?view=vs-2019" class="external">slnf in MSBuild</a></li>
</ul>]]></content>
  </entry>
  <entry>
    <id>https://kaylumah.nl/2021/05/23/generate-csharp-client-for-openapi.html</id>
    <title type="text"><![CDATA[Generate C# client for OpenAPI]]></title>
    <summary type="text"><![CDATA[A look at using OpenAPI clients in C#]]></summary>
    <published>2021-05-23T00:00:00+02:00</published>
    <updated>2021-05-23T00:00:00+02:00</updated>
    <author>
      <name>Max Hamulyák</name>
      <email>max@kaylumah.nl</email>
    </author>
    <link href="https://kaylumah.nl/2021/05/23/generate-csharp-client-for-openapi.html" />
    <category term="C#" />
    <category term="NSwag" />
    <category term="OpenAPI" />
    <category term="Swashbuckle" />
    <content type="html"><![CDATA[<blockquote>
<p><strong>note</strong>: on 2023-04-14 I published a revised version of this article which you can find here <a href="https://kaylumah.nl/2023/04/14/csharp-client-for-openapi-revistted.html">&quot;Generate C# client for OpenAPI - Revisited&quot;</a> it compares an alternative to the way described in this article. Including an update to <code>net7.0</code>, but if you are looking specifically for configuring NSwag you can use this article for the general idea :)</p>
</blockquote>
<p>I've recently worked on a project where I was the consumer of a third party API. Luckily for me, we decided on an Open API specification which made integrating services a breeze. If you have been following my content, you know I often use C# in my projects. So I needed a type-safe client for use in my C# code base.</p>
<p>To accomplish my goals, I used the <a href="https://github.com/RicoSuter/NSwag/wiki/NSwag.MSBuild" class="external">NSwag library</a> created by Rico Suter. This project provides me with an MSBuild task for generating clients. In my case, I used a JSON file version to generate my client. NSwag is not limited to just one way of working.</p>
<h2 id="what-is-openapi"><a href="#what-is-openapi">What is OpenAPI</a></h2>
<p>First, a quick recap of what is an OpenAPI. According to the <a href="https://swagger.io/specification/" class="external">official definition</a>:</p>
<blockquote>
<p>The OpenAPI Specification (OAS) defines a standard, language-agnostic interface to RESTful APIs which allows both humans and computers to discover and understand the capabilities of the service without access to source code, documentation, or through network traffic inspection. When properly defined, a consumer can understand and interact with the remote service with a minimal amount of implementation logic.</p>
<p>An OpenAPI definition can then be used by documentation generation tools to display the API, code generation tools to generate servers and clients in various programming languages, testing tools, and many other use cases.</p>
</blockquote>
<p>That's pretty cool. Also, if you are wondering about the difference between OpenAPI / Swagger, Swagger is part of the OpenAPI initiative since 2015. But in short OpenAPI = specification, Swagger = Tooling. In this article, I am not going into much detail in setting up your API, but Microsoft <a href="https://docs.microsoft.com/en-us/aspnet/core/tutorials/web-api-help-pages-using-swagger?view=aspnetcore-5.0#openapi-vs-swagger" class="external">described</a> three versions on how to combine it with .NET Core.</p>
<h2 id="generate-client-from-file"><a href="#generate-client-from-file">Generate client from file</a></h2>
<p>The first version uses a file to generate our code. In our case, we will use a <a href="https://petstore.swagger.io/v2/swagger.json" class="external">JSON file</a> from the <a href="https://petstore.swagger.io/" class="external">PetStore</a> example project as provided by the swagger team.</p>
<pre><code class="language-shell">dotnet new classlib --framework netstandard2.0 --output src/Sdks/PetStore --name Kaylumah.GenerateCSharpClientForOpenAPI.Sdks.PetStore
dotnet add package NSwag.MSBuild
dotnet add package System.ComponentModel.Annotations
dotnet add package Newtonsoft.Json
</code></pre>
<p>Safe the pet store OpenAPI JSON in the project we just created under the name <code>swagger.json</code>. We also need a <code>nswag.json</code> file with the following contents:</p>
<pre><code class="language-json">{
    &quot;runtime&quot;: &quot;NetCore31&quot;,
    &quot;documentGenerator&quot;: {
        &quot;fromDocument&quot;: {
            &quot;json&quot;: &quot;swagger.json&quot;
        }
    },
    &quot;codeGenerators&quot;: {
        &quot;openApiToCSharpClient&quot;: {
            &quot;output&quot;: &quot;Client.g.cs&quot;
        }
    }
}
</code></pre>
<p>We use an MSBuild task that calls NSwag. Update the <code>...Sdks.Petstore.csproj</code> project file to look like this.</p>
<pre><code class="language-xml">&lt;Project Sdk=&quot;Microsoft.NET.Sdk&quot;&gt;

  &lt;PropertyGroup&gt;
    &lt;TargetFramework&gt;netstandard2.0&lt;/TargetFramework&gt;
  &lt;/PropertyGroup&gt;

  &lt;ItemGroup&gt;
    &lt;PackageReference Include=&quot;Newtonsoft.Json&quot; Version=&quot;13.0.1&quot; /&gt;
    &lt;PackageReference Include=&quot;NSwag.MSBuild&quot; Version=&quot;13.11.1&quot;&gt;
      &lt;IncludeAssets&gt;runtime; build; native; contentfiles; analyzers; buildtransitive&lt;/IncludeAssets&gt;
      &lt;PrivateAssets&gt;all&lt;/PrivateAssets&gt;
    &lt;/PackageReference&gt;
    &lt;PackageReference Include=&quot;System.ComponentModel.Annotations&quot; Version=&quot;5.0.0&quot; /&gt;
  &lt;/ItemGroup&gt;

  &lt;Target Name=&quot;GenerateSdk&quot; BeforeTargets=&quot;Build&quot;&gt;
    &lt;Exec Command=&quot;$(NSwagExe_Core31) run nswag.json &quot; /&gt;
  &lt;/Target&gt;

&lt;/Project&gt;
</code></pre>
<p>After building the project, we have a file named <code>Client.g.cs</code> containing everything we need to consume the PetStore API. We can use a console application to verify that we can make API calls.</p>
<pre><code class="language-shell">dotnet new console --framework netcoreapp3.1 --output src/Client/ApiClient --name Kaylumah.GenerateCSharpClientForOpenAPI.Client.ApiClient
</code></pre>
<p>An example call we can make with our API looks like this:</p>
<pre><code class="language-cs">using System;
using System.Net.Http;
using System.Threading.Tasks;

namespace Kaylumah.GenerateCSharpClientForOpenAPI.Client.ApiClient
{
    class Program
    {
        static async Task Main(string[] args)
        {
            var httpClient = new HttpClient();
            var apiClient = new MyNamespace.Client(httpClient);
            var result = await apiClient.GetInventoryAsync();
            Console.WriteLine(string.Join(&quot;|&quot;, result.Keys));
        }
    }
}

</code></pre>
<h2 id="influence-created-output"><a href="#influence-created-output">Influence created output</a></h2>
<p>We have established that we have a working C# client for the PetStore API. Let us look at the generated result. We got DTO's for every definition in the definitions part of the specification. We also got a class named <code>Client</code> with methods as <code>GetInventoryAsync</code>. All the generated code in <code>Client.g.cs</code> is part of the namespace <code>MyNamespace</code>; this is not helpful if I wanted to create a project with many API clients.</p>
<p>Two things influence the generated code. First, how you specify your fields has the most influence. For example, are your fields required, are they nullable and which kind of values are allowed. You cannot always influence this as sometimes you consume an external API; such is the case with our PetStore implementation. Luckily we can control the output by tuning values in our NSwag configuration. An eagle-eyed reader will have noticed that we are already doing this. Our nswag.json is responsible for the result. In this case, we are using the <code>output</code> variable to control the generated file's name.</p>
<p>We control the output by using an NSwag configuration document usually called <code>*.nswag</code> or <code>nswag.json</code>. It can be generated via NSwagStudio or manually. Over at the <a href="https://github.com/RicoSuter/NSwag/wiki/NSwag-Configuration-Document" class="external">NSwag Wiki</a> you can read all about it. It's outside of the scope of this article to go into all options, so I will demonstrate a couple of changes I like to make in my projects.</p>
<blockquote>
<p><strong>Note</strong>: You can generate a nswag configuration file by running <code>&lt;Exec Command=&quot;$(NSwagExe_Core31) new&quot; /&gt;</code>.</p>
</blockquote>
<p>I encourage you to take a look at the documentation to see all configuration options. Some options apply to every generator, and some only to C# clients. See the table below for links to every section. Every section describes the options and default values if applicable.</p>
<table>
<thead>
<tr>
<th>Settings</th>
<th>Description</th>
</tr>
</thead>
<tbody>
<tr>
<td><a href="https://github.com/RicoSuter/NSwag/wiki/ClientGeneratorBaseSettings" class="external">ClientGeneratorBaseSettings</a></td>
<td>Common settings for all client code generators.</td>
</tr>
<tr>
<td><a href="https://github.com/RicoSuter/NSwag/wiki/CSharpGeneratorBaseSettings" class="external">CSharpGeneratorBaseSettings</a></td>
<td>Base settings for all C# code generators.</td>
</tr>
<tr>
<td><a href="https://github.com/RicoSuter/NSwag/wiki/CSharpClientGeneratorSettings" class="external">CSharpClientGeneratorSettings</a></td>
<td>Settings for C# clients.</td>
</tr>
</tbody>
</table>
<p>If you look closely at your build log, you see the following line <code>Executing file 'nswag.json' with variables ''...</code>. So how do we pass variables to NSwag? Update the statement to &quot;$(NSwagExe_Core31) run nswag.json /variables:Configuration=$(Configuration)&quot; . Here we define a variable named Configuration and assign it the MSBuild value for $(Configuration). If we build our project, the logline reads <code>Executing file 'nswag.json' with variables 'Configuration=Debug'...</code>. You also have the option to supply default values in your NSwag configuration. This way, you don't see it as part of your build log, but it helps omit parts from the command.</p>
<table>
<thead>
<tr>
<th>Property</th>
<th>Description</th>
</tr>
</thead>
<tbody>
<tr>
<td><code>namespace</code> and <code>contractsNamespace</code></td>
<td>Control the namespace of the generated code</td>
</tr>
<tr>
<td><code>generateContractsOutput</code> and <code>contractsOutputFilePath</code></td>
<td>Control seperation of contract and implementation</td>
</tr>
<tr>
<td><code>generateClientInterfaces</code></td>
<td>create an interface</td>
</tr>
<tr>
<td><code>exceptionClass</code> and <code>className</code></td>
<td>control classnames</td>
</tr>
<tr>
<td><code>operationGenerationMode</code></td>
<td>how to create client for multiple endpoints</td>
</tr>
</tbody>
</table>
<p>After our modifications, our NSwag file looks like this.</p>
<pre><code class="language-json">{
    &quot;runtime&quot;: &quot;NetCore31&quot;,
    &quot;defaultVariables&quot;: &quot;Configuration=Debug&quot;,
    &quot;documentGenerator&quot;: {
        &quot;fromDocument&quot;: {
            &quot;json&quot;: &quot;$(InputDocument)&quot;
        }
    },
    &quot;codeGenerators&quot;: {
        &quot;openApiToCSharpClient&quot;: {
            &quot;generateClientInterfaces&quot;: true,
            &quot;exceptionClass&quot;: &quot;$(SdkName)ApiException&quot;,
            &quot;useBaseUrl&quot;: true,
            &quot;generateBaseUrlProperty&quot;: true,
            &quot;generateContractsOutput&quot;: true,
            &quot;contractsNamespace&quot;: &quot;$(SdkNamespace).Interface&quot;,
            &quot;contractsOutputFilePath&quot;: &quot;$(GeneratedContractFile)&quot;,
            &quot;className&quot;: &quot;$(SdkName)Client&quot;,
            &quot;operationGenerationMode&quot;: &quot;SingleClientFromOperationId&quot;,
            &quot;namespace&quot;: &quot;$(SdkNamespace).Service&quot;,
            &quot;output&quot;: &quot;$(GeneratedClientFile)&quot;
        }
    }
}
</code></pre>
<p>To pass all the values to NSwag, we update our csproj file to look like this. For demonstration purposes, I show that the name of the MSBuild variable does not need to match the NSwag variable. Do take care that the variable names passed to NSwag need to match the name in nswag.json</p>
<pre><code class="language-xml">&lt;Project Sdk=&quot;Microsoft.NET.Sdk&quot;&gt;

  &lt;PropertyGroup&gt;
    &lt;TargetFramework&gt;netstandard2.0&lt;/TargetFramework&gt;
  &lt;/PropertyGroup&gt;

  &lt;ItemGroup&gt;
    &lt;PackageReference Include=&quot;Newtonsoft.Json&quot; Version=&quot;13.0.1&quot; /&gt;
    &lt;PackageReference Include=&quot;NSwag.MSBuild&quot; Version=&quot;13.11.1&quot;&gt;
      &lt;IncludeAssets&gt;runtime; build; native; contentfiles; analyzers; buildtransitive&lt;/IncludeAssets&gt;
      &lt;PrivateAssets&gt;all&lt;/PrivateAssets&gt;
    &lt;/PackageReference&gt;
    &lt;PackageReference Include=&quot;System.ComponentModel.Annotations&quot; Version=&quot;5.0.0&quot; /&gt;
  &lt;/ItemGroup&gt;

  &lt;Target Name=&quot;GenerateSdk&quot; BeforeTargets=&quot;Build&quot;&gt;
    &lt;PropertyGroup&gt;
        &lt;OpenAPIDocument&gt;swagger.json&lt;/OpenAPIDocument&gt;
        &lt;NSwagConfiguration&gt;nswag.json&lt;/NSwagConfiguration&gt;

        &lt;SdkNamespace&gt;$(RootNamespace)&lt;/SdkNamespace&gt;
        &lt;SdkName&gt;PetStore&lt;/SdkName&gt;
        &lt;GeneratedInterfaceFile&gt;$(SdkName).Interface.g.cs&lt;/GeneratedInterfaceFile&gt;
        &lt;GeneratedServiceFile&gt;$(SdkName).Service.g.cs&lt;/GeneratedServiceFile&gt;

    &lt;/PropertyGroup&gt;
    &lt;Error Text=&quot;The OpenAPI document '$(OpenAPIDocument)' does not exists!&quot; Condition=&quot;!Exists('$(OpenAPIDocument)')&quot; /&gt;
    &lt;Error Text=&quot;The NSwag configuration '$(NSwagConfiguration)' does not exists!&quot; Condition=&quot;!Exists('$(NSwagConfiguration)')&quot; /&gt;
    &lt;Exec Command=&quot;$(NSwagExe_Core31) run $(NSwagConfiguration) /variables:Configuration=$(Configuration),InputDocument=$(OpenAPIDocument),SdkName=$(SdkName),SdkNamespace=$(SdkNamespace),GeneratedClientFile=$(GeneratedServiceFile),GeneratedContractFile=$(GeneratedInterfaceFile)&quot; /&gt;
  &lt;/Target&gt;

&lt;/Project&gt;
</code></pre>
<h2 id="generate-client-from-api-in-your-project"><a href="#generate-client-from-api-in-your-project">Generate client from API in your project</a></h2>
<p>Our second version generates the SDK based on a .NET Core API project in our solution, which can be very useful if you want to provide the client in a NuGet package to other projects/teams in your organization. The project setup will be almost identical to our file-based setup.</p>
<pre><code class="language-shell">dotnet new classlib --framework netstandard2.0 --output src/Sdks/FromNswagApi --name Kaylumah.GenerateCSharpClientForOpenAPI.Sdks.FromNswagApi
dotnet add package NSwag.MSBuild
dotnet add package System.ComponentModel.Annotations
dotnet add package Newtonsoft.Json
</code></pre>
<p>If we are going to create an SDK we first need to generate our API project. We generate a webapi with the following command:</p>
<pre><code class="language-sh">dotnet new webapi --framework netcoreapp3.1 --output src/Apis/Nswag/WeatherForecastApi --name Kaylumah.GenerateCSharpClientForOpenAPI.Apis.Nswag.WeatherForecastApi
</code></pre>
<p>Note that I am specifying the optional --framework option for creating the projects; this has two reasons. First, I prefer to use LTS versions of the Microsoft SDK and secondly, Microsoft made <a href="https://docs.microsoft.com/en-us/aspnet/core/release-notes/aspnetcore-5.0?view=aspnetcore-5.0#openapi-specification-on-by-default" class="external">changes</a> to the webapi template in the NET5 SDK that makes it opt-out to use OpenAPI and defaults to Swashbuckle, which I don't want in this case.</p>
<pre><code class="language-json">{
    &quot;runtime&quot;: &quot;NetCore31&quot;,
    &quot;documentGenerator&quot;: {
        &quot;aspNetCoreToOpenApi&quot;: {
            &quot;project&quot;: &quot;../../Apis/Nswag/WeatherForecastApi/Kaylumah.GenerateCSharpClientForOpenAPI.Apis.Nswag.WeatherForecastApi.csproj&quot;
        }
    },
    &quot;codeGenerators&quot;: {
        &quot;openApiToCSharpClient&quot;: {
            &quot;generateClientInterfaces&quot;: true,
            &quot;exceptionClass&quot;: &quot;$(SdkName)ApiException&quot;,
            &quot;useBaseUrl&quot;: true,
            &quot;generateBaseUrlProperty&quot;: true,
            &quot;generateContractsOutput&quot;: true,
            &quot;contractsNamespace&quot;: &quot;$(SdkNamespace).Interface&quot;,
            &quot;contractsOutputFilePath&quot;: &quot;$(GeneratedContractFile)&quot;,
            &quot;className&quot;: &quot;$(SdkName)Client&quot;,
            &quot;operationGenerationMode&quot;: &quot;SingleClientFromOperationId&quot;,
            &quot;namespace&quot;: &quot;$(SdkNamespace).Service&quot;,
            &quot;output&quot;: &quot;$(GeneratedClientFile)&quot;
        }
    }
}
</code></pre>
<p>Like before, we need a <code>GenerateSdk</code> target; the difference is that we don't have a <code>swagger.json</code>.</p>
<pre><code class="language-xml">&lt;Target Name=&quot;GenerateSdk&quot; BeforeTargets=&quot;Build&quot;&gt;
&lt;PropertyGroup&gt;
    &lt;NSwagConfiguration&gt;nswag.json&lt;/NSwagConfiguration&gt;

    &lt;SdkNamespace&gt;$(RootNamespace)&lt;/SdkNamespace&gt;
    &lt;SdkName&gt;Weather&lt;/SdkName&gt;
    &lt;GeneratedInterfaceFile&gt;$(SdkName).Interface.g.cs&lt;/GeneratedInterfaceFile&gt;
    &lt;GeneratedServiceFile&gt;$(SdkName).Service.g.cs&lt;/GeneratedServiceFile&gt;

&lt;/PropertyGroup&gt;
&lt;Error Text=&quot;The NSwag configuration '$(NSwagConfiguration)' does not exists!&quot; Condition=&quot;!Exists('$(NSwagConfiguration)')&quot; /&gt;
&lt;Exec Command=&quot;$(NSwagExe_Core31) run $(NSwagConfiguration) /variables:Configuration=$(Configuration),SdkName=$(SdkName),SdkNamespace=$(SdkNamespace),GeneratedClientFile=$(GeneratedServiceFile),GeneratedContractFile=$(GeneratedInterfaceFile)&quot; /&gt;
&lt;/Target&gt;
</code></pre>
<p>If we try to build our project now, we get an error.</p>
<pre><code class="language-output">Microsoft (R) Build Engine version 16.9.0+57a23d249 for .NET
Copyright (C) Microsoft Corporation. All rights reserved.

  Determining projects to restore...
  All projects are up-to-date for restore.
  NSwag command line tool for .NET Core NetCore31, toolchain v13.11.1.0 (NJsonSchema v10.4.3.0 (Newtonsoft.Json v12.0.0.0))
  Visit http://NSwag.org for more information.
  NSwag bin directory: /Users/maxhamulyak/.nuget/packages/nswag.msbuild/13.11.1/tools/NetCore31
  
  Executing file 'nswag.json' with variables 'Configuration=Debug'...
  Launcher directory: /Users/maxhamulyak/.nuget/packages/nswag.msbuild/13.11.1/tools/NetCore31
  System.Reflection.TargetInvocationException: Exception has been thrown by the target of an invocation.
   ---&gt; System.InvalidOperationException: No service for type 'NSwag.Generation.IOpenApiDocumentGenerator' has been registered.
</code></pre>
<p>The reason behind this error is that the tool requires NSwag in the API project. To do this, we need to install the NSwag.AspNetCore package with <code>dotnet add package NSwag.AspNetCore</code>.  The scope of this tutorial is not how to set up an API project with NSwag luckily, the <a href="https://github.com/RicoSuter/NSwag#usage-in-c" class="external">guide</a> is straightforward. We modify the <code>ConfigureServices</code> method in Startup.cs with <code>services.AddOpenApiDocument();</code> and we add <code>app.UseOpenApi();</code> and <code>app.UseSwaggerUi3();</code> to the <code>Configure</code> method. We have an Open API specification for our WeatherForecast controller with these changes and can easily view and test it with Swagger UI.</p>
<p>Now we can successfully generate a client for the WeatherForecastAPI!</p>
<h2 id="generate-client-from-swashbuckle-project"><a href="#generate-client-from-swashbuckle-project">Generate client from Swashbuckle project</a></h2>
<p>The third and final version I will look at is a combination of both previous versions. I already hinted at it in the last section, but Microsoft made some <a href="https://docs.microsoft.com/en-us/aspnet/core/release-notes/aspnetcore-5.0?view=aspnetcore-5.0#openapi-specification-on-by-default" class="external">changes</a> to the template to generate them by default using Swashbuckle.</p>
<pre><code class="language-shell">dotnet new classlib --framework netstandard2.0 --output src/Sdks/FromSwashbuckleApi --name Kaylumah.GenerateCSharpClientForOpenAPI.Sdks.FromSwashbuckleApi
dotnet add package NSwag.MSBuild
dotnet add package System.ComponentModel.Annotations
dotnet add package Newtonsoft.Json
</code></pre>
<p>Like before, we also need a webapi project.</p>
<pre><code class="language-sh">dotnet new webapi --framework netcoreapp3.1 --output src/Apis/Swashbuckle/WeatherForecastApi --name Kaylumah.GenerateCSharpClientForOpenAPI.Apis.Swashbuckle.WeatherForecastApi
</code></pre>
<p>Of course, we could launch the API project and browse to <code>https://localhost:5001/swagger</code> and download the specification from there. But I will opt for automating the process with a <a href="https://github.com/domaindrivendev/Swashbuckle.AspNetCore#swashbuckleaspnetcorecli" class="external">CLI</a> provided as a dotnet tool by Swashbuckle.</p>
<p>Since we are using netcoreapp3.1 we can make use of a local tool manifest.</p>
<pre><code class="language-sh">dotnet new tool-manifest
dotnet tool install --version 6.1.4 Swashbuckle.AspNetCore.Cli
</code></pre>
<p>This allows us to run</p>
<pre><code class="language-sh">swagger tofile --output [output] [startupassembly] [swaggerdoc]`. For example, in the FromSwashbuckleApi folder we would run `dotnet swagger tofile --output swagger.json ../../Apis/Swashbuckle/WeatherForecastApi/bin/Debug/netcoreapp3.1/Kaylumah.GenerateCSharpClientForOpenAPI.Apis.Swashbuckle.WeatherForecastApi.dll v1
</code></pre>
<p>At the moment, this returns an error if you target a netcoreapp3.1 project when using a net5 SDK. This <a href="https://github.com/domaindrivendev/Swashbuckle.AspNetCore/issues/2006" class="external">issue</a> describes a change in 6.x of the tool. A workaround for this is using a global.json file.</p>
<pre><code class="language-json">{
    &quot;sdk&quot;: {
        &quot;version&quot;: &quot;3.1.406&quot;,
        &quot;rollForward&quot;: &quot;latestPatch&quot;
    }
}
</code></pre>
<p>Similar to the NSwag version, we still need to add Swashbuckle to the webapi. Luckily just as with NSwag the <a href="https://github.com/domaindrivendev/Swashbuckle.AspNetCore#getting-started" class="external">guide</a> is straightforward.</p>
<pre><code class="language-xml">&lt;Target Name=&quot;GenerateOpenAPI&quot; BeforeTargets=&quot;GenerateSdk&quot;&gt;
  &lt;Exec Command=&quot;dotnet swagger tofile --output swagger.json ../../Apis/Swashbuckle/WeatherForecastApi/bin/Debug/netcoreapp3.1/Kaylumah.GenerateCSharpClientForOpenAPI.Apis.Swashbuckle.WeatherForecastApi.dll v1&quot; /&gt;
&lt;/Target&gt;

&lt;Target Name=&quot;GenerateSdk&quot; BeforeTargets=&quot;Build&quot;&gt;
  &lt;PropertyGroup&gt;
    &lt;OpenAPIDocument&gt;swagger.json&lt;/OpenAPIDocument&gt;
    &lt;NSwagConfiguration&gt;nswag.json&lt;/NSwagConfiguration&gt;

    &lt;SdkNamespace&gt;$(RootNamespace)&lt;/SdkNamespace&gt;
    &lt;SdkName&gt;Weather&lt;/SdkName&gt;
    &lt;GeneratedInterfaceFile&gt;$(SdkName).Interface.g.cs&lt;/GeneratedInterfaceFile&gt;
    &lt;GeneratedServiceFile&gt;$(SdkName).Service.g.cs&lt;/GeneratedServiceFile&gt;
  &lt;/PropertyGroup&gt;
  &lt;Error Text=&quot;The OpenAPI document '$(OpenAPIDocument)' does not exists!&quot; Condition=&quot;!Exists('$(OpenAPIDocument)')&quot; /&gt;
  &lt;Error Text=&quot;The NSwag configuration '$(NSwagConfiguration)' does not exists!&quot; Condition=&quot;!Exists('$(NSwagConfiguration)')&quot; /&gt;
  &lt;Exec Command=&quot;$(NSwagExe_Core31) run $(NSwagConfiguration) /variables:Configuration=$(Configuration),InputDocument=$(OpenAPIDocument),SdkName=$(SdkName),SdkNamespace=$(SdkNamespace),GeneratedClientFile=$(GeneratedServiceFile),GeneratedContractFile=$(GeneratedInterfaceFile)&quot; /&gt;
&lt;/Target&gt;
</code></pre>
<p>Now that we generated a second version of our Weather API, let's quickly compare the two.</p>
<pre><code class="language-cs">// Swashbuckle
[System.CodeDom.Compiler.GeneratedCode(&quot;NSwag&quot;, &quot;13.11.1.0 (NJsonSchema v10.4.3.0 (Newtonsoft.Json v12.0.0.0))&quot;)]
public partial interface IWeatherClient
{
    /// &lt;returns&gt;Success&lt;/returns&gt;
    /// &lt;exception cref=&quot;WeatherApiException&quot;&gt;A server side error occurred.&lt;/exception&gt;
    System.Threading.Tasks.Task&lt;System.Collections.Generic.ICollection&lt;WeatherForecast&gt;&gt; WeatherForecastAsync();

    /// &lt;param name=&quot;cancellationToken&quot;&gt;A cancellation token that can be used by other objects or threads to receive notice of cancellation.&lt;/param&gt;
    /// &lt;returns&gt;Success&lt;/returns&gt;
    /// &lt;exception cref=&quot;WeatherApiException&quot;&gt;A server side error occurred.&lt;/exception&gt;
    System.Threading.Tasks.Task&lt;System.Collections.Generic.ICollection&lt;WeatherForecast&gt;&gt; WeatherForecastAsync(System.Threading.CancellationToken cancellationToken);

}

// NSwag
[System.CodeDom.Compiler.GeneratedCode(&quot;NSwag&quot;, &quot;13.11.1.0 (NJsonSchema v10.4.3.0 (Newtonsoft.Json v12.0.0.0))&quot;)]
public partial interface IWeatherClient
{
    /// &lt;exception cref=&quot;WeatherApiException&quot;&gt;A server side error occurred.&lt;/exception&gt;
    System.Threading.Tasks.Task&lt;System.Collections.Generic.ICollection&lt;WeatherForecast&gt;&gt; WeatherForecast_GetAsync();

    /// &lt;param name=&quot;cancellationToken&quot;&gt;A cancellation token that can be used by other objects or threads to receive notice of cancellation.&lt;/param&gt;
    /// &lt;exception cref=&quot;WeatherApiException&quot;&gt;A server side error occurred.&lt;/exception&gt;
    System.Threading.Tasks.Task&lt;System.Collections.Generic.ICollection&lt;WeatherForecast&gt;&gt; WeatherForecast_GetAsync(System.Threading.CancellationToken cancellationToken);

}
</code></pre>
<p>Funnily enough, even in a specification as small as these, there can already be differences!</p>
<h2 id="closing-thoughts"><a href="#closing-thoughts">Closing Thoughts</a></h2>
<p>As we have seen, there are multiple ways to generate a client by using <code>NSwag.MSBuild</code>.
If I am writing an OpenAPI specification, I prefer the syntax of Swashbuckle for several things like API versioning. That, of course, is a personal preference, but since Microsoft now also offers Swashbuckle as a default, it is nice to know we can make Swashbuckle and NSwag play nice together. How I configure my API with OpenAPI, API Versioning, ProblemDetails will be part of a future blog post.</p>
<p>So, where do we go from here? I did not mention it in the article, but in every generated client, we need to inject <code>System.Net.HttpClient</code>, which means we can combine it with <a href="https://docs.microsoft.com/en-us/dotnet/architecture/microservices/implement-resilient-applications/use-httpclientfactory-to-implement-resilient-http-requests" class="external">HttpClientFactory</a> and all the options it provides. Alas, that is also a topic for another day.</p>
<p>As always, if you have any questions, feel free to reach out. Do you have suggestions or alternatives? I would love to hear about them.</p>
<p>The corresponding source code for this article is on <a href="https://github.com/kaylumah/GenerateCSharpClientForOpenAPI" class="external">GitHub</a>.</p>
<p>See you next time, stay healthy and happy coding to all 🧸!</p>
<h2 id="sources"><a href="#sources">Sources</a></h2>
<ul>
<li><a href="https://github.com/RicoSuter/NSwag/wiki/" class="external">NSwag GitHub</a></li>
<li><a href="https://github.com/domaindrivendev/Swashbuckle.AspNetCore" class="external">Swashbuckle GitHub</a></li>
</ul>]]></content>
  </entry>
  <entry>
    <id>https://kaylumah.nl/2021/04/11/an-approach-to-writing-mocks.html</id>
    <title type="text"><![CDATA[Experiment with Moq, an approach to writing mocks]]></title>
    <summary type="text"><![CDATA[An experiment to create reusable mocks in my testing code]]></summary>
    <published>2021-04-11T00:00:00+02:00</published>
    <updated>2021-04-11T00:00:00+02:00</updated>
    <author>
      <name>Max Hamulyák</name>
      <email>max@kaylumah.nl</email>
    </author>
    <link href="https://kaylumah.nl/2021/04/11/an-approach-to-writing-mocks.html" />
    <category term="C#" />
    <category term="Moq" />
    <category term="Testing" />
    <category term="Xunit" />
    <content type="html"><![CDATA[<p>Recently I was looking into a new way to use mocks in my unit tests. My framework of choice to write unit tests is XUnit, whereas I use Moq to create Mocks. The theory behind Moq will still apply if you use a different testing framework, and perhaps some of the things I will demonstrate will be possible in other mocking frameworks.</p>
<p>In many projects, I find that we look at essential things like:</p>
<ul>
<li>How should the architecture look?</li>
<li>Which design patterns should we use?</li>
<li>Making sure we follow the SOLID principles.</li>
<li>How should we structure our code base?</li>
</ul>
<p>At the same time, I find that we do not give our tests the same amount of love.</p>
<p>Wouter Roos, a colleague of mine over at ilionx, gave me this idea, and after experimenting a bit with it, I like it so much that I decided to blog about it. I tried hard to find other articles about it but did not find a post doing something similar. It wanted to make sure that the idea would also transfer to other aspects like how to mock <code>ILogger&lt;T&gt;</code>, that I stumbled upon <a href="https://adamstorr.azurewebsites.net/blog/mocking-ilogger-with-moq" class="external">an excellent article</a> by Adam Storr. Coincidentally Adam <a href="https://exceptionnotfound.net/using-moq-to-create-fluent-test-classes-in-asp-net-core/" class="external">linked</a> to a part in a series by Matthew Jones about Fluent Mocks. I have been reading articles written by  Matthew for some time now but missed this one. Matthews approach and, for that matter, Adam's proposal on testing ILogger are not quite the same as what I will propose, but I think these ideas will complement each other nicely. Funnily enough, I have had Adam's idea to create extensions methods on <code>Mock&lt;T&gt;</code> before when setting up a mock filesystem for use in unit tests. However, I can extend on that premise with what I learned from Wouter and make it even better.</p>
<h2 id="system-setup"><a href="#system-setup">System Setup</a></h2>
<p>Bear with me for a little while whilst we set up our demo scenario. In our architecture, we have defined three components. We have two resource access components and one manager. The manager is used to orchestrate our business code, and the resource access components interact with a resource, for example, a database.</p>
<!-- 
@startuml
title Architecture Component Diagram

component [Site\nManager] as Site
component [Article\nAccess] as Article
component [Author\nAccess] as Author


Site - -> Article
Site - -> Author
@enduml
 -->
<p><picture><source type="image/webp" srcset="https://kaylumah.nl/assets/images/posts/20210411/approach-to-writing-mocks/architecture.png.webp" /><img loading="lazy" src="https://kaylumah.nl/assets/images/posts/20210411/approach-to-writing-mocks/architecture.png" width="323" height="226" alt="Architecture Diagram for Blog Platform Scenario" /></picture></p>
<p>Since I am writing this blog post, what better example than a use case for a blogging platform. Imagine a platform where users can create and share their content. But you can only successfully start posts after you verified your account. In a sequence diagram, it might look something like this.</p>
<!-- 
@startuml
title UC: Create Article
autonumber "<b>[000]"

SiteManager -> AuthorAccess: RetrieveAuthors
AuthorAccess - -> SiteManager: RetrieveAuthorsResponse
SiteManager -> SiteManager: is valid author?

SiteManager -> ArticleAccess: CreateArticle
ArticleAccess - -> SiteManager: CreateArticleResponse
@enduml
 -->
<p><picture><source type="image/webp" srcset="https://kaylumah.nl/assets/images/posts/20210411/approach-to-writing-mocks/sequence.png.webp" /><img loading="lazy" src="https://kaylumah.nl/assets/images/posts/20210411/approach-to-writing-mocks/sequence.png" width="487" height="297" alt="Sequence Diagram for Blog Platform Scenario" /></picture></p>
<p>I am going to use the dotnet CLI to create my project structure.</p>
<pre><code class="language-shell">dotnet new sln
dotnet new classlib --name Kaylumah.AdventuresWithMock.Access.Article.Interface --output src/Components/Access/Article/Interface --framework netstandard2.1
dotnet new classlib --name Kaylumah.AdventuresWithMock.Access.Article.Service --output src/Components/Access/Article/Service --framework netstandard2.1
dotnet new classlib --name Kaylumah.AdventuresWithMock.Access.Author.Interface --output src/Components/Access/Author/Interface --framework netstandard2.1
dotnet new classlib --name Kaylumah.AdventuresWithMock.Access.Author.Service --output src/Components/Access/Author/Service --framework netstandard2.1
dotnet new classlib --name Kaylumah.AdventuresWithMock.Manager.Site.Interface --output src/Components/Manager/Site/Interface --framework netstandard2.1
dotnet new classlib --name Kaylumah.AdventuresWithMock.Manager.Site.Service --output src/Components/Manager/Site/Service --framework netstandard2.1
dotnet new xunit --name Test.Unit --output test/Unit --framework netcoreapp3.1
</code></pre>
<!-- Command to print file tree -->
<!-- ls -aR | grep ":$" | perl -pe 's/:$//;s/[^-][^\/]*\//    /g;s/^    (\S)/└── \1/;s/(^    |    (?= ))/│   /g;s/    (\S)/└── \1/' -->
<pre><code class="language-output">└── src
│   └── Components
│   │   └── Access
│   │   │   └── Article
│   │   │   │   └── Interface
│   │   │   │   └── Service
│   │   │   └── Author
│   │   │   │   └── Interface
│   │   │   │   └── Service
│   │   └── Manager
│   │   │   └── Site
│   │   │   │   └── Interface
│   │   │   │   └── Service
└── test
│   └── Unit
</code></pre>
<p>If everything went fine, you should have the following directory structure on disk. I like to split my components into an interface definition project and an actual implementation project. This split, of course, means that every <code>.Service</code> project needs to reference the corresponding <code>.Interface</code> project via <code>ProjectReference</code>. Because of our architecture, the SiteManager service needs to reference the interface projects of both access services. Finally, our unit test project needs to reference the service projects so we can test them.</p>
<blockquote>
<p>You may be wondering why I specified <code>--framework</code> after each dotnet new command; this is because it would otherwise default to <code>NET5.0</code>, which would be fine for a blog post like this, but since NET5 is not LTS, I mostly abstain from using it in my projects.</p>
</blockquote>
<p>I will not include every little DTO as part of this article since those classes will be available as part of <a href="https://github.com/kaylumah/AdventuresWithMock" class="external">the source code</a> in the end. For now, assume we have created our implementation to look like this.</p>
<p>Our Article Access</p>
<pre><code class="language-cs">using System;
using System.Threading.Tasks;
using Kaylumah.AdventuresWithMock.Access.Article.Interface;

namespace Kaylumah.AdventuresWithMock.Access.Article.Service
{
    public class ArticleAccess : IArticleAccess
    {
        public Task&lt;CreateArticlesResponse&gt; CreateArticles(CreateArticlesRequest createArticlesRequest)
        {
            throw new NotImplementedException();
        }

        public Task DeleteArticles(DeleteArticlesRequest deleteArticlesRequest)
        {
            throw new NotImplementedException();
        }

        public Task&lt;FilterArticleResponse&gt; FilterArticles(FilterArticleCriteria filterArticleCriteria = null)
        {
            throw new NotImplementedException();
        }
    }
}
</code></pre>
<p>Our Author Access</p>
<pre><code class="language-cs">using System;
using System.Threading.Tasks;
using Kaylumah.AdventuresWithMock.Access.Author.Interface;

namespace Kaylumah.AdventuresWithMock.Access.Author.Service
{
    public class AuthorAccess : IAuthorAccess
    {
        public Task&lt;FilterAuthorResponse&gt; FilterAuthors(FilterAuthorCriteria filterAuthorCriteria = null)
        {
            throw new NotImplementedException();
        }
    }
}
</code></pre>
<p>And finally, our Site Manager, which should match our sequence diagram, looks like this.</p>
<pre><code class="language-cs">using System.Linq;
using System.Threading.Tasks;
using Kaylumah.AdventuresWithMock.Access.Article.Interface;
using Kaylumah.AdventuresWithMock.Access.Author.Interface;
using Kaylumah.AdventuresWithMock.Manager.Site.Interface;

namespace Kaylumah.AdventuresWithMock.Manager.Site.Service
{
    public class SiteManager : ISiteManager
    {

        private readonly IArticleAccess _articleAccess;
        private readonly IAuthorAccess _authorAccess;

        public SiteManager(IArticleAccess articleAccess, IAuthorAccess authorAccess)
        {
            _articleAccess = articleAccess;
            _authorAccess = authorAccess;
        }

        public async Task CreateArticle(Interface.CreateArticleRequest createArticleRequest)
        {
            // Hardcoded for now, would probably come from JWT user claim.
            var authorId = 666;

            var authorsResponse = await _authorAccess.FilterAuthors(new FilterAuthorCriteria {
                AuthorIds = new int[] { authorId }
            });

            var author = authorsResponse.Authors.SingleOrDefault(x =&gt; x.Id.Equals(authorId));

            if (author == null)
            {
                return;
            }

            if (!author.Verfied)
            {
                return;
            }

            var article = new Access.Article.Interface.CreateArticleRequest
            { 
                AuthorId = authorId,
                Title = createArticleRequest.Title,
                Description = createArticleRequest.Content
            };

            var response = await _articleAccess.CreateArticles(new CreateArticlesRequest {
                CreateArticleRequests = new Access.Article.Interface.CreateArticleRequest[] {
                    article
                }
            });
        }
    }
}
</code></pre>
<p>Wait just a minute! You forgot to implement the access components and only gave us the manager one. I did not ;-) It is to prove a point. Since we are going to mock our dependencies, we don't use the actual implementation.</p>
<p>Thank you for bearing with me; now that we have all that in place, we can finally get to the heart of the matter and start our adventure with Mock.</p>
<h2 id="the-problem"><a href="#the-problem">The Problem</a></h2>
<p>I have yet to explain the reason behind the article. Let us look at how we might test this code traditionally with the following snippet.</p>
<pre><code class="language-cs">[Fact]
public async Task Test_SiteManager_CreateArticle_Traditionally()
{
    // Arange
    var authorAccessMock = new Mock&lt;IAuthorAccess&gt;();
    authorAccessMock.Setup(x =&gt; x.FilterAuthors(It.Is&lt;FilterAuthorCriteria&gt;(p =&gt; p.AuthorIds.Contains(666)))).ReturnsAsync(new FilterAuthorResponse {
        Authors = new Author[] {
            new Author {
                Id = 666,
                DisplayName = &quot;Max&quot;,
                Verfied = true
            }
        }
    });
    var articleAccessMock = new Mock&lt;IArticleAccess&gt;();
    articleAccessMock.Setup(x =&gt; x.CreateArticles(It.IsAny&lt;CreateArticlesRequest&gt;())).ReturnsAsync(new CreateArticlesResponse {
        Articles = new Article[] {
            new Article {
                Id = 1,
                AuthorId = 666,
                Title = &quot;...&quot;,
                Description = &quot;...&quot;
            }
        }
    });
    ISiteManager sut = new SiteManager(articleAccessMock.Object, authorAccessMock.Object);

    // Act
    var request = new Kaylumah.AdventuresWithMock.Manager.Site.Interface.CreateArticleRequest { 
        Title = &quot;Pretty Title&quot;,
        Content = &quot;# AdventuresWithMock ...&quot;
    };
    await sut.CreateArticle(request);

    // Assert
    authorAccessMock.Verify(x =&gt; x.FilterAuthors(It.IsAny&lt;FilterAuthorCriteria&gt;()), Times.Once);
    articleAccessMock.Verify(x =&gt; x.CreateArticles(It.IsAny&lt;CreateArticlesRequest&gt;()), Times.Once);
}
</code></pre>
<p>That is a lot of code to test a simple scenario. It is in its current form, even four lines longer than the code under test. Even worse, it's primarily boilerplate to set up the test. I often find myself repeating similar code for every test. Which is a violation of the &quot;Don't Repeat Yourself&quot; principle. So I am going to propose an alternative set up to my mock code. All you need to do is create a subclass from <code>Mock&lt;T&gt;</code> for the system you want to stub, and you are good to go.</p>
<h2 id="mocking-data-access"><a href="#mocking-data-access">Mocking Data Access</a></h2>
<p>We start with the AuthorsAccessMock. We will use our constructor to pass a <code>List&lt;Author&gt;</code> and use Moq's <code>Setup</code> method to return the internal state. Yes, that's right, because our mock is now a class we are stateful, this means we can now track state and changes on our mocks without relying on the <code>Verify</code> method.</p>
<pre><code class="language-cs">using System.Collections.Generic;
using System.Linq;
using Kaylumah.AdventuresWithMock.Access.Author.Interface;
using Moq;

namespace Test.Unit.Mocks
{
    public class AuthorAccessMock : Mock&lt;IAuthorAccess&gt;
    {
        public List&lt;Author&gt; Authors { get; }
        public AuthorAccessMock(List&lt;Author&gt; authors)
        {
            Authors = authors;

            Setup(x =&gt; x.FilterAuthors(It.IsAny&lt;FilterAuthorCriteria&gt;()))
                .ReturnsAsync((FilterAuthorCriteria criteria) =&gt; {

                    IQueryable&lt;Author&gt; result = Authors.AsQueryable();
                    if (criteria != null)
                    {
                        result = result.Where(x =&gt; criteria.AuthorIds.Contains(x.Id));
                    }

                    return new FilterAuthorResponse {
                        Authors = result.ToArray()
                    };
                });
        }
    }
}
</code></pre>
<p>So how does this impact our test? We create a new AuthorAccessMock and pass it to our system under test. Keep in mind this is still a <code>Mock&lt;T&gt;</code>, so to give it, we do <code>authorAccessMock.Object</code>. Our new setup drastically decreases the setup code in my test, and at the same time, it increases the reusability of my mocks</p>
<pre><code class="language-cs">[Fact]
public async Task Test_SiteManager_CreateArticle_RepoMocksDemo1()
{
    // Arange
    var authorAccessMock = new AuthorAccessMock(new List&lt;Author&gt; {
        new Author { Id = 666, DisplayName = &quot;Max&quot;, Verfied = false }
    });
    var articleAccessMock = new ArticleAccessMock();
    ISiteManager sut = new SiteManager(articleAccessMock.Object, authorAccessMock.Object);

    // Act
    var request = new Kaylumah.AdventuresWithMock.Manager.Site.Interface.CreateArticleRequest
    {
        Title = &quot;Pretty Title&quot;,
        Content = &quot;# AdventuresWithMock ...&quot;
    };
    await sut.CreateArticle(request);

    // Assert
    authorAccessMock.Verify(x =&gt; x.FilterAuthors(It.IsAny&lt;FilterAuthorCriteria&gt;()), Times.Once);
    articleAccessMock.Verify(x =&gt; x.CreateArticles(It.IsAny&lt;CreateArticlesRequest&gt;()), Times.Never);
}
</code></pre>
<p>Our AuthorAccess was a bit boring. Let's extend on the stateful premise by building our ArticleAccessMock, which looks a lot like a CRUD repository. There are a couple of things in the following snippet I like to point out.</p>
<ol>
<li>I created another representation of our Article class, and this is so that our mock implementation does a soft delete. Since we are stateful, we can then make tests on that premise.</li>
<li>I also track the requests DTOs to my service using Moq's Callback mechanism. This way, I can make assertions regarding the actual input request.</li>
<li>I partially moved away from constructor set up to demonstrate this pattern nicely complements Matthew's FluentMocks pattern.</li>
<li>Lastly, I also added a custom verify method, which takes a func as an argument; this makes it possible to write any validation I can imagine against my internal state.</li>
</ol>
<pre><code class="language-cs">using System;
using System.Collections.Generic;
using System.Linq;
using Kaylumah.AdventuresWithMock.Access.Article.Interface;
using Moq;

namespace Test.Unit.Mocks
{
    public class ArticleAccessMock : Mock&lt;IArticleAccess&gt;
    {
        public class ArticleMock
        {
            public int Id { get;set; }
            public int AuthorId { get;set; }
            public string Title { get;set; }
            public string Content { get;set; }
            public bool Removed { get;set; }
        }

        public List&lt;CreateArticlesRequest&gt; CreateArticlesRequests { get; } = new List&lt;CreateArticlesRequest&gt;();
        public List&lt;DeleteArticlesRequest&gt; DeleteArticlesRequests { get; } = new List&lt;DeleteArticlesRequest&gt;();

        private List&lt;ArticleMock&gt; _articleState = new List&lt;ArticleMock&gt;();
        private int _numberOfArticlesBeforeCreate = 0;

        public ArticleAccessMock()
        {
            Setup(access =&gt; access.CreateArticles(It.IsAny&lt;CreateArticlesRequest&gt;()))
                .Callback&lt;CreateArticlesRequest&gt;(request =&gt; {
                    CreateArticlesRequests.Add(request);
                    _numberOfArticlesBeforeCreate = _articleState.Count;
                    var nextId = _numberOfArticlesBeforeCreate + 1;
                    foreach(var createArticleRequest in request.CreateArticleRequests)
                    {
                        _articleState.Add(new ArticleMock {
                            Id = nextId,
                            AuthorId = createArticleRequest.AuthorId,
                            Content = createArticleRequest.Description,
                            Title = createArticleRequest.Title,
                            Removed = false
                        });
                        nextId++;
                    }
                })
                .ReturnsAsync(() =&gt; new CreateArticlesResponse {
                    Articles = _articleState
                    .Skip(_numberOfArticlesBeforeCreate)
                    .Select(x =&gt; new Article
                    {
                        Id = x.Id,
                        AuthorId = x.AuthorId,
                        Description = x.Content,
                        Title = x.Title
                    })
                    .ToArray()
                });
            
            Setup(access =&gt; access.DeleteArticles(It.IsAny&lt;DeleteArticlesRequest&gt;()))
                .Callback&lt;DeleteArticlesRequest&gt;(deleteArticlesRequest =&gt; {
                    DeleteArticlesRequests.Add(deleteArticlesRequest);
                    foreach(var deleteArticleRequests in deleteArticlesRequest.DeleteArticleRequests)
                    {
                        var existing = _articleState.SingleOrDefault(article =&gt; deleteArticleRequests.ArticleId == article.Id);
                        if (existing != null)
                        {
                            existing.Removed = true;
                        }
                    }
                });
        }

        public ArticleAccessMock SetupFilterArticles(List&lt;Article&gt; articles)
        {
            _articleState = articles.Select(x =&gt; new ArticleMock {
                Id = x.Id,
                AuthorId = x.AuthorId,
                Content = x.Description,
                Title = x.Title,
                Removed = false
            }).ToList();

            Setup(x =&gt; x.FilterArticles(It.IsAny&lt;FilterArticleCriteria&gt;()))
                .ReturnsAsync((FilterArticleCriteria criteria) =&gt; {
                    IQueryable&lt;ArticleMock&gt; result = _articleState.AsQueryable();
                    if (criteria != null)
                    {
                        result = result.Where(x =&gt; criteria.ArticleIds.Contains(x.Id));
                    }
                    return new FilterArticleResponse {
                        Articles = result
                            .Where(x =&gt; !x.Removed)
                            .Select(x =&gt; new Article {
                                Id = x.Id,
                                AuthorId = x.AuthorId,
                                Description = x.Content,
                                Title = x.Title
                            })
                            .ToArray()
                    };
                });

            return this;
        }

        public bool VerifyArticles(Func&lt;List&lt;ArticleMock&gt;, bool&gt; predicate)
        {
           return predicate(_articleState);
        }
    }
}
</code></pre>
<p>I usually would not write a test for my Moq code. The following snippet's purpose is to demonstrate the statefulness of our mocks. On the other hand, our mocks are now lightweight implementations of service, so why not test them!</p>
<pre><code class="language-cs">[Fact]
public async Task Test_ArticleAccessMock_StatefullDemo1()
{
    // Arange
    var articleAccessMock = new ArticleAccessMock()
        .SetupFilterArticles(new List&lt;Article&gt; {});
    var sut = articleAccessMock.Object;

    // Act
    var initialResponse = await sut.FilterArticles();
    var createResponse = await sut.CreateArticles(new CreateArticlesRequest {
        CreateArticleRequests = new CreateArticleRequest[] {
            new CreateArticleRequest {
                AuthorId = 666,
                Description = &quot;1&quot;,
                Title = &quot;1&quot;
            },
            new CreateArticleRequest {
                AuthorId = 666,
                Description = &quot;2&quot;,
                Title = &quot;2&quot;
            }
        }
    });

    var afterAddResponse = await sut.FilterArticles();

    await sut.DeleteArticles(new DeleteArticlesRequest {
        DeleteArticleRequests = new DeleteArticleRequest[] {
            new DeleteArticleRequest {
                ArticleId = createResponse.Articles.First().Id
            }
        }
    });

    var afterRemoveResponse = await sut.FilterArticles();


    // Assert
    initialResponse.Should().NotBeNull();
    initialResponse.Articles.Count().Should().Be(0, &quot;No articles initially&quot;);

    afterAddResponse.Should().NotBeNull();
    afterAddResponse.Articles.Count().Should().Be(2, &quot;We created two articles&quot;);

    afterRemoveResponse.Should().NotBeNull();
    afterRemoveResponse.Articles.Count().Should().Be(1, &quot;There is only one article left&quot;);

    // Verify result with predicate logic instead if Mock.Verify()
    articleAccessMock.VerifyArticles(articles =&gt; articles.Count(x =&gt; x.Removed) == 1).Should().BeTrue();
}
</code></pre>
<p>You might ask yourself; Max, if you use a constructor to set up our mock, how would I deviate in my tests if I want to test error scenarios, for example? In that case, we might as well go full circle with the Fluent Mock approach. You could do it like the following snippet. You then choose to use the 'default' stateful mock or call the Setup methods you want to use.</p>
<pre><code class="language-cs">public ArticleAccessMock MakeStateful(List&lt;Article&gt; articles)
{
    return this
        .SetupFilterArticles(articles)
        .SetupDeleteArticles()
        .SetupCreateArticles();
}

public ArticleAccessMock SetupDeleteArticles() { /* ... */ }
public ArticleAccessMock SetupCreateArticles() { /* ... */ }
</code></pre>
<h2 id="mocking-ilogger"><a href="#mocking-ilogger">Mocking ILogger</a></h2>
<p>I did say that Adam's article also inspired me. So let us see how ILogger can implement stateful mocks. First, a quick reminder of what we are going to Mock. The <a href="https://github.com/dotnet/runtime/blob/3cbbadee12cc95bd62c70786d5408a2277a21e0a/src/libraries/Microsoft.Extensions.Logging.Abstractions/src/ILogger.cs#L23" class="external">ILogger interface</a> looks like this.</p>
<pre><code class="language-cs">/// &lt;summary&gt;
/// Writes a log entry.
/// &lt;/summary&gt;
/// &lt;param name=&quot;logLevel&quot;&gt;Entry will be written on this level.&lt;/param&gt;
/// &lt;param name=&quot;eventId&quot;&gt;Id of the event.&lt;/param&gt;
/// &lt;param name=&quot;state&quot;&gt;The entry to be written. Can be also an object.&lt;/param&gt;
/// &lt;param name=&quot;exception&quot;&gt;The exception related to this entry.&lt;/param&gt;
/// &lt;param name=&quot;formatter&quot;&gt;Function to create a &lt;see cref=&quot;string&quot;/&gt; message of the &lt;paramref name=&quot;state&quot;/&gt; and &lt;paramref name=&quot;exception&quot;/&gt;.&lt;/param&gt;
/// &lt;typeparam name=&quot;TState&quot;&gt;The type of the object to be written.&lt;/typeparam&gt;
void Log&lt;TState&gt;(LogLevel logLevel, EventId eventId, TState state, Exception? exception, Func&lt;TState, Exception?, string&gt; formatter);
</code></pre>
<p>I can not express how happy I am that I don't need to call the Logger like that. Luckily Microsoft offers a different extension method for every occasion. Unfortunately, Moq cannot test extension methods. Luckily for me, Adam figured out how to test it.</p>
<p>Create a <code>LoggerMock&lt;T&gt;</code> class that implements <code>Mock&lt;ILogger&lt;T&gt;&gt;</code> we are not going to add something custom to it just yet.</p>
<pre><code class="language-cs">using Microsoft.Extensions.Logging;
using Moq;

namespace Test.Unit.Mocks
{
    public class LoggerMock&lt;T&gt; : Mock&lt;ILogger&lt;T&gt;&gt;
    {
    }
}
</code></pre>
<p>At the same time, we will use the final result from Adam's post as a helper method to test our logging.</p>
<pre><code class="language-cs">public static Mock&lt;ILogger&lt;T&gt;&gt; VerifyLogging&lt;T&gt;(this Mock&lt;ILogger&lt;T&gt;&gt; logger, string expectedMessage, LogLevel expectedLogLevel = LogLevel.Debug, Times? times = null)
{
    times ??= Times.Once();

    Func&lt;object, Type, bool&gt; state = (v, t) =&gt; v.ToString().CompareTo(expectedMessage) == 0;

    logger.Verify(
        x =&gt; x.Log(
            It.Is&lt;LogLevel&gt;(l =&gt; l == expectedLogLevel),
            It.IsAny&lt;EventId&gt;(),
            It.Is&lt;It.IsAnyType&gt;((v, t) =&gt; state(v, t)),
            It.IsAny&lt;Exception&gt;(),
            It.Is&lt;Func&lt;It.IsAnyType, Exception, string&gt;&gt;((v, t) =&gt; true)), (Times)times);

    return logger;
}
</code></pre>
<p>With that in place, let's update the manager to log.</p>
<pre><code class="language-cs">public class SiteManager : ISiteManager
{

    // ...
    
    private readonly ILogger _logger;

    public SiteManager(IArticleAccess articleAccess, IAuthorAccess authorAccess, ILogger&lt;SiteManager&gt; logger)
    {
        // ...
        _logger = logger;
    }

    public async Task CreateArticle(Interface.CreateArticleRequest createArticleRequest)
    {
        // Hardcoded for now, would probably come from JWT user claim.
        var authorId = 666;

        /// ...

        if (author == null)
        {
            _logger.LogWarning($&quot;No author found for {authorId}&quot;);
            return;
        }

        // ...
    }
}
</code></pre>
<p>To put it to the test:</p>
<pre><code class="language-cs">[Fact]
public async Task Test_SiteManager_CreateArticle_TestLogging()
{
    // Arange
    var loggerMock = new LoggerMock&lt;SiteManager&gt;();
    var authorAccessMock = new AuthorAccessMock(new List&lt;Author&gt; {});
    var articleAccessMock = new ArticleAccessMock();
    ISiteManager sut = new SiteManager(articleAccessMock.Object, authorAccessMock.Object, loggerMock.Object);

    // Act
    var request = new Kaylumah.AdventuresWithMock.Manager.Site.Interface.CreateArticleRequest
    {
        Title = &quot;Pretty Title&quot;,
        Content = &quot;# AdventuresWithMock ...&quot;
    };
    await sut.CreateArticle(request);

    // Assert
    authorAccessMock.Verify(x =&gt; x.FilterAuthors(It.IsAny&lt;FilterAuthorCriteria&gt;()), Times.Once);
    articleAccessMock.Verify(x =&gt; x.CreateArticles(It.IsAny&lt;CreateArticlesRequest&gt;()), Times.Never);
    loggerMock.VerifyLogging(&quot;No author found for 666&quot;, Microsoft.Extensions.Logging.LogLevel.Warning);
}
</code></pre>
<p>Wait, did that just work on the first try? Did Adam's extension method not work on <code>Mock&lt;ILogger&lt;T&gt;&gt;</code>? Remember subclassing is an <code>is a</code> relation ship which means that our MockLogger qualifies for this extension method.</p>
<p>What would happen if have a lot of traffic and log thousands upon thousands of requests. In that case, we can move to an alternative for methods such as <code>LogInformation</code>. For these scenarios, you can use <a href="https://docs.microsoft.com/en-us/aspnet/core/fundamentals/logging/loggermessage?view=aspnetcore-5.0" class="external">LoggerMessage for high-performance logging</a>.</p>
<pre><code class="language-cs">using System;
using Microsoft.Extensions.Logging;

namespace Kaylumah.AdventuresWithMock.Manager.Site.Service
{
    public static class LoggerExtensions
    {
        private static readonly Action&lt;ILogger, int, Exception&gt; _authorNotVerfied =
            LoggerMessage.Define&lt;int&gt;(
                LogLevel.Information,
                EventIds.AuthorNotVerfied,
                &quot;Author with Id {AuthorId} is not verfied!&quot;
            );

        public static void LogAuthorNotVerfied(this ILogger logger, int authorId)
        {
            _authorNotVerfied(logger, authorId, null);
        }

        private static class EventIds
        {
            public static readonly EventId AuthorNotVerfied = new(100, nameof(AuthorNotVerfied));
        }
    }
}
</code></pre>
<pre><code class="language-cs">[Fact]
public async Task Test_SiteManager_CreateArticle_TestLoggingExtensionMethod()
{
    // Arange
    var loggerMock = new LoggerMock&lt;SiteManager&gt;();
    var authorAccessMock = new AuthorAccessMock(new List&lt;Author&gt; {
        new Author { Id = 666, DisplayName = &quot;Max&quot;, Verfied = false }
    });
    var articleAccessMock = new ArticleAccessMock();
    ISiteManager sut = new SiteManager(articleAccessMock.Object, authorAccessMock.Object, loggerMock.Object);

    // Act
    var request = new Kaylumah.AdventuresWithMock.Manager.Site.Interface.CreateArticleRequest
    {
        Title = &quot;Pretty Title&quot;,
        Content = &quot;# AdventuresWithMock ...&quot;
    };
    await sut.CreateArticle(request);

    // Assert
    authorAccessMock.Verify(x =&gt; x.FilterAuthors(It.IsAny&lt;FilterAuthorCriteria&gt;()), Times.Once);
    articleAccessMock.Verify(x =&gt; x.CreateArticles(It.IsAny&lt;CreateArticlesRequest&gt;()), Times.Never);
    loggerMock.VerifyLogging(&quot;Author with Id 666 is not verfied!&quot;, Microsoft.Extensions.Logging.LogLevel.Information);
    loggerMock.VerifyEventIdWasCalled(new Microsoft.Extensions.Logging.EventId(100, &quot;AuthorNotVerfied&quot;));

}
</code></pre>
<p>You are probably as surprised as I was that it did not work. As it turns out, LoggerMessage actual checks against LogLevel enabled. So add the following to our LoggerMock.</p>
<pre><code class="language-cs">public LoggerMock&lt;T&gt; SetupLogLevel(LogLevel logLevel, bool enabled = true)
{
    Setup(x =&gt; x.IsEnabled(It.Is&lt;LogLevel&gt;(p =&gt; p.Equals(logLevel))))
        .Returns(enabled);
    return this;
}
</code></pre>
<p>There is one last improvement I wish to make to our LoggerMock. Like our stateful repository mocks, I feel it would be beneficial to capture everything that goes into our mock—in my opinion, using Predicates and Linq gives me more control over my assertions than using mocks internals.</p>
<p>Our final implementation looks like this:</p>
<pre><code class="language-cs">using System;
using System.Collections.Generic;
using Microsoft.Extensions.Logging;
using Moq;

namespace Test.Unit.Mocks
{
    public class LoggerMock&lt;T&gt; : Mock&lt;ILogger&lt;T&gt;&gt;
    {
        public class LogMessageMock
        {
            public LogLevel LogLevel { get;set; }
            public EventId Event { get;set; }
            public string Message { get;set; }
        }

        public List&lt;LogMessageMock&gt; Messsages { get; } = new List&lt;LogMessageMock&gt;();

        public LoggerMock()
        {
            Setup(x =&gt; x.Log(
                    It.IsAny&lt;LogLevel&gt;(),
                    It.IsAny&lt;EventId&gt;(),
                    It.Is&lt;It.IsAnyType&gt;((v, t) =&gt; true),
                    It.IsAny&lt;Exception&gt;(),
                    It.Is&lt;Func&lt;It.IsAnyType, Exception, string&gt;&gt;((v, t) =&gt; true)
                )
            )
            .Callback(new InvocationAction(invocation =&gt;
            {
                // https://stackoverflow.com/questions/52707702/how-do-you-mock-ilogger-loginformation
                // https://github.com/moq/moq4/issues/918
                var logLevel = (LogLevel)invocation.Arguments[0];
                var eventId = (EventId)invocation.Arguments[1];
                var state = invocation.Arguments[2];
                var exception = (Exception?)invocation.Arguments[3];
                var formatter = invocation.Arguments[4];

                var invokeMethod = formatter
                    .GetType()
                    .GetMethod(&quot;Invoke&quot;);

                var logMessage = (string?)invokeMethod?.Invoke(formatter, new[] { state, exception });
                Messsages.Add(new LogMessageMock {
                    Event = eventId,
                    LogLevel = logLevel,
                    Message = logMessage
                });
            }));
        }

        public LoggerMock&lt;T&gt; SetupLogLevel(LogLevel logLevel, bool enabled = true)
        {
            Setup(x =&gt; x.IsEnabled(It.Is&lt;LogLevel&gt;(p =&gt; p.Equals(logLevel))))
                .Returns(enabled);
            return this;
        }
    }
}
</code></pre>
<h2 id="mocking-httpclient"><a href="#mocking-httpclient">Mocking HttpClient</a></h2>
<p>Even though our article is getting to be on the length side, I found it helpful to include at least one more example. I could rewrite the filesystem sample I mentioned to match this pattern, but I decided to do that later. I thought it would be more useful to look into mocking an HttpClient. One option would be to hide HttpClient behind an interface, but since our ArticleAccess is already the lowest point in our architecture, I see no need to hide that we use a HttpClient.</p>
<p>Since this is purely a demonstration, I am not going to set up an HTTP Server. Luckily we can use https://jsonplaceholder.typicode.com/posts for our needs. Suppose our CreateArticles method looked like this.</p>
<pre><code class="language-cs">public async Task&lt;CreateArticlesResponse&gt; CreateArticles(CreateArticlesRequest createArticlesRequest)
{
    // NOTE: not going to call them in a loop, just for demo purposes.
    var json = JsonSerializer.Serialize(createArticlesRequest.CreateArticleRequests.First());
    var response = await _httpClient.PostAsync(&quot;https://jsonplaceholder.typicode.com/posts&quot;, new StringContent(json));
    if (!response.IsSuccessStatusCode)
    {
        throw new Exception(&quot;Something went horribly wrong!&quot;);
    }
    var responseText = await response.Content.ReadAsStringAsync();
    // Map it to response
    return new CreateArticlesResponse {};
}
</code></pre>
<p>Unfortunately, you cannot achieve this by mocking HttpClient. You need to Mock HttpMessageHandler. Depending on your needs, it might look something like the following snippet. (Based on <a href="https://stackoverflow.com/a/57199040/1936600" class="external">this stackoverflow answer</a>)</p>
<pre><code class="language-csharp">using System;
using System.Collections.Generic;
using System.Net;
using System.Net.Http;
using System.Threading;
using System.Threading.Tasks;
using Moq;
using Moq.Language;
using Moq.Protected;

namespace Test.Unit.Mocks
{
    public class HttpClientMock : Mock&lt;HttpMessageHandler&gt;
    {
        private readonly List&lt;Tuple&lt;HttpStatusCode, HttpContent&gt;&gt; _responses;
        public HttpClientMock(List&lt;Tuple&lt;HttpStatusCode, HttpContent&gt;&gt; responses) : base(MockBehavior.Strict)
        {
            _responses = responses;
            SetupResponses();
        }

        private void SetupResponses()
        {
            var handlerPart = this.Protected().SetupSequence&lt;Task&lt;HttpResponseMessage&gt;&gt;(
              &quot;SendAsync&quot;,
              ItExpr.IsAny&lt;HttpRequestMessage&gt;(),
              ItExpr.IsAny&lt;CancellationToken&gt;()
           );

            foreach (var item in _responses)
            {
                handlerPart = AdddReturnPart(handlerPart, item.Item1, item.Item2);
            }
        }

        private ISetupSequentialResult&lt;Task&lt;HttpResponseMessage&gt;&gt; AdddReturnPart(ISetupSequentialResult&lt;Task&lt;HttpResponseMessage&gt;&gt; handlerPart,
        HttpStatusCode statusCode, HttpContent content)
        {
            return handlerPart.ReturnsAsync(new HttpResponseMessage()
            {
                StatusCode = statusCode,
                Content = content
            });
        }

        public static implicit operator HttpClient (HttpClientMock mock)
        {
            // Since neither HttpClient or HttpClientMock is an interface we can use implicit operator to convert.
            // Safes us a call to mock.Object in the test code.
            return new HttpClient(mock.Object) {};
        }
    }
}
</code></pre>
<p>The corresponding test would look like</p>
<pre><code class="language-csharp">[Fact]
public async Task Test_ArticleAccess_Returns200OK()
{
    var createArticleResponse = new StringContent(&quot;{ 'id':'anId' }&quot;, Encoding.UTF8, &quot;application/json&quot;);
    var httpClient = new HttpClientMock(new List&lt;Tuple&lt;HttpStatusCode, HttpContent&gt;&gt; {
        new Tuple&lt;HttpStatusCode, HttpContent&gt;(HttpStatusCode.OK, createArticleResponse),
    });
    var articleAccess = new ArticleAccess(httpClient);
    await articleAccess.CreateArticles(new CreateArticlesRequest{
        CreateArticleRequests = new CreateArticleRequest[] {
            new CreateArticleRequest {
                AuthorId = 666,
                Description = &quot;...&quot;,
                Title = &quot;Demo&quot;
            }
        }
    });
}
</code></pre>
<h2 id="summary"><a href="#summary">Summary</a></h2>
<p>That concludes my experiment for the day. I have shown three instances where you can apply your custom subclasses of <code>Mock&lt;T&gt;</code>. The way I see it, it offers three distinct advantages:</p>
<ol>
<li>Test code and mock code is separated.</li>
<li>Mock code is reusable across tests.</li>
<li>Stateful mocking allows for more readable verification in tests.</li>
</ol>
<p>Of course, creating a mock library will take some time. You could argue if it's worth the time to make a duplicate, albeit a simplified version of your data access. My personal opinion is that it makes debugging and reasoning about my tests easier than taking a deep dive in Invocations and Verify mock provides. As I have hopefully demonstrated is that one does not exclude the other. I think it can complement one and other.</p>
<p>I am glad about the early results of my experiment, hence me writing this blog post. Over time you can evolve these mocks to be even better. For example, change tracking of entities could potentially be used cross mock. The HttpClientMock could use some more love. Imagine hiding every detail like StatusCode, HttpResponseMessage from the tester. I could have saved it for another blog, but I shared this abstraction to start a dialogue with my team about testing and test set up.</p>
<p>As always, if you have any questions, feel free to reach out. I am curious to hear what you all think about this approach. Do you have suggestions or alternatives? I would love to hear about them.</p>
<p>The corresponding source code for this article is on <a href="https://github.com/kaylumah/AdventuresWithMock" class="external">GitHub</a>.</p>
<p>See you next time, stay healthy and happy coding to all 🧸!</p>
<h2 id="sources"><a href="#sources">Sources</a></h2>
<ul>
<li><a href="https://exceptionnotfound.net/using-moq-to-create-fluent-test-classes-in-asp-net-core/" class="external">Fluent Mocks</a></li>
<li><a href="https://adamstorr.azurewebsites.net/blog/mocking-ilogger-with-moq" class="external">Testing ILogger</a></li>
<li><a href="https://github.com/Moq/moq4/wiki/Quickstart" class="external">Moq Quickstart</a></li>
<li><a href="https://medium.com/webcom-engineering-and-product/a-cleaner-way-to-create-mocks-in-net-6e039c3d1db0" class="external">Cleaner way to create mocks</a></li>
<li><a href="https://stackoverflow.com/a/57199040/1936600" class="external">Testing HttpClient</a></li>
</ul>]]></content>
  </entry>
  <entry>
    <id>https://kaylumah.nl/2021/03/27/set-nuget-metadata-via-msbuild.html</id>
    <title type="text"><![CDATA[Set NuGet metadata via MSBuild]]></title>
    <summary type="text"><![CDATA[Discover how to use MSBuild to set your NuGet package's metadata]]></summary>
    <published>2021-03-27T00:00:00+01:00</published>
    <updated>2021-03-27T00:00:00+01:00</updated>
    <author>
      <name>Max Hamulyák</name>
      <email>max@kaylumah.nl</email>
    </author>
    <link href="https://kaylumah.nl/2021/03/27/set-nuget-metadata-via-msbuild.html" />
    <category term="MSBuild" />
    <category term="NuGet" />
    <content type="html"><![CDATA[<p>For .NET, the standard mechanism for sharing packages is NuGet. A <code>.nupkg</code> file is an archive that contains your compiled code (DLLs), other files related to your code, and a manifest containing metadata (<a href="https://docs.microsoft.com/en-us/nuget/what-is-nuget" class="external">source</a>). This blog post will show you how data in this manifest can be controlled by using MSBuild.</p>
<p>For simplification purposes, my sample project will consist of only a single class library project. I like you to keep in mind that this would scale to many projects as Microsoft did with the <a href="https://github.com/dotnet/runtime" class="external">&quot;Microsoft.Extensions packages&quot;</a>. The sky is the limit.</p>
<h2 id="setup"><a href="#setup">Setup</a></h2>
<p>There are bits of this demo that work cross-platform and bits that require you to run on Windows. For example, I like the control the <a href="https://docs.microsoft.com/en-us/dotnet/core/tools/" class="external">.NET CLI</a> gives me when creating a new project. If you prefer to use <a href="https://visualstudio.microsoft.com/vs/" class="external">Visual Studio</a>, the result will remain the same.</p>
<pre><code class="language-shell">$ dotnet new sln

The template &quot;Solution File&quot; was created successfully.

$ dotnet new classlib --framework netstandard2.0 --output src/Kaylumah.Logging.Extensions.Abstractions

The template &quot;Class library&quot; was created successfully.

Processing post-creation actions...
Running 'dotnet restore' on src/Kaylumah.Logging.Extensions.Abstractions\Kaylumah.Logging.Extensions.Abstractions.csproj...
  Determining projects to restore...
  Restored C:\Projects\NugetMetadata\src\Kaylumah.Logging.Extensions.Abstractions\Kaylumah.Logging.Extensions.Abstractions.csproj (in 84 ms).
Restore succeeded.

$ dotnet sln add src/Kaylumah.Logging.Extensions.Abstractions/Kaylumah.Logging.Extensions.Abstractions.csproj

Project `src\Kaylumah.Logging.Extensions.Abstractions\Kaylumah.Logging.Extensions.Abstractions.csproj` added to the solution.
</code></pre>
<p>I chose &quot;Kaylumah.Logging.Extensions.Abstractions&quot; to keep inline and in style with the extension packages Microsoft provides. By default, the namespace of the assembly sets the unique package identifier. Of course, this only matters when publishing the package to a NuGet source like <code>https://nuget.org</code>. That is not this article's scope, as publishing the default template with only the empty <code>Class1.cs</code> file would not benefit anyone by sharing it.</p>
<h2 id="why-do-we-even-need-metadata-in-our-packages"><a href="#why-do-we-even-need-metadata-in-our-packages">Why do we even need metadata in our packages?</a></h2>
<p>Before showing you how I set metadata, I like to show you what happens without specifying any metadata. You can run the command <a href="https://docs.microsoft.com/en-us/dotnet/core/tools/dotnet-pack#description" class="external"><code>dotnet pack</code></a> for a single project or an entire solution. If you do it for the solution, only projects that are <code>&lt;IsPackable&gt;true&lt;/IsPackable&gt;</code> generate a package. The class library we created uses the <code>Microsoft.NET.Sdk</code> and is packable by default.</p>
<pre><code class="language-shell">$ dotnet pack

Microsoft (R) Build Engine version 16.8.3+39993bd9d for .NET
Copyright (C) Microsoft Corporation. All rights reserved.

  Determining projects to restore...
  All projects are up-to-date for restore.
  Kaylumah.Logging.Extensions.Abstractions -&gt; C:\Projects\NugetMetadata\src\Kaylumah.Logging.Extensions.Abstractions\bin\Debug\netstandard2.0\Kaylumah.Logging.Extensions.Abstractions.dll
  Successfully created package 'C:\Projects\NugetMetadata\src\Kaylumah.Logging.Extensions.Abstractions\bin\Debug\Kaylumah.Logging.Extensions.Abstractions.1.0.0.nupkg'.
</code></pre>
<p>This command generated the package in my bin folder. Since I did not specify a configuration, it chose the default configuration, which is Debug. So how do we inspect &quot;Kaylumah Logging Extensions Abstractions 1.0.0 nupkg&quot;? My prefered way is the <a href="https://github.com/NuGetPackageExplorer/NuGetPackageExplorer" class="external">NuGet Package Explorer</a>, which is unfortunately only available on Windows.</p>
<p><picture><source type="image/webp" srcset="https://kaylumah.nl/assets/images/posts/20210327/nuget-metadata/001_npe_initial_metadata.png.webp" /><img loading="lazy" src="https://kaylumah.nl/assets/images/posts/20210327/nuget-metadata/001_npe_initial_metadata.png" width="4500" height="4000" alt="Without Metadata in NuGet Package Explorer" /></picture></p>
<p>There seems to be no metadata set by default. Let's, for a quick moment, compare it to what Microsoft adds to its packages. We can do this by downloading <a href="https://www.nuget.org/api/v2/package/Microsoft.Extensions.Logging.Console/3.1.13" class="external">the package</a> from nuget.org and view it like we just did for &quot;Kaylumah.Logging.*.nupkg&quot;. Alternatively, the NuGet Package Explorer also supports viewing metadata from remote sources such as nuget.org.</p>
<p><picture><source type="image/webp" srcset="https://kaylumah.nl/assets/images/posts/20210327/nuget-metadata/002_console_logger_info.png.webp" /><img loading="lazy" src="https://kaylumah.nl/assets/images/posts/20210327/nuget-metadata/002_console_logger_info.png" width="4500" height="6000" alt="Microsoft Extensions Logging Metadata in NuGet Package Explorer" /></picture></p>
<p>Now that is what I call metadata. Remember that <code>.nupkg</code> files are archives; this means we can easily verify what the explorer was telling us about our package.  You can do this by changing the extension from <code>.nupkg</code> to <code>.zip</code> and then extracting it. It contains the &quot;Kaylumah Logging .nuspec&quot;, which is the manifest I was talking about in the introduction. At the moment, it looks like this:</p>
<pre><code class="language-xml">&lt;?xml version=&quot;1.0&quot; encoding=&quot;utf-8&quot;?&gt;
&lt;package xmlns=&quot;http://schemas.microsoft.com/packaging/2012/06/nuspec.xsd&quot;&gt;
  &lt;metadata&gt;
    &lt;id&gt;Kaylumah.Logging.Extensions.Abstractions&lt;/id&gt;
    &lt;version&gt;1.0.0&lt;/version&gt;
    &lt;authors&gt;Kaylumah.Logging.Extensions.Abstractions&lt;/authors&gt;
    &lt;requireLicenseAcceptance&gt;false&lt;/requireLicenseAcceptance&gt;
    &lt;description&gt;Package Description&lt;/description&gt;
    &lt;dependencies&gt;
      &lt;group targetFramework=&quot;.NETStandard2.0&quot; /&gt;
    &lt;/dependencies&gt;
  &lt;/metadata&gt;
&lt;/package&gt;
</code></pre>
<p>So as expected, it matches what NuGet Package Explorer shows us. The default for both id and authors is the assembly namespace, whereas description defaults to &quot;Package Description&quot;, which tells our users nothing about what the package does.</p>
<h2 id="how-do-we-set-metadata"><a href="#how-do-we-set-metadata">How do we set metadata?</a></h2>
<p>Now that we have covered our basis, we can finally explain how we can set metadata via MSBuild.</p>
<h3 id="set-metadata-from-csproj"><a href="#set-metadata-from-csproj">Set metadata from csproj</a></h3>
<p>Since we are working on a single project, the logical place to set metadata is by editing our .csproj file. I will not cover every property today, so I refer you to <a href="https://docs.microsoft.com/en-us/nuget/reference/msbuild-targets#pack-target" class="external">pack target docs</a> link. I will, however, cover properties I often use in my projects.</p>
<p>So behind the scenes, what happens is that specific MSBuild properties map to properties in the .nuspec file. We have to either edit the existing <code>PropertyGroup</code> in our file or add one to set properties. In my opinion, every package should contain branding (like authors, company and copyright information), a helpful description and categorized by a series of tags. So in the example below, I have set these values.</p>
<pre><code class="language-xml">&lt;Project Sdk=&quot;Microsoft.NET.Sdk&quot;&gt;
  &lt;PropertyGroup&gt;
    &lt;TargetFramework&gt;netstandard2.0&lt;/TargetFramework&gt;
    &lt;Authors&gt;Max Hamulyák&lt;/Authors&gt;
    &lt;!-- Note: Company does not get added to the .nuspec but it is part of the Assembly...Attribute so I often set them all --&gt;
    &lt;Company&gt;Kaylumah&lt;/Company&gt;
    &lt;Description&gt;Logging abstractions for Kaylumah.&lt;/Description&gt;
    &lt;PackageTags&gt;logging;abstractions&lt;/PackageTags&gt;
    &lt;Copyright&gt;Copyright (c) 2021 Kaylumah&lt;/Copyright&gt; 
  &lt;/PropertyGroup&gt;
&lt;/Project&gt;
</code></pre>
<p>If we run <code>dotnet pack</code> now, we can immediately see that our package no longer has empty metadata.</p>
<p><picture><source type="image/webp" srcset="https://kaylumah.nl/assets/images/posts/20210327/nuget-metadata/003_npe_author_metadata.png.webp" /><img loading="lazy" src="https://kaylumah.nl/assets/images/posts/20210327/nuget-metadata/003_npe_author_metadata.png" width="4500" height="4000" alt="With Author Metadata in NuGet Package Explorer" /></picture></p>
<p>You can also verify this in Visual Studio by checking your projects properties and clicking on the <code>Package</code> tab.</p>
<p><picture><source type="image/webp" srcset="https://kaylumah.nl/assets/images/posts/20210327/nuget-metadata/004_vs2019_author_metadata.png.webp" /><img loading="lazy" src="https://kaylumah.nl/assets/images/posts/20210327/nuget-metadata/004_vs2019_author_metadata.png" width="4500" height="3000" alt="With Author Metadata in VS2019" /></picture></p>
<p>In the introduction, I talked about what exactly is a NuGet package. We are now at the part regarding other files. Since we already took care of branding, let us also add an icon. Our code is under license; how do we include it in the package?</p>
<p>Add files named <code>Logo.png</code> and <code>LICENSE</code> to the folder containing our project. We can then use the tags <code>PackageIcon</code> and <code>PackageLicenseFile</code> respectfully. We also need to tell MSBuild that these files should be part of the package. The updated project file looks like this:</p>
<pre><code class="language-xml">&lt;Project Sdk=&quot;Microsoft.NET.Sdk&quot;&gt;

  &lt;PropertyGroup&gt;
    &lt;TargetFramework&gt;netstandard2.0&lt;/TargetFramework&gt;
    &lt;Authors&gt;Max Hamulyák&lt;/Authors&gt;
    &lt;Company&gt;Kaylumah&lt;/Company&gt;
    &lt;Description&gt;Logging abstractions for Kaylumah.&lt;/Description&gt;
    &lt;PackageTags&gt;logging;abstractions&lt;/PackageTags&gt;
    &lt;Copyright&gt;Copyright (c) 2021 Kaylumah&lt;/Copyright&gt;
    &lt;PackageIcon&gt;Logo.png&lt;/PackageIcon&gt;
    &lt;PackageLicenseFile&gt;LICENSE&lt;/PackageLicenseFile&gt;
  &lt;/PropertyGroup&gt;

  &lt;ItemGroup&gt;
    &lt;None Include=&quot;Logo.png&quot; Pack=&quot;true&quot; PackagePath=&quot;&quot; /&gt;
    &lt;None Include=&quot;LICENSE&quot; Pack=&quot;true&quot; PackagePath=&quot;&quot;/&gt;
  &lt;/ItemGroup&gt;

&lt;/Project&gt;
</code></pre>
<p><picture><source type="image/webp" srcset="https://kaylumah.nl/assets/images/posts/20210327/nuget-metadata/005_npe_includingfiles_metadata.png.webp" /><img loading="lazy" src="https://kaylumah.nl/assets/images/posts/20210327/nuget-metadata/005_npe_includingfiles_metadata.png" width="4500" height="4000" alt="Including NuGet Package Explorer FileMetadata" /></picture></p>
<p>Regarding these files, I like to say a couple of things before moving on to more advanced use cases.
There is more than one way to set both the Icon and the License files for starters, which the Microsoft Docs <a href="https://docs.microsoft.com/en-us/nuget/reference/msbuild-targets#pack-target" class="external">describe</a>. Both used to have a <code>Url</code> variant that would link to the Icon or License in question. Both of those options are now deprecated, and in the case of <code>PackageLicenseFile</code>, the alternative is  <code>PackageLicenseExpression</code>, which uses <code>SDPX</code> license identifiers.</p>
<blockquote>
<p><strong>note</strong>: For backwards compatibility, <code>PackageLicenseUrl</code> gets populated with <code>https://docs.microsoft.com/en-us/nuget/consume-packages/finding-and-choosing-packages#license-url-deprecation</code> if you choose to use <code>PackageLicenseFile</code> and with <code>https://licenses.nuget.org/MIT</code> for example, if your SPDX would be MIT.</p>
</blockquote>
<p>The second point I like to raise is regarding the file names.
In my example, the value for <code>PackageIcon</code> and the name of my icon file match precisely; this is not necessary. What does matter is the name we specify in the package path. Failing to do so would, for example, trigger &quot;NU5046: The icon file 'NotAnIcon.png' does not exist in the package. See a couple of samples below:</p>
<pre><code class="language-xml">&lt;!-- Visible 'False' hides the file in the Visual Studio explorer but still packages it under Logo.png --&gt;
&lt;None Include=&quot;Logo.png&quot; Pack=&quot;true&quot; PackagePath=&quot;&quot; Visible=&quot;false&quot; /&gt;

&lt;!-- Link changes the name Visual Studio displays in the explorer but still packages it under Logo.png --&gt;
&lt;None Include=&quot;Logo.png&quot; Pack=&quot;true&quot; PackagePath=&quot;&quot; Link=&quot;NotAnIcon.png&quot; /&gt;

&lt;!-- PackagePath rewrites the filename to Icon.png so PackageIcon remains unchanged --&gt;
&lt;None Include=&quot;KaylumahLogo.png&quot; Pack=&quot;true&quot; PackagePath=&quot;Icon.png&quot; /&gt;

&lt;!-- PackagePath rewrites the filename to KaylumahLogo.png so set PackageIcon to &quot;KaylumahLogo&quot; --&gt;
&lt;None Include=&quot;Icon.png&quot; Pack=&quot;true&quot; PackagePath=&quot;KaylumahLogo.png&quot; /&gt;
</code></pre>
<p>Rewriting via package path only works for files with an extension. For historical purposes, both NuGet and MSBuild treat these files as directories. If we had used <code>LICENSE.txt</code> over <code>LICENSE</code>, we would have been able to modify the name in the package. However, our <code>LICENSE</code> file can apply both the <code>Visible</code> and the <code>Link</code> example. For more information regarding Package Icons, see <a href="https://docs.microsoft.com/en-us/nuget/reference/msbuild-targets#packing-an-icon-image-file" class="external">package-icon</a>. For packing licenses without an extension see <a href="https://docs.microsoft.com/en-us/nuget/reference/msbuild-targets#packing-a-file-without-an-extension" class="external">package-license-1</a>, and licenses with an extension see <a href="https://docs.microsoft.com/en-us/nuget/reference/msbuild-targets#packing-a-license-expression-or-a-license-file" class="external">package-license-2</a></p>
<blockquote>
<p>Keep in mind that by adding both Icon and License files to the package, the overall package size slightly increases; this can cause slower restore times on initial package downloads. This performance penalty is a trade-off you have to decide for your self. Given today's network speeds, I think the impact isn't noticeable.</p>
</blockquote>
<h3 id="set-metadata-for-multiple-projects"><a href="#set-metadata-for-multiple-projects">Set metadata for multiple projects</a></h3>
<p>So lets for a moment, assume our project is a huge success. We are creating more and more extension libraries. Think about the vast number of packages in <code>dotnet/runtime</code>. Even if we would only include an implementation for <code>.Abstractions</code> package, it would be very time consuming to do this for every project. It would also violate the <a href="https://en.wikipedia.org/wiki/Don%27t_repeat_yourself" class="external">DRY principle</a>.</p>
<p>To get started, create a file called <code>Directory.Build.props</code> at the root of your solution. The way Microsoft handles this file, and in precisely that casing, is starting from your project folder; it goes up till it finds a match or it reaches the root of your drive. This <code>Directory.Build.props</code> file follows the same syntax we use in our <code>.csproj</code> files. To demonstrate, remove only the <code>Copyright</code> tag from the project and recreate it in the <code>Directory.Build.props</code> file. Now is the perfect moment to also demonstrate something I have not yet told you. We are using MSBuild to populate our metadata, and thus we can use the full force of MSBuild. For example, we can reference other variables and even use built-in functions. So the thing about our current Copyright implementation is that if after <code>31/12/2021</code> I want to release the next version, I have to remember to update my copyright notice. We can achieve this by setting the copyright tag like below.</p>
<pre><code class="language-xml">&lt;?xml version=&quot;1.0&quot; encoding=&quot;utf-8&quot;?&gt;
&lt;Project&gt;
    &lt;PropertyGroup&gt;
        &lt;Copyright&gt;Copyright © $(Company) $([System.DateTime]::Now.Year)&lt;/Copyright&gt;
    &lt;/PropertyGroup&gt;
&lt;/Project&gt;
</code></pre>
<p><picture><source type="image/webp" srcset="https://kaylumah.nl/assets/images/posts/20210327/nuget-metadata/006_npe_buildpropsv1.png.webp" /><img loading="lazy" src="https://kaylumah.nl/assets/images/posts/20210327/nuget-metadata/006_npe_buildpropsv1.png" width="4500" height="4000" alt="Using BuildProps NuGet Package Explorer V1" /></picture></p>
<p>What happened? Something is wrong; why do I see the copyright year 2021, but not my company name? Before explaining it, let me prove it by adding a company tag to the <code>Directory.Build.props</code> with a different value. For example:</p>
<pre><code class="language-xml">&lt;?xml version=&quot;1.0&quot; encoding=&quot;utf-8&quot;?&gt;
&lt;Project&gt;
    &lt;PropertyGroup&gt;
        &lt;Company&gt;NotKaylumah&lt;/Company&gt;
        &lt;Copyright&gt;Copyright © $(Company) $([System.DateTime]::Now.Year)&lt;/Copyright&gt;
    &lt;/PropertyGroup&gt;
&lt;/Project&gt;
</code></pre>
<p>Unlike the <code>Copyright</code> tag do not remove the <code>Company</code> tag from the <code>.csproj</code> file. The result, this time, is a little different.</p>
<p><picture><source type="image/webp" srcset="https://kaylumah.nl/assets/images/posts/20210327/nuget-metadata/007_npe_buildpropsv2.png.webp" /><img loading="lazy" src="https://kaylumah.nl/assets/images/posts/20210327/nuget-metadata/007_npe_buildpropsv2.png" width="4500" height="4000" alt="Using BuildProps NuGet Package Explorer V2" /></picture></p>
<p>It appears that I have two different values for <code>Company</code>; this happens because <code>Directory.Build.props</code> gets imported before your project, and <code>Directory.Build.targets</code> gets imported after. The latest registration wins. That is why if we would read the <code>System.Reflection AssemblyCopyrightAttribute</code> the value for <code>Company</code> is &quot;Kaylumah&quot;, but when we set <code>Copyright</code>, it is still &quot;NotKaylumah&quot;. You can verify this behaviour by running the preprocess command (<code>dotnet build -pp:fullproject.xml</code>). See <a href="https://docs.microsoft.com/en-us/visualstudio/msbuild/msbuild-command-line-reference?view=vs-2019" class="external">msbuild comand line reference</a> for an explanation.</p>
<blockquote>
<p>Word of caution, you should not set every property this way. You should only set the values that are shared cross-project. For example, <code>Company</code> and <code>Copyright</code> are likely to be the same for every project. The <code>Authors</code> and <code>PackageTags</code> could be project-specific; heck, even <code>Description</code> could be reused if so desired. One thing for sure is that <code>Id</code> can not be recycled since every package requires a unique Id.</p>
</blockquote>
<pre><code class="language-xml">&lt;?xml version=&quot;1.0&quot; encoding=&quot;utf-8&quot;?&gt;
&lt;Project&gt;
    &lt;PropertyGroup&gt;
        &lt;Authors&gt;Max Hamulyák&lt;/Authors&gt;
        &lt;Company&gt;Kaylumah&lt;/Company&gt;
        &lt;Description&gt;Logging abstractions for Kaylumah.&lt;/Description&gt;
        &lt;Copyright&gt;Copyright © $(Company) $([System.DateTime]::Now.Year)&lt;/Copyright&gt;
        &lt;PackageTags&gt;logging;abstractions&lt;/PackageTags&gt;
        &lt;PackageIcon&gt;Logo.png&lt;/PackageIcon&gt;
        &lt;PackageLicenseFile&gt;LICENSE&lt;/PackageLicenseFile&gt;
    &lt;/PropertyGroup&gt;

    &lt;ItemGroup&gt;
        &lt;None Include=&quot;$(MSBuildThisFileDirectory)Logo.png&quot; Pack=&quot;true&quot; PackagePath=&quot;&quot; /&gt;
        &lt;None Include=&quot;$(MSBuildThisFileDirectory)LICENSE&quot; Pack=&quot;true&quot; PackagePath=&quot;&quot; /&gt;
    &lt;/ItemGroup&gt;

&lt;/Project&gt;
</code></pre>
<p><picture><source type="image/webp" srcset="https://kaylumah.nl/assets/images/posts/20210327/nuget-metadata/008_npe_buildpropsv3.png.webp" /><img loading="lazy" src="https://kaylumah.nl/assets/images/posts/20210327/nuget-metadata/008_npe_buildpropsv3.png" width="4500" height="4000" alt="Using BuildProps NuGet Package Explorer V3" /></picture></p>
<p>In case you are wondering where did <code>$(MSBuildThisFileDirectory)</code> come from, it is one of the predefined MSBuild variables you can use. It allows us to set the path without thinking about relative file paths; for other variables, see the <a href="https://docs.microsoft.com/en-us/visualstudio/msbuild/msbuild-reserved-and-well-known-properties?view=vs-2019" class="external">Microsoft Docs</a> on the topic.</p>
<h3 id="bonus-chapter"><a href="#bonus-chapter">Bonus Chapter</a></h3>
<p>I have referred to the list of properties before. There are a couple of handy ones we have not yet discussed. I am talking about the repository fields, making sure that an artefact can always trace back to a specific revision of your source code (Under repository in the nuspec).</p>
<table>
<thead>
<tr>
<th>NuSpec</th>
<th>MSBuild</th>
<th>Description</th>
</tr>
</thead>
<tbody>
<tr>
<td>Url</td>
<td>RepositoryUrl</td>
<td>URL where sourcecode is located</td>
</tr>
<tr>
<td>Type</td>
<td>RepositoryType</td>
<td>The repository type i.e. <code>git</code></td>
</tr>
<tr>
<td>Branch</td>
<td>RepositoryBranch</td>
<td>Optional repository branch</td>
</tr>
<tr>
<td>Commit</td>
<td>RepositoryCommit</td>
<td>Optional commit information</td>
</tr>
</tbody>
</table>
<p>Before I explain this, I am getting a bit tired of running <code>dotnet pack</code> every time. Lucky for me, there is a way to generate a package on build. Update the <code>.csproj</code> file to look like this:</p>
<pre><code class="language-xml">&lt;Project Sdk=&quot;Microsoft.NET.Sdk&quot;&gt;

  &lt;PropertyGroup&gt;
    &lt;TargetFramework&gt;netstandard2.0&lt;/TargetFramework&gt;
    &lt;GeneratePackageOnBuild&gt;true&lt;/GeneratePackageOnBuild&gt;
  &lt;/PropertyGroup&gt;

&lt;/Project&gt;
</code></pre>
<p>So back to repository info. MSBuild itself is not aware of things like source control. Fortunately, we can pass parameters from the outside to use inside MSBuild. For this, we have the <code>-p</code> or <code>-property</code> switch. The following script retrieves the URL, branch name and SHA1 hash from the current commit.</p>
<pre><code class="language-shell">#!/bin/sh -x

REPO_URL=$(git config --get remote.origin.url)
REPO_BRANCH=$(git branch --show-current)
REPO_COMMIT=$(git rev-parse HEAD)
dotnet build -p:RepositoryUrl=&quot;$REPO_URL&quot; -p:RepositoryBranch=&quot;$REPO_BRANCH&quot; -p:RepositoryCommit=&quot;$REPO_COMMIT&quot; -p:RepositoryType=&quot;git&quot;
</code></pre>
<p>Remember, we now generate a package on build. Let us verify we see repo info by opening the created package in NuGet Package Explorer.</p>
<p><picture><source type="image/webp" srcset="https://kaylumah.nl/assets/images/posts/20210327/nuget-metadata/009_npe_repoinfo.png.webp" /><img loading="lazy" src="https://kaylumah.nl/assets/images/posts/20210327/nuget-metadata/009_npe_repoinfo.png" width="4500" height="4000" alt="Repo Info in NuGet Package Explorer" /></picture></p>
<p>Even though it is OK to add repo metadata this way, there is a better alternative. This alternative does more than add metadata; it also enables source code debugging from NuGet packages. How cool is that? This technology is called <a href="https://github.com/dotnet/sourcelink" class="external">Source Link</a>.</p>
<p>Like before with the properties, I have no wish to add source link to every package separately. For this, create <code>Directory.Build.targets</code>, which looks like this:</p>
<pre><code class="language-xml">&lt;?xml version=&quot;1.0&quot; encoding=&quot;utf-8&quot;?&gt;
 &lt;Project&gt;
     &lt;ItemGroup&gt;
         &lt;PackageReference Include=&quot;Microsoft.SourceLink.GitHub&quot; Version=&quot;1.0.0&quot; PrivateAssets=&quot;all&quot; IsImplicitlyDefined=&quot;true&quot; /&gt;
     &lt;/ItemGroup&gt;
 &lt;/Project&gt;
</code></pre>
<p>To configure source link, we need to update <code>Directory.Build.props</code> as well.</p>
<pre><code class="language-xml">&lt;?xml version=&quot;1.0&quot; encoding=&quot;utf-8&quot;?&gt;
&lt;Project&gt;
    &lt;PropertyGroup&gt;
        &lt;Authors&gt;Max Hamulyák&lt;/Authors&gt;
        &lt;Company&gt;Kaylumah&lt;/Company&gt;
        &lt;Description&gt;Logging abstractions for Kaylumah.&lt;/Description&gt;
        &lt;Copyright&gt;Copyright © $(Company) $([System.DateTime]::Now.Year)&lt;/Copyright&gt;
        &lt;PackageTags&gt;logging;abstractions&lt;/PackageTags&gt;
        &lt;PackageIcon&gt;Logo.png&lt;/PackageIcon&gt;
        &lt;PackageLicenseFile&gt;LICENSE&lt;/PackageLicenseFile&gt;
    &lt;/PropertyGroup&gt;

    &lt;ItemGroup&gt;
        &lt;None Include=&quot;$(MSBuildThisFileDirectory)Logo.png&quot; Pack=&quot;true&quot; PackagePath=&quot;&quot; /&gt;
        &lt;None Include=&quot;$(MSBuildThisFileDirectory)LICENSE&quot; Pack=&quot;true&quot; PackagePath=&quot;&quot; /&gt;
    &lt;/ItemGroup&gt;

    &lt;PropertyGroup&gt;
        &lt;PublishRepositoryUrl&gt;true&lt;/PublishRepositoryUrl&gt;
        &lt;EmbedUntrackedSources&gt;true&lt;/EmbedUntrackedSources&gt;
        &lt;IncludeSymbols&gt;true&lt;/IncludeSymbols&gt;
        &lt;SymbolPackageFormat&gt;snupkg&lt;/SymbolPackageFormat&gt;
    &lt;/PropertyGroup&gt;

&lt;/Project&gt;
</code></pre>
<p>To prove that it is still working, here is the entire <code>.nuspec</code> file after adding Source Link</p>
<pre><code class="language-xml">&lt;?xml version=&quot;1.0&quot; encoding=&quot;utf-8&quot;?&gt;
&lt;package xmlns=&quot;http://schemas.microsoft.com/packaging/2012/06/nuspec.xsd&quot;&gt;
  &lt;metadata&gt;
    &lt;id&gt;Kaylumah.Logging.Extensions.Abstractions&lt;/id&gt;
    &lt;version&gt;1.0.0&lt;/version&gt;
    &lt;authors&gt;Max Hamulyák&lt;/authors&gt;
    &lt;requireLicenseAcceptance&gt;false&lt;/requireLicenseAcceptance&gt;
    &lt;license type=&quot;file&quot;&gt;LICENSE&lt;/license&gt;
    &lt;licenseUrl&gt;https://aka.ms/deprecateLicenseUrl&lt;/licenseUrl&gt;
    &lt;icon&gt;Logo.png&lt;/icon&gt;
    &lt;description&gt;Logging abstractions for Kaylumah.&lt;/description&gt;
    &lt;copyright&gt;Copyright © Kaylumah 2021&lt;/copyright&gt;
    &lt;tags&gt;logging abstractions&lt;/tags&gt;
    &lt;repository type=&quot;git&quot; url=&quot;https://github.com/Kaylumah/NugetMetadataDemo.git&quot; commit=&quot;3378cf33e0061b234c1f58e060489efd81e08586&quot; /&gt;
    &lt;dependencies&gt;
      &lt;group targetFramework=&quot;.NETStandard2.0&quot; /&gt;
    &lt;/dependencies&gt;
  &lt;/metadata&gt;
&lt;/package&gt;
</code></pre>
<h2 id="closing-thoughts"><a href="#closing-thoughts">Closing Thoughts</a></h2>
<p>We looked at setting metadata via MSBuild and sharing metadata between projects. You can take this even further by using MSBuild tasks to verify that packages must have a description like <a href="https://github.com/dotnet/arcade/blob/9a72efb067b74bb9147f9413ade6173b568ea1af/src/Microsoft.DotNet.Arcade.Sdk/tools/Workarounds.targets#L79" class="external">shown here</a>. It is also possible to create an entire SDK as Microsoft did with <a href="https://github.com/dotnet/arcade" class="external">Arcade</a>. Of course, Arcade goes much further than just specifying some metadata. You can read about how / why Microsoft did that <a href="https://devblogs.microsoft.com/dotnet/the-evolving-infrastructure-of-net-core/" class="external">on the devblogs</a>. I experimented with a custom SDK heavily inspired by Arcade, but that is a blog post for another day.</p>
<p>For now, I hope I was able to teach you something about the power of MSBuild and how we can use it to manipulate our NuGet packages. If you have any questions, feel free to reach out.</p>
<p>The corresponding source code for this article is on <a href="https://github.com/Kaylumah/NugetMetadataDemo" class="external">GitHub</a>, there you can see all the changes I addressed in sequence.</p>
<p>See you next time, stay healthy and happy coding to all 🧸!</p>
<hr />
<h2 id="sources"><a href="#sources">Sources</a></h2>
<p>This blog was written based on personal experience when creating packages. If not already explicitly linked in the text, here are some of the primary sources used in the article.</p>
<ul>
<li><a href="https://docs.microsoft.com/en-us/visualstudio/msbuild/customize-your-build?view=vs-2019" class="external">Customize your build</a></li>
<li><a href="https://docs.microsoft.com/en-us/nuget/reference/msbuild-targets" class="external">MSBuild targets</a></li>
<li><a href="https://docs.microsoft.com/en-us/nuget/create-packages/creating-a-package-dotnet-cli" class="external">Create a package dotnet cli</a></li>
<li><a href="https://docs.microsoft.com/en-us/nuget/quickstart/create-and-publish-a-package-using-the-dotnet-cli" class="external">Create and publish a package using dotnet cli</a></li>
<li><a href="https://docs.microsoft.com/en-us/visualstudio/msbuild/msbuild-reserved-and-well-known-properties?view=vs-2019" class="external">MSBuild reserved and well-known properties</a></li>
<li><a href="https://cezarypiatek.github.io/post/setting-assembly-and-package-metadata/" class="external">Setting assembly and nuget package metadata in .NET Core</a></li>
</ul>]]></content>
  </entry>
  <entry>
    <id>https://kaylumah.nl/2019/09/07/using-csharp-code-your-git-hooks.html</id>
    <title type="text"><![CDATA[Using C# code in your git hooks]]></title>
    <summary type="text"><![CDATA[Getting started with C# script in your client-side git hooks]]></summary>
    <published>2019-09-07T00:00:00+02:00</published>
    <updated>2021-03-21T00:00:00+01:00</updated>
    <author>
      <name>Max Hamulyák</name>
      <email>max@kaylumah.nl</email>
    </author>
    <link href="https://kaylumah.nl/2019/09/07/using-csharp-code-your-git-hooks.html" />
    <category term="C#" />
    <category term="Git" />
    <content type="html"><![CDATA[<h2 id="why-use-hooks"><a href="#why-use-hooks">Why use hooks?</a></h2>
<p>We, as developers, love platforms like GitHub, GitLab, Atlassian, Azure DevOps etc., as our managed git system and collaboration platform. We also love clean code and keep inventing new linters and rules to enforce it. In my opinion, every commit should allow the codebase to deploy to production. There is nothing worse than commits like “fixed style errors” or “fixed build”. These are often small mistakes you want to know as early as possible in your development cycle. You don’t want to break the build for the next developer because he pulled your ‘mistake’ or waste precious build minutes of your CI server. Say you have asked your teammate to review your code; in the meantime, the build server rejects your code. That means you have to go back and fix this, and your teammate has to come back and possibly review again after the changes (i.e., approvals reset on new commit). Doing so would waste a lot of time and effort.</p>
<blockquote>
<p><strong>note</strong>: I favour server-side hooks, but when using a SaaS solution, this is not always a possibility. I know I would not want someone to run arbitrary code on my servers. Unfortunately, a developer can bypass the client-side hooks. Until we can run, possibly sandboxed, server-side hooks on our prefered platform, we have to make the best of it by using client-side hooks.</p>
</blockquote>
<p>Githooks are scripts that can execute on certain parts of the git lifecycle. Hooks must be executable, but other than that, hooks' power is only limited to the developer's imagination. I have seen many samples of hooks written in JavaScript (node) using tools like <a href="https://github.com/typicode/husky" class="external">husky</a> and <a href="https://github.com/conventional-changelog/commitlint" class="external">commitlint</a> to enforce a certain way of working. When I was browsing the changes in the upcoming .NET Core 3.0 release, the concept of <a href="https://docs.microsoft.com/en-us/dotnet/core/whats-new/dotnet-core-3-0#local-dotnet-tools" class="external">local-tools</a> got me thinking. I knew of the existence of <a href="https://www.hanselman.com/blog/CAndNETCoreScriptingWithTheDotnetscriptGlobalTool.aspx" class="external">dotnet-script</a>, would that make it possible to C# in my GitHooks?</p>
<blockquote>
<p><strong>note</strong>: in the past I have used a set-up with node since I occasionally work with front-end frameworks like Angular. Since I had node installed I could use it even in my pure backend projects to enforce commit messages and such. For me it felt dirty, since that would require team members to have node installed. Using the dotnet cli feels less as a forced decision since members are likely to have it installed already.</p>
</blockquote>
<h2 id="lets-get-started"><a href="#lets-get-started">Let’s get started!</a></h2>
<p>When creating a git repository there is a folder called hooks where all the git hooks are placed. For every event there is a sample post-fixed with .sample that shows the possibility of each hook. This directory is not under source control and we are going to create our own directory to be able to share the hooks with the team.</p>
<pre><code class="language-bash">mkdir git-hooks-example  
cd git-hooks-example  
git init  
dotnet new gitignore  
dotnet new tool-manifest  
dotnet tool install dotnet-script  
dotnet tool install dotnet-format  
mkdir .githooks
</code></pre>
<h3 id="pre-commit-hook"><a href="#pre-commit-hook">Pre-Commit Hook</a></h3>
<p>To demonstrate we are going to create a plain hook. To check if it is working <strong>git commit -m “”</strong> (using empty commit message will abort the commit). You should see the line pre-commit hook printed.</p>
<pre><code class="language-csharp">#!/usr/bin/env dotnet dotnet-script
Console.WriteLine(&quot;pre-commit hook&quot;);
</code></pre>
<p>To make it executable run:</p>
<pre><code class="language-bash">find .git/hooks -type f -exec rm {} \;
find .githooks -type f -exec chmod +x {} \;
find .githooks -type f -exec ln -sf ../../{} .git/hooks/ \;
</code></pre>
<p>Since we can reference other files (and even load nuget packages) in our csx we will first create a couple of files so we can have code-reuse between the hooks.</p>
<p>Create a file called <strong>logger.csx</strong></p>
<pre><code class="language-csharp">public class Logger
{
    public static void LogInfo(string message)
    {
        Console.ForegroundColor = ConsoleColor.White;
        Console.Error.WriteLine(message);
    }
    public static void LogError(string message)
    {
        Console.ForegroundColor = ConsoleColor.Red;
        Console.Error.WriteLine(message);
    }
}
</code></pre>
<p>Create a file called <strong>command-line.csx</strong></p>
<pre><code class="language-csharp">#load &quot;logger.csx&quot;
public class CommandLine
{
    public static string Execute(string command)
    {
        // according to: https://stackoverflow.com/a/15262019/637142
        // thans to this we will pass everything as one command
        command = command.Replace(&quot;\&quot;&quot;, &quot;\&quot;\&quot;&quot;);
        var proc = new Process
        {
            StartInfo = new ProcessStartInfo
            {
                FileName = &quot;/bin/bash&quot;,
                Arguments = &quot;-c \&quot;&quot; + command + &quot;\&quot;&quot;,
                UseShellExecute = false,
                RedirectStandardOutput = true,
                CreateNoWindow = true
            }
        };
        proc.Start();
        proc.WaitForExit();
        if (proc.ExitCode != 0)
        {
            Logger.LogError(proc.StandardOutput.ReadToEnd());
            return proc.ExitCode.ToString();
        }
        return proc.StandardOutput.ReadToEnd();
    }
}
</code></pre>
<p>Create a file called <strong>dotnet-commands.csx</strong></p>
<pre><code class="language-csharp">#load &quot;logger.csx&quot;
#load &quot;command-line.csx&quot;
public class DotnetCommands
{
    public static int FormatCode() =&gt; ExecuteCommand(&quot;dotnet format&quot;);
    public static int BuildCode() =&gt; ExecuteCommand(&quot;dotnet build&quot;);

    public static int TestCode() =&gt; ExecuteCommand(&quot;dotnet test&quot;);

    private static int ExecuteCommand(string command)
    {
        string response = CommandLine.Execute(command);
        Int32.TryParse(response, out int exitCode);
        return exitCode;
    }

}
</code></pre>
<p>Create a file called <strong>git-commands.csx</strong></p>
<pre><code class="language-csharp">#load &quot;logger.csx&quot;
#load &quot;command-line.csx&quot;
public class GitCommands
{
    public static void StashChanges()
    {
        CommandLine.Execute(&quot;git stash -q --keep-index&quot;);
    }
    public static void UnstashChanges()
    {
        CommandLine.Execute(&quot;git stash pop -q&quot;);
    }
}
</code></pre>
<p>We now have a utility in place for Logging and running GIT and dotnet commands. Next we are going to start with out pre-commit hook. Create a file called <strong>pre-commit</strong> The difference between this file and the others we just made is that we don’t specify the extension, and that using Shebang we explicitly load dotnet-script. For an explanation of each hook see the article posted below.</p>
<p><a href="https://www.atlassian.com/git/tutorials/git-hooks" class="external">Git Hooks | Atlassian Git Tutorial</a></p>
<pre><code class="language-csharp">#!/usr/bin/env dotnet dotnet-script
#load &quot;logger.csx&quot;
#load &quot;git-commands.csx&quot;
#load &quot;dotnet-commands.csx&quot;

// We'll only runchecks on changes that are a part of this commit so let's stash others
GitCommands.StashChanges();

int buildCode = DotnetCommands.BuildCode();

// We're done with checks, we can unstash changes
GitCommands.UnstashChanges();
if (buildCode != 0) {
    Logger.LogError(&quot;Failed to pass the checks&quot;);
    Environment.Exit(-1);
}
// All checks have passed
</code></pre>
<p>If we run <strong>git commit -m “”</strong> again this time we get an error saying Failed to pass the checks, which makes sense since we don’t have a project yet. We are going to create a simple sln consisting of a classlibary and a test libary.</p>
<pre><code class="language-bash">dotnet new sln  
dotnet new classlib --framework netstandard2.1 --langVersion 8 --name SomeLib --output src/SomeLib  
dotnet new xunit --output tests/SomeLibTests  
dotnet sln add **/*.csproj 
cd tests/SomeLibTests/  
dotnet add reference ../../src/SomeLib/SomeLib.csproj  
cd ../../  
dotnet build
</code></pre>
<p>If we use git commit -m “” one more time, we get the message about aborting the commit again. We now know that every commit will at least compile :-) If for example we remove the namespace ending curly brace from Class1 we get the error <strong>Class1.cs(7,6): error CS1513: }</strong>. If we extend our pre-commit hook even further we can have <a href="https://www.hanselman.com/blog/EditorConfigCodeFormattingFromTheCommandLineWithNETCoresDotnetFormatGlobalTool.aspx" class="external">dotnet-format</a> and dotnet-test running on every commit. If we purposely write a failing test (1 equals 0 or something like that) the build won’t pass.</p>
<pre><code class="language-csharp">#!/usr/bin/env dotnet dotnet-script
#load &quot;logger.csx&quot;
#load &quot;git-commands.csx&quot;
#load &quot;dotnet-commands.csx&quot;

Logger.LogInfo(&quot;pre-commit hook&quot;);

// We'll only runchecks on changes that are a part of this commit so let's stash others
GitCommands.StashChanges();

int formatCode = DotnetCommands.FormatCode();
int buildCode = DotnetCommands.BuildCode();
int testCode = DotnetCommands.TestCode();

// We're done with checks, we can unstash changes
GitCommands.UnstashChanges();
int exitCode = formatCode + buildCode + testCode;
if (exitCode != 0) {
    Logger.LogError(&quot;Failed to pass the checks&quot;);
    Environment.Exit(-1);
}
// All checks have passed
</code></pre>
<h3 id="prepare-commit-message-hook"><a href="#prepare-commit-message-hook">Prepare-commit-message hook</a></h3>
<p>Thus far, we have not really used anything we need C# for; Admittedly we are using C# to execute shell commands. For our next hook we are going to use System.IO. Imagine as a team you have a commit-message convention. Let's say you want each commit message to include a reference to your issue tracker.</p>
<pre><code class="language-text">type(scope?): subject  #scope is optional
</code></pre>
<p>Create a file <strong>prepare-commit-msg</strong> in this hook we can provide a convenient commit message place holder if the user did not supply a message. To actual enforce the message, you need the <strong>commit-msg</strong> hook. In this example, we only create a message for feature branches.</p>
<pre><code class="language-csharp">#!/usr/bin/env dotnet dotnet-script
#load &quot;logger.csx&quot;
#load &quot;util.csx&quot;
#load &quot;git-commands.csx&quot;

Logger.LogInfo(&quot;prepare-commit-msg hook&quot;);

string commitMessageFilePath = Util.CommandLineArgument(Args, 0);
string commitType = Util.CommandLineArgument(Args, 1);
string commitHash = Util.CommandLineArgument(Args, 2);

if (commitType.Equals(&quot;message&quot;)) {
    // user supplied a commit message, no need to prefill.
    Logger.LogInfo(&quot;commitType message&quot;);
    Environment.Exit(0);
}

string[] files = GitCommands.ChangedFiles();
for(int i = 0; i &lt; files.Length; i++) {
    // perhaps determine scope based on what was changed.
    Logger.LogInfo(files[i]);
}

string branch = GitCommands.CurrentBranch();
if (branch.StartsWith(&quot;feature&quot;)) {
    string messageToBe = &quot;feat: ISS-XXX&quot;;
    PrepareCommitMessage(commitMessageFilePath, messageToBe);
}

public static void PrepareCommitMessage(string messageFile, string message)
{
     string tempfile = Path.GetTempFileName();
    using (var writer = new StreamWriter(tempfile))
    using (var reader = new StreamReader(messageFile))
    {
        writer.WriteLine(message);
        while (!reader.EndOfStream)
            writer.WriteLine(reader.ReadLine());
    }
    File.Copy(tempfile, messageFile, true);
}
</code></pre>
<p>Create a new helper called <strong>util.csx</strong></p>
<pre><code class="language-csharp">public class Util
{
    public static string CommandLineArgument(IList&lt;string&gt; Args, int position)
    {
        if (Args.Count() &gt;= position + 1)
        {
            return Args[position];
        }
        return string.Empty;
    }

}
</code></pre>
<h3 id="commit-msg-hook"><a href="#commit-msg-hook">Commit-msg Hook</a></h3>
<p>The final local git hook I took for a spin is the commit-msg hook. It uses a regex to make sure the commit message is according the specified format.</p>
<pre><code class="language-csharp">#!/usr/bin/env dotnet dotnet-script
#load &quot;logger.csx&quot;
#load &quot;util.csx&quot;
#load &quot;git-commands.csx&quot;
using System.Text.RegularExpressions;

Logger.LogInfo(&quot;commit-msg hook&quot;);

string commitMessageFilePath = Util.CommandLineArgument(Args, 0);
string branch = GitCommands.CurrentBranch();
Logger.LogInfo(commitMessageFilePath);
Logger.LogInfo(branch);
string message = GetCommitedMessage(commitMessageFilePath);
Logger.LogInfo(message);

const string regex = @&quot;\b(feat|bug)\b(\({1}\b(core)\b\){1})?(:){1}(\s){1}(ISS-[0-9]{0,3}){1}&quot;;
var match = Regex.Match(message, regex);

if (!match.Success) {
    Logger.LogError(&quot;Message does not match commit format&quot;);
    Environment.Exit(1);
}

public static string GetCommitedMessage(string filePath) {
    return File.ReadAllLines(filePath)[0];
}
</code></pre>
<h3 id="pre-push-hook"><a href="#pre-push-hook">pre push Hook</a></h3>
<p>It is even possible to use NuGet packages in our hooks. Let say we want to prevent pushes to master (perhaps not even commits?). We can read a config file using Newtonsoft.Json and look for a protected branch and abort.</p>
<pre><code class="language-csharp">#!/usr/bin/env dotnet dotnet-script
#r &quot;nuget: Newtonsoft.Json, 12.0.2&quot;
#load &quot;logger.csx&quot;
#load &quot;config.csx&quot;
#load &quot;git-commands.csx&quot;
using Newtonsoft.Json;

string currentBranch = GitCommands.CurrentBranch().Trim();
Config currentConfig = GetConfig();
bool lockedBranch = currentConfig.ProtectedBranches.Contains(currentBranch);

if (lockedBranch) {
    Logger.LogError($&quot;Trying to commit on protected branch '{currentBranch}'&quot;);
    Environment.Exit(1);
}

public static Config GetConfig()
{
    return JsonConvert.DeserializeObject&lt;Config&gt;(File.ReadAllText(&quot;.githooks/config.json&quot;));
}
</code></pre>
<h2 id="conclusion"><a href="#conclusion">Conclusion</a></h2>
<p>My current hooks are far from the best, and perhaps C# is not the fastest language to use in git hook. I do, however consider the experiment a success. I much rather code in C# than in shell script. Ideas for further improvement include</p>
<ul>
<li>based on the list of changes, determine the scope of the change (i.e. only one directory changed we might know the scope)</li>
<li>configure the regex, allowed scopes, allowed types</li>
<li>improve pre-commit-msg for more scenarios</li>
<li>enforce users to use the hooks</li>
<li>managing versions of the hooks, on checkout old / different version of pull (with an update of the hooks) sync the directory. <a href="https://www.viget.com/articles/two-ways-to-share-git-hooks-with-your-team/" class="external">(perhaps githook location)</a></li>
</ul>
<p>Let me know what you think :-)</p>
<p><a href="https://github.com/maxhamulyak/git-hooks-example" class="external">maxhamulyak/git-hooks-example</a></p>
<p>Happy Coding 🍻</p>]]></content>
  </entry>
</feed>