Office 365/Outlook API Example: Remove Duplicate Calendar Events

I recently had trouble with “a fruit device” creating a lot of duplicate Outlook calendar events. So I developed a console application utility to remove all those duplicates, at the same time trying out the Microsoft Graph .NET Client Library. This turned out to be a little more involved than I though at first.

First, the constructor of GraphServiceClient takes an IAuthenticationProvider, that should add an authorization header consisting of  a bearer token to each request. I copied AuthenticationHelper from the console connect sample on GitHub (there are samples on GitHub for a lot of languages and frameworks). You can use the Outlook API directly with base URL https://outlook.office.com/api/v2.0/, in which case the required scope is https://outlook.office.com/calendars.readwrite. But Microsoft recommends using the graph API with base URI https://graph.microsoft.com/v1.0, in which case the required scope is calendars.readwrite, unless you need features available in the former but not the latter (e.g. Outlook tasks API).

The authentication helper uses PublicClientApplication from Microsoft.Identity.Client. My first attempt was to target .NET Core for the console application, but it turned out that PublicClientApplication doesn’t support showing a user interface dialog to input credentials on .NET Core, so I had to revert to .NET Framework 4.6.

With the sample authentication helper, you are forced to enter credentials and consent every time the program is run, which is cumbersome especially when testing. So I found TokenCacheHelper here, which I modified slightly. One thing I changed was where tokens are stored – it felt better to store them in the user’s profile (local app data), than in the executable binary folder.

The algorithm for detecting and deleting duplicates is at a high level the following:

  1. Get calendars, using Me.Calendars.
  2. For each calendar:
    1. Get all events, one page at a time, using Me.Calendars[calendar.Id].Events.
    2. Group events by subject, start time and end time.
    3. For each group:
      1. Group by ID. (This seems weird, but I discovered that the events call can respond with two or more items with the same ID. We don’t want to try to keep one of them and delete the rest, because that is not possible – one delete call will delete them all since it is in reality the same event.)
      2. Call delete with each unique event ID except the first one.

Another thing I discovered is that some calendars are read-only, notably holiday calenders, so it is not possible to remove duplicates in those. My code checks for this as well.

The complete source is available on GitHub: henriko2018/RemoveDuplicateEvents.

 

Advertisements

Integration Testing ASP.NET Core 2.1

Integration testing with ASP.NET Core has been further improved and simplified with 2.1. A very good start is Microsoft’s own documentation. Don’t miss Test app prerequisites. In summary, you do the following:

  1. Edit the test project file and set the root element to
    <Project Sdk="Microsoft.NET.Sdk.Web">
  2. Add the following packages:
    Install-Package Microsoft.AspNetCore.App -version 2.1.0
    Install-Package Microsoft.AspNetCore.Mvc.Testing -version 2.1.0

But the example is a bit simplistic. Normally, you have several test classes, and personally, I think you can create the host and client once without risk because the  the production code only creates the web host once. Not creating the host and client for each test class clearly improves performance.

This can be accomplished using an xUnit test collection. We create a collection fixture:

using System.Net.Http;
using Microsoft.AspNetCore.Mvc.Testing; 
using Xunit;
...
/// <summary>
/// One instance of this will be created per test collection.
/// </summary>
public class TestHostFixture : ICollectionFixture<WebApplicationFactory<Startup>>
{
    public readonly HttpClient Client;

    public TestHostFixture()
    {
        var factory = new WebApplicationFactory<Startup>();
        Client = factory.CreateClient();
    }
}

And then collection:

[CollectionDefinition("Integration tests collection")]
public class IntegrationTestsCollection : ICollectionFixture<TestHostFixture>
{
    // This class has no code, and is never created. Its purpose is simply
    // to be the place to apply [CollectionDefinition] and all the
    // ICollectionFixture<> interfaces.
}

The fixture can then be injected into your tests like this:

[Collection("Integration tests collection")]
public class YourTests
{
    private readonly TestHostFixture _testHostFixture;

    public YourTests(TestHostFixture testHostFixture)
    {
        _testHostFixture = testHostFixture;
    }
    
    [Fact]
    public async Task Test1()
    {
        // When
        var response = await _testHostFixture.Client.GetStringAsync(...);
        ...
    }
    ...
}

A couple of notes:

  • The type parameter for WebApplicationFactory is Startup which in this case refers to the web application Startup class.
  • Because of the above, the configuration that the production code (web app or web API) will use during the tests is the same as when it is started the normal way (with IIS or Kestrel hosting). With other words, you don’t need to add appsettings.json files to your integration test project but can rely on the ones in the web application.

If you for some reason have to use a custom WebApplicationFactory, read the docs. In this case, if you override ConfigureWebHost from the standard WebApplicationFactory. You can even override CreateWebHostBuilder like this:

public class CustomWebApplicationFactory : WebApplicationFactory<IntegrationTestStartup>
{
    protected override IWebHostBuilder CreateWebHostBuilder()
    {
        return WebHost.CreateDefaultBuilder(new string[0])
	    .UseStartup<IntegrationTestStartup>();
    }
}

If you do this, any configuration read from appsettings.json will be from your integration test project, not from the web application.

Delete Local Unmerged Branches in Git

This is a simple script to delete unmerged branches from your local Git repository. Use at your own risk!

#!/bin/bash

dryrun=1
while getopts ":d" opt; do
    case $opt in
    d)
        dryrun=0
        ;;
    \?)
        echo "Invalid option: -$OPTARG"
        ;;
    esac
done

if [[ $dryrun == 0 ]];then
    echo "Deleting..."
    git branch --no-color --merged | egrep -v "(^\*|master|develop)" | xargs git branch -d
else
    echo "The following branches would be deleted:"
    git branch --merged | egrep -v "(^\*|master|develop)" 
    echo "Use -d option to really delete."
fi

 

Testing .NET Core with Team Foundation Server

Testing .NET Core projects as part of a build in Team Foundation Server is relatively easy – just add a Test .NET Core task – but collecting and publishing the results so that they can be viewed in the UI is a bit trickier. Since I’m using XUnit as testing framework, I tried using their dotnet xunit command, but it didn’t work. After some searching and experimenting, I came up with the following solution.

  1. Add the following arguments to the Test .NET Core task:
    –no-restore –no-build –configuration $(BuildConfiguration) –logger trx
    The important part here is the logger parameter.
  2. Add a Publish Test Results task with the following options:
    Test Result Format: VSTest
    Test Result Files: **/*.trx

That’s it!

Tips and Tricks for NancyFx and .NET Core

In my current assignment, we’re using Nancy on ASP.NET Core and .NET Core 2.0. Nancy for .NET Core is currently in alpha, and since documentation is lacking, and since this is not exactly the “super-duper-happy-path”, I thought I would write down some discoveries.

Major Differences from v. 1

You still define routes in module constructors, but the syntax has changed. Instead of e.g

Get["/products/{id}"] = args => SomeMethod(args.id);

you write e.g.

Get("/products/{id}", args => SomeMethod(args.id));

I usually use async and name my routes, then this becomes:

Get("/products/{id}", async args => await SomeMethodAsync(args.id), name: "GetProduct");

private async Task<dynamic> SomeMethod(string productId) { ... }

Application Settings

With WebAPI, you define an options class, in my case ConnectionStrings, and add this to the Startup class:

public void ConfigureServices(IServiceCollection services)
{
    // ...
    services.Configure(Configuration.GetSection("ConnectionStrings"));
    // ...
}

But that is an extension method in OptionsConfigurationServiceCollectionExtensions which only works with WebAPI’s DI container. And the Configuration property in the Startup class is set by the runtime, but Nancy is using another bootsrapper. So the question was how to (1) pass the configuration to the Nancy bootstrapper and (2) how to adapt this for TinyIoC which is used in Nancy by default.

The solution for (1) is to have a Custom bootstrapper and pass the configuration to it’s constructor. To solve (2) I wrote my own extension method for TinyIoC.

public class Startup
{
    // ...
    public void Configure(IApplicationBuilder app, IHostingEnvironment env)
    {
        if (env.IsDevelopment())
            app.UseDeveloperExceptionPage();
        app.UseOwin(x => x.UseNancy(options => options.Bootstrapper = new CustomBootstrapper(_configuration)));
    }
}

public class CustomBootstrapper : DefaultNancyBootstrapper
{
    private readonly IConfiguration _configuration;

    public CustomBootstrapper(IConfiguration configuration)
    {
        _configuration = configuration;
    }

	protected override void ConfigureApplicationContainer(TinyIoCContainer container)
    {
        base.ConfigureApplicationContainer(container);
        container.Configure(_configuration.GetSection("NameOfAppsettingsSection"));
    }
}

internal static class ContainerExtensions
{
    public static void Configure(this TinyIoCContainer container, IConfiguration config) where TOptions : class, new()
    {
        var options = new TOptions();
        config.Bind(options);
        container.Register(options);
    }
}

Then, just add your options class as a parameter to classes where it is needed:

public class MyClass
{
    public MyClass(MyOptions options)
    {
        // ...
    }
    // ...
}

Swagger Generation

Swagger generation with Nancy is not as convenient as I would wish, partly because of the dynamic nature of Nancy, and partly because of the not so well implemented Nancy.Swagger package. In my experience, what works best is to use the builder methods, but it is still not perfect. I use Postman to test my API rather than Swagger.

Testing

Marcus Hammarberg has a series of blog posts that are very useful as an introduction to testing with Nancy. Here is a summary and some additional notes.

A nice feature of Nancy is that it has built-in unit test support, where you can create a “browser” that takes two parameters: a bootstrapper (use ConfigurableBootsrapper to specify the module that is the subject of test, including mocked dependencies) and an action that specifies the browser context to use for all requests. Example:

var browser = new Browser(
    new ConfigurableBootstrapper(configuration =>
    {
        configuration.Module<MyModule>();
        configuration.Dependency(myMock.Object);
    }),
    context => context.Accept("application/json"));

For integration tests, where you want to test all layers, you use your ordinary bootstrapper instead. If it takes an IConfiguration parameter, you have to write code for getting configuration from appsettings.json in the test project. Here is an example of a static helper class:

using Microsoft.Extensions.Configuration;

internal static class ConfigurationHelper
{
    public static IConfiguration Configuration;

    static ConfigurationHelper()
    {
        var configBuilder = new ConfigurationBuilder()
            .AddJsonFile("appsettings.json")
            //.AddJsonFile($"appsettings.{environment}.json", true)
            .AddEnvironmentVariables();
        Configuration = configBuilder.Build();
    }
}

Make sure that appsettings.json has build action Content and Copy if newer. Having an override for the current environment (the commented line above) is not trivial, since the test project is not a ASP.NET Core project. You could use the same environment variable with the following line:

var environment = System.Environment.GetEnvironmentVariable("ASPNETCORE_ENVIRONMENT");

Error Handling

Global error handling can be setup in the bootstrapper by overriding ApplicationStartup. Example for an API that returns JSON:

protected override void ApplicationStartup(TinyIoCContainer container, IPipelines pipelines)
{
    base.ApplicationStartup(container, pipelines);
    pipelins.OnError += OnError;
}

private Nance.Response OnError(NancyContext ctx, Exception ex)
{
    _logger.Error(ex, "An unhandled error occurred.");
    var negotiator = ApplicationContainer.Resolve<IResponseNegotiator>();
    return negotiator.NegotiateResponse(new {Error = "Internal error. Please see server log for details."}, ctx);
}

 

TeamCity and Octopus Deploy Tips and Tricks

Setting Version

It is nice to have TeamCity set build number. I tend to use major.minor.build.revision for AssemblyVersion and major.minor.revision.build for AssemblyInformationalVersion (product version). So in AssemblyInfo.cs we have for example:

[assembly: AssemblyVersion("2.2.*")] // AssemblyFileVersionAttribute is not supplied, so the AssemblyVersionAttribute is used for the Win32 file version that is displayed on the Version tab of the Windows file properties dialog.
[assembly: AssemblyInformationalVersion("2.2.1")] // A.k.a. product version

In order to have the build number inserted, create two file content replacer build features with the following configurations:

Path pattern: “**/AssemblyInfoGlobal*.cs”
File encoding: <Auto-detect>
Search for: (^\s*\[\s*assembly\s*:\s*((System\s*\.)?\s*Reflection\s*\.)?\s*AssemblyVersion(Attribute)?\s*\(\s*@?\”)(([0-9\*]+\.)+)[0-9\*]+(\”\s*\)\s*\])
Match case: true
Replace with: $1$5\%build.number%.*$7

Path pattern: “**/AssemblyInfoGlobal*.cs”
File encoding: <Auto-detect>
Search for: (^\s*\[\s*assembly\s*:\s*((System\s*\.)?\s*Reflection\s*\.)?\s*AssemblyInformationalVersion(Attribute)?\s*\(\s*@?\”)(([0-9\*]+\.?)+)(\”\s*\)\s*\])
Match case: true
Replace with: $1$5.\%build.number%$7

Build and Publish Multiple Branches

I usually want some flexibility in what branch to build, and I have found that the following settings work quite well.

To make it possible to build and publish multiple branches, you can have wildcards in the VCS Root Branch specification, e.g.:

+:refs/heads/release/*
+:refs/heads/develop
+:refs/heads/feature/*

The default branch in this case would probably be set to

refs/heads/develop

Now, you probably only want automatic build and publish for certain branches, not all. To have automatic build for just develop, go to the build configuration Triggers setting and set e.g. the following branch filter:

+:<default>

Now, you can click the ellipses next to Run, go to the Changes tab and select the desired  branch to build.

Build Pull Requests

You probably also want to have automatic build (but not publish) of pull requests. This is accomplished by having a separate build configuration for that, with the following VCS Root Branch specification:

+:refs/pull/(*/merge)
-:develop
-:master

This means pull requests are built but not the develop and master branches. (For example, pull request #1 have branch name refs/pull/1/merge.

Deploy a Specific Version

The default behavior of the OctopusDeploy: Create release step is to create a release of the latest version. If you want to build and deploy another version, probably from a release branch, you can do like this:

  1. Create a new VCS Root with default branch set to e.g. refs/heads/release/2.2 and use this in your build configuration.
  2. In General Settings, set Build Number Format to e.g. 2.2.1.%build.counter%.
  3. In your Deploy/create release step, set Release number to %build.number%, and Additional command line arguments to –packageversion=%build.number%. This will make octo.exe use this version as default for every package. You can override that with the package parameter, e.g. –packageversion=%build.number% –package=EntityFramework:1.6.2.

Note: It would be better to read the version from AssemblyInfo.cs rather than hard-configuring it, but I haven’t tried that out yet. It would require some scripting.

NuGet Publish

In a TeamCity NuGet Publish step, instead of specifying packages one by one, you can use:

**\obj\octopacked\*.nupkg

Integration Tests

I prefer to run integration tests as part of deployment rather than build, for two reasons:

  • It takes quite some time to run them, and I don’t like really long builds.
  • To a large extent, integration tests test configuration, so it make sense to run them on the target environment rather than on the build server.

Here are the steps I have used to facilitate integration testing as part of deployment. In the integration test project:

  • Add the OctoPack NuGet package.
  • Add app.config transforms. They must be called <project>.IntegrationTests.dll.<environment>.config, build action should be None and copy to output directory should be Copy if newer.
  • Add a PostDeploy.ps1 script. This could look like the example below. Make sure it has the same properties as the ones in previous step.

In TeamCity:

  • Add the integration project to NuGet publish step, or use wildcard as described above.

In Octopus Deploy:

  • Add a new Deploy a NuGet package step and choose the integration test package.

Example PostDeploy.ps1

# Clean-up
Remove-Item Project.IntegrationTests.dll.*.config

# Run integration tests
choco upgrade visualstudio2015testagents -y

$exePath = "C:\Program Files (x86)\Microsoft Visual Studio 14.0\Common7\IDE\MSTest.exe"
$testBinariesFolder = "."
$testBinariesFilter = "*.IntegrationTests.dll"
$scheme = $OctopusParameters['scheme']
$hostname = $OctopusParameters['hostname']
$webAppPath = $scheme + "://" + $hostname 
$webAppPhysicalPath = $OctopusParameters['Octopus.Action[Deploy Web].Output.Package.InstallationDirectoryPath']

# Search for integration test DLLs
$testDlls = ""
(Get-ChildItem -Path $testBinariesFolder -Filter $testBinariesFilter).FullName | ForEach-Object { $testDlls += "/testcontainer:""$_"" " }

# Exclude some categories.
$environment = $OctopusParameters['Octopus.Environment.Name']
if ($environment -match "xxx") { $categories = '/category:"CategoryX"' }
if ($environment -match "yyy") { $categories = '/category:"CategoryY"' }
if ($environment -match "zzz") { $categories = '/category:"CategoryZ"' }
if ($categories -eq $null -or $categories -eq "") {
    Write-Error "Unknown environment ""$environment"". Integration tests will not be run." -ErrorAction Continue
    return
}

# Start the test
Write-Output "& ""$exePath"" $testDlls $categories"
Invoke-Expression "& ""$exePath"" $testDlls $categories"

# Check results
if ($LASTEXITCODE -eq 0) {
    Write-Output "All integration tests passed."
} else {
    Write-Error "One or more integration tests failed." -ErrorAction Continue
}

# Upload result file
$testResultFiles = Get-ChildItem -Path .\TestResults -Filter *.trx -ErrorAction SilentlyContinue
if ($testResultFiles -ne $null)
{
	$resultMessage = "Test results available at "
	$testResultFiles | ForEach-Object {
		Move-Item $_.FullName $webAppPhysicalPath
		$resultMessage += "$webAppPath/$($_.Name) "
	}
	Write-Output $resultMessage

	# Allow download of the result
	if ((Get-WebConfiguration -Filter "//staticContent/mimeMap[@fileExtension='.trx']" -PSPath IIS:\) -eq $null)
	{
		Add-WebConfiguration -Filter "//staticContent" -PSPath IIS:\ -Value @{fileExtension=".trx";mimeType="application/x-test"}
	}
} else {
	Write-Error "Found no test results." -ErrorAction Continue
}

# Always return 0 because we don't want to fail the deployment until integration tests are stable.
exit 0

There are several things to note here.

  • To do…

Fixing Missing NuGet Packages

I have encountered this problem more than one time. Somehow, maybe when merging changes between branches, there are references in .NET projects to assemblies in NuGet packages that are not present in packages.config, which in turn can result in issues when updating packages to later versions. I put together a PowerShell script to detect this issues. It searches all .csproj files in subfolders to the current one.

# Get reference hintpath to packages:
$pattern = "<HintPath>.*Packages\\(?<id>(?:[a-zA-Z]+\.?)+)\.(?<version>(?:\d+\.)+\d+)\\.*</HintPath>"

Get-ChildItem -Path . -Filter "*.csproj" -File -Recurse | foreach {
    $projFilename = $_.FullName
    Write-Output "`r`n*** $projFilename ***"
    $packageFilename = "$($_.DirectoryName)\packages.config"
    if ([System.IO.File]::Exists($packageFilename))
    {
        $projtext = Get-Content -Path $projFilename
        $projtext | foreach {
            if ($_ -imatch $pattern)
            {
                $id = $Matches.Item("id")
                $version = $Matches.Item("version")
                Write-Output "$($Matches[0])"
                $foundInPackages = Select-String -Pattern "id=""$id"" version=""$version""" -Path $packageFilename -Quiet
                if ($foundInPackages) {
                    Write-Output (Select-String -Pattern "id=""$id"" version=""$version""" -Path $packageFilename).Line
                } else {
                    Write-Warning "Not found in package file!"
                }
            }
        }
    } else {
        Write-Warning "There is no package file at $packageFilename!"
    }
}