Using dependency injection in a .Net Core console application

One of the key features of ASP.NET Core is baked in dependency injection.

Whether you choose to use the built in container or a third party container will likely come down to whether the built in container is powerful enough for your given project. For small projects it may be fine, but if you need convention based registration, logging/debugging tools, or more esoteric approaches like property injection, then you'll need to look elsewhere.

Why use the built-in container?

One question that's come up a few times, is whether you can use the built-in provider in a .NET Core console application? The short answer is not out-of-the-box, but adding it in is pretty simple. Having said that, whether it is worth using in this case is another question.

One of the advantage of the built-in container in ASP.NET Core is that the framework libraries themselves register their dependencies with it. When you call the AddMvc() extension method in your Startup.ConfigureServices method, the framework registers a whole plethora of services with the container. If you later add a third-party container, those dependencies are passed across to be re-registered, so they are available when resolved via the third-party.

If you are writing a console app, then you likely don't need MVC or other ASP.NET Core specific services. In that case, it may be just as easy to start right off the bat using StructureMap or AutoFac instead of the limited built-in provider.

Having said that, most common services designed for use with ASP.NET Core will have extensions for registering with the built in container via IServiceCollection, so if you are using services such as logging, or the Options pattern, then it is certainly easier to use the provided extensions, and plug a third party on top of that if required.

Adding DI to a console app

If you decide the built-in container is the right approach, then adding it to your application is very simple using the Microsoft.Extensions.DependencyInjection package.

aspnet-core-di

To demonstrate the approach, I'm going to create a simple application that has two services:

public interface IFooService
{
    void DoThing(int number);
}

public interface IBarService
{
    void DoSomeRealWork();
}

Each of these services will have a single implementation. The BarService depends on an IFooService, and the FooService uses an ILoggerFactory to log some work:

public class BarService : IBarService
{
    private readonly IFooService _fooService;
    public BarService(IFooService fooService)
    {
        _fooService = fooService;
    }

    public void DoSomeRealWork()
    {
        for (int i = 0; i < 10; i++)
        {
            _fooService.DoThing(i);
        }
    }
}

public class FooService : IFooService
{
    private readonly ILogger<FooService> _logger;
    public FooService(ILoggerFactory loggerFactory)
    {
        _logger = loggerFactory.CreateLogger<FooService>();
    }

    public void DoThing(int number)
    {
        _logger.LogInformation($"Doing the thing {number}");
    }
}

As you could see above, I'm using the new logging infrastructure in my app, so I will need to add the appropriate package to my project.json. I'll also add the DependencyInjection package and the Microsoft.Extensions.Logging.Console package so I can see the results of my logging:

{
  "dependencies": {
    "Microsoft.Extensions.Logging": "1.0.0",
    "Microsoft.Extensions.Logging.Console": "1.0.0",
    "Microsoft.Extensions.DependencyInjection": "1.0.0"
  }
}

Finally, I'll update my static void main to put all the pieces together. We'll walk through through it in a second.

using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Logging;

public class Program
{
    public static void Main(string[] args)
    {
        //setup our DI
        var serviceProvider = new ServiceCollection()
            .AddLogging()
            .AddSingleton<IFooService, FooService>()
            .AddSingleton<IBarService, BarService>()
            .BuildServiceProvider();

        //configure console logging
        serviceProvider
            .GetService<ILoggerFactory>()
            .AddConsole(LogLevel.Debug);

        var logger = serviceProvider.GetService<ILoggerFactory>()
            .CreateLogger<Program>();
        logger.LogDebug("Starting application");

        //do the actual work here
        var bar = serviceProvider.GetService<IBarService>();
        bar.DoSomeRealWork();

        logger.LogDebug("All done!");
    }
}

The first thing we do is configure the dependency injection container by creating a ServiceCollection, adding our dependencies, and finally building an IServiceProvider. This process is equivalent to the ConfigureServices method in an ASP.NET Core project, and is pretty much what happens behind the scenes. You can see we are using the IServiceCollection extension method to add the logging services to our application, and then registering our own services. The serviceProvider is our container we can use to resolve services in our application.

In the next step, we need to configure the logging infrastructure with a provider, so the results are output somewhere. We first fetch an instance of ILoggerFactory from our newly constructed serviceProvider, and add a console logger.

The remainder of the program shows more dependency-injection in progress. We first fetch an ILogger<T> from the container, and then fetch an instance of IBarService. As per our registrations, the IBarService is an instance of BarService, which will have an instance of FooService injected in it.

If can then run our application and see all our beautifully resolved dependencies!

output

Adding StructureMap to your console app

As described previously, the built-in container is useful for adding framework libraries using the extension methods, like we saw with AddLogging above. However it is much less fully featured than many third-party containers.

For completeness, I'll show how easy it is to update the application to use a hybrid approach, using the built in container to easily add any framework dependencies, and using StructureMap for your own code.

First you need to add StructureMap to your project.json dependencies:

{
  "dependencies": {
    "StructureMap.Microsoft.DependencyInjection": "1.2.0"
  }
}

Now we'll update our static void main to use StructureMap for registering our custom dependencies:

public static void Main(string[] args)
{
    // add the framework services
    var services = new ServiceCollection()
        .AddLogging();

    // add StructureMap
    var container = new Container();
    container.Configure(config =>
    {
        // Register stuff in container, using the StructureMap APIs...
        config.Scan(_ =>
                    {
                        _.AssemblyContainingType(typeof(Program));
                        _.WithDefaultConventions();
                    });
        // Populate the container using the service collection
        config.Populate(services);
    });

    var serviceProvider = container.GetInstance<IServiceProvider>();

    // rest of method as before
}

At first glance this may seem more complicated than the previous version, and it is, but it is also far more powerful. In the StructureMap example, we didn't have to explicitly register our IFooService or IBarService services - they were automatically registered by convention. When your apps start to grow, this sort of convention-based registration becomes enormously powerful, especially when couple with the error handling and debugging capabilities available to you.

In this example I showed how to use StructureMap with the adapter to work with the IServiceCollection extension methods, but there's obviously no requirement to do that. Using StructureMap as your only registration source is perfectly valid, you'll just have to manually register any services added as part of the AddPLUGIN extension methods directly.

Dependency Injection with constructor parameters

The following code shows you how to configure DI for objects that have parameters in the constructor.

using Microsoft.Extensions.DependencyInjection;
using System;

namespace NetCoreDIDemo
{
    class Program
    {
        static void Main(string[] args)
        {
            var services = new ServiceCollection();
            services.AddTransient<IMyService>(s => new MyService("MyConnectionString"));
            var provider = services.BuildServiceProvider();
            var myService = provider.GetService<IMyService>();
            
            Console.WriteLine($"The constructor parameter is: {myService.GetConstructorParameter()}");
            Console.ReadKey();
        }
    }

    public interface IMyService
    {
        string GetConstructorParameter();
    }

    public class MyService : IMyService
    {
        private string connectionString;
        public MyService(string connString)
        {
            this.connectionString = connString;
        }

        public string GetConstructorParameter()
        {
            return connectionString;
        }
    }
}

Happy coding!

The Windows 3.0 File Manager is now available in the Microsoft Store

FileManager-for-Windows10

Microsoft open sourced the original File Manager that shipped with Windows 3.0, allowing users to make changes and if they want, compile it for use on Windows 10. Now, the firm is making it easier to run the legacy app, as it's offering the Windows 3.0 File Manager through the Microsoft Store (via Aggiornamenti Lumia) as a UWP app.

The app is listed as being available on PC, mobile, Surface Hub, and HoloLens; however, it also requires the device to be installed on Windows 10 build 16299 or newer. Obviously, there are no mobile devices that qualify. If you do have a Windows phone that you want to try it out on, you could always download the code and try to compile it for an earlier build.

Obviously, there were some modifications that had to be made to the original app to get it to run on Windows 10. After all, Windows 3.0 was a 16-bit operating system. Microsoft also had to add header files that were previously stored in the NT source tree, and it had to make some APIs public.

This isn't the first time that Microsoft has dabbled in software nostalgia. As an April Fool's joke in 2015, it released an MS-DOS Mobile app for Windows phones. If you want to download the Windows File Manager app, you can find it here.

Latest Windows 10 build puts desktop apps in a 3D world

Mixed-Reality-Portal

Microsoft has released a new Insider preview build of Windows 10. Build 18329 should be available now to most people who have opted into the fast preview ring. Though it's not available to everyone because, for some reason, the new build isn't available in all the languages it'd normally be shipped in.

Top-Apps-In-Search

The strangest new feature is that you can now launch and run regular Win32 apps—2D apps built for the desktop—in the Windows Mixed Reality environment that's used for both virtual reality headsets and the HoloLens augmented reality headset. Previously, it was only possible to run apps built using the modern UWP API. Now, it seems that any Windows application will work. If you want to use Photoshop or Visual Studio with a headset on, you can.The new build also adds a couple of new scripts to support the writing of languages that until recently had no adequate written form. There's the Osage language spoken by the Osage Nation in Oklahoma (which prior to 2006 used the Latin alphabet with various diacritics) and the ADLaM script used to write Pular, the language of the Fulani people in West Africa (which, similarly, used the Roman alphabet with diacritics prior to the development of the new alphabet in the 1980s). ADLaM and Osage were both added to Unicode in 2016.

Lifetime Managers in Unity Container

The unity container manages the lifetime of objects of all the dependencies that it resolves using lifetime managers.

Unity container includes different lifetime managers for different purposes. You can specify lifetime manager in RegisterType() method at the time of registering type-mapping. For example, the following code snippet shows registering a type-mapping with TransientLifetimeManager.

var container = new UnityContainer()
                   .RegisterType<ICar, BMW>(new TransientLifetimeManager());

The following table lists all the lifetime managers:

Lifetime Manager Description
TransientLifetimeManager Creates a new object of requested type every time you call Resolve or ResolveAll method.
ContainerControlledLifetimeManager Creates a singleton object first time you call Resolve or ResolveAll method and then returns the same object on subsequent Resolve or ResolveAll call.
HierarchicalLifetimeManager Same as ContainerControlledLifetimeManager, the only difference is that child container can create its own singleton object. Parent and child container do not share singleton object.
PerResolveLifetimeManager Similar to TransientLifetimeManager but it reuses the same object of registered type in the recursive object graph.
PerThreadLifetimeManager Creates singleton object per thread basis. It returns different objects from the container on different threads.
ExternallyControlledLifetimeManager It manintains only weak reference of objects it creates when you call Resolve or ResolveAll method. It does not maintain the lifetime of strong objects it creates and allow you or garbage collector to control the lifetime. It enables you to create your own custom lifetime manager

Let's understand each lifetime manager using the following example classes.

public interface ICar
{
    int Run();
}

public class BMW : ICar
{
    private int _miles = 0;

    public int Run()
    {
        return ++_miles;
    }
}

public class Ford : ICar
{
    private int _miles = 0;
    public int Run()
    {
        return ++_miles;
    }
}

public class Audi : ICar
{
    private int _miles = 0;

    public int Run()
    {
        return ++_miles;
    }
}

public class Driver
{
    private ICar _car = null;

    public Driver(ICar car)
    {
        _car = car;
    }

    public void RunCar()
    {
        Console.WriteLine("Running {0} - {1} mile ", 
                                      _car.GetType().Name, _car.Run());
    }
}

TransientLifetimeManager

TransientLifetimeManager is the default lifetime manager. It creates a new object of requested type every time you call Resolve() or ResolveAll() method.

var container = new UnityContainer()
                   .RegisterType<ICar, BMW>();

var driver1 = container.Resolve<Driver>();
driver1.RunCar();

var driver2 = container.Resolve<Driver>();
driver2.RunCar();

Output:

Running BMW - 1 Mile
Running BMW - 1 Mile

In the above example, unity container will create two new instances of BMW class and injects into driver1 and driver2 object. This is because the default lifetime manager is TransientLifetimeManager which creates new dependent object every time you call Resolve or ResolveAll method. You can specify the lifetime manager at the time of registering type using RegisterType() method.

The following example will display same output as above example because TransientLifetimeManager is the default manager if not specified.

var container = new UnityContainer()
                   .RegisterType<ICar, BMW>(
                             new TransientLifetimeManager());

var driver1 = container.Resolve<Driver>();
driver1.RunCar();

var driver2 = container.Resolve<Driver>();
driver2.RunCar();

Output:

Running BMW - 1 Mile
Running BMW - 1 Mile

ContainerControlledLifetimeManager

Use ContainerControlledLifetimeManager when you want to create a singleton instance.

var container = new UnityContainer()
                   .RegisterType<ICar, BMW>(new 
                            ContainerControlledLifetimeManager());

var driver1 = container.Resolve<Driver>();
driver1.RunCar();

var driver2 = container.Resolve<Driver>();
driver2.RunCar();

Output:

Running BMW - 1 mile
Running BMW - 2 mile

In the above example, we specified ContainerControlledLifetimeManager in RegisterType() method. So unity container will create a single instance of BMW class and inject it in all the instances of Driver.

HierarchicalLifetimeManager

The HierarchicalLifetimeManager is the same as ContainerControlledLifetimeManager except that if you create a child container then it will create its own singleton instance of registered type and will not share instance with parent container.

var container = new UnityContainer()
                   .RegisterType<ICar, BMW>(
                            new HierarchicalLifetimeManager());

var childContainer = container.CreateChildContainer();
            
var driver1 = container.Resolve<Driver>();
driver1.RunCar();

var driver2 = container.Resolve<Driver>();
driver2.RunCar();

var driver3 = childContainer.Resolve<Driver>();
driver3.RunCar();

var driver4 = childContainer.Resolve<Driver>();
driver4.RunCar();

Output:

Running BMW - 1 mile
Running BMW - 2 mile
Running BMW - 1 Mile
Running BMW - 2 Mile

As you can see, container and childContainer have their own singleton instance of BMW.

Visit Understand Lifetime Managers to learn more about it.

ARM \ Logic App Deployment with Azure DevOps

Azure-Cloud-Hero-Server-LogicApps

Microsoft’s documentation refers to Logic Apps as being iPaaS or integration Platform-as-a-Service. The “i” in iPaaS indicates the strength of Logic Apps; not only are Azure systems integrated but external and third-party systems can be included in your Logic Apps, including Twitter, Slack, Office 365, and many others. This integration is done using a set of Microsoft-provided connectors. However, if a connector does not exist, then you can still integrate your logic app to external systems via their APIs.

Go to the Azure portal https://portal.azure.com and create the logic app.

LogicApp

Virtually every resource in Azure can be extracted into an ARM Template (Azure Resource Manager Template), allowing you to spin up an environment using the Json based template.

Configure parameters

Open your favourite code editor (my personal is VS Code or Visual Studio) and examine the template you just downloaded. You will notice a number of parameters in the template.

deploy and grab that much earned beer.

For my deployments, I use Azure DevOps. There is a great task Microsoft have added called Azure Resource Manager Deployment allowing you to automate your deployments for multiple environments.

azure_devops_and_logicapps_release

Azure WebJobs API

This API is accessed the same way as the git endpoint. e.g. if your git URL is https://yoursite.scm.azurewebsites.net/yoursite.git, then the API to get the list of deployments will be https://yoursite.scm.azurewebsites.net/deployments.

The credentials you use are the same as when you git push. See Deployment-credentials for more details.

List all web jobs
GET /api/webjobs

Triggered Jobs

List all triggered jobs
GET /api/triggeredwebjobs

Response

[
  {
    name: "jobName",
    runCommand: "...\run.cmd",
    type: "triggered",
    url: "http://.../triggeredwebjobs/jobName",
    history_url: "http://.../triggeredwebjobs/jobName/history",
    extra_info_url: "http://.../",
    scheduler_logs_url: "https://.../vfs/data/jobs/triggered/jobName/job_scheduler.log",
    settings: { },
    using_sdk: false,
    latest_run:
      {
        id: "20131103120400",
        status: "Success",
        start_time: "2013-11-08T02:56:00.000000Z",
        end_time: "2013-11-08T02:57:00.000000Z",
        duration: "00:01:00",
        output_url: "http://.../vfs/data/jobs/triggered/jobName/20131103120400/output_20131103120400.log",
        error_url: "http://.../vfs/data/jobs/triggered/jobName/20131103120400/error_20131103120400.log",
        url: "http://.../triggeredwebjobs/jobName/history/20131103120400",
        trigger: "Schedule - 0 0 0 * * *"
      }
  }
]
List all triggered jobs in swagger format
GET /api/triggeredwebjobsswagger

Response

{
  "swagger": "2.0",
  "info": {
    "version": "v1",
    "title": "WebJobs"
  },
  "host": "placeHolder",
  "schemes": [
    "https"
  ],
  "paths": {
    "/api/triggeredjobs/jobName/run": {
      "post": {
        "deprecated": false,
        "operationId": "jobName",
        "consumes": [],
        "produces": [],
        "responses": {
          "200": {
            "description": "Success"
          },
          "default": {
            "description": "Success"
          }
        },
        "parameters": [
          {
            "name": "arguments",
            "in": "query",
            "description": "Web Job Arguments",
            "required": false,
            "type": "string"
          }
        ]
      }
    }
  }
}
List all triggered jobs in swagger format###
GET /api/triggeredwebjobsswagger

Response

[
  {
    name: "jobName",
    runCommand: "...\run.cmd",
    type: "triggered",
    url: "http://.../triggeredwebjobs/jobName",
    history_url: "http://.../triggeredwebjobs/jobName/history",
    extra_info_url: "http://.../",
    latest_run:
      {
        id: "20131103120400",
        status: "Success",
        start_time: "2013-11-08T02:56:00.000000Z",
        end_time: "2013-11-08T02:57:00.000000Z",
        duration: "00:01:00",
        output_url: "http://.../vfs/data/jobs/triggered/jobName/20131103120400/output_20131103120400.log",
        error_url: "http://.../vfs/data/jobs/triggered/jobName/20131103120400/error_20131103120400.log",
        url: "http://.../triggeredwebjobs/jobName/history/20131103120400"
      }
  }
]
Get a specific triggered job by name
GET /api/triggeredwebjobs/{job name}

Response

[
{
  "swagger": "2.0",
  "info": {
    "version": "v1",
    "title": "WebJobs"
  },
  "host": "placeHolder",
  "schemes": [
    "https"
  ],
  "paths": {
    "/api/triggeredjobs/jobName/run": {
      "post": {
        "deprecated": false,
        "operationId": "jobName",
        "consumes": [],
        "produces": [],
        "responses": {
          "200": {
            "description": "Success"
          },
          "default": {
            "description": "Success"
          }
        },
        "parameters": [
          {
            "name": "arguments",
            "in": "query",
            "description": "Web Job Arguments",
            "required": false,
            "type": "string"
          }
        ]
      }
    }
  }
}
]
Upload a triggered job as zip

Using a zip file containing the files for it, or just a single file (e.g foo.exe).

PUT /api/zip/site/wwwroot/App_Data/jobs/triggered/{job name}/

or

PUT /api/triggeredwebjobs/{job name}

Use Content-Type: application/zip for zip otherwise it's treated as a regular script file.

The file name should be in the Content-Dispostion header, example:

Content-Disposition: attachement; filename=run.cmd

Note: the difference between the two techniques is that the first just adds files into the folder, while the second first deletes any existing content before adding new files.

Delete a triggered job
DELETE /api/vfs/site/wwwroot/App_Data/jobs/triggered/{job name}?recursive=true

or

DELETE /api/triggeredwebjobs/{job name}
Invoke a triggered job
POST /api/triggeredwebjobs/{job name}/run

To run with arguments use the arguments parameters that will be added to the script when invoked. It also gets passed to the WebJob as the WEBJOBS_COMMAND_ARGUMENTS environment variable.

POST /api/triggeredwebjobs/{job name}/run?arguments={arguments}

Note: if the site has multiple instances, the job will run on one of them arbitrarily. This is the same behavior as regular requests sent to the site.

In the http response, you get back a location attribute, with a URL to the details of the run that was started. e.g.

Location: https://mysite.scm.azurewebsites.net/api/triggeredwebjobs/SomeJob/history/201605192149381933
List all triggered job runs history
GET /api/triggeredwebjobs/{job name}/history

Response

{
  runs:
    [
      {
        id: "20131103120400",
        status: "Success",
        start_time: "2013-11-08T02:56:00.000000Z",
        end_time: "2013-11-08T02:57:00.000000Z",
        duration: "00:01:00",
        output_url: "http://.../vfs/data/jobs/triggered/jobName/20131103120400/output_20131103120400.log",
        error_url: "http://.../vfs/data/jobs/triggered/jobName/20131103120400/error_20131103120400.log",
        url: "http://.../triggeredwebjobs/jobName/history/20131103120400",
        trigger: "Schedule - 0 0 0 * * *"
      },
      ...
    ]
}

Note: The job history is kept in D:\home\data\jobs\triggered\jobName folder. Each history of job is kept under different folder by datetime of the execution. The api returns all job history in datetime descending order (meaning latest on top). We only keep most recent 50 job history (configurable by WEBJOBS_HISTORY_SIZE appSettings).

Get a specific run for a specific triggered job
GET /api/triggeredwebjobs/{job name}/history/{id}

Response

{
  id: "20131103120400",
  status: "Success",
  start_time: "2013-11-08T02:56:00.000000Z",
  end_time: "2013-11-08T02:57:00.000000Z",
  duration: "00:01:00",
  output_url: "http://.../vfs/data/jobs/triggered/jobName/20131103120400/output_20131103120400.log",
  error_url: "http://.../vfs/data/jobs/triggered/jobName/20131103120400/error_20131103120400.log",
  url: "http://.../triggeredwebjobs/jobName/history/20131103120400",
  trigger: "Schedule - 0 0 0 * * *"
}

Continuous Jobs

List all continuous jobs
GET /api/continuouswebjobs

Response

[
  {
    name: "jobName",
    status: "Running",
    runCommand: "...\run.cmd",
    log_url: "http://.../vfs/data/jobs/continuous/jobName/job.log",
    extra_info_url: "http://.../",
    url: "http://.../continuouswebjobs/jobName",
    type: "continuous"
  }
]
Get a specific continuous job by name
GET /api/continuouswebjobs/{job name}

Response

{
  name: "jobName",
  status: "Running",
  runCommand: "...\run.cmd",
  log_url: "http://.../vfs/data/jobs/continuous/jobName/job.log",
  extra_info_url: "http://.../",
  url: "http://.../continuouswebjobs/jobName",
  type: "continuous"
}

The status can take the following values:

  • Initializing
  • Starting
  • Running
  • PendingRestart
  • Stopped
  • Aborted
  • Abandoned
  • Success
  • Failure
Upload a continuous job as zip

Using a zip file containing the files for it.

PUT /api/zip/site/wwwroot/App_Data/jobs/continuous/{job name}/

or

PUT /api/continuouswebjobs/{job name}

Use Content-Type: application/zip for zip otherwise it's treated as a regular script file.

The file name should be in the Content-Dispostion header, example:

Content-Disposition: attachement; filename=run.cmd

Note: the difference between the two techniques is that the first just adds files into the folder, while the second first deletes any existing content before adding new files.

Delete a continuous job
DELETE /api/vfs/site/wwwroot/App_Data/jobs/continuous/{job name}?recursive=true

or

DELETE /api/continuouswebjobs/{job name}
Start a continuous job
POST /api/continuouswebjobs/{job name}/start
Stop a continuous job
POST /api/continuouswebjobs/{job name}/stop
Get continuous job settings
GET /api/continuouswebjobs/{job name}/settings

Response

{
  "is_singleton": true
}
Set a continuous job as singleton

If a continuous job is set as singleton it'll run only on a single instance opposed to running on all instances. By default, it runs on all instances.

PUT /api/continuouswebjobs/{job name}/settings

Body

{
  "is_singleton": true
}

To set a continuous job as singleton during deployment (without the need for the REST API) you can simply create a file called settings.job with the content: { "is_singleton": true } and put it at the root of the (specific) WebJob directory.

Set the schedule for a triggered job

You can set the schedule for invoking a triggered job by providing a cron expression made of 6 fields (second, minute, hour, day, month, day of the week).

PUT /api/triggeredwebjobs/{job name}/settings

Body

{
  "schedule": "0 */2 * * * *"
}

To set the schedule for a triggered job during deployment (without the need for the REST API) you can simply create a file called settings.job with the content: { "schedule": "0 */2 * * * *" } and put it at the root of the (specific) WebJob directory.

Adding an external Microsoft login to IdentityServer4

This article shows how to implement a Microsoft Account as an external provider in an IdentityServer4 project using ASP.NET Core Identity with a SQLite database.

Setting up the App Platform for the Microsoft Account

To setup the app, login using your Microsoft account and open the My Applications link

https://apps.dev.microsoft.com/?mkt=en-gb#/appList

id4-microsoft-apps

Click the Add an app button. Give the application a name and add your email. This app is called microsoft_id4_enrico.

id4-microsoft-apps-registration

After you clicked the create button, you need to generate a new password. Save this somewhere for the application configuration. This will be the client secret when configuring the application.

id4-microsoft-apps-myapp

Now Add a new platform. Choose a Web type.

id4-microsoft-apps-platform

Now add the redirect URL for you application. This will be the https://YOUR_URL/signin-microsoft

id4-microsoft-apps-platform2

Add the Permissions as required

id4-microsoft-apps-permission

id4-microsoft-apps-permission-list

pplication configuration

Note: The samples are at present not updated to ASP.NET Core 2.0

Clone the IdentityServer4 samples and use the 6_AspNetIdentity project from the quickstarts.
Add the Microsoft.AspNetCore.Authentication.MicrosoftAccount package using Nuget as well as the ASP.NET Core Identity and EFCore packages required to the IdentityServer4 server project.

The application uses SQLite with Identity. This is configured in the Startup class in the ConfigureServices method.

services.AddDbContext<ApplicationDbContext>(options =>
       options.UseSqlite(Configuration.GetConnectionString("DefaultConnection")));
 
services.AddIdentity<ApplicationUser, IdentityRole>()
    .AddEntityFrameworkStores<ApplicationDbContext>()
    .AddDefaultTokenProviders()
    .AddIdentityServer();

Now the AddMicrosoftAccount extension method can be use to add the Microsoft Account external provider middleware in the Configure method in the Startup class. The SignInScheme is set to “Identity.External” because the application is using ASP.NET Core Identity. The ClientId is the Id from the app ‘microsoft_id4_damienbod’ which was configured on the my applications website. The ClientSecret is the generated password.

services.AddAuthentication()
     .AddMicrosoftAccount(options => {
          options.ClientId = _clientId;
          options.SignInScheme = "Identity.External";
          options.ClientSecret = _clientSecret;
      });
 
services.AddMvc();
 
...
 
services.AddIdentityServer()
     .AddSigningCredential(cert)
     .AddInMemoryIdentityResources(Config.GetIdentityResources())
     .AddInMemoryApiResources(Config.GetApiResources())
     .AddInMemoryClients(Config.GetClients())
     .AddAspNetIdentity<ApplicationUser>()
     .AddProfileService<IdentityWithAdditionalClaimsProfileService>();

And the Configure method also needs to be configured correctly.

If you receive an error like "unauthorize_access", remember that RedirectUri is required in IdentityServer configuration and clients.

GitHub now gives free users unlimited private repositories

github-unlimited-private-repositories

GitHub is by far the most popular way to build and share software. That said, one weakness of the platform is that it limits who can create private repositories – that is, software projects that aren’t visible to the broader public, and are shared only with a handful of pre-defined collaborators – to paying users.

Fortunately, that’s no longer the case, as GitHub today announced it was giving users of its free plan access to unlimited private repositories. This is great news for GitHub’s users, but there is a caveat, of course.

Private repositories on free accounts are limited to three collaborators apiece. So, while this might work for a small project (like, for example, a team competing in a hackathon), it isn’t particularly well-suited for actual commercial usage.

That was probably a deliberate move from GitHub. There’s little risk of the company cannibalizing its existing paid users with this new free offering.

Until recently, developers who wanted to create private git repositories without opening their wallets were forced to use a rival service – most frequently BitBucket. Today’s news, obviously, isn’t great for Atlassian’s flagship code sharing platform, but it does mean that coders aren’t forced to use two disparate code management services for their private and public projects

I also wonder what ubiquitous private repositories will mean for Github’s culture of self-exhibition and sharing.

Adding Swagger to Web API project

swagger_help_pages

Adding Swagger to your Web API does not replace ASP.NET Web API help pages (here the nuget package for Microsoft ASP.NET Web Api Help Page). You can have both running side by side, if desired.

To add Swagger to an ASP.NET Web Api, we will install an open source project called Swashbuckle via nuget.

Install-Package Swashbuckle –Version 5.2.1

After the package is installed, navigate to App_Start in the Solution Explorer. You’ll notice a new file called SwaggerConfig.cs. This file is where Swagger is enabled and any configuration options should be set here.

swagger_config-1

Configuring Swagger

At minimum you’ll need this line to enable Swagger and Swagger UI.

GlobalConfiguration.Configuration
  .EnableSwagger(c => c.SingleApiVersion("v1", "A title for your API"))
  .EnableSwaggerUi();

Start a new debugging session (F5) and navigate to http://localhost:[PORT_NUM]/swagger. You should see Swagger UI help pages for your APIs.

swagger_ui

Expanding an api and clicking the “Try it out!” button will make a call to that specific API and return results.

swagger_get_superhero-1

And then you see the response:

swagger_get_response

Enable Swagger to use XML comments

The minimum configuration is nice to get started but let’s add some more customization. We can tell Swashbuckle to use XML comments to add more details to the Swagger metadata. These are the same XML comments that ASP.NET Help Pages uses.

First, enable XML documentation file creation during build. In Solution Explorer right-click on the Web API project and click Properties. Click the Build tab and navigate to Output. Make sure XML documentation file is checked. You can leave the default file path. In my case its bin\SwaggerDemoApi.XML

build_xml_docs

Next, we need to tell Swashbuckle to include our XML comments in the Swagger metadata. Add the following line to SwaggerConfig.cs. Make sure to change the file path to the path of your XML documentation file.

GlobalConfiguration.Configuration
  .EnableSwagger(c =>
    {
      c.SingleApiVersion("v1", "SwaggerDemoApi");
      c.IncludeXmlComments(string.Format(@"{0}\bin\SwaggerDemoApi.XML",           
                           System.AppDomain.CurrentDomain.BaseDirectory));
    })
  .EnableSwaggerUi();

Finally, if you haven’t already, add XML comments to your Models and API methods.

xml_comments

Run the project and navigate back to /swagger. You should see more details added to your API documentation. I’ve highlighted a few below with their corresponding XML comment.

swagger_xml_comments

Under Response Class, click Model. You should see any XML comments added to your models.

swagger_xml_comments_model

Describing Enums As Strings

My Superhero class contains an Enum property called Universe which represents which comic universe they belong to.

universe_enum

By default, Swagger displays these Enum values as their integer value. This is not very descriptive. Let’s change it to display the string representation.

GlobalConfiguration.Configuration
  .EnableSwagger(c =>
  {
    c.SingleApiVersion("v1", "SwaggerDemoApi");
    c.IncludeXmlComments(string.Format(@"{0}\bin\SwaggerDemoApi.XML", 
                         System.AppDomain.CurrentDomain.BaseDirectory));
    c.DescribeAllEnumsAsStrings();
  })
  .EnableSwaggerUi();

If I look at Swagger now, the Universe Enum values are displayed as strings.

swagger_xml_comments_enum

These are just a few of the many configuration options you can specify in Swashbuckle to create your Swagger metadata. I encourage you to review the other options on Swashbuckle’s GitHub.

Happy coding!

Importing a BACPAC to SQL Server

We previously looked at Create a backup for Azure SQL Server and in today’s post we are going to address how to look at that data by restoring or importing it to a local SQL Server.

To start, open SQL Server Management Studio (SSMS) and connect to a local instance of SQL Server. Right-click on the instance name and select Import Data-tier Application.

AzureImportSQL1

Simply click Next to go back the welcome screen of the import wizard.

image

Click browse and locate the BACPAC file on your local computer. Click Next.

AzureImportSQL2

Alternately, change the radio button to Import from Windows Azure and click Connect. You will be prompted to enter your storage account name and access key and then locate the BACPAC in your storage account. This will be downloaded as part of the import process to a temporary directory that can also be specified in the wizard.

On the database settings page of the wizard the database name, data file storage path and log file storage paths can be modified. The default locations for the data and log files will be pulled from the model database. Click Next.

AzureImportSQL3

Click Finish on the Summary page to being the import.

AzureImportSQL4

Each step and the status of the operation will be displayed. Assuming all green check marks click Close on the wizard. If there are any errors click the link in the Result column to see the details behind the failure. There should also be a new database in the SQL Server object explorer carrying the same name specified on the Database Settings page of the import wizard.

AzureImportSQL5

This satisfies the full set of requirements given by the customer:

  • Full backup of the data, archived monthly for 10 years – this can be stored in Azure blob storage and/or downloaded and stored locally
  • Ability to restore the archive at any time – a BACPAC can be imported to Azure SQL Database or to a local SQL Server
  • Maintain data access should the customer decide to no longer leverage Azure SQL Database – BACPAC files can be imported to a local SQL Server instance

Advertsing

125X125_06

Planet Xamarin

Planet Xamarin




TagCloud

MonthList