Xamarin build error: defining a default interface method requires --min-sdk-version >= 24

I added to my project Xam.Plugins.Android.ExoPlayer and then I received this error:

java/lang/Object;I)V: defining a default interface method requires --min-sdk-version >= 24 (currently 13) for interface methods: com.google.android.exoplayer2.Player$EventListener.onTimelineChanged : (Lcom/google/android/exoplayer2/Timeline;Ljava/lang/Object;I)V

Looking around, I discovered that other people had the same issue and the problem sits in the Android Options in Project Properties. No Dex compiler was specified, select D8 Dex compiler in the Android project properties:

android-project-options

In code:

<AndroidDexTool>d8</AndroidDexTool>

Happy coding!

Using an in-memory repository. Keys will not be persisted to storage. – ASP.NET Core under IIS

.NET Core Data Protection

One of the main benefits of building a new .NET project using .NET Core is cross platform deployment, however, IIS will still be a common home for ASP.NET Core web applications.

In .netcore 2.0 MVC applications, a transparent feature that is configured during app Startup is Data Protection. Data Protection provides a cryptographic foundation for things like ASP.NET Identity among many others.

When the Data Protection system is initialized, it applies default settings based on the operational environment. These settings are generally appropriate for apps running on a single machine.  – Rick Anderson

The app attempts to detect its operational environment and handle key configuration on its own. (cite)

Default Configuration Logic

  • 1) If the app is hosted in Azure Apps, keys are persisted to the %HOME%\ASP.NET\DataProtection-Keys folder. This folder is backed by network storage and is synchronized across all machines hosting the app.
    • Keys aren’t protected at rest.
    • The DataProtection-Keys folder supplies the key ring to all instances of an app in a single deployment slot.
    • Separate deployment slots, such as Staging and Production, don’t share a key ring. When you swap between deployment slots, for example swapping Staging to Production or using A/B testing, any app using Data Protection won’t be able to decrypt stored data using the key ring inside the previous slot. This leads to users being logged out of an app that uses the standard ASP.NET Core cookie authentication, as it uses Data Protection to protect its cookies. If you desire slot-independent key rings, use an external key ring provider, such as Azure Blob Storage, Azure Key Vault, a SQL store, or Redis cache.
  • 2) If the user profile is available, keys are persisted to the %LOCALAPPDATA%\ASP.NET\DataProtection-Keys folder. If the operating system is Windows, the keys are encrypted at rest using DPAPI.
  • 3) If the app is hosted in IIS, keys are persisted to the HKLM registry in a special registry key that is ACLed only to the worker process account. Keys are encrypted at rest using DPAPI.
  • 4) If none of these conditions match, keys aren’t persisted outside of the current process. When the process shuts down, all generated keys are lost.

For IIS, the item we’re interested in here is #3. The default configuration will store the keys in the system registry, that way the keys persist between AppPool restarts and machine restarts. It also lets you share the same key between applications if necessary (via a configuration addition to Startup.cs).

The Problem

Once you deploy your app and run it under an IIS App Pool, you may find that the Data Protection keys are not being persisted. If you have error logging you’ll see entries like this:

  • No XML encryptor configured. Key may be persisted to storage in unencrypted form.
  • Neither user profile nor HKLM registry available. Using an ephemeral key repository. Protected data will be unavailable when application exits.
  • Using an in-memory repository. Keys will not be persisted to storage.

aspnetcore_dataprotection

This means that each time your app pool restarts, new keys will be generated and any encrypted codes or values which have been stored or transmitted will no longer be usable. A basic example of this is a Forgotten Password request using ASP.NET Core Identity. If you request a password reset email, an encrypted URL will be sent in the email for you to click on. If the app pool restarts before you get around to clicking that link, the token will not be able to be decrypted and the reset will fail. This scenario becomes much worse if you’re storing long term encrypted data for later decryption.

The Solution

This issue stems from a bug in IIS itself which may or may not ever be corrected. In order to work around the issue, it’s necessary for you to edit your App Pool to enable User Profile Loading. Once you set your App Pool to load the user profile for the application pool identity, the application will have permission to read and write to the system registry as intended.

iis_advanced_settings_aspnetcore_dataprotection

Alternatively, you can configure Data Protection to use a different method of key storage, like a UNC share.

PersistKeysToFileSystem

To store keys on a UNC share instead of at the %LOCALAPPDATA% default location, configure the system with PersistKeysToFileSystem:

C#Copy

public void ConfigureServices(IServiceCollection services)
{
    services.AddDataProtection()
        .PersistKeysToFileSystem(new DirectoryInfo(@"\\server\share\directory\"));
}

Accessing the OIDC tokens in ASP.NET Core 2.0

In ASP.NET Core 1.1

So for example, in ASP.NET Core 1.x, if you wanted to access the tokens (id_token, access_token and refresh_token) from your application, you could set the SaveTokens property when registering the OIDC middleware:

// Inside your Configure method
app.UseOpenIdConnectAuthentication(new OpenIdConnectOptions("Auth0")
{
    // Set all your OIDC options...

    // and then set SaveTokens to save tokens to the AuthenticationProperties
    SaveTokens = true
});
You would then subsequently be able to retrieve those tokens by calling GetAuthenticateInfoAsync inside your controllers, and using the result to retreive the tokens, for example:
// Inside on of your controllers
if (User.Identity.IsAuthenticated)
{
    var authenticateInfo = await HttpContext.Authentication.GetAuthenticateInfoAsync("Auth0");
    string accessToken = authenticateInfo.Properties.Items[".Token.access_token"];
    string idToken = authenticateInfo.Properties.Items[".Token.id_token"];
}


In ASP.NET Core 2.0

In ASP.NET Core 2.0 this has changed. Firstly, you now register your OIDC middleware inside ConfigureServices as follows (making sure to set SaveTokens to true):

// Inside your ConfigureServices method
services.AddAuthentication(options => {
    options.DefaultAuthenticateScheme = CookieAuthenticationDefaults.AuthenticationScheme;
    options.DefaultSignInScheme = CookieAuthenticationDefaults.AuthenticationScheme;
    options.DefaultChallengeScheme = OpenIdConnectDefaults.AuthenticationScheme;
})
.AddCookie()
.AddOpenIdConnect(options => {
    // Set all your OIDC options...

    // and then set SaveTokens to save tokens to the AuthenticationProperties
    options.SaveTokens = true;
});

You would then subsequently be able to retrieve those tokens by calling GetTokenAsync for each of the tokens you want to access. The code sample below shows how to access the access_token and the id_token:

// Inside on of your controllers
if (User.Identity.IsAuthenticated)
{
    string accessToken = await HttpContext.GetTokenAsync("access_token");
    string idToken = await HttpContext.GetTokenAsync("id_token");

    // Now you can use them. For more info on when and how to use the 
    // access_token and id_token, see https://auth0.com/docs/tokens
}

Happy coding!

First example with ReactJs

ReactJs time!

To start with my first example in ReactJs, I'm using Codepen. Create a new pen. In Settings, under JavaScript select Babel as JavaScript Preprocessor. Then in Add External Scripts/Pens search for React. Add react and react-dom.

ReactJs-First-PenSettings

See the Pen React Starter by Enrico (@erossini) on CodePen.

Creating pop-up footnotes in EPUB 3

showme-footnotes-lg

In EPUB 3 Flowing and Fixed Layout books, you can create pop-up footnotes by labeling footnotes with the appropriate epub:type values. You use two elements to create a pop-up footnote: an anchor (<a>) element that triggers the popup and the <aside> element that contains the footnote text. Both elements have an epub:type attribute to identify their purpose: epub:type="noteref" to trigger the popup and epub:type="footnote" to indicate the footnote’s text.

In the example below, the anchor element (<a>) has two attributes: epub:type="noteref" and a link that references the location of the element that contains the popup's text.

The <aside> element that contains the popup's text also has two attributes:

  • id="myNote" that matches the value of the href attribute in the link that references it

  • epub:type="footnote"

Because the <aside> element has an epub:type of footnote, the text is hidden in the main body of the book. The text will only be seen by the reader in the context of the popup.

<html xmlns="http://www.w3.org/1999/xhtml" xmlns:epub="http://www.idpf.org/2007/ops">
<body>
  <p>
    <a href="chapter.xhtml#myNote" epub:type="noteref">1
  </p>
  <aside id="myNote" epub:type="footnote">Text in popup</aside>
</body>
</html>

Example

Footnote and reference

<p>lorum ipsum.<a epub:type="noteref" href="#fn01">1</a></p>
<aside epub:type="footnote">
   My footnote
</aside>

Endnote and reference

<p>lorum ipsum.<a epub:type="noteref" href="#en01">1</a></p>
<aside epub:type="endnote">
   My endnote
</aside>

Endnotes section

<section epub:type="endnotes">
   <h1>Endnotes</h1>
   
   <section>
     <h2>Chapter 1</h2>
     <aside epub:type="endnote">
       My endnote
     </aside>
   </section>
</section>

Google has been tracking nearly everything you buy online

google-purchase-history

Google has been quietly keeping track of nearly every single online purchase you’ve ever made, thanks to purchase receipts sent to your personal Gmail account, according to a new report today from CNBC. Even stranger: this information is made available to you via a private web tool that’s been active for an indeterminate amount of time. You can go view it here.

According to CNBC, the company says it does not use this information for personalized ad tracking; Google said back in 2017 that it would stop using data collected from Gmail messages to personalize ads. You can also delete the information from the Purchases webpage, but you must do so individually for each recorded transaction.

Google, like Facebook, knows an immense amount of information about you, your personal habits, and, yes, what you buy on the internet. And like the social network it dominates the online advertising industry alongside, Google gets this information mostly through background data collection using methods and tools its users may not be fully aware of, like Gmail purchase receipts. This is true of web tools like Gmail and smart assistants, which are increasingly coming under scrutiny for the ways the data that software collects is observed by human employees during the artificial intelligence training process.

Microsoft brings PowerToys back to let anyone improve Windows 10 for power users

windows10_power_toys

Microsoft first introduced the concept of “PowerToys” in Windows 95. It was originally a way for Windows engineers to test a prototype feature, and Microsoft packaged some of the best ones into a PowerToys bundle. These PowerToys included popular utilities like Tweak UI to customize the Windows user interface, Quick Res to quickly change screen resolutions, and Send To X that let you send files and folders to the command line, clipboard, or desktop.

PowerToys disappeared after Windows XP, during a time when co-founder Bill Gates ordered a security review of everything that was going into Windows. These useful utilities are now being revived by Microsoft in a new effort to focus on what power users need in Windows 10. The software giant is open-sourcing PowerToys on GitHub, so anyone can contribute and create power user tools for Windows 10.

MTNDWidget

The first two utilities that Microsoft is working on for Windows 10 are a new maximize to desktop widget and a Windows key shortcut guide. The maximize to desktop widget places a pop-up button over the maximize button when you hover over it. It’s designed to let you quickly send an app to another desktop, utilizing Windows 10’s multi-desktop view. The Windows shortcut guide utility simply shows a keyboard shortcut guide when you hold down the Windows key.

Microsoft is also considering 10 other utilities for these new PowerToys for Windows 10:

  1. Full window manager, including specific layouts for docking and undocking laptops
  2. Keyboard shortcut manager
  3. Win+R replacement
  4. Better alt+tab including browser tab integration and search for running apps
  5. Battery tracker
  6. Batch file re-namer
  7. Quick resolution swaps in task bar
  8. Mouse events without focus
  9. Cmd (or PS or Bash) from here
  10. Contents menu file browsing

Microsoft is looking for feedback and contributions over on GitHub, much like how the company recently open-sourced its Windows calculator for additional input and ideas. That effort resulted in a graphing mode being added to the Windows calculator. Microsoft is now planning to preview these PowerToys utilities in the summer, alongside the corresponding source code being published on GitHub.

GitHub Package Registry

github-package-registry-1200x630

From today, GitHub Package Registry, a package management service that makes it easy to publish public or private packages next to your source code, is fully integrated with GitHub, so you can use the same search, browsing, and management tools to find and publish packages as you do for your repositories. You can also use the same user and team permissions to manage code and packages together. GitHub Package Registry provides fast, reliable downloads backed by GitHub’s global CDN. And it supports familiar package management tools: JavaScript (npm), Java (Maven), Ruby (RubyGems), .NET (NuGet), and Docker images, with more to come.

You can try GitHub Package Registry today in limited beta. It will always be free to use for open source—more pricing details will be announced soon.

Sign up for the beta

Microsoft’s Chromium Edge browser is now officially available to test

Microsoft is making its Chromium-powered Edge browser available to developers today. The software giant is releasing its Canary and Developer builds, offering daily or weekly updates to the changes that are coming to Edge. Both downloads are available on Microsoft’s new Edge insider site, and they are designed for developers to get an early look at how Edge is changing.

Microsoft has focused on the fundamentals of browsing, reliability, and extension support for this early version of Edge built on Chromium, and the company is looking for feedback about the basics to start. Encouragingly, this new Edge browser runs surprisingly well, with full support for existing Chrome extensions. Microsoft is even building in sync support for things like favorites, browsing history, and extensions to sync across Edge. Favorites is only supported in this early version today, but sync support will be gradually improved before this new version of Edge is more broadly available in a beta version.

Both Microsoft and Google engineers have been working together to improve the underlying Chromium project so that Chrome and Edge run better on Windows. Microsoft has had around 150 commits accepted into Chromium, paving the way for improvements to Edge and Chromium on Windows 10. That includes improving accessibility, smooth scrolling support, Windows Hello integration, and things like ensuring the touch keyboard shows up reliably.

image

image

Using dependency injection in a .Net Core console application

One of the key features of ASP.NET Core is baked in dependency injection.

Whether you choose to use the built in container or a third party container will likely come down to whether the built in container is powerful enough for your given project. For small projects it may be fine, but if you need convention based registration, logging/debugging tools, or more esoteric approaches like property injection, then you'll need to look elsewhere.

Why use the built-in container?

One question that's come up a few times, is whether you can use the built-in provider in a .NET Core console application? The short answer is not out-of-the-box, but adding it in is pretty simple. Having said that, whether it is worth using in this case is another question.

One of the advantage of the built-in container in ASP.NET Core is that the framework libraries themselves register their dependencies with it. When you call the AddMvc() extension method in your Startup.ConfigureServices method, the framework registers a whole plethora of services with the container. If you later add a third-party container, those dependencies are passed across to be re-registered, so they are available when resolved via the third-party.

If you are writing a console app, then you likely don't need MVC or other ASP.NET Core specific services. In that case, it may be just as easy to start right off the bat using StructureMap or AutoFac instead of the limited built-in provider.

Having said that, most common services designed for use with ASP.NET Core will have extensions for registering with the built in container via IServiceCollection, so if you are using services such as logging, or the Options pattern, then it is certainly easier to use the provided extensions, and plug a third party on top of that if required.

Adding DI to a console app

If you decide the built-in container is the right approach, then adding it to your application is very simple using the Microsoft.Extensions.DependencyInjection package.

aspnet-core-di

To demonstrate the approach, I'm going to create a simple application that has two services:

public interface IFooService
{
    void DoThing(int number);
}

public interface IBarService
{
    void DoSomeRealWork();
}

Each of these services will have a single implementation. The BarService depends on an IFooService, and the FooService uses an ILoggerFactory to log some work:

public class BarService : IBarService
{
    private readonly IFooService _fooService;
    public BarService(IFooService fooService)
    {
        _fooService = fooService;
    }

    public void DoSomeRealWork()
    {
        for (int i = 0; i < 10; i++)
        {
            _fooService.DoThing(i);
        }
    }
}

public class FooService : IFooService
{
    private readonly ILogger<FooService> _logger;
    public FooService(ILoggerFactory loggerFactory)
    {
        _logger = loggerFactory.CreateLogger<FooService>();
    }

    public void DoThing(int number)
    {
        _logger.LogInformation($"Doing the thing {number}");
    }
}

As you could see above, I'm using the new logging infrastructure in my app, so I will need to add the appropriate package to my project.json. I'll also add the DependencyInjection package and the Microsoft.Extensions.Logging.Console package so I can see the results of my logging:

{
  "dependencies": {
    "Microsoft.Extensions.Logging": "1.0.0",
    "Microsoft.Extensions.Logging.Console": "1.0.0",
    "Microsoft.Extensions.DependencyInjection": "1.0.0"
  }
}

Finally, I'll update my static void main to put all the pieces together. We'll walk through through it in a second.

using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Logging;

public class Program
{
    public static void Main(string[] args)
    {
        //setup our DI
        var serviceProvider = new ServiceCollection()
            .AddLogging()
            .AddSingleton<IFooService, FooService>()
            .AddSingleton<IBarService, BarService>()
            .BuildServiceProvider();

        //configure console logging
        serviceProvider
            .GetService<ILoggerFactory>()
            .AddConsole(LogLevel.Debug);

        var logger = serviceProvider.GetService<ILoggerFactory>()
            .CreateLogger<Program>();
        logger.LogDebug("Starting application");

        //do the actual work here
        var bar = serviceProvider.GetService<IBarService>();
        bar.DoSomeRealWork();

        logger.LogDebug("All done!");
    }
}

The first thing we do is configure the dependency injection container by creating a ServiceCollection, adding our dependencies, and finally building an IServiceProvider. This process is equivalent to the ConfigureServices method in an ASP.NET Core project, and is pretty much what happens behind the scenes. You can see we are using the IServiceCollection extension method to add the logging services to our application, and then registering our own services. The serviceProvider is our container we can use to resolve services in our application.

In the next step, we need to configure the logging infrastructure with a provider, so the results are output somewhere. We first fetch an instance of ILoggerFactory from our newly constructed serviceProvider, and add a console logger.

The remainder of the program shows more dependency-injection in progress. We first fetch an ILogger<T> from the container, and then fetch an instance of IBarService. As per our registrations, the IBarService is an instance of BarService, which will have an instance of FooService injected in it.

If can then run our application and see all our beautifully resolved dependencies!

output

Adding StructureMap to your console app

As described previously, the built-in container is useful for adding framework libraries using the extension methods, like we saw with AddLogging above. However it is much less fully featured than many third-party containers.

For completeness, I'll show how easy it is to update the application to use a hybrid approach, using the built in container to easily add any framework dependencies, and using StructureMap for your own code.

First you need to add StructureMap to your project.json dependencies:

{
  "dependencies": {
    "StructureMap.Microsoft.DependencyInjection": "1.2.0"
  }
}

Now we'll update our static void main to use StructureMap for registering our custom dependencies:

public static void Main(string[] args)
{
    // add the framework services
    var services = new ServiceCollection()
        .AddLogging();

    // add StructureMap
    var container = new Container();
    container.Configure(config =>
    {
        // Register stuff in container, using the StructureMap APIs...
        config.Scan(_ =>
                    {
                        _.AssemblyContainingType(typeof(Program));
                        _.WithDefaultConventions();
                    });
        // Populate the container using the service collection
        config.Populate(services);
    });

    var serviceProvider = container.GetInstance<IServiceProvider>();

    // rest of method as before
}

At first glance this may seem more complicated than the previous version, and it is, but it is also far more powerful. In the StructureMap example, we didn't have to explicitly register our IFooService or IBarService services - they were automatically registered by convention. When your apps start to grow, this sort of convention-based registration becomes enormously powerful, especially when couple with the error handling and debugging capabilities available to you.

In this example I showed how to use StructureMap with the adapter to work with the IServiceCollection extension methods, but there's obviously no requirement to do that. Using StructureMap as your only registration source is perfectly valid, you'll just have to manually register any services added as part of the AddPLUGIN extension methods directly.

Dependency Injection with constructor parameters

The following code shows you how to configure DI for objects that have parameters in the constructor.

using Microsoft.Extensions.DependencyInjection;
using System;

namespace NetCoreDIDemo
{
    class Program
    {
        static void Main(string[] args)
        {
            var services = new ServiceCollection();
            services.AddTransient<IMyService>(s => new MyService("MyConnectionString"));
            var provider = services.BuildServiceProvider();
            var myService = provider.GetService<IMyService>();
            
            Console.WriteLine($"The constructor parameter is: {myService.GetConstructorParameter()}");
            Console.ReadKey();
        }
    }

    public interface IMyService
    {
        string GetConstructorParameter();
    }

    public class MyService : IMyService
    {
        private string connectionString;
        public MyService(string connString)
        {
            this.connectionString = connString;
        }

        public string GetConstructorParameter()
        {
            return connectionString;
        }
    }
}

Happy coding!

Advertsing

125X125_06

Planet Xamarin

Planet Xamarin




TagCloud

MonthList