Quantcast
Channel: ASP.NET Core – Software Engineering
Viewing all 269 articles
Browse latest View live

Creating PDF files in ASP.NET Core

$
0
0

This article shows how to create PDF files in ASP.NET Core. I decided I wanted to use PDFSharp, because I like this library, but no NuGet packages exist for .NET Standard 2.0. YetaWF created a port for this, which was used 1:1 in this example, without changes. It would be great to see PDFSharp as a .NET Standard 2.0 NuGet package.

Code: https://github.com/damienbod/AspNetCorePdf

Part 2: Creating a PDF in ASP.NET Core using MigraDoc PDFSharp

Setting up the projects

To get the PDFSharp code working in ASP.NET Core, the best way is to clone the PDFsharp-.netcoreapp2.0 repository from YetaWF, and add this to your solution as a project. Then create an ASP.NET Core application, MVC or Razor Pages as preferred, and add a reference to the project.

Using the PDFSharp project

The example adds a HTTP Get request, creates a PdfData model which is used as an input for the PDF document. The PdfService was added as a scoped service to the IoC, and it creates the PDF document. This PDF document is then saved to the file system. The document is also returned in the HTTP response.

[HttpGet]
public FileStreamResult CreatePdf()
{
	var data = new PdfData
	{
		DocumentTitle = "This is my demo document Title",
		DocumentName = "myFirst",
		CreatedBy = "Damien",
		Description = "some data description which I have, and want to display in the PDF file..., This is another text, what is happening here, why is this text display...",
		DisplayListItems = new List<ItemsToDisplay>
		{
			new ItemsToDisplay{ Id = "Print Servers", Data1= "some data", Data2 = "more data to display"},
			new ItemsToDisplay{ Id = "Network Stuff", Data1= "IP4", Data2 = "any left"},
			new ItemsToDisplay{ Id = "Job details", Data1= "too many", Data2 = "say no"},
			new ItemsToDisplay{ Id = "Firewall", Data1= "what", Data2 = "Let's burn it"}

		}
	};
	var path = _pdfService.CreatePdf(data);

	var stream = new FileStream(path, FileMode.Open);
	return File(stream, "application/pdf");
}

The PdfService implements one public method, CreatePdf which takes the model as a parameter. The path configurations are defined as private fields in the class. In a real application, these settings would be read from the app.settings. The method sets up the PDF document and pages using the PDFSharp project. Each part of the document is then created in different private methods.

using AspNetCorePdf.PdfProvider.DataModel;
using PdfSharp.Drawing;
using PdfSharp.Drawing.Layout;
using PdfSharp.Fonts;
using PdfSharp.Pdf;
using System;
using System.IO;

namespace AspNetCorePdf.PdfProvider
{
    public class PdfService : IPdfService
    {
        private string _createdDocsPath = ".\\PdfProvider\\Created";
        private string _imagesPath = ".\\PdfProvider\\Images";
        private string _resourcesPath = ".\\PdfProvider\\Resources";

        public string CreatePdf(PdfData pdfData)
        {
            if (GlobalFontSettings.FontResolver == null)
            {
                GlobalFontSettings.FontResolver = new FontResolver(_resourcesPath);
            }

            var document = new PdfDocument();
            var page = document.AddPage();
            var gfx = XGraphics.FromPdfPage(page);
    
            AddTitleLogo(gfx, page, $"{_imagesPath}\\logo.jpg", 0, 0);
            AddTitleAndFooter(page, gfx, pdfData.DocumentTitle, document, pdfData);

            AddDescription(gfx, pdfData);

            AddList(gfx, pdfData);

            string docName = $"{_createdDocsPath}/{pdfData.DocumentName}-{DateTime.UtcNow.ToOADate()}.pdf";
            document.Save(docName);
            return docName;
        }

XGraphics is then used to create the document as required. Refer to the samples for reference:

http://www.pdfsharp.net/wiki/PDFsharpSamples.ashx

void AddTitleLogo(XGraphics gfx, PdfPage page, string imagePath, int xPosition, int yPosition)
{
	if (!File.Exists(imagePath))
	{
		throw new FileNotFoundException(String.Format("Could not find image {0}.", imagePath));
	}

	XImage xImage = XImage.FromFile(imagePath);
	gfx.DrawImage(xImage, xPosition, yPosition, xImage.PixelWidth / 8, xImage.PixelWidth / 8);
}

void AddTitleAndFooter(PdfPage page, XGraphics gfx, string title, PdfDocument document, PdfData pdfData)
{
	XRect rect = new XRect(new XPoint(), gfx.PageSize);
	rect.Inflate(-10, -15);
	XFont font = new XFont("OpenSans", 14, XFontStyle.Bold);
	gfx.DrawString(title, font, XBrushes.MidnightBlue, rect, XStringFormats.TopCenter);

	rect.Offset(0, 5);
	font = new XFont("OpenSans", 8, XFontStyle.Italic);
	XStringFormat format = new XStringFormat();
	format.Alignment = XStringAlignment.Near;
	format.LineAlignment = XLineAlignment.Far;
	gfx.DrawString("Created by " + pdfData.CreatedBy, font, XBrushes.DarkOrchid, rect, format);

	font = new XFont("OpenSans", 8);
	format.Alignment = XStringAlignment.Center;
	gfx.DrawString(document.PageCount.ToString(), font, XBrushes.DarkOrchid, rect, format);

	document.Outlines.Add(title, page, true);
}

void AddDescription(XGraphics gfx, PdfData pdfData)
{
	var font = new XFont("OpenSans", 14, XFontStyle.Regular);
	XTextFormatter tf = new XTextFormatter(gfx);
	XRect rect = new XRect(40, 100, 520, 100);
	gfx.DrawRectangle(XBrushes.White, rect);
	tf.DrawString(pdfData.Description, font, XBrushes.Black, rect, XStringFormats.TopLeft);
}

void AddList(XGraphics gfx, PdfData pdfData)
{
	int startingHeight = 200;
	int listItemHeight = 30;

	for (int i = 0; i < pdfData.DisplayListItems.Count; i++)
	{
		var font = new XFont("OpenSans", 14, XFontStyle.Regular);
		XTextFormatter tf = new XTextFormatter(gfx);
		XRect rect = new XRect(60, startingHeight, 500, listItemHeight);
		gfx.DrawRectangle(XBrushes.White, rect);
		var data = $"{i}. {pdfData.DisplayListItems[i].Id} | {pdfData.DisplayListItems[i].Data1} | {pdfData.DisplayListItems[i].Data2}";
		tf.DrawString(data, font, XBrushes.Black, rect, XStringFormats.TopLeft);

		startingHeight = startingHeight + listItemHeight;
	}
}

When the application is run, the create PDF link can be clicked, and it creates the PDF.

The PDF is returned in the browser. Each PDF can also be viewed in the Created directory.

This works really good, with little effort to setup. The used PDFSharp code is included in the repository. The MigraDoc is not part of this, it would be nice to use this as well, but no solution exists, which works for ASP.NET Core.

I really hope the the PDFSharp NuGet package as well as MigraDoc gets ported to .NET Core.

Links:

https://damienbod.com/2018/10/03/creating-a-pdf-in-asp-net-core-using-migradoc-pdfsharp/

https://github.com/YetaWF/PDFsharp-.netcoreapp2.0

http://www.pdfsharp.net/wiki/Graphics-sample.ashx

http://www.pdfsharp.net/wiki/PDFsharpSamples.ashx

http://www.pdfsharp.net/

https://odetocode.com/blogs/scott/archive/2018/02/14/pdf-generation-in-azure-functions-v2.aspx

http://fizzylogic.nl/2017/08/03/how-to-generate-pdf-documents-in-asp-net-core/

https://github.com/rdvojmoc/DinkToPdf

https://photosauce.net/blog/post/5-reasons-you-should-stop-using-systemdrawing-from-aspnet


Implementing User Management with ASP.NET Core Identity and custom claims

$
0
0

The article shows how to implement user management for an ASP.NET Core application using ASP.NET Core Identity. The application uses custom claims, which need to be added to the user identity after a successful login, and then an ASP.NET Core policy is used to authorize the identity.

Code: https://github.com/damienbod/AspNetCoreAngularSignalRSecurity

Setting up the Project

The demo application is implemented using ASP.NET Core MVC and uses the IdentityServer and IdentityServer4.AspNetIdentity NuGet packages.

ASP.NET Core Identity is then added in the Startup class ConfigureServices method. SQLite is used as a database. A scoped service for the IUserClaimsPrincipalFactory is added so that the additional claims can be added to the Context.User.Identity scoped object.

An IAuthorizationHandler service is added, so that the IsAdminHandler can be used for the IsAdmin policy. This policy can then be used to check if the identity has the custom claims which was added to the identity in the AdditionalUserClaimsPrincipalFactory implementation.

public void ConfigureServices(IServiceCollection services)
{
	...
	
	services.AddDbContext<ApplicationDbContext>(options =>
	 options.UseSqlite(Configuration.GetConnectionString("DefaultConnection")));

	services.AddIdentity<ApplicationUser, IdentityRole>()
	 .AddEntityFrameworkStores<ApplicationDbContext>()
	 .AddDefaultTokenProviders();

	services.AddScoped<IUserClaimsPrincipalFactory<ApplicationUser>, 
	 AdditionalUserClaimsPrincipalFactory>();

	services.AddSingleton<IAuthorizationHandler, IsAdminHandler>();
	services.AddAuthorization(options =>
	{
		options.AddPolicy("IsAdmin", policyIsAdminRequirement =>
		{
			policyIsAdminRequirement.Requirements.Add(new IsAdminRequirement());
		});
	});

	...
}

The application uses IdentityServer4. The UseIdentityServer extension is used instead of the UseAuthentication method to use the authentication.

public void Configure(IApplicationBuilder app, 
  IHostingEnvironment env, 
  ILoggerFactory loggerFactory)
{
	...
	
	app.UseStaticFiles();

	app.UseIdentityServer();

	app.UseMvc(routes =>
	{
		routes.MapRoute(
			name: "default",
			template: "{controller=Home}/{action=Index}/{id?}");
	});
}

The ApplicationUser class implements the IdentityUser class. Additional database fields can be added here, which will then be used to create the claims for the logged in user.

using Microsoft.AspNetCore.Identity;

namespace StsServer.Models
{
    public class ApplicationUser : IdentityUser
    {
        public bool IsAdmin { get; set; }
        public string DataEventRecordsRole { get; set; }
        public string SecuredFilesRole { get; set; }
    }
}

The AdditionalUserClaimsPrincipalFactory class implements the UserClaimsPrincipalFactory class, and can be used to add the additional claims to the user object in the HTTP context. This was added as a scoped service in the Startup class. The ApplicationUser is then used, so that the custom claims can be added to the identity.

using IdentityModel;
using Microsoft.AspNetCore.Identity;
using Microsoft.Extensions.Options;
using StsServer.Models;
using System.Collections.Generic;
using System.Security.Claims;
using System.Threading.Tasks;

namespace StsServer
{
    public class AdditionalUserClaimsPrincipalFactory 
          : UserClaimsPrincipalFactory<ApplicationUser, IdentityRole>
    {
        public AdditionalUserClaimsPrincipalFactory( 
            UserManager<ApplicationUser> userManager,
            RoleManager<IdentityRole> roleManager, 
            IOptions<IdentityOptions> optionsAccessor) 
            : base(userManager, roleManager, optionsAccessor)
        {
        }

        public async override Task<ClaimsPrincipal> CreateAsync(ApplicationUser user)
        {
            var principal = await base.CreateAsync(user);
            var identity = (ClaimsIdentity)principal.Identity;

            var claims = new List<Claim>
            {
                new Claim(JwtClaimTypes.Role, "dataEventRecords"),
                new Claim(JwtClaimTypes.Role, "dataEventRecords.user")
            };

            if (user.DataEventRecordsRole == "dataEventRecords.admin")
            {
                claims.Add(new Claim(JwtClaimTypes.Role, "dataEventRecords.admin"));
            }

            if (user.IsAdmin)
            {
                claims.Add(new Claim(JwtClaimTypes.Role, "admin"));
            }
            else
            {
                claims.Add(new Claim(JwtClaimTypes.Role, "user"));
            }

            identity.AddClaims(claims);
            return principal;
        }
    }
}

Now the policy IsAdmin can check for this. First a requirement is defined. This is done by implementing the IAuthorizationRequirement interface.

using Microsoft.AspNetCore.Authorization;
 
namespace StsServer
{
    public class IsAdminRequirement : IAuthorizationRequirement{}
}

The IsAdminHandler AuthorizationHandler uses the IsAdminRequirement requirement. If the user has the role claim with value admin, then the handler will succeed.

using Microsoft.AspNetCore.Authorization;
using System;
using System.Linq;
using System.Threading.Tasks;

namespace StsServer
{
    public class IsAdminHandler : AuthorizationHandler<IsAdminRequirement>
    {
        protected override Task HandleRequirementAsync(
          AuthorizationHandlerContext context, IsAdminRequirement requirement)
        {
            if (context == null)
                throw new ArgumentNullException(nameof(context));
            if (requirement == null)
                throw new ArgumentNullException(nameof(requirement));

            var adminClaim = context.User.Claims.FirstOrDefault(t => t.Value == "admin" && t.Type == "role"); 
            if (adminClaim != null)
            {
                context.Succeed(requirement);
            }

            return Task.CompletedTask;
        }
    }
}

The AdminController adds a way to do the CRUD operations for the Identity users. The AdminController uses the Authorize attribute with the policy IsAdmin to authorize. The AuthenticationSchemes needs to be set to “Identity.Application”, because Identity is being used. Now admins can create, or edit Identity users.

using System.Linq;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Authorization;
using Microsoft.AspNetCore.Identity;
using Microsoft.AspNetCore.Mvc;
using Microsoft.EntityFrameworkCore;
using StsServer.Data;
using StsServer.Models;

namespace StsServer.Controllers
{
    [Authorize(AuthenticationSchemes = "Identity.Application", Policy = "IsAdmin")]
    public class AdminController : Controller
    {
        private readonly ApplicationDbContext _context;
        private readonly UserManager<ApplicationUser> _userManager;

        public AdminController(ApplicationDbContext context, UserManager<ApplicationUser> userManager)
        {
            _context = context;
            _userManager = userManager;
        }

        public async Task<IActionResult> Index()
        {
            return View(await _context.Users.Select(user => 
                new AdminViewModel {
                    Email = user.Email,
                    IsAdmin = user.IsAdmin,
                    DataEventRecordsRole = user.DataEventRecordsRole,
                    SecuredFilesRole = user.SecuredFilesRole
                }).ToListAsync());
        }

        public async Task<IActionResult> Details(string id)
        {
            if (id == null)
            {
                return NotFound();
            }

            var user = await _context.Users
                .FirstOrDefaultAsync(m => m.Email == id);
            if (user == null)
            {
                return NotFound();
            }

            return View(new AdminViewModel
            {
                Email = user.Email,
                IsAdmin = user.IsAdmin,
                DataEventRecordsRole = user.DataEventRecordsRole,
                SecuredFilesRole = user.SecuredFilesRole
            });
        }

        public IActionResult Create()
        {
            return View();
        }

        [HttpPost]
        [ValidateAntiForgeryToken]
        public async Task<IActionResult> Create(
         [Bind("Email,IsAdmin,DataEventRecordsRole,SecuredFilesRole")] AdminViewModel adminViewModel)
        {
            if (ModelState.IsValid)
            {
                await _userManager.CreateAsync(new ApplicationUser
                {
                    Email = adminViewModel.Email,
                    IsAdmin = adminViewModel.IsAdmin,
                    DataEventRecordsRole = adminViewModel.DataEventRecordsRole,
                    SecuredFilesRole = adminViewModel.SecuredFilesRole,
                    UserName = adminViewModel.Email
                });
                return RedirectToAction(nameof(Index));
            }
            return View(adminViewModel);
        }

        public async Task<IActionResult> Edit(string id)
        {
            if (id == null)
            {
                return NotFound();
            }

            var user = await _userManager.FindByEmailAsync(id);
            if (user == null)
            {
                return NotFound();
            }

            return View(new AdminViewModel
            {
                Email = user.Email,
                IsAdmin = user.IsAdmin,
                DataEventRecordsRole = user.DataEventRecordsRole,
                SecuredFilesRole = user.SecuredFilesRole
            });
        }

        [HttpPost]
        [ValidateAntiForgeryToken]
        public async Task<IActionResult> Edit(string id, [Bind("Email,IsAdmin,DataEventRecordsRole,SecuredFilesRole")] AdminViewModel adminViewModel)
        {
            if (id != adminViewModel.Email)
            {
                return NotFound();
            }

            if (ModelState.IsValid)
            {
                try
                {
                    var user = await _userManager.FindByEmailAsync(id);
                    user.IsAdmin = adminViewModel.IsAdmin;
                    user.DataEventRecordsRole = adminViewModel.DataEventRecordsRole;
                    user.SecuredFilesRole = adminViewModel.SecuredFilesRole;

                    await _userManager.UpdateAsync(user);
                }
                catch (DbUpdateConcurrencyException)
                {
                    if (!AdminViewModelExists(adminViewModel.Email))
                    {
                        return NotFound();
                    }
                    else
                    {
                        throw;
                    }
                }
                return RedirectToAction(nameof(Index));
            }
            return View(adminViewModel);
        }

        public async Task<IActionResult> Delete(string id)
        {
            if (id == null)
            {
                return NotFound();
            }

            var user = await _userManager.FindByEmailAsync(id);
            if (user == null)
            {
                return NotFound();
            }

            return View(new AdminViewModel
            {
                Email = user.Email,
                IsAdmin = user.IsAdmin,
                DataEventRecordsRole = user.DataEventRecordsRole,
                SecuredFilesRole = user.SecuredFilesRole
            });
        }

        [HttpPost, ActionName("Delete")]
        [ValidateAntiForgeryToken]
        public async Task<IActionResult> DeleteConfirmed(string id)
        {
            var user = await _userManager.FindByEmailAsync(id);
            await _userManager.DeleteAsync(user);
            return RedirectToAction(nameof(Index));
        }

        private bool AdminViewModelExists(string id)
        {
            return _context.Users.Any(e => e.Email == id);
        }
    }
}

Running the application

When the application is started, the ADMIN menu can be clicked, and the users can be managed by administrators.

Links

http://benfoster.io/blog/customising-claims-transformation-in-aspnet-core-identity

https://adrientorris.github.io/aspnet-core/identity/extend-user-model.html

https://docs.microsoft.com/en-us/aspnet/core/security/authentication/identity?view=aspnetcore-2.1&tabs=visual-studio

ASP.NET Core MVC Ajax Form requests using jquery-unobtrusive

$
0
0

This article shows how to send Ajax requests in an ASP.NET Core MVC application using jquery-unobtrusive. This can be tricky to setup, for example when using a list of data items with forms using the onchange Javascript event, or the oninput event.

Code: https://github.com/damienbod/AspNetCoreBootstrap4Validation

Setting up the Project

The project uses the npm package.json file, to add the required front end packages to the project. jquery-ajax-unobtrusive is added as well as the other required dependencies.

{
  "version": "1.0.0",
  "name": "asp.net",
  "private": true,
  "devDependencies": {
    "bootstrap": "4.1.3",
    "jquery": "3.3.1",
    "jquery-validation": "1.17.0",
    "jquery-validation-unobtrusive": "3.2.10",
    "jquery-ajax-unobtrusive": "3.2.4"
  }
}

bundleconfig.json is used to package and build the Javascript and the css files into bundles. The BuildBundlerMinifier NuGet package needs to be added to the project for this to work.

The Javascript libraries are packaged into 2 different bundles, vendor-validation.min.js and vendor-validation.min.js.

// Vendor JS
{
    "outputFileName": "wwwroot/js/vendor.min.js",
    "inputFiles": [
      "node_modules/jquery/dist/jquery.min.js",
      "node_modules/bootstrap/dist/js/bootstrap.bundle.min.js"
    ],
    "minify": {
      "enabled": true,
      "renameLocals": true
    },
    "sourceMap": false
},
// Vendor Validation JS
{
    "outputFileName": "wwwroot/js/vendor-validation.min.js",
    "inputFiles": [
      "node_modules/jquery-validation/dist/jquery.validate.min.js",
      "node_modules/jquery-validation/dist/additional-methods.js",
      "node_modules/jquery-validation-unobtrusive/dist/jquery.validate.unobtrusive.min.js",
      "node_modules//jquery-ajax-unobtrusive/jquery.unobtrusive-ajax.min.js"
    ],
    "minify": {
      "enabled": true,
      "renameLocals": true
    },
    "sourceMap": false
}

The global bundles can be added at the end of the _Layout.cshtml file in the ASP.NET Core MVC project.


... 
    <script src="~/js/vendor.min.js" asp-append-version="true"></script>
    <script src="~/js/site.min.js" asp-append-version="true"></script>
    @RenderSection("scripts", required: false)
</body>
</html>

And the validation bundle is added to the _ValidationScriptsPartial.cshtml.

<script src="~/js/vendor-validation.min.js" asp-append-version="true"></script>

This is then added in the views as required.

@section Scripts  {
    @await Html.PartialAsync("_ValidationScriptsPartial")
}

Simple AJAX Form request

A form request can be sent as an Ajax request, by adding the html attributes to the form element. When the request is finished, the div element with the id attribute defined in the data-ajax-update parameter, will be replaced with the partial result response. The Html.PartialAsync method calls the initial view.

@{
    ViewData["Title"] = "Ajax Test Page";
}

<h4>Ajax Test</h4>

<form asp-action="Index" asp-controller="AjaxTest" 
      data-ajax="true" 
      data-ajax-method="POST"
      data-ajax-mode="replace" 
      data-ajax-update="#ajaxresult" >

    <div id="ajaxresult">
        @await Html.PartialAsync("_partialAjaxForm")
    </div>
</form>

@section Scripts  {
    @await Html.PartialAsync("_ValidationScriptsPartial")
}

The _partialAjaxForm.cshtml view implements the form contents. The submit button is required to send the request as an Ajax request.

@model AspNetCoreBootstrap4Validation.ViewModels.AjaxValidationModel 

<div asp-validation-summary="All" class="text-danger"></div>

<div class="form-group">
  <label for="name">Name</label>
  <input type="text" class="form-control" asp-for="Name" 
     id="AjaxValidationModelName" aria-describedby="nameHelp" 
     placeholder="Enter name">
  <small id="nameHelp" class="form-text text-muted">
    We'll never share your name ...
  </small>
  <span asp-validation-for="Name" class="text-danger"></span>
</div>

<div class="form-group">
  <label for="age">Age</label>
  <input type="number" class="form-control" 
    id="AjaxValidationModelAge" asp-for="Age" placeholder="0">
  <span asp-validation-for="Age" class="text-danger"></span>
</div>

<div class="form-check ten_px_bottom">
  <input type="checkbox" class="form-check-input big_checkbox"
      asp-for="IsCool" id="AjaxValidationModelIsCool">
  <label class="form-check-label ten_px_left" for="IsCool">IsCool</label>
  <span asp-validation-for="IsCool" class="text-danger"></span>
</div>

<button type="submit" class="btn btn-primary">Submit</button>

The ASP.NET Core MVC controller handles the requests from the view. The first Index method in the example below, just responds to a plain HTTP GET.

The second Index method accepts a POST request with the Anti-Forgery token which is sent with each request. When the result is successful, a partial view is returned. The model state must also be cleared, otherwise the validation messages will not be reset.

If the page returns the incorrect result, ie just the content of the partial view, then the request was not sent asynchronously, but as a full page request. You need to check, that the front end packages are included correctly.

public class AjaxTestController : Controller
{
  public IActionResult Index()
  {
    return View(new AjaxValidationModel());
  }

  [HttpPost]
  [ValidateAntiForgeryToken]
  public IActionResult Index(AjaxValidationModel model)
  {
    if (!ModelState.IsValid)
    {
      return PartialView("_partialAjaxForm", model);
    }

    // the client could validate this, but allowed for testing server errors
    if(model.Name.Length < 3)
    {
      ModelState.AddModelError("name", "Name should be longer than 2 chars");
      return PartialView("_partialAjaxForm", model);
    }

    ModelState.Clear();
    return PartialView("_partialAjaxForm");
  }
}

Complex AJAX Form request

In this example, a list of data items are returned to the view. Each item in the list will have a form to update its data, and also the data will be updated using a checkbox onchange event or the input text oninput event and not the submit button.

Because a list is used, the div element to be updated must have a unique id. This can be implemented by creating a new GUID with each item, and can be used then in the name of the div to be updated, and also the data-ajax-update parameter.

@using AspNetCoreBootstrap4Validation.ViewModels
@model AjaxValidationListModel
@{
    ViewData["Title"] = "Ajax Test Page";
}

<h4>Ajax Test</h4>

@foreach (var item in Model.Items)
{
    string guid = Guid.NewGuid().ToString();

    <form asp-action="Index" asp-controller="AjaxComplexList" 
          data-ajax="true" 
          data-ajax-method="POST"
          data-ajax-mode="replace" 
          data-ajax-update="#complex-ajax-@guid">

        <div id="complex-ajax-@guid">
            @await Html.PartialAsync("_partialComplexAjaxForm", item)
        </div>
    </form>
}


@section Scripts  {
    @await Html.PartialAsync("_ValidationScriptsPartial")
}

The form data will send the update with an onchange Javascript event from the checkbox. This could be required for example, when the UX designer wants instant updates, instead of an extra button click. To achieve this, the submit button is not displayed. A unique id is used to identify each button, and the onchange event from the checkbox triggers the submit event using this. Now the form request will be sent using Ajax like before.

@model AspNetCoreBootstrap4Validation.ViewModels.AjaxValidationModel
@{
    string guid = Guid.NewGuid().ToString();
}

<div asp-validation-summary="All" class="text-danger"></div>

<div class="form-group">
    <label for="name">Name</label>

    <input type="text" class="form-control" asp-for="Name" 
      id="AjaxValidationModelName" aria-describedby="nameHelp" placeholder="Enter name"
      oninput="$('#submit-@guid').trigger('submit');">

    <small id="nameHelp" class="form-text text-muted">We'll never share your name ...</small>
    <span asp-validation-for="Name" class="text-danger"></span>
</div>
<div class="form-group">
    <label for="age">Age</label>

    <input type="number" asp-for="Age" 
      class="form-control" id="AjaxValidationModelAge" placeholder="0"
      oninput="$('#submit-@guid').trigger('submit');">

    <span asp-validation-for="Age" class="text-danger"></span>
</div>
<div class="form-check ten_px_bottom">

    @Html.CheckBox("IsCool", Model.IsCool,
        new { onchange = "$('#submit-" + @guid + "').trigger('submit');", @class = "big_checkbox" })

    <label class="form-check-label ten_px_left" >Check the checkbox to send a request</label>
</div>

<button style="display: none" id="submit-@guid" type="submit">Submit</button>

The ASP.NET Core controller returns the HTTP GET and POST like before.

using AspNetCoreBootstrap4Validation.ViewModels;

namespace AspNetCoreBootstrap4Validation.Controllers
{
    public class AjaxComplexListController : Controller
    {
        public IActionResult Index()
        {
            return View(new AjaxValidationListModel {
                Items = new List<AjaxValidationModel> {
                    new AjaxValidationModel(),
                    new AjaxValidationModel()
                }
            });
        }

        [HttpPost]
        [ValidateAntiForgeryToken]
        public IActionResult Index(AjaxValidationModel model)
        {
            if (!ModelState.IsValid)
            {
                return PartialView("_partialComplexAjaxForm", model);
            }

            // the client could validate this, but allowed for testing server errors
            if(model.Name.Length < 3)
            {
                ModelState.AddModelError("name", "Name should be longer than 2 chars");
                return PartialView("_partialComplexAjaxForm", model);
            }

            ModelState.Clear();
            return PartialView("_partialComplexAjaxForm", model);
        }
    }
}

When the requests are sent, you can check this using the F12 developer tools in the browser using the network tab. The request type should be xhr.

Links

https://dotnetthoughts.net/jquery-unobtrusive-ajax-helpers-in-aspnet-core/

https://www.mikesdotnetting.com/article/326/using-unobtrusive-ajax-in-razor-pages

https://www.learnrazorpages.com/razor-pages/ajax/unobtrusive-ajax

https://damienbod.com/2018/07/08/updating-part-of-an-asp-net-core-mvc-view-which-uses-forms/

https://ml-software.ch/blog/extending-client-side-validation-with-fluentvalidation-and-jquery-unobtrusive-in-an-asp-net-core-application

https://ml-software.ch/blog/extending-client-side-validation-with-dataannotations-and-jquery-unobtrusive-in-an-asp-net-core-application

OpenID Connect back-channel logout using Azure Redis Cache and IdentityServer4

$
0
0

This article shows how to implement an OpenID Connect back-channel logout, which uses Azure Redis cache so that the session logout will work with multi instance deployments.

Code: https://github.com/damienbod/AspNetCoreBackChannelLogout

Posts in this series:

Setting up the Azure Redis Cache

Before using the Azure Redis Cache in the application, this needs to be setup in Azure. Joonas Westlin has a nice blog about this. The Redis Azure FAQ link is also very good, which should help you decide the configuration which is correct for you.

Click “Create a Resource” and enter Redis Cache in the search input.

Then create the Redis Cache as required:

Creating the cache takes some time. Once finished, the connection string can be copied from the Access keys

Now that the Azure Redis is setup, you can add the cache to the ASP.NET Core application. In this example, the Microsoft.Extensions.Caching.Redis NuGet package is used to access and use the Azure Redis Cache. Add this to your project.

In the Startup class, add the distributed Redis cache using the AddDistributedRedisCache extension method from the NuGet package.

services.AddDistributedRedisCache(options =>
{
	options.Configuration = 
	  Configuration.GetConnectionString("RedisCacheConnection");
	options.InstanceName = "MvcHybridBackChannelInstance";
});

Add the Redis connection string to the app.settings. This example using the RedisCacheConnection. The values for this can be copied from the access keys tab in the Redis/Access keys menu which was created above.

The connection string should be added as a secret to the application, and not committed in the code.

"ConnectionStrings": {
    "RedisCacheConnection": "redis-connection-string"
},

Using the Cache for the Back-Channel logout

The LogoutSessionManager class uses the Azure Redis cache to add or get the different logouts. The OpenID Connect back-channel specification defines how this logout works. The Secure Token Server, implemented using IdentityServer4, requests a logout URL which is handled in the client application.

The LogoutController class is used for this. If all the validation and the checks are ok, the class uses a singleton instance of LogoutSessionManager to manage the logouts for the client. The code used in this example, was created using the IdentityServer4.Samples.

The IDistributedCache is added in the constructor and saved as a read only field in the class.

private static readonly Object _lock = new Object();
private readonly ILogger<LogoutSessionManager> _logger;
private IDistributedCache _cache;

// Amount of time to check for old sessions. If this is to long, 
// the cache will increase, or if you have many user sessions, 
// this will increase to much.
private const int cacheExpirationInDays = 8;

public LogoutSessionManager(ILoggerFactory loggerFactory, IDistributedCache cache)
{
	_cache = cache;
	_logger = loggerFactory.CreateLogger<LogoutSessionManager>();
}

When a logout is initialized by a user, from an application, this request is sent to the OpenID Connect server. The server does the logout logic, and sends requests back to all applications that have the back-channel configured.

The LogoutController handles this request from the Secure Token Server, and adds a key pair to the Redis cache using the sid and the sub.

The Redis cache is shared between all instances of the client application and needs to be thread safe. Then all client instances can check if the user, application needs to be logged out.

public void Add(string sub, string sid)
{
	_logger.LogWarning($"Add a logout to the session: sub: {sub}, sid: {sid}");
	var options = new DistributedCacheEntryOptions()
          .SetSlidingExpiration(TimeSpan.FromDays(cacheExpirationInDays));

	lock (_lock)
	{
		var key = sub + sid;
		var logoutSession = _cache.GetString(key);
		if (logoutSession != null)
		{
			var session = JsonConvert.DeserializeObject<Session>(logoutSession);
		}
		else
		{
			var newSession = new Session { Sub = sub, Sid = sid };
			_cache.SetString(key, JsonConvert.SerializeObject(newSession), options);
		}
	}
}

The IsLoggedOutAsync method is used to check if a logout request exists for the application, user. This method uses the sid and sub values, to request the Redis value, if it exists.

public async Task<bool> IsLoggedOutAsync(string sub, string sid)
{
	var key = sub + sid;
	var matches = false;
	var logoutSession = await _cache.GetStringAsync(key);
	if (logoutSession != null)
	{
		var session = JsonConvert.DeserializeObject<Session>(logoutSession);
		matches = session.IsMatch(sub, sid);
		_logger.LogInformation($"Logout session exists T/F {matches} : {sub}, sid: {sid}");
	}

	return matches;
}

The method is used in the CookieEventHandler class in the ValidatePrincipal method to end the session if a logout request was found.

public override async Task ValidatePrincipal(CookieValidatePrincipalContext context)
{
	if (context.Principal.Identity.IsAuthenticated)
	{
		var sub = context.Principal.FindFirst("sub")?.Value;
		var sid = context.Principal.FindFirst("sid")?.Value;

		if (await LogoutSessions.IsLoggedOutAsync(sub, sid))
		{
			context.RejectPrincipal();
			await context.HttpContext.SignOutAsync(
                          CookieAuthenticationDefaults.AuthenticationScheme);
		}
	}
}

The CookieEventHandler was added in the Startup to the cookie configuration.

.AddCookie(options =>
{
	options.ExpireTimeSpan = TimeSpan.FromMinutes(60);
	options.Cookie.Name = "mvchybridbc";

	options.EventsType = typeof(CookieEventHandler);
})

Now Azure Redis cache is used to handle the back-channel logouts from the Secure Token Server.

Configure IdentityServer4 for custom end session Logic

If you want more control over how and what back-channel clients receive a request, you can implement the IEndSessionRequestValidator interface when using IdentityServer4. The GetClientEndSessionUrlsAsync method could be edited to change the required clients which will be called after a logout event.

protected virtual async Task<(IEnumerable<string> frontChannel, 
     IEnumerable<BackChannelLogoutModel> backChannel)> 
  GetClientEndSessionUrlsAsync(EndSession endSession)
{
	var frontChannelUrls = new List<string>();
	var backChannelLogouts = new List<BackChannelLogoutModel>();

	List<string> backchannelLogouts = new List<string>
	{
		"mvc.hybrid.backchannel",
		"mvc.hybrid.backchanneltwo"
	};

	foreach (var clientId in backchannelLogouts)
	{

If the IEndSessionRequestValidator is implemented, this needs to be added to the ASP.NET Core IoC.

services.AddTransient<IEndSessionRequestValidator, 
   MyEndSessionRequestValidator>();

Notes, Problems

One problem with this, is that all logouts are saved to the cache for n-days. If the logouts are removed to early, the logout will not work for a client application which is opened after this, or if the logout items are kept to long, the size of the Redis cache will be very large in size, and cost.

Links:

https://joonasw.net/view/redis-cache-session-store

https://docs.microsoft.com/en-us/aspnet/core/performance/caching/distributed?view=aspnetcore-2.2#distributed-redis-cache

https://docs.microsoft.com/en-us/azure/azure-cache-for-redis/

https://blogs.msdn.microsoft.com/luisdem/2016/09/06/azure-redis-cache-on-asp-net-core/

https://openid.net/specs/openid-connect-backchannel-1_0.html

http://docs.identityserver.io/en/release/topics/signout.html

View story at Medium.com

View story at Medium.com

https://ldapwiki.com/wiki/OpenID%20Connect%20Back-Channel%20Logout

https://datatracker.ietf.org/meeting/97/materials/slides-97-secevent-oidc-logout-01

https://docs.microsoft.com/en-us/aspnet/core/fundamentals/app-state?view=aspnetcore-2.2

https://docs.microsoft.com/en-us/azure/azure-cache-for-redis/cache-dotnet-core-quickstart

Using Azure Key Vault with ASP.NET Core and Azure App Services

$
0
0

This article shows how to use an Azure Key Vault with an ASP.NET Core application deployed as an Azure App Service. The Azure App Service can use the system assigned identity to access the Key Vault. This needs to be configured in the Key Vault access policies using the service principal.

Code: https://github.com/damienbod/AspNetCoreBackChannelLogout

Posts in this series:

Create an Azure Key Vault

This is really easy, and does not require much effort. You can create an Azure Key Vault by following the Microsoft documentation here:

https://docs.microsoft.com/en-us/azure/key-vault/key-vault-get-started

Or using the Azure UI, you can create a Key Vault by clicking the “+ Create a Resource” blade and typing Key Vault in the search text input.

Fill out the inputs as required.

Now the Key Vault should be ready.

Create and Deploy the Azure App service

The second step is to deploy the ASP.NET Core application to Azure as an Azure App Service. You can do this in Visual Studio, or with templates from a build. Once the application is deployed, check that the Identity blade is configured correctly.

In the “App Services” blade, click the application which was deployed, and then the Identity blade. The Status must be “On” for the system assigned tab.

See the Microsoft docs for Azure App Services deployments.

Add the Access Policy in the Key Vault for the App Service

Now that the Azure App Service is ready, the Key Vault must be configured to permit the App Service application access. In the Key Vault, click the “Access Policies” blade, and then “Add new

Then click the “Select principal” and search for the Azure App service which was created above. Make sure that the required permissions are activated when configuring. Normally only the GET and List permissions are required. Click save, and check that the permissions are really saved after you have saved.

Now the Azure App Service can access the Key Vault.

Add some secrets to the Key Vault:

The secrets can be added in different formats. See the Microsoft docs for secret text formats. Any app.settings.json format can be matched.

Configure the application to use the Key Vault for configuration values

The application now requires code to use the Azure Key Vault. Add the Microsoft.Extensions.Configuration.AzureKeyVault NuGet package to the project.

Add the Azure Key Vault configuration to the Program.cs file in the ASP.NET Core application. Add this in the BuildWebHost method using the ConfigureAppConfiguration method. The app.settings configuration value AzureKeyVaultEndpoint should have the DNS Name value of the Key Vault. This can be found in the overview of the Key Vault which was created. Then add the Key Vault to the application as follows:

public static IWebHost BuildWebHost(string[] args) =>
 WebHost.CreateDefaultBuilder(args)
 .ConfigureAppConfiguration((context, config) =>
 {
    var builder = config.Build();

    var keyVaultEndpoint = builder["AzureKeyVaultEndpoint"];

    var azureServiceTokenProvider = new AzureServiceTokenProvider();

    var keyVaultClient = new KeyVaultClient(
      new KeyVaultClient.AuthenticationCallback(
        azureServiceTokenProvider.KeyVaultTokenCallback)
      );

    config.AddAzureKeyVault(keyVaultEndpoint);
 })
 .UseStartup<Startup>()

Remove any Configuration builders from the Startup constructor. The IConfiguration should be used, not created here.

public IConfiguration Configuration { get; }

public Startup(IConfiguration configuration, IHostingEnvironment env)
{
	JwtSecurityTokenHandler.DefaultInboundClaimTypeMap.Clear();
	Configuration = configuration;
	_environment = env;
}

Add the Key Vault developer values, to the app.settings as required, or add the secret values for development to the secrets.json file.

{
  "ConnectionStrings": {
    "RedisCacheConnection": "redis-connection-string"
  },
  "Logging": {
    "IncludeScopes": false,
    "LogLevel": {
      "Default": "Debug",
      "System": "Information",
      "Microsoft": "Information"
    }
  },
  "AuthConfiguration": {
    "StsServerIdentityUrl": "https://localhost:44318",
    "Audience": "mvc.hybrid.backchannel"
  },
  "SecretMvcHybridBackChannel": "secret"
}

The configuration values will be set from the Key Vault first. If no Key Vault item exists, then the secrets.json file will be used, and after this, the app.settings.json file.

The Key Vault values can be used anywhere in the ASP.NET Core application by using the standard configuration interfaces.

The following demo uses the Test configuration value which is read from the Key Vault.

private AuthConfiguration _optionsAuthConfiguration;

private IConfiguration _configuration;

public HomeController(IOptions<AuthConfiguration> optionsAuthConfiguration, IConfiguration configuration)
{
	_configuration = configuration;
	_optionsAuthConfiguration = optionsAuthConfiguration.Value;
}

public IActionResult Index()
{
	var cs = _configuration["Test"];
	return View("Index",  cs);
}

Links

https://social.technet.microsoft.com/wiki/contents/articles/51871.net-core-2-managing-secrets-in-web-apps.aspx#AzureKeyVault_Secrets

https://docs.microsoft.com/en-us/azure/key-vault/key-vault-developers-guide

https://jeremylindsayni.wordpress.com/2018/03/15/using-the-azure-key-vault-to-keep-secrets-out-of-your-web-apps-source-code/

https://stackoverflow.com/questions/40025598/azure-key-vault-access-denied

https://cmatskas.com/securing-asp-net-core-application-settings-using-azure-key-vault/

https://github.com/jayendranarumugam/DemoSecrets/tree/master/DemoSecrets

https://docs.microsoft.com/en-us/cli/azure/install-azure-cli-windows?view=azure-cli-latest

Deploying ASP.NET Core App Services using Azure Key Vault and Azure Resource Manager templates

$
0
0

This article shows how to create an Azure Resource Manager (ARM) template which uses an Azure Key Vault. The ARM template is used to deploy an ASP.NET Core application as an Azure App Service. By using an Azure Resource Group project, the secret app settings can be fetched from the Azure Key Vault during deployment, and deployed to the Azure App Service. This makes it easy to automate the whole deployment process, and no secrets are added to the source.

Different services can then use the same secrets from the Azure Key Vault, so it is easy to change the secrets regularly. The Key Vault is only used during deployment.

A problem with this approach is that if secrets are shared across services, then all services need to be updated at the same time when the secret is changed. If the services were using the secrets directly, then the secret could be updated directly, although the services would have to use the new value, which usually means an application restart.

Code: https://github.com/damienbod/AspNetCoreBackChannelLogout

Posts in this series:

Create an Azure Resource Group project

In Visual Studio, click the Cloud menu, and select a “Azure Resource Group” type.

Then choose a Web APP as the target project will be deployed as an Azure App Service.

Add some Application settings to the WebSite. You can right click the Website blade in the Json Outline window.

You can validate the template using the Azure CLI. You can also deploy this to Azure using Visual Studio (Right click the project). Deploy this to the same Resource Group as the Key Vault which you have already created, or need to create.

Microsoft Documentation for Visual Studio

Configure the Key Vault for the template

Before the Key Vault can be used in an Azure ARM template, this needs to be activated in the Key Vault. Open the Key Vault in Azure, select the Access Policies blade, then Click to show advanced access policies. Set the Enable access to Azure Resource Manager for template deployment.

Using a Key Vault secret in the ARM template

The ARM template can now use the Azure Key Vault to set application settings. A parameter will be used for this. In the properties where the application settings are defined, add a new parameter which will be used for the Key Vault value. The name of the parameter is internal to the ARM template. In the following example, the app setting ClientSecret uses the ARM template parameter ‘name_of_parameter_in_template’

"resources": [
{
  "name": "appsettings",
  "type": "config",
  "apiVersion": "2015-08-01",
  "dependsOn": [
	"[resourceId('Microsoft.Web/sites', variables('webSiteName'))]"
  ],
  "tags": {
	"displayName": "app"
  },
  "properties": {
	"ClientSecret": "[parameters('name_of_parameter_in_template')]",
	"ConnectionStrings:RedisCacheConnection": "[parameters('redisCacheConnection')]"
	"AuthConfiguration:StsServerIdentityUrl": "https//localhost:44318",
  }
}]

This is the code which matters:

[parameters('name_of_parameter_in_template')]

Add the parameter as a securestring in the template. You can navigate to this by using the Json Outline window in Visual Studio. The parameter used above, needs to be defined here, ie: ‘name_of_parameter_in_template’

 "parameters": {
    "name_of parameter_in_template": {
      "type": "securestring"
    },

In the WebSite.parameters.json file, add the Key Vault configuration. Use the parameter defined above, ‘name_of_parameter_in_template’ and add the Azure Key Vault using the reference json object. This object has two properties, a keyVault which requires the id, and the name of the secret.

Open the Azure Key Vault and click the Properties blade. The RESOURCE ID is the id which is required here. The secretName is the name of the secret in the secrets blade, which will be used.

"name_of_parameter_in_template": {
  "reference": {
     "keyVault": {
          "id": "/subscriptions/..."
     },
     "secretName": "SecretMvcHybridBackChannel2"
  }
},

When the ARM template is deployed, the application setting will use the Key Vault secret to get the value, and deploy this as an application setting in the Azure App Service. The application can then use the application setting. You need to deploy the ASP.NET Core application to the newly created Azure App Service.

Links

https://docs.microsoft.com/en-us/azure/azure-resource-manager/vs-azure-tools-resource-groups-deployment-projects-create-deploy#deploy-code-with-your-infrastructure

https://docs.microsoft.com/en-us/azure/azure-resource-manager/

https://social.technet.microsoft.com/wiki/contents/articles/51871.net-core-2-managing-secrets-in-web-apps.aspx#AzureKeyVault_Secrets

https://docs.microsoft.com/en-us/azure/virtual-machines/azure-cli-arm-commands

https://docs.microsoft.com/en-us/azure/key-vault/key-vault-developers-guide

https://jeremylindsayni.wordpress.com/2018/03/15/using-the-azure-key-vault-to-keep-secrets-out-of-your-web-apps-source-code/

https://stackoverflow.com/questions/40025598/azure-key-vault-access-denied

https://cmatskas.com/securing-asp-net-core-application-settings-using-azure-key-vault/

https://github.com/jayendranarumugam/DemoSecrets/tree/master/DemoSecrets

https://docs.microsoft.com/en-us/cli/azure/install-azure-cli-windows?view=azure-cli-latest

Securing Angular applications using the OpenID Connect Code Flow with PKCE

$
0
0

In this post, I show how an Angular application could be secured using the OpenID Connect Code Flow with Proof Key for Code Exchange (PKCE).

The Angular application uses the OIDC lib angular-auth-oidc-client. In this example, the src code is used directly, but you could also use the npm package. Here’s an example which uses the npm package.

Code https://github.com/damienbod/AspNet5IdentityServerAngularImplicitFlow

lib src: https://github.com/damienbod/angular-auth-oidc-client

npm package: https://www.npmjs.com/package/angular-auth-oidc-client

Configuring the Angular client

The Angular application loads the configurations from a configuration json file. The response_type is set to “code”. This defines the OpenID Connect (OIDC) flow. PKCE is always used, as this is a public client which cannot keep a secret.

The other configurations must match the OpenID Connect client configurations on the server.

"ClientAppSettings": {
    "stsServer": "https://localhost:44318",
    "redirect_url": "https://localhost:44352",
    "client_id": "angular_code_client",
    "response_type": "code",
    "scope": "dataEventRecords securedFiles openid profile",
    "post_logout_redirect_uri": "https://localhost:44352",
    "start_checksession": true,
    "silent_renew": true,
    "startup_route": "/dataeventrecords",
    "forbidden_route": "/forbidden",
    "unauthorized_route": "/unauthorized",
    "log_console_warning_active": true,
    "log_console_debug_active": true,
    "max_id_token_iat_offset_allowed_in_seconds": 10,
}

The Angular application reads the configuration in the app.module and initializes the security lib.

import { NgModule, APP_INITIALIZER } from '@angular/core';
import { HttpClientModule } from '@angular/common/http';

import { AuthModule } from './auth/modules/auth.module';
import { OidcSecurityService } from './auth/services/oidc.security.service';
import { OpenIDImplicitFlowConfiguration } from './auth/modules/auth.configuration';
import { OidcConfigService } from './auth/services/oidc.security.config.service';
import { AuthWellKnownEndpoints } from './auth/models/auth.well-known-endpoints';

// Add then other imports, config you need

export function loadConfig(oidcConfigService: OidcConfigService) {
    console.log('APP_INITIALIZER STARTING');
    return () => oidcConfigService.load(`${window.location.origin}/api/ClientAppSettings`);
}

@NgModule({
    imports: [
        BrowserModule,
        FormsModule,
        routing,
        HttpClientModule,
        AuthModule.forRoot(),
    ],
    declarations: [
        AppComponent,
    ],
    providers: [
        OidcConfigService,
        OidcSecurityService,
        {
            provide: APP_INITIALIZER,
            useFactory: loadConfig,
            deps: [OidcConfigService],
            multi: true
        },
        Configuration
    ],
    bootstrap: [AppComponent],
})

export class AppModule {

    constructor(
        private oidcSecurityService: OidcSecurityService,
        private oidcConfigService: OidcConfigService,
        configuration: Configuration
    ) {
        this.oidcConfigService.onConfigurationLoaded.subscribe(() => {

            const openIDImplicitFlowConfiguration = new OpenIDImplicitFlowConfiguration();
            openIDImplicitFlowConfiguration.stsServer = this.oidcConfigService.clientConfiguration.stsServer;
            openIDImplicitFlowConfiguration.redirect_url = this.oidcConfigService.clientConfiguration.redirect_url;
            // The Client MUST validate that the aud (audience) Claim contains its client_id value registered at the Issuer
            // identified by the iss (issuer) Claim as an audience.
            // The ID Token MUST be rejected if the ID Token does not list the Client as a valid audience,
            // or if it contains additional audiences not trusted by the Client.
            openIDImplicitFlowConfiguration.client_id = this.oidcConfigService.clientConfiguration.client_id;
            openIDImplicitFlowConfiguration.response_type = this.oidcConfigService.clientConfiguration.response_type;
            openIDImplicitFlowConfiguration.scope = this.oidcConfigService.clientConfiguration.scope;
            openIDImplicitFlowConfiguration.post_logout_redirect_uri = this.oidcConfigService.clientConfiguration.post_logout_redirect_uri;
            openIDImplicitFlowConfiguration.start_checksession = this.oidcConfigService.clientConfiguration.start_checksession;

            openIDImplicitFlowConfiguration.silent_renew = this.oidcConfigService.clientConfiguration.silent_renew;
            openIDImplicitFlowConfiguration.silent_renew_url = this.oidcConfigService.clientConfiguration.redirect_url + '/silent-renew.html';

            openIDImplicitFlowConfiguration.post_login_route = this.oidcConfigService.clientConfiguration.startup_route;
            // HTTP 403
            openIDImplicitFlowConfiguration.forbidden_route = this.oidcConfigService.clientConfiguration.forbidden_route;
            // HTTP 401
            openIDImplicitFlowConfiguration.unauthorized_route = this.oidcConfigService.clientConfiguration.unauthorized_route;
            openIDImplicitFlowConfiguration.log_console_warning_active = this.oidcConfigService.clientConfiguration.log_console_warning_active;
            openIDImplicitFlowConfiguration.log_console_debug_active = this.oidcConfigService.clientConfiguration.log_console_debug_active;
            // id_token C8: The iat Claim can be used to reject tokens that were issued too far away from the current time,
            // limiting the amount of time that nonces need to be stored to prevent attacks.The acceptable range is Client specific.
            openIDImplicitFlowConfiguration.max_id_token_iat_offset_allowed_in_seconds =
                this.oidcConfigService.clientConfiguration.max_id_token_iat_offset_allowed_in_seconds;

            // openIDImplicitFlowConfiguration.iss_validation_off = false;
            configuration.FileServer = this.oidcConfigService.clientConfiguration.apiFileServer;
            configuration.Server = this.oidcConfigService.clientConfiguration.apiServer;

            const authWellKnownEndpoints = new AuthWellKnownEndpoints();
            authWellKnownEndpoints.setWellKnownEndpoints(this.oidcConfigService.wellKnownEndpoints);

            this.oidcSecurityService.setupModule(openIDImplicitFlowConfiguration, authWellKnownEndpoints);

        });

        console.log('APP STARTING');
    }
}

The redirect request with the code from the secure token server (STS) needs to be handled inside the Angular application. This is done in the app.component.

If the redirected URL from the server has the code and state parameters, and the state is valid, the tokens are requested from the STS server. The tokens in the response are validated as defined in the OIDC specification.

private doCallbackLogicIfRequired() {
   console.log(window.location);
   // Will do a callback, if the url has a code and state parameter.
   this.oidcSecurityService.authorizedCallbackWithCode(window.location.toString());
}

or if you want more control, or have specific logic:

private doCallbackLogicIfRequired() {
        console.log(window.location);
   
        const urlParts = window.location.toString().split('?');
        const params = new HttpParams({
            fromString: urlParts[1]
        });
        const code = params.get('code');
        const state = params.get('state');
        const session_state = params.get('session_state');

        if (code && state && session_state) {
            this.oidcSecurityService.requestTokensWithCode(code, state, session_state);
        }
}

IdentityServer4 is used to configure and implement the secure token server. The client is configured to use PKCE and no secret. The client ID must match the Angular application configuration.

new Client
{
	ClientName = "angular_code_client",
	ClientId = "angular_code_client",
	AccessTokenType = AccessTokenType.Reference,
	// RequireConsent = false,
	AccessTokenLifetime = 330,// 330 seconds, default 60 minutes
	IdentityTokenLifetime = 30,

	RequireClientSecret = false,
	AllowedGrantTypes = GrantTypes.Code,
	RequirePkce = true,

	AllowAccessTokensViaBrowser = true,
	RedirectUris = new List<string>
	{
		"https://localhost:44352",
		"https://localhost:44352/silent-renew.html"

	},
	PostLogoutRedirectUris = new List<string>
	{
		"https://localhost:44352/unauthorized",
		"https://localhost:44352"
	},
	AllowedCorsOrigins = new List<string>
	{
		"https://localhost:44352"
	},
	AllowedScopes = new List<string>
	{
		"openid",
		"dataEventRecords",
		"dataeventrecordsscope",
		"securedFiles",
		"securedfilesscope",
		"role",
		"profile",
		"email"
	}
},

Silent Renew

The tokens are refreshed using an iframe like the OpenID Connect Implicit Flow. The HTML has one small difference. The detail value of the event returned to the Angular application returns the URL and not just the hash.

<!doctype html>
<html>
<head>
    <base href="./">
    <meta charset="utf-8" />
    <meta name="viewport" content="width=device-width, initial-scale=1.0" />
    <title>silent-renew</title>
    <meta http-equiv="content-type" content="text/html; charset=utf-8" />
</head>
<body>

    <script>
        window.onload = function () {
            /* The parent window hosts the Angular application */
            var parent = window.parent;
            /* Send the id_token information to the oidc message handler */
            var event = new CustomEvent('oidc-silent-renew-message', { detail: window.location });
            parent.dispatchEvent(event);
        };
    </script>
</body>
</html>

Now the OIDC Flow can be used in the Angular client application. When the application is started, the configurations are loaded.

The authorize request is sent to the STS with the code_challenge and the code_challenge_method.

https://localhost:44318/connect/authorize?
client_id=angular_code_client
&redirect_uri=https%3A%2F%2Flocalhost%3A44352
&response_type=code
&scope=dataEventRecords%20securedFiles%20openid%20profile
&nonce=N0.55639781033142241546880026878
&state=15468798618260.8857500931703779
&code_challenge=vBcZBGqBEcQAA3HYf_nSWy6jViRjtGQyiqrrZYUdHHU
&code_challenge_method=S256
&ui_locales=de-CH

The STS redirects back to the Angular application with the code and state.

https://localhost:44352/?
code=2ee056b556db7dcd5c936686c4b30056e7efd78046eb4e8d4f57c3f6cc638449
&scope=openid%20profile%20dataEventRecords%20securedFiles
&state=15468798618260.8857500931703779
&session_state=xQzQduNOGHP7Qh8l5Pjs02piChWuSawPBpDhb2vCmqo.8cccc6873860ec345ea65ead4233c4ee

The client application then requests the tokens using the code:

HTTP POST
https://localhost:44318/connect/token

BODY
grant_type=authorization_code
&client_id=angular_code_client
&code_verifier=C0.8490756539574429154688002688015468800268800.23727863075955402
&code=2ee056b556db7dcd5c936686c4b30056e7efd78046eb4e8d4f57c3f6cc638449
&redirect_uri=https://localhost:44352

The tokens are then returned and validated. The silent renew works in the same way.

Notes

The Angular application works now using OIDC Code Flow with PKCE to authenticate and authorize, but requires other security protections such as CSP, HSTS XSS protection, and so on. This is a good solution for Angular applications which uses APIs from any domain.

Links

https://tools.ietf.org/html/rfc7636

https://openid.net/specs/openid-connect-core-1_0.html#AuthRequest

https://www.npmjs.com/package/angular-auth-oidc-client

Securing a Vue.js app using OpenID Connect Code Flow with PKCE and IdentityServer4

$
0
0

This article shows how to setup a Vue.js SPA application to authenticate and authorize using OpenID Connect Code flow with PKCE. This is good solution when implementing SPA apps requesting data from APIs on separate domains. The oidc-client-js npm package is used to implement the client side authentication logic and validation logic. IdentityServer4 and ASP.NET Core Identity are used to implement the secure token server.

Code https://github.com/damienbod/IdentityServer4VueJs

IdentityServer4 Client configuration

The secure token server was implemented using IdentityServer4 with ASP.NET Core Identity and an Entity Framework Core database. A client configuration was added for the Vue.js application. This configures the code flow with PKCE and supports the callback and the silent-renew redirects.

new Client
{
	ClientName = "vuejs_code_client",
	ClientId = "vuejs_code_client",
	AccessTokenType = AccessTokenType.Reference,
	// RequireConsent = false,
	AccessTokenLifetime = 330,// 330 seconds, default 60 minutes
	IdentityTokenLifetime = 300,

	RequireClientSecret = false,
	AllowedGrantTypes = GrantTypes.Code,
	RequirePkce = true,

	AllowAccessTokensViaBrowser = true,
	RedirectUris = new List<string>
	{
		"https://localhost:44357",
		"https://localhost:44357/callback.html",
		"https://localhost:44357/silent-renew.html"
	},
	PostLogoutRedirectUris = new List<string>
	{
		"https://localhost:44357/",
		"https://localhost:44357"
	},
	AllowedCorsOrigins = new List<string>
	{
		"https://localhost:44357"
	},
	AllowedScopes = new List<string>
	{
		"openid",
		"dataEventRecords",
		"dataeventrecordsscope",
		"role",
		"profile",
		"email"
	}

Vue.js Client setup

The Vue.js client is implemented using Vue.js CLI to create a new typescript application. The STS is configured to run with HTTPS. This needs to be configured in the Vue.js app.

This can be done in the package.json file. The serve script was changed to support HTTPS. The oidc-client and the axios npm packages were added to the solution as well as the standard Vue.js npm packages.

axios is used for the API requests which uses the access token after a successful login.

{
  "name": "vue-js-oidc-client",
  "version": "0.1.0",
  "private": true,
  "scripts": {
    "serve": "vue-cli-service serve --https --port 44357",
    "build": "vue-cli-service build",
    "lint": "vue-cli-service lint"
  },
  "dependencies": {
    "axios": "^0.18.0",
    "oidc-client": "^1.6.1",
    "vue": "^2.5.21",
    "vue-class-component": "^6.0.0",
    "vue-property-decorator": "^7.0.0",
    "vue-router": "^3.0.1"
  },
  "devDependencies": {
    "@vue/cli-plugin-babel": "^3.3.0",
    "@vue/cli-plugin-typescript": "^3.3.0",
    "@vue/cli-service": "^3.3.0",
    "typescript": "^3.0.0",
    "vue-template-compiler": "^2.5.21"
  }
}

The authentication was implemented following the blog from Jerrie Pelser and the samples from the oidc-client github repo.

An AuthService typescript class was implemented with the oidc-client settings. The code flow is configured here, as well as the silent renew. Then some functions were implemented for login, logout and getting an access token.

import { UserManager, WebStorageStateStore, User } from 'oidc-client';

export default class AuthService {
    private userManager: UserManager;

    constructor() {
        const STS_DOMAIN: string = 'https://localhost:44356';

        const settings: any = {
            userStore: new WebStorageStateStore({ store: window.localStorage }),
            authority: STS_DOMAIN,
            client_id: 'vuejs_code_client',
            redirect_uri: 'https://localhost:44357/callback.html',
            automaticSilentRenew: true,
            silent_redirect_uri: 'https://localhost:44357/silent-renew.html',
            response_type: 'code',
            scope: 'openid profile dataEventRecords',
            post_logout_redirect_uri: 'https://localhost:44357/',
            filterProtocolClaims: true,
        };

        this.userManager = new UserManager(settings);
    }

    public getUser(): Promise<User> {
        return this.userManager.getUser();
    }

    public login(): Promise<void> {
        return this.userManager.signinRedirect();
    }

    public logout(): Promise<void> {
        return this.userManager.signoutRedirect();
    }

    public getAccessToken(): Promise<string> {
        return this.userManager.getUser().then((data: any) => {
            return data.access_token;
        });
    }
}

The AuthService was then used in the Home.vue file just like Jerrie Pelser did it. Thanks for this blog. An API call was added to the component which gets the access token from the oidc lib, and requests data from a secure API. The data is then displayed in the UI.

<template>
    <div class="home">
        <img alt="Vue logo" src="../assets/logo.png">
        <div class="home">
            <p v-if="isLoggedIn">User: {{ username }}</p>
            <button class="btn" @click="login" v-if="!isLoggedIn">Login</button>
            <button class="btn" @click="logout" v-if="isLoggedIn">Logout</button>
            <button class="btn" @click="getProtectedApiData" v-if="isLoggedIn">Get API data</button>
        </div>

        <div v-if="dataEventRecordsItems && dataEventRecordsItems.length">
            <div v-for="dataEventRecordsItem of dataEventRecordsItems">
                <p><em>Id:</em> {{dataEventRecordsItem.Id}} <em>Details:</em> {{dataEventRecordsItem.Name}}  - {{dataEventRecordsItem.Description}} - {{dataEventRecordsItem.Timestamp}}</p>
            </div>
            <br />
        </div>

    </div>
</template>
<script lang="ts">
    import { Component, Vue } from 'vue-property-decorator';
    import AuthService from '@/services/auth.service';

    import axios from 'axios';

    const auth = new AuthService();

    @Component({
        components: {
        },
    })

    export default class Home extends Vue {
        public currentUser: string = '';
        public accessTokenExpired: boolean | undefined = false;
        public isLoggedIn: boolean = false;

        public dataEventRecordsItems: [] = [];

        get username(): string {
            return this.currentUser;
        }

        public login() {
            auth.login();
        }

        public logout() {
            auth.logout();
        }

        public mounted() {
            auth.getUser().then((user) => {
                this.currentUser = user.profile.name;
                this.accessTokenExpired = user.expired;

                this.isLoggedIn = (user !== null && !user.expired);
            });
        }

        public getProtectedApiData() {

            auth.getAccessToken().then((userToken: string) => {
                axios.defaults.headers.common['Authorization'] = `Bearer ${userToken}`;

                axios.get('https://localhost:44355/api/DataEventRecords/')
                    .then((response: any) => {
                        this.dataEventRecordsItems = response.data;
                    })
                    .catch((error: any) => {
                        alert(error);
                    });
            });
        }
    }
</script>
<style>

    .btn {
        color: #42b983;
        font-weight: bold;
        background-color: #007bff;
        border-color: #007bff;
        display: inline-block;
        font-weight: 400;
        text-align: center;
        vertical-align: middle;
        -webkit-user-select: none;
        -moz-user-select: none;
        -ms-user-select: none;
        user-select: none;
        background-color: transparent;
        border: 1px solid #42b983;
        padding: .375rem .75rem;
        margin: 10px;
        font-size: 1rem;
        line-height: 1.5;
        border-radius: .25rem;
        transition: color .15s ease-in-out,background-color .15s ease-in-out,border-color .15s ease-in-out,box-shadow .15s ease-in-out;
    }

</style>

Two html files are required in the root web folder. In Vue.js, this is the public folder. A callback.html file is required to handle the code redirect.

<!DOCTYPE html>
<html>
<head>
    <meta charset="UTF-8">
    <title>Waiting...</title>
</head>
<body>
    <script src="js/oidc-client.min.js"></script>
    <script>

        var mgr = new Oidc.UserManager({ response_mode: 'query', userStore: new Oidc.WebStorageStateStore() }).signinRedirectCallback().then(function (user) {
            console.log("signin response success", user);
            window.location.href = '../';
        }).catch(function (err) {
            console.log(err);
            });

    </script>
</body>
</html>

The silent-renew.html is used for the silent renew in the iframe.

<!DOCTYPE html>
<html>
<head>
    <meta charset="UTF-8">
    <title>Waiting...</title>
</head>
<body>
    <script src="js/oidc-client.min.js"></script>
    <script>

        var mgr = new Oidc.UserManager().signinSilentCallback();
    </script>
</body>
</html>

Both of these html files require a reference to the oidc-client.min.js which is copied to the js folder.

When the application is run, the user can login and get the secure data. Start the two dotnet core applications from Visual Studio and start the Vue.js application from the cmd using ‘npm run serve’

When the application starts login:

Give your consent:

request the API data:

And view the data:

Silent renew will also work using the iframe which has to be allowed on the server. This can be viewed in the F12 network tab in Chrome.

It is pretty easy to setup a secure Vue.js application using OIDC code Flow with PKCE.

Links

https://cli.vuejs.org

https://www.jerriepelser.com/blog/using-auth0-with-vue-oidc-client-js/

https://github.com/joaojosefilho/vuejsOidcClient

https://www.scottbrady91.com/Angular/Migrating-oidc-client-js-to-use-the-OpenID-Connect-Authorization-Code-Flow-and-PKCE

https://github.com/IdentityModel/oidc-client-js/

https://tools.ietf.org/html/rfc7636

https://openid.net/specs/openid-connect-core-1_0.html#AuthRequest


Using Azure Key Vault from a non-Azure App

$
0
0

In this article, I show how Azure Key Vault can be used with a non Azure application. An example of this, is a console application used for data migrations, or data seeding during release pipelines. This app could then read the secret connection strings from the Key Vault, and then do the app logic as required.

Code: https://github.com/damienbod/AspNetCoreBackChannelLogout

Posts in this series:

Create a Key Vault

You can create an Azure Key Vault by following the Microsoft documentation here:

https://docs.microsoft.com/en-us/azure/key-vault/key-vault-get-started

Or using the Azure UI, you can create a Key Vault by clicking the “+ Create a Resource” blade and typing Key Vault in the search text input.

Fill out the inputs as required.

Now the Key Vault should be ready.

Create an Azure AD Application

To connect from a non Azure application, an Azure AD Application Registration needs to be added. Click Azure Active Directory, and then in the new blade App registrations (Preview). This will probably be renamed soon. Click the New registration.

In the new blade, Register a new application. Give it a name and save.

Wait a bit, and the AAD Application registration will be created. Save the Application (client) ID somewhere as this is required in the code.

Now a secret for the AAD Application registration needs to be created. Click the Certificates & secrets button, and then New client secret.

Configure the secret, give it a description and define how long it should remain active.

Save the secret somewhere, as this is required in the code, to access the Key Vault.

Configure the Azure Key Vault to allow the Azure AD Application

In the Azure Key Vault, the AAD Application registration needs to be given access rights.
Open the Key Vault, and click the Access policies. Then click the Add new button.

Select the AAD Application registration principle which was created before. You can find this, by entering the name. In this example, it was called standalone. Then give it the required permissions and save.

Also save when you re-enter to the Key Vault blade after clicking save.

Create your Standalone Application and use the Azure Key Vault

Now the application, which can be run anywhere, and use the Key Vault secrets, can be configured and created. In this example, a console application is created, which uses the Microsoft.AspNetCore.App and the Microsoft.Extensions.Configuration.AzureKeyVault.

You could also create a web application, or whatever. A console application could be used for example, to do migrations or data seeding in a build pipeline.

<Project Sdk="Microsoft.NET.Sdk">

  <PropertyGroup>
    <OutputType>Exe</OutputType>
    <TargetFramework>netcoreapp2.2</TargetFramework>
  </PropertyGroup>

  <ItemGroup>
    <PackageReference Include="Microsoft.AspNetCore.App" Version="2.2.1" />
    <PackageReference Include="Microsoft.Extensions.Configuration.AzureKeyVault" Version="2.2.0" />
  </ItemGroup>

  <ItemGroup>
    <None Update="appsettings.json">
      <CopyToOutputDirectory>Always</CopyToOutputDirectory>
    </None>
  </ItemGroup>

</Project>

Now configure the application to use the Key Vault. This is done using the AddAzureKeyVault extension method, with the 3 parameters using the data from above; the DNS name of the Key vault, the AAD Application Registration Application ID, and the Secret.

var dnsNameKeyVault = _config["DNSNameKeyVault"];

if (!string.IsNullOrWhiteSpace(dnsNameKeyVault))
{
  configBuilder.AddAzureKeyVault($"{dnsNameKeyVault}",
   _config["AADAppRegistrationAppId"], 
   _config["AADAppRegistrationAppSecret"]);

  _config = configBuilder.Build();
}

The program reads the configuration from the app settings, adds the services, and displays the Key Vault secret.

using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using System;
using System.IO;
using System.Reflection;

namespace ConsoleStandaloneUsingAzureSecrets
{
    class Program
    {
        private static IConfigurationRoot _config;
        private static IServiceProvider _services;

        static void Main(string[] args)
        {
            Console.WriteLine("Console APP using Azure Key Vault");

            GetConfigurationsForEnvironment();

            SetupServices();

            // read config value
            var someSecret = _config["SomeSecret"];

            Console.WriteLine($"Read from key vault: {someSecret}");
            Console.ReadLine();
        }

        private static void SetupServices()
        {
            var serviceCollection = new ServiceCollection();

            // Do migration, seeding logic or whatever

            _services = serviceCollection.BuildServiceProvider();
        }

        private static void GetConfigurationsForEnvironment()
        {
            var environmentName = Environment.GetEnvironmentVariable("ASPNETCORE_ENVIRONMENT");
            var location = Assembly.GetEntryAssembly().Location;
            var directory = Path.GetDirectoryName(location);

            Console.WriteLine($"{directory}{Path.DirectorySeparatorChar}appsettings.json");
            Console.WriteLine($"{environmentName}");

            var configBuilder = new ConfigurationBuilder()
           .AddJsonFile($"{directory}{Path.DirectorySeparatorChar}appsettings.json", false, true)
           .AddJsonFile($"{directory}{Path.DirectorySeparatorChar}appsettings.{environmentName}.json", true, true)
           .AddEnvironmentVariables();
            _config = configBuilder.Build();

            var dnsNameKeyVault = _config["DNSNameKeyVault"];

            if (!string.IsNullOrWhiteSpace(dnsNameKeyVault))
            {
                configBuilder.AddAzureKeyVault($"{dnsNameKeyVault}",
                        _config["AADAppRegistrationAppId"], 
                        _config["AADAppRegistrationAppSecret"]);

                _config = configBuilder.Build();
            }
        }
    }
}

The appsettings.json file contains the values used above. If this was a real application, you should not save the secret to the app settings. These values could be left empty, and set during a deployment, for example using Azure Devops. Or in a web application, you could use user secrets.

Thes values here are no longer valid. If you want to run the code locally, these need to be set to correct values.

{
  "DNSNameKeyVault": "https://standalone-kv.vault.azure.net/",
  "AADAppRegistrationAppId": "7faea48d-141e-41f9-9d9e-4ec9fd93ead0",
  "AADAppRegistrationAppSecret": "OWs6u2hY{F$UjXB5j7l&&DeNk9+$at{y/!pg!1Xh8MB@L",

  "SomeSecret": "DEV_VALUE"
}

You must also configure a secret in the key Vault which will be read in the standalone

Running the Application

Using the Key vault values:

When the application is started, with correct Key Vault and AAD application registration values, the configuration is read from the Key Vault.

If the DNSNameKeyVault property is not set, the development settings in the appsettings.json is used.

Links

https://social.technet.microsoft.com/wiki/contents/articles/51871.net-core-2-managing-secrets-in-web-apps.aspx#AzureKeyVault_Secrets

https://docs.microsoft.com/en-us/azure/key-vault/key-vault-developers-guide

https://jeremylindsayni.wordpress.com/2018/03/15/using-the-azure-key-vault-to-keep-secrets-out-of-your-web-apps-source-code/

https://stackoverflow.com/questions/40025598/azure-key-vault-access-denied

https://cmatskas.com/securing-asp-net-core-application-settings-using-azure-key-vault/

https://github.com/jayendranarumugam/DemoSecrets/tree/master/DemoSecrets

https://docs.microsoft.com/en-us/cli/azure/install-azure-cli-windows?view=azure-cli-latest

https://docs.microsoft.com/en-us/azure/key-vault/key-vault-developers-guide

https://jeremylindsayni.wordpress.com/2018/03/15/using-the-azure-key-vault-to-keep-secrets-out-of-your-web-apps-source-code/

https://stackoverflow.com/questions/40025598/azure-key-vault-access-denied

https://cmatskas.com/securing-asp-net-core-application-settings-using-azure-key-vault/

https://github.com/jayendranarumugam/DemoSecrets/tree/master/DemoSecrets

https://docs.microsoft.com/en-us/cli/azure/install-azure-cli-windows?view=azure-cli-latest

ASP.NET Core OAuth Device Flow Client with IdentityServer4

$
0
0

This article shows how to implement the OAuth 2.0 Device Flow for Browserless and Input Constrained Devices in an ASP.NET Core application. The tokens are then saved to a cookie for later usage. IdentityServer4 is used to implement the secure token server.

Code: https://github.com/damienbod/AspNetCoreHybridFlowWithApi

History

2019-02-24 Updated packages, API calls

Note: The code in the this blog was built using the example from leastprivilege’s github repo AspNetCoreSecuritySamples. This was then adapted for an ASP.NET Core Razor Page application.

Creating the Client Login

The ASP.NET Core application is setup to login using the OAuth Device flow. When the user clicks the login, 4 things happen, the device code, user code is requested from the server, the device code is saved to an ASP.NET Core session, and the login page starts to poll the STS for a successful login and the QRCode is displayed so that the user can login with a mobile device, or just enter the login URL directly.

The Login OnGetAsync method, resets the user session, and signs out, if already signed in. Cookie Authentication is used to save the session once logged in. The device flow is started by calling the BeginLogin method. When the method completes, the session data is set, and the page view is returned.

public async Task OnGetAsync()
{
	HttpContext.Session.SetString("DeviceCode", string.Empty);

	await HttpContext.SignOutAsync(CookieAuthenticationDefaults.AuthenticationScheme);

	var deviceAuthorizationResponse = await _deviceFlowService.BeginLogin();
	AuthenticatorUri = deviceAuthorizationResponse.VerificationUri;
	UserCode = deviceAuthorizationResponse.UserCode;

	if (string.IsNullOrEmpty(HttpContext.Session.GetString("DeviceCode")))
	{
		HttpContext.Session.SetString("DeviceCode", deviceAuthorizationResponse.DeviceCode);
		HttpContext.Session.SetInt32("Interval", deviceAuthorizationResponse.Interval);
	}
}

The BeginLogin sends a code request using the RequestDeviceAuthorizationAsync method from the IdentityModel Nuget package. The required scopes are added to the request, and the ClientId is set to match the server configuration for this client.

internal async Task<DeviceAuthorizationResponse> BeginLogin()
{
	var client = _clientFactory.CreateClient();

	var disco = await HttpClientDiscoveryExtensions.GetDiscoveryDocumentAsync(client, _authConfigurations.Value.StsServer);

	if (disco.IsError)
	{
		throw new ApplicationException($"Status code: {disco.IsError}, Error: {disco.Error}");
	}

	var deviceAuthorizationRequest = new DeviceAuthorizationRequest
	{
		Address = disco.DeviceAuthorizationEndpoint,
		ClientId = "deviceFlowWebClient"
	};
	deviceAuthorizationRequest.Scope = "email profile openid";
	var response = await client.RequestDeviceAuthorizationAsync(deviceAuthorizationRequest);

	if (response.IsError)
	{
		throw new Exception(response.Error);
	}

	return response;
}

The ASP.NET Core session and the Cookie authentication are setup in the Startup class. The session is added using the AddSession extension method, and then added using the UseSession in the Configure method.

Cookie Authentication is added to save the logged-in user. The UseAuthentication method is added to the Configure method. The IHttpContextAccessor is added to the IoC so that we can show the user name in the razor page views.

using Microsoft.AspNetCore.Authentication.Cookies;
using Microsoft.AspNetCore.Authentication.OpenIdConnect;
using Microsoft.AspNetCore.Builder;
using Microsoft.AspNetCore.Hosting;
using Microsoft.AspNetCore.Http;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using System;

namespace DeviceFlowWeb
{
    public class Startup
    {
        private string stsServer = "";
        public IConfiguration Configuration { get; }
        public Startup(IConfiguration configuration)
        {
            Configuration = configuration;
        }

        // This method gets called by the runtime. Use this method to add services to the container.
        public void ConfigureServices(IServiceCollection services)
        {
            services.AddScoped<DeviceFlowService>();
            services.AddHttpClient();
            services.Configure<AuthConfigurations>(Configuration.GetSection("AuthConfigurations"));

            services.AddDistributedMemoryCache();

            services.AddSession(options =>
            {
                // Set a short timeout for easy testing.
                options.IdleTimeout = TimeSpan.FromSeconds(60);
                options.Cookie.HttpOnly = true;
            });

            services.Configure<CookiePolicyOptions>(options =>
            {
                // This lambda determines whether user consent for non-essential cookies is needed for a given request.
                options.CheckConsentNeeded = context => true;
                options.MinimumSameSitePolicy = SameSiteMode.None;
            });

            var authConfigurations = Configuration.GetSection("AuthConfigurations");
            stsServer = authConfigurations["StsServer"];

            services.AddAuthentication(options =>
            {
                options.DefaultScheme = CookieAuthenticationDefaults.AuthenticationScheme;
            })
            .AddCookie();

            services.AddAuthorization();
            services.AddSingleton<IHttpContextAccessor, HttpContextAccessor>();

            services.AddMvc(options =>
            {
                options.Filters.Add(new MissingSecurityHeaders());
            }).SetCompatibilityVersion(CompatibilityVersion.Version_2_2);
        }

        public void Configure(IApplicationBuilder app, IHostingEnvironment env)
        {
            ...

            app.UseAuthentication();

            app.UseSession();

            app.UseMvc();
        }
    }
}

The Login Razor Page implements an OnPost method, which polls the server for a successful login. This is called using Javascript as soon as the page opens. The results from the OnGet are displayed in this view. The login link is displayed using a QRCode so that a mobile device could scan this and login. The user code is also displayed, which needs to be entered when logging in. The button to get the tokens is not required, this is just displayed for the demo.

@page
@model DeviceFlowWeb.Pages.LoginModel
@{
    ViewData["Title"] = "Login";
    Layout = "~/Pages/Shared/_Layout.cshtml";
}


Login: <p>@Model.AuthenticatorUri</p>

<br />

User Code: <p>@Model.UserCode</p>
<br />
<br />

<div id="qrCode"></div>
<div id="qrCodeData" data-url="@Html.Raw(Model.AuthenticatorUri)"></div>

<br />
<br />

<form data-ajax="true"  method="post" data-ajax-method="POST">
    <button class="btn btn-secondary" name="begin_token_check" id="begin_token_check" type="submit">Get tokens</button>
</form>

@section scripts {
<script src="~/js/qrcode.min.js"></script>
<script type="text/javascript">
        new QRCode(document.getElementById("qrCode"),
            {
                text: "@Html.Raw(Model.AuthenticatorUri)",
                width: 150,
                height: 150
            });

    $(document).ready(() => {
        document.getElementById('begin_token_check').click();
    });

</script>
}

The OnPostAsync method uses the RequestTokenAsync method from the service to get the tokens. This polls the server if a valid device code exists and tries to get the tokens. If the user has logged in, the tokens will be returned. This code could be optimized to remove the thread sleep calls, and use a background service.

internal async Task<TokenResponse> RequestTokenAsync(string deviceCode, int interval)
{
	var client = _clientFactory.CreateClient();

	var disco = await HttpClientDiscoveryExtensions.GetDiscoveryDocumentAsync(client, _authConfigurations.Value.StsServer);

	if (disco.IsError)
	{
		throw new ApplicationException($"Status code: {disco.IsError}, Error: {disco.Error}");
	}

	while (true)
	{
		if(!string.IsNullOrWhiteSpace(deviceCode))
		{
			var response = await client.RequestDeviceTokenAsync(new DeviceTokenRequest
			{
				Address = disco.TokenEndpoint,
				ClientId = "deviceFlowWebClient",
				DeviceCode = deviceCode
			});

			if (response.IsError)
			{
				if (response.Error == "authorization_pending" || response.Error == "slow_down")
				{
					Console.WriteLine($"{response.Error}...waiting.");
					await Task.Delay(interval * 1000);
				}
				else
				{
					throw new Exception(response.Error);
				}
			}
			else
			{
				return response;
			}
		}
		else
		{
			await Task.Delay(interval * 1000);
		}
	}
}

Adding the token claims to the Cookie

The OnPostAsync method calls the RequestTokenAsync method, using the session data. Once the tokens are returned, these are added to a cookie and used to add the claims to the auth cookie, and the user in logged in. The HttpContext.SignInAsync method is used for this with the claims from the tokens.

public async Task<IActionResult> OnPostAsync()
{
	var deviceCode = HttpContext.Session.GetString("DeviceCode");
	var interval = HttpContext.Session.GetInt32("Interval");

	if(interval.GetValueOrDefault() <= 0)
	{
		interval = 5;
	}

	var tokenresponse = await _deviceFlowService.RequestTokenAsync(deviceCode, interval.Value);

	if (tokenresponse.IsError)
	{
		ModelState.AddModelError(string.Empty, "Invalid login attempt.");
		return Page();
	}

	var claims = GetClaims(tokenresponse.IdentityToken);

	var claimsIdentity = new ClaimsIdentity(
		claims, 
		CookieAuthenticationDefaults.AuthenticationScheme, 
		"name", 
		"user");

	var authProperties = new AuthenticationProperties();

	// save the tokens in the cookie
	authProperties.StoreTokens(new List<AuthenticationToken>
	{
		new AuthenticationToken
		{
			Name = "access_token",
			Value = tokenresponse.AccessToken
		},
		new AuthenticationToken
		{
			Name = "id_token",
			Value = tokenresponse.IdentityToken
		}
	});

	await HttpContext.SignInAsync(
		CookieAuthenticationDefaults.AuthenticationScheme,
		new ClaimsPrincipal(claimsIdentity),
		authProperties);

	return Redirect("/Index");
}

private IEnumerable<Claim> GetClaims(string token)
{
	var validJwt = new JwtSecurityToken(token);
	return validJwt.Claims;
}

Logout

Logout is implemented using a Razor Page, and this just cleans up the auth cookies using the HttpContext.SignOutAsync method.

using Microsoft.AspNetCore.Authentication;
using Microsoft.AspNetCore.Authentication.Cookies;
using Microsoft.AspNetCore.Mvc;
using Microsoft.AspNetCore.Mvc.RazorPages;
using System.Threading.Tasks;

namespace DeviceFlowWeb.Pages
{
    public class LogoutModel : PageModel
    {
        public async Task<IActionResult> OnGetAsync()
        {
            await HttpContext.SignOutAsync(CookieAuthenticationDefaults.AuthenticationScheme);

            return Redirect("/SignedOut");
        }
    }
}

IdentityServer4 client configuration

The Device Flow client is configured using the grant type DeviceFlow. The profile claims are added to the id_token and no secret is required, as the web application client would run on a device, in an untrusted zone, so it cannot be trusted to keep a secret. The ClientId value must match the configuration on the client.

new Client
{
	ClientId = "deviceFlowWebClient",
	ClientName = "Device Flow Client",

	AllowedGrantTypes = GrantTypes.DeviceFlow,
	RequireClientSecret = false,

	AlwaysIncludeUserClaimsInIdToken = true,
	AllowOfflineAccess = true,

	AllowedScopes =
	{
		IdentityServerConstants.StandardScopes.OpenId,
		IdentityServerConstants.StandardScopes.Profile,
		IdentityServerConstants.StandardScopes.Email
	}
}

Running the APP

On the Device App and click the login:

Scan the QRCode and open in a browser, use the link:

Login with user email, or Microsoft account:

Enter the user code displayed on the Device Login page:

Give your consent:

And the Device is now logged in, received the tokens, and added them to the auth cookie.

You could now use the tokens in the standard way, to call APIs etc.

Links

https://github.com/aspnet/Docs/tree/master/aspnetcore/security/authentication/cookie/samples/2.x/CookieSample

https://docs.microsoft.com/en-us/aspnet/core/security/authentication/cookie?view=aspnetcore-2.2

Try Device Flow with IdentityServer4

https://tools.ietf.org/wg/oauth/draft-ietf-oauth-device-flow/

https://github.com/leastprivilege/AspNetCoreSecuritySamples/tree/aspnetcore21/DeviceFlow

https://www.red-gate.com/simple-talk/dotnet/net-development/using-auth-cookies-in-asp-net-core/

Security Experiments with gRPC and ASP.NET Core 3.0

$
0
0

This article shows how a gRPC service could implement OAuth2 security using IdentityServer4 as the token service.

Code: https://github.com/damienbod/Secure_gRpc

Posts in this series

History

2019-03-08: Removing the IHttpContextAccessor, no longer required
2019-03-07: Updated the auth security to configure this on the route, attributes are not supported in the current preview.

Setup

The application is implemented using 3 applications. A console application is used as the gRPC client. This application requests an access token for the gRPC server using the IdentityServer4 token service. The client application then sends the access token in the header of the HTTP2 request. The gRPC server then validates the token using Introspection, and if the token is valid, the data is returned. If the token is not valid, a RPC exception is created and sent back to the server.

At present, as this code is still in production, securing the API using the Authorization attributes with policies does not seem to work, so as a quick fix, the policy is added to the routing configuration.

The gRPC client and server were setup using the Visual Studio template for gRPC.

gRPC Server

The GreeterService class is the generated class from the Visual Studio template. The security bits were then added to this class. The Authorize attribute is added to the class which is how the security should work.

using System.Threading.Tasks;
using Greet;
using Grpc.Core;
using Microsoft.AspNetCore.Authorization;

namespace Secure_gRpc
{
    [Authorize(Policy = "protectedScope")]
    public class GreeterService : Greeter.GreeterBase
    {
        public override Task<HelloReply> SayHello(HelloRequest request, ServerCallContext context)
        {
            return Task.FromResult(new HelloReply
            {
                Message = "Hello " + request.Name
            });
        }
    }
}

The startup class configures the gRPC service and the required security to use this service. IdentityServer4.AccessTokenValidation is used to validate the access token using introspection. The gRPC service is added along with the authorization and the authentication.

using Microsoft.AspNetCore.Builder;
using Microsoft.AspNetCore.Hosting;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
using IdentityServer4.AccessTokenValidation;
using Microsoft.AspNetCore.Authentication.JwtBearer;
using System.Security.Claims;

namespace Secure_gRpc
{
    public class Startup
    {
        private string stsServer = "https://localhost:44352";

        // This method gets called by the runtime. Use this method to add services to the container.
        // For more information on how to configure your application, visit https://go.microsoft.com/fwlink/?LinkID=398940
        public void ConfigureServices(IServiceCollection services)
        {
            services.AddHttpContextAccessor();

            services.AddAuthorization(options =>
            {
                options.AddPolicy("protectedScope", policy =>
                {
                    policy.RequireClaim("scope", "grpc_protected_scope");
                });
            });

            services.AddAuthorizationPolicyEvaluator();

            services.AddAuthentication(IdentityServerAuthenticationDefaults.AuthenticationScheme)
                .AddIdentityServerAuthentication(options =>
                {
                    options.Authority = stsServer;
                    options.ApiName = "ProtectedGrpc";
                    options.ApiSecret = "grpc_protected_secret";
                    options.RequireHttpsMetadata = false;
                });

            services.AddGrpc(options =>
            {
                options.EnableDetailedErrors = true;
            });
        }

        // This method gets called by the runtime. Use this method to configure the HTTP request pipeline.
        public void Configure(IApplicationBuilder app, IWebHostEnvironment env)
        {
            if (env.IsDevelopment())
            {
                app.UseDeveloperExceptionPage();
            }

            app.UseRouting(routes =>
            {
                routes.MapGrpcService<GreeterService>().RequireAuthorization("protectedScope");
            });

            app.UseAuthentication();
            app.UseAuthorization();
        }
    }
}

The gRPC service is then setup to run using HTTPS and HTTP2.

using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Hosting;
using Microsoft.AspNetCore.Server.Kestrel.Core;
using Microsoft.Extensions.Hosting;

namespace Secure_gRpc
{
    public class Program
    {
        public static void Main(string[] args)
        {
            CreateHostBuilder(args).Build().Run();
        }

        public static IHostBuilder CreateHostBuilder(string[] args) =>
            Host.CreateDefaultBuilder(args)
                .ConfigureWebHostDefaults(webBuilder =>
                {
                    webBuilder.UseStartup<Startup>()
                    .ConfigureKestrel(options =>
                    {
                        options.Limits.MinRequestBodyDataRate = null;
                        options.ListenLocalhost(50051, listenOptions =>
                        {
                            listenOptions.UseHttps("server.pfx", "1111");
                            listenOptions.Protocols = HttpProtocols.Http2;
                        });
                    });
                });
    }
}

RPC interface definition

The RPC API is defined using proto3 and referenced in both projects. When the applications are built, the C# classes are created.

syntax = "proto3";

package Greet;

// The greeting service definition.
service Greeter {
  // Sends a greeting
  rpc SayHello (HelloRequest) returns (HelloReply) {}
}

// The request message containing the user's name.
message HelloRequest {
  string name = 1;
}

// The response message containing the greetings.
message HelloReply {
  string message = 1;
}

gRPC client

The client is implemented in a simple console application. This client gets the access token from the IdentityServer4 token service, and adds it to the Authorization header as a bearer token. The client then uses a cert to connect over HTTPS. This code will probably change before the release. Then the API is called and the data is returned, or an exception. If you comment in the incorrect token, an auth exception is returned.

using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Net.Http;
using System.Net.Http.Headers;
using System.Runtime.Serialization;
using System.Runtime.Serialization.Formatters.Binary;
using System.Threading.Tasks;
using Greet;
using Grpc.Core;

namespace Secure_gRpc
{
    public class Program
    {
        static async Task Main(string[] args)
        {
            ///
            /// Token init
            /// 
            HttpClient httpClient = new HttpClient();
            ApiService apiService = new ApiService(httpClient);

            // switch the token here to use an invalid token,
            var token = await apiService.GetAccessTokenAsync();
            //var token = "This is invalid, I hope it fails";

            var tokenValue = "Bearer " + token;
            var metadata = new Metadata
            {
                { "Authorization", tokenValue }
            };

            ///
            /// Call gRPC HTTPS
            ///
            var channelCredentials =  new SslCredentials(
                File.ReadAllText("Certs\\ca.crt"),
                    new KeyCertificatePair(
                        File.ReadAllText("Certs\\client.crt"),
                        File.ReadAllText("Certs\\client.key")
                    )
                );

            CallOptions callOptions = new CallOptions(metadata);
            // Include port of the gRPC server as an application argument
            var port = args.Length > 0 ? args[0] : "50051";
            var channel = new Channel("localhost:" + port, channelCredentials);
            var client = new Greeter.GreeterClient(channel);

            var reply = await client.SayHelloAsync(
                new HelloRequest { Name = "GreeterClient" }, callOptions);

            Console.WriteLine("Greeting: " + reply.Message);

            await channel.ShutdownAsync();

            Console.WriteLine("Press any key to exit...");
            Console.ReadKey();
        }
   }
}

Sending a valid token

Sending an invalid token

This code is still in development, and a lot will change before the first release. The demo shows some of the new gRPC, HTTP2, hosting features which will be released as part of ASP.NET Core 3.0.

Links:

https://github.com/grpc/grpc-dotnet/

https://grpc.io/

https://www.stevejgordon.co.uk/early-look-at-grpc-using-aspnet-core-3

https://www.zoeys.blog/first-impressions-of-grpc-integration-in-asp-net-core-3-preview/

gRPC Bi-directional streaming with Razor Pages and a Hosted Service gRPC client

$
0
0

This article shows how a Bi-directional streaming gRPC service could be implemented using an .NET Core Hosted Service as a gRPC client, and a Razor Page to send Bi-directional streaming messages to the servers connected clients.

Code: https://github.com/damienbod/Secure_gRpc

Posts in this series

History

2019-03-26 Added Code improvements from feedback

Setting up the Bi-directional streaming gRPC Server

The gRPC client and server code is defined using a proto3 file. This has a single method, SendData which takes a MyMessage stream.

syntax = "proto3";

package Duplex;

service Messaging {

  rpc SendData (stream MyMessage) returns (stream MyMessage) {}
}

message MyMessage {
  string name = 1;
  string message = 2;
}

The DuplexService class implements the gRPC service. This class implements the SendData method, which was defined using the proto3 definitions. The service uses a ServerGrpcSubscribers singleton service, which implements the broadcast. If a gRPC client sends a request, the client is added to the list of subscribers and then the message is broadcasted to all the other clients.

If the gRPC client closes gracefully, the client will be removed here as well. The service requires that the client send a valid bearer token by using the Authorize attribute.

using System;
using System.Collections.Generic;
using System.Threading.Tasks;
using Duplex;
using Grpc.Core;
using Microsoft.AspNetCore.Authorization;
using Microsoft.Extensions.Logging;

namespace SecureGrpc.Server
{
    [Authorize(Policy = "protectedScope")]
    public class DuplexService : Messaging.MessagingBase, IDisposable
    {
        private readonly ILogger _logger;
        private readonly ServerGrpcSubscribers _serverGrpcSubscribers;

        public DuplexService(ILoggerFactory loggerFactory, ServerGrpcSubscribers serverGrpcSubscribers)
        {
            _logger = loggerFactory.CreateLogger<DuplexService>();
            _serverGrpcSubscribers = serverGrpcSubscribers;
        }

        public override async Task SendData(IAsyncStreamReader<MyMessage> requestStream, IServerStreamWriter<MyMessage> responseStream, ServerCallContext context)
        {
            var httpContext = context.GetHttpContext();
            _logger.LogInformation($"Connection id: {httpContext.Connection.Id}");

            if (!await requestStream.MoveNext())
            {
                return;
            }

            var user = requestStream.Current.Name;
            _logger.LogInformation($"{user} connected");
            var subscriber = new SubscribersModel
            {
                Subscriber = responseStream,
                Name = user
            };

            _serverGrpcSubscribers.AddSubscriber(subscriber);

            do
            {
                await _serverGrpcSubscribers.BroadcastMessageAsync(requestStream.Current);
            } while (await requestStream.MoveNext());

            _serverGrpcSubscribers.RemoveSubscriber(subscriber);
            _logger.LogInformation($"{user} disconnected");
        }

        public void Dispose()
        {
            _logger.LogInformation("Cleaning up");
        }
    }
}

The ServerGrpcSubscribers class implements the BroadcastMessageAsync method and the ConcurrentDictionary of clients are managed here. This service can be used to send server messages to the connected clients.

If when sending a message to a client fails, for example, the client application is killed, the broadcast will catch an exception, and remove this subscription.

using Duplex;
using Microsoft.Extensions.Logging;
using System;
using System.Collections.Concurrent;
using System.Threading.Tasks;

namespace SecureGrpc.Server
{
    public class ServerGrpcSubscribers
    {
        private readonly ILogger _logger;
        private readonly ConcurrentDictionary<string, SubscribersModel> Subscribers = new ConcurrentDictionary<string,SubscribersModel>();
        
        public ServerGrpcSubscribers(ILoggerFactory loggerFactory)
        {
            _logger = loggerFactory.CreateLogger<ServerGrpcSubscribers>();
        }

        public async Task BroadcastMessageAsync(MyMessage message)
        {
            await BroadcastMessages(message);
        }


        public void AddSubscriber(SubscribersModel subscriber)
        {
            bool added = Subscribers.TryAdd(subscriber.Name, subscriber);
            _logger.LogInformation($"New subscriber added: {subscriber.Name}");
            if (!added)
            {
                _logger.LogInformation($"could not add subscriber: {subscriber.Name}");
            }
        }

        public void RemoveSubscriber(SubscribersModel subscriber)
        {
            try
            {
                Subscribers.TryRemove(subscriber.Name, out SubscribersModel item);
                _logger.LogInformation($"Force Remove: {item.Name} - no longer works");
            }
            catch (Exception ex)
            {
                _logger.LogError(ex, $"Could not remove {subscriber.Name}");
            }
        }

        private async Task BroadcastMessages(MyMessage message)
        {
            foreach (var subscriber in Subscribers.Values)
            {
                var item = await SendMessageToSubscriber(subscriber, message);
                if (item != null)
                {
                    RemoveSubscriber(item);
                };
            }
        }

        private async Task<SubscribersModel> SendMessageToSubscriber(SubscribersModel subscriber, MyMessage message)
        {
            try
            {
                _logger.LogInformation($"Broadcasting: {message.Name} - {message.Message}");
                await subscriber.Subscriber.WriteAsync(message);
                return null;
            }
            catch(Exception ex)
            {
                _logger.LogError(ex, "Could not send");
                return subscriber;
            }
        }

    }
}

The SubscribersModel class is used for the clients which are connected to the service. The RequireAuthorization method is used to define the authorization in the routing configuration.

public class SubscribersModel
{
	public IServerStreamWriter<MyMessage> Subscriber { get; set; }

	public string Name { get; set; }
}

The server startup configures the gRPC service.

using Microsoft.AspNetCore.Builder;
using Microsoft.AspNetCore.Hosting;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
using IdentityServer4.AccessTokenValidation;
using Microsoft.AspNetCore.Authentication.JwtBearer;
using System.Security.Claims;

namespace SecureGrpc.Server
{
    public class Startup
    {
        private string stsServer = "https://localhost:44352";

        public void ConfigureServices(IServiceCollection services)
        {
            services.AddHttpContextAccessor();

            services.AddSingleton<ServerGrpcSubscribers>();

            services.AddAuthorization(options =>
            {
                options.AddPolicy("protectedScope", policy =>
                {
                    policy.RequireClaim("scope", "grpc_protected_scope");
                });
            });

            services.AddAuthorizationPolicyEvaluator();

            services.AddAuthentication(IdentityServerAuthenticationDefaults.AuthenticationScheme)
                .AddIdentityServerAuthentication(options =>
                {
                    options.Authority = stsServer;
                    options.ApiName = "ProtectedGrpc";
                    options.ApiSecret = "grpc_protected_secret";
                    options.RequireHttpsMetadata = false;
                });

            services.AddGrpc(options =>
            {
                options.EnableDetailedErrors = true;
            });

            services.AddMvc()
               .AddNewtonsoftJson();
        }

        public void Configure(IApplicationBuilder app, IWebHostEnvironment env)
        {
            ...

            app.UseRouting(routes =>
            {
                routes.MapGrpcService<GreeterService>().RequireAuthorization("protectedScope");
                routes.MapGrpcService<DuplexService>().RequireAuthorization("protectedScope");
                routes.MapRazorPages();
            });

            app.UseAuthentication();
            app.UseAuthorization();
        }
    }
}

The csproj requires the GrpcServices and the proto configuration to create the stubs.

<Project Sdk="Microsoft.NET.Sdk.Web">

  <PropertyGroup>
    <TargetFramework>netcoreapp3.0</TargetFramework>
  </PropertyGroup>

  <ItemGroup>
    <Protobuf Include="..\Protos\*.proto" GrpcServices="Server" />
    <Content Include="@(Protobuf)" LinkBase="" />
  </ItemGroup>

  <ItemGroup>
    <PackageReference Include="Grpc.AspNetCore.Server" Version="0.1.19-pre1" />
    <PackageReference Include="Google.Protobuf" Version="3.6.1" />

    <PackageReference Include="Grpc.Tools" Version="1.19.0-pre1" PrivateAssets="All" />

    <PackageReference Include="Microsoft.AspNetCore.Mvc.NewtonsoftJson" Version="3.0.0-preview-19075-0444" />
    <PackageReference Include="IdentityServer4.AccessTokenValidation" Version="2.7.0" />
  </ItemGroup>

  <ItemGroup>
    <None Update="server.pfx">
      <CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
    </None>
  </ItemGroup>

</Project>

Hosted Worker service gRPC client

The gRPC client is implemented in a worker class run in a Hosted Service. The csproj file also requires the gRPC configurations and the proto settings, otherwise the stub will not be built from the proto file.

using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;

namespace BiDirectionalStreamingWorker
{
    public class Program
    {
        public static void Main(string[] args)
        {
            CreateHostBuilder(args).Build().Run();
        }

        public static IHostBuilder CreateHostBuilder(string[] args) =>
            Host.CreateDefaultBuilder(args)
                .ConfigureServices(services =>
                {
                    services.AddHostedService<Worker>();
                    services.AddSingleton<ApiService>();
                });
    }
}

The worker service implements the gRPC client. This is based on the example from the C# gRPC github repo.

The application gets a bearer token from the Secure token service, and uses the Metadata to add this as a header to the stream.

The data is then sent, received from the server. If the application is closed properly, it will close it’s connection. If the application is killed, the server gRPC server needs to handle this.

using System;
using System.Threading;
using System.Threading.Tasks;
using Grpc.Core;
using Microsoft.Extensions.Hosting;
using Microsoft.Extensions.Logging;
using System.Net.Http;
using System.IO;

namespace BiDirectionalStreamingWorker
{
    public class Worker : BackgroundService
    {
        private readonly ILogger<Worker> _logger;

        public Worker(ILogger<Worker> logger)
        {
            _logger = logger;
        }

        protected override async Task ExecuteAsync(CancellationToken stoppingToken)
        {
            ///
            /// Token init
            /// 
            HttpClient httpClient = new HttpClient();
            ApiService apiService = new ApiService(httpClient);
            var token = await apiService.GetAccessTokenAsync();
            //var token = "This is invalid, I hope it fails";

            var tokenValue = "Bearer " + token;
            var metadata = new Metadata
            {
                { "Authorization", tokenValue }
            };

            ///
            /// Call gRPC HTTPS
            ///
            var channelCredentials = new SslCredentials(
                File.ReadAllText("Certs\\ca.crt"),
                    new KeyCertificatePair(
                        File.ReadAllText("Certs\\client.crt"),
                        File.ReadAllText("Certs\\client.key")
                    )
                );

            var port = "50051";

            var name = "worker_client";
            while (!stoppingToken.IsCancellationRequested)
            {
                _logger.LogInformation($"Worker running at: {DateTime.Now}");

                var channel = new Channel("localhost:" + port, channelCredentials);
                var client = new Duplex.Messaging.MessagingClient(channel);

                using (var sendData = client.SendData(metadata))
                {
                    Console.WriteLine($"Connected as {name}. Send empty message to quit.");

                    var responseTask = Task.Run(async () =>
                    {
                        while (await sendData.ResponseStream.MoveNext(stoppingToken))
                        {
                            Console.WriteLine($"{sendData.ResponseStream.Current.Name}: {sendData.ResponseStream.Current.Message}");
                        }
                    });

                    var line = Console.ReadLine();
                    while (!string.IsNullOrEmpty(line))
                    {
                        await sendData.RequestStream.WriteAsync(new Duplex.MyMessage { Name = name, Message = line });
                        line = Console.ReadLine();
                    }
                    await sendData.RequestStream.CompleteAsync();
                }

                await channel.ShutdownAsync();
                await Task.Delay(1000, stoppingToken);
            }
        }
    }
}

Sending server messages from the Server Razor Pages

On the gRPC server a Razor page can be used to send server messages to all the connected clients. For Razor pages and gRPC to work on the same kestrel server, HTTP and HTTP2 need to be allowed.

A Razor page implements a form to send a broadcast to all the connected gRPC clients, using the ServerGrpcSubscribers defined above.

using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Mvc;
using Microsoft.AspNetCore.Mvc.RazorPages;

namespace SecureGrpc.Server.Pages
{
    public class IndexModel : PageModel
    {
        private readonly ServerGrpcSubscribers _serverGrpcSubscribers;

        public IndexModel(ServerGrpcSubscribers serverGrpcSubscribers)
        {
            _serverGrpcSubscribers = serverGrpcSubscribers;
        }

        public void OnGet()
        {
        }

        public async Task OnPostAsync(string message)
        {
            await _serverGrpcSubscribers.BroadcastMessageAsync(
              new Duplex.MyMessage { Message = message, Name = "Server" });
        }
    }
}

Running the code

In Visual studio, build and run all the projects, multiple project start. The clients will get an access token from the Secure token service, and then send a message to the gRPC server from the clients using the console.

Using a browser at https://localhost:50051, a razor page can be opened to send a server message to the connected clients.

If a connected client is killed, a message is sent, the server throws an exception, and removes the client without crashing.

When the client connects again, the server can send messages to the same client.

Links:

https://github.com/grpc/grpc-dotnet/

https://grpc.io/

https://www.stevejgordon.co.uk/early-look-at-grpc-using-aspnet-core-3

https://www.zoeys.blog/first-impressions-of-grpc-integration-in-asp-net-core-3-preview/

Using Azure Service Bus Queues with ASP.NET Core Services

$
0
0

This article shows how to implement two ASP.NET Core API applications to communicate with each other using Azure Service Bus. The ASP.NET Core APIs are implemented with Swagger support and uses an Azure Service Bus Queue to send data from one service to the other ASP.NET Core application.

Code: https://github.com/damienbod/AspNetCoreServiceBus

Setting up the Azure Service Bus Queue

Azure Service Bus is setup as described here:

https://docs.microsoft.com/en-us/azure/service-bus-messaging/service-bus-create-namespace-portal

A queue or a topic can be used to implement the messaging. A queue is used as the messaging type in this example. Once the data has been received, it is removed from the queue.

Applications Overview

The applications are implemented as follows:

Implementing a Service Bus Queue

The Microsoft.Azure.ServiceBus Nuget package is used to implement the Azure Service Bus clients. The connection string for the service bus is saved in the user secrets of the projects. To run the example yourself, create your own Azure Service Bus, and set the secret for the projects. This can be easily done in Visual Studio by right clicking the project menu in the solution explorer. When deploying the application, use Azure Key Vault to set the secret. This would need to be implemented in the applications.

The SendMessage method takes a MyPayload type as a parameter, and adds this to the message as a Json payload.

using Microsoft.Azure.ServiceBus;
using Microsoft.Extensions.Configuration;
using Newtonsoft.Json;
using System.Text;
using System.Threading.Tasks;

namespace ServiceBusMessaging
{
    public class ServiceBusSender
    {
        private readonly QueueClient _queueClient;
        private readonly IConfiguration _configuration;
        private const string QUEUE_NAME = "simplequeue";

        public ServiceBusSender(IConfiguration configuration)
        {
            _configuration = configuration;
            _queueClient = new QueueClient(
              _configuration.GetConnectionString("ServiceBusConnectionString"), 
              QUEUE_NAME);
        }
        
        public async Task SendMessage(MyPayload payload)
        {
            string data = JsonConvert.SerializeObject(payload);
            Message message = new Message(Encoding.UTF8.GetBytes(data));

            await _queueClient.SendAsync(message);
        }
    }
}

The ServiceBusSender is registered to the IoC of the ASP.NET Core application in the Startup class, ConfigureServices method. Swagger is also added here.

public void ConfigureServices(IServiceCollection services)
{
	services.AddMvc()
		.AddNewtonsoftJson();

	services.AddScoped<ServiceBusSender>();

	services.AddSwaggerGen(c =>
	{
		c.SwaggerDoc("v1", new OpenApiInfo
		{
			Version = "v1",
			Title = "Payload View API",
		});
	});
}

This service can then be used in the Controller which provides the API.

[HttpPost]
[ProducesResponseType(typeof(Payload), StatusCodes.Status200OK)]
[ProducesResponseType(typeof(Payload), StatusCodes.Status409Conflict)]
public async Task<IActionResult> Create([FromBody][Required] Payload request)
{
	if (data.Any(d => d.Id == request.Id))
	{
		return Conflict($"data with id {request.Id} already exists");
	}

	data.Add(request);

	// Send this to the bus for the other services
	await _serviceBusSender.SendMessage(new MyPayload
	{
		Goals = request.Goals,
		Name = request.Name,
		Delete = false
	});

	return Ok(request);
}

Consuming messaging from the Queue

The ServiceBusConsumer implements the IServiceBusConsumer interface. This is used to receive the messages from Azure Service Bus. The Connection String from the Queue is read from the application IConfiguration interface. The RegisterOnMessageHandlerAndReceiveMessages method adds the event handler for the messages, and uses the ProcessMessagesAsync method to process these. The ProcessMessagesAsync method converts the message to an object and calls the IProcessData interface to complete the processing of the message.

using Microsoft.Azure.ServiceBus;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.Logging;
using Newtonsoft.Json;
using System.Text;
using System.Threading;
using System.Threading.Tasks;

namespace ServiceBusMessaging
{
    public interface IServiceBusConsumer
    {
        void RegisterOnMessageHandlerAndReceiveMessages();
        Task CloseQueueAsync();
    }

    public class ServiceBusConsumer : IServiceBusConsumer
    {
        private readonly IProcessData _processData;
        private readonly IConfiguration _configuration;
        private readonly QueueClient _queueClient;
        private const string QUEUE_NAME = "simplequeue";
        private readonly ILogger _logger;

        public ServiceBusConsumer(IProcessData processData, 
            IConfiguration configuration, 
            ILogger<ServiceBusConsumer> logger)
        {
            _processData = processData;
            _configuration = configuration;
            _logger = logger;
            _queueClient = new QueueClient(
              _configuration.GetConnectionString("ServiceBusConnectionString"), QUEUE_NAME);
        }

        public void RegisterOnMessageHandlerAndReceiveMessages()
        {
            var messageHandlerOptions = new MessageHandlerOptions(ExceptionReceivedHandler)
            {
                MaxConcurrentCalls = 1,
                AutoComplete = false
            };

            _queueClient.RegisterMessageHandler(ProcessMessagesAsync, messageHandlerOptions);
        }

        private async Task ProcessMessagesAsync(Message message, CancellationToken token)
        {
            var myPayload = JsonConvert.DeserializeObject<MyPayload>(Encoding.UTF8.GetString(message.Body));
            _processData.Process(myPayload);
            await _queueClient.CompleteAsync(message.SystemProperties.LockToken);
        }

        private Task ExceptionReceivedHandler(ExceptionReceivedEventArgs exceptionReceivedEventArgs)
        {
            _logger.LogError(exceptionReceivedEventArgs.Exception, "Message handler encountered an exception");
            var context = exceptionReceivedEventArgs.ExceptionReceivedContext;

            _logger.LogDebug($"- Endpoint: {context.Endpoint}");
            _logger.LogDebug($"- Entity Path: {context.EntityPath}");
            _logger.LogDebug($"- Executing Action: {context.Action}");

            return Task.CompletedTask;
        }

        public async Task CloseQueueAsync()
        {
            await _queueClient.CloseAsync();
        }
    }
}

The Startup class configures the application and adds the support for Azure Service Bus, Swagger and the ASP.NET Core application.

public void ConfigureServices(IServiceCollection services)
{
	services.AddMvc()
		.AddNewtonsoftJson();

	services.AddSingleton<IServiceBusConsumer, ServiceBusConsumer>();
	services.AddTransient<IProcessData, ProcessData>();

	services.AddSwaggerGen(c =>
	{
		c.SwaggerDoc("v1", new OpenApiInfo
		{
			Version = "v1",
			Title = "Payload API",
		});
	});
}

public void Configure(IApplicationBuilder app, IWebHostEnvironment env)
{

	app.UseStaticFiles();
	app.UseHttpsRedirection();

	app.UseRouting();

	app.UseAuthorization();
	app.UseCors();

	app.UseEndpoints(endpoints =>
	{
	    endpoints.MapControllers();
	});

	// Enable middleware to serve generated Swagger as a JSON endpoint.
	app.UseSwagger();
	app.UseSwaggerUI(c =>
	{
	    c.SwaggerEndpoint("/swagger/v1/swagger.json", 
	       "Payload Management API V1");
	});

	var bus = app.ApplicationServices.GetService<IServiceBusConsumer>();
	bus.RegisterOnMessageHandlerAndReceiveMessages();
}

The IProcessData interface is added to the shared library. This is used to process the incoming messages in the ServiceBusConsumer service.

public interface IProcessData
{
	void Process(MyPayload myPayload);
}

The ProcessData implements the IProcessData and is added to the application which receives the messages. The hosting application can the do whatever is required with the messages.

using AspNetCoreServiceBusApi2.Model;
using ServiceBusMessaging;

namespace AspNetCoreServiceBusApi2
{
    public class ProcessData : IProcessData
    {
        public void Process(MyPayload myPayload)
        {
            DataServiceSimi.Data.Add(new Payload
            {
                Name = myPayload.Name,
                Goals = myPayload.Goals
            });
        }
    }
}

When the applications are started, a POST request can be sent using the swagger UI from the first App.

And the message is then processed in the API 2.

Links:

https://docs.microsoft.com/en-us/azure/service-bus-messaging/

https://docs.microsoft.com/en-us/azure/service-bus-messaging/service-bus-dotnet-get-started-with-queues

https://docs.microsoft.com/en-us/dotnet/standard/microservices-architecture/multi-container-microservice-net-applications/integration-event-based-microservice-communications

https://www.nuget.org/packages/Microsoft.Azure.ServiceBus

Using Azure Service Bus Topics in ASP.NET Core

$
0
0

This article shows how to implement two ASP.NET Core API applications to communicate with each other using Azure Service Bus Topics. This post continues on from the last article, this time using topics and subscriptions to communicate instead of a queue. By using a topic with subscriptions, and message can be sent to n receivers.

Code: https://github.com/damienbod/AspNetCoreServiceBus

Posts in this series:

Setting up the Azure Service Bus Topics

The Azure Service Bus Topic and the Topic Subscription need to be setup in Azure, either using the portal or scripts.

You need to create a Topic in the Azure Service Bus:

In the new Topic, add a Topic Subscription:

ASP.NET Core applications

The applications are setup like in the first post in this series. This time the message bus uses a topic and a subscription to send the messages.

Implementing the Azure Service Bus Topic sender

The messages are sent using the ServiceBusTopicSender class. This class uses the Azure Service Bus connection string and a topic path which matches what was configured in the Azure portal. A new TopicClient is created, and this can then be used to send messages to the topic.

using Microsoft.Azure.ServiceBus;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.Logging;
using Newtonsoft.Json;
using System;
using System.Text;
using System.Threading.Tasks;

namespace ServiceBusMessaging
{
    public class ServiceBusTopicSender
    {
        private readonly TopicClient _topicClient;
        private readonly IConfiguration _configuration;
        private const string TOPIC_PATH = "mytopic";
        private readonly ILogger _logger;

        public ServiceBusTopicSender(IConfiguration configuration, 
            ILogger<ServiceBusTopicSender> logger)
        {
            _configuration = configuration;
            _logger = logger;
            _topicClient = new TopicClient(
                _configuration.GetConnectionString("ServiceBusConnectionString"),
                TOPIC_PATH
            );
        }
        
        public async Task SendMessage(MyPayload payload)
        {
            string data = JsonConvert.SerializeObject(payload);
            Message message = new Message(Encoding.UTF8.GetBytes(data));

            try
            {
                await _topicClient.SendAsync(message);
            }
            catch (Exception e)
            {
                _logger.LogError(e.Message);
            }
        }
    }
}

The ServiceBusTopicSender class is added as a service in the Startup class.

services.AddScoped<ServiceBusTopicSender>();

This service can then be used in the API to send messages to the bus, when other services need the data from the API call.

[HttpPost]
[ProducesResponseType(typeof(Payload), StatusCodes.Status200OK)]
[ProducesResponseType(typeof(Payload), StatusCodes.Status409Conflict)]
public async Task<IActionResult> Create([FromBody][Required] Payload request)
{
	if (data.Any(d => d.Id == request.Id))
	{
		return Conflict($"data with id {request.Id} already exists");
	}

	data.Add(request);

	// Send this to the bus for the other services
	await _serviceBusTopicSender.SendMessage(new MyPayload
	{
		Goals = request.Goals,
		Name = request.Name,
		Delete = false
	});

	return Ok(request);
}

Implementing an Azure Service Bus Topic Subscription

The ServiceBusTopicSubscription class implements the topic subscription. The SubscriptionClient is created using the Azure Service Bus connection string, the topic path and the subscription name. These values are the values which have been configured in Azure. The RegisterOnMessageHandlerAndReceiveMessages method is used to receive the events and send the messages on for processing in the IProcessData implementation.

using Microsoft.Azure.ServiceBus;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.Logging;
using Newtonsoft.Json;
using System.Text;
using System.Threading;
using System.Threading.Tasks;

namespace ServiceBusMessaging
{
    public interface IServiceBusTopicSubscription
    {
        void RegisterOnMessageHandlerAndReceiveMessages();
        Task CloseSubscriptionClientAsync();
    }

    public class ServiceBusTopicSubscription : IServiceBusTopicSubscription
    {
        private readonly IProcessData _processData;
        private readonly IConfiguration _configuration;
        private readonly SubscriptionClient _subscriptionClient;
        private const string TOPIC_PATH = "mytopic";
        private const string SUBSCRIPTION_NAME = "mytopicsubscription";
        private readonly ILogger _logger;

        public ServiceBusTopicSubscription(IProcessData processData, 
            IConfiguration configuration, 
            ILogger<ServiceBusTopicSubscription> logger)
        {
            _processData = processData;
            _configuration = configuration;
            _logger = logger;

            _subscriptionClient = new SubscriptionClient(
                _configuration.GetConnectionString("ServiceBusConnectionString"), 
                TOPIC_PATH, 
                SUBSCRIPTION_NAME);
        }

        public void RegisterOnMessageHandlerAndReceiveMessages()
        {
            var messageHandlerOptions = new MessageHandlerOptions(ExceptionReceivedHandler)
            {
                MaxConcurrentCalls = 1,
                AutoComplete = false
            };

            _subscriptionClient.RegisterMessageHandler(ProcessMessagesAsync, messageHandlerOptions);
        }

        private async Task ProcessMessagesAsync(Message message, CancellationToken token)
        {
            var myPayload = JsonConvert.DeserializeObject<MyPayload>(Encoding.UTF8.GetString(message.Body));
            _processData.Process(myPayload);
            await _subscriptionClient.CompleteAsync(message.SystemProperties.LockToken);
        }

        private Task ExceptionReceivedHandler(ExceptionReceivedEventArgs exceptionReceivedEventArgs)
        {
            _logger.LogError(exceptionReceivedEventArgs.Exception, "Message handler encountered an exception");
            var context = exceptionReceivedEventArgs.ExceptionReceivedContext;

            _logger.LogDebug($"- Endpoint: {context.Endpoint}");
            _logger.LogDebug($"- Entity Path: {context.EntityPath}");
            _logger.LogDebug($"- Executing Action: {context.Action}");

            return Task.CompletedTask;
        }

        public async Task CloseSubscriptionClientAsync()
        {
            await _subscriptionClient.CloseAsync();
        }
    }
}

The IServiceBusTopicSubscription and the IProcessData, plus the implementations are added to the IoC of the ASP.NET Core application.

services.AddSingleton<IServiceBusTopicSubscription, ServiceBusTopicSubscription>();
services.AddTransient<IProcessData, ProcessData>();

The RegisterOnMessageHandlerAndReceiveMessages is called in the Configure Startup method, so that the application starts to listen for messages.

public void Configure(IApplicationBuilder app, IWebHostEnvironment env)
{
	...

	var busSubscription = 
		app.ApplicationServices.GetService<IServiceBusTopicSubscription>();
	busSubscription.RegisterOnMessageHandlerAndReceiveMessages();
}

The ProcessData service processes the incoming topic messages for the defined subscription, and adds them to an in-memory list in this demo, which can be viewed using the Swagger API.

using AspNetCoreServiceBusApi2.Model;
using ServiceBusMessaging;

namespace AspNetCoreServiceBusApi2
{
    public class ProcessData : IProcessData
    {
        public void Process(MyPayload myPayload)
        {
            DataServiceSimi.Data.Add(new Payload
            {
                Name = myPayload.Name,
                Goals = myPayload.Goals
            });
        }
    }
}

If only the ASP.NET Core application which sends messages is started, and a POST is called to for the topic API, a message will be sent to the Azure Service Bus topic. This can then be viewed in the portal.

If the API from the application which receives the topic subscriptions is started, the message will be sent and removed from the topic subscription.

Links:

https://docs.microsoft.com/en-us/azure/service-bus-messaging/

https://docs.microsoft.com/en-us/azure/service-bus-messaging/service-bus-dotnet-get-started-with-queues

https://www.nuget.org/packages/Microsoft.Azure.ServiceBus

Azure Service Bus Topologies

https://docs.microsoft.com/en-us/dotnet/standard/microservices-architecture/multi-container-microservice-net-applications/integration-event-based-microservice-communications

Always subscribe to Dead-lettered messages in an Azure Service Bus

Using Azure Service Bus Topics Subscription Filters in ASP.NET Core

$
0
0

This article shows how to implement Azure Service Bus filters for topic subscriptions used in an ASP.NET Core API application. The application uses the Microsoft.Azure.ServiceBus NuGet package for all the Azure Service Bus client logic.

Code: https://github.com/damienbod/AspNetCoreServiceBus

Posts in this series:

Azure Service Bus Topic Sender

The topic sender from the previous post was changed to add a UserProperties item to the message called goals which will be filtered. Otherwise the sender is as before and sends the messages to the topic.

using Microsoft.Azure.ServiceBus;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.Logging;
using Newtonsoft.Json;
using System;
using System.Text;
using System.Threading.Tasks;

namespace ServiceBusMessaging
{
    public class ServiceBusTopicSender
    {
        private readonly TopicClient _topicClient;
        private readonly IConfiguration _configuration;
        private const string TOPIC_PATH = "mytopic";
        private readonly ILogger _logger;

        public ServiceBusTopicSender(IConfiguration configuration, 
            ILogger<ServiceBusTopicSender> logger)
        {
            _configuration = configuration;
            _logger = logger;
            _topicClient = new TopicClient(
                _configuration.GetConnectionString("ServiceBusConnectionString"),
                TOPIC_PATH
            );
        }
        
        public async Task SendMessage(MyPayload payload)
        {
            string data = JsonConvert.SerializeObject(payload);
            Message message = new Message(Encoding.UTF8.GetBytes(data));
            message.UserProperties.Add("goals", payload.Goals);

            try
            {
                await _topicClient.SendAsync(message);
            }
            catch (Exception e)
            {
                _logger.LogError(e.Message);
            }
        }
        
    }
}

It is not possible to add a subscription filter to the topic using the Azure portal. To do this you need to implement it in code, or used scripts, or the Azure CLI.

The RemoveDefaultFilters method checks if the default filter exists, and if it does it is removed. It does not remove the other filters.

private async Task RemoveDefaultFilters()
{
	try
	{
		var rules = await _subscriptionClient.GetRulesAsync();
		foreach(var rule in rules)
		{
			if(rule.Name == RuleDescription.DefaultRuleName)
			{
				await _subscriptionClient.RemoveRuleAsync(RuleDescription.DefaultRuleName);
			}
		}
		
	}
	catch (Exception ex)
	{
		_logger.LogWarning(ex.ToString());
	}
}

The AddFilters method adds the new filter, if it is not already added. The filter in this demo will use the goals user property from the message and only subscribe to messages with a value greater than 7.

private async Task AddFilters()
{
	try
	{
		var rules = await _subscriptionClient.GetRulesAsync();
		if(!rules.Any(r => r.Name == "GoalsGreaterThanSeven"))
		{
			var filter = new SqlFilter("goals > 7");
			await _subscriptionClient.AddRuleAsync("GoalsGreaterThanSeven", filter);
		}
	}
	catch (Exception ex)
	{
		_logger.LogWarning(ex.ToString());
	}
}

The filter methods are added to the PrepareFiltersAndHandleMessages method. This sets up the filters, or makes sure the filters are correct on the Azure Service Bus, and then registers itself to the topic subscription to receive the messages form its subscription.

public async Task PrepareFiltersAndHandleMessages()
{
	await RemoveDefaultFilters();
	await AddFilters();

	var messageHandlerOptions = new MessageHandlerOptions(ExceptionReceivedHandler)
	{
		MaxConcurrentCalls = 1,
		AutoComplete = false,
	};

	_subscriptionClient.RegisterMessageHandler(ProcessMessagesAsync, messageHandlerOptions);
}

The Azure Service Bus classes are added to the ASP.NET Core application in the Startup class. This adds the services to the IoC and initializes the message listener.

using Microsoft.AspNetCore.Builder;
using Microsoft.AspNetCore.Hosting;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
using Microsoft.OpenApi.Models;
using ServiceBusMessaging;
using System.Threading.Tasks;

namespace AspNetCoreServiceBusApi2
{
    public class Startup
    {
        public Startup(IConfiguration configuration)
        {
            Configuration = configuration;
        }

        public IConfiguration Configuration { get; }

        public void ConfigureServices(IServiceCollection services)
        {
            ...
			
            services.AddSingleton<IServiceBusTopicSubscription, ServiceBusTopicSubscription>();
            services.AddTransient<IProcessData, ProcessData>();

        }

        public void Configure(IApplicationBuilder app, IWebHostEnvironment env)
        {
            ...
			
            var busSubscription = app.ApplicationServices.GetService<IServiceBusTopicSubscription>();
            busSubscription.PrepareFiltersAndHandleMessages().GetAwaiter().GetResult();
        }
    }
}

When the applications are started, the API2 only receives messages which have a goal value greater than seven.

Links:

https://docs.microsoft.com/en-us/azure/service-bus-messaging/

https://docs.microsoft.com/en-us/azure/service-bus-messaging/service-bus-dotnet-get-started-with-queues

https://www.nuget.org/packages/Microsoft.Azure.ServiceBus

Azure Service Bus Topologies

https://docs.microsoft.com/en-us/dotnet/standard/microservices-architecture/multi-container-microservice-net-applications/integration-event-based-microservice-communications

Always subscribe to Dead-lettered messages when using an Azure Service Bus


Running Razor Pages and a gRPC service in a single ASP.NET Core application

$
0
0

This article shows how ASP.NET Core Razor Pages can be run in the same application as a gRPC service.

Code: https://github.com/damienbod/Secure_gRpc

Posts in this series

Adding Razor Pages to an existing gRPC service

This demo is built using the code created in the previous post, which is pretty much the gRPC code created from the ASP.NET Core 3.0 templates.

To support both Razor Pages and gRPC services hosted using a single Kestrel server, both HTTP and HTTP2 needs to be supported. This is done in the Program class of the application using a ConfigureKestrel method. Set the Protocols property to HttpProtocols.Http1AndHttp2.

public static IHostBuilder CreateHostBuilder(string[] args) =>
 Host.CreateDefaultBuilder(args)
   .ConfigureWebHostDefaults(webBuilder =>
   {
	webBuilder.UseStartup<Startup>()
	.ConfigureKestrel(options =>
	{
		options.Limits.MinRequestBodyDataRate = null;
		options.ListenLocalhost(50051, listenOptions =>
		{
			listenOptions.UseHttps("server.pfx", "1111");
			listenOptions.Protocols = HttpProtocols.Http1AndHttp2;
		});
	});
   });

Now we need to add the MVC middleware and also the Newtonsoft JSON Nuget package ( Microsoft.AspNetCore.Mvc.NewtonsoftJson ) as this is no longer in the default Nuget packages in ASP.NET Core 3.0. You need to add the Nuget package to the project.

Then add the MVC middleware.

public void ConfigureServices(IServiceCollection services)
{
	...

	services.AddGrpc(options =>
	{
		options.EnableDetailedErrors = true;
	});

	services.AddMvc()
	   .AddNewtonsoftJson();
}

In the configure method, the static files middleware need to be added so that the css and the Javascript will be supported. The Razor Pages are then added to the routing.

public void Configure(IApplicationBuilder app, IWebHostEnvironment env)
{
	...

	app.UseStaticFiles();

	app.UseRouting(routes =>
	{
		routes.MapGrpcService<GreeterService>();
		   
		routes.MapRazorPages();
	});

}

Add the Razor Pages to the application and also the css, Javascript files inside the wwwroot.

And now both application types are running, hosted on Kestrel in the same APP which is really cool.

Links:

https://github.com/grpc/grpc-dotnet/

https://grpc.io/

An Early Look at gRPC and ASP.NET Core 3.0

Using Entity Framework Core to process Azure Service Messages in ASP.NET Core

$
0
0

This article shows how to use Entity Framework Core together with an Azure Service Bus receiver in ASP.NET Core. This message handler is a singleton and so requires that an Entity Framework Core context inside this singleton and is not registered as a scoped service but created and disposed for each message event.

Code: https://github.com/damienbod/AspNetCoreServiceBus

Posts in this series:

Processing the Azure Service Bus Messages

The ProcessData class is used to handle the messages in the ASP.NET Core application. The service uses an Entity Framework Core context to save the required data to a database.

using AspNetCoreServiceBusApi2.Model;
using Microsoft.Extensions.Configuration;
using ServiceBusMessaging;
using System;
using System.Threading.Tasks;

namespace AspNetCoreServiceBusApi2
{
    public class ProcessData : IProcessData
    {
        private IConfiguration _configuration;

        public ProcessData(IConfiguration configuration)
        {
            _configuration = configuration;
        }
        public async Task Process(MyPayload myPayload)
        {
            using (var payloadMessageContext = 
                new PayloadMessageContext(
                    _configuration.GetConnectionString("DefaultConnection")))
            {
                await payloadMessageContext.AddAsync(new Payload
                {
                    Name = myPayload.Name,
                    Goals = myPayload.Goals,
                    Created = DateTime.UtcNow
                });

                await payloadMessageContext.SaveChangesAsync();
            }
        }
    }
}

The services used to consume the Azure Service Bus are registered to the IoC (Inversion of Control) as singletons. Due to this, only singletons or transient services can be used. If we use the context as a singleton, we will end up having connection and pooling problems with the database.

services.AddSingleton<IServiceBusConsumer, ServiceBusConsumer>();
services.AddSingleton<IServiceBusTopicSubscription, ServiceBusTopicSubscription>();
services.AddSingleton<IProcessData, ProcessData>();

A PayloadMessageContext Entity Framework Core context was created for the Azure Service Bus message handling.

using Microsoft.EntityFrameworkCore;

namespace AspNetCoreServiceBusApi2.Model
{
    public class PayloadMessageContext : DbContext
    {
        private string _connectionString;

        public DbSet<Payload> Payloads { get; set; }
      
        public PayloadMessageContext(string connectionString)
        {
            _connectionString = connectionString;
        }

        protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder)
        {
            optionsBuilder.UseSqlite(_connectionString);
        }

        protected override void OnModelCreating(ModelBuilder builder)
        {
            builder.Entity<Payload>().Property(n => n.Id).ValueGeneratedOnAdd();
            builder.Entity<Payload>().HasKey(m => m.Id); 
            base.OnModelCreating(builder); 
        } 
    }
}

The required NuGet packages were added to the project. This demo uses SQLite.

<PackageReference Include="Microsoft.EntityFrameworkCore.Sqlite" Version="3.0.0-preview4.19216.3" />
    <PackageReference Include="Microsoft.EntityFrameworkCore.Tools" Version="3.0.0-preview4.19216.3">
      <PrivateAssets>all</PrivateAssets>
      <IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
    </PackageReference>

The service is then added in the Startup class as a singleton. The Entity Framework Core context used for the messaging is not registered here, because it is used inside the singleton instance and we do not want a context which is a singleton, because it will have problems with the database connections, and pooling. Instead a new context is created inside the service for each message event and disposed after. If you have a lot of messages, this would need to be optimized.

Now when the ASP.NET Core application receives messages, the singleton service context handles this messages, and saves the data to a database.

Links:

https://docs.microsoft.com/en-us/azure/service-bus-messaging/

https://docs.microsoft.com/en-us/azure/service-bus-messaging/service-bus-dotnet-get-started-with-queues

https://www.nuget.org/packages/Microsoft.Azure.ServiceBus

Azure Service Bus Topologies

https://docs.microsoft.com/en-us/dotnet/standard/microservices-architecture/multi-container-microservice-net-applications/integration-event-based-microservice-communications

Always subscribe to Dead-lettered messages when using an Azure Service Bus

https://ml-software.ch/posts/stripe-api-with-asp-net-core-part-3

Handling Access Tokens for private APIs in ASP.NET Core

$
0
0

This article shows how to persist access tokens for a trusted ASP.NET Core application which needs to access secure APIs. These tokens which are persisted are not meant for public clients, but are used for the service to service communication.

Code: https://github.com/damienbod/AspNetCoreHybridFlowWithApi

Posts in this series:

Setup

The software system consists of 3 applications, a web client with a UI and user, an API which is used by the web client and a secure token service, implemented using IdentityServer4.

The tokens persisted in this example are used for the communication between the web application and the trusted API in the service. The application gets the access tokens for the service to service communication. The tokens for the identities (users + application) are not used here. In the previous post, each time the user requested a view, the API service requested the disco service data (OpenID Connect well known endpoints). Then it requested the access token from the secure token service token endpoint. After it requested the API resource. We want to re-use the access tokens instead of always doing the extra 2 HTTP requests for the web UI requests.

The ApiService is used to access the API for the identity. This is a scoped or transient instance in the IoC and for each identity different.

The service uses the API token client service which is a singleton. The service is used to get the access tokens and persist them as long as the tokens are valid. The service then uses the access token to get the data from the API resource.

using Microsoft.Extensions.Options;
using Newtonsoft.Json.Linq;
using System;
using System.Net.Http;
using System.Threading.Tasks;

namespace WebHybridClient
{
    public class ApiService
    {
        private readonly IOptions<AuthConfigurations> _authConfigurations;
        private readonly IHttpClientFactory _clientFactory;
        private readonly ApiTokenCacheClient _apiTokenClient;

        public ApiService(
            IOptions<AuthConfigurations> authConfigurations, 
            IHttpClientFactory clientFactory,
            ApiTokenCacheClient apiTokenClient)
        {
            _authConfigurations = authConfigurations;
            _clientFactory = clientFactory;
            _apiTokenClient = apiTokenClient;
        }

        public async Task<JArray> GetApiDataAsync()
        {
            try
            {
                var client = _clientFactory.CreateClient();

                client.BaseAddress = new Uri(_authConfigurations.Value.ProtectedApiUrl);

                var access_token = await _apiTokenClient.GetApiToken(
                    "ProtectedApi",
                    "scope_used_for_api_in_protected_zone",
                    "api_in_protected_zone_secret"
                );

                client.SetBearerToken(access_token);

                var response = await client.GetAsync("api/values");
                if (response.IsSuccessStatusCode)
                {
                    var responseContent = await response.Content.ReadAsStringAsync();
                    var data = JArray.Parse(responseContent);

                    return data;
                }

                throw new ApplicationException($"Status code: {response.StatusCode}, Error: {response.ReasonPhrase}");
            }
            catch (Exception e)
            {
                throw new ApplicationException($"Exception {e}");
            }
        }
    }
}

The API token client service use the GetApiToken method to get the access token. It requires an API name, a scope and a secret to get the token.

var access_token = await _apiTokenClient.GetApiToken(
                    "ProtectedApi",
                    "scope_used_for_api_in_protected_zone",
                    "api_in_protected_zone_secret"
                );

The first time the ASP.NET Core instance requests an access token, it gets the well known endpoint data from the Auth server, and then gets the access token for the parameters provided. The token response is saved to a concurrent dictionary, so that it can be reused.

private async Task<AccessTokenItem> getApiToken(string api_name, string api_scope, string secret)
{
	try
	{
		var disco = await HttpClientDiscoveryExtensions.GetDiscoveryDocumentAsync(
			_httpClient, 
			_authConfigurations.Value.StsServer);

		if (disco.IsError)
		{
			_logger.LogError($"disco error Status code: {disco.IsError}, Error: {disco.Error}");
			throw new ApplicationException($"Status code: {disco.IsError}, Error: {disco.Error}");
		}

		var tokenResponse = await HttpClientTokenRequestExtensions.RequestClientCredentialsTokenAsync(_httpClient, new ClientCredentialsTokenRequest
		{
			Scope = api_scope,
			ClientSecret = secret,
			Address = disco.TokenEndpoint,
			ClientId = api_name
		});

		if (tokenResponse.IsError)
		{
			_logger.LogError($"tokenResponse.IsError Status code: {tokenResponse.IsError}, Error: {tokenResponse.Error}");
			throw new ApplicationException($"Status code: {tokenResponse.IsError}, Error: {tokenResponse.Error}");
		}

		return new AccessTokenItem
		{
			ExpiresIn = DateTime.UtcNow.AddSeconds(tokenResponse.ExpiresIn),
			AccessToken = tokenResponse.AccessToken
		};
		
	}
	catch (Exception e)
	{
		_logger.LogError($"Exception {e}");
		throw new ApplicationException($"Exception {e}");
	}
}

The GetApiToken is the public method for this service. This method checks if a valid access token exists for this API, and returns it from memory if it does. Otherwise, it gets a new token from the secure token service with the extra 2 HTTP calls.

public async Task<string> GetApiToken(string api_name, string api_scope, string secret)
{
	if (_accessTokens.ContainsKey(api_name))
	{
		var accessToken = _accessTokens.GetValueOrDefault(api_name);
		if (accessToken.ExpiresIn > DateTime.UtcNow)
		{
			return accessToken.AccessToken;
		}
		else
		{
			// remove
			_accessTokens.TryRemove(api_name, out AccessTokenItem accessTokenItem);
		}
	}

	_logger.LogDebug($"GetApiToken new from STS for {api_name}");

	// add
	var newAccessToken = await getApiToken( api_name,  api_scope,  secret);
	_accessTokens.TryAdd(api_name, newAccessToken);

	return newAccessToken.AccessToken;
}

What’s wrong with this?

The above service works well, but what if the ASP.NET Core application is deployed as a multi-instance? Each instance of the application would have it’s own in memory access tokens, which are updated each time the tokens expire. What if I want to share tokens between instances or even services? Then the software system would be making extra requests which could be optimized.

Using Cache to solve and improve performance with multiple instances

A distributed cache could be used to solve this problem. For example a Redis cache could be used to persist the access tokens for the services, and used in all trusted services which request secure API data. These are not the tokens used for the identities, but the tokens for the service to service communication. This should be in a protected zone, and if you save access tokens to a shared cache, then care has to be taken, that this cannot be abused!

The service works just like the service above except a cache is used instead of a concurrent dictionary.

using IdentityModel.Client;
using Microsoft.Extensions.Caching.Distributed;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using Newtonsoft.Json;
using System;
using System.Net.Http;
using System.Threading.Tasks;

namespace WebHybridClient
{
    public class ApiTokenCacheClient
    {
        private readonly ILogger<ApiTokenCacheClient> _logger;
        private readonly HttpClient _httpClient;
        private readonly IOptions<AuthConfigurations> _authConfigurations;

        private static readonly Object _lock = new Object();
        private IDistributedCache _cache;

        private const int cacheExpirationInDays = 1;

        private class AccessTokenItem
        {
            public string AccessToken { get; set; } = string.Empty;
            public DateTime ExpiresIn { get; set; }
        }

        public ApiTokenCacheClient(
            IOptions<AuthConfigurations> authConfigurations,
            IHttpClientFactory httpClientFactory,
            ILoggerFactory loggerFactory,
            IDistributedCache cache)
        {
            _authConfigurations = authConfigurations;
            _httpClient = httpClientFactory.CreateClient();
            _logger = loggerFactory.CreateLogger<ApiTokenCacheClient>();
            _cache = cache;
        }

        public async Task<string> GetApiToken(string api_name, string api_scope, string secret)
        {
            var accessToken = GetFromCache(api_name);

            if (accessToken != null)
            {
                if (accessToken.ExpiresIn > DateTime.UtcNow)
                {
                    return accessToken.AccessToken;
                }
                else 
                { 
                    // remove  => NOT Needed for this cache type
                }
            }

            _logger.LogDebug($"GetApiToken new from STS for {api_name}");

            // add
            var newAccessToken = await getApiToken( api_name,  api_scope,  secret);
            AddToCache(api_name, newAccessToken);

            return newAccessToken.AccessToken;
        }

        private async Task<AccessTokenItem> getApiToken(string api_name, string api_scope, string secret)
        {
            try
            {
                var disco = await HttpClientDiscoveryExtensions.GetDiscoveryDocumentAsync(
                    _httpClient, 
                    _authConfigurations.Value.StsServer);

                if (disco.IsError)
                {
                    _logger.LogError($"disco error Status code: {disco.IsError}, Error: {disco.Error}");
                    throw new ApplicationException($"Status code: {disco.IsError}, Error: {disco.Error}");
                }

                var tokenResponse = await HttpClientTokenRequestExtensions.RequestClientCredentialsTokenAsync(_httpClient, new ClientCredentialsTokenRequest
                {
                    Scope = api_scope,
                    ClientSecret = secret,
                    Address = disco.TokenEndpoint,
                    ClientId = api_name
                });

                if (tokenResponse.IsError)
                {
                    _logger.LogError($"tokenResponse.IsError Status code: {tokenResponse.IsError}, Error: {tokenResponse.Error}");
                    throw new ApplicationException($"Status code: {tokenResponse.IsError}, Error: {tokenResponse.Error}");
                }

                return new AccessTokenItem
                {
                    ExpiresIn = DateTime.UtcNow.AddSeconds(tokenResponse.ExpiresIn),
                    AccessToken = tokenResponse.AccessToken
                };
                
            }
            catch (Exception e)
            {
                _logger.LogError($"Exception {e}");
                throw new ApplicationException($"Exception {e}");
            }
        }

        private void AddToCache(string key, AccessTokenItem accessTokenItem)
        {
            var options = new DistributedCacheEntryOptions().SetSlidingExpiration(TimeSpan.FromDays(cacheExpirationInDays));

            lock (_lock)
            {
                _cache.SetString(key, JsonConvert.SerializeObject(accessTokenItem), options);
            }
        }

        private AccessTokenItem GetFromCache(string key)
        {
            var item = _cache.GetString(key);
            if (item != null)
            {
                return JsonConvert.DeserializeObject<AccessTokenItem>(item);
            }

            return null;
        }
    }
}

This improves the performance and reduces the amount of HTTP calls for each request. The tokens for the API services are only updated when the tokens expire, and so saves many HTTP calls.

Links

https://docs.microsoft.com/en-gb/aspnet/core/mvc/overview

https://docs.microsoft.com/en-gb/aspnet/core/security/anti-request-forgery

https://docs.microsoft.com/en-gb/aspnet/core/security/

http://openid.net/

https://www.owasp.org/images/b/b0/Best_Practices_WAF_v105.en.pdf

https://tools.ietf.org/html/rfc7662

http://docs.identityserver.io/en/release/quickstarts/5_hybrid_and_api_access.html

https://github.com/aspnet/Security

Identity Server: From Implicit to Hybrid Flow

http://openid.net/specs/openid-connect-core-1_0.html#HybridFlowAuth

https://docs.microsoft.com/en-us/aspnet/core/performance/caching/distributed?view=aspnetcore-2.2

Updating Microsoft Account Logins in ASP.NET Core with OpenID Connect and Azure Active Directory

$
0
0

This article shows how to implement an Azure Active Directory login for an ASP.NET Core application. The Microsoft identity platform (v2.0) is now Open ID Connect certified and the Microsoft Account logins can now be replaced with this. By using OpenID Connect instead of Microsoft Accounts, it is easy to force a login, or a consent screen as well as following a standard. A full signout can also be supported if required. The AddOpenIdConnect OIDC extension method should now be used instead of the AddMicrosoftAccount method. This replaces the existing post: Adding an external Microsoft login to IdentityServer4. It is still possible to use the https://apps.dev.microsoft.com if only Microsoft accounts are required.

Code https://github.com/damienbod/AspNetCoreID4External

Updating Microsoft Account Logins in ASP.NET Core with OpenID Connect and Azure Active Directory

If you open an existing Microsoft Account App configuration on https://apps.dev.microsoft.com , it will offer you the possibility to configure this on the Azure portal as an Azure Active Directory App. You can also create a new one using the Azure Active Directory/App Registrations/New Registration button.

We want to create an Accounts in any organizational directory and personal Microsoft accounts (e.g. Skype, Xbox, Outlook.com) type, because we would like that our AAD any other AAD or live accounts can login to our software. We also want to define the return URLs which are required. Live SDK support is also required.

Then we need to create a new secret which is required to access the login from our OIDC Authorization Code Flow client in the ASP.NET Core application. The Implicit Flow is not required. We could also define the logout URL in the Authentication blade, so that when the user logs out from his/her account, the application will also do a logout.

The application ID is required to configure the OIDC client in the ASP.NET Core application.

In the startup class of the ASP.NET Core application, the AddOpenIdConnect extension method is used to implement the Open ID Connect code flow client to access the Azure AD App. The common V2.0 endpoint is used. The SignInScheme is defined as “Identity.External”. This is because ASP.NET Core Identity is used in this application, and the identity is then stored to the Identity database, with the defined login. Cookies could also be used here if you use only Azure AD and Live accounts with the V2.0 common endpoint. The RemoteAuthenticationTimeout property is set so that the user has enough time to do the login. The response type is code as per OpenID Connect specification. The Issuer will not be validated and this is configured to false. This is because any AAD or live account can be used here, and so the Issuer will always be different. If you know or want to allow only specific AAD tenants etc, then you should validate this. The email scope is requested and this is then mapped to the name property which can be accessed easily in the HttpContext object.

The Prompt property can be used to force a login, or the consent screen. Per specification, “none”, “login”, “consent”, “select_account” values can be used here. If the login is not forced, the user will automatically be logged in, if only one account is active.

The CallbackPath path is set to match the App configuration in the Azure AD app registration.

services.AddAuthentication()
.AddOpenIdConnect("Azure AD / Microsoft", "Azure AD / Microsoft", options => 
{
	//  https://login.microsoftonline.com/common/v2.0/.well-known/openid-configuration
	options.ClientId = _clientId;
	options.ClientSecret = _clientSecret;
	options.SignInScheme = "Identity.External";
	options.RemoteAuthenticationTimeout = TimeSpan.FromSeconds(30);
	options.Authority = "https://login.microsoftonline.com/common/v2.0/";
	options.ResponseType = "code";
	options.Scope.Add("profile");
	options.Scope.Add("email");
	options.TokenValidationParameters = new TokenValidationParameters
	{
	   ValidateIssuer = false,
	   NameClaimType = "email",
	};
	options.CallbackPath = "/signin-microsoft";
	options.Prompt = "login"; // login, consent
});

Troubleshooting Correlation Exceptions

When the Application is deployed, sometimes you will start receiving Correlation Exceptions which are caused for a number of different reasons and can be difficult to figure out why it worked in dev, but not in the deployment.

RemoteAuthenticationTimeout

Sometimes not enough time is allowed for the user to login, maybe a 2FA login is required or something which takes time to complete. Try increasing the RemoteAuthenticationTimeout value to allow more time if this is causing the Correlation exceptions.

No Cookie exception

This is caused when the Client PC is blocking cookies or has some weird IT setup with a Firewall/ Anti-Virus which blocks this. Get the IT Admin to open up the security for your applications.

Multiple instance deployments

This can happen when multiple instances are used for the deployment, and the data protection does not store the keys to a common store. This is not required for Azure App Services deployment, but maybe you deploy with Service Fabric or docker, and a common cache, store should be coded for the data protection.

Cookie Policy overwritten

If the Cookie policy is configured to strict or non essential, then the Correlation Cookie Builder would have to be coded, and set so that these cookies are same site = none and essential.

Summary

The AddMicrosoftAccount extension method can now be replaced with the AddOpenIdConnect method, and AddMicrosoftAccount should no longer be used. The Azure V2.0 Common endpoint is now certified and this would be following a specification and best practice.

Links

https://docs.microsoft.com/en-us/azure/active-directory/develop/quickstart-v2-aspnet-core-webapp

https://docs.microsoft.com/en-us/azure/active-directory/develop/v2-overview

https://openid.net/specs/openid-connect-core-1_0.html

https://docs.microsoft.com/en-us/azure/app-service-mobile/app-service-mobile-how-to-configure-microsoft-authentication

http://docs.identityserver.io/en/release/topics/signin_external_providers.html

https://developer.microsoft.com/en-us/identity/blogs/new-app-registrations-experience-is-now-generally-available/

https://docs.microsoft.com/en-us/azure/active-directory/develop/app-registrations-training-guide

https://docs.microsoft.com/en-us/azure/active-directory/develop/v1-protocols-openid-connect-code

Certificate Authentication in ASP.NET Core 3.0

$
0
0

This article shows how Certificate Authentication can be implemented in ASP.NET Core 3.0. In this example, a shared self signed certificate is used to authenticate one application calling an API on a second ASP.NET Core application.

Code https://github.com/damienbod/AspNetCoreCertificateAuth

Posts in this series

Setting up the Server

Add the Certificate Authentication using the Microsoft.AspNetCore.Authentication.Certificate NuGet package to the server ASP.NET Core application.

This can also be added directly in the csproj file.

<Project Sdk="Microsoft.NET.Sdk.Web">

  <PropertyGroup>
    <TargetFramework>netcoreapp3.0</TargetFramework>
    <AspNetCoreHostingModel>OutOfProcess</AspNetCoreHostingModel>
  </PropertyGroup>

  <ItemGroup>
    <PackageReference Include="Microsoft.AspNetCore.Authentication.Certificate" 
      Version="3.0.0-preview6.19307.2" />
  </ItemGroup>

  <ItemGroup>
    <None Update="sts_dev_cert.pfx">
      <CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
    </None>
  </ItemGroup>

</Project>

The authentication can be added in the ConfigureServices method in the Startup class. This example was built using the ASP.NET Core documentation. The AddAuthentication extension method is used to define the default scheme as “Certificate” using the CertificateAuthenticationDefaults.AuthenticationScheme string. The AddCertificate method then adds the configuration for the certificate authentication. At present, all certificates are excepted which is not good and the MyCertificateValidationService class is used to do extra validation of the client certificate. If the validation fails, the request is failed and the request for the resource will be rejected.

public void ConfigureServices(IServiceCollection services)
{
	services.AddSingleton<MyCertificateValidationService>();

	services.AddAuthentication(CertificateAuthenticationDefaults.AuthenticationScheme)
		.AddCertificate(options => // code from ASP.NET Core sample
		{
			options.AllowedCertificateTypes = CertificateTypes.All;
			options.Events = new CertificateAuthenticationEvents
			{
				OnCertificateValidated = context =>
				{
					var validationService =
						context.HttpContext.RequestServices.GetService<MyCertificateValidationService>();

					if (validationService.ValidateCertificate(context.ClientCertificate))
					{
						var claims = new[]
						{
							new Claim(ClaimTypes.NameIdentifier, context.ClientCertificate.Subject, ClaimValueTypes.String, context.Options.ClaimsIssuer),
							new Claim(ClaimTypes.Name, context.ClientCertificate.Subject, ClaimValueTypes.String, context.Options.ClaimsIssuer)
						};

						context.Principal = new ClaimsPrincipal(new ClaimsIdentity(claims, context.Scheme.Name));
						context.Success();
					}
					else
					{
						context.Fail("invalid cert");
					}

					return Task.CompletedTask;
				}
			};
		});

	services.AddAuthorization();

	services.AddControllers();
}

The AddCertificateForwarding method is used so that the client header can be specified and how the certificate is to be loaded using the HeaderConverter option. When sending the certificate with the HttpClient using the default settings, the ClientCertificate was always be null. The X-ARR-ClientCert header is used to pass the client certificate, and the cert is passed as a string to work around this.

services.AddCertificateForwarding(options =>
{
	options.CertificateHeader = "X-ARR-ClientCert";
	options.HeaderConverter = (headerValue) =>
	{
		X509Certificate2 clientCertificate = null;
		if(!string.IsNullOrWhiteSpace(headerValue))
		{
			byte[] bytes = StringToByteArray(headerValue);
			clientCertificate = new X509Certificate2(bytes);
		}

		return clientCertificate;
	};
});

The Configure method then adds the middleware. UseCertificateForwarding is added before the UseAuthentication and the UseAuthorization.

public void Configure(IApplicationBuilder app, IWebHostEnvironment env)
{
	...
	
	app.UseRouting();

	app.UseCertificateForwarding();
	app.UseAuthentication();
	app.UseAuthorization();

	app.UseEndpoints(endpoints =>
	{
		endpoints.MapControllers();
	});
}

The MyCertificateValidationService is used to implement validation logic. Because we are using self signed certificates, we need to ensure that only our certificate can be used. We validate that the thumbprints of the client certificate and also the server one match, otherwise any certificate can be used and will be be enough to authenticate.

using System.IO;
using System.Security.Cryptography.X509Certificates;

namespace AspNetCoreCertificateAuthApi
{
    public class MyCertificateValidationService
    {
        public bool ValidateCertificate(X509Certificate2 clientCertificate)
        {
            var cert = new X509Certificate2(Path.Combine("sts_dev_cert.pfx"), "1234");
            if (clientCertificate.Thumbprint == cert.Thumbprint)
            {
                return true;
            }

            return false;
        }
    }
}

The API ValuesController is then secured using the Authorize attribute.

[Route("api/[controller]")]
[ApiController]
[Authorize]
public class ValuesController : ControllerBase
{

...

The ASP.NET Core server project is deployed in this example as an out of process application using kestrel. To use the service, a certificate is required. This is defined using the ClientCertificateMode.RequireCertificate option.

public static IWebHost BuildWebHost(string[] args)
  => WebHost.CreateDefaultBuilder(args)
  .UseStartup<Startup>()
  .ConfigureKestrel(options =>
  {
	var cert = new X509Certificate2(Path.Combine("sts_dev_cert.pfx"), "1234");
	options.ConfigureHttpsDefaults(o =>
	{
		o.ServerCertificate = cert;
		o.ClientCertificateMode = ClientCertificateMode.RequireCertificate;
	});
  })
  .Build();

Implementing the HttpClient

The client of the API uses a HttpClient which was create using an instance of the IHttpClientFactory. This does not provide a way to define a handler for the HttpClient and so we use a HttpRequestMessage to add the Certificate to the “X-ARR-ClientCert” request header. The cert is added as a string using the GetRawCertDataString method.

private async Task<JArray> GetApiDataAsync()
{
	try
	{
		var cert = new X509Certificate2(Path.Combine(_environment.ContentRootPath, "sts_dev_cert.pfx"), "1234");

		var client = _clientFactory.CreateClient();

		var request = new HttpRequestMessage()
		{
			RequestUri = new Uri("https://localhost:44379/api/values"),
			Method = HttpMethod.Get,
		};

		request.Headers.Add("X-ARR-ClientCert", cert.GetRawCertDataString());
		var response = await client.SendAsync(request);

		if (response.IsSuccessStatusCode)
		{
			var responseContent = await response.Content.ReadAsStringAsync();
			var data = JArray.Parse(responseContent);

			return data;
		}

		throw new ApplicationException($"Status code: {response.StatusCode}, Error: {response.ReasonPhrase}");
	}
	catch (Exception e)
	{
		throw new ApplicationException($"Exception {e}");
	}
}

If the correct certificate is sent to the server, the data will be returned. If no certificate is sent, or the wrong certificate, then a 403 will be returned. It would be nice if the IHttpClientFactory would have a way of defining a handler for the HttpClient. I also believe a non valid certificates should fail per default and not require extra validation for this. The AddCertificateForwarding should also not be required to use for a default HTTPClient client calling the service.

Certificate Authentication is great, and helps add another security layer which can be used together with other solutions. See the code and ASP.NET Core src code for further documentation and examples. Links underneath.

Links

https://docs.microsoft.com/en-us/aspnet/core/security/authentication/certauth?view=aspnetcore-3.0

https://github.com/aspnet/AspNetCore/tree/master/src/Security/Authentication/Certificate/src

https://tools.ietf.org/html/rfc5246#section-7.4.4

Viewing all 269 articles
Browse latest View live