Bootstrapping ASP.NET Core - Week 3

In Part 2 of our series, we used Swagger to define a simple REST API for our dog walking application. In this installment, we are going to use Entity Framework Core to work with a database from our ASP.NET Core application.

What is Entity Framework Core?

Entity Framework Core is an object-relational mapper (O/RM) that enables .NET developers to work with a database using .NET objects. It eliminates the need for most of the data-access code that developers usually need to write - MSFT documentation

Like ASP.NET Core, EF Core is a rewrite of the previous version of Entity Framework. Conceptually it is similar to its predecessor. If you are familiar with Entity Framework 6 then Entity Framework Core will feel very similar

If you are not familiar with Entity Framework or other ORM's an ORM is simply a means of mapping the classes in our code to tables in our database. An ORM lets our application code access a database without requiring us to worry about the specifics of generating SQL.

In our simple application our tables and classes are identical but this is not always the case.

Entity Framework also lets us change or database provider withtout changing our application code.

ef_overview

There are two main parts to Entity Framework.

  1. The entity framework data context
  2. The database provider

Our application interacts with a data context and Entity Framework takes care of generating the necessary SQL using the configured data context. In our case, we will be using SQL Server.

Entity Framework Core Packages

The first step, as usual, is to install the Entity Framework Core packages into our project. We will be using the NuGet Package Manager Console to work with our Entity Framework tools when we look at creating data migrations so let's install these packages from the console as well.

First, install the SqlServer database provider for Entity Framework. This will also install the core Entity Framework libraries.

PM> Install-Package Microsoft.EntityFrameworkCore.SqlServer

Second, we will install a tools package that will let us work with our database and Entity Framework during our development process.

PM> Install-Package Microsoft.EntityFrameworkCore.Tools -Version 1.1.0-preview4-final
*** We are using the latest preview of the tools package so be sure to always update and select a newer version if available

Application Data Context

We created all of our model classes when defining the REST API. Out next step is to create an Entity Framework data context class. We will create this class in a new top level folder called Data and we will name the class ApplicationDataContext.


public class ApplicationDataContext: DbContext
{
    public ApplicationDataContext(DbContextOptions<ApplicationDataContext> options): base(options)
    { }

    public DbSet<Attachment> Attachments { get; set; }

    public DbSet<Customer> Customers { get; set; }

    public DbSet<Payment> Payments { get; set; }

    public DbSet<Pet> Pets { get; set; }

    public DbSet<Walk> Walks { get; set; }
}

DbSet is a custom Entity Framework collection that represents a table in our database. We will use LINQ to query from these collections in our API Controllers.

Wiring Up the Data Context

We want other parts of our application to be able to access the data context so we need to register it as a service. Entity Framework Core comes with an extension method that takes care of registering all the required services, services.AddDbContext(…).

We will add this call to the ConfigureServices method of our Startup class but before we do that we need to define our connection string to tell Entity Framework how to connect to our database.

Connection String Configuration

Our first inclination might be to add our connection string to our appsettings.json file or to the environment specific appsettings.development.json.

{
  "appSettings": {
    "title": "[DEVELOPMENT] Walk With Fido"
  },
  "connectionStrings": {
    "fido.data": "Data Source=(local);Initial Catalog=fido.data;Integrated Security=true;"
  }
}

This approach would work but we should be in the practice of being more careful with any information that could potentially be considered sensitive. Currently our connection string does not contain a username or password but it is not uncommon for connection strings to contain such information. Instead of adding our connection string to our settings file we should add it as a UserSecret.

User Secrets can be managed at the command line but Visual Studio 2017 provides a context menu that opens the secrets.json file for the current project. In this situation editing the file directly will be easier than jumping out to the command line.

context_menu

My secrets.json file is located at 'C:\Users\andre\AppData\Roaming\Microsoft\UserSecrets\aspnet-FidoWeb-1e7acb77-4a4d-4630-968a-b93bf7969600'. This file isn't encrypted but because it is physically located outside of my project folder I will not be able to accidentally commit this information to source control. Which can lead to all kinds of issues.

{
  "connectionStrings": {
    "fido.data": "Data Source=(local);Initial Catalog=fido.data;Integrated Security=true;"
  }
}
When using secrets locally this week I was given an error message about being unable to locate the UserSecretIdAttribute. If you see this error you need to update the code in Startup.cs that registers UserSecrets from .AddUserSecrets() to .AddUserSecrets<Startup>()

Register the Service

Now that we have our connection string defined we can return to the Startup class and register the Entity Framework Core services by adding the following line to the ConfigureServices method.

 public void ConfigureServices(IServiceCollection services)
{
    ...
    var connection = Configuration.GetConnectionString("fido.data");
    services.AddDbContext<ApplicationDataContext>(options => options.UseSqlServer(connection));
}

Verify

To verify that we have our data context wired up successfully we will inject it into our HomeController by adding an additional constructor parameter and backing property to the controller.

private ApplicationDataContext _dataContext;

public HomeController(IOptions<AppSettings> appSettingsAccessor, ApplicationDataContext dataContext)
{
	_appSettings = appSettingsAccessor.Value;
	_dataContext = dataContext;
}

After reloading the index page without seeing any additional errors we can confirm that the data context is reading the connection string and is wired up correctly with the services collection.

But we still don’t have a database for our application to access.

Entity Framework Migrations

We could manually create the database or write SQL scripts to define our schema but Entity Framework comes with a database migrations tool that can inspect our data context and model classes and create the necessary database. The concept and nuances of using database migrations, especially on large teams, probably deserve their own post. Since Fido is such a simple application we will do fine to simply use the happy path provided by Entity Framework Core Migrations.

Adding the first Migration

To add our first migration we need to open the NuGet Package Manager Console.

When using project.json based ASP.NET Core projects there was a dotnet CLI that could be used for running Entity Framework Migrations. As of right now, the CLI is not available when working with msbuild style projects and we have to use the NuGet console to create and run our migrations.

I hope the CLI returns in the future.

In the NuGet package manager console we can create our first migration using the Add-Migration command and then supplying a name. We will call our first migration Initial.

PM> Add-Migration Initial

If all goes well we will see a message saying,

To undo this action, use Remove-Migration.

Inside a Migration

Before we apply the migration and create the database we should take a look at what the Add-Migration command actually did.

A new Migrations folder was added to our project with 3 initial files.

migration_files

The first file with a number followed by the name we provided, Initial, is our actual migration file. The number at the beginning of the file name is a timestamp in the format YYYYMMDDHHMMSS. This number is used to sequence the order in which migrations are applied. A smaller number is always an early date and will be applied prior to larger numbers.

Application Snapshot

The ModelSnapshot file is used by the migrations tool to always represent the current state of the application model, or rather the state of the models the last time we ran the Add-Migration command. By knowing the current state the migrations tool is able to generate a diff between the existing state of the database and automatically detect any changes in our application model so that new migrations can be created.

It’s important to know that this file is here and what it is for but for the most part we will leave it alone.

The Migration File

There are two parts to every migration the Up and the Down. The up logic is executed when the migration is applied and the down logic is executed when the migration is removed or rolled back. One of the key principles of database migrations is that every migration can clean up after itself when it is removed.

In the Up method we see a series of migrationBuilder.CreateTable(…) method calls that are used to build our table. The tables and column creation are pretty straight forward but it is always a good idea to look at the migration file after it is generated to make sure it is doing what we actually expect it to do. Looking at our migration file we will notice that we are not providing any lengths for our string values. Take the Customers table for example.

 migrationBuilder.CreateTable(
    name: "Customers",
    columns: table => new
    {
        Id = table.Column<Guid>(nullable: false),
        Address = table.Column<string>(nullable: true),
        City = table.Column<string>(nullable: true),
        Email = table.Column<string>(nullable: true),
        Name = table.Column<string>(nullable: true),
        Notes = table.Column<string>(nullable: true),
        NotifyAdHoc = table.Column<bool>(nullable: false),
        Phone = table.Column<string>(nullable: true),
        State = table.Column<string>(nullable: true),
        Timestamp = table.Column<DateTime>(nullable: false),
        Zip = table.Column<string>(nullable: true)
    },
    constraints: table =>
    {
        table.PrimaryKey("PK_Customers", x => x.Id);
    });

Also, all of the string values are nullable, which is probably not what we really want.

We could manually modify the migration class to set the field lengths and nullability but Entity Framework Core still would not know about the constraints nor be able to enforce them. A better option is to use Data Annotations attributes to decorate our model classes. This way both Entity Framework Core, the migrations tool and the database all have the same understanding of our data model.

Before we do that, let’s go back to the package manager console and remove the migration that we just created. We don't really want it the way it is currently.

PM> Remove-Migration

Adding Data Annotations

At this point, there are two things we want to accomplish. First is to define the length of our string properties. Second is to define what properties are required and which can be null. Once again we will use the customer class.

We start by adding a using statement for DataAnnotations

using System.ComponentModel.DataAnnotations;

Now we can add attributes to our properties to indicate the max length of our properties and if the properties are required. We can do this using the Required and MaxLength attributes.

[Required]
[MaxLength(125)]
public String Name { get; set; }

If we want to generate a varchar(max) column we can pass Int.MaxValue as the max length to tell Entity Framework to create a large text field.

 [MaxLength(int.MaxValue)]
public String Notes { get; set; }

The finished Customer class should look something like this.

 public class Customer: BaseModel
    {
        [Required]
        [MaxLength(125)]
        public String Name { get; set; }

        [MaxLength(254)]
        public String Email { get; set; }

        [MaxLength(20)]
        public String Phone { get; set; }

        [MaxLength(200)]
        public String Address { get; set; }

        [MaxLength(200)]
        public String City { get; set; }

        [MaxLength(55)]
        public String State { get; set; }

        [MaxLength(15)]
        public String Zip { get; set; }

        [MaxLength(int.MaxValue)]
        public String Notes { get; set; }

        public bool NotifyAdHoc { get; set; }
    }

Once we recreate the migration using the Add-Migration Initial command we end up with a migration class for the customer table that looks more like what we are expecting

 migrationBuilder.CreateTable(
                name: "Customers",
                columns: table => new
                {
                    Id = table.Column<Guid>(nullable: false),
                    Address = table.Column<string>(maxLength: 200, nullable: true),
                    City = table.Column<string>(maxLength: 200, nullable: true),
                    Email = table.Column<string>(maxLength: 254, nullable: true),
                    Name = table.Column<string>(maxLength: 125, nullable: false),
                    Notes = table.Column<string>(maxLength: 2147483647, nullable: true),
                    NotifyAdHoc = table.Column<bool>(nullable: false),
                    Phone = table.Column<string>(maxLength: 20, nullable: true),
                    State = table.Column<string>(maxLength: 55, nullable: true),
                    Timestamp = table.Column<DateTime>(nullable: false),
                    Zip = table.Column<string>(maxLength: 15, nullable: true)
                },
                constraints: table =>
                {
                    table.PrimaryKey("PK_Customers", x => x.Id);
                });

Apply the Migration

Once we have updated all the model classes to generate the migrations file that we need we can return to the Package Manager console to apply the migration to our local database using the Update-Database command.

PM> Update-Database

This command will create the database and add the tables in our initial migration. The command also creates a tabled named EFMigrationsHistory for tracking which migrations have been applied to this database.

ef_tables

The ability to know which version of the application a particular database instance is on using the migrations history table is one of the many reasons I love using database migrations. Without such a table I would have to compare schemas or look for key tables to know which version of our schema a particular database instance is currently at.

Scripting the Migrations

When we don’t have access to our production database and need to provide a SQL script to our a DBA or release person we can generate a SQL script directly from our migrations using the Script-Migration command. This command takes both a From and To argument that can be used to generated only a portion of our migrations. I also like to add the Idempotent flag so that the script can run multiple times on the same database without issue.

PM>Script-Migration -Idempotent

The SQL for our Customers table looks like this.

CREATE TABLE [Customers] (
    [Id] uniqueidentifier NOT NULL,
    [Address] nvarchar(200),
    [City] nvarchar(200),
    [Email] nvarchar(254),
    [Name] nvarchar(125) NOT NULL,
    [Notes] nvarchar(max),
    [NotifyAdHoc] bit NOT NULL,
    [Phone] nvarchar(20),
    [State] nvarchar(55),
    [Timestamp] datetime2 NOT NULL,
    [Zip] nvarchar(15),
    CONSTRAINT [PK_Customers] PRIMARY KEY ([Id])
);

Nice! Now we are getting somewhere.

When we executed the scripts command from the package console the script was opened in a temporary file that we have to manually save. In our day to day work, we would likely want to automate the script generation as part of our build process.

Manipulating Data

Once we have injected our data context into our controllers querying and updating data becomes very easy. We can update the actions in our home controller to interact with the database instead of returning static example data.

Get Customer

 public async Task<IActionResult> GetCustomer(Guid id)
        {
            var customer = await DataContext.Customers.SingleOrDefaultAsync(x => x.Id == id);

            if(customer == null)
            {
                return NotFound();
            }

            return Ok(customer);
        }

Add Customer

public async Task<IActionResult> Post([FromBody, Required]Customer model)
        {
            if (!ModelState.IsValid)
            {
                return BadRequest(ModelState);
            }

            model.Timestamp = DateTime.UtcNow;

            DataContext.Customers.Add(model);
            await DataContext.SaveChangesAsync();

            return Created($"/api/customers/{model.Id}", model);
        }

Delete a customer

public async Task<IActionResult> Delete([FromRoute]Guid id)
        {
            var customer = await DataContext.Customers.SingleOrDefaultAsync(x => x.Id == id);
            if(customer == null)
            {
                return NotFound();
            }

            DataContext.Customers.Remove(customer);
            await DataContext.SaveChangesAsync();

            return Ok();
        }

Testing With Swagger

Our controllers are now talking to the database so now let's give everything a little test run using the Swagger UI we created for our REST API in the previous section.

swagger_post

Then the response to get all customers after adding a new customer.

swagger_get

This isn't 'real' testing but we know we are communicating from our REST API all the way down through our database and back.

We can update all the other controllers to work similarly to what we have done for the customers controller and be well on our way.

Next Steps

We now have data access. Something almost every web application needs but it is very bad form to leave your data open for the world. So, our next step is to add some security to our application using ASP.NET Core Identity.

boisecodecamp

Also, don't forget to register for Boise Code Camp or come and present a session yourself.