Applying Entity Framework Migrations to a Docker Container

I’m going to run through how to deploy an API and a database into two separate Docker containers then apply Entity Framework migrations. This will create and populate the database with the correct schema and reference data. My idea was that EF migrations should be a straightforward way to initialise a database. It wasn’t that easy. I’m going to go through the failed attempts as I think they are instructive. I know that most people just want the answer – so if that’s you then just jump to the end and it’s there.

Environment

I’m using a .Net Core 3 API with Entity Framework Core and the database is MySQL. I’ll also touch on how you would do it with Entity Framework 6. The docker containers are Windows, though as it’s .Net Core and MySQL you could use Linux as well if needed.

The demo project is called Learning Analytics and it’s simple student management application. It’s just what I’m tinkering around with at the moment.

Deploying into Docker without migrations.

The DockerFile is

FROM mcr.microsoft.com/dotnet/core/aspnet:3.1-nanoserver-1903 AS base
WORKDIR /app
EXPOSE 80
EXPOSE 443

FROM mcr.microsoft.com/dotnet/core/sdk:3.1-nanoserver-1903 AS build
WORKDIR /src
COPY ["LearningAnalytics.API/LearningAnalytics.API.csproj", "LearningAnalytics.API/"]
RUN dotnet restore "LearningAnalytics.API/LearningAnalytics.API.csproj"
COPY . .
WORKDIR "/src/LearningAnalytics.API"
RUN dotnet build "LearningAnalytics.API.csproj" -c Release -o /app/build

FROM build AS publish
RUN dotnet publish "LearningAnalytics.API.csproj" -c Release -o /app/publish

FROM base AS final
WORKDIR /app
COPY --from=publish /app/publish .
ENTRYPOINT ["dotnet", "LearningAnalytics.API.dll"]

and there is a docker-compose.yml file to bring up the API container above and the database ….

services:
  db:
    image: dockersamples/tidb:nanoserver-sac2016
    ports:
      - "49301:4000"

  app:
    image: learninganalyticsapi:dev
    build:
      context: .
      dockerfile: LearningAnalytics.API\Dockerfile
    ports:
      - "49501:80"
    environment:
      - "ConnectionStrings:LearningAnalyticsAPIContext=Server=db;Port=4000;Database=LearningAnalytics;User=root;SslMode=None;ConnectionReset=false;connect timeout=3600"     
    depends_on:
      - db

networks:
  default:
    external:
      name: nat

if I go to the directory containing docker-compose.yml file and run

docker-compose up -d

I’ll get the database and the api up. I can browse to the API at a test endpoint (the API is bound to port 49501 in the docker compose file)

http://localhost:49501/test

but if I try to access the API and get a list of students at

http://localhost:49501/api/student

then the application will crash because the database is blank. I haven’t done anything to populate it. I’m going to use migrations to do that.

Deploying into Docker with migrations – what doesn’t work

I thought it would be easy but it proved not to be.

Attempt 1 – via docker-compose

My initial thought was run the migrations as part of the docker-compose file using the command directive. So in the docker-compose file

  app:
    image: learninganalyticsapi:dev
    build:
      context: .
      dockerfile: LearningAnalytics.API\Dockerfile
    ports:
      - "49501:80"
    environment:
      - "ConnectionStrings:LearningAnalyticsAPIContext=Server=db;Port=4000;Database=LearningAnalytics;User=root;SslMode=None;ConnectionReset=false;connect timeout=3600"     
    depends_on:
      - db
	command: ["dotnet", "ef", "database update"]

The app server depends on the database (depends_on) so docker compose will bring them up in dependency order. However even though the app container comes up after the db container it, isn’t necessarily ‘ready’. The official documentation says

However, for startup Compose does not wait until a container is “ready” (whatever that means for your particular application) – only until it’s running.

So when I try to run entity framework migrations against the db container from the app container it fails. The db container isn’t ready and isn’t guaranteed to be either.

Attempt 2 – via interactive shell

I therefore thought I could do the same but run it afterwards via an interactive shell (details of an interactive shell is here). The idea being that I could wrap all this up in a PowerShell script looking like this

docker-compose up -d
docker exec learninganalytics_app_1 c:\migration\LearningAnalytics.Migration.exe

but this doesn’t work because

  1. the container doesn’t have the SDK installed as part of the base image so donet command isn’t available. This is resolvable
  2. EF core migrations needs the source code to run. We only have the built application in the container; as it should be. This sucks and isn’t resolvable

Attempt 3 – via the Startup class

I’m coming round to the idea that there is going to have to be some kind of code change in the application. I can apply migrations easily via C#. So in the startup class I could do

public void Configure(IApplicationBuilder app, IWebHostEnvironment env)
{
	using (var serviceScope = app.ApplicationServices.GetService<IServiceScopeFactory>().CreateScope())
	{
		var context = serviceScope.ServiceProvider.GetRequiredService<MyDatabaseContext>();
		context.Database.Migrate();
	}
	
	//.. other code
}	

Which does work but isn’t great. My application is going to apply migrations every time it starts – not very performant. I don’t like it.

Deploying into Docker with migrations – what does work

The resolution is a combination of the failed attempts. The principle is

  1. Provide a separate utility that can run migrations
  2. deploy this into the docker application container into it’s own folder
  3. run it after docker-compose
  4. wrap it up in a PowerShell script.

Ef Migration Utility

This is a simple console app that references the API. The app is

class Program
{
	static void Main(string[] args)
	{
		Console.WriteLine("Applying migrations");
		var webHost = new WebHostBuilder()
			.UseContentRoot(Directory.GetCurrentDirectory())
			.UseStartup<ConsoleStartup>()
			.Build();

		using (var context = (DatabaseContext) webHost.Services.GetService(typeof(DatabaseContext)))
		{
			context.Database.Migrate();
		}
		Console.WriteLine("Done");
	}
}

and the Startup class is a stripped down version of the API start up

public class ConsoleStartup
{
	public ConsoleStartup()
	{
		var builder = new ConfigurationBuilder()
			.AddJsonFile("appsettings.json", optional: true, reloadOnChange: true)
			.AddEnvironmentVariables();
		Configuration = builder.Build();
   }

	public IConfiguration Configuration { get; }

	public void ConfigureServices(IServiceCollection services)
	{
		services.AddDbContext<DatabaseContext>(options =>
		{
			options.UseMySql(Configuration.GetConnectionString("LearningAnalyticsAPIContext"));

		});
	}

	public void Configure(IApplicationBuilder app, IWebHostEnvironment env)
	{
   
	}
}

I just need the Startup to read the app.config and get the database context up which this does. The console app references the API so it can use the API’s config files so I don’t have to double key the config into the console app.

DockerFile amends

The DockerFile file needs to be amended to deploy the migrations application into a separate folder on the app container file system. It becomes

FROM mcr.microsoft.com/dotnet/core/aspnet:3.1-nanoserver-1903 AS base
WORKDIR /app
EXPOSE 80
EXPOSE 443

FROM mcr.microsoft.com/dotnet/core/sdk:3.1-nanoserver-1903 AS build
WORKDIR /src
COPY ["LearningAnalytics.API/LearningAnalytics.API.csproj", "LearningAnalytics.API/"]
RUN dotnet restore "LearningAnalytics.API/LearningAnalytics.API.csproj"
COPY . .
WORKDIR "/src/LearningAnalytics.API"
RUN dotnet build "LearningAnalytics.API.csproj" -c Release -o /app/build

FROM mcr.microsoft.com/dotnet/core/sdk:3.1-nanoserver-1903 AS migration
WORKDIR /src
COPY . .
RUN dotnet restore "LearningAnalytics.Migration/LearningAnalytics.Migration.csproj"
COPY . .
WORKDIR "/src/LearningAnalytics.Migration"
RUN dotnet build "LearningAnalytics.Migration.csproj" -c Release -o /app/migration

FROM build AS publish
RUN dotnet publish "LearningAnalytics.API.csproj" -c Release -o /app/publish

FROM base AS final
WORKDIR /migration
COPY --from=migration /app/migration .

WORKDIR /app
COPY --from=publish /app/publish .
ENTRYPOINT ["dotnet", "LearningAnalytics.API.dll"]

the relevant part is

FROM mcr.microsoft.com/dotnet/core/sdk:3.1-nanoserver-1903 AS migration
WORKDIR /src
COPY . .
RUN dotnet restore "LearningAnalytics.Migration/LearningAnalytics.Migration.csproj"
COPY . .
WORKDIR "/src/LearningAnalytics.Migration"
RUN dotnet build "LearningAnalytics.Migration.csproj" -c Release -o /app/migration

which builds out the migration application and …

FROM base AS final
WORKDIR /migration
COPY --from=migration /app/migration .

which copies it into a folder on the published container called migrations

Glue it together with PowerShell

Once the containers are brought up with docker-compose then it’s straightforward to use an interactive shell to navigate to the LearningAnalytics.Migration.exe application and run it. That will initialise the database. A better solution is to wrap it all up in a simple PowerShell script e.g.

docker-compose up -d
docker exec learninganalytics_app_1 c:\migration\LearningAnalytics.Migration.exe

and run that. The container comes up and the database is populated with the correct schema and reference data via EF migrations. The API now works correctly.

Entity Framework 6

The above is all for Entity Framework Core. Entity Framework 6 introduced the Migrate.exe tool . This can apply EF migrations without the source code which was the major stumbling block for EF Core. Armed with this then you could copy this up to the container and perform the migrations via something like

docker exec learninganalytics_app_1 Migration.exe

Do Migrations suck though?

This person thinks so. Certainly the inability to run them on compiled code is a huge drag. Whenever I write a production application then I prefer to just write the SQL out for the schema and apply it with some PowerShell. It’s not that hard. I like to use migrations for personal projects but there must be a reason that I’m not using them when I get paid to write code. Do I secretly think that they suck just a little?

Demo code

As ever, demo code is on my git hub site

https://github.com/timbrownls20/Learning-Analytics/tree/master/LearningAnalytics/LearningAnalytics.Migration
is the migration app

https://github.com/timbrownls20/Learning-Analytics/blob/master/LearningAnalytics/LearningAnalytics.API/DockerfileMigrations
the DockerFile

https://github.com/timbrownls20/Learning-Analytics/tree/master/LearningAnalytics
for the docker-compose.yml file and the simple PowerShell that glues it together

Useful links


This Stack Overflow question was the starting point for a lot of this and this answer particularly has a good discussion and some other options on how to achieve this – none of them are massively satisfactory. I felt something like what I’ve done was about the best.

https://docs.docker.com/compose/startup-order/
discusses why you can’t rely on the depends_on directive to make the database available to the application when you are bringing up the containers. It has more possibilities to circumvent this, such as wait-for-it. I’m certainly going to look at these but they do seem scoped to Linux rather than Windows so I’d have to change around the docker files for that. Also they wouldn’t help with Entity Framework 6 or earlier.

Browsing the File System in Windows and Linux Docker Containers

I’ve written a few posts about Docker now so I thought I would just step back and write a set of instructions on how to browse the file system via an interactive shell on a running container. Although it’s basic I’d like to just reference these kind of instructions in other posts so I can avoid repeating myself. Also, people need simple guides to basic processes anyway – just watch me with a powerdrill and you’ll see someone in dire need of a basic guide.

Environment

I’m running Docker on a windows machine but I’ll be bring up windows and Linux containers.

The test project is a simple .Net Core project for managing student tests which I’ve ambitiously called Learning Analytics.

Windows Container

Using this simple DockerFile

FROM mcr.microsoft.com/dotnet/core/aspnet:3.1-nanoserver-1903 AS base
WORKDIR /app
EXPOSE 80
EXPOSE 443

FROM mcr.microsoft.com/dotnet/core/sdk:3.1-nanoserver-1903 AS build
WORKDIR /src
COPY ["LearningAnalytics.API/LearningAnalytics.API.csproj", "LearningAnalytics.API/"]
RUN dotnet restore "LearningAnalytics.API/LearningAnalytics.API.csproj"
COPY . .
WORKDIR "/src/LearningAnalytics.API"
RUN dotnet build "LearningAnalytics.API.csproj" -c Release -o /app/build

FROM build AS publish
RUN dotnet publish "LearningAnalytics.API.csproj" -c Release -o /app/publish

FROM base AS final
WORKDIR /app
COPY --from=publish /app/publish .
ENTRYPOINT ["dotnet", "LearningAnalytics.API.dll"]

Build it into an image

docker build . -f "LearningAnalytics.API\DockerFile" -t learninganalyticsapi:dev

It will be named learninganalyticsapi and tagged dev.

Now run the image as a container called learninganalyticsapi_app_1 in detached mode.

docker run -d -p 80:80 --name learninganalyticsapi_app_1 learninganalyticsapi:dev dotnet c:/app/publish/LearningAnalytics.API.dll

It’s going to bind the output of the api to port 80 of the host. Assuming there is nothing already bound to port 80, I can navigate to a test page here

http://localhost/test

And I will get a test message which confirms the container is up and running.

Now run the cmd shell in inteactive mode

docker exec -it learninganalyticsapi_app_1 cmd

Now we are on the running container itself so running these commands

cd ..
dir

will navigate up to the root of the container and I can see what the top level directories are like so ….

Obviously now I’ve got an interactive shell I can do anything that shell supports. Browsing files is just an easy example.

Once I’m done then type exit to end the interactive session and I’m back to the host.

Linux Container

So same again for a Linux container. It’s going to be pretty similar

Using this simple Docker file

FROM mcr.microsoft.com/dotnet/core/aspnet:3.1-buster-slim AS base
WORKDIR /app
EXPOSE 80
EXPOSE 443

FROM mcr.microsoft.com/dotnet/core/sdk:3.1-buster AS build
WORKDIR /src
COPY ["LearningAnalytics.API/LearningAnalytics.API.csproj", "LearningAnalytics.API/"]
RUN dotnet restore "LearningAnalytics.API/LearningAnalytics.API.csproj"
COPY . .
WORKDIR "/src/LearningAnalytics.API"
RUN dotnet build "LearningAnalytics.API.csproj" -c Release -o /app/build

FROM build AS publish
RUN dotnet publish "LearningAnalytics.API.csproj" -c Release -o /app/publish

FROM base AS final
WORKDIR /app
COPY --from=publish /app/publish .
ENTRYPOINT ["dotnet", "LearningAnalytics.API.dll"]

Build and run the container

docker build . -f "LearningAnalytics.API\DockerFile" -t learninganalyticsapi:dev

docker run -d -p 49501:80 --name learninganalyticsapi_app_2 learninganalyticsapi:dev dotnet c:/app/publish/LearningAnalytics.API.dll

The only difference here is that I’ve bound it to a different port on the host.I’m working against port 49501. It’s just because I’ve already bound to port 80 in the first example so it’s now in use. If I use port 80 again then I get these kind of errors. So the test page for the Linux box is at

http://localhost:49501/test

Also the name of the container is learninganalyticsapi_app_2 to differentiate it from the Windows one which is already there from the first example.

Now bring up the shell, which is bash for Linux

docker exec -it learninganalyticsapi_app_2 bash

Now go to the root and list files. Slightly different commands than before

cd ..
ls

and we get this

which is the folders at the root of the Linux container.

As before type exit to end the interactive shell and return to the host.

Demo Code

As ever, the source code is on my GitHub site

https://github.com/timbrownls20/Learning-Analytics/tree/master/LearningAnalytics

It’s just an API with a MySQL database. I’m just bringing up the docker container for this demo. The windows Docker file is

https://github.com/timbrownls20/Learning-Analytics/blob/master/LearningAnalytics/LearningAnalytics.API/DockerfileWindows

and the Linux one is

https://github.com/timbrownls20/Learning-Analytics/blob/master/LearningAnalytics/LearningAnalytics.API/DockerfileLinux

you could do similar to the above but replace the build and run steps with a docker-compose.yml file. An example is here

https://github.com/timbrownls20/Learning-Analytics/blob/master/LearningAnalytics/docker-compose.yml

which brings up the API container and one for the database. The principle is the same though.

Artificial Intelligence A-Z

I’m just getting back into the blog after a break. In that spirit, I’m backfilling posts about some of the stuff I’ve read. watched and listened to during that break.

My boss told me that anyone who doesn’t know about AI will be left behind in 5 years time. I’m hoping I’ll still be a software developer by then, so to make sure I bought and watched artificial-intelligence-az by from Udemy. It was a good choice. The course is simple enough for a noob to follow and in depth enough for an experience software developer to feel the benefit of. The applications work up to an AI capable of playing Doom. It is fascinating to see the decisions that the AI bot makes that seem counter-intuitive but actually work and are the best choice. The course is 16.5 hours of videos but if you install all the demos and get them working it takes way longer – the longer the better as far as I’m concerned. It’s all Python but any software developer can follow the code – it’s all explained and it’s a good opportunity to brush up on Python skills anyway.

I think it cost me about $15 so good value. Udemy has a weird thing where some courses are very cheap but then if you check back (or are an existing user) they’ve rocketed in price to $100 plus. I guess it’s just their business model. It’s said that no-one pays full price at Pizza Express so in the same vain I think no-one should pay full price at Udemy. Always go armed with a voucher or a first time user reduction. With a suitable price reduction then this course is really worth the investment.

In Praise of the Marquee Tag

I’ve created a few internal tools for various tasks over the years. I tend to pop a web front end on them. I also like to pop on a marquee tag somewhere nice and visible so I can marvel at its scrolling majesty. I do it just to amuse myself then I sit back and wait for someone to notice then shout at me to not be so ridiculous and take off the ludicrous, retro, badly supported tag.

Oddly that doesn’t happen. I’ve checked back on the tools after several years of use and the text is still there, loyally scrolling away. I think the tag has a surprising number of supporters and actually there is a deep human need for easy to implement scrolling text. I imagine that other people in the company enjoy gazing hypnotically at the jerky scroll, just as I do.

I reckon I’m going for the blink tag next. An even more popular choice.

NuGet restore failing in Docker Container

I was tempted to write about this before, but I didn’t as there is already a very good, highly rated stack overflow answer with the solution. However, I’m just reinstalling Docker desktop and getting things working again and I wish I had written this stuff down as I’ve forgotten it. One of the many reasons to write blog posts is to fix stuff in my memory and as my own personal development notes. So in that spirit…

The Problem

We have a very simple .Net Core MVC solution.

It has the following NuGet packages

Install-Package NugetSample.NugetDemo.Demo -Version 1.0.0
Install-Package bootstrap -Version 4.5.0

With this DockerFile to containerise it

FROM mcr.microsoft.com/dotnet/core/aspnet AS base
WORKDIR /app
EXPOSE 80
EXPOSE 443

FROM mcr.microsoft.com/dotnet/core/sdk
WORKDIR /src
COPY ["Template.Web.csproj", "Template.Web/"]
RUN dotnet restore "Template.Web/Template.Web.csproj"
COPY . .
WORKDIR "/src/Template.Web"
RUN dotnet build "Template.Web.csproj" -c Release -o /app

FROM build AS publish
RUN dotnet publish "Template.Web.csproj" -c Release -o /app

FROM base AS final
WORKDIR /app
COPY --from=publish /app .
ENTRYPOINT ["dotnet", "Template.Web.dll"]

We go to the directory with the DockerFile and try to build it into a container with

docker build .

It fails on the dotnet restore step like so …

i.e. with this error

C:\Program Files\dotnet\sdk\3.1.302\NuGet.targets(128,5): error : Unable to load the service index for source https://api.nuget.org/v3/index.json. [C:\src\Template.Web\Template.Web.csproj]
C:\Program Files\dotnet\sdk\3.1.302\NuGet.targets(128,5): error :   No such host is known. [C:\src\Template.Web\Template.Web.csproj]
The command 'cmd /S /C dotnet restore "Template.Web/Template.Web.csproj"' returned a non-zero code: 1

NuGet is failing us

The Cause

The container doesn’t have connectivity to the internet so can’t get bring down the packages. We can see this clearly by building this very very simple docker file

FROM mcr.microsoft.com/dotnet/core/sdk
RUN ping google.com

The ping fails. The host (my development machine) does have internet access – I would have noticed if that had gone down and I would be hysterically ringing Telstra (again). So it’s something specific to the container.

The Resolution

The DNS server is wrong in the container. To fix, hardcode the DNS into Docker i.e. put this JSON

"dns": ["10.1.2.3", "8.8.8.8"]

into the Docker daemon settings. In Docker Desktop it’s here

And restart the docker service. The container now has internet access, NuGet restore will work and we can now containerise our very simple web application.

Demo Code

As ever, the demo code is on my GitHub site

The very simple application
https://github.com/timbrownls20/Demo/tree/master/ASP.NET%20Core/Template

and its docker file
https://github.com/timbrownls20/Demo/blob/master/ASP.NET%20Core/Template/Template.Web/Dockerfile

Docker file for the internet test
https://github.com/timbrownls20/Demo/blob/master/Docker/InternetTest/DockerFile

Useful Links

This Stack Overflow answer has the resolution to this with a very good explanation. Also it has other (probably better) ways to fix this and resolutions to other Docker network issues that you may face.