Cellular Automata with React

Conway’s Game of Life

Here is a fun thing – above is cellular automation coded (by me) in React. Specifically, it is Conway’s Game of Life

Rules

The rules are

  1. Any live cell with two or three live neighbours survives.
  2. Any dead cell with three live neighbours becomes a live cell.
  3. All other live cells die in the next generation. Similarly, all other dead cells stay dead.

With these rules you get an impressive array of emergent behaviour. The example above is the evolution of the R-pentomino. The starting point is this simple 5 square pattern

Which develops into the above pattern that last for well over two thousand generations. If you look carefully, you’ll see simpler patterns within it, such as the glider

which propagates across the screen until it is stopped by another object. If it isn’t stopped it will go on forever.

Live Application

three gliders gliding

I’ve published the game of life application here for anyone to gaze at

http://gameoflife.codebuckets.com.au/

You can set the size of the grid, interval between generations and the zoom. The grid wraps so it’s interesting to try the same pattern on different sizes and watch the effect of the available space on how the game develops.

Code

It’s coded up in React and is published here

https://github.com/timbrownls20/Game-Of-Life

and is freely available to download and tinker round with

The Game of Life rules are held within one file

https://github.com/timbrownls20/Game-Of-Life/blob/master/src/hooks/transformers/gameOfLife.js

import cellUtil from "../../utils/cellUtil";

const gameOfLife = (gameState, gridSettingState, gameStateDispatch) => {
  let arrTransformed = [];

  gameState.grid.filter((row) => {
    row.filter((item) => {
      if (item) {
        let rowUp = item.row <= 1 ? gridSettingState.rows : item.row - 1;
        let rowDown = item.row >= gridSettingState.rows ? 1 : item.row + 1;
        let columnUp =
          item.column <= 1 ? gridSettingState.columns : item.column - 1;
        let columnDown =
          item.column >= gridSettingState.columns ? 1 : item.column + 1;

        let neighbours = [
          { row: rowUp, column: columnUp },
          { row: rowUp, column: item.column },
          { row: rowUp, column: columnDown },
          { row: item.row, column: columnUp },
          { row: item.row, column: columnDown },
          { row: rowDown, column: columnUp },
          { row: rowDown, column: item.column },
          { row: rowDown, column: columnDown },
        ];

        let activeNeighbours = neighbours.reduce((acc, searchItem) => {
          return gameState.grid[searchItem.row][searchItem.column].selected
            ? acc + 1
            : acc;
        }, 0);

        if (item.selected && (activeNeighbours < 2 || activeNeighbours > 3)) {
          addCellToTransform(cellUtil.deselectCell(item));
        } else if (!item.selected && activeNeighbours == 3) {
          addCellToTransform(cellUtil.selectCell(item));
        }
      }
    });
  });

  arrTransformed.filter((element) => {
    gameState.grid[element.row][element.column] = { ...element };
  });

  gameState.generation = gameState.generation + 1;
  gameStateDispatch({ type: "set-state", value: gameState });

  function addCellToTransform(cellToAdd) {
    let existingCell = arrTransformed.find(
      (e) => e.column == cellToAdd.column && e.row == cellToAdd.row
    );
    if (existingCell) {
      cellToAdd =
        existingCell.selected && !cellToAdd.selected ? existingCell : cellToAdd;
    }

It would be easy to swap these out and try other cellular automation rules. You can see two other simpler transformations in the same folder, that I wrote along the way.

So enjoy it if you choose to. In the age of Zoom it’s good to start a pattern then gaze at it hypnotically during online meetings. It makes you look very focused to your fellow meeting participants.

26 Tips for Localising your Application

Your clients

I’ve worked with applications with different language variants for years. Here are some things I’ve learnt along the way. This is written from the point of view of an English speaker, working with an English application. In honour of that, there are 26 tips – one for each letter of the alphabet in the en-GB culture.

Terms

Globalisation is the process of ensuring that your application can support multiple cultures and languages.

Localisation is the process of adapting your application to one target language. It includes but is not limited to translating the application.

Globalisation

1. Build globalisation into your application from the start

Modern programming frameworks come with support for globalisation and localisation but make sure you use it from the earliest opportunity. It’s way harder to retrofit this capability and frankly it’s a huge slog. If you must do it – get someone else in to do the work. I’m only half joking.

2. Extended Character Sets

You’ll have to use extended character sets to support more than the 26 letters and 10 numbers you are used to. Accents and umlauts and so forth or perhaps an entire Cyrillic alphabet. UTF-8 is the correct choice on the web and make sure that your database can store it correctly – nvarchar for SQL Server for example.

Weirdly, people can rapidly occupy the moral high ground when talking about text encoding and you wonder if you are mad or stupid for even mentioning it. But unless you actively monitor it – there will be somewhere in your application that doesn’t properly support it and you will find that out at a very inopportune moment.

3. String length and truncated text

Translating text often results in longer or shorter strings. German is notorious for super long words for instance. Short strings might make your application look a bit weird. Long strings might truncate unhelpfully in your UI or might break your UI layout completely. Long strings can also break your database if the field length isn’t long enough.

Design your database and UI to deal with a range of string sizes as soon as you can.

4. UTC Dates

If possible, store your dates as UTC format – Coordinated Universal Time or equivalent. Store it with the user’s time zone and you’ll be able to reconstruct the actual time and compare it to other time zones. Without it, you’ll just know something important happened at 18.00. Which depending on the time zone you are in could be any time at all.

This might not seem important when shipping to the UK. It’s a tiny, tiny, crowded island where the natives only have one time zone to share between them, Then you sell to Australia with 5 time zones and UTC dates sudden seem much more important.

Generally, dates are complex – the Babylonians didn’t invent our calendar to make it easy for software developers. But make it easier for yourself wherever you can.

5. Database text

This is another obvious one, but any text in your database will also need translating. It’s easy to get so enamoured with all the amazing support that your UI technology has for globalisation that you forget that 75% of the text in your application comes out of a database. Likely you’ll need a separate process to deal with it.

6. External APIs

If you use any external APIs for content, then you’ll have to make sure that they support all the target languages. If they don’t then find an alternative, translate the input yourself on the fly (not easy) or learn to live without them.

7. You might need a different UI entirely for non-roman languages

I’ve only every worked with European languages – the standard ones. We once floated the idea of translating it into Mandarin Chinese and everyone got nervous. I don’t think our UI would have withstood it.

If you are going to access those big, big markets then it’s a whole new level of globalisation – are you ready for it?

8. Do you need to do it?

It’s worth taking a step back and asking if it does need to be translated at all. If your application is in English and your target market is professional, then they might take it as is – particularly if they can get it quicker and cheaper. We sold into the Gulf states, and they were happy to take it in English – which is as well really. The option for them was English or nothing. They took English.

Localisation

9. Managing provisional translations

Translators are expensive and you probably only want to employ them at the end. You likely end up putting provisional translations in when developing and during initial testing then bundling off your text to translators after that, when the application is more stable.

Anticipate that and devise a way to put in provisional translations, know that they are provisional and swap them out for confirmed translations when needed.

10. Language variants

You’ll want to make sure that you can properly distinguish language variants – Austrian and Swiss German, US and UK English and know which words and phrases need a custom translation for that variant. As a minimum, it’s annoying for people to read phrases that aren’t from their culture e.g., color vs colour for the UK market. At worse, your application will seem unprofessional and even borderline illiterate.

11. Don’t translate single words

Obvious maybe, but it just doesn’t work to translate single words and stitch them together. Slightly less obviously, your units of translation may have to larger than you initially think – sentences rather than phrases or even passages of text.

12. Don’t duplicate translations

This conflicts with a lot of other tips but if possible don’t duplicate translations. You’ll have a list (xml, json, csv, excel etc…) somewhere with all the terms you are translating. If you have 400 line items all of which are ‘Save’ you will delight your external translator. They will translate it once, copy it down 400 lines then charge you for 400 translations. Try to avoid delighting your external translator in this way.

13. Invest in a translation tool

Or perhaps don’t. It’s a big undertaking to localise an application and keep all the translations up to date so take all the help you can get. Considering buying in any third-party product you think can help you.

However, when I looked there wasn’t anything amazing. We bought one and it did help but it wasn’t a panacea. I would look though and see if there is anything that fits your organisations requirements. You might have to code up a translation tool yourself.

14. Translate installation files

If you’ve got installation files, msi or other utilities you’ll want to translate those. At the very least be aware that you haven’t translated them, so you aren’t shocked when the first thing the client sees is an installer rattling away in English.

15. Date format

Date formats change depending on locality e.g. dd-mm-YYYY for the UK, mm-dd-YYYY for the US and YYYY-mm-dd for China. Make sure yours change too.

More subtly, make sure you don’t accidentally translate formatting strings. For instance, the formatting string in

MyDate.ToString(“dd/mm/YYYY”);

If translated to German in Google translate would be

MyDate.ToString(“TT/MM/YYYY”);

Which is nonsense as far as string formatting is concerned and will break your application.

16. Text on images

This is an amusing surprise – even when your UI, every database and external data source is translated, and you are unveiling the application there will be an image somewhere with English text on it. Maybe it’s a shop, or road sign or maybe someone has photoshopped a great swathe of text over a large image.

Your images need to be culture neutral or be able to be swapped out for culture appropriate ones.

17. Job Titles

An interesting case is job titles. A job title in your home language might be entirely non-existent in your target language. You can’t translate something if it doesn’t exist.

As a broader point – concepts may differ or change in different cultures in non-obvious ways. Official documents will be different, how people identify themselves might be different even things like date of birth might have to become optional or be able to support ranges. There are places in the world where this information might not be available.

Testing

18. Your automated tests might break

So, you’ve spent a heap of time creating an automated suite with a truly astonishing test coverage. Well, they might break in your localised application. Behaviour driven testing and UI testing are particularly vulnerable to this but even the humble unit test could start to blink red at you. Factor this into your planning.

19. Don’t translate system error information and logging

Keep your logs and error messages in the language you actually speak. Your support staff will not thank you if faced with a 30 MB error log file in Hungarian.

20. Consider using test environments with language specific OS/Database etc…

Not mandatory by any means but you might want to consider setting up your test environment to have the target language for its operating system, databases and all other software. You might detect some hard-to-find bugs before it hits production.

Balanced against that, it might be a pain to maintain your own on-premise test servers in a radically different language. But with infrastructure on demand, it should be less of an issue and worth thinking about.

21. Account for increase in test time

Each language you add, even language variant, will add to your test effort. Budget for it and don’t over promise on your project estimations.

22. Employ testers with good language skills

Employ anyone with good language skills but it’s really useful in testing. Also, if they are good then you might be able to use fewer external translators which as I’ve said before, can be expensive.

Broadly it’s just generally useful to speak the target language to some extent but scarily actually not necessary. I’m pretty poor at languages (high school Spanish and that’s it) but I’ve worked with French and German drug databases for years. Other than the odd embarrassing bug (entire section of data in the wrong language that I didn’t notice), it hasn’t made a much of a difference. But then I’m not a tester.

Final Words

23. Other things to watch for

There are many, many other things to bear in mind. Here are a few more

  1. Number separators – different in different cultures. Can you application support a comma for a decimal point
  2. Currency symbols – need to be swappable and correct
  3. Post(zip) codes – even within very similar cultures they are radically different. The Australian postcode is 4 digits and the UK postcode is alpha numeric and up to 8 characters long with a space. Make sure your form validation can cope
  4. International telephone numbers – might be formatted differently
  5. Sorting might break- it did for me.
  6. Gender in languages – get this wrong and your application looks like it’s been translated by a 5-year-old.

24. Google Translate is your friend

Google translate is just an amazing help during initial development. Pump in all your English text and get provisional translation out.

This is really handy for spotting areas in your application that you have forgotten to translate, and it gives a good indication what the difference in text length is doing to your UI. Also, if you have problems with your encoding then you’ll see it straightaway when faced with weird characters in your text. You just get an early view of what your application looks like in the target language which is invaluable.

25. Google Translate is not your friend

But do not rely on Google Translate for your final translations and be very careful not to inadvertently leave it in the final release.

We had much hilarity from a client when skin peel (medical) was translated as lemon peel (cooking). They thought it was funny. Your client might think it’s woefully unprofessional. Don’t take the risk.

26. It’s a lifelong task

Translating an application is a pretty sizable task. Once done, there will be further releases and each one will have more stuff to translate. You need a good method to work the translations into your processes. The task will never end. Embrace it.

Good Luck

Not a tip – just good luck, buona fortuna, bonne chance and buena suerte* with your localisation endeavours. You are doing a good thing.

*All translations were provided by Google Translate and are provisional

Moving hosting to Hostinger

At least these guys don’t have to move their web hosting

I’ve been meaning to move the hosting for codebuckets for months now. My level of enthusiasm for moving hosting come somewhere between going to the dentist and getting my annual taxes sorted. Actually, it’s less appealing than both those unappealing options. I just don’t want to do it.

But I have bitten the bullet, grabbed the greasy pig, put on my big boy panties and moved my hosting to Hostinger over the weekend. While they were a few wrinkles (my stylesheets completely disappeared at one point leaving codebuckets looking like it was a website from the 90s) it all happened pretty smoothly – largely due to the excellent support over at Hostinger. So, it’s a big shout out to the support guy, Ignas from Lithuania, for spending a good chunk of time helping me out and getting me the new configuration just as I wanted. Thank you.

The good thing is that now it’s over with some other sites I run, I can legitimately claim the hosting costs back against my business expenses – marvellous. And what’s more, I’ve hooked up an SSL certificate, which came free when I bought the hosting, so now codebuckets is finally https as it should have been all along.

So, job done. All, I need to do now is sort out my 2020/21 taxes. Sadly, I can feel my euthusiasm draining away again!

Bootstrap Modal Dialog in React without JQuery

The Task

I want the modal dialog component working in a React project. My React project uses function components and hooks but it would be the same with class components.

What I want – a lovely modal dialog component

The Problem

To show the modal I need to call it thus

$('#myModal').modal('show')

It’s not going to work as is in React. There is no JQuery and I don’t want to be referencing DOM elements directly if I can avoid it.

The Solution

I’m going to roll my own and use useState react hook to swap the classes in and out to show and hide it.

Justification

I could import JQuery and use a useRef hook to the DOM. But I don’t want to install JQuery into my project just so I can have a modal dialog behaving appropriately. It should be possible without it.

Alternatively I could use React-Bootstrap components. But it feels a bit abstracted to me. In any case I just want the modal working not a heap of other bootstrap components. Also, I want to use the latest bootstrap and not be forced to use the earlier one in React-Bootstrap.

So I’m going to roll my own. No true programmer would save time by using third party components or by taking shortcuts. They do it themselves and take much longer about it. I shall do it this way **

** I use third party components and shortcuts all the time. I’m just not going to for this one.

Implementation

Full implementation is on my git hub site

https://github.com/timbrownls20/Demo/tree/master/React/bootstrap-modal

so I recommend that you go straight there and copy the code down. It’s stripped down to basics and it’s pretty straight forward

Modal Component

https://github.com/timbrownls20/Demo/blob/master/React/bootstrap-modal/src/components/Modal.jsx

import React from 'react';

const Modal = ({ children, show, hideModal }) => (
  <div
    className={`modal ${show ? ' modal-show' : ''}`}
    tabIndex="-1"
    role="dialog"
  >
    <div className="modal-dialog" role="document">
      <div className="modal-content">
        <div className="modal-header">
          <h5 className="modal-title">Modal Title</h5>
          <button
            type="button"
            className="close"
            data-dismiss="modal"
            aria-label="Close"
            onClick={hideModal}
          >
            <span aria-hidden="true">&times;</span>
          </button>
        </div>
        <div className="modal-body">
          {children}
        </div>
        <div className="modal-footer">
          <button type="button" className="btn btn-primary" onClick={hideModal}>
            Save
          </button>
          <button
            type="button"
            className="btn btn-secondary"
            data-dismiss="modal"
            onClick={hideModal}
          >
            Close
          </button>
        </div>
      </div>
    </div>
  </div>
);

export default Modal;

It’s a straight copy from the bootstrap documentation with a prop to toggle the visibility and a function to response to close and save.

State

https://github.com/timbrownls20/Demo/blob/master/React/bootstrap-modal/src/components/App.jsx

import React, { useState } from "react";
import Modal from "./Modal";
import ModalLauncher from "./ModalLauncher";
import '../css/modal.css';

function App() {
  const [show, setShow] = useState(false);

  const showModal = () => {
    setShow(true);
  };

  const hideModal = () => {
    setShow(false);
  };

  return (
    <div>
      <div className="container-fluid">
      <div className="d-flex justify-content-center align-content-center m-5">
            <div className="p-5 demo-text">front page text</div>
            <ModalLauncher showModal={showModal} />
            <div className="p-5 demo-text">More front page text</div>
        </div>
        <Modal show={show} hideModal={hideModal}>Modal content</Modal>
      </div>
    </div>
  );
}

export default App;

The show and hide state is set at the parent component level and is persisted through a state hook taking a boolean i.e.

const [show, setShow] = useState(false);
CSS

The detail of it is really in the CSS and it’s that I had t spend time fiddling around with

https://github.com/timbrownls20/Demo/blob/master/React/bootstrap-modal/src/css/modal.css

.modal{
    display: block;
    visibility: hidden;
}

.modal-show
{
    visibility:visible;
    background-color: rgba(169, 169, 169, 0.8);
    transition: opacity 0.2s linear; 
} 

.modal-content 
{
    opacity: 0; 
}

.modal-show .modal-content 
{
    opacity: 1; 
    transition: 0.2s linear; 
} 

To get the modal to appear and disappear is straight forward but I want it to show with a transition to fade it in and out, otherwise it looks terrible. That’s a bit more awkward.

To do this, the modal styles are overridden. The visibility in bootstrap is controlled by the CSS display property. I’ve changed it to use the visibility property. If the showing and hiding is managed by changing display from none to block then the transitions don’t work. Visibility does work with transitions, hence the change

.modal-show is the class that we add and remove to show and hide the modal dialog. We use opacity to bring in the dialog. Interestingly if I use

.modal-show
{
    visibility:visible;
    background-color: rgba(169, 169, 169);
    opacity: 0.5;
    transition: opacity 0.2s linear; 
} 

Then the dialog and the background have an opacity of 0.8 and I’ve got a see through modal which I definitely don’t want.

See through modal dialog. Nasty

Setting the opacity with background-colour just applies it to the background overlay and the dialog itself isn’t affected and is not see through at all.

The final wrinkle is that now the dialog doesn’t transition in, although the gray background overlay does. It actually looks OK to me (but then I’m not a designer so I don’t care hugely about this stuff). To get the dialog fading in and out it needs its own CSS transition, this time on opacity i.e.

.modal-content 
{
    opacity: 0; 
}

.modal-show .modal-content 
{
    opacity: 1; 
    transition: 0.2s linear; 
}

Now the background overlay fades in to 0.8 opacity and the modal dialog fades in to opacity 1 i.e. fully visible.

I could spend more time playing around with the CSS to improve it further and believe me I’m tempted to do just that. However, I’ve got a paid job to do and a daughter who is insisting that I play Minecraft with her (which is what passes for parenting these days). So, I am declaring it job done!

Useful Links

This Stack overflow question gives a whole bunch of implementations for this but they do use JQuery which I wanted to avoid. It might well be the definition of madness to implement something that has already got a bunch of alternative implementations. I just wanted to do it without JQuery.

Bootstrap modal dialog documentation. The JQuery does more than open and close the dialog but that’s all I need it to do right now

React-Bootstrap components. I’ve tried these but I just don’t like them. I’ve watched pluralsight videos where people are very keen on them so it’s probably just me.

Applying Entity Framework Migrations to a Docker Container

I’m going to run through how to deploy an API and a database into two separate Docker containers then apply Entity Framework migrations. This will create and populate the database with the correct schema and reference data. My idea was that EF migrations should be a straightforward way to initialise a database. It wasn’t that easy. I’m going to go through the failed attempts as I think they are instructive. I know that most people just want the answer – so if that’s you then just jump to the end and it’s there.

Environment

I’m using a .Net Core 3 API with Entity Framework Core and the database is MySQL. I’ll also touch on how you would do it with Entity Framework 6. The docker containers are Windows, though as it’s .Net Core and MySQL you could use Linux as well if needed.

The demo project is called Learning Analytics and it’s simple student management application. It’s just what I’m tinkering around with at the moment.

Deploying into Docker without migrations.

The DockerFile is

FROM mcr.microsoft.com/dotnet/core/aspnet:3.1-nanoserver-1903 AS base
WORKDIR /app
EXPOSE 80
EXPOSE 443

FROM mcr.microsoft.com/dotnet/core/sdk:3.1-nanoserver-1903 AS build
WORKDIR /src
COPY ["LearningAnalytics.API/LearningAnalytics.API.csproj", "LearningAnalytics.API/"]
RUN dotnet restore "LearningAnalytics.API/LearningAnalytics.API.csproj"
COPY . .
WORKDIR "/src/LearningAnalytics.API"
RUN dotnet build "LearningAnalytics.API.csproj" -c Release -o /app/build

FROM build AS publish
RUN dotnet publish "LearningAnalytics.API.csproj" -c Release -o /app/publish

FROM base AS final
WORKDIR /app
COPY --from=publish /app/publish .
ENTRYPOINT ["dotnet", "LearningAnalytics.API.dll"]

and there is a docker-compose.yml file to bring up the API container above and the database ….

services:
  db:
    image: dockersamples/tidb:nanoserver-sac2016
    ports:
      - "49301:4000"

  app:
    image: learninganalyticsapi:dev
    build:
      context: .
      dockerfile: LearningAnalytics.API\Dockerfile
    ports:
      - "49501:80"
    environment:
      - "ConnectionStrings:LearningAnalyticsAPIContext=Server=db;Port=4000;Database=LearningAnalytics;User=root;SslMode=None;ConnectionReset=false;connect timeout=3600"     
    depends_on:
      - db

networks:
  default:
    external:
      name: nat

if I go to the directory containing docker-compose.yml file and run

docker-compose up -d

I’ll get the database and the api up. I can browse to the API at a test endpoint (the API is bound to port 49501 in the docker compose file)

http://localhost:49501/test

but if I try to access the API and get a list of students at

http://localhost:49501/api/student

then the application will crash because the database is blank. I haven’t done anything to populate it. I’m going to use migrations to do that.

Deploying into Docker with migrations – what doesn’t work

I thought it would be easy but it proved not to be.

Attempt 1 – via docker-compose

My initial thought was run the migrations as part of the docker-compose file using the command directive. So in the docker-compose file

  app:
    image: learninganalyticsapi:dev
    build:
      context: .
      dockerfile: LearningAnalytics.API\Dockerfile
    ports:
      - "49501:80"
    environment:
      - "ConnectionStrings:LearningAnalyticsAPIContext=Server=db;Port=4000;Database=LearningAnalytics;User=root;SslMode=None;ConnectionReset=false;connect timeout=3600"     
    depends_on:
      - db
	command: ["dotnet", "ef", "database update"]

The app server depends on the database (depends_on) so docker compose will bring them up in dependency order. However even though the app container comes up after the db container it, isn’t necessarily ‘ready’. The official documentation says

However, for startup Compose does not wait until a container is “ready” (whatever that means for your particular application) – only until it’s running.

So when I try to run entity framework migrations against the db container from the app container it fails. The db container isn’t ready and isn’t guaranteed to be either.

Attempt 2 – via interactive shell

I therefore thought I could do the same but run it afterwards via an interactive shell (details of an interactive shell is here). The idea being that I could wrap all this up in a PowerShell script looking like this

docker-compose up -d
docker exec learninganalytics_app_1 c:\migration\LearningAnalytics.Migration.exe

but this doesn’t work because

  1. the container doesn’t have the SDK installed as part of the base image so donet command isn’t available. This is resolvable
  2. EF core migrations needs the source code to run. We only have the built application in the container; as it should be. This sucks and isn’t resolvable

Attempt 3 – via the Startup class

I’m coming round to the idea that there is going to have to be some kind of code change in the application. I can apply migrations easily via C#. So in the startup class I could do

public void Configure(IApplicationBuilder app, IWebHostEnvironment env)
{
	using (var serviceScope = app.ApplicationServices.GetService<IServiceScopeFactory>().CreateScope())
	{
		var context = serviceScope.ServiceProvider.GetRequiredService<MyDatabaseContext>();
		context.Database.Migrate();
	}
	
	//.. other code
}	

Which does work but isn’t great. My application is going to apply migrations every time it starts – not very performant. I don’t like it.

Deploying into Docker with migrations – what does work

The resolution is a combination of the failed attempts. The principle is

  1. Provide a separate utility that can run migrations
  2. deploy this into the docker application container into it’s own folder
  3. run it after docker-compose
  4. wrap it up in a PowerShell script.

Ef Migration Utility

This is a simple console app that references the API. The app is

class Program
{
	static void Main(string[] args)
	{
		Console.WriteLine("Applying migrations");
		var webHost = new WebHostBuilder()
			.UseContentRoot(Directory.GetCurrentDirectory())
			.UseStartup<ConsoleStartup>()
			.Build();

		using (var context = (DatabaseContext) webHost.Services.GetService(typeof(DatabaseContext)))
		{
			context.Database.Migrate();
		}
		Console.WriteLine("Done");
	}
}

and the Startup class is a stripped down version of the API start up

public class ConsoleStartup
{
	public ConsoleStartup()
	{
		var builder = new ConfigurationBuilder()
			.AddJsonFile("appsettings.json", optional: true, reloadOnChange: true)
			.AddEnvironmentVariables();
		Configuration = builder.Build();
   }

	public IConfiguration Configuration { get; }

	public void ConfigureServices(IServiceCollection services)
	{
		services.AddDbContext<DatabaseContext>(options =>
		{
			options.UseMySql(Configuration.GetConnectionString("LearningAnalyticsAPIContext"));

		});
	}

	public void Configure(IApplicationBuilder app, IWebHostEnvironment env)
	{
   
	}
}

I just need the Startup to read the app.config and get the database context up which this does. The console app references the API so it can use the API’s config files so I don’t have to double key the config into the console app.

DockerFile amends

The DockerFile file needs to be amended to deploy the migrations application into a separate folder on the app container file system. It becomes

FROM mcr.microsoft.com/dotnet/core/aspnet:3.1-nanoserver-1903 AS base
WORKDIR /app
EXPOSE 80
EXPOSE 443

FROM mcr.microsoft.com/dotnet/core/sdk:3.1-nanoserver-1903 AS build
WORKDIR /src
COPY ["LearningAnalytics.API/LearningAnalytics.API.csproj", "LearningAnalytics.API/"]
RUN dotnet restore "LearningAnalytics.API/LearningAnalytics.API.csproj"
COPY . .
WORKDIR "/src/LearningAnalytics.API"
RUN dotnet build "LearningAnalytics.API.csproj" -c Release -o /app/build

FROM mcr.microsoft.com/dotnet/core/sdk:3.1-nanoserver-1903 AS migration
WORKDIR /src
COPY . .
RUN dotnet restore "LearningAnalytics.Migration/LearningAnalytics.Migration.csproj"
COPY . .
WORKDIR "/src/LearningAnalytics.Migration"
RUN dotnet build "LearningAnalytics.Migration.csproj" -c Release -o /app/migration

FROM build AS publish
RUN dotnet publish "LearningAnalytics.API.csproj" -c Release -o /app/publish

FROM base AS final
WORKDIR /migration
COPY --from=migration /app/migration .

WORKDIR /app
COPY --from=publish /app/publish .
ENTRYPOINT ["dotnet", "LearningAnalytics.API.dll"]

the relevant part is

FROM mcr.microsoft.com/dotnet/core/sdk:3.1-nanoserver-1903 AS migration
WORKDIR /src
COPY . .
RUN dotnet restore "LearningAnalytics.Migration/LearningAnalytics.Migration.csproj"
COPY . .
WORKDIR "/src/LearningAnalytics.Migration"
RUN dotnet build "LearningAnalytics.Migration.csproj" -c Release -o /app/migration

which builds out the migration application and …

FROM base AS final
WORKDIR /migration
COPY --from=migration /app/migration .

which copies it into a folder on the published container called migrations

Glue it together with PowerShell

Once the containers are brought up with docker-compose then it’s straightforward to use an interactive shell to navigate to the LearningAnalytics.Migration.exe application and run it. That will initialise the database. A better solution is to wrap it all up in a simple PowerShell script e.g.

docker-compose up -d
docker exec learninganalytics_app_1 c:\migration\LearningAnalytics.Migration.exe

and run that. The container comes up and the database is populated with the correct schema and reference data via EF migrations. The API now works correctly.

Entity Framework 6

The above is all for Entity Framework Core. Entity Framework 6 introduced the Migrate.exe tool . This can apply EF migrations without the source code which was the major stumbling block for EF Core. Armed with this then you could copy this up to the container and perform the migrations via something like

docker exec learninganalytics_app_1 Migration.exe

Do Migrations suck though?

This person thinks so. Certainly the inability to run them on compiled code is a huge drag. Whenever I write a production application then I prefer to just write the SQL out for the schema and apply it with some PowerShell. It’s not that hard. I like to use migrations for personal projects but there must be a reason that I’m not using them when I get paid to write code. Do I secretly think that they suck just a little?

Demo code

As ever, demo code is on my git hub site

https://github.com/timbrownls20/Learning-Analytics/tree/master/LearningAnalytics/LearningAnalytics.Migration
is the migration app

https://github.com/timbrownls20/Learning-Analytics/blob/master/LearningAnalytics/LearningAnalytics.API/DockerfileMigrations
the DockerFile

https://github.com/timbrownls20/Learning-Analytics/tree/master/LearningAnalytics
for the docker-compose.yml file and the simple PowerShell that glues it together

Useful links


This Stack Overflow question was the starting point for a lot of this and this answer particularly has a good discussion and some other options on how to achieve this – none of them are massively satisfactory. I felt something like what I’ve done was about the best.

https://docs.docker.com/compose/startup-order/
discusses why you can’t rely on the depends_on directive to make the database available to the application when you are bringing up the containers. It has more possibilities to circumvent this, such as wait-for-it. I’m certainly going to look at these but they do seem scoped to Linux rather than Windows so I’d have to change around the docker files for that. Also they wouldn’t help with Entity Framework 6 or earlier.

Browsing the File System in Windows and Linux Docker Containers

I’ve written a few posts about Docker now so I thought I would just step back and write a set of instructions on how to browse the file system via an interactive shell on a running container. Although it’s basic I’d like to just reference these kind of instructions in other posts so I can avoid repeating myself. Also, people need simple guides to basic processes anyway – just watch me with a powerdrill and you’ll see someone in dire need of a basic guide.

Environment

I’m running Docker on a windows machine but I’ll be bring up windows and Linux containers.

The test project is a simple .Net Core project for managing student tests which I’ve ambitiously called Learning Analytics.

Windows Container

Using this simple DockerFile

FROM mcr.microsoft.com/dotnet/core/aspnet:3.1-nanoserver-1903 AS base
WORKDIR /app
EXPOSE 80
EXPOSE 443

FROM mcr.microsoft.com/dotnet/core/sdk:3.1-nanoserver-1903 AS build
WORKDIR /src
COPY ["LearningAnalytics.API/LearningAnalytics.API.csproj", "LearningAnalytics.API/"]
RUN dotnet restore "LearningAnalytics.API/LearningAnalytics.API.csproj"
COPY . .
WORKDIR "/src/LearningAnalytics.API"
RUN dotnet build "LearningAnalytics.API.csproj" -c Release -o /app/build

FROM build AS publish
RUN dotnet publish "LearningAnalytics.API.csproj" -c Release -o /app/publish

FROM base AS final
WORKDIR /app
COPY --from=publish /app/publish .
ENTRYPOINT ["dotnet", "LearningAnalytics.API.dll"]

Build it into an image

docker build . -f "LearningAnalytics.API\DockerFile" -t learninganalyticsapi:dev

It will be named learninganalyticsapi and tagged dev.

Now run the image as a container called learninganalyticsapi_app_1 in detached mode.

docker run -d -p 80:80 --name learninganalyticsapi_app_1 learninganalyticsapi:dev dotnet c:/app/publish/LearningAnalytics.API.dll

It’s going to bind the output of the api to port 80 of the host. Assuming there is nothing already bound to port 80, I can navigate to a test page here

http://localhost/test

And I will get a test message which confirms the container is up and running.

Now run the cmd shell in inteactive mode

docker exec -it learninganalyticsapi_app_1 cmd

Now we are on the running container itself so running these commands

cd ..
dir

will navigate up to the root of the container and I can see what the top level directories are like so ….

Obviously now I’ve got an interactive shell I can do anything that shell supports. Browsing files is just an easy example.

Once I’m done then type exit to end the interactive session and I’m back to the host.

Linux Container

So same again for a Linux container. It’s going to be pretty similar

Using this simple Docker file

FROM mcr.microsoft.com/dotnet/core/aspnet:3.1-buster-slim AS base
WORKDIR /app
EXPOSE 80
EXPOSE 443

FROM mcr.microsoft.com/dotnet/core/sdk:3.1-buster AS build
WORKDIR /src
COPY ["LearningAnalytics.API/LearningAnalytics.API.csproj", "LearningAnalytics.API/"]
RUN dotnet restore "LearningAnalytics.API/LearningAnalytics.API.csproj"
COPY . .
WORKDIR "/src/LearningAnalytics.API"
RUN dotnet build "LearningAnalytics.API.csproj" -c Release -o /app/build

FROM build AS publish
RUN dotnet publish "LearningAnalytics.API.csproj" -c Release -o /app/publish

FROM base AS final
WORKDIR /app
COPY --from=publish /app/publish .
ENTRYPOINT ["dotnet", "LearningAnalytics.API.dll"]

Build and run the container

docker build . -f "LearningAnalytics.API\DockerFile" -t learninganalyticsapi:dev

docker run -d -p 49501:80 --name learninganalyticsapi_app_2 learninganalyticsapi:dev dotnet c:/app/publish/LearningAnalytics.API.dll

The only difference here is that I’ve bound it to a different port on the host.I’m working against port 49501. It’s just because I’ve already bound to port 80 in the first example so it’s now in use. If I use port 80 again then I get these kind of errors. So the test page for the Linux box is at

http://localhost:49501/test

Also the name of the container is learninganalyticsapi_app_2 to differentiate it from the Windows one which is already there from the first example.

Now bring up the shell, which is bash for Linux

docker exec -it learninganalyticsapi_app_2 bash

Now go to the root and list files. Slightly different commands than before

cd ..
ls

and we get this

which is the folders at the root of the Linux container.

As before type exit to end the interactive shell and return to the host.

Demo Code

As ever, the source code is on my GitHub site

https://github.com/timbrownls20/Learning-Analytics/tree/master/LearningAnalytics

It’s just an API with a MySQL database. I’m just bringing up the docker container for this demo. The windows Docker file is

https://github.com/timbrownls20/Learning-Analytics/blob/master/LearningAnalytics/LearningAnalytics.API/DockerfileWindows

and the Linux one is

https://github.com/timbrownls20/Learning-Analytics/blob/master/LearningAnalytics/LearningAnalytics.API/DockerfileLinux

you could do similar to the above but replace the build and run steps with a docker-compose.yml file. An example is here

https://github.com/timbrownls20/Learning-Analytics/blob/master/LearningAnalytics/docker-compose.yml

which brings up the API container and one for the database. The principle is the same though.

Artificial Intelligence A-Z

I’m just getting back into the blog after a break. In that spirit, I’m backfilling posts about some of the stuff I’ve read. watched and listened to during that break.

My boss told me that anyone who doesn’t know about AI will be left behind in 5 years time. I’m hoping I’ll still be a software developer by then, so to make sure I bought and watched artificial-intelligence-az by from Udemy. It was a good choice. The course is simple enough for a noob to follow and in depth enough for an experience software developer to feel the benefit of. The applications work up to an AI capable of playing Doom. It is fascinating to see the decisions that the AI bot makes that seem counter-intuitive but actually work and are the best choice. The course is 16.5 hours of videos but if you install all the demos and get them working it takes way longer – the longer the better as far as I’m concerned. It’s all Python but any software developer can follow the code – it’s all explained and it’s a good opportunity to brush up on Python skills anyway.

I think it cost me about $15 so good value. Udemy has a weird thing where some courses are very cheap but then if you check back (or are an existing user) they’ve rocketed in price to $100 plus. I guess it’s just their business model. It’s said that no-one pays full price at Pizza Express so in the same vain I think no-one should pay full price at Udemy. Always go armed with a voucher or a first time user reduction. With a suitable price reduction then this course is really worth the investment.

In Praise of the Marquee Tag

I’ve created a few internal tools for various tasks over the years. I tend to pop a web front end on them. I also like to pop on a marquee tag somewhere nice and visible so I can marvel at its scrolling majesty. I do it just to amuse myself then I sit back and wait for someone to notice then shout at me to not be so ridiculous and take off the ludicrous, retro, badly supported tag.

Oddly that doesn’t happen. I’ve checked back on the tools after several years of use and the text is still there, loyally scrolling away. I think the tag has a surprising number of supporters and actually there is a deep human need for easy to implement scrolling text. I imagine that other people in the company enjoy gazing hypnotically at the jerky scroll, just as I do.

I reckon I’m going for the blink tag next. An even more popular choice.

NuGet restore failing in Docker Container

I was tempted to write about this before, but I didn’t as there is already a very good, highly rated stack overflow answer with the solution. However, I’m just reinstalling Docker desktop and getting things working again and I wish I had written this stuff down as I’ve forgotten it. One of the many reasons to write blog posts is to fix stuff in my memory and as my own personal development notes. So in that spirit…

The Problem

We have a very simple .Net Core MVC solution.

It has the following NuGet packages

Install-Package NugetSample.NugetDemo.Demo -Version 1.0.0
Install-Package bootstrap -Version 4.5.0

With this DockerFile to containerise it

FROM mcr.microsoft.com/dotnet/core/aspnet AS base
WORKDIR /app
EXPOSE 80
EXPOSE 443

FROM mcr.microsoft.com/dotnet/core/sdk
WORKDIR /src
COPY ["Template.Web.csproj", "Template.Web/"]
RUN dotnet restore "Template.Web/Template.Web.csproj"
COPY . .
WORKDIR "/src/Template.Web"
RUN dotnet build "Template.Web.csproj" -c Release -o /app

FROM build AS publish
RUN dotnet publish "Template.Web.csproj" -c Release -o /app

FROM base AS final
WORKDIR /app
COPY --from=publish /app .
ENTRYPOINT ["dotnet", "Template.Web.dll"]

We go to the directory with the DockerFile and try to build it into a container with

docker build .

It fails on the dotnet restore step like so …

i.e. with this error

C:\Program Files\dotnet\sdk\3.1.302\NuGet.targets(128,5): error : Unable to load the service index for source https://api.nuget.org/v3/index.json. [C:\src\Template.Web\Template.Web.csproj]
C:\Program Files\dotnet\sdk\3.1.302\NuGet.targets(128,5): error :   No such host is known. [C:\src\Template.Web\Template.Web.csproj]
The command 'cmd /S /C dotnet restore "Template.Web/Template.Web.csproj"' returned a non-zero code: 1

NuGet is failing us

The Cause

The container doesn’t have connectivity to the internet so can’t get bring down the packages. We can see this clearly by building this very very simple docker file

FROM mcr.microsoft.com/dotnet/core/sdk
RUN ping google.com

The ping fails. The host (my development machine) does have internet access – I would have noticed if that had gone down and I would be hysterically ringing Telstra (again). So it’s something specific to the container.

The Resolution

The DNS server is wrong in the container. To fix, hardcode the DNS into Docker i.e. put this JSON

"dns": ["10.1.2.3", "8.8.8.8"]

into the Docker daemon settings. In Docker Desktop it’s here

And restart the docker service. The container now has internet access, NuGet restore will work and we can now containerise our very simple web application.

Demo Code

As ever, the demo code is on my GitHub site

The very simple application
https://github.com/timbrownls20/Demo/tree/master/ASP.NET%20Core/Template

and its docker file
https://github.com/timbrownls20/Demo/blob/master/ASP.NET%20Core/Template/Template.Web/Dockerfile

Docker file for the internet test
https://github.com/timbrownls20/Demo/blob/master/Docker/InternetTest/DockerFile

Useful Links

This Stack Overflow answer has the resolution to this with a very good explanation. Also it has other (probably better) ways to fix this and resolutions to other Docker network issues that you may face.

Blocked by CORS policy? Unblocking in .Net Core 3

A while ago I wrote an post about hosting Angular under an existing IIS website. Quite a few people seem to have found it useful which is good. My motivation was to avoid CORS policy errors i.e blocked JavaScript requests to a different domain. I bypassed them completely by hosting the Angular client and the UI under the same domain – bit of a cheat really. At the time I wrote

This is to avoid Cross Origin Scripting Issues in my own environment. [.. ]other ways to do this

Never worked
Were too invasive to implement on the API side
Did work then frustratingly stopped working

I don’t know why I was struggling so much. It turns out to be pretty straight forward to have a CORS policy that lets anything through. I suspect its got a lot easier in .Net Core 3. Perhaps I just missed it before.

Cross-origin resource sharing (CORS)

Just to define terms- CORS is a way to enable one website to access resources on another domain. Often requests are blocked if they are from a different host (same-origin policy). It’s typically when JavaScript clients (Angular, React etc..) make a request to a API on a different host using XMLHttpRequest. When this happens, we see something like

blocked by CORS policy

In this case we need a suitable CORS Policy.

Enabling CORS for all hosts in .Net Core

Here I’m going to create a very relaxed CORS policy; it’s going to let anything through. In Startup.cs file ConfigureServices add

public void ConfigureServices(IServiceCollection services)
{
    //.. configure other services
    
    services.AddCors(options =>
    {
        options.AddPolicy("AllOrigins",
            builder =>
            {
                builder.AllowAnyHeader()
                               .AllowAnyOrigin()
                              .AllowAnyMethod();
            });
    });
}

and wire it up in the Configure method

public void Configure(IApplicationBuilder app, IWebHostEnvironment env)
{	
    //.. more code

    app.UseCors("AllOrigins");

}

and that’s it. The error goes away and any JavaScript client can make a request against this API. This is good for my development projects but if this was to go into production you’d want to consider a finer tuned CORS policy. Same-origin policy is implemented in browsers for a reason, not just to frustrate my demo projects.

Demo code

As ever, the full code is at on my github site here.

Useful Links

https://en.wikipedia.org/wiki/Cross-origin_resource_sharing#Headers
Wikipedia has a good page on Cross-origin resource sharing

https://code.google.com/archive/p/browsersec/wikis/Part2.wiki#Same-origin_policy.
I found the same-origin policy browser security notes from Google interesting. Same-origin policy is a bit of an umbrella terms for some related security concerns. The ones that is causing the problem here is the same-origin policy for XMLHttpRequest but there are same-origin policy for DOM access, cookies and Java (as well as Silverlight and Flash – remember those guys anyone?)