Part 5. Clock-in/out System: Seed Database and migration data
This post is part of a Series of post which I'm describing a clock-in/out system if you want to read more you can read the following posts:
- Part 1. Clock-in/out System: Diagram.
- Part 2. Clock-in/out System: Basic backend - AuthModule.
- Part 3. Clock-in/out System: Basic backend - UsersModule.
- Part 4. Clock-in/out System: Basic backend- AppModule.
- Part 5. Clock-in/out System: Seed Database and migration data
- Part 6. Clock-in/out System: Basic frontend.
- Part 7. Clock-in/out System: Deploy backend (nestJS) using docker/docker-compose
- Part 8. Clock-in/out System: Deploy frontend (Angular 2+) using environments
- Part 9. Testing: Backend Testing - Unit Testing
- Part 10. Testing: Backend Testing - Integration Testing
- Part 11. Testing: Backend Testing - E2E Testing
- Part 12. Testing: Frontend Testing - Unit Testing
- Part 13. Testing: Frontend Testing - Integration Testing
In the previous posts the development of the backend of the clock-in system has been described but there are no data to verify that our backend has been successfully developed.
So, in this posts I'm going to described how I have created a series of scripts (typescript) that allow filling the database from an excel sheet provided by the client from its old software (MS Excel).
The first step is to add a npm script in the package.json that allows the execution of our script in TypeScript using node-ts. So, the execution of our script will be executing the command npm run init:db.
{
"name": "clock-backend",
"version": "0.0.1",
"description": "clock-backend",
"author": "Carlos Caballero",
"license": "MIT",
"scripts": {
"format": "prettier --write \"**/*.ts\"",
"start": "ts-node -r tsconfig-paths/register src/main.ts",
"start:dev": "nodemon",
"prestart:prod": "rm -rf dist && tsc",
"start:prod": "node dist/main.js",
"start:hmr": "node dist/server",
"test": "jest",
"test:cov": "jest --coverage",
"test:e2e": "jest --config ./test/jest-e2e.json",
"webpack": "webpack --config webpack.config.js",
"init:db": "ts-node e2e/create-data-fixture.ts"
},
"dependencies": {
"@nestjs/common": "^5.0.0",
"@nestjs/core": "^5.0.0",
"body-parser": "^1.18.3",
"class-validator": "^0.9.1",
"cors": "^2.8.4",
"dotenv": "^6.0.0",
"fastify-formbody": "^2.0.0",
"moment": "^2.22.2",
"pg": "^7.4.3",
"reflect-metadata": "^0.1.12",
"rxjs": "^6.0.0",
"typeorm": "^0.2.7",
"typescript": "^2.6.2"
},
"devDependencies": {
"@nestjs/testing": "^5.0.0",
"@types/express": "^4.0.39",
"@types/jest": "^21.1.8",
"@types/node": "^9.3.0",
"@types/supertest": "^2.0.4",
"jest": "^21.2.1",
"nodemon": "^1.14.1",
"prettier": "^1.11.1",
"supertest": "^3.0.0",
"ts-jest": "^21.2.4",
"ts-loader": "^4.1.0",
"ts-node": "^4.1.0",
"tsconfig-paths": "^3.1.1",
"tslint": "5.3.2",
"webpack": "^4.2.0",
"webpack-cli": "^2.0.13",
"webpack-node-externals": "^1.6.0"
},
"jest": {
"moduleFileExtensions": [
"js",
"json",
"ts"
],
"rootDir": "src",
"testRegex": ".spec.ts$",
"transform": {
"^.+\\.(t|j)s$": "ts-jest"
},
"coverageDirectory": "../coverage"
}
}
The script create-data-fixture is a self-executing function which create a new object based on DatabaseTest and invoke the method db.reload(). The function DatabaseTest is created using two parameters:
- A connection of database (the database which will be used to store the data) which would be different in each execution if we want to use different databases. This fact is interesting when we want to have different databases in parallel running e2e (End to End) test with mock-data.
- DataFixture which is a object which contains the information which will be insert into the database (Users and Users-schedule).
import { DatabaseTest } from './database.test';
import { TestFixture } from './utils/fixtures/test.fixture';
(async () => {
try {
const db = new DatabaseTest(
await DatabaseTest.createConnectionDB({ logging: 'all' }),
{
...TestFixture,
},
);
await db.reload();
} catch (error) {
// tslint:disable-next-line:no-console
console.error(error);
}
})();
So, the most simple file is TestFixture which exports the data once they have been imported and loaded from the XLS file of a data mapping script (which will also be developed by us).
import { XSLToJson } from '../import';
export const TestFixture = {
User: [],
AuthEntity: [],
UserSchedule: [],
};
[TestFixture.UserSchedule, TestFixture.User] = XSLToJson();
On the other hand, the database.test.ts file is responsible of load the faked data in the database. That is, the Users and UsersSchedule will be taken from the test.fixture file. Then we will go describing this file, since it is quite important:
- The DatabaseTest class received using DI (Dependency Injection) the database connection and the data mocked from the test.fixture file.
- The modelsCharged attribute is initialized, which stores the models that have already been loaded in the database and do not have to reload them.
- The createConnectionDB method is static to do the connection to the database from outside the class. In this way, when the class is injected, the connection has already been established successfully.
@Injectable()
export class DatabaseTest {
private modelsCharged = [];
constructor(
@Inject(DB_CONNECTION_TOKEN) private readonly DBConnection: Connection,
@Inject('DataFixture') private readonly dataFixture: DataFixture,
) { }
public static createConnectionDB(
options?: Partial,
): Promise {
return createConnection({
type: 'postgres',
host: 'localhost',
port: 5531,
username: 'root',
password: 'toor',
database: 'clock',
entities: [User, AuthEntity, UserSchedule],
synchronize: true,
...options,
} as ConnectionOptions);
}
....
}
The next step is to build the basic methods to initialize, reload and synchronize the database.
public sync() {
return this.DBConnection.synchronize(true);
}
public async reload() {
await this.sync();
await this.init();
}
public async init() {
for (const model of Object.values(localModels)) {
await this.loadModel(model);
}
}
Next, the models are loaded, which are recursively performed one by one, taking the information of the existing dependencies between the different models.
private async createData(modelName: string, className: any): Promise {
if (!this.dataFixture[modelName]) {
return;
}
const repository = await this.DBConnection.getRepository(
this.DBConnection.entityMetadatas.find(
entity => entity.name === modelName,
).target,
);
const elements = this.dataFixture[modelName].map(newElement =>
Object.assign(new className(), newElement),
);
await repository.save(elements);
}
private async loadModel(model: Model) {
if (this.modelsCharged.indexOf(model.name) !== -1) {
return;
}
if (model.depends && model.depends.length !== 0) {
for (const dependencyName of model.depends) {
const modelDependency = localModels[dependencyName];
await this.loadModel(modelDependency);
}
}
this.modelsCharged.push(model.name);
return await this.createData(model.name, model.className);
}
To recapitulate, the complete database.test file is shown.
database.test
import { Injectable, Inject } from '@nestjs/common';
import { Connection } from 'typeorm/connection/Connection';
import { createConnection, ConnectionOptions } from 'typeorm';
import { User } from '../src/modules/users/entities/user.entity';
import { DB_CONNECTION_TOKEN } from '../src/common/config/database.tokens.constants';
import * as cp from 'child_process';
import { promisify } from 'util';
import { AuthEntity } from '../src/modules/auth/entities/auth.entity';
import { UserSchedule } from '../src/modules/users/entities/user-schedule.entity';
interface Model {
name: string;
className: string;
depends?: string[];
}
interface DataFixture {
[key: string]: any[];
}
const localModels: { [modelName: string]: Model } = {
User: {
name: 'User',
className: User,
depends: [],
},
AuthEntity: {
name: 'AuthEntity',
className: AuthEntity,
depends: ['User'],
},
UserSchedule: {
name: 'UserSchedule',
className: UserSchedule,
depends: ['User'],
},
} as any;
@Injectable()
export class DatabaseTest {
private modelsCharged = [];
constructor(
@Inject(DB_CONNECTION_TOKEN) private readonly DBConnection: Connection,
@Inject('DataFixture') private readonly dataFixture: DataFixture,
) {}
public sync() {
return this.DBConnection.synchronize(true);
}
public async reload() {
await this.sync();
await this.init();
}
public async init() {
for (const model of Object.values(localModels)) {
await this.loadModel(model);
}
}
public static createConnectionDB(
options?: Partial,
): Promise {
return createConnection({
type: 'postgres',
host: 'localhost',
port: 5531,
username: 'root',
password: 'toor',
database: 'clock',
entities: [User, AuthEntity, UserSchedule],
synchronize: true,
...options,
} as ConnectionOptions);
}
private async createData(modelName: string, className: any): Promise {
if (!this.dataFixture[modelName]) {
return;
}
const repository = await this.DBConnection.getRepository(
this.DBConnection.entityMetadatas.find(
entity => entity.name === modelName,
).target,
);
const elements = this.dataFixture[modelName].map(newElement =>
Object.assign(new className(), newElement),
);
await repository.save(elements);
}
private async loadModel(model: Model) {
if (this.modelsCharged.indexOf(model.name) !== -1) {
return;
}
if (model.depends && model.depends.length !== 0) {
for (const dependencyName of model.depends) {
const modelDependency = localModels[dependencyName];
await this.loadModel(modelDependency);
}
}
this.modelsCharged.push(model.name);
return await this.createData(model.name, model.className);
}
}
BONUS TRACK
Next we will show the XLSToJson method, which simply consists of transforming each row of the excel to the different JSON objects needed to be imported by TypeORM. It is important to note that each developer will have to adapt the XLSToJson function from their XLS.
First of all we need to install the node-xlsx package which allows us to interact with the XLS files. Below is a first version that makes use of several forEachloops to go through and build a data structure similar to the following:
- [schedulers, users] where
- scheduler is an object of the following type:
{
user: {
uid:String,
},
room: string,
day: number (0-4),
hour: number (1-12),
}
- user is an object of the following type:
{
uid: string,
name: string,
}
The following code, I would consider that it is not in its cleanest version and I consider a nice refactoring of it when the system is done as a future post. If someone wants to advance the task, the following points are proposed:
- Extract functions according to the tasks performed, instead of a large, single function.
- Replace the forEach methods that hide the classic data structure for but functionally by more suitable methods such as the reduce.
migration.ts
const xls = require('node-xlsx').parse(__dirname + '/schedule.xls');
export function XSLToJson() {
const schedulers = [];
const users = [];
xls[0].data.slice(2).forEach(teacher => {
let schedule = {};
let user = {};
let name = teacher[0].split(',');
if (name.length > 1) {
name = name[0] + ', ' + name[1].trim()[0] + '.';
}
user = {
uid: name,
name,
};
users.push(user);
const monday = [...teacher.slice(1, 7), ...teacher.slice(8, 14)];
const tuesday = [...teacher.slice(14, 20), ...teacher.slice(21, 27)];
const wednesday = [...teacher.slice(27, 33), ...teacher.slice(34, 40)];
const thursday = [...teacher.slice(40, 46), ...teacher.slice(47, 53)];
const friday = [...teacher.slice(53, 59), ...teacher.slice(60, 66)];
[monday, tuesday, wednesday, thursday, friday].forEach((day, numberOfDay) =>
day.forEach((room, hour) => {
if (room) {
schedule = {
user: {
uid: name,
},
room: room.replace(/\n/g, ' - '),
day: numberOfDay,
hour,
};
schedulers.push(schedule);
}
}),
);
});
return [schedulers, users];
}
RESUME
In this post I've explain my seed database which is composed of a series of scripts that allow me to load any data structure in the database. To import from Excel, a specific script has been created that converts the XLS file into JSON.
The GitHub project is https://github.com/Caballerog/clock-in-out.
The GitHub branch of this post is https://github.com/Caballerog/clock-in-out/tree/part5-seed-database.