Crafting Node APIs, Part 3: Under the Covers

Previously: Crafting Node APIs part 1 and part 2


Scott Wojan, projekt202 Principal Architect

Reggie Samuel, projekt202 Managing Architect

In this post, we will dive into how the API is organized.

Getting Oriented

Project Structure

Project Structure
config All of the configuration concerns for the API
controll-ers The controllers for each end point. The role of the controller is to handle requests for a specific route, usually by calling a service, piping the result to the response stream and error catching.
database Sets up our persistence layer with database specifics from the configuration
logging Provides the ability to log messages
models The data models/entities
routes Routing definitions exposed by our API
security Authentication/authorization for our API
services Classes responsible for processing requests in our API. Services are the last layer which retrieves/updates/transforms data from our data store.
.eslintrc. json The configuration settings for ESLint
app.js The main starting point of the API
db_setup. bat The create database script for windows
package. json The node packaging file

API Flow

The general flow and structure of the API conforms to this pattern:

Sequence Diagram for the API's flow

Endpoint development consists of three components:

  1. The route handler denotes what url and http verbs the API supports and the controller responsible for handling the request.
  2. The controller validates the request, prepares the data required by a service and interprets the service's response (including exceptions) into an http friendly response.
  3. The service is responsible for interacting with the backing data store and returning a response to the controller. Services could very well interact with other systems like calling third-party APIs.

This answers the goal of "How are you going to logically separate your code?"

app.js - Where it all begins

[code lang="js"]'use strict';

let restify = require('restify'); let logging = require('./logging'); let config = require('./config'); let authentication = require('./security/authentication'); let authorization = require('./security/authorization');

process.on('uncaughtException', (err) => { var logId = logging.processError(err, process, config);

console.error(`Process exception. Check log id: ${logId}`); console.error(err);

err = err || {};

if (!(err.status >= 400 && err.status <= 499)) { process.nextTick( process.exit(1) ); } }); /* Create the restify api server */ let server = restify.createServer({name:, version: config.server.version}); /* Configure the server middleware */ server.use(restify.CORS()) /* allows cross domain resource requests */ .use(restify.fullResponse()) /* allows the use of POST requests */ .use(restify.acceptParser(server.acceptable)) /*parses out the accept header and ensures the server can respond to the client’s request */ .use(restify.queryParser()) /* parses non-route values from the query string */ .use(restify.bodyParser()) /* parses the body based on the content-type header */ .use((req, res, next)=> { req.server = server; next(); }); /* Enable security middleware if set in the config */ if(config.server.enableSecurity){ server.use(authentication(server)) .use(authorization(server)); }

/* Create all the routes for the server */ require('./routes')(server);

/* Handle all uncaught exceptions. These will be turned into 500 Internal Server Error responses and the details not leaked to the client */ server.on('uncaughtException', (request, response, route, error) => { response.statusCode = 500; var logId = logging.webError(request, response, route, error); return response.send(500, {code: 'InternalServerError', message: 'An internal server error occurred', supportId: logId}); });

/* Start listening */ server.listen(config.server.port, () => {`${} is listening at ${server.url}`); });

module.exports.server = server; module.exports.config = config;[/code]

A very important part of app.js is the uncaughtException routine. This will take server exceptions, log them to the logging mechanism, and return the unique ID of the item that was logged as part of the response. This answers our goal of "How are you going to keep details of exceptions/internal server errors from getting back to the client, yet still be able to see what happened? How are you going to field or diagnose support calls for those errors?"

As an example, if your database isn't up and running, you would receive the following:

[code lang="js"]{ "code": "InternalServerError", "message": "An internal server error occurred", "supportId": "44abdd51-3834-4678-97f8-5ad27352c469" }[/code]

And the log message would provide much more detail:

[code lang="js"]{ "request": { "path": "/authenticate", "url": "/authenticate", "method": "POST", "params": { "emailAddress": "", "password": "ABC123" }, "body": { "emailAddress": "", "password": "ABC123" }, "headers": { "host": "", "connection": "keep-alive", "content-length": "62", "accept": "application/json", "cache-control": "no-cache", "origin": "chrome-extension://aicmkgpgakddgnaphhhpliifpcfhicfo", "content-type": "application/json", "user-agent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/49.0.2623.110 Safari/537.36", "postman-token": "3e4185f6-b55f-7796-3b1d-6af06098f1d3", "accept-encoding": "gzip, deflate", "accept-language": "en-US,en;q=0.8" }, "query": "", "httpVersion": "1.1" }, "response": { "statusCode": 500 }, "error": { "message": "connect ECONNREFUSED", "stack": "SequelizeConnectionRefusedError: connect ECONNREFUSED\n at Handshake.<anonymous> (/Users/scottwojan/Documents/blog-template/content/node_modules/sequelize/lib/dialects/mysql/connection-manager.js:79:20)\n at bound (domain.js:287:14)\n at Handshake.runBound [as _callback] (domain.js:300:12)\n at Handshake.Sequence.end (/Users/scottwojan/Documents/blog-template/content/node_modules/mysql/lib/protocol/sequences/Sequence.js:85:24)\n at Protocol.handleNetworkError (/Users/scottwojan/Documents/blog-template/content/node_modules/mysql/lib/protocol/Protocol.js:364:14)\n at Connection._handleNetworkError (/Users/scottwojan/Documents/blog-template/content/node_modules/mysql/lib/Connection.js:434:18)\n at emitOne (events.js:77:13)\n at Socket.emit (events.js:169:7)\n at emitErrorNT (net.js:1269:8)\n at nextTickCallbackWith2Args (node.js:442:9)\n at process._tickDomainCallback (node.js:397:17)", "sql": null, "code": "ECONNREFUSED" }, "id": "44abdd51-3834-4678-97f8-5ad27352c469", "level": "error", "message": "SequelizeConnectionRefusedError", "timestamp": "2016-09-29T23:30:24.879Z" }[/code]

Application Flow

  1. The restify framework is loaded.A note on using Restify: Two of the most popular node-based API frameworks are Express and Restify. There is a wealth of information out there comparing the two frameworks, so we won't do a deep dive into comparing the two here. We went with Restify for a couple of simple reasons, the first being that it's specifically built to construct APIs. Express is a much more in-depth framework, allowing you to not only build an API but also web applications. The second reason is performance. In 2015, the performance gap between Express and Restify was much greater, but, in 2016, Restify maintains only a slight edge.
  2. The API's configuration is loaded.
  3. Unhandled exceptions at the process level are set to be logged.
  4. A Restify server is created with the settings from the configuration.
  5. The routes are configured for the server.
  6. Unhandled exceptions within the Restify server are set to be logged. The log will generate a uuid for the log entry, and the outgoing response is made to be generic and contain a property in the response body called "supportId" with the uuid of the log entry. This helps tremendously with debugging and production support.
  7. The Restify server is set to start listening on the port specified in the config.


The configuration module puts all of the configuration settings into a nice packaged solution for us so we can references things like There is a hierarchy established here that allows setting environment variables to override the environment-specific settings file (config/development.json, config/production.json, etc.).


This is the main module that exposes configuration settings.

[code lang="js"]'use strict';

let nconf = require('nconf');

/* The environment, if none is set assume development */ exports.environment = process.env.NODE_ENV || 'development';

nconf.argv().env().file({file: __dirname + '/' + exports.environment + '.json'});

/* Restify server settings */ exports.server = { name: process.env.SERVER_NAME || nconf.get('server:name'), version: process.env.SERVER_VERSION || nconf.get('server:version'), port: process.env.SERVER_PORT || nconf.get('server:port'), enableSecurity: process.env.SERVER_ENABLE_SECURITY || nconf.get('server:enableSecurity') };

/* Database settings */ exports.database = { name: process.env.DATABASE_NAME || nconf.get('database:name'), host: process.env.DATABASE_HOST || nconf.get('database:host'), username: process.env.DATABASE_USERNAME || nconf.get('database:username'), password: process.env.DATABASE_PASSWORD || nconf.get('database:password'), port: process.env.DATABASE_PORT || nconf.get('database:port'), dialect: process.env.DATABASE_DIALECT || nconf.get('database:dialect') };[/code]


This is just a simple, environment-specific JSON file. You can create a new one of these for each environment (config/test.json, config/production.json, etc.) with specific options for each. You would then need to set the NODE_ENV as mentioned above to use that environment's configuration.

[code lang="js"]{ "server": { "name": "Todo Services DEVELOPMENT", "version": "1.0.0", "port": "3000", "enableSecurity": false }, "database": { "username": "todo", "password": "todo", "name": "todo", "host": "", "port": "3306", "dialect": "mysql" } }[/code]


A route is responsible for mapping URIs plus HTTP verbs to controller methods and where the data comes from (the query string and/or the body of the call).


This module will load all of the routes defined in the routes folder automatically for you. It leverages the require-directory package to load the routes and uses a visitor pattern to inject the Restify server into each route.

[code lang="js"]'use strict';

//Load all routes from the route directory function routeInitializer(server) { let requireDirectory = require('require-directory'), renamer = function (name) { return name.toLowerCase(); }, visitor = function (route) { route(server); //inject the server into the route };

return requireDirectory(module, {visit: visitor, rename: renamer}); }

module.exports = routeInitializer;[/code]


The todoRoute.js is more or less the same thing.

[code lang="js"]'use strict';

module.exports = function (server) { let userController = new (require('../controllers/userController'))();

server.get({path: '/users/:userId'}, userController.get);

server.put({path: '/users/:userId'}, userController.update);{path: '/users'}, userController.create); };[/code]


A controller is responsible for digesting the request, validating the required parameters and the body that has been sent, interacting with a service to handle the request, and converting the service's response into an appropriate HTTP response.


TodoController.js is more or less the same thing.

[code lang="js"]'use strict';

let userService = new (require('../services/userService'))(); let serviceErrors = require('../services/serviceErrors'); let controllerErrors = require('./controllerErrors');

class UserController { get(req, res, next) {

/* Verify a userId was passed in otherwise return a 400 */ if (!req.params.userId) { throw new controllerErrors.BadRequestError('The user id is required'); }

/* Get the user */ userService.getById(req.params.userId) .then((user) => { /* If there is no user, return a 404 */ if(!user) { return next(new controllerErrors.ResourceNotFoundError()); } /* Return the user */ res.send(user); return next(); }) .catch((e)=> { req.server.emit('uncaughtException', req, res, req.route, e); next(false); }); }

update(req, res, next) {

/* Ensure there a userId was passed in */ if (!req.params.userId) { return next(new controllerErrors.BadRequestError('The user id is required.')); }

/* Ensure there is data that was passed */ if (!req.body) { return next(new controllerErrors.BadRequestError('Missing user information.')); }

/* Update the user */ userService.update(req.params.userId, req.body) .then((user) => { /* If there is no user, return a 404 */ if(!user) { return next(new controllerErrors.ResourceNotFoundError()); } /* Return the updated user */ res.send(user); return next(); }) .catch(serviceErrors.ValidationError, (e) => { return next(new controllerErrors.ValidationError(e)); }) .catch((e)=> { req.server.emit('uncaughtException', req, res, req.route, e); next(false); }); }

create(req, res, next) { userService.create(req.body) .then((user) => { res.send(user); return next(); }) .catch(serviceErrors.ValidationError, (e) => { return next(new controllerErrors.ValidationError(e)); }) .catch((e)=> { req.server.emit('uncaughtException', req, res, req.route, e); next(false); }); } }

module.exports = UserController;[/code]


A service is responsible for taking action on incoming data. This could be to reformat the data and call a third party, interact with a file system, or, in our case, interact with our data layer to create, update and delete persisted objects. You might notice the use of the Bluebird promise library. We have decided to include this library because, unlike the other promise libraries out there, it is not only compatible with the native Promise implementation, but is up to 6x faster in some benchmarks. Take note of the use of  custom error classes for errors. We find using this pattern allows consumers to know more details about what is occurring.


TodoService.js is more or less the same thing.

[code lang="js"]'use strict';

let Sequelize = require('sequelize'); let Promise = require('bluebird'); let ModelManager = require('../models'); let modelManager = new ModelManager(); let serviceErrors = require('./serviceErrors'); let passwordHash = require('password-hash');

function cleanOutgoingUser(user){ if(!user) return user; delete user.dataValues.password; return user; }

class UserService { getById(userId) { return new Promise((resolve, reject) => { modelManager.models.user.findById(userId) .then((user) => { resolve(cleanOutgoingUser(user)); }) .catch(reject); }); }

validatePassword(emailAddress, password) { return new Promise((resolve, reject) => { this.getPasswordForEmail(emailAddress) .then((user)=> { if (!passwordHash.verify(password, user.password)) { reject(new serviceErrors.InvalidUserPassword()); } else { resolve(user); } }) .catch(reject); }); }

getPasswordForEmail(emailAddress) { return new Promise((resolve, reject) => { modelManager.models.user.findOne({attributes: ['id', 'password'], where: {emailAddress: emailAddress}}) .then(resolve) .catch(reject); }); }

update(userId, updatedUser) { return new Promise((resolve, reject) => { this.getById(userId) .then((user) => { if (!user) { resolve(null); } else { user.updateAttributes(updatedUser) .then((user) => { resolve(cleanOutgoingUser(user)); }) .catch(Sequelize.ValidationError, (validationError) => { reject(new serviceErrors.ValidationError(validationError)); }) .catch((error) => { throw error; }); } }) .catch(reject); }); }

create(user) { return new Promise((resolve, reject) => { modelManager.models.user.create(user) .then((createdUser) => { resolve(cleanOutgoingUser(createdUser)); }) .catch(Sequelize.ValidationError, (validationError) => { reject(new serviceErrors.ValidationError(validationError)); }) .catch(reject); }); } }

module.exports = UserService;[/code]


Models in this context represent the entities found in ORM packages like Hibernate, nHibernate or Entity Framework. For our ORM, we are using Sequelize. If you have used an ORM before, the concepts will be familiar; if not, the Sequelize site has fantastic documentation and examples.


This is the main module that registers all of the model classes in the folder with Sequelize. Please note the code starting with _.chain(models). This is something we crafted to enable association definitions to exist within the model.

[code lang="js"]'use strict';

let fs = require('fs'); let path = require('path'); let database = require('../database'); let _ = require('lodash'); let models = {}; let instance = null;

/* Loop through all of the model files in this directory, import them into sequelize and add them to the models object */ fs.readdirSync(__dirname).filter((file) => { return (file.indexOf('.') !== 0) && (file !== 'index.js'); }).forEach((file) => { let modelName = file.replace(path.extname(file), '').replace('Model', '');

models[modelName] = database.import(path.join(__dirname, file)); });

/* Loop through the models object, obtain the property key names and wire up any associations */ _.chain(models) .keys() .forEach((modelName) => { if ('instanceMethods' in models[modelName].options && !_.isUndefined(models[modelName].options.instanceMethods.associate)) { models[modelName].options.instanceMethods.associate(models); } }).value();

class ModelManager { constructor() { if (instance) { return instance; }

instance = this; }

get models() { return models; }

get db() { return database; } }

module.exports = ModelManager;[/code]


This is the model definition for the User object. Note that you are specifying property names, their data type, if nulls are allowed, validation rules, etc. You can find out more about models on the Sequelize Docs Site.

[code lang="js"]'use strict'; let passwordHash = require('password-hash');

module.exports = function (sequelize, DataTypes) { var user = sequelize.define('user', { id: { type: DataTypes.INTEGER, primaryKey: true, autoIncrement: true, allowNull: false, }, emailAddress: { type: DataTypes.STRING(255), allowNull: false, unique: true, validate: { isEmail: true, max: 255 } }, firstName: { type: DataTypes.STRING(100), allowNull: false, validate: { notEmpty: true, max: 100 } }, lastName: { type: DataTypes.STRING(100), allowNull: false, validate: { notEmpty: true, max: 100 } }, password: { type: DataTypes.STRING(1024), allowNull: false, set : function(val) { this.setDataValue('password', passwordHash.generate(val)); }, validate: { notEmpty: true, max: 1024 } } }, { timestamps: true, freezeTableName: true, instanceMethods: { /* This method is called by the models/index.js code */ associate: function (models) { /* Associate the todoModels to the user object so you can navigate as user.toDos as an array */ user.hasMany(models.todo, { as: 'toDos', onDelete: 'CASCADE', foreignKey: { name: 'userId', allowNull: false } });

/* Associate the api claims to the user object so you can navigate as as an array */ user.hasMany(models.claim, { as: 'claims', onDelete: 'CASCADE', foreignKey: { name: 'userId', allowNull: false } }); } } }); return user; };[/code]


In this post, we have walked through the major aspects of the project and given insight into what each area of the project does, and why certain decisions or frameworks were chosen. In Part 4, we will add in security via authentication and authorization.

Stay up-to-date on projekt202 news by following us on LinkedIn, Twitter and Facebook.