Bookstore Guide Part 2

todo this guide was originally written as the Sales service, but since I've removed the public keyword for now it needs to be adapted to be about adding a database to the Bookstore. Most of it will persist in its new form, but there are a couple things keeping this incomplete:

Begin outdated Sales guide

In this guide we'll create our own sales service and use that instead of the public one we included in the Bookstore Guide. If you haven't completed the Bookstore Guide, do that first. The topics covered here should be the last bits needed to create an application with Strat. We'll learn how to

Lets start with a bare-bones Sales api:

Sales.st:

service Sales {
  public getSales ():any -> "./getSales.js"
  public setSales (any):any -> "./setSales.js"
}

getSales.js:

module.exports = () => [ 'John Steinbeck' ];

setSales.js:

module.exports = e => e;

Notice how we're not including Http--that public keyword sets up Http for us. Lets run this and poke around some endpoints:

stratc Sales.st && stratc Sales.sa

localhost:3000 gives a not found as expected--we didn't dispatch any Http events. However, the public keyword generates a few endpoints under the path '/strat/Sales/' for us that other APIs can use to connect to our Sales service.

Lets run this and use it in our Books API. But first, we need to sort out how to run the two APIs on different ports as they can't both use 3000.

svs.json

Create a file svs.json:

{
  "substrate": "local",
  "local": {
    "Http": {
      "port": 3001
    }
  }
}

svs.json is the vehicle for supplying behavior overrides to whichever SVS is running our .sa file. Its not part of the Strat language, but sometimes SVS implementations need more information to run .sa files and users may wish to provide explicit infrastructure details like permissions, which we'll see later. Run our sales service, which will now run on localhost:3001/strat/Sales/Sales.st. Now, back in our Bookstore.st file change the Sales include from:

include "https://s0tjdzrsha.execute-api.us-west-2.amazonaws.com/Sales/strat/Sales/Sales.st"

to

include "http://localhost:3001/strat/Sales/Sales.st"

and re-run your Bookstore

stratc Bookstore.st && stratc Bookstore.sa

and checkout your new service oriented architecture at localhost:3000.

If you get an error 'ECONNREFUSED' its because you don't have your Sales service running--the Sales service needs to be running to communicate with stratc while Bookstore is building. Just to prove to yourself that you do indeed have two independent services running on your machine, change the sales response in getSales.js from 'John Steinbeck' to 'Leo Tolstoy', rebuild the Sales service then rebuild the Bookstore. You should see a little on sale indicator next to War and Peace.

Database Access

Stateless compute is all fun and games, but almost all real web software has a persistence layer. For this tutorial we'll be using a DynamoDB database. If you're not familiar, DynamoDB is a managed NoSQL database available on AWS. DDB backs most AWS services and therefore a substantial portion of the internet--its one of the rare technologies made for massive scale that's still user friendly and practical at smaller workloads.

We'll need to load in the AWS SDK. Create a package.json file and paste this content inside:

{
  "name": "sales-demo",
  "main": "getSales.js",
  "devDependencies": {
    "webpack": "^4.29.6",
    "webpack-cli": "^3.3.0"
  },
  "dependencies": {
    "aws-sdk": "^2.434.0"
  }
}

Then run:

npm install

Now, lets create the Sales table we'll use in the API. Create a javscript file 'createSalesTable.js' with this content:

var AWS = require("aws-sdk");

AWS.config.update({
    region: "us-west-2",
});

const ddb = new AWS.DynamoDB({apiVersion: '2012-08-10'});

const params = {
  AttributeDefinitions: [
    {
      AttributeName: "iterationScope", 
      AttributeType: "N"
    },
    {
      AttributeName: "saleIteration", 
      AttributeType: "N"
    }
  ],
  KeySchema: [
    {
      AttributeName: "iterationScope", 
      KeyType: "HASH"
    },
    {
      AttributeName: "saleIteration", 
      KeyType: "RANGE"
    },
  ], 
  ProvisionedThroughput: {
    ReadCapacityUnits: 5, 
    WriteCapacityUnits: 5
  },
  TableName: "Sales"
};

ddb.createTable(params, (e, r) => {
  if (e) {
    console.log('error')
    console.log(e)
  } else {
    console.log(r);
  }
});

Then execute that file to create the table:

node createSalesTable.js

You should see some output in your terminal that says the table is being created. Now, make a new file we'll actually call in our api, salesDb.js:

var AWS = require("aws-sdk");

AWS.config.update({
    region: "us-west-2",
});

module.exports = {
  getSales: getSales,
  setSales: setSales
};

const ddb = new AWS.DynamoDB({apiVersion: '2012-08-10'});

async function getSales () {
  return (await getRecentSales()).authors;
}

async function setSales (sales) {
  const recentIteration = (await getRecentSales()).saleIteration;
  await addSalesIteration(sales, recentIteration + 1);
  return sales;
}

async function addSalesIteration (sales, iteration, dontRetry) {
  var params = {
    Item: {
      authors: {
       SS: sales
      },
      date: {
       S: (new Date()).toISOString()
      },
      iterationScope: {
        N: '1'
      },
      saleIteration: {
        N: "" + iteration
      }
    },
    ConditionExpression: "attribute_not_exists(saleIteration)"
  };

  try {
    return await runDdb('putItem', params);
  } catch (e) {
    if (dontRetry) throw e;
    return addSalesIteration(sales, iteration + 1, true);
  }
}

async function getRecentSales () {
  const queryResult = await runDdb('query', {
    ExpressionAttributeValues: {
      ':s': {N: '1'}
    },
    Limit: 1,
    KeyConditionExpression: 'iterationScope=:s',
    ScanIndexForward: false
  });
  const recentIteration = queryResult.Items
    .map(item => {
      return {
        saleIteration: parseInt(item.saleIteration.N),
        authors: item.authors.SS
      };
    })
    [0] || { saleIteration: 0, authors: [] };
  return recentIteration;
}


async function runDdb (operation, parameters) {
  return new Promise(function (resolve, reject) {
    const params = Object.assign(parameters, { TableName: "Sales" });
    ddb[operation](params, (e, r) => {
      if (e) reject(e);
      resolve(r);
    });
  });
}

This file handles all the query logic we need for two different API operations:

You may notice there's a lot of in-the-weeds dynamodb stuff around a range key saleIteration--you can ignore this (unless you want to code/design review this guide).

Building A Javascript Bundle

We are at an unfortunate time in this guide. We have a nice little database access file, but its not totally clear how we'll access this in our API. We could make it its own function in Sales.st and call it from getSales by using Strat like we do in the Books service, but instead we'll bundle it into the getSales function. Strat is rigid about what constitutes an artifact--single files that expose a single function. For most languages this is pretty easy--the compiler for rust creates a single binary file, for example. However, javascript is lacking in this regard, and we have to bring in some extra tools to create this nice single file bundle. If you've already lived through the 7th circle of hell that is building javascript, go ahead and use whatever you're comfortable with. We'll be using Webpack here because it's the preeminent cause of mental breakdowns in the Javascript community--we want only the best.

Change your getSales and setSales files to use the new database access file:

getSales.js:

const getSales = require('./salesDb').getSales;

module.exports = getSales;

setSales.js:

const setSales = require('./salesDb').setSales;

module.exports = event => setSales(event.body);

Create a webpack.config.js file:

const webpack = require('webpack');
const path = require('path');

module.exports = {
  target: 'node',
  entry: {
    getSales: './getSales.js',
    setSales: './setSales.js'
  },
  output: {
    path: path.resolve('./'),
    filename: './[name].bundle.js',
    library: 'strat-library',
    libraryTarget: 'umd'
  },
  plugins: [
    new webpack.IgnorePlugin(/strat/gi)
  ],
};

This configuration soup is why Webpack has such a bad reputation. It's decent technology as long as somebody else hands you a working config file--you're welcome.

Install Webpack:

npm install

Run Webpack (note: requires npm 5+; use n or nvm to upgrade your npm if npx doesn't work):

npx webpack

And change Sales.st to use the bundle:

service Sales {
  public getSales ():any -> "./getSales.bundle.js"
  public setSales (any):any -> "./setSales.bundle.js"
}

From here on out, we need to rebuild our javascript bundle whenever we change things within our javascript files, so our new build command looks like:

npx webpack && stratc Sales.st && stratc Sales.sa

Try out your public endpoints (run these individually and look at the results):

curl localhost:3001/strat/Sales/getSales

curl -X "POST" localhost:3001/strat/Sales/setSales -d "[ \"Brian Kernighan and Dennis Ritchie\"]"

curl localhost:3001/strat/Sales/getSales

The second curl changes our database state to have a new sale, which we can see in the last curl.

Our Sales API is good to go, now there's just one last thing we need to do to deploy it to AWS. Our functions will be deployed out on infrastructure (probably Lambdas), and those infrastructure components will need to have custom permissions in order to access DynamoDB. We will specify these permissions by adding a role property in our svs.json:

{
  "substrate": "aws",
  "aws": {
    "config": {
      "region": "us-west-2"
    },
    "roles": {
      "Sales": [
        {
          "action": [ "dynamodb:Query", "dynamodb:PutItem" ],
          "arn": "arn:aws:dynamodb:*:*:table/Sales"
        }
      ]
    }
  },
  "local": {
    "Http": {
      "port": 3001
    }
  }
}

There are a couple things in this svs.json to mention:

The next time you run the build command you'll deploy to AWS. This conlcudes your Sales service, and now you should be equiped to build real stuff with Strat.