aftr
Azure Functions & Table Storage REST.
Simple CRUD REST APIs hosted on Azure Functions v2 and backed by Azure Table Storage.
import aftr from 'aftr'
module.exports = aftr({
table: 'TodosApp',
name: 'todos',
schema: {
description: 'string',
done: 'boolean',
category: { type: 'string', optional: true },
added: {
type: 'string',
default: () => (new Date()).toISOString(),
},
},
})
Quickstart
If you're just getting started with Azure Functions, you might want to read the not-so-quickstart.
-
Add the
aftr
package:yarn add aftr # or: npm install --save aftr
-
Add an environment variable
AZURE_STORAGE_CONNECTION_STRING
to your app, set to your connection string.- You can find this under your App's Application Settings on the Azure Portal. Storage Accounts are automatically created by default with new Apps; you can probably copy the value for
AzureWebJobsStorage
. (this value is set for Azure Web Jobs; your App probably shouldn't use it.)
- You can find this under your App's Application Settings on the Azure Portal. Storage Accounts are automatically created by default with new Apps; you can probably copy the value for
-
Set the App's Node.js version to at least
8.9.0
. -
Create (or use an existing) table in the App's Storage Account.
-
Configure
function.json
as follows. Changeapi-name-here
as desired, and note thatreq
andres
should not be renamed. Disabling any of the HTTP methods is fine.{ "disabled": false, "bindings": [ { "direction": "in", "type": "httpTrigger", "name": "req", "authLevel": "function", "methods": [ "get", "post", "put", "patch", "delete" ], "route": "api-name-here/{id?}" }, { "direction": "out", "type": "http", "name": "res" } ] }
-
In
index.js
, importaftr
, configure it, and export what it churns out.import aftr from 'aftr' // or: const aftr = require('aftr').default module.exports = aftr({ table: 'TodosApp', name: 'todos', schema: { description: 'string', done: 'boolean', category: { type: 'string', optional: true }, added: { type: 'string', default: () => (new Date()).toISOString(), }, }, })
- Don't
export default
; Function Apps don't appear to look for a.default
yet.
- Don't
-
Deploy your changes, and that should be it.
In this example, data is stored in the TodosApp
table under the todos
partition.
You can list the stored todos
with a GET
, create with POST
, update with PUT
/PATCH
, and DELETE
as expected. The latter methods will, of course, need an ID specified in the URI.
You can extend aftr by adding pre
/post
hooks to your config (see the reference), or even creating a wrapper for the exported function.
Reference
The aftr
module (by default) exports a function that takes a single options object. It returns a function that accepts a context
parameter (as expected by Functions).
import aftr from 'aftr'
aftr({ ...options })
Options
All are optional unless specified.
storageConnectionString
Explicitly specifies the Azure Storage connection string to use. This overrides the environment variable AZURE_STORAGE_CONNECTION_STRING
.
storageConnectionString: "DefaultEndpointsProtocol=https;AccountName=aftr;AccountKey=01234567890abcdefghijklmnopqrstuvwxyz==;EndpointSuffix=core.windows.net"
table
(required)
Name of the table to use. You must create this table yourself, either from the portal or programmatically; table creation/deletion can take a couple of minutes, so you'd be blocked from requests for that time anyway.
aftr works on the assumption that your App's Functions will share a table, but use their own partitions; that is one partition per entity type, or in Rails terms, one partition per resource.
table: 'TodosApp'
Note: a common pattern is to share this option between functions, a la:
// config.js
export default {
table: 'TodosApp',
cors: 'todosapp.com',
}
// function/index.js
import aftr from 'aftr'
import config from '../config'
module.exports = aftr({
...config,
schema: { ... },
})
name
(required)
Alias: plural
Name of this entity type; also the name of the partition.
name: `todos`
singular
Singular of name
. Only used for logging and error strings.
Defaults to name.slice(0, -1)
.
singular: 'sheep'
blobContainer
Blob storage fields will store blobs in the container specified here. This option is required if you have any of said fields.
limit
Maximum number of results to return at once for GET
requests.
Defaults to infinity.
limit: 10
idDelimiter
Delimiter for references-many fields. Should be a string; ideally one character.
Obviously this must not be any of the characters used to make up IDs. aftr currently generates all IDs using uuid/v4
, so that bars you from using anything in [0-9a-z-]
.
Azure Table Storage doesn't have any relational integrity checking built-in, so references are stored as strings. Where references-many fields exist, the many IDs must be delimited, hence this option.
aftr will eventually try to handle referential integrity, albeit in a passive way (#12).
Defaults to ','
.
idDelimiter: ','
cors
Sends an Access-Control-Allow-Origin: *
header for referers matching these domain(s).
Can be a string, an array, or true
to always send the header.
cors: ['abc.com', 'xyz.co.uk']
pre
Hook function to be called before any processing of the request (on aftr's part) has taken place. The function is passed ctx
as an argument.
If the function returns a promise, aftr will wait for it to resolve before doing anything else. Rejections will cause 500s.
If the returned promise resolves truthy, or if anything else truthy is returned from the function, execution on aftr's part is halted. In this case, calling ctx.done()
is up to you. Don't forget this, or requests will time out.
pre: (ctx) => {
if(ctx.headers['call-it-quits']) {
ctx.done()
return true
}
}
Note: an alternative method of adding a pre-hook is by creating a wrapper function, which is perfectly valid. The following is mostly equivalent to the above:
module.exports = (ctx) => {
if(ctx.headers['call-it-quits'])
ctx.done()
else
aftr({ ...options })()
}
post
Hook function to be called after all processing of the request (on aftr's part) has completed; it's effectively called in place of ctx.done()
. ctx
is passed as a parameter.
If this is set, calling ctx.done()
is up to you. Don't forget this, or requests will time out.
post: (ctx) => {
// append spam to get-multiple requests
if((ctx.req.method === 'GET') && ! ctx.req.params.id) {
ctx.res.body.push(`spam`)
ctx.done()
}
}
override
Can be set to an object to override the handling of specific HTTP methods. It's a good idea to have a look at src/index.js
to see what it is you're replacing! (see also the schema and storage reference below.)
ctx
is passed to the handler functions. If a promise is returned, aftr will wait for it to resolve.
The return value (if not a promise), or the value the promise resolves with, will be further processed by aftr before ending the request. It should have the format:
{
result: [], // json to send to client
errors: null, // if set to an array, sends { errors: [ ... ] }
status: 200, // set a custom status code
}
If you'd like to prevent this and handle ctx.res
yourself, return null
. In this case, aftr will still call ctx.done()
once the handler has completed / resolved (unless you've set a post-hook).
override: {
GET: (ctx) => {
ctx.res.body = 'whatever'
ctx.done()
return null
}
}
schema
(required)
This must be an object which maps fields' names to their types (and other metadata).
All entities have an implicit id
(actually stored as the RowKey
) string which is generated upon the entity's creation (currently via uuid/v4
).
schema: {
firstName: 'string',
middleName: { type: 'string', optional: true },
lastName: 'string',
}
type
Must be either 'string'
, 'number'
or 'boolean'
.
fieldName: 'type'
implies fieldName: { type: 'type' }
.
Both Table Storage and JSON don't support many types; aftr takes after these. So, if you're at a loss as to how to store your data, the safe option is to serialise it into a string
.
This is required unless references
or blobStorage
is specified.
{ type: 'string' }
references
If set, should be a string equalling the name (plural) of another entity type to reference. This creates a field which will reference ID(s) of that entity type (see many
below).
Currently, no referential integrity is enforced or checked (#12).
Implies type: 'string'
if set.
bestFriend: { references: 'people' }
many
If references
is set, this allows the field to reference multiple entities. Referenced IDs are separated within the field by idDelimiter
.
Set to an integer to specify the maximum number of referenced entities, or true
for infinity.
friends: { references: 'people', many: true }
blobStorage
If set, indicates that this field should upload its values to Azure Blob Storage. Practically, this is used to store files, though you should keep the payload limits for Function calls in mind.
When writing to these fields, provide a Base64-encoded string. When reading, a URL to the blob will be returned.
Buffer.from()
is used to decode the string when uploading; this means that invalid Base64 will be ignored. If you want some protection against corrupt file uploads, set the mimeTypes
option.
Implies type: 'string'
if set.
mimeTypes
For blobStorage
fields, whitelists the MIME types to be uploaded. Note that if MIME can't be detected, application/octet-stream
will be assumed.
Can be a string or array.
optional
If true, allows the field to be absent.
This means that, when reading/writing to an entity, the field's key won't be present at all. Unless you interfere (eg. in the post-hook), the APIs produced by aftr won't ever return any null
s.
When writing optional fields who have default
defined (see below), null
or ''
must be specified in the API call to prevent the default value being set.
middleName: { type: 'string', optional: true }
default
Sets the field to given default value when left unspecified.
This can be a function; if so, it is passed two parameters:
- The entity as specified by the API's user;
- A boolean which is true if the entity is being created (rather than updated).
It must, of course, return the value to be set.
Fields which have both default
and optional
set must be explicitly set to null
or ''
in API calls in order to prevent setting the default.
added: {
type: 'string',
default: () => (new Date()).toISOString(),
}
min
/ max
If set, specifies a minimum/maximum (inclusive) value for numbers, or length for strings.
volume: { type: 'number', min: 0, max: 100 }
step
Sets a step to which to round numbers; set to 1
to ensure integer values.
numberOfGoats: { type: 'number', step: 1, min: 0 }
Schema functions
TODO
Storage functions
TODO
Development
You'll need Yarn and Node.js >= v8 installed.
The Dockerfile
and docker-compose.yml
are provided for convenience; the repo will be mounted writable at /srv
. .NET Core, func
and friends will be usable.
# Get a shell
[doug@workstation]$ docker-compose run aftr bash
# Install dependencies
[root@container]# yarn
# Test
[root@container]# AZURE_STORAGE_CONNECTION_STRING='...' yarn test
# Build
[root@container]# yarn build
docker-compose up
on its own will yarn
, build and test, but in order for the integration tests to pass you'll either need to append -e AZURE_STORAGE_CONNECTION_STRING='...'
or put it in a file .storage.env
in the root of the repo; this will be sourced in test/run
. (this is because we currently can't run the storage emulator on Linux)
Code style
aftr doesn't use a linter. Just keep things tidy, ta.
Generally, follow the below rules, but as ever, break them if it helps readability:
- Indent with 2 spaces.
- Curly braces aren't compulsory; only add a block if it markedly increases readability in context.
- Don't be afraid to use ternary
?:
for conditional assignments. - Code in paragraphs and chapters (one-line / two-line blanks).
- Abbreviations are sometimes good, especially if the longer equivalent might decrease readability.
- This said, avoid single-letter variables in large scopes.
Release procedure
[doug@workstation]$ docker-compose run aftr bash
[root@container]# yarn
[root@container]# yarn build
[root@container]# yarn test
If all of the above went well:
- Update
CHANGELOG.md
- Bump the version in
package.json
- Commit the changes
Then:
[doug@workstation]$ git tag -a vX.X.X
[doug@workstation]$ git push --tags
[doug@workstation]$ npm login # if needed
[doug@workstation]$ npm publish
License
MIT
Copyright 2018 Douglas Thompson
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.