@sakuli/plugin-validator
The Sakuli plugin validator is used to validate Sakuli plugins and (container) environments
Verification of tokens etc. will be done using a secret contained in the plugin binary.
Token verification
Sakuli license token are JSON Web Token.
For token handling the project includes a header only JWT library in ./src/include/jwt
.
It's one of the official reccommendations from jwt.io and can be found on GitHub.
The main class for verification is ./src/pluginvalidator.cpp
, which provides a C++ N-API native module.
Token audience
A PluginValidator
is instantiated with a package name
, which is either an actual package name of an Sakuli enterprise package as published on npmjs.com like @sakuli/forwarder-checkmk
, or the name of a Sakuli Docker image, e.g. taconsol/sakuli
.
Within Sakuli's packages/sakuli-cli/src/load-presets.function.ts#loadPresets
function, a new validator instance is created for each potential plugin.
Each enterprise plugin comes with a unique token which contains the package name as audience claim.
If the package name does not match the token audience, validation fails and the plugin will not be activated.
For Docker images, the image name is set via environment variable in a Docker file:
...
ENV IMG=taconsol/sakuli
...
This environment variable is read by an additional validation script at container startup which exits the container start on audience missmatch.
nbf, exp, iat
Sakuli license token are valid for a limited time. For verification each token contains the following timestampts:
-
iat
(Issued At): The UTC timestamp at token creation time -
nbf
(Not Before): The UTC timestamp from when the token is valid -
exp
(EXPiration): The UTC timestamp when the token will become invalid
When validating a license token, the nbf
and exp
timestampt are taken into account.
The iat
timestamp could be used to validate whether a license file has been created by us or not.
Token categories
Sakuli plugins and environments are divided into (currently) four categories, which have values in increasing powers of 2:
enum TokenCategory {
S = 1,
M = 2,
L = 4,
XL = 8,
VALUES
};
This allows us to enable a plugin in multiple categories by summing up all available categories.
Example: For a plugin enabled in categories M
and L
the resulting category would be 6
. (M
+ L
=> 2
+ 4
= 6
)
A user token on the other hand can can (or should) only exists in a single category, depending on his license.
Example: A Sakuli user has a license in category M
, so the resulting category would be 2
.
The token category is set in a private claim called category
.
Given both the category of a plugin or a container, and the respective license token encoded in a users license, category validation is done by bitwise and:
bool matchingCategory = pluginToken.getCategory() & userToken.getCategory();
if (!matchingCategory) {
...
}
If the users license would be in e.g. category S
, the result of pluginToken.getCategory() & userToken.getCategory()
would be 0
and validation would fail.
Issuer
Each token has an issuer claim which in our case is always set to sakuli.io
Token subject
There are three different types of token:
-
sakuli_user
: A Sakuli license token which is given out to customers -
sakuli_plugin
: A Sakuli plugin token which is shipped with an enterprise plugin -
sakuli_container
: A Sakuli container token which is shipped with a Sakuli container
These categories are set in the subject claim and validated when instantiating a new Token
.
Building and releasing
The packages are built on GitHub Actions using a matrix build for Windows, Linux and macOS.
The build reads the masterkey from an PAROLE_VALUE
environment variable which is configured as GitHub Actions secret.
You can find the masterkey on pw.consol.de
.
It is passed to the build in the ./CMakeLists.txt
file:
add_compile_definitions(PAROLE="$ENV{PAROLE_VALUE}")
add_compile_definitions(OFFSET=0x15)
Both PAROLE_VALUE
and OFFSET
are used for obfuscation.
Tests contained in the project are built using the current masterkey.
The thing to notice here is that each token has been generated with category S
.
Since each enterprise package is at least in category M
, these token can not be missused.
Releases are done automatically. Whenever a new tag is pushed, GitHub Actions will publish the packages to npmjs.com.
Note: Since we're dealing with platform dependent packages, the project comes with a ./patch-packagename.js
file, which is used to update the package name.
It adds the current platform name the build is running on to the package name, so at the end of a successfull release, the following three packages will be published:
@sakuli/plugin-validator-darwin
@sakuli/plugin-validator-linux
@sakuli/plugin-validator-win32
For usage in a project, there's an additional meta package called plugin-validator-install, which publishes the @sakuli/plugin-validator
package.
This package includes all platform dependent packages and only exports the one suitable for the current platform.
Building and testing locally
Building and testing the package is a two step process:
- Build native binding
- Run tests
Building the native bindings
When building the native bindings, one has to configure the PAROLE_VALUE
environment variable used to verify token data.
All test data has been generated using this value, so tests will and should fail when using a different value.
The required key can be found on pw.consol.de
.
Two npm scripts exists for building the binding:
npm run build:debug
npm run build:release
npm run build:debug
contains debug symbols, while a release
build does not.
Windows requires additional flags to use to appropriate generator. When building on Windows, please use one of the respective Windows specific scripts:
npm run build:debug:win
npm run build:release:win
Testing
All contained tests can be run via npm t
.