journeys

1.0.2 • Public • Published

Simple API Testing with User Journeys

The intention of this small command-line utility is to provide a means to easily run "User Journeys" over a REST API either as part of development workflow, or, by importing sub-modules, as part of a continuous deployment or integration pipeline. The input object should provide key/values corresponding to a particular user journey, where the value provides an array of actions defined in their simplest terms as follows:

{ 
    /*
        Fields marked * are optional.
        Functions are called in the scope of the "request".
        The request scope may or may not include the whole "test" scope.
    */
    
    userOne: [
        {
            description: 'Insert first test description',
            headers: {
                send: {
                    // Insert headers to send (key/value pairs)
                    // **OR** provide keys and functions that supply values
                    // Send empty object if none
                },
                receive: {
                    // Insert headers to expect in response (key/value pairs)
                    // **OR** provide keys and functions that supply values
                    // Send empty object if none
                }        
            },
            method: '[DELETE,GET,POST,PUT]',
            *send: 'INSERT JSON DATA' **OR** function() {
                return { ...MY JSON DATA... };
            },
            status: 'INSERT EXPECTED STATUS',
            url: 'INSERT ENDPOINT URL (FROM BASE (SEE BELOW))' **OR** function() {
                // Use this to add path variables or query params
                return 'ENDPOINT URL (FROM BASE (SEE BELOW))';
            },
            *validate: function(response) {
                // Insert response tests here
            },
            *then: function(response) {
                // Insert post actions here
            }
        },
        {
            description: 'Insert second test description',
            headers: {
                send: {
                    // Insert headers to send (key/value pairs)
                    // **OR** provide keys and functions that supply values
                    // Send empty object if none
                },
                receive: {
                    // Insert headers to expect in response (key/value pairs)
                    // **OR** provide keys and functions that supply values
                    // Send empty object if none
                }        
            },
            method: '[DELETE,GET,POST,PUT]',
            *send: 'INSERT JSON DATA' **OR** function() {
                return { ...MY JSON DATA... };
            },
            status: 'INSERT EXPECTED STATUS',
            url: 'INSERT ENDPOINT URL (FROM BASE (SEE BELOW))' **OR** function() {
                // Use this to add path variables or query params
                return 'ENDPOINT URL (FROM BASE (SEE BELOW))';
            },
            *validate: function(response) {
                // Insert response tests here
            },
            *then: function(response) {
                // Insert post actions here
            }
        },
        
        // ... Insert more actions ad infinitum
    ],
    userTwo: [ 
        ... 
    ],
    
    // ... Insert more users ad infinitum
}

Each action within a particular user journey will be performed in sequence, but each user journey will be executed in parallel with all other user journeys. With enough tests and/or by adding your own means to auto-generate test data, this pattern could potentially be used for (but is not limited to) stress-testing of APIs and/or the testing of contracts between microservices.

It is expected that user journeys would be stored within their own file and imported/required. Furthermore, since the testing "functions" could also be required in the same manner and are likely to take a finite number of forms, it would be possible for the tests themselves to be maintained by a team with more automated testing experience, while the declarative JSON syntax required for the creation of User Journeys could be undertaken by those with more knowledge of customer requirements and usage patterns.

The utility provides console output in the following (success) form:

User Journey of UserOne:
    ✓ UserOne performs ActionA
    ✓ UserOne performs ActionB
    ✓ UserOne performs ActionC
    
User Journey of UserTwo:
    ✓ UserTwo performs ActionA
    ✓ UserTwo performs ActionB
    ✓ UserTwo performs ActionC

And in the following (fail) form:

User Journey of UserOne:
    ✓ UserOne performs Action A
    ✗ UserTwo performs Action B 

Error is:
    expect(received).toBeFalsy()

Received: "5eb7a7c49443093064a8c50e"
Error: expect(received).toBeFalsy()

Received: "5eb7a7c49443093064a8c50e"
    at Object.test ({ Stack follows ... }) 

Once installed globally (see below), you can run the utility using:

journeys /path/to/user/journeys --url base_url_of_api_under_test

Two examples are provided at: /examples/computed.js and /examples/simple.js. In the same folder, a very basic Express server that reflects its input is provided for demonstration. Once you have installed Express (npm install express), uuid (npm install uuid) and jest (npm install jest), run node server.js from the /examples folder to start the server prior to running the examples.

Whilst the command line utility provides clear human-readable information as to the results of your tests, it is also possible to import sub-modules (most likely the file at src/journeys) and bind your results into a promise chain for the purposes of Continuous Integration or Continuous Deployment pipelines. This also allows for convenient control over pre- and post-test setup/teardown.

Installation

(1) Install globally for use on the command line with:

npm install -g journeys

(2) Install on a project basis for use of sub-modules in a CI/CD pipeline by omitting the global declaration:

npm install journeys

Package Sidebar

Install

npm i journeys

Weekly Downloads

3

Version

1.0.2

License

ISC

Unpacked Size

34.5 kB

Total Files

20

Last publish

Collaborators

  • mikey_pooh