@superfaceai/one-service
TypeScript icon, indicating that this package has built-in type declarations

3.0.3 • Public • Published

Website | Get Started | Documentation | Discord | Twitter | Support

Superface

OneService

GitHub Workflow Status npm license TypeScript Discord

OneService allows you to run OneSDK as a service with configured usecases. And use it as backend for frontend.

For more details about Superface visit how it works and get started.

Install

You can use this package as a globally installed CLI program:

npm install --global @superfaceai/one-service

Usage

To run OneService you need to have Superface configuration.

  1. Create new folder where the configuration will be created:

    mkdir myapp
    cd myapp
  2. Install usecases and configure providers :

    npx @superfaceai/cli install weather/current-city --providers wttr-in

    (Repeat for any usecase you find in Catalog.)

  3. Start OneService with GraphiQL.

    oneservice --graphiql
  4. Visit http://localhost:8000/ to open GraphQL interactive IDE

Use as a HTTP server middleware

OneService package provides createGraphQLMiddleware function for mounting the GraphQL server as a middleware with any HTTP web framework that supports connect styled middleware. This includes Connect itself, Express, Polka, Restify and others.

The createGraphQLMiddleware function returns a promise, therefore you need to await resolution before mounting the middelware.

If you can use ES modules in your project, you can resolve the promise with top-level await:

// server.mjs
import express from 'express';
import { createGraphQLMiddleware } from '@superfaceai/one-service';

const app = express();
const graphqlMiddleware = await createGraphQLMiddleware({
  graphiql: true,
});

app.use('/graphql', graphqlMiddleware);

app.listen(3000);

Alternatively you can setup the server inside an async funtion:

const express = require('express');
const { createGraphQLMiddleware } = require('@superfaceai/one-service');

async function startServer() {
  const app = express();
  const graphqlMiddleware = await createGraphQLMiddleware({
    graphiql: true,
  });
  app.use('/graphql', graphqlMiddleware);

  app.listen(3000);
}

startServer();

createGraphQLMiddleware function will throw an error if it cannot generate a GraphQL schema. Possible issues include:

  • missing or invalid super.json
  • name collision between use-cases of installed profiles
  • profile with features unsupported by GraphQL

Example Queries

query WeatherInPrague {
  WeatherCurrentCity {
    GetCurrentWeatherInCity(input: { city: "Prague" }) {
      result {
        temperature
        feelsLike
        description
      }
    }
  }
}

query SelectProvider {
  WeatherCurrentCity {
    GetCurrentWeatherInCity(
      input: { city: "Prague" }
      provider: { mock: { active: true } } # active flag is optional if only one provider is configured
    ) {
      result {
        temperature
        feelsLike
        description
      }
    }
  }
}

query InstalledProfilesAndProviders {
  _superJson {
    profiles {
      name
      version
      providers
    }
    providers
  }
}

Use with .env file

OneService doesn't automatically load .env file. You can manually use dotenv package for this functionality. If you would like to use OneService CLI with dotenv, follow these steps:

  1. Install dotenv and OneService locally in your project:

    npm i dotenv @superfaceai/one-service
    
  2. Run the CLI via node with dotenv preload:

    node -r dotenv/config node_modules/.bin/oneservice
    

Deployment

Considerations

OneService doesn't provide any authentication or CORS support. This should be handled by API Gateway (eg. Express Gateway or Kong) or you can use own Express instance and attach the middleware.

Heroku

OneService can be deployed to Heroku with Git.

Prerequisites

Steps

  1. You will need folder for the application with local Git repository

    mkdir myapp
    cd myapp
    git init
  2. Next step is to install OneService

    npm init -y
    npm install --save @superfaceai/one-service
  3. Install profiles as explaied in Usage step 2.

  4. Create Heroku Procfile

    echo 'web: oneservice --port $PORT --host 0.0.0.0 --graphiql --logs' > Procfile
  5. Commit changes to Git repository

    git add --all
    git commit -m 'OneService configuration'
  6. Create Heroku remote

    heroku create
  7. Deploy app

    git push heroku main

Logs

OneService can print structured logs. To allow it use --logs=<level> flag:

oneservice --logs
oneservice --logs=trace

Set the optional logging level specifier. Available levels are (ordered by priority):

  • fatal
  • error
  • warn
  • info
  • debug
  • trace
  • silent

The logging level is the minimum threshold. When set, every higher level will be logged as well. For instance if logger.level is info then all fatal, error, warn, and info logs will be enabled.

You can pass silent to disable logging.

Pretty print

To print structured logs in readable way during testing or developement, you can use pino-pretty.

Install pino-pretty:

npm install -g pino-pretty

Run OneService with pretty print:

oneservice --logs=trace | pino-pretty

Contributing

We welcome all kinds of contributions! Please see the Contribution Guide to learn how to participate.

License

OneService is licensed under the MIT License.

© 2023 Superface s.r.o.

Readme

Keywords

none

Package Sidebar

Install

npm i @superfaceai/one-service

Weekly Downloads

2

Version

3.0.3

License

MIT

Unpacked Size

112 kB

Total Files

59

Last publish

Collaborators

  • zdne
  • freaz
  • superface-bot