Network Printer Manager

    TypeScript icon, indicating that this package has built-in type declarations

    2.1.0 • Public • Published


    This package implements the PubSubEngine Interface from the graphql-subscriptions package and also the new AsyncIterator interface. It allows you to connect your subscriptions manger to a Google PubSub mechanism to support multiple subscription manager instances.


    npm install @axelspringer/graphql-google-pubsub or yarn add @axelspringer/graphql-google-pubsub

    Using as AsyncIterator

    Define your GraphQL schema with a Subscription type:

    schema {
      query: Query
      mutation: Mutation
      subscription: Subscription
    type Subscription {
        somethingChanged: Result
    type Result {
        id: String

    Now, let's create a simple GooglePubSub instance:

    import { GooglePubSub } from '@axelspringer/graphql-google-pubsub';
    const pubsub = new GooglePubSub();

    Now, implement your Subscriptions type resolver, using the pubsub.asyncIterator to map the event you need:

    const SOMETHING_CHANGED_TOPIC = 'something_changed';
    export const resolvers = {
      Subscription: {
        somethingChanged: {
          subscribe: () => pubsub.asyncIterator(SOMETHING_CHANGED_TOPIC),

    Subscriptions resolvers are not a function, but an object with subscribe method, that returns AsyncIterable.

    Calling the method asyncIterator of the GooglePubSub instance will subscribe to the topic provided and will return an AsyncIterator binded to the GooglePubSub instance and listens to any event published on that topic. Now, the GraphQL engine knows that somethingChanged is a subscription, and every time we will use pubsub.publish over this topic, the GooglePubSub will PUBLISH the event to all other subscribed instances and those in their turn will emit the event to GraphQL using the next callback given by the GraphQL engine.

    pubsub.publish(SOMETHING_CHANGED_TOPIC, { somethingChanged: { id: "123" }});

    The topic doesn't get created automatically, it has to be created beforehand.

    If you publish non string data it gets stringified and you have to parse the received message data.

    Receive Messages

    The received message from Google PubSub gets directly passed as payload to the resolve/filter function.

    You might extract the data (Buffer) in there or use a common message handler to transform the received message.

    function commonMessageHandler ({attributes = {}, data = ''}) {
      return {
        text: data.toString()

    The can use custom message handler test illustrates the flexibility of the common message handler.

    Dynamically use a topic based on subscription args passed on the query:

    export const resolvers = {
      Subscription: {
        somethingChanged: {
          subscribe: (_, args) => pubsub.asyncIterator(`${SOMETHING_CHANGED_TOPIC}.${args.relevantId}`),

    Using both arguments and payload to filter events

    import { withFilter } from 'graphql-subscriptions';
    export const resolvers = {
      Subscription: {
        somethingChanged: {
          subscribe: withFilter(
            (_, args) => pubsub.asyncIterator(`${SOMETHING_CHANGED_TOPIC}.${args.relevantId}`),
            (payload, variables) => === variables.relevantId,

    Creating the Google PubSub Client

    import { GooglePubSub } from '@axelspringer/graphql-google-pubsub';
    const pubSub = new GooglePubSub(options, topic2SubName, commonMessageHandler)


    These are the options which are passed to the internal or passed Google PubSub client. The client will extract credentials, project name etc. from environment variables if provided. Have a look at the authentication guide for more information. Otherwise you can provide this details in the options.

    const options = {
      projectId: 'project-abc',
        client_email: '',
        private_key: '-BEGIN PRIVATE KEY-\nsample\n-END PRIVATE KEY-\n'

    Subscription Options

    Subscription options can be passed into subscribe or asyncInterator.

    Note: google.protobuf.Duration types must be passed in as an object with a seconds property ({ seconds: 123 }).

    const dayInSeconds = 60 * 60 * 24;
    const subscriptionOptions = {
      messageRetentionDuration: { seconds: dayInSeconds },
      expirationPolicy: {
        ttl: { seconds: dayInSeconds * 2 }, // 2 Days
    await pubsub.asyncIterator("abc123", subscriptionOptions);


    Allows building different workflows. If you listen on multiple server instances to the same subscription, the messages will get distributed between them. Most of the time you want different subscriptions per server. That way every server instance can inform their clients about a new message.

    const topic2SubName = topicName => `${topicName}-${serverName}-subscription`


    The common message handler gets called with the received message from Google PubSub. You can transform the message before it is passed to the individual filter/resolver methods of the subscribers. This way it is for example possible to inject one instance of a DataLoader which can be used in all filter/resolver methods.

    const getDataLoader = () => new DataLoader(...);
    const commonMessageHandler = ({attributes: {id}, data}) => ({id, dataLoader: getDataLoader()});
    export const resolvers = {
      Subscription: {
        somethingChanged: {
          resolve: ({id, dataLoader}) => dataLoader.load(id)


    Jonas Hackenberg - jonas-arkulpa


    This project is mostly inspired by graphql-redis-subscriptions. Many thanks to its authors for their work and inspiration. Thanks to the Lean Team (Daniel Vogel, Martin Thomas, Marcel Dohnal, Florian Tatzky, Sebastian Herrlinger, Mircea Craculeac and Tim Susa).


    npm i @axelspringer/graphql-google-pubsub

    DownloadsWeekly Downloads






    Unpacked Size

    83.8 kB

    Total Files


    Last publish


    • 1250c69
    • andreasprang
    • hammi85
    • jan.michalowsky
    • jonas_arkulpa
    • katallaxie
    • preventdefault
    • timmsusa
    • tonimedina