overlooker
TypeScript icon, indicating that this package has built-in type declarations

0.32.0 • Public • Published

overlooker

Build Status npm overlooker package

This package is a set of utilities that allow you to configure your frontend profiling in CI/CD. At the same time, you will receive comprehensive information about what your pages contain.

Overlooker allows you to run batch profiling of a set of pages for all relevant metrics for the frontend. You can set the number of measurements for each page, write scripts that need to be measured for execution speed, analyze the content of resources on your pages, and much more..

In addition, you can use the received data to compare them with previous profiles, while receiving a report on what has changed on your pages.

As a tool for performance tests, there is an opportunity to set thresholds that will limit the degradation of performance on your project and help you track what exactly led to them.

Installation

npm i overlooker

Usage

Configuration (types)

First, let's figure out how to start profiling.

const config = {
  host: 'https://example.com', // profiling host - it will be concatenated for urls of all pages
  throttling: { // like a throttling in Chrome DevTools Performance
    cpu: 1,
    network: 'WiFi'
  },
  cookies: [{ // an array of cookies to be added for all pages when profiling
      name: 'cookie_name',
      value: 'cookie_value',
      domain: 'example.com'
    }],
  cache: { // used cache - you can use the built-in wpr binary or use your own proxy
    type: 'wpr'
    // to use your own proxy:
    // {
    //   type: 'proxy',
    //   host: string,
    //   restart?: () => Promise<any>
    // }
  },
  count: 10, // number of profiles for each page
  platform: 'desktop', // platform which will be used for profiling (desktop|mobile)
  pages: [{ // array of profiling pages
    name: 'main',
    url: '/',
    layers: {
      meaningfulLayer: '.selector'
    },
    actions: [{ // each page can include several scripts that will be executed after the page is loaded
      name: 'test-action',
      action: async (page) => {
        await page.click('button');
        await page.waitForSelector('#loaded-image');
      }
    }]
  }, {
    name: 'category',
    url: '/'
  }],
  logger: (msg) => console.log(msg), // logger for profiling - it will receive messages during the profiling process
  buildData: { // URL or getter to get build data (generated by Bundle Internals Plugin) to assemble complete profiling data
    url: '/build.json',
  },
  requests: { // some functions for filtering requests
    ignore: (url) => url.includes('ad'), // ignoring some urls by pattern
    merge: (url) => url.includes('stats'), // merge duplicate requests
    internalTest: (url) => url.startsWith('https://example.com') || url.startsWith('https://example.io'), // pattern for detecting internal resources
  }
};

Most of the parameters are optional and the brief config can be as follows:

const config = {
  host: 'https://example.com',
  pages: [{
    name: 'main',
    url: '/',
  }],
  count: 10
};

To collect some product-centric metrics, you can use standard API on your pages:

Element Timing API

<span elementtiming="some-element-paint"></span>

User Timing API

// to detect User Timing API calls, first set up marks pattern in the profiling configuration
const config = {
  // ...
  customMetrics: {
    timing: /^product-timing\.(.*?)$/i // use string 'all' to collect all timings
  }
}

and on your page you have to execute this:

performance.mark('some-metric');

These metrics will be collected during profiling and presented in the resulting json.

Profiling (types)

Profiling can be organized in a simple way.

  const { profile } = require('overlooker');

  const profileResult = await profile({
    ...config,
    host: 'https://master.example.com'
  });

  await db.saveProfile(revision, profileResult);

As a result, you will receive data about the performance of your pages in json format. I recommend saving the data to your favorite database or directly to the file system (do not forget about rotation) with the identifier of the measured revision.

Impact Analysis (types)

To reduce the cost of profiling and speed it up, I recommend using the impact analyzer on the page. To use it, you need the previous impact analysis data.

const { impactAnalysis, affectConfigByImpact } = require('overlooker');

const impactData = await impactAnalysis(
  masterDescription, // impact data about pages on previous impact analysis
  config, // same configuration as for profiling 
  (element) => element.request.url.includes('ad') // element filtration for collecting stable impact data (for example, you can filter dynamic ad urls) 
);

await db.saveImpactData(revision, impactData);

const impactedConfig = affectConfigByImpact(config, impactData);

As a result of executing this code, you will get a configuration with pages that have changes compared to another revision. Impact analysis data is best saved in a database too.

Comparison (types)

After profiling is over, you can use it to compare against a profile with an earlier revision. or the revision profile from where your branch you are testing was forked from.

const { comparePages } = require('overlooker');

const profileDataFeature = await db.getProfileByRevision(featureRevision);
const profileDataMaster = await db.getProfileByRevision(masterRevision);

const comparison = comparePages(profileDataMaster, profileDataFeature);

As a result of the comparison, you will get the full difference for all metrics, requests, modules in chunks (if you used bundle-internals-plugin) and this data can be used to analyse performance impact of your code changes.

Analyzing the comparison results (types)

There is also a separate method to check the comparison results by custom thresholds. But first let's see how to define thresholds for your performance metrics. That will allow you to set up the deviation limits that you need.

Example of thresholds:

const thresholds = {
  default: { // default thresholds (will be used for all pages)
    'percent.stats.userCentric.timeToInteractive.median': 0.05, // path for value in comparison object and limit for deviation
    'percent.stats.elementsTimings.**.median': 0.05 // path with ** covers all nested paths up to keys after this point
  },
  main: { // threshold for one page, which name is 'main' in profile configuration
    'percent.stats.custom.timings.*.median': 0.1, // path with * symbol covers all keys on this level
    'percent.stats.custom.userCentric.{timeToInteractive, speedIndex}.median': 0.1 // path with {} symbol covers all included keys on this level
  }
}

And you can use these thresholds to check your comparison

const { check } = require('overlooker');

const result = check(comparison, thresholds);

if (!result.success) {
    return 1; // for example, you can handle the check result and fail the build
}

You can also use this approach to budget performance by using the check method on some profile.

const { check } = require('overlooker');

const result = check(profileDataFeature, budget);

Tool for getting build data

Bundle Internals Plugin

Tool for view trace events

flame-chart-js

Readme

Keywords

none

Package Sidebar

Install

npm i overlooker

Weekly Downloads

1

Version

0.32.0

License

ISC

Unpacked Size

66.3 MB

Total Files

81

Last publish

Collaborators

  • pyatyispyatil