Natively Pronounced Mandarin

    ccht
    TypeScript icon, indicating that this package has built-in type declarations

    0.1.1 • Public • Published

    ccht

    npm Test and Lint Workflow Status Publish Package Workflow Status

    Command-line Crawling HTTP Testing tool

    ccht is a simple command-line tool to crawl and test your website resources' HTTP status code, like broken link checker.

    Installation

    You can skip installation if you use npx for one-time invocation.

    $ npm i -D ccht
    
    # or
    $ yarn add -D ccht

    Usage

    ccht [options] <url>
    
    # to crawl and test "https://example.com"
    $ npx ccht 'https://example.com'
    
    # to show help
    $ npx ccht --help

    ccht will crawl the site starting from the given URL.

    Options

    To see more options, run npx ccht --help.

    Global Options

    --crawler <name>

    Choose crawler. Available crawlers:

    node-http

    Default. Crawls pages by using Node.js' HTTP module and cheerio.

    puppeteer

    Crawls pages by using a real browser through Puppeteer. You need to install puppeteer (npm i -D puppeteer) or configure your environment (browser DevTool protocol connection, executable.)

    --reporter <name>

    Specify reporter, which formats and outputs the test result.

    code-frame

    Default. Outputs human-friendly visuallized result.

    json

    Prints JSON string. Useful for a programmatic access to results.

    $ npx ccht 'https://example.com' --reporter=json | jq

    --include <urls>

    A comma separated list of a URL to include in a result. Any URLs forward-matching will be crawled and be reported.

    Defaults to the given URL. For example, given npx ccht 'https://example.com' then --include will be https://example.com.

    --exclude <urls>

    A comma separated list of a URL to exclude from a result. Any URLs forward-matching will be skipped nor be removed from a result.

    --expected-status <HTTP status codes>

    A comma separated list of an expected HTTP status code for pages. Any pages responded with other status codes result in error (unexpected_status).

    Defaults to 200.

    --exit-error-severity <list of severity>

    Change which severities occurs exit status 1. Available severities are below:

    • danger
    • warning
    • info
    • debug

    Defaults to danger.

    Crawler Options

    --timeout <ms>

    Timeout for each page to load/response in crawling phase. This option value is directly goes to node-fetch's or puppeteer's one.

    Defaults to 3000 (3s).

    --concurrency <uint>

    How many connection can exist at the same time, a size for connection pool.

    Defaults to 1.

    Install

    npm i ccht

    DownloadsWeekly Downloads

    3

    Version

    0.1.1

    License

    MIT

    Unpacked Size

    75.6 kB

    Total Files

    50

    Last publish

    Collaborators

    • pocka