Share your code. npm Orgs help your team discover, share, and reuse code. Create a free org »



    A contact information scrapping tool for programmatic and command-line use. Hoo will scrape webpages looking for personal websites, email addresses, Twitter handles, and Github usernames and returns completed user profiles in JSON or CSV.

    npm install -g hoo

    Command-line Usage

    This is a tool for quick contact information. Just provide a Twitter handle @compooter or Github username ^andrejewski or even just a plain website url, and Hoo will try figure out the remaining details.

    # these all do the same thing
    hoo @compooter
    hoo ^andrejewski
    { fullname: 'Chris Andrejewski',
      github: [ 'andrejewski' ],
      url: [ '' ],
      email: [ '' ],
      twitter: [ 'compooter' ] }

    Hoo works fine with multiple names, although too many will take longer.

    hoo @compooter ^tj @iamdevloper

    Output as JSON or CSV

    By default, all output is in JSON. Passing the --csv flag will change all output to CSV.

    hoo @compooter --csv
    hoo @compooter -c
    Chris Andrejewski,compooter,,,andrejewski

    Writing to a file

    Pass --output <filename> and Hoo will save output to a file instead. It works how you would expect passing the CSV flag as well.

    hoo @compooter ^tj --output output.json
    hoo @compooter ^tj -o output.json

    For JSON, the results array is grouped into the "people" key.

      "people": [
          "fullname": "Chris Andrejewski",
          "twitter": [
          "url": [
          "email": [
          "github": [
          "fullname": "TJ Holowaychuk",
          "github": [
          "url": [
          "email": [

    More options

    See hoo --help for more options including colored output, debugging activity, and selecting only certain fields.

    Programmatic Usage

    Hoo is designed to be entirely configurable. The command-line interface uses some default scrappers but an instance of the Hoo class initially has none. Any scrappers are added just as you would add Express/Connect middleware.

    var Hoo = require('hoo');
    var hoo = new Hoo()
    var names = ['@compooter', '^tj'];, function(error, records) {
        // do something awesome 


    Hoo includes Email (Default), Twitter, and Github web scrappers, but that doesn't mean new ones cannot be made. In fact that is why they all extend the same base Scrapper class. Building a new scrapper is easy.

    var Scrapper = require('hoo').Scrapper;
    class MyScrapper extends Scrapper {
        constructor(options) {
            /* options passed to new Hoo() are passed to each Scrapper added to it */
        expandArg(arg) {
            /* this allows the twitter/github scrappers to expand usernames to urls */
            return arg;
        processWebpage(webpage, record, next) {
                take any webpage and extract contact information to put on the record
                find new webpage urls to call
                calling next when done
                Process `webpage` like it's jQuery like:
                    var $ = webpage; $('#myElement').text();
            next(err, [optional urls])

    Note that while ES6 classes are used, you do not need to extend the Scrapper class for your own scrapper. Just be sure to implement the methods in your prototyped class.


    If you like Hoo enough to contribute, sweet. As the markup of scrapped webpages change, Hoo will need to be updated to match, so open a issue/pull if a scrapper is broken. If you have scrapper you would like to add to Hoo, pull request. Any other issues are welcome too.

    npm install # dependencies
    npm run build # to build
    npm run pre-publish # to pre-publish for pull requests

    Follow me on Twitter for updates or just for the lolz and please check out my other repositories if I have earned it. I thank you for reading.


    npm i hoo

    Downloadslast 7 days







    last publish


    • avatar