Nourishing Pear Medley

    build-wikipedia-feed

    0.1.2 • Public • Published

    build-wikipedia-feed

    Build a hyperdb feed of Wikipedia articles, including historical revisions.

    npm version build status ISC-licensed chat on gitter support me on Patreon

    Rationale

    Problem

    Wikipedia is an incredibly important collection of knowledge on the internet. It is free to read and edit for everyone. Though, it is important to know that it is stored on only a handful of servers in a handful of countries controlled by a single organisation. This mainly causes two problems:

    • Currently, it is too easy to censor the Wikipedia. We need a system that supports redundancy without any additional effort.
    • It does not work offline. Making an offline copy is complicated. Also, you usually have to download all articles for a language.

    Solution

    Let's store Wikipedia's content in a peer-to-peer (P2P) system. By leveraging software from the Dat project, we won't have to reinvent the wheel. The Dat protocol efficiently only syncs changes between to versions of data, allows for sparse & live replication and is completely distributed.

    This tool can extract articles from a Wikipedia dump or download it directly, and store it in a Dat archive. See below for more details.

    Installing

    npm install -g build-wikipedia-feed

    Usage

    This module exposes several command line building blocks.

    read all revisions of every article

    Pipe a stub-meta-history* XML file into wiki-revisions-list. You will get an ndjson list of page revisions.

    curl -s 'https://dumps.wikimedia.org/enwiki/20181001/enwiki-20181001-stub-meta-history1.xml.gz' | gunzip | wiki-revisions-list >revisions.ndjson

    read the most recent revision of every article

    Pipe a stub-meta-current* XML file file into wiki-revisions-list. You will get an ndjson list of page revisions.

    curl -s 'https://dumps.wikimedia.org/enwiki/20181001/enwiki-20181001-stub-meta-current1.xml.gz' | gunzip | wiki-revisions-list >revisions.ndjson

    read articles being edited right now

    Use wiki-live-revisions. You will get an ndjson list of page revisions.

    wiki-live-revisions >revisions.ndjson

    fetch & store revisions in a hyperdb

    Use wiki-store-revisions to write the HTML content of all revisions in revisions.ndjson into a hyperdb. The archive will be created under p2p-wiki in your system's data directory.

    cat revisions.ndjson | wiki-store-revisions

    Related

    Contributing

    If you have a question or have difficulties using build-wikipedia-feed, please double-check your code and setup first. If you think you have found a bug or want to propose a feature, refer to the issues page.

    Install

    npm i build-wikipedia-feed

    DownloadsWeekly Downloads

    0

    Version

    0.1.2

    License

    ISC

    Unpacked Size

    13.7 kB

    Total Files

    10

    Last publish

    Collaborators

    • derhuerst