node package manager


Connect to Gnip streaming API and manage rules

NodeJS Gnip module

Connect to Gnip streaming API and manage rules. You must have a Gnip account with any data source available, like Twitter Power Track.

Currenly, this module only supports JSON activity stream format, so you must enable data normalization in your admin panel.


This class is an EventEmitter and allows you to connect to the stream and start receiving data.

Constructor options


As requested in the Gnip docs (, this option in the constructor allows us to set a read timeout in the client. The recommended value is >=30 seconds, so the constructor will throw an error if a smaller timeout is provided. The default value for this option is 35 seconds.


Number of minutes to backfill after connecting to the stream. Optional. Value should be 0 - 5.


Partition of the Firehose stream you want to connect to. Only required for Firehose streams.


Parser library for incoming JSON data. Optional, defaults to the native JSON parser.
Matching tag IDs are sent to us as big integers which can't be reliably parsed by the native JSON library in Node.js. When you rely on tag IDs you can use the excellent json-bigint library:

var JSONbig = require('json-bigint');
var stream = new Gnip.Stream({
    parser: JSONbig,

More info on this issue can be found at StackOverflow

API methods


Connect to the stream and start receiving data. At this point you should have registered at least one event listener for any of these events: 'data', 'object' or 'tweet'.


Terminates the connection.



Emitted when the connection has been successfully established ###### data Emitted for each data chunk (decompressed) ###### error Emitted when any type of error occurs. An error is raised if the response status code is not 20x. {error: String} objects are also checked here. ###### object Emitted for each JSON object. ###### tweet Emitted for each tweet. ###### delete Emitted for each deleted tweet. ###### end Emitted when the connection is terminated. This event is always emitted when an error occurs and the connection is closed.


This class allows you to manage an unlimited number of tracking rules.

API methods

rules.getAll(Function callback)

Get cached rules.

rules.update(Array rules, Function callback)

Creates or replaces the live tracking rules.
Rules are sent in batches of options.batchSize, so you can pass an unlimited number of rules.
The current tracking rules are stored in a local JSON file so you can update the existing rules efficiently without having to remove them all. The callback receives an object as the 2nd argument and contains the number of added and deleted rules.

rules.clearCache(Function callback)

Clears cached rules.

The following methods uses Gnip API directly and ignores the local cache. Avoid usage if you are working with too many rules! rules, Function callback) rules, Function callback) rules, Function callback) callback) callback)


npm install gnip

Example Usage

var Gnip = require('gnip');

var stream = new Gnip.Stream({
	url : '',
	user : 'xxx',
	password : 'xxx',
	backfillMinutes: 5 // optional
stream.on('ready', function() {
	console.log('Stream ready!');
stream.on('tweet', function(tweet) {
stream.on('error', function(err) {

var rules = new Gnip.Rules({
	url : '',
	user : 'xxx',
	password : 'xxx',
	batchSize: 1234, // not required, defaults to 5000

var newRules = [
	{value: 'keyword as object'},
	{value: '@demianr85', tag: 'rule tag'}

rules.update(newRules, function(err) {
	if (err) throw err;

More details and tests soon...