Basic cookie-handling for outgoing requests
This library emulates client cookie behaviour, allowing appropriate cookie behaviour when making outgoing requests from Node.
This library is not particularly efficient for large numbers of cookies. The cookies are stored in a big list, and this entire list is searched for every request. For small numbers of cookies, or cookies on a single domain, this should not be an issue - so if you're just automatically logging into and exploring a single API, this should be fine.
If you're looking to create a full-blown multi-domain scraper or something, then raise a GitHub issue or email the author about making it more efficient.
var cookieClient = require'cookie-client';var cookieStore = cookieClient; // use of "new" is optional
cookieStoreaddFromHeadersresponseheaders; // full headers objectcookieStoreaddFromHeadersresponseheaders'set-cookie'; // just the cookie headers
requestheaders'cookie' = cookieStorecookieStringForRequestdomain path isSecure;
requestheaders'cookie' = cookieStorecookiesForRequestdomain path isSecure;
To prevent "super-cookies" assigning themselves domains like
.com (which is dangerous), this module attempts to download a Public Suffix List (see publicsuffix.org). Any response other than a 200 will log an error to the console.
This is downloaded and stored in the module directory, so it happens once per installation.
Even if it is cached locally, it is loaded asynchronously. You can query whether the PSL has loaded yet, or request a callback:
cookieClientpslLoaded; // boolean flagcookieClientwhenPslLoaded;
However, the file
public-suffix-list.txt is taken from publicsuffix.org, and has separate licensing terms (Mozilla Public License). This package used to fetch the file from the web on first run, but that caused Node to crash if the connection was dropped by the remote server. It is therefore included in the package, but is license separately from the rest of the code.