9gscraper

1.1.5 • Public • Published

9gscraper

A nodejs package to scrape 9gag posts and comments.

Documentation

getFeed([options])

  • options {Object} an object containing: section, batchSize and limit.

    • section {String} sections type, default: trending.
    • batchSize {Number} indicates how many posts should the scraper have in the buffer at one time, default: 10.
    • limit {Number} total count of results, default: -1.
  • returns asynchronous iterator of post objects.

post object example:

{
  "id": "aQ1edyK",
    "url": "https://9gag.com/gag/aQ1edyK",
    "title": "Keep the culture alive",
    "content": {
        "type": "video",
        "src": [
            {
                "src": "https://img-9gag-fun.9cache.com/photo/aQ1edyK_460svvp9.webm",
                "format": "video/webm"
            },
            {
                "src": "https://img-9gag-fun.9cache.com/photo/aQ1edyK_460sv.mp4",
                "format": "video/mp4"
            },
            {
                "src": "https://img-9gag-fun.9cache.com/photo/aQ1edyK_460svwm.webm",
                "format": "video/webm"
            }
        ]
    },
    "category": "GIF",
    "comments": "Object [AsyncGenerator]"
}

Examples

Get first 1k posts using Promises and forEach

const scraper = require("9gscraper");
scraper.getFeed({ limit: 1000 }).then(async feed => {
  await feed.forEach(post=>console.log(post.title));
});

Get comments from the first 5 posts in section "hot" using Async/Await and for loop

const scraper = require("9gscraper")
(async () => {
  const feed = await scraper.getFeed({ section: "hot", limit: 5 });
  for await (const post of feed) {
    for await (const comment of post.comments) {
      console.log(`${comment.user.displayName}${comment.text}`);
    }
  }
})();

Dependents (0)

Package Sidebar

Install

npm i 9gscraper

Weekly Downloads

7

Version

1.1.5

License

MIT

Unpacked Size

69.4 kB

Total Files

6

Last publish

Collaborators

  • robert_berglund