logging-system

0.0.7 • Public • Published

This package helps you log zillions of events {cheaply,efficiently,reliably}.

EventBucket

An EventBucket is a set of events.

Event

    Key            JSONValue'*, **
    Type           uint16 or unicode
    Value          JSONObject            (e.g. via JSON, protobuf (needing a .proto file to view/analyze))
    Client Time   ms since 1970         client-specified timestamp
    Server Time   ms since 1970         when the event was received by the logging server

* no non-integer numbers, no nested structures, no \u0000
** Keys must be distinct so the server can be idempotent.

Stuff

{LoggingServer, EventServer, EventClient} = require 'logging-system'

APIs

{LoggingServer,EventServer} API

POST /api/post-events?bucket=TOKEN
    Content-Type: "application/json" or "application/eventbuf-v2"
    Request body: See "Formats" section
    
    Response status:
      2xx or 5xx
      TODO: 2xx only after all data has been fully persisted
      (TEMP: 2xx immediately)

EventServer API

GET /events.json?bucket=TOKEN
  {"events": [{...see JSON Event Format...}...]}

GET /reset
  Delete all events.

Servers

LoggingServer

  • Logs all data of all incoming HTTP requests in batches
  • HTTP requests are not parsed. Not even a little. // TEMP: requiring UTF-8 for now
  • Batches are POSTed to S3
  • One batch every 10 sec implies 2.6 USD/month for POST requests
server = new LoggingServer {
  s3: {
    AWSAccessKeyId: "..."
    policy64:       "..."
    signature64:    "..."
    bucket:         "..."
    customUrl:      "https://my-s3-clone:12345"    # OPTIONAL
  }
  batchSeconds: 10                                 # OPTIONAL
}
server.listen PORT, () -> console.log "Listening on #{PORT}..."
# AND/OR:
...(req, res) ->
  if ...
    server.handleRequest req, res

EventServer

server = new EventServer()
server.listen PORT, () -> console.log "Listening on #{PORT}..."
server.reset() # delete all events

Datastore usage

[EVENTS, bucket, k_string]                      -> eventJson // later protobuf
[INDEXED_EVENTS, bucket, indexId, ...index...]  -> eventJson // later protobuf

Formats

JSON event

{
  "key": ...,
  "type": ...,
  "value": ...,
  "clientTime": 1317062859638
}

JSON events

{
  "events": [...]
}

eventbuf-v3 event

TODO

eventbuf-v3 events

Just concatenate 'em.

LoggingServer S3 Objects

Key

"v1/%Y-%m-%d/%H-%M-%S-%L-Z-" + serverToken + "-" + randomToken(8, BASE58_ALPHABET) + "-" + batchNumber + "-v2"
  ...where, when the server starts, serverToken := randomToken(8, BASE58_ALPHABET)

  If the body is gzipped, append ".gz".

Batch format, V2

Concatenated HTTP events, each of which is:

  msgpack(length_of_the_following)  # because msgpack libraries don't support streaming
  msgpack(http_event)

HTTP events

REQ_DATA_EVENT = 1
REQ_END_EVENT = 2
{
  1: REQ_DATA_EVENT
  2: reqId
  3: ms_timestamp_of_first_fragment
  8: fragment_id
  
  4: data
}
{
  1: REQ_END_EVENT
  2: reqId
  3: ms_timestamp_of_first_fragment
  8: fragment_id
  
  5: remote IP      UTF-8
  6: content-type   UTF-8
  7: path           UTF-8
}

Notes

S3 Pricing

{PUT,LIST}s          100,000 for 1 USD    => (every 10 sec => 2.6 USD/month)
{GET}s             1,000,000 for 1 USD
storage     >= 7.14 GB-month for 1 USD

Readme

Keywords

none

Package Sidebar

Install

npm i logging-system

Weekly Downloads

0

Version

0.0.7

License

none

Last publish

Collaborators

  • andrewschaaf