deep-restful
Homogeneous Restful collection manipulation for deepjs.
This library comes with :
- a memory Collection manager that define exactly the same API than any other deepjs client.
- a memory Object manager that define the same API
- a HTTP/Restful client that define the same API and used as base class for other deepjs http clients (jquery ajax, nodejs http, etc.).
- other stores related sheets (transformations applied on stores that gives homogeneity)
- a chained, Promise based, restful handler that use this API and allow dependency injection.
The API defines :
- get("id" || "?myQuery=true" || "")
- range(start, end, query)
- post( obj || null )
- put( obj || null )
- patch( obj || null )
- del( obj || null )
- rpc( method, args, id)
- bulk( actions )
And defines additionnaly for collections (not for deep.Object)
- flush()
- count()
It could be combined with protocols and OCM. (see deepjs core docs for more infos).
install
npm install deep-restful
or
bower install deep-restful
It will install also deepjs if not already there.
Simple usage
var deep = ; // load core; // load chain; // load collection manager "myProtocol" id:"e1" title:"hello world"; deep; // you'll se the putted object
JSON-Schema validation :
You could associate a schema to a collection :
var deep = ; // load core; // load chain; // load collection manager protocol:"myProtocol" collection: id:"e1" title:"hello world" schema: properties: title: type:"string" required:true minlength:6 ); deep; // you'll see an error 412 PreconditionFail with a report describing the validation error.
Constraints
Within the JSON-Schema, you could define constraints on any properties :
- private : property is removed on "get" (query and range also)
- readOnly : property could not be modified
There is a more obstrusive constraint that allow to securise collection access by item's owner. (it work with session defined in deep.context : see autobahnjs)
- ownRestriction : could be a string, giving the name of the property in item that give owner (as "userId"), or false.
And finally, collection managers could apply transformation automatically on post/put/patch, wich is usefull for hashing a password on update by example.
var deep = ; // load core; // load chain; // load collection manager var myCollection = protocol:"myProtocol" collection: id:"e1" title:"hello world" schema: properties: password: type:"string" required:true minlength:6 "private":true transform:{ return deeputils; } email: type:"string" format:"email" readOnly:true ); deep // you see the posted result : without password; // you'll see an error 412 PreconditionFail : email is readonly
JSON-RPC
You could include RPC methods in you store definition.
var deep = ; // load core; // load chain; // load collection manager protocol:"myProtocol" collection: id:"e1" title:"hello world" methods: { // 'this" refer to object retrieved with id provided to rpc call thisdescription = description; thislastUpdate = +; return handler; } ); deep // you see the rpc result // get all on current store; // you see an array : [{ id:"e1", title:"hello world", "object's description"}]
Other Collection clients
Persistent Collection
You could use deep-mongo or deep-local-storage (among other) to have persistent collection that implement same API than above.
example with deep-mongo :
var deep = ; // load core; // load chain; // load driver deep
HTTP/Restful client
When you want to interact with remote restful services (from browser or nodejs to remote server), you could use http clients such as 'deep-jquery/ajax/xxx' or 'deep-node/rest/http/xxx' where xxx=json|html|xml that implement same API than above but act remotely.
var deep = ; // load core; // load restful chain; // load json restful client // define a json client to your remote json servicedeepjqueryajax; deep // get something from remote service. e.g. from : /my/service/base/uri/....; // you'll see the put result on remote service or any error from chain.
When you use your service through deep.restful chain and protocols, you don't even need to know if you're working remotely or not. Code still the same.
It acts as a dependecy injection mecanism and allows great modularity and isomorphism.
You could develop then all your services and clients logics blindly browser side or server side (or from deep-shell), with dummies collections or not, and then choose to run it remotely or not, with different stores implementations depending on production flags by example.
Remote and local validation
When you want to pre-validate datas before to send them to remote services (by example on form submition from browser), you could specify in your remote client definition where to get schema before sending datas to remote (or you could provide directly your own schema).
var deep = ; // load core; // load restful chain; // load json restful client // define generic json clientdeepjqueryajax; // define a json client to your remote json servicedeepjqueryajax; deep // get something from remote service. e.g. from : /my/service/base/uri/.... // will pre-validate with local schema before send; // you'll see the put result on remote service or any error from chain (maybe a precondition fail from local validation).
Remarque : If the remote service is defined with a deep-restful compliant store (with autobahnjs by example), and if it has a schema defined, you could retrieve it by passing "schema" as id.
e.g. deep.get("myService::schema").log();
or deep.restful("myService").get("schema").log();
.
Nodejs simple example
var http = ;var deep = ; // the core; // homogeneous restful API; // simple memory collection "myobjects" ; var titles = "hello" "deepjs" "world";var count = 0; http; console;
Then, open your browser, go to http://127.0.0.1:1337/, refresh few times, and try :
http://127.0.0.1:1337/an_id_of_an_item_in_collection or http://127.0.0.1:1337/?title=deepjs or http://127.0.0.1:1337/?count=lt=2
Licence
LGPL 3.0