Resource cover image URLs cache server
A cover image URLs cache exposing its content as a REST web service.
In various softwares (ILS for example), Book (or other kind of resources) cover image is displayed in front of the resource. Those images are fetched automatically from providers, such as Google or Amazon. Providers propose web services to retrieve information on Books from an ID (ISBN for example).
With Coce, the cover images URL from various providers are cached in a Redis
server. Client send REST request to Coce which reply with cached URL, or if not
available in its cache retrieving them from providers. In its request, the
client specify a providers order (for example
aws,gb,ol for AWS, Google, and
then Open Library): Coce send the first available URL.
Install and start a Redis server
Install node.js libraries. In Coce home directory, enter:
Configure Coce operation by editing config.json. Start with provided
port- port on which the server respond
providers- array of available providers: gb,aws,ol
timeout- timeout in miliseconds for the service. Above this value, Coce stops waiting response from providers
redis- Redis server parameters:
gb- Google Books parameters:
timeout- timeout of the cached URL from Google Books
ol- Open Library parameters:
timeout- timeout of the cached URL from Open Library. After this delay, an URL is automatically removed from the cache, and so has to be re-fetched again if requested
imageSize- size of images: small, medium, large
aws- Amazon parameters. In order to use AWS, you need to create a credential. Create a user and give him credential to Amazon Product Advertising API:
imageSize- size of images: SmallImage, MediumImage, LargeImage
cd _Coce HOME_node app.js
By default, running Coce directly, there isn't any supervision mechanism, and Coce run as a multi-threaded single process (as any node.js application). In production, it is necessary to transform Coce into a Linux service, with automatic start/stop, and supervision. Traditional Unix process supervision architecture could be used: Unix System V Init, runit, or [daemon](http://man7.org/linux/man- pages/man3/daemon.3.html).
A more sophisticated approach could be utilised by using Phusion Passenger. This way, it's possible to make Coce respond to requests on http (80) port, even with other webapps running on the same server, and to run a Coce process on each core of a multi-core server.
For example, on Debian follow this [instructions](https://www.phusionpassenger .com/documentation/Users%20guide%20Standalone.html#install_on_debian_ubuntu). And start, coce, beeing in coce directory:
passenger start --port 8080
passenger start --port 8080 --daemonize
To get all cover images from Open Library (ol), Google Books (gb), and Amazon (aws) for several ISBN:
This request returns:
&all parameter, the same request returns first URL per ISBN, by
coceclient.js module, which is use like this:
// isbns is an array of ISBNsvar coceClient = '' 'ol,aws,gb';coceClient;
coce is highly scalable. With all requested URLs in cache,
ab test, 10000 requests, with 50 concurrent requests:
ab -n 10000 -c 50 http://localhost:8080/cover?id=9780415480635,97808?1417492,2847342257,9780563533191&provider=gb,aws
gives this result:
Document Path: /cover?id=9780415480635,97808?1417492,2847342257,9780563533191Document Length: 295 bytesConcurrency Level: 50Time taken for tests: 7.089 secondsComplete requests: 10000Failed requests: 0Write errors: 0Total transferred: 4610000 bytesHTML transferred: 2950000 bytesRequests per second: 1410.70 [#/sec] (mean)Time per request: 35.443 [ms] (mean)Time per request: 0.709 [ms] (mean, across all concurrent requests)Transfer rate: 635.09 [Kbytes/sec] receivedConnection Times (ms)min mean[+/-sd] median maxConnect: 0 1 0.5 1 3Processing: 9 34 16.5 32 288Waiting: 7 29 16.4 27 278Total: 12 35 16.5 34 290Percentage of the requests served within a certain time (ms)50% 3466% 3475% 3780% 3990% 4495% 5098% 5499% 58100% 290 (longest request)