react-isomorphic-render
Server Side Rendering for React + React-router v3 + Redux
stack.
- Asynchronously preloads pages before performing client-side navigation
- Provides an isomorphic HTTP client for calling REST API in Redux "action creators"
- Supports Webpack "hot reload" (aka "Hot Module Replacement")
- Provides supplementary utilities: locale detection for internationalization, easy setting page
<title/>
and<meta/>
, programmatic redirects, 100% correct handling of HTTP Cookies, etc
Why Server Side Rendering
World-Wide Web Concepts
The original concept of the web was one of a network of "resources" interconnected with "hyperlinks": a user could query a "resource" by a "Universal Resource Link" (URL) and then travel to any of the connected "resources" just by navigating the corresponding hyperlinks, and then it would all repeat recursively therefore interconnecting each and every "resource" into a giant interconnected web (hence the name). The "resources" were meant to be "documents", like reports, articles, papers, news, books, etc.
The web wasn't meant at all for "applications". At first javascript was only used to bring some naive interactivity to static "documents", like following the cursor with a sprinkle, or adding christmas snow to a page, or applying some effect to a picture upon mouseover. Initially javascript was never meant to be a means of operating on the page's "content". It was just for "presentation" ("view"), not the "content".
Ajax wasn't originally meant for "content" too: it was just for tiny utility things like hitting a "Like" button without needlessly refreshing the whole freaking page, but it was then too turned into a machinery for fetching a page's "content".
And so the Web became broken. And to completely fix that and make the Web 100% pure again total Server Side Rendering for each dynamic website is the only way to go. This is still a purely esthetical argument and nobody would really care (except purists and perfectionists) if it didn't come to being able to be indexed by Google...
Search engines
Search engine crawlers like Google bot won't wait for a page to make its Ajax calls to an API server for data: they would simply abort all asynchronous javascript and index the page as is. Don't mistake it for web crawlers not being able to execute javascript — they're perfectly fine with doing that (watch out though for using the latest and greatest and always use polyfills for the older browsers since web crawlers may be using those under the hood).
So the only thing preventing a dynamic website from being indexed by a crawler is Ajax, not javascript. This therefore brings two solutions: one is to perform everything (routing, data fetching, rendering) on the server side and the other is to perform routing and data fetching on the server side leaving rendering to the client's web browser. Both these approaches work with web crawlers. And this is what this library provides.
While the first approach is more elegant and pure, currently it is a very CPU intensive task to render a moderately complex React page using ReactDOM.renderToString()
(takes about 100 milliseconds of blocking CPU single core time for complex pages having more than 1000 components, as of 2016). Facebook doesn't use Server Side Rendering itself so optimizing this part of the React library is not a priority for them. So until this (if ever possible) Server Side Rendering performance issue is fixed I prefer the second approach: performing routing and page preloading on the server side while leaving page rendering to the client. This is achieved by using render: false
flag (described much further in this document).
Page loading time
The final argument in favour of Server Side Rendering is that even if a website doesn't need search engine indexing it would still benefit from employing Server Side Rendering because that would save that additional HTTP roundtrip from the web browser to the API server for fetching the page's data. And no matter how fast the API server is, latency is unbeatable being about 100ms. So, by performing routing and page preloading on the server side one can speed up website loading by about 100ms. Not that it mattered that much for non-perfectionists but still why not do it when it's so simple to implement.
Usage
(see webpack-react-redux-isomorphic-render-example or webapp as references)
$ npm install react-isomorphic-render --save
Start by creating a settings file (it configures both client side and server side)
react-isomorphic-render.js
// React-router v3 routes routes: // Redux reducers // (they will be combined into the // root reducer via `combineReducers()`) reducer:
./src/client/redux/reducers/index.js
...
Then call render()
in the main client-side javascript file.
./src/client/application.js
// Render the page in web browser
And the index.html
would look like this:
react-isomorphic-render
Where bundle.js
is the ./src/client/application.js
file built with Webpack (or you could use any other javascript bundler).
Now, index.html
and bundle.js
files must be served over HTTP. If you're using Webpack then place index.html
to Webpack's configuration.output.path
folder and run webpack-dev-server
: it will serve index.html
from disk and bundle.js
from memory.
Now go to localhost:8080
. It should respond with the contents of the index.html
file. Client-side rendering should work now. The whole setup can be deployed as-is being uploaded to a cloud and served statically (which is very cheap).
Server side
Adding Server Side Rendering to the setup is quite simple though requiring a running Node.js process therefore the website is no longer just statics served from the cloud but is both statics and a Node.js rendering process running somewhere (say, in a Docker container).
index.html
will be generated on-the-fly by page rendering server for each incoming HTTP request, so the index.html
file may be deleted as it's of no use now.
// Create webpage rendering serverconst server = // Start webpage rendering server on port 3000// (`server.listen(port, [host], [callback])`)server
Now disable javascript in Chrome DevTools, go to localhost:3000
and the server should respond with a server-side-rendered page.
The server
variable in the example above is just a Koa application, so alternatively it could be started like this:
// import https from 'https'const server = http// https.createServer(options, server.callback()).listen(443, (error) => ...)
Serving assets and API
In the examples above "static" files (assets) are served by webpack-dev-server
on localhost:8080
. But it's for local development only. For production these "static" files must be served by someone else, be it a dedicated proxy server like NginX, a simple homemade Node.js application or (recommended) a cloud-based solution like Amazon S3.
Also, a real-world website most likely has some kind of an API, which, again, could be either a dedicated API server (e.g. written in Golang), a simple Node.js application or a modern "serverless" API like Amazon Lambda hosted in the cloud.
The following 3 sections illustrate each one of these 3 approaches.
The simplest approach
This section illustrates the "simple homemade Node.js application" approach. It's not the approach I'd use for a real-world website but it's the simplest one so it's for illustration purposes only.
So, a Node.js process is already running for page rendering, so it could also be employed to perform other tasks like serving "static" files (webpack-dev-server
is not running in production) or hosting a REST API.
// `npm install koa-static koa-mount --save` // Create webpage rendering serverconst server = // Start webpage rendering server on port 3000...
The old-school way
The old-school way is to set up a "proxy server" like NginX dispatching all incoming HTTP requests: serving "static" files, redirecting to the API server for /api
calls, etc.
server { # Web server listens on port 80 listen 80; # Serving "static" files (assets) location /assets/ { root "/filesystem/path/to/static/files"; } # By default everything goes to the page rendering service location / { proxy_pass http://localhost:3001; } # Redirect "/api" requests to API service location /api { rewrite ^/api/?(.*) /$1 break; proxy_pass http://localhost:3000; }}
Or, alternatively, a quick Node.js proxy server could be made up for development purposes using http-proxy library
const path = const express = const httpProxy = // Use Express or Koa, for exampleconst app = const proxy = httpProxy // Serve static filesapp // Proxy `/api` calls to the API serviceapp // Proxy all other HTTP requests to webpage rendering serviceapp // Web server listens on port `80`app
The modern way
Finally, the modern way is not using any "proxy servers" at all. Instead everything is distributed and decentralized. Webpack-built assets are uploaded to the cloud (e.g. Amazon S3) and webpack configuration option .output.publicPath
is set to something like https://s3-ap-southeast-1.amazonaws.com/my-bucket/folder-1/
(your CDN URL) so now serving "static" files is not your job – your only job is to upload them to the cloud after Webpack build finishes. API is dealt with in a similar way: CORS headers are set up to allow querying directly from a web browser by an absolute URL and the API is either hosted as a standalone API server or run "serverless"ly, say, on Amazon Lambda, and is queried by an absolute URL, like https://at9y1jpex0.execute-api.us-east-1.amazonaws.com/develop/users/list
.
This concludes the introductory part of the README and the rest is the description of the various (useful) tools which come prepackaged with this library.
Tools
Making HTTP Requests
If a Redux action creator returns an object with a promise
(function) and events
(array) then this action is assumed asynchronous.
- An event of
type = events[0]
is dispatched promise
function gets called and returns aPromise
- If the
Promise
succeeds then an event oftype = events[1]
is dispatched havingresult
property set to thePromise
result - If the
Promise
fails then an event oftype = events[2]
is dispatched havingerror
property set to thePromise
error
Example:
{ return Promise events: 'PROMISE_PENDING' 'PROMISE_SUCCESS' 'PROMISE_ERROR' }
This is a handy way of dealing with "asynchronous actions" in Redux, e.g. HTTP requests for a server-side HTTP REST API (see the "HTTP utility" section below).
Autogenerate event names
When you find yourself copy-pasting those _PENDING
, _SUCCESS
and _ERROR
event names from one action creator to another then take a look at asynchronousActionEventNaming
setting described in the All react-isomorphic-render.js
settings section of the "advanced" readme: it lets a developer just supply a "base" event
name and then it generates the three lifecycle event names from that "base" event
significantly reducing boilerplate.
HTTP utility
For convenience, the argument of the promise
function parameter of "asynchronous actions" described above is always the built-in http
utility having methods get
, head
, post
, put
, patch
, delete
, each returning a Promise
and taking three arguments: the url
of the HTTP request, parameters
object, and an options
object. It can be used to easily query HTTP REST API endpoints in Redux action creators.
Using ES6 async/await
:
{ return promise: async await http events: 'GET_FRIENDS_PENDING' 'GET_FRIENDS_SUCCESS' 'GET_FRIENDS_FAILURE' }
Or using plain Promise
s (for those who prefer)
{ return http events: 'GET_FRIENDS_PENDING' 'GET_FRIENDS_SUCCESS' 'GET_FRIENDS_FAILURE' }
The possible options
(the third argument of all http
methods) are
headers
— HTTP Headers JSON objectauthentication
— set tofalse
to disable sending the authentication token as part of the HTTP request, set to a String to pass it as anAuthorization: Bearer ${token}
token (no need to set the token explicitly for everyhttp
method call, it is supposed to be set globally, see below)progress(percent, event)
— is used for tracking HTTP request progress (e.g. file upload)onResponseHeaders(headers)
– for examining HTTP response headers (e.g. Amazon S3 file upload)
HTTP utility authentication token
In order for http
utility calls to send an authentication token as part of an HTTP request (the Authorization: Bearer ${token}
HTTP header) the authentication.accessToken()
function must be specified in react-isomorphic-render.js
.
authentication: { // (make sure the access token is not leaked to a third party) return return localStorage return storeauthenticationaccessToken }
HTTP utility and URLs
All URLs queried via http
utility must be relative ones (e.g. /api/users/list
). In order to transform these relative URLs into absolute ones there are two approaches.
The first approach is for people using a proxy server (minority). In this case all client-side HTTP requests will still query relative URLs which are gonna hit the proxy server and the proxy server will route them to their proper destination. And the server side is gonna query the proxy server directly (there is no notion of "relative URLs" on the server side) therefore the proxy host
and port
need to be configured in webpage rendering service options.
const server =
The second approach is for everyone else (majority). In this case all URLs are transformed from relative ones into absolute ones by the http.url(path)
function parameter configured in react-isomorphic-render.js
.
http: `https://api.server.com`
File upload
The http
utility will also upload files if they're passed as part of parameters
(example below). Any of these types of file parameters
are accepted:
- In case of a
File
parameter it will be a single file upload. - In case of a
FileList
parameter with a singleFile
inside it would be treated as a singleFile
. - In case of a
FileList
parameter with multipleFile
s inside multiple file upload will be performed. - In case of an
<input type="file"/>
DOM element parameter its.files
will be taken as aFileList
parameter.
Progress can be metered by passing progress
option as part of the options
argument.
// React componentComponent { return <div> ... <input type="file" onChange=thisonFileSelected/> </div> } // Make sure to `.bind()` this handler { const file = eventtargetfiles0 // Could also pass just `event.target.files` as `file` // Reset the selected file // so that onChange would trigger again // even with the same file. eventtargetvalue = null } // Redux action creator { return http events: 'UPLOAD_ITEM_PHOTO_PENDING' 'UPLOAD_ITEM_PHOTO_SUCCESS' 'UPLOAD_ITEM_PHOTO_FAILURE' }
JSON Date parsing
By default, when using http
utility all JSON responses get parsed for javascript Date
s which are then automatically converted from String
s to Date
s. This is convenient, and also safe because such date String
s have to be in a very specific ISO format in order to get parsed (year-month-dayThours:minutes:seconds[timezone]
), but if someone still prefers to disable this feature and have their String
ified Date
s back then there's the parseDates: false
flag in the configuration to opt-out of this feature.
Page preloading
For page preloading consider using @preload()
helper to load the neccessary data before the page is rendered.
// fetches the list of users from the server { return http events: 'GET_USERS_PENDING' 'GET_USERS_SUCCESS' 'GET_USERS_FAILURE' } @@ { const users fetchUsers = thisprops return <div> <Title>Users</Title> <ul> users </ul> <button onClick= fetchUsers >Refresh</button> </div> }
In the example above @preload()
helper is called to preload a web page before it is displayed, i.e. before the page is rendered (both on server side and on client side).
@preload()
decorator takes a function which must return a Promise
:
@
Alternatively, async/await
syntax may be used:
@
When dispatch
is called with a special "asynchronous" action (having promise
and events
properties, as discussed above) then such a dispatch()
call will return a Promise
, that's why in the example above it's written simply as:
@
Note: transform-decorators-legacy
Babel plugin is needed at the moment to make decorators work in Babel:
npm install babel-plugin-transform-decorators-legacy --save
.babelrc
"presets": ... "plugins": "transform-decorators-legacy"
On the client side, in order for @preload
to work all <Link/>
s imported from react-router
must be instead imported from react-isomorphic-render
. Upon a click on a <Link/>
first it waits for the next page to preload, and then, when the next page is fully loaded, it is displayed to the user and the URL in the address bar is updated.
@preload()
also works for Back/Forward web browser buttons navigation. If one @preload()
is in progress and another @preload()
starts (e.g. Back/Forward browser buttons) the first @preload()
will be cancelled if bluebird
Promise
s are used in the project and also if bluebird
is configured for Promise
cancellation (this is an advanced feature and is not required for operation). @preload()
can be disabled for certain "Back" navigation cases by passing instantBack
property to a <Link/>
(e.g. for links on search results pages).
To run @preload()
only on client side pass the second { client: true }
options argument to it
@
For example, a web application could be hosted entirely statically in a cloud like Amazon S3 and fetch data using a separately hosted API like Amazon Lambda. This kind of setup is quite popular due to being simple and cheap. Yes, it's not a true isomorphic approach because the user is given a blank page first and then some main.js
script fetches the page data in the browser. But, as being said earlier, this kind of setup is rediculously simple to build and cheap to maintain so why not. Yes, Google won't index such websites, but if searchability is not a requirement (yet) then it's the way to go (e.g. "MVP"s).
Specifying { client: true }
option for each @preload()
would result in a lot of copy-pasta so there's a special configuration option for that: { preload: { client: true } }
.
@preload()
indicator
Sometimes preloading a page can take some time to finish so one may want to (and actually should) add some kind of a "spinner" to inform the user that the application isn't frozen and the navigation process needs some more time to finish. This can be achieved by adding a Redux reducer listening to these three Redux events:
{ }
And a "spinner" component would look like
@ { const pending = thisprops return <div className= `preloading ` > <ActivityIndicator className="preloading__spinner"/> </div> ; }
Page HTTP response status code
To set a custom HTTP response status code for a specific route set the status
property of that <Route/>
.
<Route path="/" component=Layout> <IndexRoute component=Home/> <Route path="blog" component=Blog/> <Route path="about" component=About/> <Route path="*" component=PageNotFound status=404/> </Route>
Utilities
Setting and tags
This package uses react-helmet under the hood.
// Webpage title will be replaced with this one<Title>Home</Title> // Adds additional <meta/> tags to the webpage <head/><Meta> <meta charset="utf-8"/> <meta name="viewport" content="width=device-width, initial-scale=1.0, user-scalable=no"/> <meta property="og:title" content="International Bodybuilders Club"/> <meta property="og:description" content="This page explains how to do some simple push ups"/> <meta property="og:image" content="https://www.google.ru/images/branding/googlelogo/2x/googlelogo_color_120x44dp.png"/> <meta property="og:locale" content="ru_RU"/></Meta>
Handling asynchronous actions
Once one starts writing a lot of http
calls in Redux actions it becomes obvious that there's a lot of copy-pasting involved. To reduce those tremendous amounts of copy-pasta an "asynchronous action handler" may be used:
redux/blogPost.js
// (`./react-isomorphic-render-async.js` settings file is described below) const handler = // Post comment Redux "action creator"const postComment = // Get comments Redux "action creator"const getComments = // A developer can additionally handle any other custom eventshandler // This is for the Redux `@connect()` helper below.// Each property name specified here or// as a `result` parameter of an `action()` definition// will be made available inside Redux'es// `@connect(state => ({ ...connector(state.reducerName) }))`.// This is just to reduce boilerplate when `@connect()`ing// React Components to Redux state.// Alternatively, each required property from Redux state// can be specified manually inside `@connect()` mapper.handler // A little helper for Redux `@connect()`// which reduces boilerplate when `@connect()`ing// React Components to Redux state.// `@connect(state => ({ ...connector(state.reducerName) }))`// Will add all (known) state properties from// Redux state to the React Component `props`.const connector = // This is the Redux reducer which now// handles the asynchronous actions defined above// (and also the `handler.handle()` events).// Export it as part of the "root" reducer.
redux/reducer.js
// The "root" reducer composed of various reducers....
The React Component would look like this
// Preload comments before showing the page// (see "Page preloading" section of this document)@// See `react-redux` documentation on `@connect()` decorator@ { const getCommentsPending getCommentsError comments } = thisprops if getCommentsError return <div>Error while loading comments</div> return <div> <ul> comments </ul> this </div> { // `params` are the URL parameters populated by `react-router`: // `<Route path="/blog/:blogPostId"/>`. const userId params postComment postCommentPending postCommentError } = thisprops if postCommentPending return <div>Posting comment...</div> if postCommentError return <div>Error while posting comment</div> return <button onClick= > Post comment </button> }}
And the additional configuration would be:
react-isomorphic-render.js
// All the settings as before ...asyncSettings
react-isomorphic-render-async.js
// When supplying `event` instead of `events` // as part of an asynchronous Redux action // this will generate `events` from `event` // using this function. `_PENDING` `_SUCCESS` `_ERROR` // When using "asynchronous action handlers" feature // this function will generate a Redux state property name from an event name. // E.g. event `GET_USERS_ERROR` => state.`getUsersError`. asynchronousActionHandlerStatePropertyNaming: underscoredToCamelCase
Notice the extraction of these two configuration parameters (asynchronousActionEventNaming
and asynchronousActionHandlerStatePropertyNaming
) into a separate file react-isomorphic-render-async.js
: this is done to break circular dependency on ./react-isomorphic-render.js
file because the routes
parameter inside ./react-isomorphic-render.js
is the react-router
./routes.js
file which import
s React page components which in turn import
action creators which in turn would import ./react-isomorphic-render.js
hence the circular (recursive) dependency (same goes for the reducer
parameter inside ./react-isomorphic-render.js
).
Handling synchronous actions
For synchronous actions it's the same as for asynchronous ones (as described above):
// (`./react-isomorphic-render-async.js` settings file is described above) const handler = // Displays a notification.//// The Redux "action" creator is gonna be://// function(message) {// return {// type: 'NOTIFICATIONS:NOTIFY',// message// }// }//// And the corresponding reducer is gonna be://// case 'NOTIFICATIONS:NOTIFY':// return {// ...state,// message: action.message// }//const notify = // Or, it could be simplified even further://// export const notify = action({// namespace : 'NOTIFICATIONS',// event : 'NOTIFY',// result : 'message'// },// handler)//// Much cleaner. // A little helper for Redux `@connect()`const connector = // This is the Redux reducer which now// handles the actions defined above.
Locale detection
This library performs the following locale detection steps for each webpage rendering HTTP request:
- Checks the
locale
query parameter (if it's an HTTP GET request) - Checks the
locale
cookie - Checks the
Accept-Language
HTTP header
The resulting locales array is passed as preferredLocales
argument into localize()
function parameter of the webpage rendering server which then should return { locale, messages }
object in order for locale
and messages
to be available as part of the props
passed to the wrapper
component which can then pass those to <IntlProvider/>
in case of using react-intl
for internationalization.
{ const store locale messages children = props return <AppContainer> <Provider store=store> <IntlProvider locale=locale ? : 'en' messages=messages> children </IntlProvider> </Provider> </AppContainer> }
Get current location
// `withRouter` is available in `react-router@3.0`.//// For `2.x` versions just use `this.context.router` property:// static contextTypes = { router: PropTypes.func.isRequired }// // Using `babel-plugin-transform-decorators-legacy`// https://babeljs.io/docs/plugins/transform-decorators/@withRouterComponent { const router = thisprops return <div> JSON </div> }
Changing current location
These two helper Redux actions change the current location (both on client and server).
// Usage example// (`goto` navigates to a URL while adding a new entry in browsing history,// `redirect` does the same replacing the current entry in browsing history)@ { const goto redirect = thisprops // redirect('/somewhere') }
A sidenote: these two functions aren't supposed to be used inside onEnter
and onChange
react-router
hooks. Instead use the replace
argument supplied to these functions by react-router
when they are called (replace
works the same way as redirect
).
Alternatively, if the current location needs to be changed while still staying at the same page (e.g. a checkbox has been ticked and the corresponding URL query parameter must be added), then use pushLocation(location, history)
or replaceLocation(location, history)
.
@withRouter { const router = thisprops }
Performance and Caching
React Server Side Rendering is quite slow, so I prefer setting render: false
flag to move all React rendering to the web browser. This approach has virtually no complications. There are still numerous (effective) approaches to speeding up React Server Side Rendering like leveraging component markup caching and swapping the default React renderer with a much faster stripped down custom one. Read more.
Monitoring
For each page being rendered stats are reported if stats()
parameter function is passed as part of the rendering service settings.
... { if total > 1000 // in milliseconds db }
The arguments for the stats()
function are:
url
— the requested URL (without theprotocol://host:port
part)route
—react-router
route string (e.g./user/:userId/post/:postId
)time.initialize
— server sideinitialize()
function execution time (if defined)time.preload
— page preload timetime.render
— page React rendering timetime.total
— total time spent preloading and rendering the page
Rendering a complex React page (having more than 1000 components) takes about 100ms (time.render
). This is quite slow but that's how React Server Side Rendering currently is.
Besides simply logging individual long-taking page renders one could also set up an overall Server Side Rendering performance monitoring using, for example, StatsD
... { statsd statsd statsd statsd statsd if total > 1000 // in milliseconds db }
Where the metrics collected are
count
— rendered pages countinitialize
— server sideinitialize()
function execution time (if defined)preload
— page preload timerender
— page React rendering timetime
- total time spent preloading and rendering the page
Speaking of StatsD itself, one could either install the conventional StatsD + Graphite bundle or, for example, use something like Telegraf + InfluxDB + Grafana.
Telegraf starter example:
# Install Telegraf (macOS). brew install telegraf# Generate Telegraf config. telegraf -input-filter statsd -output-filter file config > telegraf.conf# Run Telegraf. telegraf -config telegraf.conf# Request a webpage and see rendering stats being output to the terminal.
Webpack HMR
Webpack's Hot Module Replacement (aka Hot Reload) works for React components and Redux reducers and Redux action creators (it just doesn't work for page @preload()
s).
HMR setup for Redux reducers is as simple as adding store.hotReload()
(as shown below). For enabling HMR on React Components (and Redux action creators) I would suggest the new react-hot-loader 3 (which is still in beta, so install it like npm install react-hot-loader@3.0.0-beta.6 --save
):
application.js
wrapper.js
// `react-hot-loader@3`'s `<AppContainer/>` { return <AppContainer> <Provider store= store > children </Provider> </AppContainer> }
.babelrc
"presets": "react" // For Webpack 2 ES6: "es2015" modules: false // For Webpack 1: // "es2015", "stage-2" "plugins": // `react-hot-loader@3` Babel plugin "react-hot-loader/babel"
webpack.config.js
entry: main: // This line is required for `react-hot-loader@3` 'react-hot-loader/patch' 'webpack-hot-middleware/client?http://localhost:8080' 'webpack/hot/only-dev-server' './src/application.js' plugins: ... ...
P.S.: Currently it says Warning: [react-router] You cannot change <Router routes>; it will be ignored
in the browser console. I'm just ignoring this for now, maybe I'll find a proper fix later. Currently I'm using this hacky workaround in ./src/client/application.js
:
/** * Warning from React Router, caused by react-hot-loader. * The warning can be safely ignored, so filter it from the console. * Otherwise you'll see it every time something changes. * See https://github.com/gaearon/react-hot-loader/issues/298 */if modulehot const isString = typeof a === 'string'; const orgError = consoleerror; // eslint-disable-line no-console console { // eslint-disable-line no-console if args && argslength === 1 && && args0 > -1 // React route changed else // Log the error as normally orgError; };
WebSocket
websocket()
helper sets up a WebSocket connection.
If token
parameter is specified then it will be sent as part of every message (providing support for user authentication).
WebSocket will autoreconnect (with "exponential backoff") emitting open
event every time it does.
After the websocket()
call a global websocket
variable is created exposing the following methods:
listen(eventName, function(event, store))
onOpen(function(event, store))
– is called onopen
eventonClose(function(event, store))
– is called onclose
eventonError(function(event, store))
– is called onerror
event (close
event always follows the correspondingerror
event)onMessage(function(message, store))
send(message)
close()
The store
argument can be used to dispatch()
Redux "actions".
websocket websocket websocket
The global websocket
object also exposes the socket
property which is the underlying robust-websocket
object (for advanced use cases).
As for the server-side counterpart I can recommend using uWebSockets
const server = port: 8888 const userConnections = {} server server // Also an HTTP server is started and a REST API endpoint is exposed// which can be used for pushing notifications to clients via WebSocket.// The HTTP server must only be accessible from the inside// (i.e. not listening on an external IP address, not proxied to)// otherwise an attacker could push any notifications to all users.// Therefore, only WebSocket connections should be proxied (e.g. using NginX).
Feature: upon receiving a message
(on the client side) having a type
property defined such a message
is dispatch()
ed as a Redux "action" (this can be disabled via autoDispatch
option). For example, if { type: 'PRIVATE_MESSAGE', content: 'Testing', from: 123 }
is received on a websocket connection then it is automatically dispatch()
ed as a Redux "action". Therefore, the above example could be rewritten as
// Server side (REST API endpoint)socket // Client side (Redux reducer) { }
Static site generation
In those rare cases when website's content doesn't change at all (or changes very rarely, e.g. a blog) it may be beneficial to host a statically generated version of such a website on a CDN as opposed to hosting a full-blown Node.js application just for the purpose of real-time webpage rendering. In such cases one may choose to generate a static version of the website by snapshotting it on a local machine and then host it in a cloud at virtually zero cost.
First run the website in production (it can be run locally, for example).
Then run the following Node.js script which is gonna snapshot the currently running website and put it in a folder which can be then hosted anywhere.
# If the website will be hosted on Amazon S3 npm install s3 --save
// The following code hasn't been tested so create an issue in case of a bug // Index page is added by defaultlet pages = '/about' url: '/unauthenticated' status: 401 url: '/unauthorized' status: 403 url: '/not-found' status: 404 url: '/error' status: 500 { const status content = JSON if status !== 200 throw 'Couldn\'t load items' const items = JSON pages = pages const output = path // Snapshot the website await snapshot host: configurationhost port: configurationport pages output // Copy assets (built by Webpack) await // Upload the website to Amazon S3 await }
Bundlers
If you're using Webpack then make sure you either build your server-side code with Webpack too (so that asset require()
calls (images, styles, fonts, etc) inside React components work, see universal-webpack) or use something like webpack-isomorphic-tools.
Advanced
At some point in time this README became huge so I extracted some less relevant parts of it into README-ADVANCED (including the list of all possible settings and options). If you're a first timer then just skip that one.
Contributing
After cloning this repo, ensure dependencies are installed by running:
npm install
This module is written in ES6 and uses Babel for ES5 transpilation. Widely consumable JavaScript can be produced by running:
npm run build
Once npm run build
has run, you may import
or require()
directly from
node.
After developing, the full test suite can be evaluated by running:
npm test
When you're ready to test your new functionality on a real project, you can run
npm pack
It will build
, test
and then create a .tgz
archive which you can then install in your project folder
npm install [module name with version].tar.gz