I am currently working on Practo’s Healthfeed , a place where doctors and health specialists write health articles and share their expertise. If not anything else, any blogging platform needs to be good at ONE thing the most: the SEO . Any blog with good SEO is destined to get more readers than anyone else. And more readers means more SEO !
To get better SEO you have to do a lot of things right. But here we’ll only talk the most important two, the most fundamentals for good SEO:
Making your site crawlable for bots .
Make it FAST . Like Millennium Falcon fast !
Enters React !
When React came out, one of it’s selling point was that it supported Server Side Rendering (SSR). To make your app support SSR, All you need is a node server and an API . The modern architecture looks something like this:
Notice ! The node server acts as a mediate person between the user and the API server. So the Flow goes like this:
User hits the URL, Request goes to the Node Server .
Node Server makes request to the API Server and takes the data from the API server.
Pushes the data to the APP , which in return creates the final HTML for the user.
Returns the HTML string back to the user.
Now the whole setup is done ! Server is taking requests, API is giving response and finally user/bots are getting a fully rendered HTML page. But this can turn out to be a user’s nightmare .
What is API server is slow ! Like 500ms response time.
One problem with the server side rendering is that it’s response time relies heavily on the API server’s response time. That means, no matter how efficient and fast app you’ve made, the user will see the white screen for atleast 500ms, provided your node server has 0 ms response time. Which is practically impossible (for now).
So let’s see the breakdown here:
500 ms response time from API
150 ms for server side rendering (yes, it takes that much)
10 ms for node server
150 ms network latency
So a user will get response after almost 810ms ! Now, of course these are just average numbers, but in real world, it can be lot worse. Since we don’t have much control over the network latency, we’ll keep that out. So the server response time is currently at 660 ms .
To improve the situation, we’ll first catch the biggest fish: API response time.
Enters Redis !
Redis is one of the most powerful data-structure store, which is super fast and efficient. You can store anything in there as key value pairs. Integrating redis to store the node environment is super easy. If we store the API result in the redis , we can save our network trip to the API server.
So now, whenever the user makes some request, the node server will query the Redis if it already has the response. If it has, it’ll directly pass it to the app for rendering and finally return the HTML string. If it doesn’t have the response, we’ll go ahead and call the API and now store this result in the redis before passing it to the app.
The new breakdown will be:
150 ms for server side rendering
5ms for the redis server (NOTE: The rendering time might be less or very high, depending upon your application size.)
10 ms for node server
With just caching our API responses, we have dropped down from 660 ms to 160 ms .
Squeeze More !
Although 160 ms is good, we can get more by just a small trick.
Instead of storing the API response in redis, why not store the whole HTML string itself ?