Server Side Rendering SPA on Shared Host

Server side rendering on shared host

The information technology has helped us to understand and improve many things. Similar changes were made to the web technology with the latest technologies like Angular, React & Vue, also knows as front end framework or library to improve user experience.

Before Single Page Applications, we need to send request to the server and download the page with the data populated with it. We used to keep downloading same small pieces of assets every time we hit a URL. Now, we load specific components and make request to only those things we want to load. The UX of website improves and the speed increases, which make a website more user friendly.

In one side, we are improving user experience with our fast loading quick interfaces but on the other side we are loosing our website reach because with more focus on improving user experience ( humans ), the bots or the crawlers of Search Engine are facing difficult to reach our applications.

No matter how fast your website loads, if you can’t reach your proper audience there won’t be any value.

The Problem

A single page application or SPA updates the current page with new data, we don’t have to download whole page again and again. So, due to this behavior, the website speed increases but the search engine bots are not able to crawl these dynamic content. They cannot wait for a page to update data dynamically.

Bot will end up crawling the static template file with no content that we use as a root elements with such frameworks or library to render our components. You website will not rank for a keyword or any content because the search engine bots are unknown.

A good example can be checking your SPA source code on browser and comparing it with inspect element after loading specific components.

Angular application view source
Angular application – view source

In above image, you can see the browser renders only the static template where we have a root element called app-root and is used to render our components. This is the same way web crawlers sees our web application.

Now, check the screen shot below. The inspect elements shows the elements ( Component ) rendered on the client site when we hit the root URL of our application. This is what a client or user sees but not accessible for bots because they cannot wait for such dynamic contents.

Angular application inspect element
Angular application – inspect element

If you are using Node JS, there are many dependencies which can help in server side rendering. Angular Universal toolkit is one of them but we are talking about the shared server.

We cannot run Node JS on our shared server. It requires a dedicated or VPS server. So, we must use a server side language which can be run on our shared server. PHP can be helpful here to build a back-end for our Angular, React or Vue application. You can create a simple fake back-end with API end points on which we send a request from our application. Maybe using a WordPress can save our time? WordPress has its own API and is easy to use.

Solution

To enable Server Side Rendering on our shared server we need make some changes for our Apache server using .htaccess file.

I have already explain about web crawlers crawling our website. We can actually detect them using our server configuration. So, the basic idea is to detect the request made by bot or a user. Our configuration will find out the request made by web crawlers or bot and we will redirect them to a different page which will load our content just for them (bots) also knows as Server Side Rendering.

In your root directory, create a .htaccess file.

We need to first enable the rewrite mod.


RewriteEngine On

Now, We will add a condition to rewrite our URL. We want to redirect our all crawlers to a specific URL but not users.

RewriteCond %{HTTP_USER_AGENT} (facebookexternalhit/[0-9]|Twitterbot|googlebot)

We are setting a condition for the HTTP_USER_AGENT. We will check if the AGENT is one of the bot listed on our condition i.e. Facebook bot or Twitterbot or the Google bot.

If the condition matches, we will add our rewrite rule. This is just an example with Facebook, Twitter and Google Bot. There are many web crawlers. You can add them according to the need.

RewriteRule blog/(.*)$ https://www.ashiish.me/ssr/post.php?slug=$1 [R=301,L]

Above is our Rewrite Rule to redirect our bot. Basically, how this work is we set the rule with pattern, target and flags.

blog/(.*)$ – will select everything after blog/ to the end. We will use this string to make a request to the API of WordPress.

https://www.ashiish.me/ssr/post.php?slug=$1 is our target for the bot where it passes the above string as a parameter to the slug variable.

[R=301, L] are the two flags where the flag R=301 makes a 301 redirection and the flag L will stop further rewrites.

Htaccess full code

RewriteEngine On

RewriteCond %{HTTP_USER_AGENT} (facebookexternalhit/[0-9]|Twitterbot|googlebot)

RewriteRule blog/(.*)$ https://www.ashiish.me/ssr/post.php?slug=$1 [R=301,L]

Now, our configuration is all set. We need to create a post.php file under ssr directory.

This file is the one our bot will visit and help us to create a snapshot or a server rendered version of post. You can optimize this page with on site SEO techniques.

In our post.php file we will use the slug variable value that we received when a bot hits our URL. We will pass that slug to fetch data from WordPress API. You need to install WordPress inside your root directory in a different folder to keep things organized and separate. In my case, I will install WordPress under the api folder inside root directory.

My site root: https://www.ashiish.me/ssr/

My WP API: https://www.ashiish.me/ssr/api/

API By WP: https://www.ashiish.me/ssr/api/wp-json/wp/v2/posts

If you have installed WordPress, you can make a request to the API url of your website same as above and it will return a nice JSON data of the blog post.

You can get posts by id, slug etc. We will use slug to fetch our page.

Note: WordPress API by default doesn’t provides a featured image when you make a API request to /posts end point.

I have created a simple but useful Headless CMS theme for WordPress.

You can download it from my GitHub Repository.

WordPress headless: https://github.com/ashiishme/wordpress-headless-cms

You can now install WordPress and install the above theme which will take care of the rest. You need to update the location from index.php of above project to your own domain so that it will redirect to the root of your domain when someone tries to access your server.

Above headless CMS theme adds a custom field featured_image_src for featured image which can be used to access the image. You can find this key being used in our post.php file.

Now, in your post.php file lets declare some constants and fetch data from WordPress API.

<?php
// Your API URL with slug as a param.
$api = 'https://www.ashiish.me/ssr/api/wp-json/wp/v2/posts?slug=';
// Your site root
$siteRoot = 'https://www.ashiish.me/ssr/';

$title = $excerpt = $content = $images_url = $page_url = '';

// We will check if the slug param is set and get the slug value using $_GET method
$slug = (isset($_GET['slug'])) ? $_GET['slug'] : '';
// Returns the raw data from API.
$raw_data = file_get_contents($api . $slug);
$data = json_decode($raw_data);

$title = $data[0]->title->rendered;
$excerpt = $data[0]->excerpt->rendered;
$content = $data[0]->content->rendered;

// featured_image_src is the custom field provided by our headless CMS theme.
$images_url = $data[0]->featured_image_src;
$page_url = $siteRoot . "blog/" . $data[0]->slug;
?>

// Your HTML code 

<!DOCTYPE html>
<html>
<head>
  <meta charset="utf-8">
  <meta name="viewport" content="width=device-width, initial-scale=1">
  
  <title><?php echo $title; ?></title>

  <!--SEO-->
  <meta name="description" content="<?php echo $excerpt; ?>">
  <meta name="author" content="Ashish Yadav">

  <!-- Open Graph For Facebook -->
  <meta property="og:site_name" content="Ashiish Me">
  <meta property="og:type" content="website">
  <meta property="og:title" content="<?php echo $title; ?>" />
  <meta property="og:url" content="<?php echo $page_url; ?>">
  <meta property="og:description" content="<?php echo $excerpt; ?>">
  <meta property="og:image" content="<?php echo $image_url; ?>">
  <!--/ Open Graph -->

</head>
<body>
  <img src="<?php echo $image_url; ?>">
  <h1><?php echo $title; ?></h1>
  <p><?php echo $content; ?></p>
</body>
</html>

This is it. Now, every time a bot hits your Angular, React or Vue application /blog path. It is redirected to the server page where we do our server side rendering using post.php file and return the content for our bots. They will be able to crawl, read and understand the page. You can test it with Facebook Debugger as we are using Open Graphs meta tags.

This way we can achieve SSR on shared server if you don’t have a VPS or Dedicated server or want to use PHP as a back-end. You can update /post path and add your own for multiple pages with multiple RewriteRules.

If you have any questions, please feel free to ask in comments.