Skip to content

Gridsome – Implement Additional SEO

Gridsome is a modern website development framework for creating fast and secure websites that can be deployed anywhere. Static HTML files are generated to create SEO-friendly markup that hydrates into a Vue.js-powered SPA once loaded in the browser.

One of the main goals of Gridsome is to make a framework that lets developers build websites that are optimized out of the box.

Static website generators like Gridsome provide a number of benefits to web developers, primarily because they provide improved performance, version control, security and ease.

Performance

The speed advantage of static sites is a huge advantage. Since there are no database queries to run, no processing on every request, web servers have no issues serving static files almost instantly to visitors.

Version Control

Changes to your static site can be tracked through source control management programs like Git, which allows more people to collaboratively work on a project, and undo changes when something goes wrong.

Security

Content Management Systems (CMS) like WordPress are a common target for hackers and malicious users and  can be exploited from simple reflected XSS (Cross Site Scripting) through SQL Injection, all the way to Remote Code Execution. 

Static generated sites, however, are a tougher nut to crack since they have little to no server-side functionality. There is a reduced code & application surface area where something can go wrong and allow unauthorized access to the website content. Since static sites typically are generated on one machine and then usually served from a different one, so there is not much an attacker can do to the web server serving the static files.

All these features and benefits also make Static Generated websites great for SEO, most notably because they are blazingly fast. If you've worked on any major e-commerce or even any marketing focused website you'll know that digital marketers are always concerned about the speed of the website.

This is for 2 primary reasons. The first being that most customers expect almost instantaneous response to their actions on a website. More users than ever are accessing websites via mobile devices and want near real time response.

The second reason, is that Google, rewards fast sites! Over the years, there has been quite a lot of debate about this fact in the SEO community, Many with differing opinions and like everything most consultants will always answer with a "It depends" answer. But the general consensus that having a fast site will definitely NOT hurt or damage your SEO reputation whereas having slow site will.

So erring on the side of caution, its best to always ensuring that your speed is optimum. If you're planning a new website, it's probably best to go for a static generated website.

In a previous article we looked at using Gridsome to implement some basic on-Page SEO we will now expand on this article by implementing other essential features when it comes to SEO for your website. When optimising your On-Page SEO, it's vitally important to remember that for the most part SEO is just common sense

We will take a look at implementing features that will all contribute to helping to promote your site and make it easier for searches engines to crawl and index your site.

https://github.com/garywoodfine/geekiam

Implement RSS Feeds

As developers we spend a lot of time on line browsing and reading websites, we will also invariably subscribe to some our favourite websites using RSS feeds to keep up to date with any new features.

Personally, I use the Thunderbird Email Client which has an RSS Feed feature and subscribe to several my favourite news sources to keep up to date

What is RSS ?

RSS is an acronym is actually stands for Really Simple Syndication and is a technology which dates back to the late 1990s and was developed as a way to allow website content to be syndicated for use on other sites.

RSS has evolved to become a popular standardised format used to publish frequently changing content, most commonly news headlines but also blog posts, video content, podcasts and calendar events.

The main benefit of RSS is that instead of having to go out to each individual website and see if there is any new content, content comes to you in one centralized location. The predominant method for using RSS feeds is through an application known as a feed reader or aggregator Thunderbird Email Client.

if you’re a webmaster or business owner, RSS feeds have some important benefits for you. By creating an RSS feed for your content and encouraging visitors to subscribe, you allow those who stumble on your website to become regular readers who will continue to engage with your content and your brand.

In SEO For Dummies also explains what an important role RSS Feeds play in helping to promote your site and content across many Syndication Services.

SEO For Dummies

Up relevance scores, improve page speed, optimize voice search questions, and more! 

shows website owners, developers, and search engine optimizers (SEOs) how to create a website that ranks at the top of search engines and has high-volume traffic, while answering the essential question of "how do I get people to visit my site?"

How to Implement RSS feeds with Gridsome

In Gridsome - Explore plugins making life easy we took a look at a few plugins which really help to set up Gridsome for use with the Netlify CMS and Tailwind CSS. The Gridsome plugin directory has a rich set up plugins available to help you out with some of the most common tasks, you will need to do when setting and configuring new websites.

We are going to take a look at another plugin, that is available to configure your Gridsome static generated website to generate RSS Feeds, gridsome-plugin-feed . The bonus this plugin provides is that it also provides the ability to generate Atom, and/or JSON feed for your Gridsome site.

What is ATOM ?

ATOM is a name the applies to collective of Web Standards, The Atom Syndication Format is an XML language used for web feeds, while the Atom Publishing Protocol (AtomPub or APP) is a simple HTTP-based protocol for creating and updating web resources.

The ATOM format was developed as an alternative to RSS. However, in my experience at least it is not as popular in the web syndication community, but is still used.

What is JSON Feed ?

a format similar to RSS and Atom but in JSON. JSON Feed is a new-ish project aiming to put together a formal spec for JSON based RSS feeds.

Fortunately, by making use of the gridsome-plugin-feed plugin, you don't necessarily need to concern yourself with the complexities and intricacies of building these feeds, it litterally becomes a plugin and go process.

How to Install Gridsome Plugin Feed

Installing the Gridsome plugin feed is super easy, its just a case of using NPM to install it.

Shell

Once installed we can configure it by editing our gridsome.config.js by add the following to our plugin array. We are going to implement just enough code to get the most basic feeds functional. We are also going to activate all 3 of available types of feed.

JSON

In Gridsome – Explore plugins to make life easy we set a few properties in the general section gridsome.config.js, the most important of which in terms of this plugin is the siteUrl , which the plugin uses to generate links to posts in the feed.

The contentTypes is an array for the content types you want to include into the feed. In our case, which is the current state of this tutorial is Post, which we created in Gridsome - Using Markdown Files.

We have also added an additional property to our Post content type, a Boolean flag to indicate whether the post is ready to be published or not. This will be important later, because we want to try to avoid a situation in the future that if we accidentally publish an article before it is finished or attempt to change the slug that it confuses feed readers.

Once we deploy our site to our server, we can then navigate to https://geekiam.io/feed.xml and view our feed.

Add XML Site Map

 a sitemap is pretty much an essential for any website, having a sitemap that is constructed with a clear goals in mind could be the driving factor to a website's success. Sitemaps provide a vital link between a website and search engine .

A well-structured sitemap will make a website searchable by all search engines, and will provide users with more accurate search results when they are looking for keywords or key terms that are associated with a website. Search Engine Crawlers depend on sitemaps to point them in the direction of the correct website that a user is searching for.

How to Install XML SiteMap for Gridsome

Checking out the Gridsome plugin directory and you'll find there is already the handy @gridsome/plugin-sitemap and all that is required to install is the regular NPM

Shell

Once installed we edit the gridsome.config.js again to add our configuration details. The plugin example has a number of different configuration options, depending on your pages and Url structure. For the time being we are going to keep it simple, because we only have the one Post option in the root directory. 

We add the following to our plugin array in gridsome.config.js .

JSON

Information

Using a sitemap doesn't guarantee that all the items in your sitemap will be crawled and indexed, as Google processes rely on complex algorithms to schedule crawling. However, in most cases, your site will benefit from having a sitemap, and you'll never be penalized for having one.

Once we deploy our new settings to the server and navigate to https://geekiam.io/sitemap.xml we now see our generated site map.

Once you have your sitemap working its a good idea to add an additional meta tag on your site, just to let any search engines and other web crawlers know that your site has a sitemap available. To do this, we can use the technique we defined in Gridsome – Configure Basic On-Page SEO to add additional meta information in our main.js .

JS

Add Robots.txt

A robots.txt file is a set of instructions for bots. This file is included in the source files of most websites. Robots.txt files are mostly intended for managing the activities of good bots like web crawlers, since bad bots aren't likely to follow the instructions.

A robots.txt file is just a text file with no HTML markup code. The robots.txt file is hosted on the web server just like any other file on the website.

The robots.txt file for any given website can typically be viewed by typing the full URL for the homepage and then adding /robots.txt.

In Gridsome we can just add a plain text file to our static directory because as part of the build process any additional files listed in the static directory are deployed along with our site. So we can just add a plain text file to our static directory and call it robots.txt

Once complete we can just add an additional line to inform good web crawlers that we have a sitemap available.

Plain Text

Conclusion

Implementing these two features will increase the SEO effectiveness of your Gridsome website. I am off to write more articles in Geek.I.Am because the most important aspect to really improve your search engine rankings is Content because without any content nothing you do will have an effect.

Over the next couple of weeks I will be generating a lot of content on Geek.I.Am, which will obviously require some additional development, design and styling changes and enhancements which will undoubtedly spur additional articles!

Gary Woodfine
Latest posts by Gary Woodfine (see all)