DEV Community

MylifeforAiur
MylifeforAiur

Posted on

React SEO: manage sitemaps

maze with instruction

background

This is another step of React SEO improvement series with Gatsby. Sitemap serves as report of site's content and help crawlers to grab the nuts and bolts of site.

According to google:

A sitemap is a file where you provide information about the pages, videos, and other files on your site, and the relationships between them.

Sitemap are specially helpful if:

  • Your site is large and have a big number of pages with low traffic
  • Your site is new and eagerly wants to get noticed
  • Your site is rich in media content like video, images etc

Problem

  1. Web site needs sitemap automation tool to generate sitemap with your production build so it is maintenance free.

  2. Some of the urls like generated by client side rendering are not recognised by sitemap tool. You need to attach it manually, meanwhile keep the automated part running.

Solution

gatsby sitemap plugin

It will check all pages and generate sitemap like this:

<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"
  xmlns:news="http://www.google.com/schemas/sitemap-news/0.9"
  xmlns:xhtml="http://www.w3.org/1999/xhtml"
  xmlns:image="http://www.google.com/schemas/sitemap-image/1.1"
  xmlns:video="http://www.google.com/schemas/sitemap-video/1.1">
  <url>
    <loc>https://gatsbystarterblogsource.gatsbyjs.io/gatsby-starter-blog/hello-world/</loc>
    <changefreq>daily</changefreq>
    <priority>0.7</priority>
  </url>
  <url>
    <loc>https://gatsbystarterblogsource.gatsbyjs.io/gatsby-starter-blog/my-second-post/</loc>
    <changefreq>daily</changefreq>
    <priority>0.7</priority>
  </url>
</urlset>
Enter fullscreen mode Exit fullscreen mode

⚠️ changefreq and priority will be ignored by google bots

  • That's the moving part, but how to append fixed urls not auto-generated? Answer is to use multi-location sitemap. This map will be uploaded to google search console of your site and set in robot.txt. Sample like:
<?xml version="1.0" encoding="UTF-8"?>
<sitemapindex xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
  <sitemap>
    <loc>https://gatsbystarterblogsource.gatsbyjs.io/gatsby-starter-blog/sitemap-0.xml</loc>
  </sitemap>
  <sitemap>
    <loc>https://gatsbystarterblogsource.gatsbyjs.io/gatsby-starter-blog/sitemap-manual.xml</loc>
  </sitemap>
</sitemapindex>
Enter fullscreen mode Exit fullscreen mode

With above, sitemap-0.xml is generated by sitemap plugin so it keeps the book of all gatsby site urls, and site-manual.xml is the extra you want to declare.

Last but not least, don't forget to submit the url in google search console:

sitempap in google search console

Here is the teleport to (commit)[https://github.com/gatsbyjs/gatsby-starter-blog/commit/b0d1f7c7dd0b0362925c3998090f77889ca63eca] for devs too busy to read:

Next

Champagne🍾 time now, with sitemap and robot.txt improvement, your site has a good start for SEO and you should be able to see some improvement of impressions and clicks in google search console.

Next topic will be tips on site metadata. SEO on!

ref links:

  1. Sitemaps Overview
  2. Gatsby sitemap plugin
  3. Google search console

Top comments (0)