Technical SEO Checklist: 9 Steps to a Technically Perfect Site in 2018

Share :

Technical SEO Checklist

We talk about off-page SEO a lot. Building, managing, and auditing your backlinks is a critical aspect of SEO, and it keeps getting trickier. On-page SEO is a hot topic too; particularly now that Google’s increasingly shifting to semantic search, and the old-school tactics don’t seem to work as well as they used to.

No doubt those are very important aspects of SEO, but there’s one thing we tend to forget about. SEO isn’t either off- or on-page. The technical part of the process is just as important; in fact, if you don’t get the technical foundation right, your other SEO efforts might bring no results at all.

SEO Checklist

 

In this article, I’ll focus on the main aspects of technical SEO that will help you maximize usability, search engine crawling, indexing, and ultimately rankings. Let’s roll!

1. Review your sitemap.

You sure know how important your sitemap is. It tells the search engines about your site structure and lets them discover fresh content. (If you don’t have a sitemap, you should really, really go and create one right now. You can do it in WebSite Auditor by simply starting a project for your site, jumping to the Pages dashboard, and hitting the Sitemap button.)

As you check your sitemap, make sure it is:

  • Clean. Keep your sitemap free from errors, redirects, and URLs blocked from indexing; otherwise, you’re at risk of search engines ignoring the sitemap like it isn’t there.
  • Up-to-date. Make sure your sitemap is updated every time content is added to your site (or removed from it) — this will help search engines discover new content fast.
  • Concise. Google won’t crawl sitemaps over 50,000 URLs. Ideally, you should keep it much shorter than that to ensure that your most important pages are crawled more often: experiments show that shorter sitemaps result in more effective crawls.
  • Registered in Search Console. Let Google know about your sitemap. You can either submit it manually to Google Search Console or specify its location anywhere in your robots.txt file in the following way:
    Sitemap: http://yourdomain.com/sitemaplocation.xml

For more on sitemaps, check with Google’s guide.

 

2. Check indexing.

Let’s move on to your site’s pages that are indexed by search engines. To check this quickly, fire up WebSite Auditor, create a project for your site (or rebuild an existing one) and jump to Domain Strength.

indexing search engines

Ideally, this number should be close to the total number of your site’s pages (which you can see under Site Structure > Pages in your WebSite Auditor project) minus the ones you deliberately restricted from indexing. If there’s a bigger gap than you expected, you’ll need to review your disallowed pages. Which brings us to…

 

3. Make sure all important resources are crawlable.

You might be tempted to simply look through robots.txt to make sure your important pages are crawlable. But in reality, your robots.txt file is only one of the ways to restrict pages from indexing. What about the no index meta tag, X-Robots-Tag, or orphan pages that aren’t linked to internally? What about your JavaScript and CSS files that could be critical to your page’s rendering? To run a comprehensive crawlability check, you’ll need to use an SEO crawler.

  • Find pages and resources restricted from indexing. With WebSite Auditor, you can quickly get a full list of all blocked pages and resources. To do that, open your WebSite Auditor project, jump to Site Structure > Site Audit and click on Resources restricted from indexing.

site audit

If any of the resources on the list aren’t supposed to be blocked, check with the Robots instructions column to see where the disallow instruction was found, so you can fix it quickly.

  • Check for orphan pages. Orphan pages are pages that exist on your site but aren’t linked to internally. This means that if search engines discover them at all, they’ll likely crawl them very infrequently. To check if there are any orphan pages on your site, rebuild your WebSite Auditor project by going to Site Structure > Pages and hitting the Rebuild Projectbutton. At Step 2 of the rebuild, check the Search for orphan pages box and proceed with the rebuild.

website crawl options

Once the rebuild is complete, you’ll be able to easily spot the orphan pages by the Orphan page tag.

sitemap auditing

Note: If your site is built using AJAX or relies on JavaScript to generate its content, you’ll need to enable rendered crawling in WebSite Auditor as you create or rebuild a project. To do this, at Step 2 of project creation/rebuild, switch to Advanced options and check the Execute JavaScript box.

 

4. Amplify crawl budget.

Crawl budget is the number of site’s pages that search engines crawl during a given period of time. Crawl budget isn’t a ranking factor per se, but it determines how frequently the important pages of your site are crawled (and whether some of them are being crawled at all). You can get an idea of what your daily crawl budget is in Google Search Console by going to Crawl > Crawl Stats.

google search console

From the report above, I can see that on average, Google crawls 32 pages of my site per day. From that, I can figure out that my monthly crawl budget is 960 units.

Once you know what your crawl budget is, you must be wondering how you can make the most of it.

  • Clean up duplicate content. Duplicate pages are one of the most common reasons crawl budget goes to waste. For a hint on duplicated pages on your site, check with the On-page section in WebSite Auditor’s Site Audit dashboard. The pages with duplicate title and meta description tags likely have duplicated content (otherwise, you should really rewrite those titles).

website seo audit

For every duplicate page you can get rid of — do it. If you have to keep the page, at least make sure to block it from search engine bots. In terms of crawl budget, canonical URLs aren’t of much help: search engines will still hit the duplicate pages and waste a unit of your crawl budget every time.

  • Restrict indexation of pages with no SEO value. Think about which pages of your site will make no sense in the search results: privacy policy, terms and conditions, old promotions, etc. These are all good candidates for a disallow rule.
  • Add URL parameters in Google Search Console. By default, Google might crawl the same page with certain URL parameters and without them separately, as if they were two different pages. That’s why it’s useful to add the URL parameters you’re using to Google Search Console — this will let Google know that it is in fact the same page and make crawling more efficient.

google search console URL parameters

  • Take care of broken links. When a search bot hits a 4XX/5XX page, a unit of your crawl budget goes to waste. That’s why it’s important to find and fix all broken links on your site. You can get a full list of those in WebSite Auditor’s Site Audit dashboard under the Links section, by clicking on Broken links.

broken links

  • Fix redirect chains. Every redirect search engine bots follow is a waste of a unit of your crawl budget. Moreover, if there are an unreasonable number of 301 and 302 redirects in a row on your site, at some point the search spiders will stop following the redirects, and the destination page may not get crawled.

You can get a full list of pages with redirects in WebSite Auditor, along with a list of redirect chains found on your site. Just jump to the Site Audit dashboard and look for Pages with 302 redirectPages with 301 redirect, and Pages with long redirect chains.

seo auditing checklist

 

5. Audit internal links.

A shallow, logical site structure is important for users and search engine bots; additionally, internal linking helps spread ranking power (or link juice) among your pages more efficiently.

internal links

As you audit your internal links, here are the things to check.

  • Click depth. Make sure your site’s important pages are no more than 3 clicks away from the homepage. To check your pages’ click depth, fire up WebSite Auditor once more and jump to Site Structure > Pages. Then sort the URLs by Click depth in descending order by clicking on the header of the column twice.

internal link seo audit

  • Broken links. As I’ve already mentioned, broken links put your crawl budget to waste. On top of that, they confuse visitors and eat up your pages’ link juice. It’s important to remember that apart from the <a> tags, broken links may hide in the <link> tags, HTTP headers, and sitemaps. For a comprehensive list of all resources with a 4xx/5xx response code, it’s best to check on your site’s resources in WebSite Auditor’s All Resources dashboard. Click on Internal resources and sort the list by HTTP Status Code (by clicking on the header’s column). Now, click on any of the broken resources to see where the links to it hide.

internal resources seo auditing

  • Orphan pages. These pages aren’t linked to from other pages of your site — and thus are hard to find for visitors and search engines. To check if there are any orphan pages on your site, rebuild your WebSite Auditor project by going to Site Structure > Pages and hitting the Rebuild Project button. At Step 2 of the rebuild, check the Search for orphan pages box and proceed with the rebuild. When the rebuild is complete, you’ll be able to easily spot the orphan pages by the Orphan page tag.

 

6. Check on your HTTPS content.

Google started using HTTPS as a ranking signal in 2014; since then, HTTPS migrations have become increasingly common. Today, over 70% of page 1 Google search results use HTTPS.

If your site is already using HTTPS (either partially or entirely), it is important to check on the common HTTPS issues as part of your site audits. In particular, remember to check for:

  • Mixed content. Mixed content issues arise when an otherwise secure page loads some of its content (images, videos, scripts, CSS files) over a non-secure HTTP connection. This weakens the security of the page and might prevent browsers from loading the non-secure content or even the entire page. To check your site for mixed content issues, open your WebSite Auditor project and jump to Site Audit. Locate the HTTPS pages with mixed content issues factor (under Encoding and technical factors). Click on it to see the list of pages with mixed content, if any.

https content

  • Canonicals, links, and redirects. Ideally, all links on your HTTPS site, as well as redirects and canonicals, should point to HTTPS pages straight away. Even if you have the HTTP to HTTPS redirects implemented properly on the entire site, you still don’t want to take users through unnecessary redirects — this will make your site appear much slower than it is. Such redirects may be a problem for crawling, too, as you will waste a little of your crawl budget every time a search engine bot hits a redirect.

For a comprehensive list of all non-HTTPS resources on your site, jump to WebSite Auditor’s All Resources dashboard. Click on HTML under Internal resources and sort the list by URL (by clicking on the header’s column). This way, you should be able to see the HTTP pages first. For every HTTP page you find, check with the Found on pages list at the bottom of the screen for a full list of pages that link to the HTTP page you’re examining. Here, you’ll also see where the link was found so you can fix things quickly.

html seo audit

If your site hasn’t yet gone HTTPS, you may want to consider an HTTPS migration. If you do decide to go secure, feel free to use the framework from the case study of our own migration to HTTPS at link-assistant.com.

 

7. Test and improve page speed.

Google expects pages to load in two seconds or less, and they’ve officially confirmed that speed is a ranking signal. Speed also has a massive impact on UX: slower pages have higher bounce rates and lower conversion rates.

Page speed isn’t just one of Google’s top priorities for 2018, it’s also its ranking signal both for desktop and mobile results. To check if your pages pass Google’s speed test, open your WebSite Auditor project and go to Content Analysis. Click Add page, specify the URL you’d like to test, and enter your target keywords. In a moment, your page will be analyzed in terms of on-page optimization and technical SEO. Switch to Technical factors and scroll to the Page Speed (Desktop) section of on-page factors to see if any problems have been found.

page audit

If your page doesn’t pass some of the aspects of the test, you’ll see the details and how-to-fix recommendations in the right-hand view.

 

8. Get mobile friendlier.

After a year and a half of careful experimentation and testing, Google started migrating sites to mobile-first indexing this spring. A ‘mobile-first index’ means that Google will index the mobile versions of websites instead of their desktop version. This literally means that the mobile version of your pages will determine how they should rank in both mobile and desktop search results.

Here are the most important things to take care of when auditing your mobile site.

  • Test your pages for mobile friendliness. Google’s mobile friendly test includes a selection of usability criteria, such as viewport configuration, use of plugins, and the size of text and clickable elements. It’s also important to remember that mobile friendliness is assessed on a page basis, so you’d need to check each of your landing pages for mobile friendliness separately, one at a time. You can quickly run the check in WebSite Auditor — Google’s mobile-friendly test is incorporated right into the tool. In your project, go to the Content Analysis module, select a page you’d like to analyze, and enter your target keywords. When the analysis is complete, look at the Page Usability (Mobile) section to see if any errors or warnings have been found.

technical factors page audit

  • Run comprehensive audits of your mobile site. Having all your important pages pass Google’s mobile test is a good start — but there’s a lot more analysis to do. A full audit of your mobile site is a great way to make sure that all your important pages and resources are accessible to Googlebot and free from errors.

To do an in-depth mobile website audit, you’ll need to run a site crawl with custom user agent and robots.txt settings. In your WebSite Auditor project, jump to the Pagesdashboard and click the Rebuild Project button. At Step 2, make sure the Follow robots.txt instructions box is checked; in the drop-down menu next to it, choose Googlebot-Mobile. Right below, check the Crawl as a specific user agent box. In the drop-down menu to the right, pick the second user agent on the list:

That’s the user agent Google uses when crawling mobile versions of pages. In a moment, the tool will conduct a full audit of your mobile website. Remember that any SEO issues you find can equally affect your desktop and mobile rankings, so do look through the traditional SEO factors like redirect chains, broken links, heavy pages, duplicate or empty titles and meta descriptions, etc.

 

9. Ask search engines to re-crawl your site.

With the 8 steps above, I’m sure you’ve identified a few issues on your site that need fixing. Once you have those fixed, you can explicitly ask Google to re-crawl your pages to make sure the changes are taken into account immediately.

All you need to do is log in to Google Search Console and go to Crawl > Fetch as Google. Enter the URL of the page you want to be re-crawled (or leave the field blank if you’d like Google to crawl the homepage) and click Fetch.

google search console fetch and render

Note that your fetch must have a complete, partial, or redirected status for you to be able to submit the page to Google’s index (otherwise, you’ll see a list of problems Google found on your site and will need to fix those and use the Fetch as Google tool again). If Googlebot can successfully fetch your page, just click the Submit to index button to encourage Google to re-crawl it.

fetch as google

You can submit either the exact URL to be re-crawled (up to 500 URLs per week), or the URL and all pages linked from it (up to 10 per month). If you choose the latter, Google will use this URL as a starting point in indexing your site content and will follow internal links to crawl the rest of the pages. Google doesn’t guarantee to index all of your site’s pages, but if the site is fairly small, it most probably will.

crawl URL

(There’s a similar option in Bing Webmaster Tools, too. Just locate the Configure My Sitesection in your dashboard and click on Submit URLs. Fill in the URL you need re-indexed, and Bing will typically crawl it within minutes.)

 

 

Source: SEO Power Suite

Comments are closed.

About the author

Michael Doyle

Michael is a digital marketing powerhouse and the brain behind Top4 Marketing and Top4. His know-how and over 23 years of experience make him a go-to resource for anyone looking to crush it in the digital space. To get the inside scoop on the latest and greatest in digital marketing, be sure to read his blog posts and follow him on LinkedIn.

Top4 - Made in Australia with Love
\