Site icon Top4

Technical SEO Checklist: 9 Steps to a Technically Perfect Site in 2018

SEO Checklist

Technical SEO Checklist

We talk about off-page SEO a lot. Building, managing, and auditing your backlinks is a critical aspect of SEO, and it keeps getting trickier. On-page SEO is a hot topic too; particularly now that Google’s increasingly shifting to semantic search, and the old-school tactics don’t seem to work as well as they used to.

No doubt those are very important aspects of SEO, but there’s one thing we tend to forget about. SEO isn’t either off- or on-page. The technical part of the process is just as important; in fact, if you don’t get the technical foundation right, your other SEO efforts might bring no results at all.

 

In this article, I’ll focus on the main aspects of technical SEO that will help you maximize usability, search engine crawling, indexing, and ultimately rankings. Let’s roll!

1. Review your sitemap.

You sure know how important your sitemap is. It tells the search engines about your site structure and lets them discover fresh content. (If you don’t have a sitemap, you should really, really go and create one right now. You can do it in WebSite Auditor by simply starting a project for your site, jumping to the Pages dashboard, and hitting the Sitemap button.)

As you check your sitemap, make sure it is:

For more on sitemaps, check with Google’s guide.

 

2. Check indexing.

Let’s move on to your site’s pages that are indexed by search engines. To check this quickly, fire up WebSite Auditor, create a project for your site (or rebuild an existing one) and jump to Domain Strength.

Ideally, this number should be close to the total number of your site’s pages (which you can see under Site Structure > Pages in your WebSite Auditor project) minus the ones you deliberately restricted from indexing. If there’s a bigger gap than you expected, you’ll need to review your disallowed pages. Which brings us to…

 

3. Make sure all important resources are crawlable.

You might be tempted to simply look through robots.txt to make sure your important pages are crawlable. But in reality, your robots.txt file is only one of the ways to restrict pages from indexing. What about the no index meta tag, X-Robots-Tag, or orphan pages that aren’t linked to internally? What about your JavaScript and CSS files that could be critical to your page’s rendering? To run a comprehensive crawlability check, you’ll need to use an SEO crawler.

If any of the resources on the list aren’t supposed to be blocked, check with the Robots instructions column to see where the disallow instruction was found, so you can fix it quickly.

Once the rebuild is complete, you’ll be able to easily spot the orphan pages by the Orphan page tag.

Note: If your site is built using AJAX or relies on JavaScript to generate its content, you’ll need to enable rendered crawling in WebSite Auditor as you create or rebuild a project. To do this, at Step 2 of project creation/rebuild, switch to Advanced options and check the Execute JavaScript box.

 

4. Amplify crawl budget.

Crawl budget is the number of site’s pages that search engines crawl during a given period of time. Crawl budget isn’t a ranking factor per se, but it determines how frequently the important pages of your site are crawled (and whether some of them are being crawled at all). You can get an idea of what your daily crawl budget is in Google Search Console by going to Crawl > Crawl Stats.

From the report above, I can see that on average, Google crawls 32 pages of my site per day. From that, I can figure out that my monthly crawl budget is 960 units.

Once you know what your crawl budget is, you must be wondering how you can make the most of it.

For every duplicate page you can get rid of — do it. If you have to keep the page, at least make sure to block it from search engine bots. In terms of crawl budget, canonical URLs aren’t of much help: search engines will still hit the duplicate pages and waste a unit of your crawl budget every time.

You can get a full list of pages with redirects in WebSite Auditor, along with a list of redirect chains found on your site. Just jump to the Site Audit dashboard and look for Pages with 302 redirectPages with 301 redirect, and Pages with long redirect chains.

 

5. Audit internal links.

A shallow, logical site structure is important for users and search engine bots; additionally, internal linking helps spread ranking power (or link juice) among your pages more efficiently.

As you audit your internal links, here are the things to check.

 

6. Check on your HTTPS content.

Google started using HTTPS as a ranking signal in 2014; since then, HTTPS migrations have become increasingly common. Today, over 70% of page 1 Google search results use HTTPS.

If your site is already using HTTPS (either partially or entirely), it is important to check on the common HTTPS issues as part of your site audits. In particular, remember to check for:

For a comprehensive list of all non-HTTPS resources on your site, jump to WebSite Auditor’s All Resources dashboard. Click on HTML under Internal resources and sort the list by URL (by clicking on the header’s column). This way, you should be able to see the HTTP pages first. For every HTTP page you find, check with the Found on pages list at the bottom of the screen for a full list of pages that link to the HTTP page you’re examining. Here, you’ll also see where the link was found so you can fix things quickly.

If your site hasn’t yet gone HTTPS, you may want to consider an HTTPS migration. If you do decide to go secure, feel free to use the framework from the case study of our own migration to HTTPS at link-assistant.com.

 

7. Test and improve page speed.

Google expects pages to load in two seconds or less, and they’ve officially confirmed that speed is a ranking signal. Speed also has a massive impact on UX: slower pages have higher bounce rates and lower conversion rates.

Page speed isn’t just one of Google’s top priorities for 2018, it’s also its ranking signal both for desktop and mobile results. To check if your pages pass Google’s speed test, open your WebSite Auditor project and go to Content Analysis. Click Add page, specify the URL you’d like to test, and enter your target keywords. In a moment, your page will be analyzed in terms of on-page optimization and technical SEO. Switch to Technical factors and scroll to the Page Speed (Desktop) section of on-page factors to see if any problems have been found.

If your page doesn’t pass some of the aspects of the test, you’ll see the details and how-to-fix recommendations in the right-hand view.

 

8. Get mobile friendlier.

After a year and a half of careful experimentation and testing, Google started migrating sites to mobile-first indexing this spring. A ‘mobile-first index’ means that Google will index the mobile versions of websites instead of their desktop version. This literally means that the mobile version of your pages will determine how they should rank in both mobile and desktop search results.

Here are the most important things to take care of when auditing your mobile site.

To do an in-depth mobile website audit, you’ll need to run a site crawl with custom user agent and robots.txt settings. In your WebSite Auditor project, jump to the Pagesdashboard and click the Rebuild Project button. At Step 2, make sure the Follow robots.txt instructions box is checked; in the drop-down menu next to it, choose Googlebot-Mobile. Right below, check the Crawl as a specific user agent box. In the drop-down menu to the right, pick the second user agent on the list:

That’s the user agent Google uses when crawling mobile versions of pages. In a moment, the tool will conduct a full audit of your mobile website. Remember that any SEO issues you find can equally affect your desktop and mobile rankings, so do look through the traditional SEO factors like redirect chains, broken links, heavy pages, duplicate or empty titles and meta descriptions, etc.

 

9. Ask search engines to re-crawl your site.

With the 8 steps above, I’m sure you’ve identified a few issues on your site that need fixing. Once you have those fixed, you can explicitly ask Google to re-crawl your pages to make sure the changes are taken into account immediately.

All you need to do is log in to Google Search Console and go to Crawl > Fetch as Google. Enter the URL of the page you want to be re-crawled (or leave the field blank if you’d like Google to crawl the homepage) and click Fetch.

Note that your fetch must have a complete, partial, or redirected status for you to be able to submit the page to Google’s index (otherwise, you’ll see a list of problems Google found on your site and will need to fix those and use the Fetch as Google tool again). If Googlebot can successfully fetch your page, just click the Submit to index button to encourage Google to re-crawl it.

You can submit either the exact URL to be re-crawled (up to 500 URLs per week), or the URL and all pages linked from it (up to 10 per month). If you choose the latter, Google will use this URL as a starting point in indexing your site content and will follow internal links to crawl the rest of the pages. Google doesn’t guarantee to index all of your site’s pages, but if the site is fairly small, it most probably will.

(There’s a similar option in Bing Webmaster Tools, too. Just locate the Configure My Sitesection in your dashboard and click on Submit URLs. Fill in the URL you need re-indexed, and Bing will typically crawl it within minutes.)

 

 

Source: SEO Power Suite

Exit mobile version