Nice to meet you.

Enter your email to receive our weekly G2 Tea newsletter with the hottest marketing news, trends, and expert opinions.

Your Comprehensive Guide to Technical SEO

August 20, 2021

iStock-889461006

Technical SEO. A short phrase that has been known to strike fear into the hearts of SEOs and non-SEO-focused marketers alike. 

It makes sense. Between subdomains, robots.txt files, crawl budgets, schema.org markup, and every other factor usually handled by developers, technical SEO can seem daunting. 

However, once you dive into the basics, and understand what Google and other search engines are trying to accomplish (crawling and indexing web pages), you can begin to develop a checklist approach to optimizing your website. 

We’re here to discuss what technical SEO is, why it’s crucial to ranking well, important considerations to make, and how to optimize your site for future success. 

What is technical SEO? 

Technical SEO describes any technical or website code related implementation that helps Google and any other search engine crawl bot) efficiently and accurately crawl and index your website. 

Example optimizations for technical SEO include, but aren’t limited to:

  • Creating an XML sitemap to help search engines more easily find pages you want to be indexed 
  • Inserting <meta> tags that instruct crawl bots on which pages you want to be included in Google’s indexed or left alone 
  • Redirecting a newly deleted page with a 301 (permanent) redirect  

Technical SEO optimizations can improve user experience, but primarily these factors are aimed at helping search engine crawl bots do their jobs more effectively. 

Learn: Before we get further in to understanding technical SEO, make sure you familiarize yourself with regular SEO and how it works.

Why is technical SEO important? 

Although less comprehensible than link building or other on-page optimizations, technical SEO is critical to building strong foundations for your SEO campaign. Without these being properly implemented, Google will have a hard time knowing who you are, what you provide, and how to rank your website appropriately

Creating outstanding content and building a bunch of links to your site without your technical foundations in place is the same as having 50 holes in the bottom of your boat. You can bail all of the water out as fast as you can, but there are always going to be leaks that stop the boat from staying afloat. 

Crawling and indexing – what are they and how do they work? 

In order to understand why these optimizations are crucial, it’s important to know just how search engines crawl and index content on the web. The more you understand, the better insight you’ll have into optimizing your site. 

Crawling

The web, crawling, spiders...this giant metaphor has gotten out of hand. But it is accurate. Search engines essentially send out these “crawlers” (software programs) that use existing web pages and the links within those web pages to find new content. Once they’ve “crawled” (found all of the content and links) a website, they move on to the next. 

Depending on how large, popular, and trustworthy your website is, the crawlers will regularly come back and recrawl your content to see what’s changed and what’s new. 

Indexing 

After your website has been crawled, search engines have to make it searchable. A search engine's index is a collection of the pages in the search results that come up when you search for a given search term. 

A search engine will regularly update its index based on the directives that you give it in the code of your website – whether pages are deleted, how accessible the content is, and when new content is posted. 

There can also be large changes to the underlying software of the search engine, like Google’s mysterious and impactful algorithm updates.

Search engines are powerful software tools that do many complex things, but once you understand their goals, you can start to put together the pieces for your own strategy. A big part of this is knowing the difference between technical SEO and other factor categories. 

How does technical SEO differ from on- and off-page SEO factors?

Even though each of these ranking factors have the same goal: helping improve your search visibility for target keywords, each of the ranking factor categories has a slightly different purpose. 

On-page SEO focuses on the factors that your users are most likely to interact with. These include: 

  • Internal links
  • H1-H6 tags
  • Keyword placement  
  • Content URL slugs 
  • Image alt-tags 

Off-page SEO includes all of the ranking factors that are outside of your website. The primary factor that you can control is backlink building and acquisition. 

A backlink is anytime another website links over to yours. These links are the thumbs up and thumbs down system of the web. Search engines evaluate your site and its potential to rank based on the quality, quantity, and relevance of links you have coming from other websites back to yours.

Other off-page SEO factors include:

  • Including your company information on business directories 
  • Social media mentions
  • Unlinked brand mentions on other websites and publications 
  • Reviews on popular platforms  

Knowing the key differences between these factors and their intended purpose can help you better inform your implementation strategy. Now that you’ve got the basics down, here are the concrete steps you can take to improve your own website’s technical SEO. 

11 tips for improving your site’s technical SEO

Understanding each technical SEO related ranking factor is important, but correctly implementing each fix and keeping your site healthy long term is the real goal. Here are the 10 most important areas of focus when it comes to full optimizing your website on an ongoing basis. Use this information as a checklist while you go through your own web presence. 

1. Make site structure and navigation user-friendly 

One way that you can help search engines rank you higher and more consistently is by planning a website structure that is user-friendly and has clear navigation. Your site navigation is more than just the primary menu at the top of your website. An ideal website structure helps both users and search engines alike quickly and easily find the pages that matter most to them. 

Related factors are:

  • Click depth. Click depth is how many clicks it takes to get to any given web page from the home page. This is important because the home page is often one of if not the most visited landing page on any given website. As a good rule of thumb, limit click depth to three clicks. 
  • No orphaned pages. An orphaned page is any page that has no internal links pointing to it. This not only removes the possibility of a user finding the page while navigating the website, but it communicates to search engines that the page isn’t important. Use the tools below to identify these pages, and link to them from another relevant page on the site. 
  • Primary navigation. Your primary navigation menu, usually at the top of every website, is crucial in communicating your website’s most important pages. The pages that you include, and their related anchor text, is telling Google what to rank you for. Here are a few best practices to remember:
    • Include your service or solutions focused pages in the navigation.
    • Make anchor text as keyword focused as possible, but also as broad as possible.
    • Don’t include more than 30 links here as the value of each individual link starts to become diluted. 
  • Secondary navigation. These elements, like a blog sidebar or the footer, should serve to allow users to easily find what they are looking for when not at the top of the website and on non core pages. For a blog, this could be categories, and for the footer, it might be privacy policy information or a link to a partner website. 
  • Breadcrumbs. Breadcrumbs are internal links not found in the primary navigation menu that show a visual of the URL folder structure of the page you’re on. They allow a user to see where they are within the site and use them to easily return to where they came from, hence “breadcrumbs” (think Hansel and Gretel).

2. Create a strategic, scalable URL structure 

A consistent url structure better helps users understand where they are when navigating through your website, but it also informs search engines about exactly what you do.

Some URL best practices include:

  • Create logical parent–child folder structure relationships. Instead of having every page live one level down from the root domain, consider adding parent–child url relationships whenever possible. Let’s say that you offer marketing services. Your parent URL might look like this: https://yourdomain.com/marketing-services/ and contain a list of every service you offer. In this case, it’s a good idea to have separate pages that describe each service. The child URL might look like this: https://yourdomain.com/marketing-services/social-media-management/. 
  • Keep them concise. Conjunctions and articles like “and,” “the,” “or” won’t improve a user’s understanding of your content from the SERPs or improve your rankings in most cases. Cast a wide net, and only include the most relevant terms in your URLs. 
  • Remember to target broad keywords. These are relevant, related keywords to your primary target keyword. 
  • Create a folder structure that scales. Think through what content or offers you are likely to create in the future and organize your URL structure with that in mind. 
  • Avoid weird characters. Anything that would be confusing to a user at first glance or might trip up a search engine should be left out of your URL. The more straightforward, the better. 
  • Use hyphens. Google recommends that you keep things simple and separated in your URLs with the use of hyphens rather than cramming all of your words together or utilizing underscores. 

3. Make sure your site speed isn’t lagging 

Website performance and page load times have always been a core consideration to performing well in search, but as of June 2021, with Google’s Page Experience Update, it’s absolutely critical to get right.

Google has explicitly stated and quantified their expectations around your website’s Core Web Vitals, which are a set of metrics that aim to set the standard of page load performance quality. The most important of these being largest contentful paint, first input delay, and cumulative layout shift. On top of pleasing Google, users expect your website to load in fewer than three seconds. The longer it takes to load, the less likely site users are to stick around. 

Here is a high-level rundown of optimizations you can make to positively impact load performance:

  • Limit third-party resource loading. Any time you have to load an analytics script, pixel, or a software script, you are adding to the overall total requests that your browser has to process in order to show your website. Keep these resources to a minimum. 
  • Deferring/async loading unnecessary scripts. Similar to ensuring only the most important resources are loading, you need to ensure that your resources are loading in the correct order. “Defer” and “async” are attributes that you add to a script to instruct whether or not these scripts are loaded at the same time as other scripts and elements on the page (async), or wait until those other scripts and elements are loaded before they load (defer). 
  • Optimize images and videos. A major barrier to good load performance is having large resources like images or videos that aren’t properly optimized. When you upload an image or a video, ensure that it is compressed with any unnecessary metadata stripped out and resized down to only as big as it needs to be on the page. 
  • Use Google’s Page Speed Insights tool. This will show you exactly what Google sees when they crawl your site and which optimizations they recommend to remedy the core issues. 
  • Implement a content delivery network (CDN). A content delivery network helps your website get served to users more quickly by utilizing servers  (where your website files are housed) that are closest to your user’s physical location. The CDN keeps a copy of your website in a server near your user’s location that then gets served to them whenever they want to access the site. 
  • Choose a proven hosting company. Your website hosting service might be slowing your website down. In particular, if you are sharing hosting space, then you are sharing the amount of bandwidth that can be accessed at any given time with those other websites. If those other websites grow their user base and start taking up more of that shared space, you lose out. Some hosts are also more optimized for website load performance out of the box. When choosing a host, review comparisons of which have the best average load speeds. 
  • Switch your images to WebP. WebP is an image format developed by Google specifically designed for increased load performance. The easiest way to convert your images to WebP is to use a bulk online converter tool or a plugin if you’re using a CMS. 

4. Check to see if your site is crawlable by search engines 

One of the foundational goals of technical SEO is to ensure that your site is able to be found and inspected by Google. There are three primary methods of achieving this and checking to see if your content is currently being crawled by Google: 

  • Check Google’s index directly. The quickest way to see what pages on your site are being indexed by Google, is to check Google directly. You can do this with a “site:” search. If you want to see how many pages are indexed on WebMD, your search would be “site:https://www.webmd.com/”. If you wanted to verify sleep apnea content indexation, it would be “site:https://www.webmd.com/sleep-disorders/sleep-apnea/”. 
  • Check Google Search Console. Google Search Console is a fantastic search discovery and website health tool created by Google. One of its features is checking to see how many pages are currently in Google’s index, which pages are indexed, and which pages are currently not able to be indexed alongside the reason why. 
  • Check Screaming Frog. Screaming frog is a great tool that mirrors how Googlebot crawls your site and will return every page with a status to let you know if it’s currently indexable, crawlable, or any combination of both. 

You should audit your web pages regularly for desired indexation sitewide. Every page should be given a status and corresponding action of whether to keep it indexed, switch a no indexed page to intentionally being indexed, no index a currently indexed page, and more. 

Tip: Learn the differences between SEO testing and SEO audits and when to use which.

Once you’ve identified these actions, it’s time to make them happen. Here’s how. 

Robots.txt 

The Robots.txt file is a small file you place in your website folder structure that gives instructions to search engine crawlers about which web pages on your site you want to be crawled and indexed. 

Google gives a great overview of how to implement this document and some specific use cases, but in general, here are the primary instructions you can give:

  • User agent. This is an instruction to which specific crawlers you want to follow certain rules. You can also specify all crawlers at once. 
  • Allow/Disallow. This is an instruction to prevent a crawler from accessing parts of your site that you don’t want it to. 
  • Sitemap location. You have the ability to tell search engine crawlers the url that your sitemap lives on to make it easier for them to find and return to. 

A very basic sitemap that allows all crawlers to access all content and points them in the direction of your sitemap looks like this:

User-agent: *

Disallow: 

Sitemap: https://yoursite.com/sitemap.xml 

Meta robots tag 

You can also leverage the “Index vs. no index” directives within the code of a web page to instruct a search engine to include your page in their index or not. This can be done by adding a meta tag within the page code written as <meta name="robots" content="noindex"> or <meta name="robots" content="index">.

Similarly, you can instruct a search engine to include a page in their index, but then not to follow the links on that page and pass on their authority to other pages on or off your website. This can be expressed within that same meta robots tag as either <meta name="robots" content="follow"> or <meta name="robots" content="nofollow">.

5. Use schema.org structured data markup 

Schema markup is a form of structured data created by Google, Bing, Yahoo!, and Yandex. Structured data is a form of language that is added to the code that communicates information to search engines. The official Schema website provides resources to learn more and a complete library of schema vocabulary. 

Schema markup was created in order to help businesses communicate more explicitly with search engines about the processes, products, services and other offerings that they might have. It also communicates things like key information about the business. Right now, search engines use their complex algorithms to make extremely educated guesses about those aspects. 

Schema.org markup can be broken down into two major components, ItemTypes and ItemProperties. 

  • ItemType. This lets the search engine know what type of entity the web page is and what it’s focused on. This could be a movie, local business, award, blog post, or even a business review. 
  • ItemProp (property). These are the specific properties of the above-mentioned ItemType. This could be the name of the author of a book, the date your business was founded, or even the price of your software product. 

Other than letting search engines know exactly what your content is about, this structured data can help your chances of showing for a rich snippet. These are special features in the SERPs beyond the title, meta description, and URL. 

Some examples of how Schema.org can help your website and search visibility with these rich snippets are: 

  • Product information 
  • Blog information 
  • Event information 
  • Local business information 
  • Knowledge graph of your organization 
  • Business or product reviews 

6. Eliminate dead links on your site 

A broken link isn’t only a poor experience for the user, it can also harm your ability to rank. If you have a page that was intentionally or unintentionally deleted, it will show up as a 404 “Not Found” error. This error will take both your users and search engine bots to your “404 page” or a blank page if you don’t have one set up. 

It’s crucial that you make a plan of action every time a page is deleted on your site and ensure that the links to those broken pages aren’t interrupted. Here’s how to find and clean up these broken pages and links:

  • Crawl site to find all known 404 pages 
  • Give an instruction to either implement a redirect to a new page or ignore the page if it should rightfully be deleted. This can either be a 301 (permanent) or 302 (temporary) redirect. 
  • Find all of the pages that have linked to the broken page, and replace the links with the updated URL(s) of the forwarded page. 

7. Fix duplicate content issues 

Duplicate content is any time that you have two or more pages on your website that are too similar to one another. This is typically content that is completely copied and pasted or templated content, also known as syndicated content

In Google’s eyes, duplicate content is the worst because it’s low effort. The goal of any search engine worth its salt is to deliver high quality, informative, and relevant content to its users. See the discrepancy?

To fix duplicate content issues, you’ll first need to crawl your website. Website crawling tools have specific features within the software that look for overlap of content and record which pages are overly similar. 

After you’ve identified these pages, you need to determine which page do you want as the “main” page, and what you plan to do with the duplicate content. Delete it? Redirect? Rewrite or refresh? 

In other situations, like when you have product pages that don’t have any SEO value (e.g. selling the same shoe in red, blue, etc.), you’ll want to utilize canonical tags between the pages. 

What are canonical tags? 

A canonical tag is a snippet of text within the code of a page that instructs a search engine to treat that page as an intentional duplicate of another “main” page, and ignore the intentional variations from appearing in the SERPS. 

Say you own a gym shoe company called Dope Shoes. A URL you have on your site might look like: https://dopeshoes.com/shoes/running/dope300/. 

You might also have a CMS that is making a new “page” for each variation or size: https://dopeshoes.com/shoes/running/dope300/red/ or https://dopeshoes.com/shoes/running/dope300/blue/  

Now, because the content for those color variations is likely to be identical or near identical to the main /dope300/ page, you would want to declare that each of those color variations is an intentional duplicate of the main page. 

This can be done by placing the rel canonical tag within the code of the variation pages like this: 

  • <link rel="canonical" href="https://dopeshoes.com/shoes/running/dope300/" /> 

8. Implement HTTPS for enhanced security 

A secure website has always been important for users and search engines alike, particularly if you utilize ecommerce. 

With that in mind, the secure sockets layer (SSL) was created. This adds an extra layer of security due to the SSL distributor by creating a private and public access key on the server that helps to verify the ownership and authenticity of a website. This verification layer prevents a variety of attacks. 

Once you’ve implemented your SSL certificate, you will then be rewarded with the HTTPs (rather than the standard and less secure HTTP) protocol added to your URL. Search engines will then include the details of your certificate and include a “secure” related message to users once they find you. It’s also a direct ranking signal. 

9. Create an XML sitemap 

Simply put, a sitemap is a collection of links that you want search engines to crawl and index. Extensible markup language (XML) sitemaps allow you to give specific information that a search engine can use to more efficiently index your pages rather than a simple list of links. 

XML sitemaps are great for large websites with lots of content, new websites that don’t yet have many inbound links, and generally any website that regularly makes changes that need to be crawled and indexed. 

How do you create an XML sitemap? 

If you utilize a CMS, one is usually created for you by adding “/sitemap.xml” to the end of your root domain. Example: https://yourwebsite.com/sitemap.xml”. 

Here are some best practices after creating your sitemap:

  • Include a link in the website footer
  • Ensure that you have fields for the URL, image, and last modified date and time
  • Submit your sitemaps separately through Google Search Console

10. Ensure your site is mobile-friendly 

If you are behind the times, Google switched to mobile first indexing in 2021, and this means that they will evaluate your website to determine it’s ranking potential based on the mobile version of your site. 

“Mobile friendliness” describes a range of website features such as:

  • Page elements being inside your users mobile viewport
  • Text and other page elements sized for easy readability
  • Scripts and plugins being able to load on mobile viewports
  • Page elements that aren’t moving constantly on the page and aren’t hard to tap and swipe

You can use Google’s own Mobile Friendly Test tool to audit your website. 

11. Improve your internal linking

A strong and intentional internal linking strategy can dramatically improve the strength and ranking of the individual pages on your website. Internal links work similarly to backlinks in the sense that they can help inform search engine bots as to what the target page is about. 

When thinking through internal links on your website, the links that you place are not only good for helping users navigate through the site, but they also communicate hierarchy and importance. If you have the most links going to your core solutions pages, Google is inclined to think those are the most important topics and rank you for related terms accordingly. 

Best practices to follow: 

  • Ensure that you update internal links on your site after target pages are deleted
  • Map out your internal link anchor texts to target keywords that you want the target page to rank higher for
  • Audit how many internal links each page on your site has and make sure that those numbers correlate with the pages that you want to rank most
  • Audit your site for any “orphaned” pages (don’t have any incoming internal links) and create a plan to get at least one or two links sent their way

Technical SEO tools

Now that you have a solid grasp of the most important technical SEO factors and some implementation techniques, here are some must-have SEO tools in your toolbox. 

  • Screaming Frog. Screaming Frog is a daily resource for any long term SEO effort. This software crawls any website on demand similarly to Google and gives you a wealth of information about each crawled page. 
  • Ahrefs. A staple SEO research, keyword analysis, and competitor intelligence tool. Ahrefs can give you data about the technical status of your site, recommended fixes, and regular alerts when certain issues arise. 
  • Google Search Console. This free tool by Google gives you insight into which keywords users have used to find your website. It also gives warnings and daily status updates about how Google is crawling and indexing your website. 
  • Schema.org. The official website of schema.org structured data. Here you can find information on different item types, their properties, and implementation guidance to use structured data to your advantage 
  • Google Pagespeed Insights. Another free tool by Google that shows you how quickly your website loads on both desktop and mobile. Google Mobile Friendly Test
  • Google Analytics. Yet another digital marketing tool staple and complimentary tool by Google. Despite this being primarily a web analytics tool, you can get valuable insight into the technical performance of your website as well. 

Final thoughts

Technical SEO can seem daunting at first. There are a lot of moving parts and a bit of a learning curve. However, these checks are pretty binary and once you understand the intent behind them, you’ll be well on your way to keeping an optimized presence. 

When it comes down to it, poorly implemented technical SEO can ruin your other SEO efforts like link building and creating a content strategy. It’s not always the most glamorous, but it is crucial to your website’s success and always will be. 

As time goes on, you might be tempted to set it and forget it or implement these checks once and then never review them, but you need to resist that urge. It’s important that you have a plan to regularly check in on the technical health of your website. 

Here’s a quick technical SEO upkeep roadmap:

  • Regularly crawl your website with a crawler tool. This will ensure that you always have a pulse on what’s going on. 
  • Schedule time for someone on your team to review your website’s health. This should be a combination of your technical or development team and someone from marketing. Larger or more dynamic websites should do this quarterly, if not monthly. Smaller or more static websites can get away with every three to six months. 
  • Stay curious and continuously learn about changes in the industry. There was a time when none of the existing best practices were the standard. The way to get around this is to stay on top of new trends and standards set forth by Google and other search engines. Keeping up with these developments means that you’ll always please the algorithms and stay on top of your competition. 
  • Consult with an SEO expert alongside your dev team for any major site migrations, refreshes, redesigns, or other large-scale changes. It’s helpful to have a specific checklist for all of these scenarios and make sure that you have prepared before these events, and after they are implemented.

Get this exclusive AI content editing guide.

By downloading this guide, you are also subscribing to the weekly G2 Tea newsletter to receive marketing news and trends. You can learn more about G2's privacy policy here.