Technical SEO. A short phrase that has been known to strike fear into the hearts of SEOs and non-SEO-focused marketers alike.
It makes sense. Between subdomains, robots.txt files, crawl budgets, schema.org markup, and every other factor usually handled by developers, technical SEO can seem daunting.
However, once you dive into the basics, and understand what Google and other search engines are trying to accomplish (crawling and indexing web pages), you can begin to develop a checklist approach to optimizing your website.
We’re here to discuss what technical SEO is, why it’s crucial to ranking well, important considerations to make, and how to optimize your site for future success.
Technical SEO describes any technical or website code related implementation that helps Google and any other search engine crawl bot) efficiently and accurately crawl and index your website.
Example optimizations for technical SEO include, but aren’t limited to:
Technical SEO optimizations can improve user experience, but primarily these factors are aimed at helping search engine crawl bots do their jobs more effectively.
Learn: Before we get further in to understanding technical SEO, make sure you familiarize yourself with regular SEO and how it works.
Although less comprehensible than link building or other on-page optimizations, technical SEO is critical to building strong foundations for your SEO campaign. Without these being properly implemented, Google will have a hard time knowing who you are, what you provide, and how to rank your website appropriately.
Creating outstanding content and building a bunch of links to your site without your technical foundations in place is the same as having 50 holes in the bottom of your boat. You can bail all of the water out as fast as you can, but there are always going to be leaks that stop the boat from staying afloat.
In order to understand why these optimizations are crucial, it’s important to know just how search engines crawl and index content on the web. The more you understand, the better insight you’ll have into optimizing your site.
The web, crawling, spiders...this giant metaphor has gotten out of hand. But it is accurate. Search engines essentially send out these “crawlers” (software programs) that use existing web pages and the links within those web pages to find new content. Once they’ve “crawled” (found all of the content and links) a website, they move on to the next.
Depending on how large, popular, and trustworthy your website is, the crawlers will regularly come back and recrawl your content to see what’s changed and what’s new.
After your website has been crawled, search engines have to make it searchable. A search engine's index is a collection of the pages in the search results that come up when you search for a given search term.
A search engine will regularly update its index based on the directives that you give it in the code of your website – whether pages are deleted, how accessible the content is, and when new content is posted.
There can also be large changes to the underlying software of the search engine, like Google’s mysterious and impactful algorithm updates.
Search engines are powerful software tools that do many complex things, but once you understand their goals, you can start to put together the pieces for your own strategy. A big part of this is knowing the difference between technical SEO and other factor categories.
Even though each of these ranking factors have the same goal: helping improve your search visibility for target keywords, each of the ranking factor categories has a slightly different purpose.
On-page SEO focuses on the factors that your users are most likely to interact with. These include:
Off-page SEO includes all of the ranking factors that are outside of your website. The primary factor that you can control is backlink building and acquisition.
A backlink is anytime another website links over to yours. These links are the thumbs up and thumbs down system of the web. Search engines evaluate your site and its potential to rank based on the quality, quantity, and relevance of links you have coming from other websites back to yours.
Other off-page SEO factors include:
Knowing the key differences between these factors and their intended purpose can help you better inform your implementation strategy. Now that you’ve got the basics down, here are the concrete steps you can take to improve your own website’s technical SEO.
Understanding each technical SEO related ranking factor is important, but correctly implementing each fix and keeping your site healthy long term is the real goal. Here are the 10 most important areas of focus when it comes to full optimizing your website on an ongoing basis. Use this information as a checklist while you go through your own web presence.
One way that you can help search engines rank you higher and more consistently is by planning a website structure that is user-friendly and has clear navigation. Your site navigation is more than just the primary menu at the top of your website. An ideal website structure helps both users and search engines alike quickly and easily find the pages that matter most to them.
Related factors are:
A consistent url structure better helps users understand where they are when navigating through your website, but it also informs search engines about exactly what you do.
Some URL best practices include:
Website performance and page load times have always been a core consideration to performing well in search, but as of June 2021, with Google’s Page Experience Update, it’s absolutely critical to get right.
Google has explicitly stated and quantified their expectations around your website’s Core Web Vitals, which are a set of metrics that aim to set the standard of page load performance quality. The most important of these being largest contentful paint, first input delay, and cumulative layout shift. On top of pleasing Google, users expect your website to load in fewer than three seconds. The longer it takes to load, the less likely site users are to stick around.
Here is a high-level rundown of optimizations you can make to positively impact load performance:
One of the foundational goals of technical SEO is to ensure that your site is able to be found and inspected by Google. There are three primary methods of achieving this and checking to see if your content is currently being crawled by Google:
You should audit your web pages regularly for desired indexation sitewide. Every page should be given a status and corresponding action of whether to keep it indexed, switch a no indexed page to intentionally being indexed, no index a currently indexed page, and more.
Tip: Learn the differences between SEO testing and SEO audits and when to use which.
Once you’ve identified these actions, it’s time to make them happen. Here’s how.
The Robots.txt file is a small file you place in your website folder structure that gives instructions to search engine crawlers about which web pages on your site you want to be crawled and indexed.
Google gives a great overview of how to implement this document and some specific use cases, but in general, here are the primary instructions you can give:
User-agent: *
Disallow:
Sitemap: https://yoursite.com/sitemap.xml
You can also leverage the “Index vs. no index” directives within the code of a web page to instruct a search engine to include your page in their index or not. This can be done by adding a meta tag within the page code written as <meta name="robots" content="noindex"> or <meta name="robots" content="index">.
Similarly, you can instruct a search engine to include a page in their index, but then not to follow the links on that page and pass on their authority to other pages on or off your website. This can be expressed within that same meta robots tag as either <meta name="robots" content="follow"> or <meta name="robots" content="nofollow">.
Schema markup is a form of structured data created by Google, Bing, Yahoo!, and Yandex. Structured data is a form of language that is added to the code that communicates information to search engines. The official Schema website provides resources to learn more and a complete library of schema vocabulary.
Schema markup was created in order to help businesses communicate more explicitly with search engines about the processes, products, services and other offerings that they might have. It also communicates things like key information about the business. Right now, search engines use their complex algorithms to make extremely educated guesses about those aspects.
Schema.org markup can be broken down into two major components, ItemTypes and ItemProperties.
Other than letting search engines know exactly what your content is about, this structured data can help your chances of showing for a rich snippet. These are special features in the SERPs beyond the title, meta description, and URL.
Some examples of how Schema.org can help your website and search visibility with these rich snippets are:
A broken link isn’t only a poor experience for the user, it can also harm your ability to rank. If you have a page that was intentionally or unintentionally deleted, it will show up as a 404 “Not Found” error. This error will take both your users and search engine bots to your “404 page” or a blank page if you don’t have one set up.
It’s crucial that you make a plan of action every time a page is deleted on your site and ensure that the links to those broken pages aren’t interrupted. Here’s how to find and clean up these broken pages and links:
Duplicate content is any time that you have two or more pages on your website that are too similar to one another. This is typically content that is completely copied and pasted or templated content, also known as syndicated content.
In Google’s eyes, duplicate content is the worst because it’s low effort. The goal of any search engine worth its salt is to deliver high quality, informative, and relevant content to its users. See the discrepancy?
To fix duplicate content issues, you’ll first need to crawl your website. Website crawling tools have specific features within the software that look for overlap of content and record which pages are overly similar.
After you’ve identified these pages, you need to determine which page do you want as the “main” page, and what you plan to do with the duplicate content. Delete it? Redirect? Rewrite or refresh?
In other situations, like when you have product pages that don’t have any SEO value (e.g. selling the same shoe in red, blue, etc.), you’ll want to utilize canonical tags between the pages.
A canonical tag is a snippet of text within the code of a page that instructs a search engine to treat that page as an intentional duplicate of another “main” page, and ignore the intentional variations from appearing in the SERPS.
Say you own a gym shoe company called Dope Shoes. A URL you have on your site might look like: https://dopeshoes.com/shoes/running/dope300/.
You might also have a CMS that is making a new “page” for each variation or size: https://dopeshoes.com/shoes/running/dope300/red/ or https://dopeshoes.com/shoes/running/dope300/blue/
Now, because the content for those color variations is likely to be identical or near identical to the main /dope300/ page, you would want to declare that each of those color variations is an intentional duplicate of the main page.
This can be done by placing the rel canonical tag within the code of the variation pages like this:
A secure website has always been important for users and search engines alike, particularly if you utilize ecommerce.
With that in mind, the secure sockets layer (SSL) was created. This adds an extra layer of security due to the SSL distributor by creating a private and public access key on the server that helps to verify the ownership and authenticity of a website. This verification layer prevents a variety of attacks.
Once you’ve implemented your SSL certificate, you will then be rewarded with the HTTPs (rather than the standard and less secure HTTP) protocol added to your URL. Search engines will then include the details of your certificate and include a “secure” related message to users once they find you. It’s also a direct ranking signal.
Simply put, a sitemap is a collection of links that you want search engines to crawl and index. Extensible markup language (XML) sitemaps allow you to give specific information that a search engine can use to more efficiently index your pages rather than a simple list of links.
XML sitemaps are great for large websites with lots of content, new websites that don’t yet have many inbound links, and generally any website that regularly makes changes that need to be crawled and indexed.
If you utilize a CMS, one is usually created for you by adding “/sitemap.xml” to the end of your root domain. Example: https://yourwebsite.com/sitemap.xml”.
Here are some best practices after creating your sitemap:
If you are behind the times, Google switched to mobile first indexing in 2021, and this means that they will evaluate your website to determine it’s ranking potential based on the mobile version of your site.
“Mobile friendliness” describes a range of website features such as:
You can use Google’s own Mobile Friendly Test tool to audit your website.
A strong and intentional internal linking strategy can dramatically improve the strength and ranking of the individual pages on your website. Internal links work similarly to backlinks in the sense that they can help inform search engine bots as to what the target page is about.
When thinking through internal links on your website, the links that you place are not only good for helping users navigate through the site, but they also communicate hierarchy and importance. If you have the most links going to your core solutions pages, Google is inclined to think those are the most important topics and rank you for related terms accordingly.
Best practices to follow:
Now that you have a solid grasp of the most important technical SEO factors and some implementation techniques, here are some must-have SEO tools in your toolbox.
Technical SEO can seem daunting at first. There are a lot of moving parts and a bit of a learning curve. However, these checks are pretty binary and once you understand the intent behind them, you’ll be well on your way to keeping an optimized presence.
When it comes down to it, poorly implemented technical SEO can ruin your other SEO efforts like link building and creating a content strategy. It’s not always the most glamorous, but it is crucial to your website’s success and always will be.
As time goes on, you might be tempted to set it and forget it or implement these checks once and then never review them, but you need to resist that urge. It’s important that you have a plan to regularly check in on the technical health of your website.
Here’s a quick technical SEO upkeep roadmap:
Ken Marshall is the CGO and a Partner at RevenueZen. He’s been doing some version of digital marketing for the past seven years and has shifted his focus to all things SEO and inbound for the last five. Husband, mini Australian shepherd puppy dad, and serial entrepreneur (mostly failures, lots of lessons).
Technical search engine optimization (SEO) can be daunting. It’s a discipline that’s so huge...
Google recently announced that starting July 1, 2019, mobile-first indexing will be the...
Over the years, on-page SEO practices, for the most part, have remained the same.
Technical search engine optimization (SEO) can be daunting. It’s a discipline that’s so huge...
Google recently announced that starting July 1, 2019, mobile-first indexing will be the...