Technical SEO

What is technical SEO? A simplified way to think of it is as any optimization that isn’t related to your content or links – rather, it’s the parts that users don’t see. Most website owners should (and probably do) know that search engine optimization is an integral part of getting organic traffic to their site, but many less are aware of the impact that technical SEO can have.

Basic SEO pyramid

Technical SEO is your foundation

It should not come as a surprise to you that this type of optimization can get very, well… Technical. Therefore, we have kept this page legible and relatively brief. Each specific section will be finished off with a quick checklist – and there is some crossover between sections. Don’t feel too bad if you can’t check off every item, since it’s not necessary (or even possible) to have a perfect website.

The why: What can I gain from technical SEO?

While it’s true that the biggest gains from optimizing the technical aspects of your website come from repairing, restoring and correcting past mistakes, errors and hiccups – but there are also very positive effects on loading speeds, conversion rates, bounce rates, click-through-rates and even traffic to be had with proper technical SEO!

What does technical SEO entail?

Technical SEO encompasses many different aspects of your site, from how fast it loads to how the entire website is structured and coded. Certain parts may seem insignificant, such as reducing the size of your images by a few kilobytes each, but it all really does add up in the end. A single image of 10 kB that gets viewed 3 times per day adds up to 10.95 megabytes in a year and that’s just a single, tiny image.

They add up to a lot

Web property, language and target country

We’ll start here, since you can do this as soon as you have a website online. Your website (the domain + content) can be seen as an internet property, much like owning a piece of land in real life. The unique problem to web properties is that they can have multiple addresses. For example, these are four different addresses to the same property/website:


You need to properly define the correct version in Google Search Console (previously known as Webmaster Tools), so that Google knows which version it should show in search results. Match this with a “rel=canonical” tag in the HTML.

While you’re at it, you should make sure that your website is targeting the correct country/language. This too can (and should) be done in both HTML and the Search Console. Note that while having alternate language tags/links in your code is recommended, they are ignored by Google. Instead, they figure out the language based on your content.

“We don’t use any code-level language information such as lang attributes.” – Google

SSL Secure Web

Lastly, it is now recommended to use site-wide SSL/HTTPS encryption. It’s definitely not required, but more and more sites are joining the secure web – especially since Google announced it is a (minor) ranking factor. It is also a prerequisite for HTTP/2. Warnings for websites without HTTPS will become more and more prevalent.

Website architecture and structure

You should have thought about your site structure, information architecture and navigation before you even started. If you didn’t, it’s better to fix it sooner rather than later. Having a poorly structured site with improper or confusing navigation makes it more difficult for both human visitors and robotized search engines to wander through your website.

Site structure example

A simple example – Just make sure it makes sense

It should be as simple and clear as possible, but there is no one true way. A typical structure would use categories, subcategories, pages/posts and, perhaps, tags and landing pages. Doing this right helps the right people find the right page at the right step of the sales funnel.

URL optimization

Far too many webmasters and e-commerce business owners look past URL optimization for some reason. It’s important, since both search engines and regular users vastly prefer short and clear URLs. You should be able to tell what the page is about just by reading the address, so if your URLs are not optimized, there’s a decent chance you are taking a hit in the rankings.

Pretty site structure example

Web shop “Bubbleroom” has a very clear URL structure

As an example of good structure: – This is easy for both humans and search engines to understand. Use suitable keywords in the URL. You also need to make the navigation (menus, internal links, etc.) just as easy to understand. PS: Your URLs should use UTF-8 encoding.

“Make sure to use UTF-8 encoding in the URL (in fact, we recommend using UTF-8 wherever possible)” – Google

Even if all your URLs are easy to read, as well as being very short and descriptive, you may have missed to correctly define all other URL parameters, such as pagination, filtration, tracking tags, session IDs and queries. Google and other search engines have gotten really good at identifying these over the years, but it’s still important to make sure it’s set up correctly. There are several ways of doing this (robots.txt, Search Console, html tags, etc.) and it might be worth doing it in more ways than one, to ensure that other search engines also understand your website.

  • Your site has a clear, logical structure
  • It is easy to navigate
  • Your URLs are short, clear, relevant and readable
  • Your URLs use UTF-8 encoding

Indexing & Crawling

The first thing is to make sure that the so-called “crawlers” or “spiders” – computerized bots that crawl through the web, scanning pages and indexing them – can actually crawl your site successfully. Google Search Console allows you to request a simulated crawl from Google’s own bot, and upon completion you can manually submit your website to the Google index. Alternatively, you can use other crawlers, such as the excellent Screaming Frog SEO Spider or web-based tools.

Google Web Crawler

You need to ensure that the crawlers are able to visit every page they should be, and unable to access any page you don’t want them to visit. Is your robots.txt file set up correctly? It allows you to give instructions for bots and crawlers about where they may and may not go, how often, etc. The worst case scenario is that you have accidentally blocked your entire website from being crawled.

Create a sitemap for your entire website and submit it Google Search Console. This ensures that your website gets crawled and indexed by the search engine and if something goes wrong, you’ll see it in detail.

Indexing and crawling checklist:

  • Your site can be crawled
  • It passes HTML/CSS validation
  • It has little or no duplicate content
  • Your robots.txt is correctly set up
  • You have created and submitted a sitemap
  • You have no 404s (or crawl errors)
  • Your redirects are working as intended


While users have always preferred pages that load quickly, it has become a much more important factor over the past few years. The massive online retailer noted that their conversion rate went up by 1% per 100 ms of reduced page load speed.

Google has confirmed that pages that load quickly tend to rank higher. Their human raters also evaluate the loading time as part of measuring a sites user-friendliness. The fastest pages on the web load in hundreds of milliseconds. If you still think your 4 second loading time is OK, you should seriously reconsider.

Page load speed runner

So, what causes slow loading speeds for web pages? How fast your website loads depends on many different variables, including your: theme, design, plugins, hosting, CDN, images, code structure, html compression, minification, the number of requests, browser/disk caching and so on.

You obviously need to start with hosting your site on a fast web host, since that is the one thing you cannot change on your end. It will always be the ultimate bottle neck for your technical SEO efforts. The next step would be to combine as many static files as possible into as few files as possible. This refers to such files as CSS, JavaScript and images, in order to reduce the number of requests to the server. Note that websites utilizing HTTP/2 most likely do not need to reduce the number of requests.

Static resources that don’t need to be reloaded every time they are used can be loaded from the user’s disk instead of the server. Leverage browser caching (set to at least one week) for any and all cacheable resources. A CDN (Content Delivery Network) will host static files locally across the globe to further reduce the load on your server.

PageSpeed checklist*:

  • All CSS and JavaScript is separate from HTML
  • All code (HTML, JS, CSS, etc.) has been combined and minified
  • All code has been compressed with gzip
  • All render-blocking content is below the fold
  • Scripts are loaded asynchronously
  • All images are compressed
  • All images have predefined sizes
  • Browser caching is utilized
  • Disk caching is utilized

* Note that many of the checklist items are on a “where applicable”-basis. Certain server configurations, themes or website designs are not compatible with all forms of optimization.

User-friendliness, design and markup language

These things are more difficult to define. Google uses human raters to assess if a site is user-friendly, then teaches its algorithm what technically makes a site user-friendly, then uses a combination of algorithmic and human ratings to rank websites. They take in to account things such as:

  • Does it satisfy my needs/answer my question?
  • Does the page load quickly?
  • Does it look professional?
  • Is it easy to navigate?
  • Is the secondary content (such as ads) too distracting?

It’s definitely a mixture of technical + regular SEO, as well as other aspects of web development, so communication is key to optimizing these variables. You need to make sure it is responsive, renders and looks good on any device and at any resolution – especially mobile.

“Structured data markup” is a standardized way to annotate your content so that machines can understand it. When your web pages include structured data markup, Google (and other search engines) can use that data to index your content better, present it more prominently in search results, and surface it in new experiences, like voice answers, maps, and Google Now.

It helps products stand out in search results, it can show pictures, breadcrumbs, phone numbers, etc. and has a great potential for increased CTR. You may have seen star ratings in the Google search engine results pages – this is where those come from.

You may have heard of W3C XML Schema Markup Language – this is what we are talking about. If you cannot (or don’t want to) use that, you can also use Google’s own Data Highlighter to achieve similar results.

Mobile optimization is a chapter unto itself, but having a responsive design is most highly recommended. Not only does it scale your website to fit any device and screen size, but it only requires that you maintain a single version of a website and can still serve them all.

User experience and markup checklist:

  • Your site is responsive and works across devices/resolutions
  • It’s easy to navigate, quick and satisfying
  • It uses W3C Schema markup (if applicable)
  • It renders correctly when fetched by Google

Technical SEO tools

These tools will help you along the way.

  • Screaming Frog SEO Spider (the free version crawls up to 500 pages)
  • – Checks a website for all 40x errors.
  • – Validates links (40x errors)
  • – Validates markup language
  • – Validates CSS

The best tools for measuring pagespeed

  • Google PageSpeed Insights
  • GTMetrix
  • Pingdom (free report)
  • YSlow Chrome Plugin (Yahoo!)

Technical SEO audit by SEOSEON LTD.

We at SEOSEON would be happy to help you optimize your site by performing a technical SEO audit. We will probe your website thoroughly and scan every single page for potential errors, as well as possible gains. Upon completion, you will receive a detailed report, with a clear course of action and a priority rating for each task. Let’s improve your website together!