What is technical SEO?
Technical SEO is a site optimization process that helps search engines like Google find, understand, and index site pages. Unlike on-page SEO optimization, technical SEO is not focused on optimizing the content on the site but on technical parameters that affect the ability of search engines to analyze the site properly.
Technically, SEO alone will not drastically increase the position of a site on search engines if the content is not of good quality and if on-page optimization is not done. But, if a technical issue prevents search engines from indexing your site, it will not appear in the results regardless of the quality of the content and other SEO activities you undertake.
Find out below:
- How to make it easier for Google to find your site and start showing it on results as early as possible
- What is the Google Search Console, and what are some of the most important features that this platform provides you
- How to speed up your site and avoid someone leaving the site just because the page loads too long
What activities does technical SEO Audit cover?
Many factors affect technical SEO, and for this guide, we will group them as follows:
- Factors affecting site indexing
- Duplicate content
- No index tag
- Robots.txt file
- Canonical tag
- Load speed optimization
- Site structure
- Responsive web design
- SSL certificate
Indexing means that search engines were able to find your site when they searched the Internet and added it to their site database. Based on whether the site is properly indexed and what the search engines found on the site, they later rank it for relevant terms.
For a site to appear on search engine results, search engines must find that site through crawlers (small bots that go online and find sites). In most cases, this will happen automatically after a while, but there are a few things you can do to make their job easier and faster.
You also need to make sure that there is nothing to prevent your site from being indexed. You can check most of these things, and more, through the Google Search Console platform.
Google Search Console
The Google Search Console is a free platform that lets you learn a lot about your website and the people who visit it. With this platform, you can find out things like the number of people who find the site through search engines and what terms they use when searching, the rate of organic clicks, and much more.
Through the Google Search Console, you can also check that your site's pages are indexed correctly, check that your site is optimized for mobile devices, and see any potential errors. This is the part that interests us most for the needs of this guide.
You must first add the site to the Google Search Console.
Note: This is best done immediately after launching the site because the Google Search Console starts collecting data from the moment you add your site to the platform and cannot display any data from an earlier period.
The process of adding a site to the platform is simple.
First of all, log in to your Google Account, where Google Analytics is implemented, I will explain why below. Visit this link and click on "Start".
The next screen will show you two options for adding the site to the
platform. The first option is better if you have multiple subdomains and
require DNS verification. In most cases, another option that has a much
easier verification process will suffice.
Enter your domain and click "Continue".
It's time for verification. The easiest method I recommend in all cases is Google Analytics verification. If you are logged in to a Google Account where Google Analytics is already set up, select this option and then click "Verify".
After a short check, you will be taken to the Google Search Console
platform, where you will be able to see a lot of useful information about
your site and enable faster indexing. If you have just added a website, no
data will be displayed.
The next step is to add an XML Sitemap to the Google Search Console.
A sitemap is a file through which you provide information about the pages, videos, and other content on your website and the relationships between them. Search engines like Google read this file to properly search and index the site.
Adding a Sitemap to the Google Search Console can speed up the indexing of
your site. This is especially important for new websites, but it is also
useful for websites that have been around for some time.
To add a sitemap, you must first create one.
Creating an XML sitemap in WordPress and Blogger
Yoast SEO WordPress plugin is very useful when it comes to optimizing WordPress sites. This plugin also allows you to easily create a sitemap, although the option is a bit hidden.
You can find the XML Sitemap in General> Features> XML Sitemaps> See the XML Sitemap section of the Yoast SEO plugin.
All sitemaps will be displayed, which are likely to include content types such as posts, pages, projects, categories, etc.
Depending on the structure of the site, in most cases, it will be enough to enter folders of posts and pages in the Google Search Console.
Copy the links of these folders and enter them in the Google Search Console in the Sitemaps tab.
As for Blogger, the thing is much simpler.
Just type sitemap.xml and press Submit.
Create an XML folder if the site is not WordPress or Blogger
If your site is not created in WordPress or Blogger system, I assume you have a developer who created the site so ask him/her to create an XML sitemap.
You can create a sitemap using the free Screaming Frog tool. Instructions for creating a sitemap using the Screaming Frog tool can be found here.
Once you get site map links using the Screaming Frog tool, you can enter them into the Google Search Console in the Sitemaps tab.
Another thing you can do to speed up the indexing of your site's pages is to inspect the URL. You can do this in the URL Inspection tab in the Google Search Console platform.
Enter the URL you want to index at the top of the page, and you'll get the current information Google has about that address.
Here you can see if Google can regularly find and index the address, if there are any problems and if the page is optimized for mobile devices. If the URL hasn't been indexed yet, or you've made some changes to the page that you want Google to pull faster, click the 'Require indexing' button.
I recommend that every time you place a new page on a site, you require indexing so that Google can find and rank the page earlier.
Finding indexing issues
In the "Coverage" tab of the Google Search Console platform, you can check for errors or warnings on your site when it comes to indexing. Here you can also see how many pages of the site are properly indexed and how many pages are excluded from the ranking.
Duplicate content is content that appears on multiple pages on the Internet.
Duplicate content can be found within a single site, such as multiple pages
with the same text.
Google does not penalize sites directly for duplicate content, but it can cause many other crawling issues.
Duplicate content is always best eliminated, and there should be original content on all pages of the website. However, in some cases, you may want to keep duplicate content or, you can't remove it from the site.
For example, if your site has different mobile and desktop addresses, addresses with and without www prefixes, etc. In this case, you can only remove those pages from the Google index, and duplicate content will not negatively affect your SEO.
Remove pages from the Google Index
You're probably wondering why you'd like some pages of your site not to appear on Google results.
If more pages can rank, it is better for SEO, right?
Not always. Some pages are not good for first contact with site visitors and should be excluded from search results. This does not mean that they will be removed from the site (users will be able to access them without problems), but they will not appear on Google.
Here are some examples of pages you might want to remove from the Google index:
- Category and tag pages - if you have a blog and do not want to display pages where all texts by categories and tags are listed
- Comment pages - if you have a lot of comments and they are organized into pages, you do not want those pages to be ranked on the results
- Basket, payment pages, thank you pages, and similar pages are always good to remove from search engines
Duplicate content pages - avoid duplicate content whenever you can, but if you already
need to have duplicate content on some pages - remove it from the index
One way to remove these pages from the index is to request them in the
Removals section of the Google Search Console platform.
Great solution if you only need to temporarily (up to 6 months) remove the address from the index. For example, if you have pages related to seasonal promotions, you don’t want them to appear until next season.
Click the New Request button.
Then enter the URL you want to remove and click the "Next" button.
If you still want to permanently remove some pages from the browser, the best option is the noindex tag.
No index tag
But an index tag is a piece of HTML code that can be added to a page and signals to search engines not to index that page. Search engine crawlers can visit the site but do not include it in the results.
WordPress - No index tag
If you optimize a site created on the WordPress platform, you can add an index tag to the pages using the Yoast SEO plugin. While editing a page at the bottom, you will have a Yoast SEO section, and in the Advanced tab, you can choose not to index that page.
A great method for removing individual pages from the index, but what if you want to remove categories and tags?
You can do this in the Taxonomies tab of the Search Appearance section of the Yoast SEO plugin.
Blogger No index tag
Simply, edit the post or page, on the right side in Post Setting select, Custom Robots Tags> default> noindex
If your website is not created on WordPress or Blogger platform, but you can add an index tag inside the <head> part of the HTML code of the page, and that tag looks like this:
<meta name="robots" content="noindex">
In the URL Overview tab of the Google Search Console platform, you can check
that the no-index tag is working properly.
Enter the URL and click the Test Active URL button.
If you get a message the URL is not available to Google, it means that the no-index tag is set correctly.
However, if you see a message that the URL is on Google, it means that
the index tag is not implemented properly.
Once you've placed a no-index tag on a page, it usually takes Google a few days to a few weeks to pull changes.
Another way to control the way search engines read and index your content is the robots.txt file.
Robots.txt is a file that tells search engines how to crawl certain web pages
or parts of pages.
Most sites do not need a robots.txt file because Google and other search engines can, in most, cases index all important pages of the site without any problems.
In cases when you want to remove a page from the index, the no-index tag can solve it easier than a robots.txt file.
However, the robots.txt file is very useful if you want to remove certain files (images, PDFs, etc.) from the browser index. This is a rare case and would be beyond the scope of this text, but if you want to know more about it, you can find more information at this link.
You are most likely to encounter a robots.txt file indirectly if you have a site created on the WordPress or Blogger platform and, for some, reason noticed that your site does not appear at all in search results.
When you install WordPress or Blogger and start working on a site, there is no point in indexing a website because it is under construction. WordPress and Blogger have a feature that automatically requires robots.txt to require browsers not to crawl and index pages, and this feature is automatically enabled.
When your WordPress site is ready for users, in the Settings> Read the section of the WordPress CMS, remove the bar in front of Discourage web browsers from saving this site. A few days after you make that change, your site will likely be in the search engine index.
When your Blogger site is ready for users, go to Settings> Crawlers and indexing> Custom robots.txt.
Enter the following:
Disallow: / search
A few days after you make this change, your site will appear in the search engine index.
The canonical tag tells search engines which URL is the main page for a particular piece of content. This address ranks in search results while search engines crawl much less often and do not include other sites that have similar content in general.
The canonical tag is one of the best ways to prevent problems caused by identical or similar content appearing on multiple pages of a site.
In the WordPress panel, go to the page you want to canonically tag to another page. In the Advanced tab, you can enter the address of a page that has the same or similar content, and you want only that page to be indexed for that content.
As for the Blogger: The Blogger team took all the necessary measures
to comply with web page best practices.
If your site is not created on WordPress, you can add a canonical tag using HTML by entering inside the <head> section of the page:
<link rel="canonical" href="https://www.example.com/some-page/" />
Site speed optimization
Google wants to provide users with quality content that loads very quickly. Users expect sites to load within seconds, and if that doesn't happen, they quickly return to search results in search of a better, faster solution.
This reflects badly on the average time spent on the site, the bounce rate, and therefore on SEO. Page loading speed is among the most important factors of technical SEO.
Okay, that probably sounds logical to you because no one likes sites that
take a long time to load, but how do you speed up a site?
If you want your site to load quickly, some of the things you need to pay attention to are:
- Web Hosting
- Minification of the elements
- Image compression
We will briefly look at each of these factors.
All content on your site should be stored on a server. That server is essentially your web hosting.
There are various hosting providers and packages, but what interests us the most when it comes to technical SEO is the location of the server and the category of hosting.
The server location is essential because the farther the server is
from the location where the request to load content comes from, the longer
it will take to get a response from the server and the site to load.
In practice, this means that it is best to choose servers that are as close as possible to your target audience. If your audience is in the UK, try to find a hosting provider that has servers in the UK, etc.
Most of the time, you won’t be able to find servers in the location you need (or they’ll be too expensive), but you can make up for that to some extent by using a CDN.
You do not need a host for the Blogger website.
The cache is located between one or more web servers and users and monitors incoming requests, storing copies of responses - such as HTML pages, images, and files. Then, if the request for the same URL is repeated, those copies are served to users from the cache instead of the content coming back from the original web server.
There are two main reasons why caching is used:
Load speed - Because the request is answered from the cache (which is closer to the client) instead of the source server, it takes less time to load the page.
Reduce data traffic - Because content representations are re-displayed from the cache, this reduces the amount of data traffic.
Here are some add-ons that make it easy to implement caching in WordPress:
In addition to caching, some of these plugins (WP Rocket) also allow minification of elements.
Minification of the elements
The first step should be to clear the excess code. This redundancy can result from features you no longer have on your website or poor coding.
The bottom line: the cleaner the code, the faster the site will load.
As I mentioned before, WordPress plugins like WP Rocket plugins allow you to do this without coding if your site is made through this platform.
After cleaning the code, it is recommended that you compress it with a program such as GZip.
It is important to note that minifying elements using plugins can often cause problems with loading elements on the site. Therefore, before turning on the minification, make sure that everything on the site works properly. If a problem occurs, you can always turn off the minification option inside the plugin, and your site will work properly again.
Large images slow down your site which, is bad for the user experience and SEO.
The two main types of image compression are lossy and lossless. It is important to understand these terms because different tools offer different types of compression.
Lossy - This type of compression can significantly reduce the image size, but there is also a risk that the images will lose a lot of quality. If you opt for lossy compression, you mustn't overdo it.
Lossless - This compression reduces the size to a lesser extent but does not cause image deterioration.
The image formats mostly used for websites are PNG and JPEG, but the image format I recommend you use is webp.
You can download the program for converting images to webp at this link.
Content Delivery Network (CDN) means a network of servers located in geographically diverse locations for faster delivery of web content. CDN services store site content on servers around the world and, depending on where people access the site is located, send content from the nearest server.
As we have already mentioned in the section on hosting, this leads to faster loading of the site. CDN is especially useful if your customers are located around the world. If this is not the case and most users are in a narrow geographical area, it is enough to rent hosting near that location, and then you do not need a CDN.
There are many CDN services, and their quality varies considerably. Cloudflare is one of the more popular low-cost services.
BunnyCDN is great because you only pay for the flow used, you only pay as much as you need, and you can choose which continents CDN will be included for and which not.
Website structure is the way pages are structured and linked. The ideal site structure helps search engine users and crawlers easily find what they are looking for on a website.
If you have pages on your site that are a few clicks away from your homepage or not linked to another page at all, Google will have a hard time finding and indexing those pages.
But if the site has a good structure and the pages link to each other, Google will easily find all the pages and add them to the index.
Also, when you link from sites that already have authority and backlinks to other pages of the site, you transfer part of that authority to those other pages. This is a great way to improve the ranking of new site pages.
A well-organized site is also great for the user experience because it allows users to easily and quickly access the information they need.
Responsive web design
Today, users access sites more often via mobile phones than laptops or desktops. As a result, a greater need for responsive web design has developed.
Responsive web design means that the site must be viewable and easy to use
on all devices used by users.
If it is necessary to zoom in on the elements, the elements are on the edges or of the screen, etc., it is very bad for the user experience, and search engines punish it. A few years ago, Google introduced "mobile-first" indexing, where it primarily checks how optimized the mobile version of the site is and accordingly decides where the page should be on the results.
You can make a site responsive with CSS code, if you use WordPress or Blogger, your site will automatically be responsive if you use any more popular theme (read theme descriptions just in case).
To verify that a page is responsive, enter the page address in
Google Mobile-Friendly Test
and click Test URL.
After a brief analysis, the tool will tell you if the site page is well optimized for mobile devices.
SSL (Secure Sockets Layer) creates a layer of protection between the web server and the web browser and thus makes the website more secure. When a user provides some information on your sites, such as payment information or contact information, they are less likely to hack that information if you have active SSL protection.
Search engines reward this extra security and position sites that use SSL better in search results.
You can know if the SSL certificate is active if the site address starts with "HTTPS" instead of "HTTP", and the padlock is located to the left of the URL in the browser.
Most of the better web hosting services either offer an SSL certificate for free within cPanel or provide an additional payment for SSL, which usually costs around $ 10 per site for a basic certificate.
If no payments go through your site, it is enough to have a basic certificate.
You can always implement SSL through cPanel for free, and yourself using Let’s encrypt service, although it requires some technical knowledge.
Technical SEO is sometimes very complex, and in this guide, I have
listed the most important things you need to know for your site to be
well technically optimized. If you do technical SEO correctly, research
your keywords well and write quality content regularly, SEO results will
not be missed.