SEO

Technical SEO: The Ultimate Beginner's Guide to Unlocking Organic Traffic

29 min read
Technical SEO: The Ultimate Beginner's Guide to Unlocking Organic Traffic

Is your website struggling to climb the search engine ranks despite having great content and a solid backlink strategy? You’re not alone. Many businesses pour countless hours into crafting the perfect blog post or landing page, only to see no rankings and zero organic traffic. The missing piece? Technical SEO.

In this article we’re getting into the nitty gritty of technical SEO—the often overlooked foundation of a successful SEO strategy. Think of technical SEO as the foundation of your website. Without a strong foundation, everything you build on top of it is at risk. You could have the most beautiful site or the most engaging content but if your site is slow, has crawl errors or is not mobile friendly you’re setting yourself up for failure.

In this guide, we’re going to break down everything you need to know about technical SEO. From understanding the anatomy of a web page to fixing common issues that might be dragging down your rankings, we’ll cover the essentials that can unlock your site’s full potential and significantly boost your organic traffic. Whether you’re a complete newbie or looking to fine tune what you already know, this guide will equip you with actionable insights to improve your website’s technical performance and search engine visibility.

Ready to transform your site from an invisible corner of the web to a traffic magnet? Let’s get started.

What You’ll Learn in This Article:

  • What technical SEO is all about
  • Why it’s important for your overall SEO strategy
  • The anatomy of a web page
  • Technical SEO hierarchy
  • Site structure and navigation elements
  • Crawling, loading, and indexing elements
  • Duplicate and thin content issues
  • Page loading speed
  • Other critical technical aspects for overall SEO performance

What is Technical SEO?

Before we get started, let's clarify what technical SEO is all about. Technical SEO is the process of optimizing a website’s technical elements to improve its search engine ranking and visibility. This involves optimizing everything from your server and website code to how your site is structured—all to ensure your website provides a great user experience and is easy to crawl for search engines.

Help Search Engine Crawlers Understand Your Site

Technical SEO focuses on optimizing the back-end infrastructure and code of your website to make it easier for search engines to crawl, index, and understand your content. This might sound daunting, but if you're using a content management system (CMS) like WordPress, Squarespace, Wix, Webflow, or Storyblok, a lot of the heavy lifting is already done for you.

A spider web intricately connects multiple pages, symbolizing the interconnectedness of information for search engine crawlers
Image by Moz.com

Why is Technical SEO Important?

Now that you know what technical SEO entails, you might wonder why it’s so crucial, especially as the starting point of any new SEO campaign. At its core, technical SEO improves your website's crawlability. Without proper crawlability, search engines can’t discover or index your pages, which means your site won't appear in search results.

Increases Website Security

One of the often overlooked benefits of technical SEO is enhanced website security. In today’s digital landscape, with stringent regulations like GDPR, protecting your website and user data is more critical than ever. Implementing best practices for technical SEO not only safeguards your site from potential vulnerabilities but also ensures compliance with local and international data protection laws. A secure website builds trust with your users, reducing bounce rates and potentially boosting conversions. So, don’t overlook the role of SSL certificates, secure hosting, and regular security audits as part of your technical SEO toolkit

Improves Mobile-Friendliness (Responsive Website)

A responsive, mobile-friendly website is non-negotiable in today’s world. With more users browsing on mobile devices than desktops, having a website that adapts seamlessly to any screen size is key. Mobile-friendliness isn't just about having a pretty layout—it’s a critical factor that impacts page loading speed, user experience, and ultimately, your rankings. Tools like Google’s Mobile-Friendly Test can help you ensure that your site meets these mobile standards.

Increasing Your Websites Visibility

Once your technical SEO is solid search engines can crawl and index your pages and your pages will be visible to the right people. A well optimised site means higher rankings for relevant queries and more targeted traffic to you. A big part of technical SEO is the small stuff – fixing crawl errors, page speed and clean URL’s. All of which adds up to more visibility and accessibility for your site.

statistics of mobile vs desktop usage worldwide by stat counter
Desktop vs Mobile Market Share Worldwide by stat counter

Three Primary Website Elements

There are three primary website elements that we’ll dive into and each one kind of describes the different aspects of what makes up a web page, which will help you understand some of these technical terminologies a bit better as we go through the technical SEO optimizations.

HTML: Clean Code Structure

HTML (Hypertext Markup Language) is the foundation of your website. It structures your content so search engines can understand the importance of each piece. HTML elements like headings (H1, H2, H3 etc), paragraphs and lists help define your content’s hierarchy so crawlers can understand. Here's an example of what HTML code would look like:

<h1>Your Blog Title Goes Here</h1>
<p>You might have an intro paragraph here for your blog post to let your readers know what your article is about.</p>
<h2>Sub Title For Your Article</h2>
<p>Paragraph that belongs to this specific sub title & section.</p>

This is a very basic example of HTML structure. For more details, you can view the source code of any web page by right-clicking and selecting "View Page Source."

CSS: What Your Page Looks Like

CSS is short for cascading style sheets and is a technology layer that allows styling of a website, how it visually looks. When you see amazing fonts, colors, layout, different boxes, grids and elements; that is CSS at work right there.

So what CSS does is it determines how HTML elements and fonts are rendered and sometimes even animated on a screen. So let's say you have a heading on your page, CSS can then decide how large it is, what color the text is, the font, where it's placed, how it moves when you resize the window, and there are many more CSS properties that you can use to style a website.There are more than 500 CSS properties but most browsers only support around 300+ properties. Plenty to choose from!

JavaScript: How Your Website Behaves

JavaScript adds interactivity to your site. Whether it’s a simple button click or complex animations, JavaScript powers the dynamic elements of your website, adding to the overall user experience. However, it’s crucial to ensure that your JavaScript is optimized and doesn’t slow down your site, as heavy scripts can negatively impact loading times and crawlability.

Technical SEO Hierarchy

Understanding the hierarchy of technical SEO elements is essential for prioritizing your optimizations. Think of it as a pyramid, where the foundational elements must be strong before you can move up to more advanced tactics.

A pyramid illustrating the hierarchy of needs for technical SEO, emphasizing the importance of crawlability at its base.

Crawlability: Can Search Engines Find And Access Your Site

Crawlability is how easily search engines can find and navigate your web pages. This is a big deal because if search engines can’t crawl your site, they won’t be able to find, index or rank your content and your site will be invisible to visitors. So make crawlability a top priority.

To help with crawling, regularly review and optimize your robots.txt file and XML sitemap. The robots.txt file contains directives that tell search engine crawlers which pages to crawl or not to crawl. Make sure you don’t accidentally block important pages from being crawled. The XML sitemap is a roadmap for search engines, listing all the pages on your site you want to be found and indexed. Keep your sitemap up to date so search engines have an accurate view of your site’s structure.

Also consider your site’s crawl budget, which is the number of pages search engine crawlers will crawl and index on your site within a certain time frame. Using your crawl budget efficiently means optimizing your internal linking, making high priority pages easily accessible and avoiding duplication or low value pages that will waste crawler resources. By managing your crawl budget you’re helping search engines focus on your most important and valuable content.

Indexability

Once search engines can crawl your site, the next step is to make sure your pages are indexable. This means your site’s configuration allows search engines to not only access but also include your pages in their search results. To do this you need to check your robots are set up correctly.

Robots are instructions given through your site’s robots.txt file or meta tags that tell search engines how to handle your content. For example you might use these to stop search engines from indexing pages that are not useful or relevant to searchers, such as admin or login pages, using the no index tag that we'll talk about more later.

You also need to decide which pages on your site should be indexed. Not all pages need to be in search results. Focus on indexing pages that add value to your users and are aligned to your SEO goals. This way search engines will prioritise high quality relevant content and your site will perform better in search and user engagement.

Accessibility (WCAG/ADA)

Before I explain this one, I think it’s important to note that people with disabilities have an immense spending power and $13 trillion in expendable income. Personally I believe everyone should make their website accessible, but there is a revenue opportunity as well. If you want to make sure your website is accessible, run a free accessibility scan today. For online businesses, I encourage you to make your site welcoming to everyone.

Note: These first 3 layers (crawlability, indexability and accessibility) can make or break your search visibility. If these three layers are not properly working, and don’t have this in place, forget about the top two layers as you need to have this core foundation sorted out.

Another thing to consider is the legal aspect, as in the United States the Americans with Disabilities Act (ADA) can get you sued if your website is not accessible, and in Europe legislation is being adopted starting in 2025 to enforce accessibility for new websites.

Graphic displaying essential accessibility metrics and the legal consequences of failing to meet ADA standards for a new website

Rankability: E-E-A-T Content Optimization

Rankability refers to how well your pages are set up to rank in search engine results, which includes content optimisation, authority building and trust signals. One of the key factors here is Google’s E-E-A-T (Expertise, Authoritativeness, Trustworthiness) guidelines which heavily influence how your content is scored. Google looks at signals like author credentials, content accuracy and external mentions to determine where your content sits in the SERPs.

To increase rankability, create content that shows your expertise, gets quality backlinks and builds user trust. Update your content regularly to keep it relevant and align it with the latest search intent. Make sure your pages have proper schema markup to control how your content is displayed in search results.

Clickability: Optimize Your CTR

At the top of the pyramid is clickability, which is all about getting users to click through to your site from search results. This means writing great meta titles and descriptions, optimizing your URLs and making your content relevant and interesting. A high Click-Through-Rate (CTR) not only brings in traffic but also tells search engines your content matches user intent which can improve your rankings.

Optimize your SERP snippets by adding keywords, creating urgency and using symbols or numbers to stand out. Test different meta descriptions to see what works for your audience and keep refining to improve CTR.

Google Search Console dashboard displaying total clicks, total impressions, average CTR and average position, highlighting the "Search results" tab for CTR data

Site Structure And Navigation

First up we’ll look at site structure and navigation, a technical SEO fundamental. This means looking at how your website pages are organized and how they relate to each other. Navigation design includes menu bars, in-content links and overall site layout.

A well structured site not only improves user experience but also distributes link equity. When you get a backlink the link equity it gives you is spread across your internal pages. Simple page layouts and a clean link structure is key to efficient equity distribution. Internal links are crucial in directing equity to the right pages in the right proportions.

If you want a visual representation you can use this free visual sitemap generator to see what pages you have and how they are structured. If you are looking for a visualization for your internal links, I highly recommend you check out Sitebulb instead.

Shallow Structure - But Not Flat!

When designing your site’s architecture you should aim for a flat structure. This means search engines can get to your pages in 2-3 clicks from the homepage which will improve crawlability and add to the overall user experience. A flat structure makes navigation simpler for users and search engines to find important content quickly. A deep structure can bury important content under multiple layers and make it harder for search engines to index and users to find, so unless absolutely necessary you'll want to avoid this.

To test your site’s structure you can use tools like Screaming Frog. This will show you how many clicks it takes to get to each page from the homepage. No page should be more than 3 clicks away. If your analysis shows some pages are too deep you may need to reorganize your content. Consider recategorizing your content into clearer subcategories or refine your internal linking to bring important pages to the surface.

Bonus tip: For very small sites a flat structure where most content is on one level might be enough and easier. But as your site grows a shallow structure will mean all content is accessible and users and search engines can navigate your site.

Breadcrumbs are navigation tools that help users know where they are in the site’s structure. Found on e-commerce and blog sites, breadcrumbs help with navigation and link equity distribution by linking to parent pages. They improve user experience and help search engines understand the relationship between pages on your site.

By providing a clear path from the homepage to deeper content, breadcrumbs allow users to see where they’ve been and where to go next. This can reduce bounce rates and encourage longer site visits as users are more likely to engage with other pages if they can easily get back to broader categories.

Example of navigational breadcrumb structures used on the Bakklog store.

And on top of that, breadcrumbs can also help with SEO. They provide a way for search engines to understand the context and relevance of your pages within the overall site structure. Each breadcrumb link is a signal to search engines of the importance and hierarchy of pages. This structured data can get your site to show up in search results as rich snippets or enhanced listings. Breadcrumbs help with user navigation and overall site SEO.

Pagination

Multiple pages for lists of articles, products, or other content types. You see this a lot on blogs, ecommerce sites when you click through the next or to the previous page, which is pagination.

Ideally you want to have 10 to 30 items per page, and as you start increasing this it can become too many (from a technical resource point of view). Another problem is that users tend to start scrolling too far down. So why is it important?

  • Improves the user experience as it’s easier to navigate
  • Faster page load times as there is less content for the server/browser to load
  • Improved crawlability and indexability

Consistent URL Structure

When optimizing your URLs (also known as slugs or permalinks in various CMS systems) make sure they include your page’s main keyword. This helps search engines understand what’s on your page before they even visit and also makes it easy for users to know what’s on your page just by reading the URL.

Creating SEO Friendly URLs

When creating or editing URLs try to avoid filler words or stop words like “for,” “to,” “the.” These words don’t add much value for search engines and can make the URL longer than it needs to be. Instead use your main keyword. For readability and SEO benefits separate words with dashes (e.g. example-page instead of example_page). Also keep your URLs in lowercase to avoid case sensitivity issues; search engines treat URLs as case sensitive so different capitalizations can create multiple versions of the same content and dilute your SEO efforts.

Bonus tip: Avoid unnecessary file extensions like .asp or .php in your URLs, especially if you’re using a more advanced platform. These extensions can complicate manual URL entry for users and clutter your URLs. Keeping your URLs clean and straightforward is beneficial for the user experience and SEO.

Discover how our team can drive more conversions and revenue for your website

  • SEO - not just traffic, but actual revenue.
  • Content Marketing- our team creates epic content that will gets shared, attracts links, and drive conversions.
  • Google Ads - effective PPC strategies with clear ROI.

Logically Organized Topical Clusters

We want your content to be organized in a way that makes sense. This helps users find what they’re looking for more easily and builds topical authority through content clusters. To do this you’ll need to group related pages into content clusters (also known as topic clusters).

A good example of this is how WordPress handles category pages. It creates category pages that group similar content together which helps both users and search engines understand the relationships between different pages. By organizing content in this way search engines get the context to index and rank your pages better. Plus it helps with user navigation and internal linking. For example if you have a category page with 3 subcategories and each subcategory has multiple pages, linking between them helps users and search engines find and navigate the content more easily.

Here’s an example:

You have five long-tail keywords related to motorcycle licensing and riding:

  1. Get a motorcycle license
  2. How to ride a motorcycle
  3. Motorcycle license process
  4. How to get a license for motorcycle
  5. Ride a motorcycle

You can cluster them like this:

  • Group 1: Motorcycle License

    • Get a motorcycle license
    • Motorcycle license process
    • How to get a license for motorcycle
  • Group 2: Ride a motorcycle

    • How to ride a motorcycle
    • Ride a motorcycle

By clustering similar topics you can create a hub page or sub-cluster around each main topic. This helps create a category with related content underneath it and makes it easier for users and search engines to find and understand your content.

Easy For Users to Navigate

To make your site easy to navigate structure your menu to match your content. For example if your site has a 3 level hierarchy – categories, subcategories and individual pages – make sure your menu mirrors this. Instead of listing the end pages directly in the menu list the categories first.

When the user hovers over a category they should see a sub menu with the subcategories. When they hover over a subcategory they should see the pages within it. This way users can find content more intuitively and navigation is more organized.

Crawling, Loading and Indexing

Crawling, loading and indexing is all about how efficient a website is designed for search engine bots to crawl and explore the website, to basically find new pages. Web page loading speed is a large factor for crawlers, because if it's too slow it might even timeout and the crawler wont load the page or only partially, or move to the next one and skip it.

Another important part is understanding which pages are indexable by default, and which ones are blocked, for example by using a noindex tag. Because if you block certain pages from being indexed, it wont show up in the search engines no matter what you try.

Let’s start looking at some of the ways you can find crawling, loading or indexing issues and how you can tackle them.

Identifying Issues in Google Search Console

One of the first things you can do when conducting a technical SEO audit, is looking at the Page Indexing report in Google Search Console (GSC) to check for any crawl errors. This report shows you pages that Google was unable to crawl, and therefore preventing them from being indexed in the search results. It's important that you go into your GSC account as you have to regularly check the “Page Indexing” report. You’ll also get an email when GSC detects a new error on your site, so you can quickly look into it.

Overview of a Google Search Console page indexing report

In the example above, you can see there are 109 indexed and 657 not indexed pages. Sometimes this is intentional, but it can also show you pages that are not indexed but you would like to get indexed. What you often see is that when you request Google to index a page, after you submit it through GSC, it will briefly show up in the search engine rankings and then get de-indexed, if your website has some technical issues or isn’t of high enough quality to meet Google's standards for the specific query.

A quick way to see if there are any obvious technical issues preventing your page from being crawled or indexed, is by going to the URL Inspection Tool, which allows you to check the status of a specific page and see if it's currently indexed, indexable, blocked or has any other status that matters.

Crawl Stats

Whenever I go through Google Search Console with one of our clients, I notice many people have no clue that you can see how often Google is crawling your site. A really easy way to tell is by going into Settings > Crawl Stats > Open Report. This shows you a quick overview of how many crawl requests there have been, how much has been downloaded and the average response time.

It’s important to note that Google will automatically crawl your website if it's not blocked from public access, and if you are a LaunchPad SEO customer at Bakklog our crawler will also crawl your website to flag any SEO issues that need attention and fixing.

You can request a free 7-day trial and get your first SEO issue report with a prioritized to-do list within 24 hours, helping you tackle the most meaningful issues on your site.

XML Sitemap: A Roadmap For Search Engines

An XML sitemap is an essential file if you want search engines to easily crawl & discover new, updated or removed pages on your website. A sitemap is essentially a roadmap that guides search engines to all the important pages on your site. Most popular CMS platforms like Squarespace, WordPress or Wix automatically create an XML sitemap and update it for you. So as you add pages, edit or update them, remove them, your sitemap will be updated to reflect these changes.

There are also higher level sitemap index pages, which contain multiple sitemaps for pages, blog, images, etc. You often see this on websites that use Yoast, as this plugin automatically creates a sitemap index for you.

You can submit these to your GSC and this will basically tell people right away where they are and have them crawl it right away.

XML sitemap markup inside of a code editor for a WordPress website

Robots.txt File

The Robots.tsxt file contains rules on how search engines should crawl your website. Before we get into this, it’s important to note that a robots.txt file does not always mean crawlers will respect these settings, but they offer guidelines to help crawlers understand which pages you would want to avoid from being indexed or crawled, such as an admin login page often at /wp-admin for WordPress sites.

Ideally your robots file should have a link to your XML sitemap, so that when search crawlers want to know how to crawl your website and check out the rules, it can also locate your XML sitemap if you haven’t already submitted it through Google Search Console.

There are different rules that you can set using a robots.txt file, but commonly the rules include:

  • Which spiders (crawlers) are allowed to crawl your site
  • How quickly they can crawl your website
  • Which directories or pages they can or can't crawl

Internal Linking Between Your Pages

Internal links are links between pages on your website and a very important aspect of technical SEO. It allows web crawlers to discover new pages and build a map of relationships between page topics, specifically if you have categories or subcategories, the topics that you link between each other, so it creates more context.

Internal links also help distribute page authority between related pages so it helps with your overall ranking for each of those pages. 

An often overlooked benefit of having internal links, is that it also helps users to discover new relevant content. So when you’re on a page and you see a link to another page on the same site, with an important anchor text, you might click it and go to the next page. From there, you might click on another internal link again, and stay on the website for a longer time which increases the session duration and is a great signal to Google that you offer valuable content.

Visualisation of internal links on a website with a differentiation between useful and not useful linking

So you might be wondering, how many internal links should I have then? The aim should be to have 2-3 internal links (at the very least) per page. It can be a lot more, depending on how long that page is so if you have a few thousand words, you can definitely add more links.

If you have breadcrumbs on your pages, this also helps for linking up to parent pages, but make sure to link down to child pages and cross linking on the same hierarchy level. That makes for a great and robust internal linking strategy.

You’ve probably seen your fair share of 404 pages on websites you tried to visit, so you know how frustrating of a user experience this is. Broken (404) links create a negative experience, so you’ll want to fix these as soon as possible.

Fix them by creating a 301 redirect to the next most relevant page. Don’t just redirect them to the homepage, but instead find the most suitable replacement for the page you have removed, so that it’s relevant to the page that was there before. You can also link to the category page or if there's a parent page redirect up one level to the parent page.

Google Search Console or LaunchPad will flag 404 errors, you can easily spot them and then go ahead and fix them.

Bonus tip: pay attention to is non-existent inbound backlinks. You can use Google Analytics to find these. Let's say you built up lots of backlinks but you’ve changed your pages over the years, those backlinks may not know that you changed that so they’ll still be pointing to the old (404) page. So what you want to do is go to Google Analytics and look at what pages you are getting traffic from and getting traffic to, and then you can see the non-existent pages and redirect the old page URL to a new one. This allows you to reclaim lost traffic and wasted link equity which is a fantastic and quick easy win to boost your rankings.

Fix Redirect Chains

Now that we’ve mentioned redirects, another thing that you’ll want to look into is redirect chains. Redirect chains happen when redirected pages are redirected again (and again, and again). Let's say you have a page that you deleted, you then redirect it to the new one, but a few months go by and you also delete the newer page, and redirect it again. Yep, you just created a redirect chain without being aware of this.

Example: Page A -> Page B -> Page C

But why is this bad SEO and why do you need to fix redirect chains? There's a few reasons:

  • It causes slightly slower page load times
  • Redirect chains can increase your server load
  • Creates a poor user experience as users get redirected all over the place
  • Decreases the crawl performance
  • Decreases the link equity as with each redirect you lose a little bit of link juice

What you'll want to do when you remove a page, and there is an existing redirect is cut out the redirects in the middle of the chain, and in the example I gave you, you want to redirect page A to Page C right away without the middle bit in between it.

A visualzation of a 301 redirect chain that negatively affects SEO

Duplicate Or Thin Content

Now that we have covered some of the more technical details, let's look at what Google is looking for in terms of content and quality. It’s important to remember that you’re not just trying to get a higher ranking, because Google’s goal is to provide the highest quality user experience possible. So what should your goal be? Provide the highest quality user experience, and duplicate or thin content ruins that.

What you want to do instead is serve high quality content that is unique, useful and original. They shouldn’t be duplicate, and content should solve the issue or explain the answer in enough detail to help the user get value. So your focus again is on providing value to the user.

But what is duplicate content? Duplicate content are pages that have the same, exactly the same, or highly similar content. If you have two pages that have the same topic but talk about it in a slightly different way without adding any additional value to the user, this creates duplicate content. Thin content on the other hand are pages that have minimal to no content and basically don’t have the right amount of words on the page. Category pages or products with no descriptions are typically thin content.

Keyword Cannibalization

Instead of having your pages compete for the same keyword, what we want to do is have one unique topic per page. When you create a new page, you should ask yourself this: “Do I already have a page or article that talks about this topic?” and if the answer is no - great, create the page. If the answer is yes, is it unique enough to warrant its own page? Is it specific enough in terms of topic, to differentiate it from the other page.

What Is keyword cannibalization

What we are trying to avoid is keyword cannibalization. This happens when two pages are identical or very similar and Google is wondering which one it should rank in the search engine results. What can happen is that it will swap out in the search engines, and it creates a lot of confusion as you have two pages competing for the same keyword.

Canonical Tags

When we have duplicate content, or content that is very similar, we can use something called canonical tags to tell Google which of the duplicate pages to return in the search results, while ignoring the rest. You can only have one canonical, so if you have Page A, Page B, and Page C and you want Page B to rank for this keyword or topic, you’ll want to make Page B the canonical page. So on Page A and C you’ll place a little code snippet to tell Google that Page B is the canonical page that you want it to rank. Page B would have a canonical to itself, so that’s how that works.

Another benefit of having canonical tags is that it also helps avoid duplicate-content issues such as www/non-www URLs. A user will see the same content, but for a search engine it will be considered two separate pages so you’ll always want to redirect to either the non-www or www versions of your website.

Another common issue is trailing slashes. So this is when you have a page and you can add multiple slashes to the end. It still loads the page and it doesn’t look any different to the end user, but again it creates multiple pages.

<link rel="canonical" href="https://www.bakklog.com">
<link rel="alternate" hreflang="en" href="https://www.bakklog.com">
<link rel="alternate" hreflang="nl" href="https://www.bakklog.com/nl">

As our website is translated into Dutch for our customers in The Netherlands, we use canonical tags to help Google index the right pages in the Dutch version of their search engine.

HTTPS (SSL Certificate)

Although most CMS systems like Squarespace, Wix, Shopify and Webflow do this for you, you still need to make sure that your website has a valid SSL certificate and always redirects to the HTTPs version of your website.

If you have a WordPress website, this is usually managed on the hosting side of things, and most hosting providers have an easy one-click installer which allows you to quickly set up an SSL certificate for your website. If you’re not entirely sure whether your website has an active HTTPS redirect or not, you can use the HTTPS (SSL) Redirect Checker from SEOptimer.

Noindex - Telling Search Engines Not To Index Your Page

You can add a noindex tag on the pages that should not show up in the search results, and you can add that in your SEO plugin like Yoast or Rankmath, or use a CMS specific configuration. If your website is more of a custom coded type of website, you’ll have to ask your developer to do this for you and only do this for pages that you absolutely do not want to rank on Google or other search engines.

It’s important to note that adding noindex tags is not a substitute for the canonical tags. So, if you have three pages and you noindex two of them, this is not the right way and you should use canonical tags instead of the noindex tag. Noindex tags are more suitable for pages like admin pages, or pages you don’t want the public to go to or find, that’s what noindex is for.

Always Avoid Thin Content

I know most of you like to write extremely long texts for the about page (that is that business owner excitement kicking in 😂), which you shouldn’t for so many reasons, but more on that in our article about 6 website elements you should remove.

What you want to avoid on all pages in general, is having very few words. Once you start writing content for your website you will want to make sure that every page has some unique, useful content on it, even if it is just 100 to 150 words describing what the page is about. It creates a positive user experience, it answers the question (search query) in the right way, or maybe it helps direct the visitor to the next steps they have to take to buy your product or use your services.

Empty pages make it difficult for users to navigate through your website, just like meaningless button texts such as click here or read more mean nothing and set no expectation for the next page. But, search engines also favor having content that is helpful to give the search engine crawlers context around your pages.

Sometimes short content (thin content) does make sense.

Screenshot showing thin content
An example of a page on X (Twitter) Help Center with short content that serves a clear search intent and is therefore not classed as a thin content page.

Page Loading Speed & Performance

Page loading speed is all about user experience. The speed at which a website page loads has a significant impact on your Google rankings but it also affects the users experience. Typically people will go back to the search results if your page does not load within 2 to 3 seconds. Why? Because there’s plenty of options out there that do load fast, and people get frustrated when your website is slow or takes too long to load.

It’s important for the user experience, but also extremely important for web crawlers. Crawlers from Google, Bing or any other search engine may timeout if it takes too long to load, meaning they won’t go through your page at all and won’t index you in the search results.

A quick way to see how your website is doing in terms of its loading speed and general performance, is by checking Google PageSpeed Insights. You will get a few optimization tips right away, and can start looking at what decreases your page performance.

Google PageSpeed Insights

Optimizing Images: Converting Images To WebP Format

To instantly make your pages load faster, you can optimize your images into a more web-friendly file format, such as WebP. These images are extremely compressed but retain their quality, allowing them to load really quickly. That’s why it’s important to use a next-gen file format such as WebP.

Another way to optimize your images is by resizing them. I see it happen all too often, but people forget to downscale images they take with their phone or download from stock sites. By resizing your images to a smaller format that fits the page, you decrease the file size significantly which reduces the loading time for that page. You can use width and height attributes to avoid something called layout shifting.

Third Party Scripts And Unused Plugins

If you’ve run a Google PageSpeed Insights test, it’s very likely that it gave you a warning along the lines of "Reduce the impact of third-party code". Many WordPress websites have way too many plugins, and Shopify stores can have too many apps, slowing down the general performance of the website and increasing the page load times.

A quick way to fix this is to carefully look at which plugins you really need, and delete the ones you don’t use anymore or can consolidate into another existing plugin. For WordPress websites you can also use W3 Cache or WP Rocket to add a layer of caching that helps speed up loading your website as well. Otherwise what you can do is use a tag manager like Google Tag Manager allowing you to easily manage the scripts from one place to avoid creating a cluttered mess of scripts, because that just slows down the loading of your site as well.

Content Delivery Network (CDN)

To add to this, you can also consider using a Content Delivery Network (CDN). What a CDN does, is it distributes your website over multiple servers across the world so that it can load the fastest version of your site, and most CDN providers can also optimize the media assets (such as images, videos, etc.) and add a few security features such as a WAF (web application firewall), HTTPS redirects, newer TLS standards, etc. You can find lots of information about this on the Cloudflare website.

Most managed platforms, like Wix, Squarespace, WP.com or Headless CMS systems like Storyblok already have this built in. When your site is custom built you may need to get a Content Delivery Network just to speed up the loading of your site.

Other Technical SEO Aspects

We've covered most of the Technical SEO aspects, but here's a quick overview of other items you can start looking into once you've worked through the other items in this article.

  • Hreflang For International Websites: If your site targets multiple countries or languages, use hreflang tags to specify the intended audience. This helps search engines serve the correct version of your content to users based on their location and language preferences, avoiding duplicate content issues.
  • 404 Links: Regularly check for broken links and fix them promptly. Redirect these links to the most relevant page rather than the homepage to maintain user experience and SEO value.
  • Structured Data: Structured data enhances how your content appears in search results, enabling rich snippets like star ratings, images, or additional information. Implement Schema.org or JSON-LD to provide search engines with detailed information about your pages.
  • Noindex Tag & Category Pages In WordPress: Avoid cluttering search results by noindexing unnecessary tag and category pages using plugins like Yoast SEO or RankMath. This keeps your index clean and focused on valuable content.
  • Page Experience Signals: Optimize for user experience signals such as avoiding intrusive pop-ups, ensuring mobile-friendliness, and maintaining HTTPS. Tools like LaunchPad SEO can automatically flag technical SEO issues and provide actionable insights to improve your site’s performance.

Conclusion

Technical SEO is key to your website’s performance and search engine rankings. Ready to take your site to the next level and need a full analysis? Get in touch. Contact us to request a technical SEO audit and we’ll help you find and fix any issues. We’re here to help you succeed online! In summary, technical SEO is key to your website’s performance and search engine rankings. Ready to take your site to the next level and need a full analysis? Get in touch. Contact us to request a technical SEO audit and we’ll help you find and fix any issues. We’re here to help you succeed online!

Boost Your Revenue and Explode Your Sales Beyond All Belief!

We've got something special for you (if you qualify)...

We know you've been burned by other agencies' empty promises, but we're different. The majority of our clients have been with us for many years, as we continue to grow their business to new heights year over year. If you don't get better results within 90 days, we work for free. Sounds fair?

Be quick! FREE spots are almost gone for September

Headshot of Sophie Roos

Written by

Sophie Roos

View profile

Sophie Roos is an SEO Specialist at Bakklog with a knack for turning search algorithms into opportunities. She excels in optimizing web content, driving organic traffic, and enhancing online visibility. Outside of the digital world, Sophie loves hiking and finds inspiration in nature’s trails, bringing a fresh perspective to every project she tackles. Her analytical mind and creative approach ensure that every endeavor reaches its full potential in the digital landscape.