SEO—

Local SEO For Roofing Companies: The Ultimate Guide

27 min read
Local SEO For Roofing Companies: The Ultimate Guide

Summary

You’re great at what you do but struggling to get organic visibility because your competitors are dominating the search results. This guide will show you how local SEO can help you get more leads and grow your business. It covers keyword research with Semrush, backlink analysis and technical SEO audits, the foundation of local SEO. By implementing these strategies and using tools like SEMrush, Screaming Frog and LocalFalcon you can improve your website, user experience and get higher rankings. Don’t miss out on dominating your local market and growing your roofing business.

Picture this: your roofing company is a best kept secret in your area, delivering amazing results that truly set you apart. You have the expertise, a great team and a burning desire to outshine the competition. But there’s one missing ingredient: organic visibility. Whenever you search for a roofing company in your area, you're immediately confronted with your competitors outranking you on Google.

In today’s digital age, relying on word of mouth alone isn’t enough. Ads are becoming increasingly expensive, and the results are diminishing, but there is a better and more sustainable way to grow. It’s time to tap into the power of local SEO and become the go to roofer in your area.

This guide will give you the knowledge and strategies to make this a reality. We’ll cover the benefits of local SEO, on-page and off-page tactics, how to measure your results, how to choose an agency that understands local SEO and answer your questions.

Get ready to learn all you need to know and use these strategies to boost your online visibility. This will attract more leads and grow your roofing business. Don't miss the chance to dominate in your local market!

Getting Started With Local SEO For Roofing Companies

Before we dive into the groundwork, I want to give you a breakdown of some of the topics that we will be covering. Here's an overview of what we'll tackle in this local SEO guide:

  • Finding Your Current Keywords With Semrush
  • Keyword Research
  • Using the Semrush Backlink Analysis
  • Conducting a Technical SEO Audit

We start with our research process as we do with any SEO campaign. That means that we are going to find keywords through some keyword research and we are also going to find link opportunities right away. While we’re at it we will also conduct an SEO audit. All of this really is the foundational work that we need to do to flesh out our local SEO strategy.

If you don’t have access to any SEO software yet, we personally really like working with SEMrush and you can start a free trial by clicking here.

Keyword Research Using SEMrush

Once you’ve set up your trial, go to the Semrush Domain Overview and start by entering your website and hit enter. In our example we will be taking a look at this Roofing Company in Fort Lauderdale, FL called Maze roofing: mazeroofingfortlauderdale.com

Once the analysis is complete and your data has been loaded up, go to Organic Research and click on Positions. From here, go ahead and export these keywords and dump them in a Google Sheet. You should end up with something that looks like this. You now have an overview of your current keywords and their positions.

Semrush keyword data uploaded as a Google Sheet

You can label this tab as “Keywords” instead of Sheet 1 if you want to have some clarity in the process as we will start adding some more tabs in the upcoming steps, and our Google Sheet will become our SEO database.

Untapped Keywords: Competitor Analysis

Now that we've exported our current keywords, it's time to take a look at some of the Untapped keywords. These are keywords that at least one of your competitors rank for, but you don't.

You can open the Competitor tab in a separate tab and open up the Keyword Gap tool in a separate tab as well. My advice is to add 4 competitors here, and you can either use the suggestions that Semrush gives you, or enter 4 competitors that you know are your most important competitors. No matter how you go about it, make sure you put in at least 4 competitors to make sure we have enough data to work with.

Looking At The Untapped Keywords

As I described earlier, the Keyword Gap tool reveals important opportunities. It highlights search terms your rivals rank for, but you don't. This insight shows where competitors outrank you. It reveals ways to grow. So we want to be able to fill that topical authority gap, to ultimately drive more traffic through your content.

Once you’ve finished the keyword gap analysis, you’ll want to click on the Untapped button in Semrush and go ahead and export these keywords and put them in a separate keyword tab inside your Google Sheet. At this point you should have one Keyword tab with your existing keywords and another one named Keyword Gap which are the keywords that you are not ranking for but your competitors are.

Semrush untapped keywords exported into our Google Sheet

Data Categorization Using Filters

Now that we have the data, we want to make sure we clean it up a little and make it easier to work with. To do this, we’ll add some very basic categorization. Let's start with our existing keywords.

Highlight column A, go to Format in the toolbar and click on Conditional Formatting. You’ll now want to go to Format rules, which gives you plenty of options but to keep it simple we want to use Text contains and enter keywords that are highly relevant to our core offer, so in this case we want to focus on “Fort Lauderdale” and enter this here as I am using a Ford Lauderdale based business for this example.

We then want to repeat this process for all the relevant secondary locations that we want to target, such as Florida, Miami, Fort Meyers, Orlando and any area relevant to the business. At this point we’ll have our most important commercial keywords highlighted, which will help us prioritize these later on in this process. I've added a short video down below to show you how to add filters in your Google Sheet. Make sure you give every filter a different color to make the keywords stand out from one another.

Now that we have our keyword data, it's time to dive into our backlinks. We added some filters and found some untapped keywords, which is a great start, but SEO doesn't end here.

Go back to SEMrush and open up the Backlink Gap tool. Once again this is very similar to the Keyword Gap tool so you’ll want to go ahead and enter your top 4 competitors. What I like to do is go into the search results for my primary keywords, so in this instance "roofing company fort lauderdale fl” and find my top 4 competitors that are ranking at the top of the SERPs. You can then go ahead and throw them into this tool to see if they have backlinks that we don’t, which give us the opportunity to acquire those backlinks (or similar ones) as we start working on improving our local SEO performance.

Once the analysis is done, export the data and create another tab in your Google Sheet called Backlink Gap and add in all the backlink data.

Technical SEO Audit: A Technical Analysis

Now that we have the data, we’re going to look at your website from a technical perspective and run a technical audit, sometimes also referred to as technical SEO audit. What that means is we’re going to use a tool like Screaming Frog, which will crawl your website and give you a bunch of data for every URL on your website. This will help you improve the technical performance of your website.

Once you've downloaded and installed Screaming Frog, you can go ahead and enter your domain and then go to the API tab. I don’t have access to the Google Analytics or Search Console for this website, as this is not a client of ours but we can still use two other tools to gather some of this data, one of them is Pagespeed Insights.

Enabling Pagespeed Insights For Screaming Frog

We can connect the Pagespeed Insights API and what this will do is pretty simple, it will take a look at all the pages and it will analyze every single page on the site and see its overall page loading speed, its overall core web vitals (CWV) scores, and all that good stuff that correlates strongly with better user experience.

To get some more SEO related data, we can also connect either Majestic, Ahrefs or Moz. In this case I’m just going to connect the Moz API, but either of these will work to help see the link equity and backlinks going to each individual page for this website. When you decide to use Ahrefs you can also enable keywords and traffic in the settings, so you can pull that data directly from Ahrefs and that just gives us more data about those URLs. It’s not as accurate as Google Analytics as Ahrefs is more of an estimate, but it’s still useful data for making some good decisions.

So lets go ahead and connect these APIs. You'll notice that as I go through this process, I am changing a lot of settings. I won't go through every single setting here for you, but at least for the PageSpeed Insight part of this there are some settings you want to pick because there is only so much we need out of all this data it could gather. I pretty much unselect everything with the exception of a few helpful metrics. I'm looking at the high level stuff. Here's an overview of the settings I enabled (all the other stuff, I disabled):

Pagespeed Insights Settings For Screaming Frog

CrUX Metrics:

  • Core Web Vitals Assessment

Lighthouse Metrics:

  • Performance Score
  • Speed Index Time (sec)

Diagnostics:

  • DOM Elements Count
  • JavaScript Execution Time (sec)

If you have a technical (web development/software) background, you can obviously go ahead and enable more of these features but since we're only doing a surface level audit, I think having just these few settings is more than enough data to work with.

Moz Settings For Screaming Frog

Next you'll want to open the Moz tab under API Access and enable the following settings under the Metrics tab:

URL (Exact URL http + https):

  • Page Authority
  • Total External Links

Domain:

  • Domain Authority

And that's all we need for now. If you have Google Analytics or Google Search Console access for your own website, which you should, then I recommend you integrate those as well. This will give you more accurate traffic data to use, as well as some other performance related metrics from your GSC (Google Search Console) account.

Screaming Frog SEO Spider API integrations enabled for PageSpeed Insights and Moz

We can now start running our Screamingfrog audit, but this will take a while. So in the meantime, lets keep going and look at the next important part and understand your link profile.

With Screamingfrog doing its thing, in the meantime we are going to circle back to Semrush. Under the Linkbuilding section on the left hand side, we're going to click Backlink Analytics. What this is going to do, is it is going to show you all the exacting backlinks for this specific domain, your website. If you enter any other website, you can also see the data for this specific domain, so you can also look at competitors or other websites to learn from them.

Semrush backlink analytics overview of domain's backlinks

Let's go ahead and put these in a separate tab in our Google Sheet. That way we keep the data together and we can add some filters and sorter through some of the backlinks. For this specific website there are only two domains, which could be an indicator that it's a brand new website and domain or they simply haven't done any SEO before and haven't added their website to any directories or whatsoever.

To demonstrate you how to process your backlinks however, I'm going to treat this the same way as we do with any local SEO project. What I like to do is go ahead and just filter through this. The first place we're going to start with filtering is the anchor text. We can click on the "Create a filter" button inside our Google Sheet to get started.

Semrush backlinks exported into Google Sheet, adding filters

To get started we sort the anchor text from A to Z, and the reason for this is that we check for over optimization issues. When you have a lot of repeated anchor texts, that leaves a massive footprint which signals to Google that you are trying to artificially inflate your rankings. When our team runs a local SEO audit and spots something like this, this goes into our action list as we will have to redistribute this anchor text to reduce (or mitigate) the risk. If you have a good variation in anchor texts, you're on the right track.

Local Pack Analysis And Listing Management

Before we dive into this section, it's important to understand what the local pack is. The Google Local Pack is a group of three local business listings. It appears in response to searches for products or services from local businesses. Unlike organic search results, which come from Google's index of websites, Local Pack listings come from the Google My Business directory. The screenshot down below is an example of this for the search query "roofing company in Fort Lauderdale, FL".

Google Local Pack example for roofing companies in Fort Lauderdale, FL

Once again you can go into Semrush, go under the Local section and click on Listing Management and then you can go ahead and enter the business. What this is going to do is look at all of the main citations online citations and look at how accurate your NAP information is across all these sitations.

What Are Citations In Local SEO?

Before we dive in, I understand you might have never heard of the term citations before. Citations are mentions of your business's name, address and phone number (usually referred to as NAP) on a website that isn't your own. So some of these websites are places like Yelp, BBB, YellowPages, Tripadvisor, and other directories. Citations are a pretty important starting point for any local SEO campaign as your citations help Google verify that you're actually located where your website says you're located.

Analyzing The Semrush Listing Management Data

Once Semrush loaded up all the citation and listing related data, we can start looking at the listings we have to start fixing. The beauty of citations is that these are really easy fixes, and this doesn't require a whole lot of effort or strategy as it's just a matter of going through the list and cleaning up.

Semrush Listing Management helps you fix your citations (NAP info)

Since this is relatively low effort, it is not going to have a huge effect on your overall local SEO performance, but as with anything SEO, every little optimization helps. It's something you can then check off your list, knowing that you've got it squared away and your NAP is accurate. Just don't believe that by fixing it you'll see a huge performance increase, as this is a very minor granular factor for your SEO efforts.

Exporting the data

Unfortunately the way Semrush works, we won't be able to just export all of this into a spreadsheet. Instead, what I recommend you do is copy the data from the Semrush page and paste it into ChatGPT, to then ask it to turn the data into a table. This is one of the instances where ChatGPT can help you make your life a little easier. I've recorded a quick video to show you how easy it is to get started:

You can then go ahead, create a new tab in your Google Sheet called Citations and paste the ChatGPT output in there. So at this point we'll have the following tabs in our Google Sheet:

  • Keywords
  • Keyword Gap
  • Backlink Gap
  • Backlinks
  • Citations

This gives us plenty of data to work with right now, but we're going to take it a step further and introduce LocalFalcon.

Discover how our team can drive more conversions and revenue for your website

  • SEO - not just traffic, but actual revenue.
  • Content Marketing- our team creates epic content that will gets shared, attracts links, and drive conversions.
  • Google Ads - effective PPC strategies with clear ROI.

LocalFalcon: Our Favourite Local SEO Tool

The reason we use LocalFalcon is because it helps us benchmark our local pack rankings, and it does this (and other things) really well. There are plenty of rank trackers out there, but with the majority of them if you try to track your rankings the data can be too simplistic to work with. It might say like "You're ranking number 3", but instead I prefer to see how much real estate we occupy and how much ground we've covered in our local SEO campaigns. The beauty of LocalFalcon is that it will show you how well you are ranking in specific areas in the city you're going after.

Once you created your LocalFalcon account and selected a subscription, or started with the pay-as-you-go option (which gives you 100 free credits), you'll want to go to Campaign Scan and create a new campaign. Within this Campaign Scan you're going to enter all of your important details such as your locations and the keywords you want to be ranking for. You can use some of the keywords from the keyword data we extracted using Semrush earlier on. I always recommend doing a 9 x 9 grid on a 5.0mi radius but since this is just for demo purpose, we created a 5 x 5 grid with a 3.0mi radius.

If you do a one-time scan, you'll be able to schedule it and you'll get an email once the results are there. Once that's done running you can open up the Scan Report.

LocalFalcon scan report output for the keyword roofing company

LocalFalcon Scan Report And Recommendations

First we see this scan for roofing company, we see that they're not doing super well for this really critical keyword in the local pack, and there is a few reasons for this which I'll show you later.

One of the main reasons and biggest ranking factor in the local pack is your address. If your address is not in the core location it will be almost impossible to rank. In this case, this company has an address in Fort Lauderdale so they are well equipped to start taking those local rankings.

Having the right category is another key ranking factor, so with LocalFalcon you'll be able to see if your category matches your top competitors. When it comes to the keywords, we would normally expand it and add the following keywords if this was one of our clients:

  • commercial roofing services fort lauderdale
  • residential roofing contractor fort lauderdale
  • roof inspection near me fort lauderdale
  • roof leak repair fort lauderdale

WIth some of these keywords being longer, they're easier to rank for as well. In this screenshot of the AI analysis, you can see some of these recommendations have been given by LocalFalcon as well.

Recommendations from the LocalFalcon AI analysis of our scan report

This is a really good way to benchmark your current positions and you can then slowly build upon this over time. We run a weekly LocalFalcon scan for our clients, share the data with them, and keep improving the results as we go.

I understand it's a lot of tools and moving parts, but I always use Siteliner. We love using Siteliner as it helps you detect duplicate content on your website and it helps you find broken links really easily as well. We always run a premium scan, for which you'll have to put some credits in there, and once the scan is done we export any instances of duplicate content and broken links and put that date into our Google Sheet.

You can get started by visiting Siteliner: https://www.siteliner.com/

Once you've done that, Screaming Frog should be ready with its scan as well so we can go back to this and we'll begin to do a couple of exports.

Technical SEO: Understanding The Screaming Frog Data

Once you're back into Screaming Frog, go ahead and click the Internal tab and click on Export. This will export everything for you in one single file, as we've left the filter to the left on the "All". setting This will export the pages, images, CSS files, everything. You technically don't need to export everything, but in this case I'll export everything and filter it out later. To keep the data clean, let's add another tab into our Google Sheet called "Crawl" or something alike, and paste in the data we just exported. As always, I like to freeze the top row and enable filters so we can easily filter the data when we get to that stage.

Screaming Frog crawl data added to our Google Sheet

Redirect & Canonical Chains

Usually we find a lot of duplicate content on websites as well, and Siteliner can confirm this for us, so what you'll want to do is go back to Screaming Frog, click on Reports, go into Redirects and export the Redirect & Canonical Chains. Often the duplicate content is caused by categories, which is why we always do a technical audit first to help make sure the websites structure checks out.

WordPress Category Pages

These categories are specifically created for WordPress, and you should not confuse them with Ecommerce categories as these are VERY different. For an Ecommerce website you want your category pages to be highly optimized, have them be indexed, and make them crawl able. They're very critical pages for an Ecommerce SEO campaign. When it comes to local business websites however, these category pages that are created on WordPress are not valuable at all and you should not be targeting your specific keyword phrases in these categories.

The example website we choose has no category pages enabled, so we're all good here. If you do find that you have a lot of URLS (addresses) which contain /category/ then you'll want to get this sorted first. Sometimes you'll target the right keyword but instead it should have its own individual page where you can properly build the right content, page and optimization. Categories are not appropriate for local SEO pages. If we'd want to have categories, I would make them extremely broad, but have no location specific modifiers as this will lead to a lot of overoptimization and keyword cannibalization.

Identifying Keyword Cannibalization

Similar to what we did with the Keyword Research, we're going to highlight the column and use that conditional formatting again to begin to identify instances of keyword cannibalization. For local SEO campaigns it's recommended you look at it on a per location basis, so you'll see that with the website I'm doing this audit for there is no Keyword Cannibalization.

To help you understand what to look for when doing a local SEO audit, I will still give you an example of what keyword cannibalization looks like. In the image below you'll see what the results of keyword cannibalization looks like in practice. Often you will find that you shouldn't target the same keyword with a different page, and instead you're better off consolidating the content to improve your rankings and organic performance. Generally speaking you just want to focus one primary keyword per page. On an about page it shouldn't be keyword targeted, and it should just be about you and your brand. It's not an appropriate page for keyword targeting, as it does not match the intent of what someone is looking for.

An example of keyword cannibalization

Once you've done this for the Titles in your crawl data, which we exported earlier using Screaming Frog, you'll want to add a filter for the Meta Description data and highlight the pages that don't have a meta description. Meta descriptions are not a big ranking factor, but it is certainly not a good idea to just have them completely blank in most cases, especially if you don't have a lot of content for the crawler to work with. To enable this filter, you'll want to use the "is empty" filter and I usually give it a red type of color to make sure I'm aware of these fields.

Highlighted empty meta descriptions in our Google Sheet

Once you've done this, we're moving on to the H1 tag which is a really critical tag for on-page SEO. Once I saw that there were a bunch of empty H1 tags I knew there was going to be a problem, so I immediately highlight this as we need to tackle this pretty quickly. The project pages, the about page, contact page and others have no H1 tag on this website, so this will definitely negatively impact the SEO performance.

In our particular case there is a duplicate H1 tag as well, which is something you should always avoid as Google uses the H1 tag to understand what the page is about.

Canonical URL

Now that we have checked the meta title, description and H1 tags it is time to move on to the canonical URL. If you use your homepage as the canonical URL for every single page you're telling Google that every other URL on your website, the preferred version for it is the homepage, which is going to throw some crazy mixed signals to Google and it's very confusing for the search engine to understand. In most instances you want to have a self referencing canonical tag. If you want to have a quick way to see if a page has the right canonical tag, you can use the Detailed SEO extension (Chrome or Firefox), and look under the Canonical section.

Check the Canonical URL using the Detailed SEO extension for Chrome or Firefox

The canonical URL should usually be the same URL as the one you're on at the moment, as you can see demonstrated in the screenshot.

Wrapping Up The Audit

At this point we've got a pretty good idea of what we're working with on a broad level, and it's time to get more specific here. I'm actually going to start building the recommendation sheet, and I'll take you through the steps.

Building The Recommendations Sheet

The first thing you'll want to do is run your homepage through Google PageSpeed Insights, just to give you a secondary confirmation of the speed of this very important page on your site. In most cases, if the homepage is slow, you can safely assume that most of the other pages are going to be slow as well.

PageSpeed Insights Website Performance

For the Maze Roofing website, the mobile speed sits at 65, which we really should be getting at 80. Once you get above 80, the law of diminishing returns begins to kick in and the impact on your overall SEO and user experience is negligible so if you're already at 80+ I wouldn't worry about this. Going from 80 to 90 won't give you a huge benefit, but you will see a massive benefit going from 30 to 80.

PageSpeed Insights results for mazeroofingfortlauderdale.com

We'll add this to our recommendations sheet, but let's keep moving and start doing a further review. What I like to do at this stage is do a qualitative analysis. Keep in mind that this is my own subjective opinion of things that I believe need to change on this website to:

  • Improve conversion rates
  • Create better search engine performance

If you have a good user experience and good conversion rates on your website, that will indirectly benefit your SEO performance as well. We know that Google uses Chrome data to inform how well a site should do inside Google's organic search, based of the signals that they collect from visitors on your website. What you want, is people clicking around, scrolling, submitting forms, clicking your phone number and be interactive with your website. If they don't, this will hurt your SEO. Besides SEO, I think it's important to consider the negative impact on your revenue as well, if people don't convert, don't click your phone number, etc.

Website User Experience Review

Navigation and call-to-action (CTA)

Now that we understand the performance of the website, we're going to look at the user experience (UX) of this website. One thing that immediately stands out to me is that the navigation does not have a very strong call-to-action (CTA). The phone number is there, but it doesn't really stand out from the rest of the links. The contact page is not linked in the Navigation of the website, which is a missed opportunity. Another thing that I'm missing here is a link to some Case Studies or in this case "Projects", which is where visitors should be able to see a before and after comparison to showcase the craftsmanship and attention to detail.

Stock photos vs. original photos

Otherwise the website appears to be using a lot of stock photos, which doesn't help the visuals either. If a lot of your competitors use the same stock photos, visitors will start to recgonize these and this won't do their trust in your business any good.

To get started, stock photos can be a good starting point, but generally speaking you'll want to have your own photos on your website instead.

Keyword stuffing

One thing that stands out in this website review as well is that the word "Roofing Ford Lauderdale" is being overused. If keyword stuffing were an Olympic sport, they'd probably get gold, haha! But quality over quantity is important. Most of the image are also missing alt texts, but let's take a look at the website source code and get an understanding of what's happening under the hood, and see if something pops out to me from a code perspective.

Website Code Review (Page Source)

One thing that immediately stands out is that they use the Yoast SEO plugin, so they do want to perform better in SEO but as with most plugins there's only so much you can optimize until there's no more gains. Yoast helps you get the keyword in the meta title, description, alt tags, headings, and some other places but relying on it too much can lead to over-optimisation. You can use it as a helpful guide, but don't consider it a strict rulebook.

Another thing that stands out is that there is no Google Analytics or Google Tag Manager tag for this website, so they won't be measuring any conversions or traffic, unless they use another 3rd party analytics tool. One of the simplest ways to check this is to view the page source code and look for the GA4 tracking code snippet. The GA4 tracking code usually starts with: <script async src=”https://www.googletagmanager.com/gtag/js?id=…” or <script>(function(i,s,o,g,r,a,m){i[‘GoogleAnalyticsObject’]=r;i[r]=i[r]||function(){... so it's pretty easy to spot. In this case I would add this to my action list, and move this website onto GA4 so we can have proper tracking set up.

Bonus Tip: WordPress Plugin Detector

Next, I look at the plugins when it's a WordPress website and use this WordPress Plugin Detector to check which theme they use and which plugins. Some websites, especially when they have performance issues, simply have too many plugins installed which bloats the database and slows down the website. You can also find out which host they use using WPDetector, which can affect the performance of the website as well.

As this is not an actual client of ours, I'll leave it here but you can go into much more detail when doing a technical audit of the website and a UX review. We have an in-house UI/UX designer that helps us do these type of reviews for every client we work with.

By now I've created a list with recommendations in my Google Doc, which looks like this:

Google Docs document containing our website review bullet points

This gives us a starting point for our SEO strategy, and helps us understand what we need to change about the website to make the most out of our local SEO campaign. You can repeat the same process for some of the other pages, and make notes of the pages you want to improve on as you start working on your SEO campaign.

An important warning: I understand there's lots of cheap SEO agencies out there, and some seem much cheaper at first. The reason for this is that they don't go into as much detail, and simply optimize your texts and build a few backlinks. More expensive agencies typically strategize a lot more, and focus on the broader marketing goals to help drive revenue to your business. If you'd like to learn more about our SEO services, book a call. We won't sell you anything, we'll just ask you a few questions to see if you're the right fit and give you the chance to ask us about our SEO processes and approach!

Discover how our team can drive more conversions and revenue for your website

  • SEO - not just traffic, but actual revenue.
  • Content Marketing- our team creates epic content that will gets shared, attracts links, and drive conversions.
  • Google Ads - effective PPC strategies with clear ROI.

Building Our Dream Sitemap

Alright, so onto the next part! We're going to build a sitemap. Go ahead and create another tab in the Google Sheet, and keep this very simple. I usually create 4 columns:

  • URL
  • Service Focus
  • Location
  • Title

And as usual, I freeze the first row so that I always have that information visible if this starts getting longer and longer. Basically what you want to do is start to build out your dream sitemap from scratch. We do this for every single campaign, even if they are doing well. If they filled out all the content gaps we'll make note of this, but if they haven't then we'll be working on that.

I look at the services, and relevant topics, but also compare to some of their competitors in the top 3 of Google. What you can do is visit their /sitemap.xml and check how they structured their pages and URLs. In this case I've kept the list short but normally I would brainstorm for a bit longer to find any topics that we haven't covered yet on the website and build out my dream sitemap.

Create your dream sitemap in a new tab in your Google Sheet to make sure you cover all the relevant topics

Creating Draft Content Using Surfer

So now that the sitemap is done we're going to create a first draft using Surfer. Using their content editor you can create a draft for your page or buy their AI credits and let the AI do all the work for your first draft.

I know ChatGPT and other AI tools have become wildly popular over the past 2 years, but I still recommend giving it a human touch. Take the draft, and go through it so that it matches your style of writing and doesn't have that AI "vibe".

Everyone can tell when a page contains words such as delve, vibrant, realm, embark, excels or vital and reading AI generated texts isn't a great user experience either.

Once you have the content online, you have the right linkable assets. So one thing I recommend in this day and age these days is to get the most high-quality backlinks possible, and focus on quality. You can get hundreds of cheap backlinks, but they won't move the needle.

Our link building services are not cheap, but if you have a little bit of budget to deploy, we can help you build website authority in a safe and effective way, by getting links from the best websites out there.

Contact us to learn more about our link building services.

If you don't have the budget yet, there are still ways to start getting some backlinks for your business. There is a ton of local link building techniques you can use but one of my favorite go-to's is to look for sponsorship opportunities. So what you can do is go into Google, enter the city you are after and add something like our sponsors, which in our case for Fort Lauderdale, FL would look like this: fort lauderdale + "our sponsors". You can replace our sponsors with either "our donors" or "donate to".

Once you find some websites, you'll want to do some scanning and see if they have outbound links to the websites of their sponsors. If they have outbound links, we need to see how much of an investment this is going to be. If you're doing this for your own business you will obviously know what you want to donate to and what you don't want to donate to so you can build a list and narrow down the list to the ones you care about and do some outreach. You can then go ahead and ask how much of an investment you will need to put down to get listed on the website.

There is a nuance to this, because if you're doing this for clients you will always want to make sure that you are authorized by the client to do this type of outreach. What we do in our agency is we build a big list for our client, send the list over to the client with the potential investment and the organisation they can donate to. All our clients have to do at that point is approve or disapprove those organizations if they are not a good alignment for their brand. This is one of my favorite ways to get hyper relevant high-quality backlinks from trusted entities, which is the key.

Conclusion

Follow these steps and you’ll be on your way to boosting your roofing company’s online presence with local SEO. You’ll get higher search engine rankings, more qualified leads and stand out in a crowded roofing industry.

Remember local SEO is a journey and requires ongoing attention and effort to maintain and improve your rankings. But you don’t have to go it alone. At Bakklog we specialize in local SEO for businesses like yours. Our team are here to help you implement these strategies, optimize your site and get your business visible in local search.

Use our local SEO services and turn your expertise and great service into online growth. We’ll handle the technical stuff, content creation and link building so you can focus on what you do best – delivering great roofing services.

Ready to take your roofing business to the next level? Get in touch today to book a free strategy session. We’ll chat about your specific needs, answer any questions you have and show you how our tailored local SEO services can bring more traffic, leads and revenue to your business.

Boost Your Revenue and Explode Your Sales Beyond All Belief!

We've got something special for you (if you qualify)...

We know you've been burned by other agencies' empty promises, but we're different. The majority of our clients have been with us for many years, as we continue to grow their business to new heights year over year. If you don't get better results within 90 days, we work for free. Sounds fair?

Be quick! FREE spots are almost gone for September

Headshot of Chris Cordell

Written by

Chris Cordell

View profile

Chris Cordell, Performance Marketing Expert at Bakklog, orchestrates campaigns with a blend of strategy and heart. With a passion for connecting brands with audiences, he navigates the digital landscape with finesse. Outside the office, Chris is a devoted family man, cherishing moments with his wife and kids. When not immersed in marketing strategies, you'll find him revving his motorcycle engine or tearing up nearby racing tracks. Whether in the boardroom or on the road, Chris brings dedication and a love for life to everything he does.