How to audit your
competitors with free
tools and data
Sophie Gibson // Rise at Seven
@sophiegibson
@riseatseven
Competitor
Site Audits
No Enterprise
Solutions
Only Free
Tools
& Free Online
Data
A Case Study
(*not a client)
estimated
revenue loss
£314k
per month
How?
Category Duplication
Let’s dig
deeper
Indexing
https://www.next.co.uk/
sitemap-index.xml
1
2
3
4
5
6
Tool
FREE
TOOL
Tool
23
24
25
26
27
Compare with
indexed pages
Tool
FREE
TOOL
searchoperators:
site:
Limit results to those from a
specific website.
195k Sitemap
vs
569k Indexed
3X
as many URLs
Why is this
happening?
Tool
FREE
TOOL
searchoperators:
inurl:
Looks for a word or phrase in the
URL
Lets dig
deeper
How
BIG
of a problem?
Let’s find
out
WEBSITE LINKS COUNT
CHECKER
https://smallseotools.com/website-links-count-checker/
Is this a
BIG
problem?
@sophiegibson
searchoperators:
@sophiegibson
inurl:
Looks for a word or phrase in the
URL
https://smallseotools.com/website-links-count-checker/
@sophiegibson
@sophiegibson
@sophiegibson
30%of indexed urls
Check
assumptions
Issue
Tool
FREE
TOOL
Tool
How
BIG
of a problem?
71
Let’s find
out
searchoperators:
inurl:
Looks for a word or phrase in the
URL
What else?
ISSUE:
Analytics &
Attribution
inurl:promotion
Indexing
Indexing
Indexing
85
Let’s find
out
(a different)
searchoperator:
intitle:
Search in the page's title for a
word or phrase. Use exact-match
(quotes) for phrases.
20%of all page titles indexed
Now what?
Use statistics
from case
studies
Finding Case
Studies
Search
Operators
inurl:
Looks for a word or phrase in the
URL
intitle:
Find pages with a certain word (or
words) in the title.
inurl:case-study
or
intitle:”case study”
index bloat “case
study”
Fixing index bloat
achieved
● A 22% increase in
organic traffic
● 7% increase in organic
revenue
https://www.goinflow.com/index-bloat/
7% increase =
£124k
pm
ISSUE:
404s
Is this a
BIG
problem?
Let’s find
out
Use statistics
from case
studies
Only 23% of visitors that encounter a
404 page make a second attempt to
find the missing page.
https://www.impactbnd.com/blog/intelligent-404-pages
77%
loss
Tool
FREE
TOOL
https://www.similarweb.com/
If only 1 % of
traffic directed to
a 404 page...
260k
visitors
Losing 77% of
people here….
200,200
visitors
lost
Ind avg 2.35%
conversion rate….
4700
sales
lost
At a £30 average
order value….
£141k
pm
Tool
FREE
TOOL
Tag Assistant by
Google
Tool
FREE
TOOL
Need to create
an account
Ten free checks
Use statistics
from case
studies
13 tags on
the site....
442ms
total
Reduced number
of tags by 50%...
221ms
reduction
2.21%
increase in
revenue
£39k
pm
+124 k
+141 k
+ 39 k
£304k
Figures are
estimated
Not
technically a
loss
BUT
Don’t worry
about getting
it all ‘right’
Enterprise
sites aren’t
scary
Click around
a site
What can I
actually do
with this info?
Bolster
client
pitches
Use this data to
make assumptions
on processes or
timings
Convince your
boss or client
for resources
Thanks!
@sophiegibson
@riseatseven

Competitor Site Audits with Free Tools and Data - Sophie Gibson - BrightonSEO 2020

Editor's Notes

  • #3 I’m here to talk to you about….. using free tools and data to audit a competitor site from a technical POV, put a ££ figure on their issues, and use this for your advantage. or…. (competitor site audits… for free) - Maybe I could go into some details of why some people might be limited to free tools - in house, they’re freelance I want to run through three main issues that I found from my own site audit, using…
  • #4 using… zero enterprise solutions
  • #5 And using only free tools
  • #6 And free online data
  • #7 Just a caveat - this is a case study, but they aren’t a client.
  • #8 This is a case study of the Next.co.uk website - but just to clarify, I don’t work with or for Next; -Add “why next?”
  • #9 Or how I found an estimated loss of revenue of
  • #10 This is a story about a site audit I completed which found aproximately 314 thousands pounds of issues
  • #11 How did I find these issues - and how did I estimate the value.
  • #12 Lets investigate: So this was the homepage - I noticed that they had these banners on the homepage to specific trends - now this is november, so they - which is great for those more specific searches you get
  • #13 Found a nice category page here… but what is this strange URL? Lets take a closer look
  • #14 So this has /0-homepage at end of URL - huh that is weird, I wonder what happens when you take this part off…..
  • #15 Looks familiar huh? How similar?
  • #16 Almost the same amount of products
  • #17 Almost the same amount of products
  • #18 So if there are two identical pages, how big of an issue is this for them - so let’s have a look at…….(c) indexing:
  • #19 Yep, google is free, and can uncover
  • #20 First thing to do with any indexing, is to figure out how many pages are in the sitemap: so visited the robots.txt file and grabbed the URL - it’s an index, so lets have a look inside
  • #21 Because when I visited the sitemap, it’s such a big site that they have split it into a bunch of separte sitemaps; How do I figure out how many pages are in the sitemap? to do that we can use…
  • #22 …. A free tool. So, I want to
  • #23 Using Screaming frog - I’m talking about the free version here - which does have a limit, as you can crawl only 500 URLs - but, we don’t need to actually crawl the site - we just need to see how many pages are in the sitemap… which you can do by...
  • #24 Then upload - and you can either select download sitemap, or download sitemap index - seeing as we have sitemap index, we’re going to select that option
  • #25 A pop up box will appear...
  • #26 And you plug in your sitemap URL - here’s our next sitemap URL
  • #27 we just need to see how many pages are in the sitemap -we don’t even need to press ‘OK - because, we don’t need to actually crawl the site - as I mentioned, with the free version, you can crawl only 500 URLs
  • #28 Basic audit checks - does their sitemap roughly reflect the amount of URLs indexed in Google. (Yes, this isn’t really accurfate)
  • #29 Basic audit checks - does their sitemap roughly reflect the amount of URLs indexed in Google. (Yes, this isn’t really accurfate)
  • #30 …. A free tool to check indexed pages in google is….
  • #31 Google search operators
  • #32 So let’s dig deeper - if they’re getting links from the homepage wrong, what else are they getting wrong?
  • #33 Basic audit checks - does their sitemap roughly reflect the amount of URLs indexed in Google. Here we can use the site:search operators here (Yes, this isn’t really accurate and can fluctuate here but for a general rule of thumb this is useful)
  • #34 So thats abous five hundred and sixty nine thousand results
  • #35 Basic audit checks - does their sitemap roughly reflect the amount of URLs indexed in Google. (Yes, this isn’t really accurfate)
  • #39 …. A free tool. So, I want to
  • #40 So let’s dig deeper - if they’re getting links from the homepage wrong, what else are they getting wrong?
  • #41 So let’s dig deeper - if they’re getting links from the homepage wrong, what else are they getting wrong?
  • #42 Yep - gross isn’t it - look at that meta. Let’s have a look to see how many pages with this in the title are indexed…. -
  • #43 Let’s have a look at how many are indexed - over 4000 URLs So, considering how many links are on the homepage (it’s not 4000 for sure) that must mean they’ve 1 been there a long while, or the issue is widespread across the site.
  • #44 So let’s dig deeper - if they’re getting links from the homepage duplicated - where else are key contenders for duplication issues?
  • #47 They have a master page with an A-Z list - if this is just A - how many potential duplicates are there going to be?
  • #48 Lets find out, using a...
  • #49 Lets find out - so to do this, we’re going to need (next slide)
  • #50 …. A free tool. So, I want to
  • #52 - and if every single category or filter has this tag on…. This could be thousands of additional URLs being indexed. But to be sure…..
  • #53 After we’ve got our number, we now need to compare this with the number of
  • #54 https://ahrefs.com/blog/google-advanced-search-operators/
  • #55 Back to our handy inurl search operator
  • #58 So there are about 165 thousand results -
  • #59 And when you look at how many results there were for the whole site - this is a massive…..
  • #60 30% of indexed URLs
  • #61 So let’s dig deeper - lets look at specifics, just to check our assumptions
  • #62 Brand example. - from Mela - now there are 496 results - and only the first two have custom meta data - the rest look auto generated. So, why do we think this is the issue?
  • #63 So let’s dig deeper - lets look at specifics, just to check our assumptions
  • #64 Going back to our party wear page. We’ve got some category selections - if they’ve got a lot of index bloat, what are the reasons why we have multiple categories coming up - with ecom, one issue which always pops up is how filters are handled
  • #65 Lets pick jumpsuits for example
  • #66 Now look what happens to the URL - it changes - now, this doesn’t mean anything on it’s own - so we can use a nice free tool to check this out...
  • #67 …. Now we need some help from… A free tool.
  • #68 …. Now we need some help from… A free tool.
  • #69 See robots will visually show you at a glance what the index status of a particular page is - it will show a coloured square, which means the following
  • #70 So not only does the URL change - the page has an index, follow tag on this, which means this URL is accessible and indexable for Google
  • #71 - and if every single category or filter has this tag on…. This could be thousands of additional URLs being indexed. But to be sure…..
  • #72 Lets find out - so to do this, we’re going to need (next slide)
  • #73 https://ahrefs.com/blog/google-advanced-search-operators/ search operators again!
  • #74 Back to our handy inurl search operator
  • #76 So I used this ‘isort’ term, and used the inurl: operator to find out how many URLs were affected by this issue.
  • #77 So I used this ‘isort’ term, and used the inurl: operator to find out how many URLs were affected by this issue. Now, it’s not as big as it could potentially be
  • #78 Now, seeing Google states there is no information for this page - this makes the think that these i-sort terms have been excluded in the robots.txt file, which means a large majority of these pages may not be crawled - but as they are linked internally, if you have not this does have the potentially to become an issue later down the line. But...
  • #79 What else could be causing these issues?
  • #80 Now, going back to our partywear page - there is also the word promotion on this, I clicked around from the same banner area, and found other pages with the same structure,..so to check this, we go back to ….
  • #82 You guessed it, back to our handy inurl search operator
  • #83 Here are the results for that search,
  • #84 With around 25 thousand results for URLs with this in the URL - a much bigger issue.
  • #85 And this led me to notice this quirky meta title data, which is listed on these tom search term: From the Next UK, - … hmm, okay, I think we all know what my next question is..
  • #86 Lets find out - so to do this, we’re going to need (next slide)
  • #87 https://ahrefs.com/blog/google-advanced-search-operators/ search operators again! But different - if I want to
  • #88 Back to our handy inurl search operator
  • #89 Now the page title I found was a little longer “from the next uk online store”
  • #90 And this had 123 thousand results -
  • #91 And when you look at how many results in google there were when doing a site search - this is a massive…..
  • #92 20% of page titles indexed have this super long - which could mean they are losing out on traffic or effecting click thru rate, seeing as non-optimised page titles are widespread.
  • #93 So now we have got all of this information about issues on the site, now what do we need to do?
  • #94 Yep, we need to show the money - we need to try and put a figure on these issues, so we’re gonna need some stats and values from somewhere. Where….
  • #95 You can use statistics from case studys.
  • #96 But how do I find these case studies?
  • #97 Think with Google have a nice selection of case studies with useful stats in
  • #99 You can do a regular search for this - but sometimes I find the articles brought back aren’t always what I’m looking for - so we can go back to our trusty search operators - we can use….
  • #100 In url… or
  • #101 In title
  • #102 So let’s dig deeper - if they’re getting links from the homepage wrong, what else are they getting wrong?
  • #103 Or for this, I just used just exact match terms
  • #104 Article, which said these had achieved a 22% increase in traiffc and 7% increase in revenue from organic. But what does that mean in terms of solid £££?
  • #105 …. Now we need some help from… SpyFu, a competitor keyword tool...
  • #106 And you click on
  • #107 And you click on the estimated monthly clicks section
  • #108 It will bring you to the seo overview tool. Here you can see a trend of how many keywords they’re ranking for. I noticed this drop off - so I can see that they went from having 78k keywords ranking in feb
  • #109 And in may this dropped to 25k - so, something happened there.
  • #111 Going back to the study we found, that site had a 7% increase in organic revenue - 7% increase on 1.78=
  • #112 This would net you roghly an extra 124k - which you are currently just throwing away.
  • #114 When I was looking around I stumbled upon the 404 page - which wasn’t great. No menu on the page - three static links to other category pages. Fustratingly, the logo didn’t take you to the homepage either.
  • #115 - and if every single category or filter has this tag on…. This could be thousands of additional URLs being indexed. But to be sure…..
  • #116 Lets find out - so to do this, we’re going to need (next slide)
  • #117 You can use statistics from case studys.
  • #118 Found this in a study which found that only 22%
  • #119 Which is a 77% loss in traffic - but how do we find out how many people might reach a 404 page?
  • #120 …. A free tool. So, I want to
  • #121 Competitor
  • #126 Back to our statistics of - losing
  • #128 Back to our statistics of - I found a case study which showed that typically ecom websites in the fashion industry has a 2.35% conversion rate
  • #130 And, taking a look at the general price of products, I took a guestimate of around £30 for the average order value, as we don’t have any more specific data
  • #132 …. A free tool. So, I want to
  • #134 All non standard implementations - All non standard implementation Site tags firing 4 pageview requests - Category page - 13 tags! Multiple analytics & GTMs. Too many tracking codes for the same thing causes inaccurate Analytics data If you don’t know what online channels are working - how do you know where to best invest the marketing budget? An increase in third party tracking scripts slows down the page loading speed
  • #135 …. A free tool. So, I want to
  • #136 To Semrush
  • #137 You need to create an account
  • #138 And you get 10 free checks
  • #139 According to SEMRush 70% of traffic with ZERO referral data, going into the direct bucket - this means that potentiallly they have 11m sessions without attribution - Need to check GA for specific figures
  • #141 This study showed that each script added to a page increases load speed by 34.1 ms
  • #142 Of there are 13 tags on the site, this adds up to…
  • #143 442 ms in total
  • #144 Back to our statistics of - I found a case study which showed that typically ecom websites in the fashion industry has a 2.35% conversion rate
  • #146 34ms each tag 13 tags on the site = 442ms Reducing scripts by just 50% = 2.21% potential increase in revenue
  • #148 39k pm
  • #149 If you add all this together you get….
  • #150 A net loss of 304 thousands visits
  • #151 Yes, figures are estimated- these findings are not going to be super accurate without data access or paid tools, and we’re making a lot of assumptions.
  • #152 And technically, they’ve not ‘lost’ this revenue, as these figures are actually potential revenue they’re missing out on due to specific issues.
  • #153 But that being said
  • #154 A pitch that is rough and ready with good insights can make a big difference, compared with something super accurate which took 10x longer and has less content - especially if you’re telling a story to get someone else on board, or if you’re super strapped for time.
  • #155  You don’t need an arsenal of tools to be able to audit large, enterprise level sites. Don’t be intimidated by either size
  • #156 You can find a lot of issues just clicking around a website - Once you’ve found one main thread, it’s easier to find related issues from the same angle.
  • #158 You can use this information bolster client pitches - “If we can figure out what issues you might be having off external data sources, imagine what we can do with access to your own data”
  • #159 We can use this information to add additional insight into pitches - We can infer hidden problems by making some assumptions - for example, in this case - we might be able to infer that teams may not be working together as well, as it looks like the Marketing department are potentially creating new banners and ‘featured categories’ without SEO in mind Automated meta still being used in sections, so no procoess when addingg You can use this information to show them other services which you could help with. Create workflows / support Next.co.uk product team with maintaining site health
  • #160 Nothing gets resources or focus in the areas you want to go for, better than telling your boss you’d be getting one over on the competitors if they can swoop in RIGHT THIS SECOND