0% found this document useful (0 votes)
41 views13 pages

Site Audit: Issues: Generated On October 3, 2022

The site audit found several issues that could negatively impact the user experience and search engine optimization of the site. Specifically, it found 169 broken internal links, 128 incorrect pages in the sitemap, and 85 pages with duplicate content. The audit recommends reviewing and fixing all issues to improve crawlability, ranking and the user experience of the site.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
41 views13 pages

Site Audit: Issues: Generated On October 3, 2022

The site audit found several issues that could negatively impact the user experience and search engine optimization of the site. Specifically, it found 169 broken internal links, 128 incorrect pages in the sitemap, and 85 pages with duplicate content. The audit recommends reviewing and fixing all issues to improve crawlability, ranking and the user experience of the site.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 13

Site Audit: Issues

facilio.com

Generated on October 3, 2022


800 BOYLSTON STREET, SUITE 2475, BOSTON, MA 02199

WWW.SEMRUSH.COM

Site Audit: Issues

Subdomain: facilio.com
Last Update: October 3, 2022
Crawled Pages: 638

facilio.com

414 0

169 internal links are broken


About this issue: Broken internal links lead users from one website to another and bring them to
169 0

non-existent webpages. Multiple broken links negatively a ect user experience and may worsen
your search engine rankings because crawlers may think that your website is poorly maintained
or coded. Please note that our crawler may detect a working link as broken.
Generally, this happens if the server hosting the website you're referring to blocks our crawler
from accessing this website. How to x: Please follow all links reported as broken. If a target
webpage returns an error, remove the link leading to the error page or replace it with another
resource.
If the links reported as broken do work when accessed with a browser, you should contact the
website's owner and inform them about the issue.

128 incorrect pages found in sitemap.xml


About this issue: A sitemap.xml le makes it easier for crawlers to discover the pages on your
128 0

website. Only good pages intended for your visitors should be included in your sitemap.xml le.
This error is triggered if your sitemap.xml contains URLs that: 1.
lead to webpages with the same content. 2. redirect to a di erent webpage.
3. return non-200 status code. Populating your le with such URLs will confuse search engines,
cause unnecessary crawling or may even result in your sitemap being rejected.
How to x: Review your sitemap.xml for any redirected, non-canonical or non-200 URLs. Provide
the nal destination URLs that are canonical and return a 200 status code.

85 pages have duplicate content issues


About this issue: Webpages are considered duplicates if their content is 85% identical. Having 85 0

duplicate content may signi cantly a ect your SEO performance. First of all, Google will typically
show only one duplicate page, ltering other instances out of its index and search results, and
this page may not be the one you want to rank.
In some cases, search engines may consider duplicate pages as an attempt to manipulate search
engine rankings and, as a result, your website may be downgraded or even banned from search
results. Moreover, duplicate pages may dilute your link pro le. How to x.
Here are a few ways to x duplicate content issues: 1. Add a rel="canonical" link to one of your
duplicate pages to inform search engines which page to show in search results. 2.
Use a 301 redirect from a duplicate page to the original one. 3. Use a rel="next" and a rel="prev"
link attribute to x pagination duplicates.
4. Instruct GoogleBot to handle URL parameters di erently using Google Search Console. 5.
Provide some unique content on the webpage. For more information, please read these articles:
https://support.google.com/webmasters/answer/66359?hl=en and
https://support.google.com/webmasters/answer/139066?hl=en.

Generated on October 3, 2022 The report data is taken from SEMrush.com 2


800 BOYLSTON STREET, SUITE 2475, BOSTON, MA 02199
WWW.SEMRUSH.COM

13 issues with broken internal JavaScript and CSS les


About this issue: A broken JavaScript or CSS le is an issue that should be watched out for on
13 0

your website. Any script that has stopped running on your website may jeopardize your rankings,
since search engines will not be able to properly render and index your webpages. Moreover,
broken JS and CSS les may cause website errors, and this will certainly spoil your user
experience.
How to x: Review all broken JavaScript and CSS les hosted on your website and x any issues.

9 pages returned 4XX status code


About this issue: A 4xx error means that a webpage cannot be accessed. This is usually the result
9 0

of broken links. These errors prevent users and search engine robots from accessing your
webpages, and can negatively a ect both user experience and search engine crawlability.
This will in turn lead to a drop in tra c driven to your website. Please be aware that crawler may
detect a working link as broken if your website blocks our crawler from accessing it. This usually
happens due to the following reasons: 1.
DDoS protection system. 2. Overloaded or miscon gured server.
How to x: If a webpage returns an error, remove all links leading to the error page or replace it
with another resource. To identify all pages on your website that contain links to a 4xx page, click
"View broken links" next to the error page. If the links reported as 4xx do work when accessed
with a browser, you can try either of the following: 1.
Contact your web hosting support team. 2. Instruct search engine robots not to crawl your
website too frequently by specifying the "crawl-delay" directive in your robots.txt.

5 pages with a broken canonical link


About this issue: By setting a rel="canonical" element on your page, you can inform search
5 0

engines of which version of a page you want to show up in search results. When using canonical
tags, it is important to make sure that the URL you include in your rel="canonical" element leads
to a page that actually exists. Canonical links that lead to non-existent webpages complicate the
process of crawling and indexing your content and, as a result, decrease crawling e ciency and
lead to unnecessary crawl budget waste.
How to x: Review all broken canonical links. If a canonical URL applies to a non-existent
webpage, remove it or replace it with another resource.

2 pages returned 5XX status code


About this issue: 5xx errors refer to problems with a server being unable to perform the request
2 0

from a user or a crawler. They prevent users and search engine robots from accessing your
webpages, and can negatively a ect user experience and search engines' crawlability. This will in
turn lead to a drop in tra c driven to your website.
How to x: Investigate the causes of these errors and x them.

2 internal images are broken


About this issue: An internal broken image is an image that can't be displayed because it no
2 0

longer exists, its URL is misspelled, or because the le path is not valid. Broken images may
jeopardize your search rankings because they provide a poor user experience and signal to
search engines that your page is low quality. How to x: To x a broken internal image, perform
one of the following: 1.
If an image is no longer located in the same location, change its URL. 2. If an image was deleted
or damaged, replace it with a new one.
3. If an image is no longer needed, simply remove it from your page's code.

Generated on October 3, 2022 The report data is taken from SEMrush.com 3


800 BOYLSTON STREET, SUITE 2475, BOSTON, MA 02199
WWW.SEMRUSH.COM

1 issue with mixed content


About this issue: If your website contains any elements that are not secured with HTTPS, this may
1 0

lead to security issues. Moreover, browsers will warn users about loading unsecure content, and
this may negatively a ect user experience and reduce their con dence in your website. How to
x: Only embed HTTPS content on HTTPS pages.
Replace all HTTP links with the new HTTPS versions. If there are any external links leading to a
page that has no HTTPS version, remove those links.

0 pages don't have title tags


0 0

0 issues with duplicate title tags


0 0

0 pages couldn't be crawled


0 0

0 pages couldn't be crawled (DNS resolution issues)


0 0

0 pages couldn't be crawled (incorrect URL formats)


0 0

0 pages have duplicate meta descriptions


0 0

Robots.txt le has format errors


0 0

0 sitemap.xml les have format errors


0 0

0 pages have a WWW resolve issue


0 0

This page has no viewport tag


0 0

0 pages have too large HTML size


0 0

0 AMP pages have no canonical tag


0 0

0 issues with hre ang values


0 0

0 hre ang con icts within page source code


0 0

Generated on October 3, 2022 The report data is taken from SEMrush.com 4


800 BOYLSTON STREET, SUITE 2475, BOSTON, MA 02199
WWW.SEMRUSH.COM

0 issues with incorrect hre ang links


0 0

0 non-secure pages
0 0

0 issues with expiring or expired certi cate


0 0

0 issues with old security protocol


0 0

0 issues with incorrect certi cate name


0 0

No redirect or canonical to HTTPS homepage from HTTP version


0 0

0 redirect chains and loops


0 0

0 pages have multiple canonical URLs


0 0

0 pages have a meta refresh tag


0 0

0 subdomains don’t support secure encryption algorithms


0 0

0 sitemap.xml les are too large


0 0

0 links couldn't be crawled (incorrect URL formats)


0 0

0 structured data items are invalid


0 0

0 pages are missing the viewport width value


0 0

0 pages have slow load speed


0 0

Generated on October 3, 2022 The report data is taken from SEMrush.com 5


800 BOYLSTON STREET, SUITE 2475, BOSTON, MA 02199

WWW.SEMRUSH.COM

facilio.com

11072 +10

8038 URLs with a temporary redirect


About this issue: Temporary redirects (i.e., a 302 and a 307 redirect) mean that a page has been
8038 +4

temporarily moved to a new location. Search engines will continue to index the redirected page,
and no link juice or tra c is passed to the new page, which is why temporary redirects can
damage your search rankings if used by mistake. How to x: Review all URLs to make sure the use
of 302 and 307 redirects is justi ed.
If so, don’t forget to remove them when they are no longer needed. However, if you permanently
move any page, replace a 302/307 redirect with a 301/308 one.

1256 images don't have alt attributes


About this issue: Alt attributes within <img> tags are used by search engines to understand the
1256 0

contents of your images. If you neglect alt attributes, you may miss the chance to get a better
placement in search results because alt attributes allow you to rank in image search results. Not
using alt attributes also negatively a ects the experience of visually impaired users and those
who have disabled images in their browsers.
For more information, please see these articles: Using ALT attributes smartly:
https://webmasters.googleblog.com/2007/12/using-alt-attributes-smartly.html and Google
Image Publishing Guidelines: https://support.google.com/webmasters/answer/114016?hl=en.
How to x: Specify a relevant alternative attribute inside an <img> tag for each image on your
website, e.g., "<img src="mylogo.png" alt="This is my company logo">".

1149 issues with unmini ed JavaScript and CSS les


About this issue: Mini cation is the process of removing unnecessary lines, white space and
1149 0

comments from the source code. Minifying JavaScript and CSS les makes their size smaller,
thereby decreasing your page load time, providing a better user experience and improving your
search engine rankings. For more information, please see this Google article
https://developers.google.com/web/fundamentals/performance/optimizing-content-e ciency.
How to x: Minify your JavaScript and CSS les. If your webpage uses CSS and JS les that are
hosted on an external site, contact the website owner and ask them to minify their les. If this
issue doesn't a ect your page load time, simply ignore it.

183 pages have low text-HTML ratio


About this issue: Your text to HTML ratio indicates the amount of actual text you have on your
183 +2

webpage compared to the amount of code. This issue is triggered when your text to HTML is 10%
or less. Search engines have begun focusing on pages that contain more content.
That's why a higher text to HTML ratio means your page has a better chance of getting a good
position in search results. Less code increases your page's load speed and also helps your
rankings. It also helps search engine robots crawl your website faster.
How to x: Split your webpage's text content and code into separate les and compare their size.
If the size of your code le exceeds the size of the text le, review your page's HTML code and
consider optimizing its structure and removing embedded scripts and styles.

Generated on October 3, 2022 The report data is taken from SEMrush.com 6


800 BOYLSTON STREET, SUITE 2475, BOSTON, MA 02199
WWW.SEMRUSH.COM

168 pages have no hre ang and lang attributes


About this issue: This issue is reported if your page has neither lang nor hre ang attribute. When
168 0

running a multilingual website, you should make sure that you’re doing it correctly. First, you
should use a hre ang attribute to indicate to Google which pages should be shown to visitors
based on their location.
That way, you can rest assured that your users will always land on the correct language version of
your website. You should also declare a language for your webpage’s content (i.e., lang
attribute). Otherwise, your web text might not be recognized by search engines.
It also may not appear in search results, or may be displayed incorrectly. How to x: Perform the
following: 1. Add a lang attribute to the <html> tag, e.g., "<html lang="en">".
2. Add a hre ang attribute to your page's <head> tag, e.g., <link rel="alternate"
href="http://example.com/" hre ang="en"/>.

94 pages don't have meta descriptions


About this issue: Though meta descriptions don't have a direct in uence on rankings, they are
94 +3

used by search engines to display your page's description in search results. A good description
helps users know what your page is about and encourages them to click on it. If your page's meta
description tag is missing, search engines will usually display its rst sentence, which may be
irrelevant and unappealing to users.
For more information, please see these article: Create good titles and snippets in Search Results:
https://support.google.com/webmasters/answer/35624. How to x: In order to gain a higher
click-through rate, you should ensure that all of your webpages have meta descriptions that
contain relevant keywords.

90 pages don't have an h1 heading


About this issue: While less important than <title> tags, h1 headings still help de ne your page’s
90 +2

topic for search engines and users. If an <h1> tag is empty or missing, search engines may place
your page lower than they would otherwise. Besides, a lack of an <h1> tag breaks your page’s
heading hierarchy, which is not SEO friendly.
How to x: Provide a concise, relevant h1 heading for each of your page.

28 pages have too much text within the title tags


About this issue: Most search engines truncate titles containing more than 70 characters.
28 0

Incomplete and shortened titles look unappealing to users and won't entice them to click on your
page. For more information, please see this Google article:
https://support.google.com/webmasters/answer/35624.
How to x: Try to rewrite your page titles to be 70 characters or less.

25 pages have a low word count


About this issue: This issue is triggered if the number of words on your webpage is less than 200.
25 +2

The amount of text placed on your webpage is a quality signal to search engines. Search engines
prefer to provide as much information to users as possible, so pages with longer content tend to
be placed higher in search results, as opposed to those with lower word counts.
For more information, please view this video: https://www.youtube.com/watch?v=w3-obcXkyA4.
How to x: Improve your on-page content and be sure to include more than 200 meaningful
words.

Generated on October 3, 2022 The report data is taken from SEMrush.com 7


800 BOYLSTON STREET, SUITE 2475, BOSTON, MA 02199
WWW.SEMRUSH.COM

23 pages have duplicate H1 and title tags


About this issue: It is a bad idea to duplicate your title tag content in your rst-level header. If
23 0

your page's <title> and <h1> tags match, the latter may appear over-optimized to search engines.
Also, using the same content in titles and headers means a lost opportunity to incorporate other
relevant keywords for your page.
For more information, please see this Google article:
https://support.google.com/webmasters/answer/35624. How to x: Try to create di erent
content for your <title> and <h1> tags.

11 external links are broken


About this issue: Broken external links lead users from one website to another and bring them to
11 -2

non-existent webpages. Multiple broken links negatively a ect user experience and may worsen
your search engine rankings because crawlers may think that your website is poorly maintained
or coded. Please note that our crawler may detect a working link as broken.
Generally, this happens if the server hosting the website you're referring to blocks our crawler
from accessing this website. How to x: Please follow all links reported as broken. If a target
webpage returns an error, remove the link leading to the error page or replace it with another
resource.
If the links reported as broken do work when accessed with a browser, you should contact the
website's owner and inform them about the issue.

6 links on HTTPS pages leads to HTTP page


6 0

Sitemap.xml not indicated in robots.txt


About this issue: If you have both a sitemap.xml and a robots.txt le on your website, it is a good
1 0

practice to place a link to your sitemap.xml in your robots.txt, which will allow search engines to
better understand what content they should crawl. How to x: Specify the location of your
sitemap.xml in your robots.txt. To check if Googlebot can index your sitemap.xml le, use the
Sitemaps report in Google Search Console: https://search.google.com/search-console/not-
veri ed?original_url=/search-console/sitemaps&original_resource_id.

0 external images are broken


0 0

0 pages don't have enough text within the title tags


0 0

0 pages have too many on-page links


0 0

0 pages have too many parameters in their URLs


0 0

0 pages don't have character encoding declared


0 0

0 pages don't have doctype declared


0 0

Generated on October 3, 2022 The report data is taken from SEMrush.com 8


800 BOYLSTON STREET, SUITE 2475, BOSTON, MA 02199
WWW.SEMRUSH.COM

0 pages have incompatible plugin content


0 0

0 pages contain frames


0 0

0 pages have underscores in the URL


0 0

0 outgoing internal links contain nofollow attribute


0 0

Sitemap.xml not found


0 0

Homepage does not use HTTPS encryption


0 0

0 subdomains don't support SNI


0 0

0 HTTP URLs in sitemap.xml for HTTPS site


0 0

0 uncompressed pages
0 -1

0 issues with blocked internal resources in robots.txt


0 0

0 issues with uncompressed JavaScript and CSS les


0 0

0 issues with uncached JavaScript and CSS les


0 0

0 pages have a JavaScript and CSS total size that is too large
0 0

0 pages use too many JavaScript and CSS les


0 0

0 link URLs are too long


0 0

Generated on October 3, 2022 The report data is taken from SEMrush.com 9


800 BOYLSTON STREET, SUITE 2475, BOSTON, MA 02199
WWW.SEMRUSH.COM

facilio.com

1329 +124

475 URLs with a permanent redirect


About this issue: Although using permanent redirects (a 301 or 308 redirect) is appropriate in
475 +4

many situations (for example, when you move a website to a new domain, redirect users from a
deleted page to a new one, or handle duplicate content issues), we recommend that you keep
them to a reasonable minimum. Every time you redirect one of your website's pages, it decreases
your crawl budget, which may run out before search engines can crawl the page you want to be
indexed. Moreover, too many permanent redirects can be confusing to users.
How to x: Review all URLs with a permanent redirect. Change permanent redirects to a target
page URL where possible.

227 links on this page have no anchor text


About this issue: This issue is triggered if a link (either external or internal) on your website has
227 0

an empty or naked anchor (i.e., anchor that uses a raw URL), or anchor text only contains symbols.
Although a missing anchor doesn't prevent users and crawlers from following a link, it makes it
di cult to understand what the page you're linking to is about. Also, Google considers anchor
text when indexing a page.
So, a missing anchor represents a lost opportunity to optimize the performance of the linked-to
page in search results. How to x: Use anchor text for your links where it is necessary. The link
text must give users and search engines at least a basic idea of what the target page is about.
Also, use short but descriptive text. For more information, please see the "Use link wisely"
section in Google's SEO Starter Guide
https://support.google.com/webmasters/answer/7451184?
hl=en&ref_topic=9460495&authuser=0.

176 orphaned pages in Google Analytics


About this issue: A webpage that is not linked to internally is called an orphaned page. It is very
176 +114

important to check your website for such pages. If a page has valuable content but is not linked
to by another page on your website, it can miss out on the opportunity to receive enough link
juice.
Orphaned pages that no longer serve their purpose confuse your users and, as a result,
negatively a ect their experience. We identify orphaned pages on your website by comparing
the number of pages we crawled to the number of pages in your Google Analytics account. That's
why to check your website for any orphaned pages, you need to connect your Google Analytics
account.
How to x: Review all orphaned pages on your website and do either of the following: 1. If a page
is no longer needed, remove it. 2.
If a page has valuable content and brings tra c to your website, link to it from another page on
your website. 3. If a page serves a speci c need and requires no internal linking, leave it as is.

155 pages need more than 3 clicks to be reached


About this issue: A page's crawl depth is the number of clicks required for users and search
155 +1

engine crawlers to reach it via its corresponding homepage. From an SEO perspective, an
excessive crawl depth may pose a great threat to your optimization e orts, as both crawlers and
users are less likely to reach deep pages. For this reason, pages that contain important content
should be no more than 3 clicks away from your homepage.
How to x: Make sure that pages with important content can be reached within a few clicks. If
any of them are buried too deep in your site, consider changing your internal link architecture.

Generated on October 3, 2022 The report data is taken from SEMrush.com 10


800 BOYLSTON STREET, SUITE 2475, BOSTON, MA 02199
WWW.SEMRUSH.COM

145 pages have only one incoming internal link


About this issue: Having very few incoming internal links means very few visits, or even none, and
145 +1

fewer chances of placing in search results. It is a good practice to add more incoming internal
links to pages with useful content. That way, you can rest assured that users and search engines
will never miss them.
How to x: Add more incoming internal links to pages with important content.

43 orphaned pages in sitemaps


About this issue: An orphaned page is a webpage that is not linked to internally. Including
43 0

orphaned pages in your sitemap.xml les is considered to be a bad practice, as these pages will
be crawled by search engines. Crawling outdated orphaned pages will waste your crawl budget.
If an orphaned page in your sitemap.xml le has valuable content, we recommend that you link
to it internally. How to x: Review all orphaned pages in your sitemap.xml les and do either of
the following: If a page is no longer needed, remove it; If a page has valuable content and brings
tra c to your website, link to it from another page on your website; If a page serves a speci c
need and requires no internal linking, leave it as is.

32 links to external pages or resources returned a 403 HTTP status code


About this issue: This issue is triggered if a crawler gets a 403 code when trying to access an
32 +4

external webpage or resource via a link on your site. A 403 HTTP status code is returned if a user
is not allowed to access the resource for some reason. In the case of crawlers, this usually means
that a crawler is being blocked from accessing content at the server level.
How to x: Check that the page is available to browsers and search engines. To do this, follow a
link in your browser and check the Google Search Console data. 1.
If a page or resource is not available, contact the owner of the external website to restore
deleted content or change the link on your page. 2. If a page is available but our bot is blocked
from accessing it, you can ask the external website owner to unblock the page, so we can check
all resources correctly.
You can also hide this issue from your list.

31 links on this page have non-descriptive anchor text


About this issue: This issue is triggered if a non-descriptive anchor text is used for a link (either
31 +1

internal or external). An anchor is considered to be non-descriptive if it doesn’t give any idea of


what the linked-to page is about, for example, “click here”, “right here”, etc. This type of anchor
provides little value to users and search engines as it doesn't provide any information about the
target page.
Also, such anchors will o er little in terms of the target page’s ability to be indexed by search
engines, and as a result, rank for relevant search requests. For more information on the criteria
used to trigger this check, refer to kb article title. How to x: To let users and search engines
understand the meaning of the linked-to page, use a succinct anchor text that describes the
page’s content.
For best practices on how to optimize your anchor text, refer to the “Write good link text”
section in Google’s Search Engine Optimization (SEO) Starter Guide
https://support.google.com/webmasters/answer/7451184?
hl=en&ref_topic=9460495&authuser=0.

Generated on October 3, 2022 The report data is taken from SEMrush.com 11


800 BOYLSTON STREET, SUITE 2475, BOSTON, MA 02199
WWW.SEMRUSH.COM

23 pages take more than 1 second to become interactive


About this issue: We all know that slow page-load speed negatively a ects user experience.
23 -1

However, if a user can start interacting with your webpage within 1 second, they are much less
likely to click away from this page. That's why it is important to keep a close eye on the time it
takes your most important webpages to become usable, known as the Average Document
Interactive Time.
For more information, please see Why Performance Matters:
https://developers.google.com/web/fundamentals/performance/why-performance-matters/. To
evaluate your site performance, use the Site Performance report. How to x: Make sure that
users can start interacting with your most important pages as quickly as possible.

9 pages have more than one H1 tag


9 0

4 issues with blocked external resources in robots.txt


About this issue: Blocked external resources are resources (e.g., CSS, JavaScript, image les, etc.)
4 0

that are hosted on an external website and blocked from crawling by a "Disallow" directive in an
external robots.txt le. Disallowing these les may prevent search engines from accessing them
and, as a result, properly rendering and indexing your webpages. This, in return, may lead to
lower rankings.
For more information, please see this article
https://support.google.com/webmasters/answer/6153277?hl=en. How to x: If blocked
resources that are hosted on an external website have a strong impact on your website, contact
the website owner and ask them to edit their robots.txt le.<br/>If blocked resources are not
necessary for your site, simply ignore them.

3 pages are blocked from crawling


3 0

3 outgoing external links contain nofollow attributes


About this issue: A nofollow attribute is an element in an <a> tag that tells crawlers not to follow
3 0

the link. "Nofollow" links don’t pass any link juice or anchor texts to referred webpages. The
unintentional use of nofollow attributes may have a negative impact on the crawling process and
your rankings.
How to x: Make sure you haven’t used nofollow attributes by mistake. Remove them from <a>
tags, if needed.

3 subdomains don't support HSTS


3 0

0 page URLs are longer than 200 characters


0 0

Robots.txt not found


0 0

0 pages have hre ang language mismatch issues


0 0

0 pages blocked by X-Robots-Tag: noindex HTTP header


0 0

Generated on October 3, 2022 The report data is taken from SEMrush.com 12


800 BOYLSTON STREET, SUITE 2475, BOSTON, MA 02199
WWW.SEMRUSH.COM

0 issues with broken external JavaScript and CSS les


0 0

0 resources are formatted as page link


0 0

Generated on October 3, 2022 The report data is taken from SEMrush.com 13

You might also like