Site Audit: Issues: Generated On October 3, 2022
Site Audit: Issues: Generated On October 3, 2022
facilio.com
WWW.SEMRUSH.COM
Subdomain: facilio.com
Last Update: October 3, 2022
Crawled Pages: 638
facilio.com
414 0
non-existent webpages. Multiple broken links negatively a ect user experience and may worsen
your search engine rankings because crawlers may think that your website is poorly maintained
or coded. Please note that our crawler may detect a working link as broken.
Generally, this happens if the server hosting the website you're referring to blocks our crawler
from accessing this website. How to x: Please follow all links reported as broken. If a target
webpage returns an error, remove the link leading to the error page or replace it with another
resource.
If the links reported as broken do work when accessed with a browser, you should contact the
website's owner and inform them about the issue.
website. Only good pages intended for your visitors should be included in your sitemap.xml le.
This error is triggered if your sitemap.xml contains URLs that: 1.
lead to webpages with the same content. 2. redirect to a di erent webpage.
3. return non-200 status code. Populating your le with such URLs will confuse search engines,
cause unnecessary crawling or may even result in your sitemap being rejected.
How to x: Review your sitemap.xml for any redirected, non-canonical or non-200 URLs. Provide
the nal destination URLs that are canonical and return a 200 status code.
duplicate content may signi cantly a ect your SEO performance. First of all, Google will typically
show only one duplicate page, ltering other instances out of its index and search results, and
this page may not be the one you want to rank.
In some cases, search engines may consider duplicate pages as an attempt to manipulate search
engine rankings and, as a result, your website may be downgraded or even banned from search
results. Moreover, duplicate pages may dilute your link pro le. How to x.
Here are a few ways to x duplicate content issues: 1. Add a rel="canonical" link to one of your
duplicate pages to inform search engines which page to show in search results. 2.
Use a 301 redirect from a duplicate page to the original one. 3. Use a rel="next" and a rel="prev"
link attribute to x pagination duplicates.
4. Instruct GoogleBot to handle URL parameters di erently using Google Search Console. 5.
Provide some unique content on the webpage. For more information, please read these articles:
https://support.google.com/webmasters/answer/66359?hl=en and
https://support.google.com/webmasters/answer/139066?hl=en.
your website. Any script that has stopped running on your website may jeopardize your rankings,
since search engines will not be able to properly render and index your webpages. Moreover,
broken JS and CSS les may cause website errors, and this will certainly spoil your user
experience.
How to x: Review all broken JavaScript and CSS les hosted on your website and x any issues.
of broken links. These errors prevent users and search engine robots from accessing your
webpages, and can negatively a ect both user experience and search engine crawlability.
This will in turn lead to a drop in tra c driven to your website. Please be aware that crawler may
detect a working link as broken if your website blocks our crawler from accessing it. This usually
happens due to the following reasons: 1.
DDoS protection system. 2. Overloaded or miscon gured server.
How to x: If a webpage returns an error, remove all links leading to the error page or replace it
with another resource. To identify all pages on your website that contain links to a 4xx page, click
"View broken links" next to the error page. If the links reported as 4xx do work when accessed
with a browser, you can try either of the following: 1.
Contact your web hosting support team. 2. Instruct search engine robots not to crawl your
website too frequently by specifying the "crawl-delay" directive in your robots.txt.
engines of which version of a page you want to show up in search results. When using canonical
tags, it is important to make sure that the URL you include in your rel="canonical" element leads
to a page that actually exists. Canonical links that lead to non-existent webpages complicate the
process of crawling and indexing your content and, as a result, decrease crawling e ciency and
lead to unnecessary crawl budget waste.
How to x: Review all broken canonical links. If a canonical URL applies to a non-existent
webpage, remove it or replace it with another resource.
from a user or a crawler. They prevent users and search engine robots from accessing your
webpages, and can negatively a ect user experience and search engines' crawlability. This will in
turn lead to a drop in tra c driven to your website.
How to x: Investigate the causes of these errors and x them.
longer exists, its URL is misspelled, or because the le path is not valid. Broken images may
jeopardize your search rankings because they provide a poor user experience and signal to
search engines that your page is low quality. How to x: To x a broken internal image, perform
one of the following: 1.
If an image is no longer located in the same location, change its URL. 2. If an image was deleted
or damaged, replace it with a new one.
3. If an image is no longer needed, simply remove it from your page's code.
lead to security issues. Moreover, browsers will warn users about loading unsecure content, and
this may negatively a ect user experience and reduce their con dence in your website. How to
x: Only embed HTTPS content on HTTPS pages.
Replace all HTTP links with the new HTTPS versions. If there are any external links leading to a
page that has no HTTPS version, remove those links.
0 non-secure pages
0 0
WWW.SEMRUSH.COM
facilio.com
11072 +10
temporarily moved to a new location. Search engines will continue to index the redirected page,
and no link juice or tra c is passed to the new page, which is why temporary redirects can
damage your search rankings if used by mistake. How to x: Review all URLs to make sure the use
of 302 and 307 redirects is justi ed.
If so, don’t forget to remove them when they are no longer needed. However, if you permanently
move any page, replace a 302/307 redirect with a 301/308 one.
contents of your images. If you neglect alt attributes, you may miss the chance to get a better
placement in search results because alt attributes allow you to rank in image search results. Not
using alt attributes also negatively a ects the experience of visually impaired users and those
who have disabled images in their browsers.
For more information, please see these articles: Using ALT attributes smartly:
https://webmasters.googleblog.com/2007/12/using-alt-attributes-smartly.html and Google
Image Publishing Guidelines: https://support.google.com/webmasters/answer/114016?hl=en.
How to x: Specify a relevant alternative attribute inside an <img> tag for each image on your
website, e.g., "<img src="mylogo.png" alt="This is my company logo">".
comments from the source code. Minifying JavaScript and CSS les makes their size smaller,
thereby decreasing your page load time, providing a better user experience and improving your
search engine rankings. For more information, please see this Google article
https://developers.google.com/web/fundamentals/performance/optimizing-content-e ciency.
How to x: Minify your JavaScript and CSS les. If your webpage uses CSS and JS les that are
hosted on an external site, contact the website owner and ask them to minify their les. If this
issue doesn't a ect your page load time, simply ignore it.
webpage compared to the amount of code. This issue is triggered when your text to HTML is 10%
or less. Search engines have begun focusing on pages that contain more content.
That's why a higher text to HTML ratio means your page has a better chance of getting a good
position in search results. Less code increases your page's load speed and also helps your
rankings. It also helps search engine robots crawl your website faster.
How to x: Split your webpage's text content and code into separate les and compare their size.
If the size of your code le exceeds the size of the text le, review your page's HTML code and
consider optimizing its structure and removing embedded scripts and styles.
running a multilingual website, you should make sure that you’re doing it correctly. First, you
should use a hre ang attribute to indicate to Google which pages should be shown to visitors
based on their location.
That way, you can rest assured that your users will always land on the correct language version of
your website. You should also declare a language for your webpage’s content (i.e., lang
attribute). Otherwise, your web text might not be recognized by search engines.
It also may not appear in search results, or may be displayed incorrectly. How to x: Perform the
following: 1. Add a lang attribute to the <html> tag, e.g., "<html lang="en">".
2. Add a hre ang attribute to your page's <head> tag, e.g., <link rel="alternate"
href="http://example.com/" hre ang="en"/>.
used by search engines to display your page's description in search results. A good description
helps users know what your page is about and encourages them to click on it. If your page's meta
description tag is missing, search engines will usually display its rst sentence, which may be
irrelevant and unappealing to users.
For more information, please see these article: Create good titles and snippets in Search Results:
https://support.google.com/webmasters/answer/35624. How to x: In order to gain a higher
click-through rate, you should ensure that all of your webpages have meta descriptions that
contain relevant keywords.
topic for search engines and users. If an <h1> tag is empty or missing, search engines may place
your page lower than they would otherwise. Besides, a lack of an <h1> tag breaks your page’s
heading hierarchy, which is not SEO friendly.
How to x: Provide a concise, relevant h1 heading for each of your page.
Incomplete and shortened titles look unappealing to users and won't entice them to click on your
page. For more information, please see this Google article:
https://support.google.com/webmasters/answer/35624.
How to x: Try to rewrite your page titles to be 70 characters or less.
The amount of text placed on your webpage is a quality signal to search engines. Search engines
prefer to provide as much information to users as possible, so pages with longer content tend to
be placed higher in search results, as opposed to those with lower word counts.
For more information, please view this video: https://www.youtube.com/watch?v=w3-obcXkyA4.
How to x: Improve your on-page content and be sure to include more than 200 meaningful
words.
your page's <title> and <h1> tags match, the latter may appear over-optimized to search engines.
Also, using the same content in titles and headers means a lost opportunity to incorporate other
relevant keywords for your page.
For more information, please see this Google article:
https://support.google.com/webmasters/answer/35624. How to x: Try to create di erent
content for your <title> and <h1> tags.
non-existent webpages. Multiple broken links negatively a ect user experience and may worsen
your search engine rankings because crawlers may think that your website is poorly maintained
or coded. Please note that our crawler may detect a working link as broken.
Generally, this happens if the server hosting the website you're referring to blocks our crawler
from accessing this website. How to x: Please follow all links reported as broken. If a target
webpage returns an error, remove the link leading to the error page or replace it with another
resource.
If the links reported as broken do work when accessed with a browser, you should contact the
website's owner and inform them about the issue.
practice to place a link to your sitemap.xml in your robots.txt, which will allow search engines to
better understand what content they should crawl. How to x: Specify the location of your
sitemap.xml in your robots.txt. To check if Googlebot can index your sitemap.xml le, use the
Sitemaps report in Google Search Console: https://search.google.com/search-console/not-
veri ed?original_url=/search-console/sitemaps&original_resource_id.
0 uncompressed pages
0 -1
0 pages have a JavaScript and CSS total size that is too large
0 0
facilio.com
1329 +124
many situations (for example, when you move a website to a new domain, redirect users from a
deleted page to a new one, or handle duplicate content issues), we recommend that you keep
them to a reasonable minimum. Every time you redirect one of your website's pages, it decreases
your crawl budget, which may run out before search engines can crawl the page you want to be
indexed. Moreover, too many permanent redirects can be confusing to users.
How to x: Review all URLs with a permanent redirect. Change permanent redirects to a target
page URL where possible.
an empty or naked anchor (i.e., anchor that uses a raw URL), or anchor text only contains symbols.
Although a missing anchor doesn't prevent users and crawlers from following a link, it makes it
di cult to understand what the page you're linking to is about. Also, Google considers anchor
text when indexing a page.
So, a missing anchor represents a lost opportunity to optimize the performance of the linked-to
page in search results. How to x: Use anchor text for your links where it is necessary. The link
text must give users and search engines at least a basic idea of what the target page is about.
Also, use short but descriptive text. For more information, please see the "Use link wisely"
section in Google's SEO Starter Guide
https://support.google.com/webmasters/answer/7451184?
hl=en&ref_topic=9460495&authuser=0.
important to check your website for such pages. If a page has valuable content but is not linked
to by another page on your website, it can miss out on the opportunity to receive enough link
juice.
Orphaned pages that no longer serve their purpose confuse your users and, as a result,
negatively a ect their experience. We identify orphaned pages on your website by comparing
the number of pages we crawled to the number of pages in your Google Analytics account. That's
why to check your website for any orphaned pages, you need to connect your Google Analytics
account.
How to x: Review all orphaned pages on your website and do either of the following: 1. If a page
is no longer needed, remove it. 2.
If a page has valuable content and brings tra c to your website, link to it from another page on
your website. 3. If a page serves a speci c need and requires no internal linking, leave it as is.
engine crawlers to reach it via its corresponding homepage. From an SEO perspective, an
excessive crawl depth may pose a great threat to your optimization e orts, as both crawlers and
users are less likely to reach deep pages. For this reason, pages that contain important content
should be no more than 3 clicks away from your homepage.
How to x: Make sure that pages with important content can be reached within a few clicks. If
any of them are buried too deep in your site, consider changing your internal link architecture.
fewer chances of placing in search results. It is a good practice to add more incoming internal
links to pages with useful content. That way, you can rest assured that users and search engines
will never miss them.
How to x: Add more incoming internal links to pages with important content.
orphaned pages in your sitemap.xml les is considered to be a bad practice, as these pages will
be crawled by search engines. Crawling outdated orphaned pages will waste your crawl budget.
If an orphaned page in your sitemap.xml le has valuable content, we recommend that you link
to it internally. How to x: Review all orphaned pages in your sitemap.xml les and do either of
the following: If a page is no longer needed, remove it; If a page has valuable content and brings
tra c to your website, link to it from another page on your website; If a page serves a speci c
need and requires no internal linking, leave it as is.
external webpage or resource via a link on your site. A 403 HTTP status code is returned if a user
is not allowed to access the resource for some reason. In the case of crawlers, this usually means
that a crawler is being blocked from accessing content at the server level.
How to x: Check that the page is available to browsers and search engines. To do this, follow a
link in your browser and check the Google Search Console data. 1.
If a page or resource is not available, contact the owner of the external website to restore
deleted content or change the link on your page. 2. If a page is available but our bot is blocked
from accessing it, you can ask the external website owner to unblock the page, so we can check
all resources correctly.
You can also hide this issue from your list.
However, if a user can start interacting with your webpage within 1 second, they are much less
likely to click away from this page. That's why it is important to keep a close eye on the time it
takes your most important webpages to become usable, known as the Average Document
Interactive Time.
For more information, please see Why Performance Matters:
https://developers.google.com/web/fundamentals/performance/why-performance-matters/. To
evaluate your site performance, use the Site Performance report. How to x: Make sure that
users can start interacting with your most important pages as quickly as possible.
that are hosted on an external website and blocked from crawling by a "Disallow" directive in an
external robots.txt le. Disallowing these les may prevent search engines from accessing them
and, as a result, properly rendering and indexing your webpages. This, in return, may lead to
lower rankings.
For more information, please see this article
https://support.google.com/webmasters/answer/6153277?hl=en. How to x: If blocked
resources that are hosted on an external website have a strong impact on your website, contact
the website owner and ask them to edit their robots.txt le.<br/>If blocked resources are not
necessary for your site, simply ignore them.
the link. "Nofollow" links don’t pass any link juice or anchor texts to referred webpages. The
unintentional use of nofollow attributes may have a negative impact on the crawling process and
your rankings.
How to x: Make sure you haven’t used nofollow attributes by mistake. Remove them from <a>
tags, if needed.