Summary
There are three layers of common issues that might affect a website before performing the SEO audit. The first one is the crawlability and site structure that determines how accessible each website’s page. The second one is the On-page SEO that focuses on mistakes of each page in a specific website.
The last one is the technical SEO which is concerned with how fast your website loads on every device such as PCs, Laptops, smartphones, and tablets. Conducting a technical SEO audit entails investigating every corner of your website to maintain its loading time while also increasing search visibility.
While I was on a roll with constantly updating my blogs to my website, I found it confusing as to why the number of visitors on my website isn’t increasing. Various websites from Wish to The Outline gain a steady number of visitors every day and they were updating as often as mine was.
I checked the numbers of my own website and I found out that my search ranking hasn’t gone up either.
I researched the reasons as to why this is happening to me, and I stumbled upon this video:
After learning more about this and applying what I have learned about SEO auditing, this helped my website become more visible on search engines and make my visitor count increase by the minute!
Take note of this: If you own your website and you notice that your search rank and visitor count is not moving or decreasing, you should have a technical SEO audit done every month.
Before learning how to perform an SEO audit, let us look at the most common issues that will be brought up and how much they will affect your website. They consist of three layers:
- Crawlability and site structure
- On-page SEO
- Technical SEO
Feel like this isn’t for you? Let me do it
-
Product on sale30 Min ConsultationOriginal price was: $600.$350Current price is: $350.
-
Comprehensive SEO Audit and SEO Strategy$9,500
Crawlability and site structure
This has something to do with how accessible each page is within your website. If Googlebot isn’t able to crawl and index your entire site, it will severely damage your search results ranking, viewers, and revenue in the long run.
The following are the main errors you should avoid:
- Links and redirects
According to SEMrush, they found out that every fourth website has link errors and 30 percent of websites have broken internal links. Also, keep in mind that your website may be one of the 26.5 that have 4XX errors as well. - Sitemap and Robots.txt
Even if there is a small percentage of websites that have formatting errors in their sitemap.xml and robot.txt, having either in your website will damage your chances of getting more views and rank higher in search results.
On-page SEO
While the previous layer discusses one’s website as a whole, this section focuses on mistakes you might be making in each page in your own website. Improving errors found in this layer will not only improve your website’s search rankings but increase the off-page performance of each and every one of them.
Here’s a list of key errors you should watch out for in this layer:
- Content
Search Engine Journal mentioned in their 2017 article that there are three technical SEO issues that most website owners make.
The first is severe duplicate issues, advising owners to add a rel=”canonical” tag and create unique content over well-optimized duplicate content.
Following that is webpages having low text-to-HTML ratio, which is present in 93.72 percent of websites.
Finally, 73.47 percent of pages have at most 250 words per page, which makes them feel inappropriate and unnatural. - Meta Descriptions
If you are one of the 63 percent of owners who do not put in the effort in creating a meta description for each and every one of their pages, this can result in lower click-through rate.
An even bigger error is having duplicate meta descriptions for your webpages. 54 percent of owners make this mistake and it will also heavily affect the overall click-through rate as well.
- Title tags, H1 tags & images
A factor in website ranking can be attributed to titles, headings, and images.
You should make sure to add missing alt tags for all your images, shorten wordy title tags, and add enough H1 tags to each of your pages. This will negatively affect your website’s overall UX, thus decrease your rankings at the same time.
Relevant: Learn all about On-page SEO here
Technical SEO
In this section, take a look at how fast your website loads not only on devices such as PCs and laptops but also on smartphones and tablets. As a study by Amazon revealed that a 1 percent drop in sales can be linked to 100 milliseconds of extra load time, avoiding any errors in this layer will positively benefit your website’s growth.
Here’s a list of key errors to watch out for in this layer:
- Page speed
One of the highest issues that your website can have is any of your websites having a slow page load speed. Ignoring this problem will cause your overall Google rating to drop.
Another problem that should be resolved immediately is having large HTML page sizes. Even though less than 1 percent of websites contain a large HTML page size, having at least one site made this way will not only make your website slower by comparison but also make it less user-friendly in the process.
The following elements in a webpage can affect this criterion:
- Website host
- Large images
- Embedded videos
- Plug-ins
- Ads
- Theme
- Widgets
- Repetitive script or dense code.
- Old technology
This aspect refers to how the website was made. For example, if a website was made using PHP 4 instead of the current PHP 7, this is referred to as old technology.
Several elements are needed for this generation’s websites including Google Analytics, the Google Tag Manager, a schema markup, and robots.txt. Elements such as Flash and iframes would have to be removed immediately in order to increase loading time as well.
Aside from the schema markup and robots.txt, you should make sure that it is not blocked by the latter. After that, keep it running on the server by checking on it from time to time.
- Mobile-friendliness
As the increase of people-to-smartphone is exponentially rising, your website must be made where it can be accessed on smaller devices such as smartphones, tablets, and smartwatches. This is also due to Google’s mobile-first indexing service was announced back in December 2018.
Aside from that trend, it has been shown that smartphone traffic has already exceeded desktop searches for a long time now.
This means that not being able to make one’s website adapt to these trends will suffer a major loss of clickthrough ratio, search rankings, and potential traffic all at once.
After learning which SEO errors to fix right away, let’s now dive in on how you can perform your own technical SEO audit.
You would also like to read my guide on the top 20 SEO tools of 2021 here
Step 1: Identify Crawl Errors
The first step is to have your site audited and get a crawl report. Getting one will help you learn which one of the errors above you should mitigate right away.
It is highly recommended that you do this monthly in order to keep one’s site clean of SEO errors and keep it as optimized as possible.
Before making any changes based on the report, make sure to create a backup of the site first.
After the backup, begin fixing any crawl errors by redirecting most 404 errors and turn them into 301 redirects instead.
Then, pass the updated website to a development team in order to determine the cause, whether or not they have to add a new .html access file or increase the server’s memory limit.
Also, remove any permanent redirects from the sitemap, internal links, and external links.
Step 2: Check HTTPS status codes
Making sure that all your websites are in HTTPS and not in HTTP is a must. Aside from that, one should also check for other errors using Google Search Console.
The following are some of the response codes that your website may have:
- 301 – the usual redirect
- 302 – an error code that usually happens in e-commerce sites when a product is out-of-stock
- 400 – users are unable to access the page
- 403 – users are unauthorized to access the page
- 404 – the page is not found, which might be a result of you deleting a page without adding a 301 redirect
- 500 – internal server error, which has to be fixed with a web development team
Doing so will help remove any 4xx and 5xx response codes, thus improving site crawlability and user experience.
Finally, check all your SSL certificates. Doing this will not only avoid crawlability errors but also keep all data between you and your visitors secured at the same time.
Regarding SSL certificates, you should prioritize getting one for your domain and subdomains as soon as possible. Users that visit a website without it will be alerted through a “Not Secure” warning on their browsers, thus decreasing your visitor count in the process.
✋ Stop worrying about SEO and have me do it for you
PS: Ready to work with the 0.01% of all SEOs worldwide? Click here.
Step 3: Check XML sitemap status
As mentioned in the previous section, having a sitemap will help you have a higher search ranking. Doing this will also help search engine crawlers find your webpages as well.
What to take note of while editing an XML sitemap
- It is written using XML free of any formatting errors, including the proper 200 status codes and canonicals.
- The entire file is written using the XML sitemap protocol.
- Includes all updated pages on the website.
- Submit it to your Google Search Console or add it in your robots.txt file.
Step 4: Check site load time
Being able to access any page in your website by past, present, and future visitors is a must in order for them to keep traffic high.
To do this step, you can use Google’s PageSpeed Insights or another site loading tool to check. Ideally, it should be less than 3 seconds.
Resolving this can be done by organizing which elements of the site should be optimized for faster loading times and a better user experience.
Aside from that, check whether the server is down or is operating slower than usual. This can occur if multiple users are attempting to access the server at once and a possible response to this is to upgrade it as soon as possible.
Then, check if any of your subdomains are redirected to the home website. Doing this will help prevent any complete page invisibility in any of your websites in the process.
Cloaking
If your website is placed in a search result where it does not contain the information the user needed, it is then otherwise known as cloaking.
An example of this phenomenon is when crawlers learn about HTML text from your website while serving visuals to visitors. Another is if you designed a website to have the text color to be similar to the background.
If this is present in your website, it is highly suggested that you should have it removed immediately by rearranging how each webpage is made. Doing this will prevent any negative damages to your overall ranking in the process.
Step 5: Ensure your site is mobile-friendly
For this step, one can use the Google Mobile-Friendly Test. You simply enter your site in the tool and it will generate a report regarding how your website works on mobile and what needs to be improved.
Solutions on how to make a website mobile-friendly
- Increase font size
- Embed Youtube videos
- Compress images
- Use Accelerated Mobile Pages (AMP)
- Add structured data
- Update URLs to mobile URLs.
- Check and update the hreflang code and links as often as possible
- Add touch elements
- Include localization
Step 6: Audit for keyword cannibalization
This audit checks for any article that shares the same keywords. Having these will make Google confused as to which page to prioritize first when it can be looked up in the search results.
Having these duplicate sites will negatively affect your click-through ratio, authority, and conversion rates.
Use Google’s Search Console’s Performance report to check which pages are competing for the same keywords then filter which pages use the same in their URL and how many pages are ranking for those same keywords.
Duplicate sites also include subdomains as well. An example of a subdomain is having blog.yourwebsite.com while having yourwebsite.com as your main.
Another way to optimize your use of keywords on all of your pages are the following:
- Use H1 tags for top-level headings
- Apply H2 tags for main categories
- Include H3 to H6 for subcategories and important links
- Avoid repeating the same keywords across multiple headings
- Write the entire page content in an H tag
- Not use H tags in headings
- Reverse the order of H tags
- Use H1 for all headings on a single page except when highlighting equally important topics on one page
- Covering similar topics or targeting the same keywords both on your main domain and subdomains
Step 7: Check your site’s robots.txt file
For this step, take a look at your robots.txt file and look for any “Disallow” lines in it.
Having these means that one or more of your links are blocked from search engines and cannot be displayed in future search results.
By disallowing which websites to crawl and allowing ones that should be, this will help not only increase your website’s visibility but also your search rankings as well.
Aside from that, make sure that update all disallowed links to be in lowercase.
Then, if you own several subdomain websites, take the time to make a robots.txt for each and every one.
Following that, remove any parameter URLs, non-indexable pages, and add the rel=”alternate” tag for the sitemap.
Finally, always use robots.txt properly while maintaining the website. There are two times as to when to check it: during development and after launching.
During development, always make it a habit to block your robots.txt. Doing this will prevent any of your pages to index incomplete, non-optimized content and treat the previous versions of your website pages as a duplicate.
After launch, always check its contents as often as possible. Doing this will prevent any of your pages from getting ranked improperly and fixing this is as simple as setting crawl directives for search engines.
Step 8: Perform a Google site search
One way to make sure that your site has not been penalized by Google or is being blocked from being indexed is by simply searching it on Google.
Simply type “site:” in the Google search bar then add in your website after the colon and check which pages are indexed by the engine. Doing this will help you go back to the previous steps and perform any error-checking there as an improvement.
If, for example, another brand is appearing in the results, solve it immediately as this may be the reason for a bigger issue present within your website. Solving this problem requires one to dive into its analytics.
Then, if the homepage is not appearing as the first result, manually check it to see what is missing. Make sure to learn whether or not it has a penalty or is within a poorly made site architecture.
Finally, cross-check the number of organic landing pages against the search results given by Google. Doing this will help one get an idea of what these engines see as valuable.
Also, read how to get your website to the top of Google here
Step 9: Check for duplicate metadata
In the sixth step, we have discussed that you can remove keyword cannibalization by correcting sites that have the same keyword in their URL or are ranked in the same keywords. In this step, I will introduce you to another solution to avoid this problem.
It is named using two words: meta description.
According to research, 54% have duplicate meta descriptions while 63% are missing from different sites.
To spot them, first get a detailed SEO audit or a crawl report. Then, one can proceed to modify your website with high rankings and value, keeping in mind to make all meta descriptions unique between pages.
Step 10: Meta description length
Following the previous step, another problem that can be encountered regarding meta description is on how too long it is.
Recently, its length was changed from 160 to 320 characters. Making use of most or all of the available characters to add key elements such as product specifications and location can help increase one’s click-through ratio.
Step 11: Check for site-wide duplicate content
Aside from the duplicate meta descriptions mentioned in step 9, you should also take care of other duplicate content within your websites as well.
Other contents that can be duplicated within your entire website include the following:
- Duplicate body content from tag pages
- Two domains
- Subdomains
- Similar content on a different domain
- Improperly implemented pagination pages
A way to spot these errors is to use a tool such as Copyscape, Screaming Frog, Site Bulb, or SEMrush.
After learning the areas of interest, fixing them will include one or all of these steps:
- Adding canonical tags to all pages to let Google know what their preferred URL per page is.
- Adding no-follow links to prevent passing a page’s SEO to another website.
- Disallow incorrect URLs in the robots.txt.
- Rewrite all content, including body copy and metadata.
- Manually review and fix any pagination errors.
Step 12: Check for broken links
The last item to check for in your technical SEO report is for bad links. Any type of this error will introduce your website to a host of other problems, such as wasting crawl budget, bad user experience, and lower search rankings.
The first step after learning about where specifically these errors are is to remove all occurrences of pages that redirect the user to old 404 pages and update them with their respective internal link. The next step is to remove middle redirects between pages.
Doing this will help make your website crawl budget lower. Having this means that all website pages render properly while making one’s host load efficient.
In short, this will help keep the user experience positive in the long run. You can use a plugin like Broken Link Checker for this.
Conclusion
To summarize, performing a technical SEO audit means exploring every nook and cranny of your website in order to keep it loading fast while improving one’s search ranking at the same time. From content to XML sitemaps, make sure to have a tool ready to check for any errors and resolve them immediately.
Still feel confused or overwhelmed? No problem, I can tackle the audit and give you an actionable plan with prioritized fixes. If you are interested, please check this product:
Are you ready to apply these tips to your own website to help it grow? If you are, you can learn more by checking out these articles too: