
How to do a quick SEO audit of your site
Having a slick, fast, and optimized website is a must if you want to have a good position in organic search results. Google has been steadily shrinking the number of organic results on the first page of SERP for years. They have been introducing new answer formats and rich results in addition to increasing the number of paid results.
Knowing all that, you can see just how important it is to optimize every detail on your website if you want your audience to be able to find it online.
In this article, we’ll cover some basic things which anyone can do to make sure their site is optimized. No expensive tools needed, only Google.
Let’s dig in.
Check if your website is indexed
The first thing to do is to see if your site is in Google Search in the first place. To check if Google has crawled and indexed your website go to google.com and enter the following search operator:
site:yourdomain.com
If Google has already crawled your website, you will see the number of pages indexed.
Optimize your robots.txt file
Robots.txt is a file that tells search engines crawlers which pages on your site they should crawl and which ones they should not. It’s used to avoid overloading your site with unnecessary crawl requests.
A simple robots.txt file, which allows all crawlers and all pages to be crawled, looks like this:
User-agent: *
Allow: /
Sitemap: http://www.example.com/sitemap.xml
If your site is a SaaS site or has a web app, for example, make sure to disallow crawler request to it in robots.txt. Bots can’t crawl past login pages, so there’s no need for you to waste your crawl budget on them.
User-agent: *
Allow: /
Disallow: /app
Sitemap: http://www.example.com/sitemap.xml
Upload your sitemap to Google
You probably have noticed that the last line in the robots.txt file is reserved for the sitemap. A sitemap tells crawlers the structure and pages of your site. You technically, don’t actually need a sitemap of your website. Googlebot is perfectly capable of finding your page on its own. It won’t if instruct it that way in robots.txt, or if the page in question is an orphan page – a page not linked to from another section of your site.
So, why use sitemaps you ask?
Well, it can take a bit time for a bot to find your new page or updated page. If you updated your sitemap, a crawler will use your robots.txt file to proceed to read the sitemap and pages included in it. That way you’ll speed up the process of bots finding your pages.
If you want to speed up that process even more, you can upload your sitemap to Google Search Console. From there go to Sitemaps section, enter your sitemap UR, and click “Submit”.
Check if your site is secure
If your site is neatly indexed and all important pages are showing, you can now check if your website is secure and protected with SSL/TLS certificate. Google has begun tagging unsecure websites with a “Not secure” tag prior to site’s URL since mid 2018. A promise they made several years before to name and shame websites with unencrypted connections which could pose a potential security threat to its visitors. At the same time, those sites that have the certificate are labeled with a green lock icon and the word “Secure” in the URL bar.
If you are not sure your website has SSL/TLS, you can check this making sure you have a little lock icon before the URL. Or alternatively, by searching for all https pages and checking if the number matches the total number of pages on your site.
Check if your site is easy to use and navigate
One of the things Google cares the most about your website is its usability. Here is some basic stuff you should make sure are ticked:
- Construct a simple and intuitive site architecture and flow of information.
- Make sure the URLs are clean and don’t contain unnecessary numbers or tags.
- Use appropriate Title and Meta tags
- Structure your content so it’s easy to consume. Use headings and other formatting elements that break the monotony of a text.
- Insert appropriate inbound links to other pages your readers may find useful. Check them in GSC at Links report > Internal links > Top linked pages table
Audit your site’s backlinks
While in the Google Search Console you should go to the Links>Top linked pages – externally
Here you can see what websites are linking to your own site. It’s important to check if some spammy websites are linking to your site. If the are, you may want to disavow those domains with Google’s disavow tool.
Now, Google’s Jon Muller has stated several times before that you don’t really need to disavow spammy links. So, unless you did some black or gray hat SEO practices like buying links or using PBNs you don’t need to worry too much about it. But at the same time, many SEOs have stated that they still saw some benefits in ranking positions after disavowing bad backlinks.
To disavow links go to disavow links tool page and upload your disavow file in .txt format. You can disavow specific URLs or choose to disavow the entire domain which you think can harm your site’s backlink portfolio. It looks something like this:
# Disavow pages
http://spam.example.com/stuff/comments.html
http://spam.example.com/stuff/paid-links.html
# Disavow domain
domain:spammysite.com
Audit your backlink anchors
After checking your backlink sources, stay at Links section of GSC and go through your “Top linking text”. There you can find anchors other sites use for linking to your website. It’s very useful information for several reasons.
Firstly, you can see how other people see and understand the theme of your website by which words they use to link to it. This can help you create better content and branding strategy for your site.
Secondly, examining your linked text from other websites is useful for checking if there was excessive use of exact match anchor text in the past by yourself or an SEO you hired. Having too many exact match anchors pointing to your business pages can provoke Google to take manual actions against your site. Branded anchors are mostly fine unless you have violated Google’s Quality guidelines.
Thirdly, check this section to see if someone is spamming your website on purpose with bad anchors. For example, a competitor may use inappropriate anchors related to porn, gambling, drugs, etc., to try to hurt your website’s ranking. Google says that they are capable of detecting and ignoring these kinds of links, but better to be safe than sorry, if you ask me.
Check the speed of your site
Since Google transitioned to mobile-first indexing in March 2018, website speed has increased in importance. While slower loading pages don’t get penalized or pushed back in results for their lack of snappiness, what can happen is that competing pages which are quicker to load can be prioritized in search engine results page.
To check the speed of your home, landing, or any other important page of your site, go to Google’s PageSpeed Insights, insert the desired URL and run the audit. On the results page, you will have tabs for both mobile and desktop results.
This tool will also give you feedback about the things you’ll need to optimize on the given page. The most common issues which impact page speeds are large images and non-minified code files.
Do a Lighthouse audit of your website
Lighthouse is another great tool from Google for doing a quick site audit. It checks some of the key technical factors that impact your site and grades them on a scale from 0 to 100.
Those are:
- Performance,
- SEO,
- Accessibility,
- Best Practices,
- Progressive Web App.
To start the audit, go to your desired page and open Chrome DevTools. You can do this by right-clicking anywhere on the page and going to the “Inspect” option.
Alternatively, press Command+Option+C (Mac) or Control+Shift+C (Windows, Linux, Chrome OS).
After you have opened Chrome DevTools, go to Audits tab (far right) and click “Run audits”.
Note: It is best that you run your audits in the Incognito mode in Chrome browsers to avoid extensions and cache impacting audit results.
After you run the test, which takes about a minute, Lighthouse gives you the list of problematic things that you’ll need to attend to. It’s a very useful tool that gives you pointers and tells you how urgent specific fixes are.
These are just some of the basics for making sure your website is nicely tuned. The goal is that both your visitors and crawlers have no problem understanding and consuming your website’s content.
If you have a smaller website, this is pretty much almost everything you need to take care of to make it optimized for search engines.
If you do not have the skills in house, have no fear, we have hand curated a list of freelance job websites which will save you time. Ensure you have created a comprehensive list of all the tasks you require in order to get an accurate cost and also remember to check the reviews (good and bad)
Mark Maric is a marketing manager specialized in digital marketing and SEO. Currently, he’s at Clockify, trying to make the world a more productive place. Check him at @mmmaric for more.