Auditing your website: Complete 13 Steps Guide To Follow.Sep 23, 2017
Technical part and maintain well designed and programmed website is a major step in your SEO process.
As an SEO specialist, you are not required to be an experienced developer to succeed.
However, you need to understand the major aspects of building a successful website and which factors you should consider.
Google bots can’t see the website the same way the human eyes do, it can only see a bunch of codes that your web browser can understand and render it into visuals that you see each time you visit a website.
Hence, how your website was built and coded can tremendously affect your SEO score.
The Ux ( user experience ), look and feel of your website can affect the relationship between the website and your visitors as well, which also affects your SEO score in return.
I do always deeply audit my client’s website before I start conducting SEO on it.
So let’s start with the major factors you should consider and the tools that can help you through the process:
1.Choose a proper CMS “Content Management System”
CMS is a must-have asset to be used to build and manage your website.
CMS manages the categories, posts, and pages of your website to let you focus on your content instead of the consumed time for managing the website system.
A lot of workload and heavy lifting tasks the CMS can handle for you, it can organize your posts and assign it to your desired categories and the hierarchy of your web pages.
Choosing the proper CMS is crucial while planning to build a successful website as changing the CMS later can be a huge headache and causes a lot of troubles too.
Pick the CMS that fits your business model, for example, if you willing to build a Blog or a website that only offer information about your business for your client then you can consider WordPress.
WordPress is one the most famous CMS with a huge market share, websites using it already for many different purposes.
WordPress can be used for e-commerce as well using its useful plugins, but there are plenty of more specialized e-commerce software to build your website with.
Also, in some cases, you might need to build your own CMS from scratch and custom code it.
2.Optimize your website navigation
The navigation of your website starts from your the website main menu.
The main way for the website visitors to navigate a site is the menu as it holds the major webpages of the website which should lead to all the other pages.
You should add the most important web pages like the contacts page, services, shop, blog to your website main menu to make it easier for the visitors to navigate.
The easier the visitors can navigate to reach what they looking for the better user experience a visitor will have.
Better Ux leads to better SEO score for your website, it will lead to longer time on page and lower bounce rate among many other factors too.
Navigation includes more elements than just the menu, starting from the website homepage to the pagination of your blog categories.
3.Site category structure
Structuring your website categories with the correct hierarchy that serves your business model is crucial to leave a good experience for your visitors.
For a successful blog, you need to have each post under the correct category and each subcategory under its correct parent category.
You can also make a good use of tags to sort things out further too.
Each category and subcategory should have the correct and most descriptive meta tags, mainly the title and description.
Consider using deep linking to your site web pages, which means linking to the websites pages and blog posts directly from other pages without the hierarchy linking.
For example, you can link to a certain blog post from your website homepage or even from another blog post, which is called internal links too.
4.Use Breadcrumbs for better navigation
Breadcrumbs are the widget area that shows the visitors where they at the moment.
For example, you can find it looks like: Home > Category > Blog Article.
It can be very helpful in many cases for the visitors to navigate back to the main category of the blog post they currently reading and in many other scenarios too.
For WordPress, you can use Yoast plugin to add breadcrumbs to your website theme and it also adds more SEO options to your articles.
5.Duplicated content and meta tags
Duplicated content won’t be counted for google to show both in the SERP, only unique version of the content will be shown.
If both original and the duplicated version of the website are both on your website then you can use a 301 redirect on your website using .htaccess to make use of rel=”canonical” meta tags on the duplicated versions of the webpage.
If the original content is not on your website, which means you have copied the content then you need to immediately delete that duplicated webpage.
Duplicated meta tags like the webpages that have the same title, description of even both can cause a lot of troubles for the website rank too.
Both duplicated content and meta tags can badly affect the website ranking, especially the content on your website copied from another website.
Unique content is always valuable for both the website visitors and Google bots.
The bottom line here is the fact that your webpage will be ranking higher in the SERP for the value it adds to the visitors, and uniqueness is a big part of that added value.
6.Keywords cannibalization and thin content
Repeated topic that’s focused on the same keyword won’t be helpful for your ranking at all.
Instead of repeating yourself, add more value regularly to your website visitors.
Instead of making like 5 ~ 6 blog posts on the same exact specific topic focusing on the same keyword, you should create a single blog post that covers the topic.
7.Image alt and links title
All the images on your website should have an alt on it and probably a title as well which can help ranking in images SERP and also adds value to your blog post score.
Images file names also can affect your ranking, however, you should not stuff your images alt, title or name with the keyword you targeting, instead make them descriptive.
All links on the page should have a title, you will realize a pop up once you hover on a link with a title, it also helps the bots to understand what are you linking to.
Links title can also affect your SEO score, but you also need not stuff it with keywords and also keep it descriptive.
8.Website usability for better user experience
The more the website is easy to use the better experience the visitors will have while visiting it.
You need to keep your website as usable as possible, including the navigation, linking, structure, titles and all types of elements that allow the visitors to reach the information they looking for on your website site with ease.
9.Hosting server status
Page load time
The faster the web page is loading the better experience the users will have.
Your website coding will need to be adjusted to keep the page load time small as possible, you can use Google pagespeed tool.
The web hosting is also responsible for the website overall speed, so you need to pick a good host that can deliver a high performance.
You also need to consider the response time and timeouts of your server as it can crucially damage your rank.
It’s best to have a mobile-friendly website that’s easy to use on mobile phones and tablets, it’s also called responsive designs or mobile first design websites.
You can use Google mobile friendly test tool to check if your website is mobile friends or not with recommendations too.
Having a mobile-friendly website is crucial for a good usability and user experience, which leads to a better SEO rank.
Each website needs an IP to work, when you choose your host you need to figure out if the offered host will be on shared IP or dedicated one.
The problem with shared IPs is the fact that it can be shared with other spammy websites which can badly affect your website rank.
The best solution is to have a dedicated IP for the website.
To secure data sent between the visitors and your website, you will need an SSL certificate installed on your website.
It’s also called https, you can mostly see it with e-commerce of any websites that accept online banking, however, Google itself announced how https is crucial for better ranking.
Google Chrome browser is also taking serious steps towards http unsecured pages with forms on it, as it will show a warning to the visitor that the website is not secured to send data through.
Bandwidth and disk space limitation
Each host you will choose for your website will have many packages to choose from.
You need to understand your website requirements to choose a host’s package that offers good specs that meets your requirements.
For instance, if you willing to have many big sized images hosted on your website then you need a bigger disk space host.
Bandwidth is also crucial, if you estimate that your website will have some heavy traffic then you might need a bigger bandwidth too.
You don’t need to overthink this as it’s easy to upgrade your Host’s package on the run without affecting your website.
10.Accessible for spiders
You need to maintain your website to be easily accessible for spiders/bots of the search engines.
To check if Google can access your website’s pages and index it then you can use the “Fetch as Google” in webmaster tools.
To make sure the website is not facing a penalty then you can use this term on Google: site:example.com just to make sure that Google didn’t remove your website entirely from the SERP.
Ofcourse there are a lot of penalties that Google won’t entirely remove the site from the SERP but will partially affect the rank aggressively.
Make sure that all your webpages don’t have a noindex meta tags on them, as it simply tells Google to stay away from your webpages.
Here is a tool you can use to check if a webpage has a noindex or nofollow tag on.
Sitemaps are a crucial part of the website which uses usually located at: example.com/sitemap.xml
It guides google to locate your website pages but it’s necessary to archive all the page pages on your sitemap.
On Google Search Console tools you can click on sitemaps to check how many links on your sitemap has been archived and if there is any problem with the sitemap itself.
The robots.txt file is always located at: example.com/robots.txt
It guides the bots whether or not it should index specific folders or file types, it also locates the location of the sitemap.xml.
You can also check if there are any problems with your current website robots.txt file from Google search console tools.
You need to always make sure all the rendering required resources on your web pages are not blocked from Google, which means the JS and CSS must be always available for Google to crawl.
13. Sneaky redirects and Cloaking
Be very careful with redirects on your website, any sneaky redirect done even if by mistake will get you into troubles with the bots.
Also if your website showing a different version of the webpage to the bots other than humans version, that can tremendously damage your ranking.
If you using a lot of APIs calls on your website which mostly returns http status that rejects connection from the bots then try blocking it by using the robots.txt file.
Auditing a website takes both the knowledge of SEO and the skills of a web developer.
The SEO specialist is not required to be an outstanding developer to make it work, however you need to have the basic understanding of how websites work.
The main meta tags and HTML elements of the website’s pages need to be also clear enough for you.
You will be responsible for generating the report with issues on the website then deliver it to the web developer to fix then you check it all over again after the developer get it done.
Most of the reporting work you will do will be conducted via multiple tools to automate the whole process for you, however, the basic understanding is always required to save the day.