4 Ways to Spot SEO Problems Before They Even Happen
It’s not such unlikely scenario at all….
And then one morning, it’s all gone.
Almost the entire organic traffic wiped out.
It could have been caused by anything – a new algorithm update, manual action or technical problem with the site.
It doesn’t matter.
What does however is that could have been avoided.
A simple monitoring for any problems could have saved your site from plummeting in rankings.
Let’s face it:
In many instances SEO tends to be reactive, taking action only when a problem has already occurred.
But what if you knew what to do to spot problems before they escalate?
Well, that’s exactly what I’m going to show you today.
Conduct Regular Site Audits
A site audit is a process of examination of a website to establish what is good about it and what is not.
Even the most basic site audit could help identify potential issues like duplicate content or crawler access problems that could hinder your SEO efforts.
But the most important thing about site audits is that they’re not a once off thing.
Many businesses conduct an audit right before launching an SEO campaign and leave it at that.
A website is a variable entity. It changes, evolves and expands with every new page, any other piece of content or functionality added to it.
One of your colleagues has added new pages to the site but forgot to properly optimize them. And now, in spite of having this new content up, there is little chance that it could achieve good search engine rankings.
Or a programmer unknowingly changed an important aspect of a template while submitting other code updates. Now all pages created with this template will contain that error, potentially preventing them from ranking.
Needless to say, such problems could negatively affect your site’s rankings and organic traffic.
And the only way to prevent them is by conducting regular site audits.
Here are some of the most important problems auditing a site could identify:
Duplicate Content. We’ve written about the duplicate content issue at length in this post. But to reiterate, duplicate content is one that appears on the Internet in more than one location.
This could be the same copy appearing on different pages on the same domain or across multiple domains.
Duplicate content confuses search engines as to which version to display in search results. And thus, it often forces them to either display a copy or drop the content entirely from the index.
Duplicate meta-information. Meta-tags, title and description are two of the most important on-page ranking factors. They help search engines understand what a page is all about but also, they use them to display search results.
But with hundreds of pages to manage, it’s easy to lose track of meta-tags and either start using duplicates or have no title and description tags at all.
Accessibility Issues. To rank your website, Google first needs to be able to access it, scan its pages and include them in its search index.
But if you accidentally block that access, via an error in robots file or a faulty plugin, search engines will no longer be able to access your site. And may even decide to drop it from index as a result.
Broken Links. Links pointing to pages or websites that no longer exist provide a poor user experience and might be considered negative ranking factors.
Rankings are the best metric of SEO success.
They indicate success or failure, make you strategies accountable and allow predicting traffic and conversions.
But they also help spot SEO problems.
No other metric will indicate a problem better than a sudden drop in rankings.
Traffic naturally fluctuates depending on trends, seasonality, events and other factors. And thus, it’s easy to overlook small dips in the number of visits to the site.
An unexpected drop in rankings however immediately indicates a problem. It could be a technical fault with the site. Or Google releasing another algorithm update.
Regardless of the reasons, monitoring rankings can help to spot a problem and start taking action to rectify it.
Audit Your Content Too
Only few years ago it was perfectly fine to post 350-500 words articles to rank.
But today, in the post-Panda world, you probably know that the search engine will only rank content if it passes very rigorous criteria of length, usefulness and a lot more.
In this post Kathryn Aragon lists the characteristics of ideal, post-Panda content:
- It should provide real answers to people’s problems,
- Help or entertain readers,
- Teach them how to do something,
- It should include references, quotes and back up every claim,
- Provide information people want to talk about on their own sites and,
- It’s a content that gets quoted, shared and linked to.
But how do you know if your content passes these criteria?
One way to find out is by auditing it, simply.
A full-blown content audit will help you to:
- Identify all the content you have on the site,
- Find out if it meets the needs of your audience,
- Find out how well it is optimized for search performance,
- Understand what actions you could take to improve it and so on.
Content audit is a complex process involving listing all pages on the site, collecting vital data about their performance, traffic, optimization and many others.
To learn about each of these steps, check out our detailed guide to conducting a content audit.
Monitor Data in the Search Console
Search Console (formerly Google Webmaster Tools) is the search engine’s platform providing webmasters with the insight about their sites and their performance in search.
And needless to say, it includes a lot of data that could help you spot SEO problems before they happen:
No of Indexed Pages. If the number of your site’s pages the search engine has indexed has dropped for no reason (i.e. you didn’t remove those pages from the site), it might be an indication of crawler problems.
(An Index status of a relatively new site. Notice the small fluctuations of the number of indexed pages possibly due to few of them being removed after launch).
Crawl Errors. Similarly, if the number of pages with errors goes up, it might be a signal to investigate.
The screenshot below shows what happened after a change of a WordPress plugin that resulted in some pages being deleted from the site.
Sitemaps. A sitemap helps search engines to index your pages quickly and more thoroughly. But that’s providing that there are no issues with the sitemap. If you use a plugin to automatically update sitemaps, you should regularly check if there are no new issues with updated versions.
Robots. We’ve already talked how an error in robots file could prevent searchbot from accessing your site. Robots tester tool in the Search Console could help you quickly test your file for any potential errors.
Creative commons image by Mike Krzeszak / Flickr