Three New Free Onsite SEO Crawlers You Must Try

crawling

One of the reasons why SEO will never die is that your friendly optimizer will crawl your site to find out what potential issues a real search engine crawler might encounter when trying to index your site correctly. Indeed that task is so common that software does it by now.

You don’t need to code a crawling script from scratch like in the early days.

There are many tools that crawl your and competitor sites for you. Indeed many of the scripts people had to build for themselves are available to the public now for free.
There are also complex paid downloadable programs like the Screaming Frog SEO spider but there are new, web based and free (at least to some extent) tools now.

Rob Hammond’s SEO Crawler

rob-hammond-seed-url

I’d like to start with Rob Hammond’s SEO Crawler first because it’s the tool which is easiest to use. It doesn’t require a sign up and log in. You can enter your (or any other) URL right away and start checking the site. The output is a simple and easy to overview table filled with URLs and potential technical problems from the SEO point of view.

Rob Hammond’s SEO Crawler shows you the basic issues that your site may face.

rob-hammond-multiple-h1-tags

For example I discovered that our new Genesis based WordPress theme has duplicate h1 tags. That was a surprise to me. Google might consider this to be a trick to fool the search engine in the worst case and in the best case just discount the ranking value of your headline. The h1 is like the title of a book on the cover. You just have one!

rob-hammond-status-302-200

Then I noticed the homepage redirect, a temporary 302 redirect that might potentially be a hazard. Google considers such pages not really moved so the new address doesn’t get the full credit while the old that sometimes still gets it might be considered empty.

None of the redirected pages get ranked accordingly to their true value usually.

In our case it’s the redirect to the https version of the site which might be not that bad as only the unencrypted one will get indexed ideally.

rob-hammond-page-not-found

Additionally a broken link in my latest article appeared on the list. (I simply copy and pasted it from the browser and forgot that these days you don’t see the “http” in the URL bar anymore.) WordPress needs that though, otherwise it interprets the link as a local one.

I hope I don’t have to explain that broken links are not only bad for user experience but also harm your overall site authority on Google.

SiteCondor

sitecondor-create-a-job

I’ve got only recently approached by the co-founder of SiteCondor about his new tool. It’s a fairly well designed and self-explanatory SEO Crawler for the intermediate user that offers some helpful visualizations to its users.

Of the three SEO crawlers described here SiteCondor has the best user experience in my opinion.

You get a quick menu like overview of your crawling job for example:

sitecondor-overview

Once you click “Site Map” or “URL Structure” you also get neat visualizations of your site:

sitecondor-structure

SiteCondor quickly identifies the pages that could pose a risk, for example blocked pages:

sitecondor-meta-robots

In our case all blocked pages are harmless though or rather blocked on purpose in order not to confuse Google with duplicates or for security reasons.

SEOcrawler.co

seocrawler.co-new-project

I’ve already recommended SEOcrawler.co earlier this year in a guest post I later republished on my own blog to get it back online. That’s why the number of free “credits” is already limited on my account (That’s the “max pages limit” of 93). You can usually scan more than 93 pages. You decide yourself how many. There is an overall limit which is sufficient for a few smaller checks or approx. three larger ones. It’s 2000 credits as far as I remember.

I discovered missing meta descriptions at once, an issue some considered not so crucial but now that people are keen on improving their CTR in the SERPs gets more attention now. Without a description Google will pick a random text snippet from the page. Why not craft a tailor-made and enticing meta description instead?

seocrawler.co-missing-description

SEOcrawler.co also manages to point out the broken outgoing link I mentioned above. You have to look for it first though by choosing the “Pages with error” option in in the drop-down menu on top of the results.

seocrawler.co-pages-with-error

The tool also shows several others potential onsite issues as you can see in the partial screen shot above. Missing titles, headlines or blocked pages with the meta robots tags can be specifically searched for using SEOcrawler.co too.

 

In case you are a regular webmaster with basic onsite SEO understanding these tools might already suffice to identify your onsite SEO issues.

Of course you might also consider some site speed testing tools and remember that content and incoming links are crucial for modern SEO but some of the obvious technical SEO issues everybody can fix will be found with the help of these three free or affordable SEO crawlers.

Do you know of any other SEO crawlers?

Let’s hear about them in the comments!

Oh, and if you liked the post, share it on Twitter, Inbound, Facebook and G+.

 

* Creative Commons image by Tambako The Jaguar

  • Gerry White

    http://visual-seo.com/ not used it yet, but it seems to be passionately developed! – I am going to look at the others 😀

  • Paul Rogers

    Some good crawlers there, I’d recommend DeepCrawl for enterprise-level companies / websites.

  • There are several crawler tools that I use consistently, you should always have multiple crawler tools in your toolkit as each will scan the website through a slightly different process. The most comprehensive free crawler i’ve found so far and use all the time is https://www.crawlmonster.com.