London: It’s not unusual in today’s SEO landscape to wake up and find a sudden fall of organic search traffic and/or keyword rankings. When such a disaster takes place, most large brands have access to a competent SEO agency or robust in-house team to make out and rectify any issues that led to the fall. For smaller brands with inadequate budget allocated to SEO, this could mean the end of the organic search channel as a driver of business.
By taking benefit of the intelligence that is obtainable for free, brands with low budgets can recover from SEO issues in order to return to leveraging the organic search channel as a revenue generator. Below is a review of three such SEO tools.
Google Webmaster Tools
A very useful resource for SEO experts and webmasters alike, Google Webmaster Tools suite provides insight into the overall health of a website and explicitly points out many issues that can negatively affect a site’s organic search health.
Getting started is as simple as signing up for a Google account and verifying ownership of the website in question by adding either a verification file to the website’s server – including a verification meta tag on the homepage – or signing in to the site’s domain name provider.
This tool is provided by Google at no charge so there is no excuse for any website to not have an account.
Once a Google Webmaster Tools account is active, upon logging in Google will provide a list of latest alerts for each website being managed under the account. This is the first place to seek any issues that may have caused a loss of performance. More information on the topics identified as well as a list of recommended actions can be seen by clicking on “View details”.
If there are no alerts or if the suggested actions from the alerts do not rectify the issue, the next place to look is in the “Health” section within the Google Webmaster Tools interface using the left sidebar navigation. There are several reports in the Health section, including Crawl Stats, Blocked URLs, Fetch as Google, Index Status and Malware.
The Crawl Stats report is very helpful in troubleshooting because it offers both a list of URLs that return errors and a timeline showing the total number of errors found on a day-to-day basis. This timeline can help pinpoint the day things went wrong if there is a sudden increase in errors detected. For example, if a recent change to the website caused a number of pages to return server errors that could definitely account for the loss of organic traffic.
The Index Status report is another helpful tool in adjusting on-site issues that can cause decreased traffic and keyword rankings. Reviewing the timeline for sudden changes in the number of Web pages Google has indexed can give insight into exactly when these changes took place so they can be attributed to on-site changes or search engine algorithm updates.
Google Webmaster Tools offers a wealth of extra information, reports, alerts, recommendations and analyses that can not only help spot recent issues, but guarantee maintenance of a healthy website going forward and is continually being updated with increased capabilities.
In order to comprehend how a website is capable of performing in organic search, it’s important to know exactly what search engines will find when their spiders crawl the site. The best way to do so is by running one’s own crawl of the website. The Screaming Frog SEO Spider Tool is a desktop program that can do just that – for free.
Crawling a website with Screaming Frog returns a huge amount of data and is able to spotting server errors, duplicate content, broken links, etc. that may not have been reported in Google Webmaster Tools.
After downloading the desktop program, simply run it by entering the URL of the homepage and clicking “Start”. The crawl will return various data points, segmented out of different tabs. For instance, the “Internal” tab will record all internal pages linked from the page that was inputted, along with the status code. This makes it very simple to find broken internal links so they can be fixed instantly.
Similarly, the “External” tab, will list all outbound links along with the status code. This will inform broken outbound links and more importantly links to pages that are redirecting. It’s good practice to see where those URLs are redirecting to as there is potential for them to redirect to some bad neighborhood sites, which could reflect negatively on the website in question.
The “Page Titles” and “Meta Descriptions” tabs can help pinpoint duplicate content issues by recognizing duplicate title tags and meta description tags, respectively. If duplicate tags are known, there is possibility that the on-page content of those URLs is also duplicate.
Though performance issues have not occur on a website, it’s always a good idea to run regularly scheduled (i.e., monthly) Screaming Frog crawls simply to ensure there are no new problems that could lead to performance issues down the road.
While Screaming Frog is helpful for looking at a website as a whole and identifying server errors, and duplicate meta tags, it’s also significant to know exactly what a search engine spider sees when it crawls each page.
BrowSEO Spider Simulator offers a free look into a page as viewed by search engine spiders. In many cases, a webpage may have great information for users, but that information may not be evident to search engines. Therefore, the page does not get any credit for it by the search engines.
BrowSEO is very easy to use. Simply input the URL of a webpage and click “Browse”. The tool will return two columns: a left-hand column containing each page element as it would be seen by a search engine spider (i.e., images, image alt tags, links, body copy, etc.); and a right-hand column that offers details about the various page elements (i.e., server response, word counts, character counts, meta tags, heading tags, etc.).
Compare the spider simulation of the page to the actual version a user sees. If there is any important content visible to users but not spiders, the page is possibly not being credited for that content and there is missed chance for organic search performance.
When using BrowSEO for troubleshooting something like a traffic drop, a useful feature is the “Check for Cloaking-Attempt” obtainable from the “Check for fraud” link in the right column. Cloaking will often exist on a page by mistake, leading to search engine penalties. This feature compares the version of the page a spider sees to the version a user sees to make sure there is no cloaking being performed on the page.
SEO Company in London