This article outlines a simple yet powerful phrase targeting methodology designed to detect quick wins for organic search traffic and enable decision making in terms of resources, time, budgets, content and products.
One of the most underutilised tools in the industry must be Google Webmaster Tools suite. For some reason webmasters and marketers seem to be choosing costly and aggressively marketed tools which can do the same or even less than some of the free tools out there. Well Google’s Webmaster Tools are completely free and are the only and most reliable chance for us to get a glimpse of Google’s understanding of domains we manage.
Getting the Data
For those who haven’t got an account yet, stop reading this article and go sign up immediately. For those of you who are familiar with the tool, here’s a bit of a refresher. What you see on the first page is a good overview outlining top search queries, problems with site structure, keyword prominence within content, sitemap information and external links.
Navigate to your site on the web and then search queries, or simply click on the “more” below the search terms block in the home page dashboard and set the appropriate traffic filters. In this case we’ve removed terms containing the brand “analogik”, selected the widest possible date range and selected to show search queries with more than 10 impressions/clicks.
You will be presented with a cleaner search term list suitable for what we wish to do later on. Note that you can adjust filters at any time if you change your mind.
The table below is simple to read, first we have the search term followed by how many times a website has appeared in the search results (impressions), how many clicks it received as a result (including the percentage click-through value, CTR) and the average position of page(s) for that search term. Together with this information Google offers trending (change) figures, but we will not use this information this time.
After you have it all set to your linking simply click on the download link on the main screen (below the table) and you’re ready to start playing with the numbers.
About the Data
Google rounds all data available so use it as an indicator, not the absolute value. If you want absolute figures you cannot even use Google Analytics, instead turn to your log files, although as of this year Google’s data is not going to be completely accurate as some 5-10% of your organic search queries are now encrypted.
Position can be based on multiple ranking URLs. As you can see in the image below, the term “acid trip” brings two ranking pages and both can appear on different positions. Google will figure out their own ranking value based on frequency of appearance and position of each URL.
It’s also worth mentioning that if you don’t specify target countries, an average value will be taken from all ranking phrases / pages. If you wish to see country-specific rankings you can set your filter to a preferred geo-location at the top.
Now that you have the CSV file, you can start predicting how different ranking situations may impact your traffic and sales. This is the time consuming part (especially if you have many phrases to consider) so I have to admit that we have a tool that does this for us automatically. Any decent spreadsheet program will do the trick though.
Here are our figures:
Search Volume (Impressions)
CTR (Click-through value)
Clicks (Clicks from Google results)
Ranking Position (Average)
What we can do next is ask ourselves, if CTR on a position #2 is 20% if I go up to position #1, what will my clicks be? The missing link here is that we don’t know what the average CTR for each position is. Once you work out averages for top ten positions for your site you can generate a standard CTR value which you can apply across all phrases and create different ranking scenarios.
Tip: Hide #1 ranking phrases from your list as there is no further potential growth for such phrases.
Here is what I have done for the purpose of this article:
After using this methodology for a while we noticed it started to betray us in some cases while in others it worked really well. We investigated further and found that there were several factors contributing to this:
1.Branded terms tend to skew CTR rates. When people search for “Amazon” they click on Amazon result in 99% cases.
2.Ranking averages can put pages in wrong CTR bracket. To battle this we implemented more stronger rounding down (1.9 is 1).
Alternative Calculation Variant
Some deviations were too hard to define, so we implemented a variation of our methodology utilising phrase-specific performance. For example if a phrase ranks #2 and its CTR is only 5% we factor in that ratio and apply in scale instead of the site-wide average.
Here’s the difference:
Traffic projections are always nice to see, however to produce tangible/actionable reports you need to add goal value and conversion rate in the mix.
With this information you can visualise financial outcomes of your ranking scenarios and make the best use of your resources and time.