We need to talk about Google - the trouble with ranking signals

Posted by Owen Powis on 9 Aug, 2012
View comments SEO
Wordtracker offers Keyword Research tools for SEO and PPC, Rank Tracking and Site Analysis tools. Start a Free Trial today to get $136 of SEO books and videos

Recent algorithm updates show a flurry of activity over at the Googleplex with some serious repercussions in the search results. Webmasters are now complaining that the results are worse than ever. Things don’t seem to be going smoothly for the golden child of the internet era, so what’s going wrong?

We need to talk about Google

The results are in trouble

Partly due to the abuse through the search industry and partly due to a lack of foresight, Google appear to be in an awkward position where the current search signals are causing large amounts of disruption. Negative SEO has become more prevalent and people are now even demanding money to take down links and webmasters are threatening to sue. The situation has got so bad that Google are penalizing themselves. It's a mess.I’m not going to go through a point by point analysis of why links don’t work as others have already put it far better than I could.

The key here is that businesses that want to rank cannot be trusted to play nice. Links as votes don’t work if the knowledgeable can manipulate the vote.Google’s response is a heavier reliance on social media signals which is itself, as unreliable as links. If not more so, as we have all witnessed the rise in social media spam. These seem to leave the search results even further away from the ideal of what’s best for the user will rank highest. Google is frantically reaching for ways to disrupt this pattern, Google + isn’t an attempt to outdo Facebook, it’s trying to sort out the mess the search results are in right now. As the search engine giant struggles, algorithm updates may become even more drastic than those we have seen already.

Google need to go back to looking at content

One of the core problems is that they are relying on user generated signals too heavily to show them what is the best thing to rank. Links and social are great, but not used in the broad way that Google are. A search engine should be able to pin point the exact information I want, not big broad brush strokes based on what’s popular.

This is why Google need to step away from their reliance on these signals and go back to the content. Looking at not just what's written on the page but establishing what it says but what it means. We all know 'an apple' is an apple but Google has almost completely forgotten about the fruit:

I'm no language expert but I know if someone is searching for 'an apple' rather than 'apple' they are most likely interested in information about the fruit. Not a tech firm. Because of the weight placed on the link and social signals, pages about boring old fruit get overwhelmed. The exception here is the ubiquitous Wikipedia, the only on topic page with the power to rank in this search. Exciting things have stronger signals than boring things. You can have the best, most well researched, informative piece of content out there, yet you will rank below the guy who has run a series of competitions, linkbait and infographics on his site. You’re too busy writing and researching great content to market it. But Google would rather take the other guy who has pushed hard and garnered a lot of attention. So now it’s not about who’s creating the best content but who’s marketing their content best. Google seem to have forgotten what we use search engines for, what we are looking for isn’t ‘stuff’, but information, precise facts, products and details. If we just want to find out what everyone is talking about that’s what we use social media for. My twitter feed brings me a constant stream of ‘stuff’ that I might be interested in. With the rise of social media Google needs to get serious, I want the best answer to my question, not the most popular. For instance 58% of Brits believe that Mt. Everest is the UK’s tallest mountain. It’s not, it’s in Nepal for a start. I don’t want the content to be dictated by the crowd, I want it to be dictated by quality. Here’s another example: In the UK, I barely even know that there is a shop called target at all. They have no UK online store or presence that I’m aware of. Google is so desperate to show me a brand though that rather than showing me the wealth of other things called target, like a 2004 movie or a UK TV show, they flood the first page of the results with repetitive information about something that is of no use to me. Just fixing spam won't fix this

The job of the search engine is to find the content that will best satisfy my query. As a webmaster I should be able to just create great content and leave Google to do the rest. However we all know that this doesn’t work. If you build it they won’t come, but if you market it they will.

So let’s say in an ‘ideal’ world Google would clear out the spam and count only the sites which have genuine signals. Anyone who works with clients in markets such as finance will know how difficult it is to gain genuine links in these sectors. Many markets aren't exciting and a specialist site in a ‘boring’ niche is not a great natural link generator. More general sites which operate across more niches are better link generators. For an example think about Wikipedia and how they already rank almost everywhere. The sites which can succeed are most likely to be bigger, where they can leverage scale, partnerships and expertise to their advantage. This means that the experts carefully crafting content around a single area will be drowned out by the jack of all trades covering a range of topics. So what should Google do This is an issue of trust. Google can quickly establish which of the sites in their index may have the best content for you, however this isn’t very exact. There are lots of sites with too similar content meaning they need additional metrics to tell them which of the sites they should display first. Better understanding of user intent is key. By making the initial relevancy decision, ie which of this big list of sites is most relevant to the query, then they would be able start off with a more refined list. This would mean less reliance on those trust metrics. User behavior also holds some of the answers here, Google does take notice of how we behave in the search results. For instance Click Through Rates from the results are part a signal Google use to detect where low quality content has crept into the results. More reliance on these behavioral metrics, which can be made hard to manipulate, could help refine results better.More openness and honesty with webmasters would go a long way. Whenever Google makes a change the whole industry scrambles to react, trying to bring their sites inline with the changes. Information around these changes in usually sketchy at best. This means that the changes made can be unnecessary at best or damaging at worst. This doesn’t make the search results better, creates confusing signals for Google and cost businesses time and money. All we want is to get our content found, if the way to do this is to create the best content possible, as Google is so fond of telling us, then this is what we would do. However we all know that’s just not enough. If Google could openly show that this was actually the case, then Webmasters would gladly divert the money spent on other dubious SEO tactics into just creating the best content possible, and surely that would create a better web for everyone.  

 

Recent articles

Google rolls out November 2024 core update
Posted by Edith MacLeod on 12 November 2024
14 essential types of visual content to use [Infographic]
Posted by Wordtracker on 3 November 2024
OpenAI launches ChatGPT search
Posted by Edith MacLeod on 31 October 2024
Google expands AI Overviews to over 100 countries
Posted by Edith MacLeod on 31 October 2024
Google unveils new AI-powered Shopping experience
Posted by Edith MacLeod on 21 October 2024