Google Search Algorithm Leak - Takeaways for SEO

If you're not an SEO you may not have heard that large amounts of documentation around Google's Search algorithm was leaked last week. It had many taking a deep dive in order to show that, reliable, honest SEO practitioners were on the right track anyway.

As Mike King and others have noted, the leak “suggests that a holistic approach to SEO, leveraging larger themes of site architecture, topical relevance, user intent, and entity-based optimization, may be more effective than targeting specific keywords alone.” - source

And this is the point.

To be honest, this has been the fundamental basic approach to SEO for many years. Content is an investment in both money and resources, but also the patience to see serious SEO strategy through to the end.

The minutiae to which many lesser experts are making assumptions around what the Google leaks actually imply is wild. When the industry's top experts are digging really deep in the weeds for the most minute of signals, it's not for the average business, it's for global brands already winning in the SEO space as a result of the mind-numbing budgets they have.

I should say, there are a few interesting takeaways important for smaller clients—which, again, can't actually be entirely verified by the leak—however, localisation is important. But this isn't particularly new information. It's just the way in which it would appear to signal... again, this is still an assumption as to how or whether the specific attributes contribute at all or how important they are.

Internal linking for high-ranking pages works. Sometimes I feel like I'm losing my mind when this even needs to be stated. Less, so contextually relevant anchor links. But why would an anchor link not be contextually relevant anyway? It helps the user make faster decisions that likely will help SEO.

A little tidbit that shows how deep things may go and only some had hypothesisesed is the information fed through from Google Maps. Data taken from Google Maps, obviously including Maps search but also actual foot traffic gathered through mobile device location looks to be a potential signal.

Google search results, most people will agree, are becoming less and less reliable. A big part of this is due to Google’s Head of Search, Prabhakar Raghavan insisting on monetisation over quality product. There’s a ranty podcast about it here, if you’re interested). Regardless, this focus seems kind of misplaced given the search product and its reliability is precisely what allowed them to monetise their search in the first instance.

Also, it's possible that you can have too much data. Too much information, too many different ways for an individual to search and ways for Google to determine the results of the query. Maybe it's just all a little too complex. If what they had before was working and led to their domination, why pursue complexity with little proof of concept?