I’m so tired of hearing various pundits say that SEO is dead. Maybe they are merely being provocative. Perhaps they need to fill seats in their event, and so they come up with “bait” session titles like “Why SEO is fundamentally DEAD.” (Yes, that was actually a keynote title at a very popular conference last year.) Or maybe they drank their own Kool-Aid and really believe this nonsense.
While SEO is NOT dead, the way that you’re doing it might be. Does the following describe your approach? You’ve optimized your H1s and meta tags and you’ve built a few (hopefully white hat) links. Now you just sit back and watch your site rise to the top of Google, right?
Wrong. This sort of cookie-cutter approach to SEO — one that equates SEO to tuning a guitar or to following the steps to a pumpkin pie recipe — rarely works in today’s search landscape.
Traditional SEO is dead
It’s human to want a repeatable formula to achieve a goal. The bad news is that there is no precise formula to SEO anymore. Sure, there are best practices, and a skilled SEO practitioner can greatly increase the chances of a good outcome. But we live in a world that comes with no guarantees — especially where SEO is concerned.
Of course, there have never really been any absolute guarantees when it comes to SEO. You should run away screaming from any SEO practitioner who promises one.
But for years, many operated under the illusion that if we just tweaked our title tags a little more and got just one more link, we would be rewarded with a higher ranking.
So if we aren’t able to predict an outcome from our optimization efforts, do I agree with those pundits who say that SEO must be dead?
Unemployed SEO Expert – Trigger and Freewheel
In a way, yes. SEO in the traditional sense is dead. Outsmarting the search engines will no longer be feasible for most. But SEO does still exist, just in an evolved form.
To understand what SEO is today, let’s look at how we got here.
The rise of artificial intelligence and machine learning in search
Remember howGoogle Panda shook the SEO world? Panda was released on February 23, 2011, impacting up to 12 percent of search results. Some aspects of Panda were easy to understand — the notion of thin content, for example. But other aspects were quite subtle.
Panda was the introduction to machine learning for many in the SEO industry. Google had gathered ratings from humans on the perceived quality of a website based on a set of questions. The engineers at Google then applied machine learning algorithms to extend those subjective human opinions to the rest of the web, and Google Panda was born.
It’s one thing to tweak a title tag to have a better keyword. It’s quite another thing to ask yourself whether the page will be judged as delivering a high-quality experience.
Malcolm Gladwell suggests in his book, “Blink,” that humans judge quality literally in the blink of an eye. These snap judgments, including whether a website looks “shady” or “trustworthy,” come from the gut level. It’s extremely difficult to “game” a judgment that comes from the human subconscious.
Then, on September 26, 2013, Google took artificial intelligence to another level by announcing thatHummingbird, a major rewrite of the core search algorithm, had been released. Not since theCaffeine update had there been such a significant reworking of Google’s machinery.
Most of us SEO practitioners have seen the evidence of the Panda algorithm and its spammy link penalizing counterpart, Penguin, starkly staring back at us in Google Analytics in the form of a major organic traffic drop. But when it came to Hummingbird, for most sites, there was no obvious impact. Yet when Matt Cutts said Hummingbird affected 90 percent of all searches (compare this to Panda’s 12 percent), it was clear something big had happened. But what?
“OK Google” shepherds in semantic search
A clue had come in the form of a Google demonstration of hands-free conversational search at Google I/O: the “OK Google” voice command.
It was thrilling to see we were one step closer to realizing a Star Trekkian future where we could speak to our machines using natural, everyday language, and they would not only understand us but also answer back.
But under the covers, to handle conversational queries correctly, search engines like Google needed to understand the intent of the query, not just the words in it.
We had made the leap from “words” to “concepts.” Understanding the meaning behind words, as well as the relationships between the words in a given topic, is known as semantic search.
If this ability to understanding meaning and intent behind words is not “artificial intelligence,” I don’t know what is. Google Now is only the beginning. We’ll soon be talking to our computers more than we will be typing at them.
And search continues to evolve. Last year, Google announced it had released RankBrain, which is machine learning that helps Google understand and process search queries. RankBrain has been particularly useful to Google in long-tail queries, which are often conversational and new to Google. Even today, 15 percent of search queries entered into Google are new searches never seen before. RankBrain is being run across 100 percent of all Google search queries; it’s become pervasive.
RankBrain is another step in the evolution of the true realization of semantic search.
With semantic search, Google can understand what an article is about. We see evidence of this when articles rank for keywords that are not found anywhere in the article (or in anchor text pointing to the article). One simple example of this is the search for “internet marketing,” which returns Quick Sprout’s guide to online marketing in the number one position. The word “internet” is not found anywhere in the guide.
So if you can rank for a keyword without having it in your title tag or in any of the usual optimization targets (such as the URL and H1), how much does on-page optimization really matter?
Title tag correlation with higher rankings is smaller than expected
In a recent study that analyzed one million Google search results, Backlinko found that the correlation between a given keyword in the title tag and the ranking for the search with that keyword was much smaller than expected.
It used to be important in SEO to have an exact matching keyword (or at least close to it) in a title tag in order to rank for that particular search query. What the Backlinko study illustrated is that Google is now significantly better at understanding the context of your page, and thus you don’t need to be explicit with the keyword you’re targeting, especially if your content clearly discusses the related entities involved in the topic.
It’s all about “entities”
What do I mean by “entities?” Let’s take an example. If you have an article on list building, it’s likely that the keyword “list building” would appear, but it is also likely that terms related to list building would also be present in the article, such as “subscribers” and “email.” These terms are relevant to our topic of list building, s0 it’s reasonable to expect them to be in our article.
We know that “email” adds specificity to “list building.” For example, it further defines the type of list (it’s not a Facebook audience). So “list building” and “email” have a relationship which creates meaning beyond just the words. So in the search industry we use the term “entities” to describe these “things” that have a meaning and often have a real-life existence and relationships with other entities.
Incidentally, this may be why longer-form content is performing better in organic search today, because the content describes more fully the topic and has more of the related entities present.
My favorite new tool for exploring entities and relationships between topics is Searchmetrics’ new Topic Explorer , which I demonstrated live last week at Pubcon in the Advanced Keyword Research session. Since Google has gone beyond keywords into entities, we too need to go beyond traditional “keyword research” into “entity research.”