Keyword research is often the first thing you do when planning a new SEO campaign (or auditing an older one). It provides the skeletal framework for a campaign, and for years has been a mainstay tool in the SEO expert’s belt. But over the past five years or so, keyword research has undergone some interesting evolutions, becoming less relevant in some ways and fundamentally changing in others.
If this trend continues, or if we see another major leap forward, could keyword research become totally irrelevant for SEO?
Why Keywords Matter in the First Place
Let’s remind us why keywords are important in the first place. The goal of SEO is to get your site ranked higher for various search queries—but how do you know which queries are best to rank for? This is where keyword research comes in. It allows you to find keywords that offer:
- Relevance, so that all incoming queries directly relate to your business and are capable of making your inbound users satisfied with the results.
- High traffic, so you have as many new people as possible seeing your site listed in search results.
- Low competition, so you don’t have to work as hard to rank for your chosen queries.
This information allows you to selectively target valuable keywords and phrases to include in your site’s metadata and content.
Finding and keeping track of keywords also serves as a valuable metric which you can use to gauge the effectiveness of your campaign by tracking keyword rankings over time.
The Old Days of Keyword Research
Keyword research used to be pretty simple, back when optimizing for various keywords meant simply stuffing them into every meta tag, and as much throughout a page’s content as you could (along with exact-match keyword anchor text in link building efforts).
Through Google Analytics, Google used to offer tons of data about how people were searching and how they found your site through keywords, and once you had a list of keywords with high traffic and low competition, you could straightforwardly optimize for those queries.
Most Google queries featured a one-to-one search relationship; Google would take your word or phrase and look for near-exact matches to those words and phrases on the web.
Hummingbird and Semantic Search
Enter Google’s Hummingbird update, which rolled out originally in 2013. This update introduced a concept known as “semantic search,” which drastically changed how Google handled incoming queries. Rather than taking a user’s words and searching for matches on the web, Google now evaluates the intention behind a user’s query, and then finds appropriate results that match it. This may seem like a small difference, but it’s had a major impact on how search optimizers think about keywords.
For starters, including a keyword or phrase verbatim isn’t a surefire way to optimize for it, and it’s possible to gain rankings for semantically linked words and phrases that you didn’t optimize for directly – and sometimes ones that aren’t even present on the page that’s ranking for them! Check out this query I tested out just now, “that movie where the guy takes a pill to feel no emotion”:
Yes, the movie I had in mind was Equilibrium. Bravo, Hummingbird!
Long-tail keyword phrases, which comprise many words linked together (usually in some kind of conversational query) have also become more popular, in part due to Hummingbird’s effects, and in part due to increased search competition forcing marketers to find less competitive, rarer phrases.
This has led to an interesting dichotomy in the search community; is it better to research and optimize for keywords, with the specific intention of building rankings for those keywords, or research potential content topics—focusing on general topical themes rather than keywords—to better serve your audience?
As the semantic search continues to become even more sophisticated, the power of individual keywords will continue to diminish, while the power of topical themes will continue to rise. Google is always getting better, and that means simple, calculated approaches will become less relevant over time.
Google’s Restriction of Keyword Data
Over the years, Google has also become increasingly protective of the keyword data it leads to marketers. It started with its restriction of keyword data in Google Analytics, preventing marketers from evaluating keyword-based traffic to their sites. Now, Google is throttling keyword data in AdWords (at least, for low-spending accounts), presumably in an effort to blind organic search marketers to this data.
What is Google thinking? First, Google wants to encourage more spending on its paid advertising in search results. Second, Google has always wanted to crack down on anyone trying to exploit quick wins in organic search rankings, to keep them fair and trusted by users. I imagine this trend will continue even further, though there are plenty of third-party tools to make up for what Google won’t give us directly, a few of which I covered here.
Looking Into the Future
New technologies are also shaping the way that people conduct searches. Digital assistants like Siri and Cortana are encouraging users to search for things through conversation, which fundamentally changes the way queries are input, as well as their structure since spoken queries are often much different than written ones.
They also introduce new search mechanics into the world of optimization, offering spoken responses rather than search results pages that list relevant results.
New technologies such as virtual reality, augmented reality, and wearables could have a similar and complementary effect, further prompting people to change the way they search, and possibly disrupting the ranking system altogether; small screens and new types of interfaces may someday totally change how we interact with search results.