If you’re a digital marketing professional and have any interest in Google search marketing you should definitely spend some time listening to Google’s podcast Search Off the Record. In the podcast, which releases new episodes a couple of times a month, Google search pros discuss their algorithms and give behind-the-scenes explanations of the search engine.
In a recent episode, the hosts interviewed Duy Nguyen from the Search Quality Team about how Google ranks websites in its search results and how algorithms distinguish quality sites from spam. Below we’ll discuss some of our takeaways from the episode. But first, let’s cover key vocab terms that the hosts and guests use on the show.
Key Google Search Vocab
- Signals – Signals are the variables that Google collects data on and uses to rank websites. These signals may be characteristics (such as what coding language the site is built with) or metrics (such as the bounce rate of a page). They serve as a way to translate human observations about website quality into quantitative inputs for the ranking algorithm to use.
- Manual Action – A manual action is a penalty Google issues to websites that they believe are “tricking” the algorithm. If a site is low quality it should be automatically ranked low by the algorithm. However, if a low quality site does rise up in the rankings due to a flaw in the algorithm or a workaround by the site owner, Google issues a manual action against it forcing its ranking to fall.
- Keyword Stuffing – Keyword stuffing is a (generally unsuccessful) SEO technique that involves using a certain keyword excessively in an effort to signal to Google that the page is highly relevant for that keyword and so, increase the page’s ranking. This is not recommended because Google’s algorithms now classify keyword stuffing as a negative quality signal.
- Spam – Spam is content that is intentionally low quality. It may be nefarious (like trying to trick users into sending money to an unfamiliar source) or just annoying (like trying to pull in users for advertising profit without any high quality content). Google has many spam signals that they use to classify spam and differentiate it from content that just isn’t very good.
- Index – The index is Google’s full register of websites. Any site in the index could be included in a search results list. New sites are indexed – meaning Google crawls them, gathers signal data (which is used in ranking), and adds the site to the index. This happens automatically but if you want to expedite the process, you can submit the URL in Google Search Console.
Three Ways That Google Prioritizes User Experience With Search
Google’s Duy Nguyen emphasized that the main priority when returning web results for a Google search is user experience. Nguyen brought up three ways that user experience guides decisions about the search function.
First, the number of search results returned is limited to a quantity that could feasibly be viewed and digested by a human user. There may be billions of websites that have some tangential relation to the topic searched but, since a typical human user would not comb through all of them, Google’s algorithm determines which sites are most likely to be useful and relevant and returns an abbreviated list. The shortlist of sites deemed most relevant still may include tens of thousands (depending on the search query). This is still quite a lot for someone to comb through but at least it is feasible that they might.
After limiting the search results to a digestible number, Google ranks them so that users can view the ‘best’ content first and don’t have to manually hunt through all results if they are short on time. We’ll dive deeper into how Google does this below but the two elements that Nguyen emphasized were site quality and relevance to the query.
The third way that Google works to prioritize user search experience is by sorting out spam or spam-like content. According to Nguyen, Google’s goal is for less than one percent of search results to be spam.
How Does Google’s Algorithm Work?
Algorithms take a set of inputs and deliver an output. For Google’s search purposes, the inputs are any signals that Google decides should determine a site’s ranking. The output of the algorithm is the search results list ordered by Google’s best estimate of usefulness. Since an important factor in the ranking is how relevant the page is to the particular search query, there is no absolute ranking of pages. A new ranking is created for each search query. The page zucchini-recipes.com may have high quality content and rank #1 for the search query “zucchini recipe” but it probably wouldn’t nab the #1 spot in search results for “how to make hamburgers.”
Members of the Search Quality team like Duy can improve the usefulness of search results by monitoring and tinkering with the algorithm. This “tinkering” may involve adding new signals as additional inputs, removing signals no longer deemed relevant, or changing the degree to which certain signals impact the output.
In order to use algorithms for ranking, Google must find ways to quantify website quality and relevance to a search query. Since an automated algorithm does not have innate human judgment it cannot simply “read” sites and decide if they are high quality or not. Instead, it reads the signals that are identified by crawlers and deemed important by Google’s developers. These might include things like site speed, keyword density, bounce rate, and domain authority.
To make some signals more important in ranking than others, Google can use multipliers in the algorithm. For example, if they deem bounce rate to be five times more important than other signals, they can include a multiplier of five on the bounce rate input in the algorithm.
No one knows exactly what the algorithm is because it is frequently tinkered with by the Search Quality Team and it is a closely guarded Google secret. On the podcast, Gary Illyes explained:
“Every search engine has its own kind of magic that they are using, that they were brewing for years and typically the part that we don’t want to talk about.”
Google’s Advice to Site Owners
While the algorithm needs to quantify signals in order to work, Google wants site owners to prioritize quality over metrics. That’s part of the reason that they keep their algorithm a secret – they don’t want spammers to be able to hack the ranking system by simply optimizing to the metrics used as inputs.
Duy advises users: “Don’t focus on one single thing because we have hundreds and hundreds of ranking signals. Focusing on one thing doesn’t mean you will improve it across the board and would rank your site better.”
However, they do recognize the frustration of trying to improve your site’s quality without concrete goals or feedback. That’s where Google Search Console comes in handy. Setting up Search Console lets you see your site from the algorithm’s perspective. You can view core web vitals and get recommendations for improving site quality. In Search Console, you can also receive notifications from Google about any content that is interpreted as low quality or spam.
What Happens if the Algorithm Thinks Your Content is Spam?
Just like the algorithm needs signals to rank legitimate content, they also need signals to identify spam and exclude it entirely from results. Even if you run a legitimate site your content could mistakenly be flagged as spam if it has spam-like signals. These could a be high density of certain words, unusual punctuation, or an abundance of random links. The good thing is that Google warns site owners about this via Search Console and will advise on how to fix the issue.
If one page on your site contains spam-like content but the site overall is clean, only that particular page will have its ranking negatively impacted. After you fix an issue, it may take some time for the page to recover its ranking but as long as you avoid repeat violations your site should remain in legitimate standing.
Common Mistakes to Avoid
The Google team discussed other issues they often see site owners run into. Generally, Search Console will flag these but, in case you don’t use Search Console or just want to cover your bases ahead of time, take note of these common mistakes:
1. Old Versions of CMS
Many website owners run their sites on a CMS, or Content Management System, like WordPress or Squarespace. These hosting platforms are great because they have built-in SEO tools and templates that are optimized for user experience. That’s not to say that every WordPress site will receive top ranking by the algorithm – especially since so many websites use the same CMS. However, they help site owners start off on the right foot.
The problem with so many people using the same CMSs is that they become tantalizing targets for hackers. To counter this, WordPress and other CMSs release new versions frequently with security updates to block potential hackers. When they release a new, more secure version, sites running on older versions become vulnerable because CMS companies usually release information about the update that could give hackers a clue about the security holes that existed in the old version.
If these sites get hacked, they will likely be flagged as spam by Google – and you lose control of your content!
2. Outdated Plug-ins
Similarly, hackers can access sites through outdated and vulnerable plug-ins. Be sure to update these whenever new versions are released.
3. Spammy Comments
Allowing comments on your site is a great way to get user feedback, content ideas, and engagement. However, by letting site visitors comment on a page, you lose some of the control over your page quality. Excessive comments containing spam signals can cause your page to be flagged as spam. Google will notify you of these in Search Console so you can delete comments that contain spam.
4. Focusing Too Much on a Single Metric
From the conversation with Duy Nguyen, it is clear that one of the Search Quality team’s pet peeves is when site owners prioritize a single metric rather than prioritizing content quality. It’s a bit counterintuitive to de-prioritize metrics since the ranking algorithm uses metrics as inputs. However, Google’s request to users is to trust that the algorithm can correctly interpret these signals to identify high quality content on user-friendly sites. Focus on well-written articles, high-quality images, intuitive site navigation and useful functions.
When you’ve nailed all of those elements, it is still worth optimizing other metrics, which you can monitor in Google Search Console.
- Google’s ranking algorithm is designed to optimize the user experience.
- The algorithm takes in signals (defined by Google) as inputs and uses them to produce ranked search results as outputs.
- The algorithm is constantly being improved upon and has gotten quite good at weeding out spam.
- To avoid giving spammers any tips, Google keeps its algorithm a secret.
- Legitimate sites could be weeded out if they carry spam signals.
- Registering your site on Google Search Console gives Google a way to notify you when there are issues on your site so you can fix them.
The Search Off the Record podcast is a useful resource for SEO pros and site owners who want to learn more about Google’s processes and the technical side of things. The hosts, from Google’s Search Relations Team, can get quite silly, making it a fun listen!