In an average year, Google makes about 500 changes to its search engine algorithms. If it seems like Google has been making more—and bigger—algorithm adjustments lately, you’re not just imaging things.
Google owns about 66 percent of the search engine market share, according to comScore’s September 2012 rankings, and this number has remained steady for several years. Even so, Google is getting “more competitive pressure” from Microsoft Bing and is therefore even more focused on improving its search engine technology, Meyers explains.
In addition, Google’s revenue model is “maturing,” he says. “They are getting to the point where revenues are plateauing. That puts additional pressure on improving their search results.”
The result: Since February 2011, with its widely reported Panda (also known as Farmer) update, Google has made some whopping algorithm updates. Here’s what you need to know to make sure your SEO efforts are still paying off.
Google Panda: Putting Content Farms Out to Pasture
What it is: In February 2011, Google rolled out a major new algorithm. It was called “Farmer” because it was targeted at demoting high-volume content farms in Google search results. The update eventually became known as Panda, a reference to the name of Google engineer Navneet Panda. Since February 2011, Google Panda has been updated 20 times, Meyers says.
The initial Panda update reportedly affected the rankings of nearly 12 percent of all search results, according to the Search Engine Land blog.
Google’s goal: Panda was designed to push down sites that are overly optimized, offer “thin” content and/or operate as content farms, explains Michael Martin, SEO manager at Covario, a global search marketing agency. (A content farm produces large amounts of content specifically to attract traffic from search engines and use those page views to generate easy advertising revenues.)
Meyers gives as an example a pest control service, operating nationwide, which may have created a specific Web page for every U.S. city in which it operates. The content on those pages is nearly identical except for the different geographic locations. With Panda, Google’s search technology is better able to identify nearly duplicate content like that, recognize that those pages offer no real value to its users and push that content way down in search result rankings.
What you should do: Don’t create content simply based on keyword optimization or post thousands of pages with nearly duplicate content. If you do, Google is likely to push down your entire site in its rankings, Meyers advises. Instead, make sure your site’s content is as unique as possible and that it adds reader value. Ask yourself: “What does my content do for people who find it?” Does it help them, educate them or engage them in some way?
Sometimes, duplicate content is part of what a company legitimately offers. A large publishing company, for instance, may publish the same article on multiple sites it owns. In those cases, to avoid a Google penalty, publishers should properly identify the parent content and make sure others use rel=canonical to point back to the original content, Martin says. (You can learn more at the Google Webmaster Tools’ rel=canonical tutorial.)
Google Penguin: Putting Link Abusers On Ice
What it is: First announced on April 24, 2012, the Penguin update was “huge,” Meyers says. Unlike previous algorithm updates, he adds, Penguin was more punitive, as opposed to simply being designed to improve search quality.
Google named its new algorithm Penguin. Initially, it affected about 3.1 percent of English-language search queries, according to Search Engine Land.
Penguin sought to decrease rankings for websites that engaged in dubious link exchanges, unnatural links, relied on too many of the same anchor text links and so on. (Anchor text links are hyperlinks that contain a targeted keyword phrase.)
Google’s goal: With Penguin, Google is cracking down on a common black hat SEO practice: abusing links to gain search engine rankings. If you paid for links from lots of dubious, low-quality link directories, link exchanges and other sites, you may have felt the Penguin slap.
What you should do: Penguin has already been updated twice and is likely to updated again soon, Meyers says. As a result, it’s more important than ever to have link quality and diversity. Earn “natural” links from a variety of other quality sites because you’ve posted compelling, useful content.
Don’t focus on getting links from other sites using identical anchor text. “Look at where your links are coming from using Google Webmaster tools and what the anchor text links are,” says Ting-Yu Liu, Covario’s manager of paid media services. “Try to have at least 60 percent keyword diversification. If you have 80 percent of external sites linking to you with the same anchor text, that’s a problem.”
Google Knowledge Graph: Find Information Without Visiting Any Sites
What it is: In May 2012, Google launched an upgrade to the Knowledge Graph, its database of more than 500 million people, places and things. Search for Tom Hanks, for example, and you should see to the right of the search results information about Hanks—a photo, who he is, when he was born, his spouse and children and so on. This information is pulled from various sources, such as Wikipedia, which Google uses for its Knowledge Graph.
Google’s goal: To immediately serve users the information they might be looking for without requiring them to click to any sites to find it.
What you should do: If you post recipes, movie showtimes, event listings or other structured data on your site, make sure you meet Google’s guidelines for such data. The website Schema.org is a great place to learn more about structured data and where Google may be heading with its Knowledge Graph—which Meyers thinks will only become more important in the future.
The 7 Results Update: Google Decides Less is More
What it is: Beginning in August 2012, for some keyword searches Google began serving seven organic listings on the first page of its search engine results pages (SERPs) instead of the standard 10 listings. For example, if you Google the blog name Search Engine Land, you’ll likely see only seven results on the first page. The same thing happens when you look up CIO.com.
In a random sampling of 26,000 brand keywords across industries, 95 percent of keywords with only seven results had a top result that contained sitelinks, according to enterprise SEO platform BrightEdge. Sitelinks are links that point to internal pages on a website. For instance, the No. 1 result for the query Search Engine Land is that blog’s home page, with six indented sitelinks underneath the top result, as seen below. Same goes for CIO.com.
As a result, “you can expect that all branded keywords—for you and your competition—will now have seven link results,” according to Jim Yu, BrightEdge CEO and founder.
Google’s goal: Ever the speed demon, Google is likely trying to answer queries even faster by serving up seven results instead of 10, says Meyers. Given that the top result in these instances also has six sitelinks, however, Google is still matching the user’s query with a lot of links.
What you should do: Be aware of the big potential downside to Google’s seven-results-only pages. If your site’s most highly ranked page for an important keyword lands at No. 8 or 9, that page may now end up on the second SERP for that keyword, Meyers says. This could cause your site to lose traffic, as one well-known study found that 93 percent of search engine users don’t bother venturing past Google’s first SERP.
“Pay attention and don’t take anything for granted,” Meyers advises. “Don’t just look at your ranking software tools to see how you’re doing. They might tell you you’re no. 8, but they won’t tell you if no. 8 is on the first or second page of Google results.”
That’s why it’s important to regularly Google your company name, product name or your own name. It’s nearly impossible to get “pure” search engine results anymore, as Google serves up results based on your location, sites you’ve visited and other data it knows about you. However, Meyers says, you can get a less personalized look at results using Google Chrome’s incognito browsing mode and setting your location to “United States” (or another country).
More Ways to Stay in Google’s Good Graces
Here are several additional strategies that will help keep you a step ahead of changing algorithms.
Keep your bounce rates low. If lots of Google users find a page on your site in a keyword search, click to go that page and then quickly return to Google’s search results, your page will have a high bounce rate. That’s a signal that users didn’t find your content helpful—and Google is “starting to consider signals like that from user behavior” when ranking content, Meyers says.
Get visitors engaged in your content. When evaluating links from external pages pointing to your content, Google is increasingly looking for “real social activity and commenting on your content,” Martin says. “Google wants to see a qualifying scale of true human interaction with a page before it counts a link from that page as a true vote for your site.” Links to your content shared across Google+ are one way for Google to validate the user engagement of your content, he adds.
The bottom line: “Create great content that people want, instead of content just designed to get eyeballs to your site,” Martin says. “Great content will get true interaction from actual people. That’s what’s best for your business now, and that’s what will be best for your business in the future.”
James A. Martin is a seasoned tech journalist and blogger based in San Francisco and winner of the 2014 ASBPE National Gold award for his CIO.com blog. He writes CIO.com's Living the Tech Life blog and is also a content marketing consultant.