Funny name. Surprisingly, as it turned out (at least according to Kathleen McDivitt, MBA, author of the ebook The New Google SEO) Google recently revised its algorithm with the assistance of a man simply named Panda. Maybe once you reach a certain level of notoriety in the Googlesphere, you can be referenced by a single name a la Cher, Prince, Madonna or Beyonce of the music industry. Meatloaf, does Meatloaf count? I’ll Google it later and let you know. Anyway, so this Panda guy says to Google:
Hey, you guys should really revise your algorithm so that it more closely mimics the actions of a user, and I’ve created a way that we can evaluate how users behave.
So, Google conducted a study where they asked end users to rate sites on three qualities, these being:
Through assessing users’ feedback signals, Google recognized that the most emphasis was placed on trustworthiness; Panda (the person) then created a computer program that would also rate sites based on these three things as well, known as Google Panda. The explanation the book provided was that Google Panda is a:
…filter designed to weed out low quality pages. If a site has too many low quality pages, Google now flags the entire site as being untrustworthy.
Google, like any other for-profit institution, looks to improve upon its search results in order to provide not only the fastest answer, to ensure that users keep returning, but that those instantaneous search results offer the user an experience that is both wanted (or relevant) and trustworthy. Google also wants to highlight or provide users with sites which offer the highest levels of engagement, which in turn gives value. Once Google Panda was up, running and flagging sites, it became clear to web developers and SEO professionals that best practices were no longer acceptable, that they were now potentially harmful to an organic search ranking. A number of large, highly ranked sites reported to Google that their once high rankings in search results had either decreased drastically, or had disappeared altogether. Though there are no absolutes in the world of SEO for Google, practitioners have learned there are a few major best practices changes that now affect rankings.
The classic SEO practice said that if sites featured keywords which were generally aligned with what organizations did, and most commonly searched, that they would drive more traffic. Where content managers continue to preach the value and importance of the most popular keyword, or general blanket keywords, efforts should now be refocused to content specific keywords. If you think about it, this makes sense. For example, Briteskies is an IT consulting firm which focuses on eCommerce web design, development and integration between front end interfaces and ERPs. I could create content which featured predominantly commonly searched terms, such as web design or companies that make websites. While these would undoubtedly drive a lot of traffic, users who might land on our site may discover that our content doesn’t quite match what they had wanted; perhaps they only really needed a graphic designer, or a hosting company such as MageMojo which “makes” websites appear online. If Briteskies.com continued to drive traffic to content which users deemed irrelevant, Google would look at these as violations as per the algorithmic rules of Panda which state a site must be of a high quality, feature a clean design and be trustworthy. If Briteskies racked up enough of these negative points, Google would then either shove our search results deep into the belly of its page three limbo, or higher, or prevent briteskies.com from appearing in results at all.
Essentially, if you want to ensure that Google doesn’t think you’re trying to fool anyone with deceptive or inaccurate keywords, use ones that are much more specific to your products and/or services and the actual content of your pages. So, my keyword web design could change to eCommerce website development; it better describes what Briteskies does, thus only driving traffic from users who are really looking for that service, while simultaneously telling Google Panda that we’re trustworthy. This practice will also decrease a website’s bounce rate, or, the rate at which users land on a page and immediately return to Google’s search results. A high bounce rate indicates users are not finding what they were searching for on a particular site, which again, tells Panda something’s amiss.
2. Really Good, Engaging, Relevant Content
The concept of well written, relevant content hasn’t changed much with the varying algorithms of Google; the goal was and still is to create copy, images and videos that a user finds useful or engaging or valuable in some way. What organizations now need to address is whether or not their content, as relevant or useful as it may be, is of the highest quality. If I made absolutely certain that each keyword of briteskies.com was perfectly suited for its page, but failed to check my grammar and spelling, only had a few poorly written sentences, had broken links or out of date landing pages, I couldn’t consider my content to be good, nor could I expect anyone else to think so either. Again: content begets quality which ultimately begets trustworthiness. If a user can’t find what he or she is looking for on a website, or the user finds a page which has expired, the user will be more inclined to leave. A slick design and simple navigation won’t be enough ifu kant spell or the event you created a page to promote happened in 2009. When creating content, providing the user a reason as to why he or she would want to stay on a particular page, or continue on to another, is paramount to a great website; if the party’s lame, people will leave it, and Panda will make it so that no one else can find it.
3. Sites as Federations, not Independent Islands
When Google began indexing websites, it was done in a way that evaluated each page individually. This meant that a website could have a fabulous, flashy home page, an easy navigation and broken links or outdated information, and Google might still consider it to be a reputable site which would rank highly. So, it used to be that you could feature a couple of really good pages, a few iffy ones, hide the crap ones and you were golden. Now, Google Panda says each page will be assessed at face value and if any of them sucks for any reason, your search rankings may be at risk. Pages must now work together to collectively tell the same story, promote a brand or sell a product. Linking between pages must be fluid, the order must make logical sense and they should all be up to date with excellent content and appropriate keywords. This holds an especially weighted importance with eCommerce websites; low quality or out of date product pages will deter buyers, making it all the more probable that potential customers will leave. An eCommerce website certainly doesn’t want customers to flee the store while the Panda is watching.
My Site’s Been Hit; Now What?
Fortunately, Google’s Panda filter runs about every four to seven weeks. If an organization finds that its website has dropped in rank or has vanished completely, there are a few things that can be done to alleviate the situation. First, a company will want to conduct a thorough examination of its site using the aforementioned guidelines. It may be a few updates and a bit of clean up could do the trick, or feed the bear; Panda does its thing and you’re flying high again. Or, after examining a site and finding nothing adverse according to the latest of Google’s standards, an organization may make an appeal to the Great and Powerful Google itself. As with all technologies and machines and good intentions, Panda is not, nor will it ever be, the end all be all of Google’s website filters; it simply happens to be in vogue at the moment.
As permanent or as fleeting of a way Google views websites as this may be, Google makes the rules. Don’t believe me? Simply search it on the engine of your choice and tell me what you find.
Next: Google Panda: My Beef with The Bear