It’s hard to believe, but Google updates its search algorithm up to 600 times per year.
They’re constantly tweaking what you see in your search results, and their goal is to provide you with the best, most relevant results they can. If they can make a little spare change (a few billion dollars here or there) from ads, they’re even happier.
We don’t see or notice all of those updates. I like to compare it to a duck– when it’s sitting on the water, it looks like it’s completely relaxed; but underneath the surface, its feet are always in motion. We only see what’s on the surface.
But we do notice some of those major updates and, as much as we’d like to, we just can’t ignore them.
There are three major updates from the past couple months you should know about:
In this post, I’ll briefly cover each, and address what they mean for you, and hopefully show you how to keep on top of Google’s expectations and see improved results with your search engine visibility.
Google’s Panda first rolled out in 2011. This year, we saw the arrival of Panda 4.0.
Panda focuses on your website’s content and quality– on-site factors.
If your content is thin, if your website loads slowly, if it’s tough to navigate, or if it’s full of spam, it’s Panda’s job to take notice. Panda tries to eliminate websites with stuffed keywords ranking for competitive search terms. Its goal is to equate search results with quality content.
Panda is basically always updating. You’ll hear about big relaunches, like Panda 4.0, but Panda is always evolving, and it’s always looking spam and low quality websites.
Panda, contrary to popular belief, does not penalize user generated content, or UGC– such as reviews, comments, forum posts, testimonials, feedback, or anything else you can think of.
Panda wants to make sure your overall site is high quality, so it probably won’t ding you for having one or two pages with thin content. Your ‘contact us’ page, for example, probably won’t be 1000+ words. And Panda understands that.
So, what did Panda 4.0 bring along?
Well, according to Google, it will demote your site in the search engine results if your content is low quality. But it won’t promote your site if your content is great.
Here’s an excerpt from Search Engine Land’s interview with Google’s Gary Illyes:
“But we don’t think of Panda as a penalty now, but rather as an algorithm applied to sites … or sites as a whole.
It measures the quality of a site pretty much by looking at the vast majority of the pages at least. But essentially allows us to take quality of the whole site into account when ranking pages from that particular site and adjust the ranking accordingly for the pages.
So essentially, if you want a blunt answer, it will not devalue, it will actually demote…
… No, I don’t think that [it] has the ability to promote.”
So, how has it affected search results? Here’s an interesting tidbit from Helpdesk Boston:
“Searchmetrics analyzed the effects of the Panda updates and listed the winners and losers, which could almost be grouped together by industry, or category. Sites like globalpost.com, spoonful.com, and songkick.com lost more than 75% of their former Google rank standing; gossip columns like thehollywoodreporter.com and aceshowbiz.com lost more than 50%, as did examiner.com. Conversely, sites that provided salient medical and health content like emedicinehealth.com and medterms.com increased their ranking by more than 500%. Interestingly, Zimbio.com, which is a celebrity gossip and trend-oriented site also went up by 500%. The difference? Original and relevant content combined with popularity, instead of recycling already-available or even questionable content.”
Takeaways: In the past, Panda unfairly punished sites that weren’t spammy. It seems to be stepping away from that now, but it’s going after duplicate, unoriginal, and thin content with renewed vigilance.
If you’re hoping to rank in the local SERPs, make sure all of your local service area pages are fleshed out with original, useful content.
In fact, do your best to make sure all of your pages are useful, even if it’s just providing the correct information on your “contact us” page.
Don’t stuff keywords. Just keep your head down and make the best possible website you can, with an emphasis on good content.
Remember– Panda can demote your website, but it’s not there to promote you. It’s one of Google’s defense mechanisms, and has nothing to do with Google’s SERPs being a meritocracy. Don’t try to skate by with thin, copied, or spammy content, and the constantly-updating Panda 4.0 should leave you alone.
Penguin first arrived on the scene in 2012. Whereas Panda sought to evaluate quality content, Penguin sought to evaluate quality links.
Links are a huge, deciding factor in Google’s search algorithm. If your site has quality backlinks from other relevant websites, it’s bound to rank higher than a competitor’s site that has no backlinks.
That’s why link building is still a thriving industry.
The problem is, link building was very spammy and very manipulative for many, many years. In the past, a link building firm could buy 2,000 backlinks from spammy websites, and their own client’s site would push up to the front page of Google.
Penguin fought to change that.
Overnight, spammy links were far less valuable– and in some cases, those spammy links even served up search ranking penalties, or de-indexed offending websites altogether.
Well, Penguin 4.0 is here, and it’s a bit different.
Here’s the short version:
-Like Panda 4,0, Penguin 4.0 is now constantly updating and changing, and it is now a core part of Google’s algorithm– Penguin is watching you
-Penguin 4.0 seems to devalue spammy links instead of outright penalizing them
-Instead of penalizing your entire site, Penguin 4.0 may just penalize a certain page that has spammy links pointing at it
Takeaways: If you have any spammy links pointing at your site, clean them up.
Penguin may not penalize you for some of them, but it’s still taking your links into consideration in real time.
Don’t build any spammy links, because Penguin 4.0 will catch on much more quickly than past Penguins ever could. Conversely, if you’re under penalty and you work hard to disavow those links, you should recover more quickly.
If you’re building links, make sure they come from relevant sites. If a site is related to your niche or industry, it’s relevant. Similarly, if a site comes from your local area or geographical location, it’s also relevant. Local links are great for local SEO, and that has not changed.
And, know that your search rankings may be unpredictable for a while, and you may not know why. That happens with any huge update. As long as you keep your backlinks relevant and stay away from spam, you’ll likely be alright in the end.
Now, here comes the big one for anyone working with a business that hopes to rank in Google’s local three pack, or anywhere in the local search results.
According to a study from Joy Hawkins and BirghtLocal, Google’s Possum update changed 64% of local SERPs. That’s a huge deal.
Here’s what it did:
-It pushed many businesses from outside the city limits into ranking higher within the adjoining city’s local search results, or even the three pack for that city
-Listings with duplicate addresses and phone numbers are filtered out of the search results
-The physical location of the user performing the search now impacts results much more heavily
-Search results vary more based on search terms and keywords (‘Lithia Ford’ vs. ‘Lithia Ford Boise,’ for example, show different search results)
So, basically, local searches are fluctuation.
If your business is right outside the city limits, you’re now more likely to show up in the local results for that city. That’s a good thing. Eliminating some needless duplicate listings is also a good thing.
Takeaways: We don’t really know what to make of Possum just yet, but people are studying it.
I’ll let Joy Hawkins give you her advice, because it’s just as good )or better than) anything I could give you:
“I have spent the last couple of months analyzing dozens of specific scenarios for businesses to try to find patterns in what changed. Some of my findings have been included in recent articles I’ve written (This one shows some patterns, and this one shows some things that impact the filter).
For now, it’s crucial for local SEO practitioners to spend time analyzing changes to help figure out which Local Ranking Factors changed as a result of Possum. So far, I have been realizing that the answers are becoming harder and harder to find as Google’s algorithm becomes more complex — and it’s no longer as easy as “getting the most reviews” or keyword-stuffing categories, which worked phenomenally several years ago.”
Possum seems to be focusing on:
-The age of your listing
-Your organic ranking
-Whether or not you have any duplicate listings
-Whether you’re spammy or not
Double check your Google My Business listing and make sure everything is in order, and then follow Joy’s advice.
Keep an eye on your local listing, your standings in any Google local packs, and your organic rankings. Make sure your content is high quality. Don’t build any spammy links.
That may seem like common sense advice, but it’s what we have to go on right now. With the advent of Possum, Penguin and Panda seem easy.
With Possum in play, we all need to pay attention to what the brightest minds in SEO are finding from this new algorithm update. They’re already steering us in the right direction, but we still have a lot to learn.
Thanks for reading!