How to Survive Future Algorithm Updates

Sites that were almost solely dependent on search engine traffic were given quite the shock this past February when Google released the Panda / Farmer update. This major algorithm update was intended to hit the content farms hardest, but it also had a devastating effect on other types of sites as well, including ecommerce (you can see a list of over 300 sites here). Sites that were affected lost a substantial amount of traffic due to a loss of rankings for a huge percentage of their keywords.

Google’s main goal with this and subsequent Panda updates was to start purging low quality sites from its search results. So the question is, how can you protect your site from future algorithm updates?

Cut Down On The Advertisements, Including Google AdSense.

It’s somewhat amusing that one of the common denominators of many of the sites that were hit by Panda were ones that are covered in ads, including Google’s very own AdSense. Since Google is looking to up the quality of its search results, it wanted to take out sites that were seemingly devoted to ads. The ones where it is obvious that the sole purpose is for a visitor to click on an AdSense or other CPC advertisement.

So if your site is overabundant in ads, whether it’s banners, text links, AdSense, or other networks, you should consider toning it down. Ads should be maybe be in the header, sidebar, and gently mixed into the content. They should not overwhelm the page to the point that it feels like you are looking at 20% content, 80% advertisements.

Create Awesome, Lengthier, Media-Enriched Content.

Remember the old standard of 300 words equals great SEO content? It’s time to toss that out the window.

One of the sites hit hardest by Panda was EzineArticles which was a hub of short, text-based articles which were also a bit heavy on the AdSense. Alternatively, a site that was not hit by Panda was Squidoo. Both are article hubs, but with one difference. Squidoo encourages their articles to not only have a good amount of text but to also supplement the text with additional media elements such as images and video. If articles do not include a good mixture of text plus other modules, they are not submitted to appear search results.

My suggestion for articles is to aim for 500 – 600 words, but don’t make every article that length. Throw in some posts that are much longer and some that are maybe a bit shorter. Add images from Flickr Creative Commons and videos from YouTube to these articles when possible.

The key isn’t to do just the bare minimum for content development, but to make it robust, enjoyable and informative. You want everything to be of the highest quality possible so your visitors and search engines will love it!

This also means going beyond just articles to creating even more exciting content such as infographics. Infographics allow you to jam pack tons of useful information into a visually appealing package. Since they are easy to share, they will lead to a more exposure for your brand resulting in links and social engagement for your website. All of these will help boost your profile in search engines while also building your brand’s authority in your industry.

Lower Your Bounce Rate.

lower your website bounce rate

Bounce rate tells search engines that people are not getting what they are looking for when they arrive on your site. If your bounce rate from search engines is high compared to another site on the same topic, it will essentially tell Google that your competitor has better content to satisfy visitors. So in the end, if Google wants better quality results in search, it is going to give a better ranking to the site that satisfies visitors over the one that visitors exit as quickly as they entered.

This is where your Google Analytics comes into play. Start by looking at the bounce rate of traffic coming to your site from search engines using particular keywords. To do so, go to Traffic Sources > Sources > Search > Organic. You should see a list of keywords people use when they visit your site. What you will want to click on is the Landing Page link under the graph and above the keywords, then use the dropdown for Secondary Dimension and, under Traffic Sources, select Keyword.

Now, you will have a list of your most visited landing pages and the keywords that brought visitors to those pages. If you look to the right, you’ll see the bounce rate in relation to those pages. From this point, it will be your goal to see which keywords cause the highest bounce rates and what you can do to optimize the page in order to:

  1. Give visitors everything they need to know about the keyword on that page.
  2. Give visitors looking for that subject a reason to continue perusing your site.

It’s not necessarily the easiest task, but once you fulfill both of these items on your top landing pages (one of which is likely to be your homepage), then you will start to see an overall lower bounce rate for your site meaning that visitors are getting what they want and spending more time on your site.

Increase Your Website’s Social Proof.

social media proof

Google has entered the arena of creating its own version of the Facebook Like button, namely Google +1, and then continued on to creating their own social network, Google+. These two moves show that Google is definitely interested in social engagement, and that is likely to spread to their search engine results as well.

Social signals, such as tweets and Facebook likes, are already supposed to be influencing search results. You will want to ensure that your site is getting its share of tweets, Facebook likes, and +1′s by the simple addition of social sharing buttons to your home page and your content. These shares and votes will essentially tell search engines that your content is so awesome that people want to share it with their network.

If you don’t have these three major buttons on your website, it’s time to go grab them. You can find them by visiting the official Tweet Button, Facebook Like Button, and Google +1 pages. Alternatively, you can use add-ons such as Add This and Share This (just to name a few) on any website to have the main three networks and more added to your site.

Obey The Webmaster Guidelines.

google webmaster guidelines

Last, but not least, your best bet is to make sure you are following Google’s own Webmaster Guidelines to ensure that a future algorithm update, or random penalty, does not negatively affect your site. These include, but are not limited to, simple things like creating a sitemap to not participating in link schemes to boost your rankings.

Your Strategies To Survive Future Algorithm Updates

Those are my suggestions for proofing your website to survive future algorithm updates, especially where Google is concerned. But now it’s your turn – what are you doing to ensure your website will maintain its rankings and survive future updates?

About the Author: Kristi Hines is a freelance writer, blogger, and social media enthusiast. Her blog Kikolani focuses on blog marketing, including social networking strategies and blogging tips.

  1. We are focusing on high quality original content to keep the Panda from biting. Outbound links to authoritative sites also seem to have a significant impact on whether or not Google will like a particular blog post.

    I’ve noticed that my articles with authoritative outbound links all seem to pull greater SERP mojo with Google.

    • That’s a great point Leo. Linking to great authoritative sites shows Google that you are trying to provide the best resource for readers, even if it isn’t on your own site.

  2. As always great advice Neil. These are great ideas regardless of the SEO benefit from them.

    I am glad Google is focusing on some of its problems affecting its keyword search results and hope they continue on removing low quality links and content from their search index.

    • I agree Mr. Bryant. There are times when I search something and am frustrated because all of the results are just crap blended with a ton of ads. It would be nice to see those results go away and have sites like WebMD and Mayo rank for anything related to medical, for example.

  3. WOW… that’s scary… but great to learn all these points… maybe it’s time to say goodbye to a lot of old things… I beleive the first two points can be controlled rather easily but getting huge ‘likes’ or ‘tweets’ will be a little tough for those websites that get a small volume of traffic

    • True, but chances are people in the same industry have similarly low traffic volume sites. So if your site only has 10 tweets, but the competitor has 0, your social proof will still win! :)

  4. I cannot over extoll the utility of using a media-rich content approach. It has been my experience that people love media-rich content and with Panda, Google is trying desperately to figure out what users like.

  5. High quality article as always – a simple thanks!

  6. I think the bounce rate thing needs to be put in context, it depends on the type of site.

    I present a lot of information on my blog, and some of my best keywords are where people are looking for an answer to a particular question.

    If they receive that answer, they are likely to leave my site after reading the article.

    But they are also likely not to do another search on the same topic right away. Stepping in to Google’s mindset, this should be much more important than bounce rate.

    What Google wants to do is provide the user with the answer to the question, as soon as possible.

    I think this is a much more important factor than bounce rate, but it is impossible to see this in Analytics or anywhere else.

    When I write an information-type article, getting the user to go to another page is only secondary.

    My main goal is to not give the user a reason to perform another search on the same topic.

  7. Really useful blog! What is still no clear to me is the bounce rate. I understand perfectly that it is necessary to lower it and you an can track it with a web analytic tool. But what I don’t understand is how Google knows our bounce rates, does they get this information from the sites that use Google Analytics?

  8. http://www.youtube.com/watch?v=CgBw9tbAQhU

    Matt Cutts clearly states that Google Analytics data isn’t used in rankings or by the webspam team.

    • Don’t have a reference to hand but I believe Google measures the bounce rate of a site based on he actions of users that have the Google browser tool bar and probably from Chrome users too.

  9. In the past I worked as an seo consultant for a site which was hard hit by Panda, though unfairly so in my opinion. However the Google changes make great sense as they are stopping people from trying to game the system and produce worthless content. In the future I see keywords becoming almost an irrelevance with good rankings becoming awarded solely for original content and usability.

  10. Yeah, I agree that site with too much ads is very annoying sometimes. There are still plenty of sites out there with irrelevance content that could ruin the search results. That’s why I’m so glad that Google comes up with this idea about their Panda thing.

  11. social media channels bring lot of traffic but the bounce rate is too high

  12. That is a very nice article, thank you very much

  13. I really liked this post and I’m also agree with the points mentioned. By the way, get a better bounced rate it’s not a simple task but I’m working on that. I hope have a 10% less after a few changes.

    Regards.

    Chori

  14. keep it short and simple, with quality over quantity.. i think that’ll make perfect sense

  15. really a nice article.. especially the social proof point…it has become kind of compulsory for the sites to perform well.

  16. Thank you Kristi,

    Another great article,

    We can decrease the bounce rate of a website, and work on great content. But, it is difficult to getting Likes, Tweets or +1 for a business website.

    What do you think about it. How can we get social proof for this type of website?

  17. If web authors would focus on writing for people instead of search engines this would be far less of a problem. There is far too much concern about google rank and far too little concern with developing a relationship with users.

  18. Yea, I agree with Daniel above. If people just created things for people, and didn’t worry so much about how high the search engines were going to rank you, then those search engines wouldn’t have any reason to come out with new tweaks to the algorithms to get rid of the crap.

  19. I completely agree, that the focus is often on robots and not on real people. We are in the process of bumping up our google seo. I can see some dodgy stuff that has been done by lots of websites esp linking to boost up ranking. It irritates me because its not real! I am trying to focus more on quality and not quantity with not only the links, but the content of the site.

Comments are closed.

← Previous ArticleNext Article →