You Never Step Into the Same Google Twice
Last year was a big year for SEO, and a lot has changed for SEO going forward in 2013. The Google Penguin update decimated many techniques SEO depended on for years. This update was unleashed with no notice, and many top websites were suddenly nowhere to be found on search engines. The good fundamental practices still worked, but the special SEO extras that boosted a website’s ranking by a few notches were suddenly not only useless, but working against those websites.
Why would Google do this? Google’s trade is being the best organic search engine. What does the best search engine do? It finds the website you want to go to before you even know you want it. How does it know this? To be honest, it doesn’t. What Google’s search engine can ascertain, though, is if a website is popular, and if the text coincides with what the person is searching for. So to answer the question of why Google would do this, you first have to understand something a little fundamental about human nature and systems.
And that is that some humans will always find a way to game the system to their advantage, no matter what that system is. And the Internet is lousy with sneaky “black hat” SEO gurus that exploit the limitations of the system. Ever since search engines were first designed, there’s always been someone finding a sneaky way to get their website to the top, whether it’s through hiding text, buying links, or embedding links in comments, usually so a website can get more clicks, and get paid for each click. Hence the new Google Penguin algorithm update. It separates the legitimately popular and useful content (with decent SEO) from the less popular and only somewhat useful content (that happens to have great SEO). Google would rather give the number one spot to a page that naturally got popular than one that is only somewhat useful but has great SEO.
While these new Google algorithms are very complex, there are two big changes to think about in 2013. The first is the way incoming links are evaluated, and second is the way the text on a website is indexed.
Inbound Links
Inbound links are the way Google can tell if something is popular. If a thousand different websites link to an article, then google knows that article is very popular, and therefore it’s a good idea to give it a top spot. And if 100 of the websites within that thousand are equally popular websites, then that gives it an even higher rating. It’s the equivalent of having both a lot of friends who vouch for you, and a subset of those friends are influential in your community.
Why was this a problem? Because there were all sorts of workarounds to getting links. There are websites whose sole purpose is to let webmasters link out to their own sites (some paid, some free), and you can link to your own site if you leave a comment on a popular blog (which is why nonsensical comment spam exists). This is the equivalent of, when applying for a job, getting a friend to impersonate a former boss for a good reference. The Google Penguin update makes it so the only links that count are ones from legitimate websites, ideally from someone you didn’t ask to do it – they simply liked your site and want to link to you. So if you (or the person in charge of your SEO) had linked to you from a million different websites that were worthless in Google’s eyes, then those links actually work AGAINST your site. And it’s hard to get rid of them. So in 2013, the best practice is to try and only get links from other legitimate sites is to create content so interesting that people just have to share it and link to it.
Text
The way text had been handled in website was much simpler before Penguin. If you were selling, for example, Scented Soap, then all you had to do was make sure the exact phrase Scented Soap was in the keywords, the title tags, the meta description, and in the body of the text. Why did they change this? Because Google cares most about the actual humans using their service, and actual humans want natural sounding text, not something that sounds written by a machine, or something that awkwardly crams in phrases. So now the algorithm is more sophisticated – it doesn’t look for just the two words next to each other – it looks for similar words used in different ways throughout the text, the words in the phrase don’t have to be next to each other, and it generally makes sure that the writing sounds natural. So good writing is more essential to SEO in 2013 than ever. The more natural the writing on your website or blog, the more chance you have of ranking higher. If your text seems stilted and unnatural, it will count against your ranking.
The Moral of the Story
Though 2012 was a hard year for many websites, what to take into 2013 is that while the fundamental practices will always be true for SEO, the search engines are constantly evolving. So make sure you (or the person in charge of your SEO) is on top of the ever changing SEO world.
Comments are closed.