As 2010 comes rapidly to a close, I look back on my second year as a search engine marketer and reminisce on the roller coaster ride. From Caffeine & Mayday, to switching marketing agencies, to Google’s new local SERPs, to attending my first industry conference, to the Bing/Yahoo Integration, it sure has been a crazy year, and I bet I’m not alone in saying that I can’t wait for 2011! But I think as an SEO professional, it’s imperative to pay close attention to what has happened in the past in order to structure a game plan for the future. With that, here are five search trends that we should all pay close attention to in 2011.
1. Content Super-Optimization – So the main factors in on-page optimization have been set in stone for awhile now, and despite what John Gruber thinks, optimizing your Title Tag for target keywords IS still a good idea. But what really excited me this year was SEOmoz’s Latent Dirichlet Allocation tool. The LDA tool was developed based upon the assumption that Google uses some form of topic modeling to determine the relevance of a piece of content. This concept takes the art of on-page optimization to a whole other level, and using Virante’s recently released LDA Content Optimizer tool, there is now a way to make sure the relevance of a piece of content is super-optimized. These tools are still in their infancy, so you can be sure that SEOmoz, Virante, and others will explore the application of topic modeling even further, changing the way SEO’s tackle on-page relevance in 2011.
2. From Reviews to Ratings - 2010 saw the battle between Google and Yelp, as well as the launch of Google’s new loco-organic..errr O-pack…errr Integrated Places Results, and most recently, the launch of Google Hotpot. Google’s been interested in reviews for awhile, grabbing what they can from Yelp, CitySearch, Urbanspoon, and the like. But with Hotpot, Google has signaled that they don’t want to rely on other sites for this information, they want to source it themselves. Enter ratings, the 2011 of reviews. The user interface of Hotpot takes most of the friction out of the reviewing process, relying solely upon star rating, a quick tip, and in some cases, smiley/frown sentiment indicators. Hotpot has really only been promoted by Google in the Portland area, but expect this to be a big deal in 2011, possibly making it’s way to the front page of local SERPs.
3. Checking-in to UGC – Google loves user-generated content. It’s fresh, it’s (usually) crawlable, and it shows them that a site has visitors that are active and engaged. With the 2010 explosion of Location Based Services like Foursquare and Gowalla, you can be sure that Google is looking to these platforms in their neverending quest to add more and more context to local search. You can already see that Google is indexing “Tips” from Foursquare in Google Places, and I think it’s only a matter of time (if it’s not happening already) before Google begins using check-in velocity & tips activity as a factor that influences rankings in local search.
4. Goin’ Mobile – You thought 2010 was the Year of the Smartphone? You haven’t seen 2011 yet! Even though smartphone adoption in 2010 in the United States rose to 28% of all cell phone subscribers, up from 10% just 2 years ago (Source: Nielson), I think we’ll see growth to at least 45% in 2011. And with the increase in smartphone usage, comes the increase in mobile search. Not only does this give Google more information on where you go and where you search from, but also more context for those on-the-go searches, allowing for even greater signals to use in local search rankings as well as offering a more personalized experience than ever before. And if you’ve ever searched on Google from your mobile phone, you’ll probably agree it’s not the greatest experience. Expect a big UI upgrade for Google mobile in the first half of 2011.
5. I Link, Therefore I Am – Gone are the days where you could win rankings for competitive keywords based on footer-stuffed inbound links with exact-match anchor text from irrelevant, high PageRank sites. Nowadays, links need to be gained from topically relevant websites, with a variance in anchor texts, over a consistent period of time, in a consistent fashion. But there are still loopholes in Google’s system and obvious ways that links are manipulated, yet still rewarded. This is where I think sentiment analysis can play a big role. Google Places and other review sites have been using sentiment analysis to filter large quantities of UGC for quite some time. And while still relatively basic, the technology behind sentiment analysis is improving enough to the point where it could be used on indexed content to determine the context around a link. Combined with content relevance and other inbound linking factors, sentiment analysis has the potential to be a factor that increases the quality and relevancy of Google’s search results, ultimately creating a better experience for the everyday searcher in 2011. And maybe, just maybe, we can avoid another DecorMyEyes fiasco.