Navigation
Spring Cleaning Your Website: 5 Minimum Effort, Maximum Effect Resolutions

Spring Cleaning Your Website: 5 Minimum Effort, Maximum Effect Resolutions

By Monique Pouget

This Sunday, most of the country (except those pesky Zonies) will celebrate one of my least-favorite holidays: Daylight Saving Time. While I’m mostly bummed about losing an hour of sleep, there is a bright side: spring is a great time to clean out what you have and reflect on what’s to come.

So, we’re kicking off “Spring Forward” month on the blog with a little website R&R. Rather than listing everything that goes into a full site audit, the goal of this post is to showcase the areas of your site that could use some spring cleaning. Who doesn’t love minimum effort, maximum effect, am I right? Let’s get started.

1. Metadata

Metadata has grown way beyond title and description tags, and of course the long-ago dismissed keywords. Metadata now encompasses directives to Google like rel=canonical, social media tags, Google+ only tags, the gradually catching-on schema tags (like the popular authorship markup), and even Facebook Open Graph tags.

In the spirit of spring cleaning, let’s focus on the low-hanging fruit that deserve renewed attention: title and description tags, and more recently, authorship markup.

Keyword research is usually one of the first things we focus on when working on a new campaign. What types of terms does the client want to be found for? How does this compare to the words their customers are using to discuss similar products or services? Which terms are the most competitive and where are the opportunities for this website? Keyword research influences anything and everything, from complex content strategies to basic metadata.

Title tags used to be defined by the number of characters search engines displayed in results (less than 70 characters, to be exact). More recently, some studies have shown that search engines don’t always follow character count, and that pixel width is more important. For example, title tags that use “I” and “l” or even “t” and “r” take up less room, and are thus preferable to tags with “W”, “N”, “R”, “E” and the like.

While we’d never recommend using only thin letters to squeeze more characters in (but I’d like to see you try), this is an interesting departure from the 10 Commandments of SEO (do those exist?). Instead, keep this in mind when creating title and description tags, and don’t let character limits get in the way of an awesome meta tag. Or even better, check out this thorough list of 18 meta tags every webpage should have in 2013.

March is a great time to rethink your current meta tags. Have you shifted focus in terms of focus keywords, and maybe you need to revamp or start fresh? What’s your on-page report card grade? Are you one of the few sites implementing authorship correctly? You can also confirm this in Webmaster Tools by checking out Optimization > Structured Data, or test URLs individually with the Google Structured Data Testing Tool.

(Side note: Schema markup for breweries?! Rad.)

tl;dr: No matter how you slice it, cleaning out your metadata and adding structured data to your site can lead to some serious SEO benefits.

2. Site Performance

At SearchFest last month, there were several presentations that focused on site performance and speed, and how this affects both user experience and rankings. In fact, in 2010, Google even announced that site speed was a ranking factor, making it impossible for us to ignore as marketers, right?

Slow loading pages have higher bounce rates and those users spend less time on those pages, while fast loading pages can be indexed quickly, meaning more traffic opportunities in the SERPs, and ultimately, more conversions.

So wake up couch potato! Here are a few tools to help you test your site performance:

  • Web Performance Best Practices is a nice resource from Google that breaks down the different aspects of page load optimization.
  • PageSpeed Insights is a simple and insightful site performance tool that Max showed me this week. You can enter any URL, and the tool will return details about your site with helpful suggestions to make your pages faster. It breaks everything down by priority, shows you what needs to happen on your pages, and provides a PageSpeed score on a scale of 1 to 100. Suggestions include anything from “leverage browser caching” to “serving scaled images.” At least consider taking care of the “High Priority” items. We saw a client go from a mid-60’s score to a low 90’s after addressing all the “High Priority” and most of the “Medium Priority” recommendations. Thanks Big G!

  • WebPageTest is another page testing tool, and it’s one of the more popular and detailed resources. Added benefits include testing multiple geographic locations, different browsers, different connection types and mobile devices. I find the breakdown between first visit and repeat visit interesting, and I like the way WPT makes simple suggestions for those that aren’t as technical.
  • Google Analytics also has a site and page speed section (Content > Site Speed > Overview). It shows you which landing pages are the slowest, which campaigns correspond to faster page loads, and how page load time varies across geographies and browsers. By default, this feature only samples 1% of your site visitors, and the GA API does not support all browsers, including Safari (goodbye iPhone data!), but here’s a handy workaround for increasing sample size.
  • Want to see GA site performance data in one spot? Check out this sweet Site Performance dashboard for GA that I found in Modesto Siotos’ site speed optimization post, which is an incredible resource in itself.
  • Want to tie site speed to the bottom line? Conversion Loss Calculator estimates revenue gained by reduced site speed, which resonates with clients.

So, how can we speed up site performance? Well, it’s not easily said nor done, but where there’s a will there’s a way. First off, evaluate what Google recommends in the PageSpeed Insights tool mentioned above. Start with high priority fixes and work with developers if the changes are beyond your technical scope. There’s also a super thorough white paper from the folks at iCrossing about site performance that’s well worth a read or two.

tl;dr: Fast loading pages increase traffic, pageviews and conversions, while making users happy simultaneously. Optimizing your site for speed is a low-hanging-fruit-opportunity that you should seize.

3. Readable and Indexable Content

Search engines use robots to crawl your site’s pages, and since we spend so much time and money on quality content, it’s important that they can actually read and index your website. Maybe your content wasn’t readable in the first place, or maybe your site growth has lead to content not being viewable by Google. Either way, something’s gotta change!

To check for readable content, first compare cached or text-only versions of pages to the live web page. Are you seeing the same content? Next, turn off Java and CSS to make sure Google “sees” all that you want Google to index. There are several ways to do this, but we like using the Firefox Web Developer Extension to toggle back and forth between JS and CSS.

In terms of indexed content, a quick site:example.com search returns the estimated number of indexed pages, but if you want to dig deeper, Webmaster Tools is a useful tool to get familiar with. “Index Status” (GWT > Health > Index Status) shows you the total number of indexed and crawled pages, which pages are blocked by robots, or even pages that have been removed for legal reasons or from webmaster requests. If your indexed and crawled pages are steadily increasing, it’s a sign that Google can regularly access and index your content. If you see a sudden drop or a ton of duplicate pages, check to make sure your server isn’t down and figure out why search engines are having trouble accessing your content. Or you know, call us.

tl;dr: Why bother having amazing content if search engines can’t understand it? Make sure your site pages are readable and indexable by the bots, and check for any warning signs in Webmaster Tools.

4. Crawl Errors

If the bots are having difficulties navigating through your content (read: it is not readable or indexable), they will give you details (aka HTTP Status Codes) about these URLs in the “Crawl Errors” section of Webmaster Tools. What’s worse, the Googlebot might try to access your website via a link to a missing page (the dreaded 404 error) which says to Google, “Go away!” and it does. This means Googlebot stops crawling your site and indexing your juicy content, potentially resulting in a loss of both rankings for the page (that is now missing) and rankings for other pages (including new pages) that aren’t getting a fresh crawl.

Of course, the ultimate travesty is that a user (yep, a human being, aka “the customer”) might land on an error page and just leave (that’s NOT a score for the home team).

Google does a good job of breaking down the different types of crawl errors. Along with Webmaster Tools, here are some other tools to help you investigate:

  • SEOmoz Crawl Test helps you zero in on duplicate content, redirects, rel-canonicals and more. It will explain the HTTP Status Codes, and it breaks everything down in a CSV. Warning: This report could take a few days to populate, so plan accordingly.
  • Ayima Redirect Checker is an awesome extension that flags 301s, 302s, 404s and 500 HTTP Status Codes.
  • Screaming Frog has a bajillion uses (see this amazing guide from SEER Interactive for proof), including crawling sites and subdirectories, looking for redirects, and more.

tl;dr: Crawl errors aren’t good for users or search engines. With so many crawl tools available, you have no excuse!

5. Keyword Cannibalization

Contrary to popular belief, keyword cannibalization has nothing to do with fava beans and a nice chianti. Instead, it means that search engines think multiple pages are about a single term or phrase. This forces Google to choose between the various pages and pick one that it feels is the best fit.

One way to see which pages Google is confused about, head on over to Webmaster Tools (again!) and check our Traffic > Search Queries. Click the “Filters” button to add in a keyword of choice. This happened with one of our real estate clients. They have a regional page for Baltimore, but as seen below, four different property-specific pages were competing for the term “luxury apartments baltimore.”

The idea was to figure out which page we wanted to rank, and then make the appropriate changes to the other competing pages. Knowing this site, the goal is to rank the regional pages, so we’ve started “turning the Baltimore dial down” for the property pages so that search engines see the regional page as the main page.

So, what’s a good way to curb keyword cannibalization for your site? First of all, have a plan! We make a keyword map for each client that breaks down top pages and the keywords we’ll use for each page. Bypass stuffing your top keyword in every page, and focus on unique, valuable variations that link back to the main target page for your keyword.

Using 301s and rel=canonical for duplicate pages also helps, and if a blog tag page seems to be out-ranking a service page, consider deindexing.

Note: Keyword cannibalization is different from duplicate content. With keyword cannibalization, Google is showing multiple URLs in the index (and search results if you’re lucky), and Google “thinks” these pages are all about the same query. So, Google potentially dilutes the site’s rankings across all the competing URLs, thus cannibalizing rankings for any one page. Duplicate content differs in that Google has determined there are more than one page with the same content. In this circumstance, Google will decide which is the “parent” page and potentially remove the other URLs from the index. This is when rel=canonical is helpful for telling Google: “Hey G! We know this page is a dupe of another so please reference the other “canonical” page and don’t penalize us.”

tl;dr: Trying to rank every page on your site for your top keyword is not a great strategy. Instead, prioritize your keywords and map out which pages you will target with these terms.

Bonus: Site Audit Posts We Love!

If you’re looking to deep clean your website, check out these awesome site audit resources:

Full site audits are intimidating, but there is a lot of opportunity in small website changes that lead to maximum results. Spring is a great time to reevaluate these basic areas of website performance optimization, and look forward to more qualified traffic and happier users. What’s not to love?

Monique Pouget

Monique Pouget heads up Marketing at ThunderActive. She also likes polka dots, sandwiches, and beards.

Say hello on Twitter and Google+.

  • http://www.adrianvender.com Adrian

    Great post! Especially the site speed stuff.

    Also, as a ‘pesky Zonie’ I’m glad that I don’t have to deal with changing our clocks. We respect the fact that time never really ‘fell back’ in the first place. :)

    • http://www.thunderseo.com/team/monique-pouget/ Monique Pouget

      Thanks Adrian! I’m glad you enjoyed it. I learned a lot about site speed and performance while researching this post, and there is definitely a lot of low-hanging fruit. Small changes make a big difference and so many free tools to test with!

      And yes, maybe you Zonies are on to something, I love the idea of never losing an hour of sleep!