Jazmin Hupp

View Original

Data-Driven Search Engine Optimization

I attended Data-Driven SEO taught by Hamlet Batista at General Assembly. Hamlet's first claim to fame was gaming the search engines to rank on the first page for "Viagra" and make seven figures in commissions in 2002-2005. More recently he is the technical editor for The Art of SEO

Pretty Code Don't Matter

Code that validates perfectly sounds like a great idea but if you run code validation tests on the most popular websites you'll see errors all over the place. Search engines don't care if your code is perfect, they care an audience wants it.

The Basics

All these steps must be done in order.

  1. Crawling - Google must find the site

  2. Indexation - Google must index the site

  3. How The Results Are Presented

  4. What You Do With The Traffic

You Can't Control Rankings, Control Your Traffic

Rankings depend on many factors, many of which are outside your control, instead:

  • Increase the number of pages that are driving search traffic to your website or

  • Keep the same number of pages but increase how much traffic each page drives

Easy & Fast: Improve How Google Shows Your Search Listing

  • Use Google Adwords to figure out what the best marketing message is for your audience

  • Change target landing pages to match those marketing message

  • Test the updated pages against your previous click-through-rate, roll-back any negative changes

  • Write descriptive titles and meta description, make them read like ads, and incorporate keywords intelligently

Rankings

You can cheat the rankings temporarily with black hat tactics but they only work for so long. Instead focus on keyword battles you can win. Target keywords that will drive the right users instead of generic keywords or overly generic keywords.

Indexation

The more pages your site has, and the more pages you get indexed, the more traffic you'll receive. You can get an estimate of how many pages Google is indexing by searching for "site:yoursitename.com" in Google. The more reliable method is to submit an XML site map to Google. To improve your indexing...

  • Consolidate duplicate content: search engines will try to ignore it

  • Address canonicalization issues: make sure to permanently redirect your traffic to either www or not (doesn't match which, just be consistent)

  • Make sure each page has unique and useful content

  • Interlink intelligently by finding pages that don't have many links to them and create a blog entry that references them (since your blog gets more links to it, it can spread the link love to less-indexed pages)

Crawling

  • To see if a page has been crawled, type into Google "cache:www.yourpageURL" to measure when the individual page is crawled. If the site hasn't been crawled for a long time, it may be that Google is penalizing you.

  • Analyze your traffic logs to see how often the search bots are hitting each page and any errors they are hitting

  • Make sure to create a comprehensive XML site map with all your UNIQUE content

  • Use text based site navigation to assist the crawlers

  • Try to avoid dynamic URLs (avoid javascript-based site navigation)

  • Avoid pages with only images and flash content, if you have a video include a text transcript of the video

Link Acquisition

  • Build relationships with press and bloggers

  • Create and promote enticing content

  • Contact everyone on your mailing list to add you to their site

  • Adopt viral content ideas from sites like BuzzFeed to your niche

Photo by Jerry Paffendor