SEO

I am at the SMX London conference today, and thought I would share my rough and ready notes that I took during the course of they day.  I haven’t edited them much, but I hope you find it all useful anyway!

My intention was for this to be live blogging.. but the Internet ain’t working so good, so it is a bit late coming…

Social Link Building

Lisa Myers:  Verve Search

 

Top message: “Social SEO is the same old job as SEO.  We are just creating ‘votes’ to websites”

There is hard evidence that social mentions are important from an SEO perspective: Rel Author tag is a very powerful social signal:  the concept of “personality rank”.  Use Google+ if for nothing else but building authority as an author.

There is evidence of the impact of Google+ on organic rankings  :  see this article on Tasty Placement

Social signals may be important, but they are ephemeral or restrained by time.  A boost of 200 social links over night has a temporary impact

Click through rate is a signal, and +MyWorld showing images in the search results will improve CTR

Lisa’s slides can be found here

James Carson: Bauer Media: @MrJamesCarson

Top Tip:  Your social media strategy should focus on pursuing influencers rather than focus on irrational numbers like number of followers

A process for using social media for link building:

1.      Network selection: for example Facebook for big brands, Twitter for technology and news, etc
2.      List influencers: identify who you are targeting using tools like Google+ search, Followerwonk
3.      Ripple and curate: create “Ego Bait”, competwitions (tweet to enter the competition)
4.      Make waves:  leverage Celebrity culture, or other major players
5.      Practice your timing:  do your own testing to find the best time to use social media
6.      Optimise for Edgerank: consider the type of content (post, status update, photo) + Engagement Rate (comments, reposting) + Affinity (spreading via Friends) + Time Decay (newness)

Case Study: Anatomy of a successful Facebook post: 585 Likes, and 212 shares:

This successful post has 4 key characteristics

  1. mentions something New
  2. Asks a question
  3. Has a link
  4. And includes a great image

Technical SEO Issues

Maile Ohye:  It’s Maile Time!

Maile from Google gave an exceptionally useful update on managing changes to your website, particularly when you are changing the structure of your site, or when tidying up duplicate content on your site.

New: to encourage reindexing of existing URLS with new content, submit a sitemap with an accurate last updated date using  a <lastmod> date. When processed then they will be processed. These pages will get prioriitsed and will get recrawled more quickly

Consider using separate sitemaps for tracking refresh of content.  For example, one sitemap for a product category, and a differnet sitemap for 301 redirects, or other deduplication activities.

When cleaning up duplicate content, remember the objective is to reduce the numbers of pages in the index.  It is obvious, but….

Previously, sitemaps only contained the main pages you wanted indexing (canonical) but now you can create seperate sitemaps to clean up your duplicates.  These sitemaps will generate errors, but this is in the normal course of events and not to be worried about.

And a good process for cleaning up duplicate content:

eliminate the stuff on your site you know is of low quality. We all know we have junk, so make it no index.  Do not get rid of the content.  You still want it to be crawled, but just not appear in the index

Only link to quality pages on your site.  Use Webmaster Tools for internal links to old content

Consolidate known duplicates using 301 redirects

Create a schedule of items to prioritise for deduplicating.  Start with Webmaster Tools, and check out your search queries and expand the keywords to see the full (potentially duplicate) URLS.  Also check to see which URLS that might be competing for the same keyword

Also look at your Top Pages that might contain duplicate URLs – these pages are ranking well and are competing for the same keywords

And finally, check your recommendations for HTML improvements in Webmaster tools – this will show duplicate Title Tags which should also be a priority.

 

Vanessa Fox:  “Don’t Give That Man the Microphone!”

Vanessa did a great job managing the chaotic questioning from the floor, and kept the discussions focused and on track.

A few takeaways:

Once Google knows a URL it will always continue to crawl it, even if you use a 301 redirect, etc.

Google has a Prioritised List of pages it crawls (Home Page, <lastmod> or high Page Rank) but lower stuff still will get indexed at some time

No index should be your last resort. You really should rel canonical

Google loves keyword based parameters and indexes them so don’t rewrite.  It is better to be item=keyword rather than item=12345.  Better for users, better for click throughs.

Pagination tag:  Google sees all the content in every page in the series as a single entity. Use rel=”next” and rel=”prev”.  Most commonly page 1 will rank, but it could be any page in the series

Ian Galpin @g1smd

Fixing canonical issues is very likely to have an impact on Analytics data, for example numbers of page views may go down, because pages have been de-duped.  As a result, Analytics will be much more accurate

Q&A

 

What should we do with Google Site Search results, in other words search on your own site

Google does not want them in the index. Current guidelines says to disallow, but even better to noindex those search results pages

What about housekeeping pages, like T&Cs, that might be ranking well. How to deal with them

First, figure out why that page is ranking for example because of inbound linking. Get links from T&C to home page. If you don’t need that page indexed, the noindex.  But if home page has a big problem, then nothing might appear rather than having it pop up.

King Content, and Panda

Ken Dobell: DAC Group

Ken gave a compelling argument for not chasing the algorithm, but instead focus on generating useful content that meets your users’ needs. Consider generating different types of contents for different channels: whether organic, local, paid, mobile.

Google’s implementation of semantic search is starting to appear in the wild:  for example see Engadget’s sighting of the Howard Carter search results

 

Simon Penson:  Zazzle @simonpenson

Understanding Site Penalties

Top tip:  these updates are not unusual big changes;  these update are going to become business as usual for Google.

Black and white animals like Penguin and Panda are not cute and cuddly.  Together these updates make a powerful web spam update.

Some key concepts guiding penalties

  1. Anchor text as a strategy is dead.  It is replaced by Relevance to the page. Anchor text as a signal is either tuned down or turned off. And of course there is a penalty for over optimised text
  2. Understanding your link profile. How many identical links have anchor text.
  3. Understanding your link profile.  How many PR 0 links.
  4. Link profile: where are your links placed. Links in short paragraphs or groups of links on a page. Area of concern. Remove sitewide links. Remove unrelated links
  5. Backlink acquisition graph.  Steady predictable growth in links, with no peaks and troughs

 

Stephen Croome @FirstConversion

Stephen gave a very useful overview of how to recover from a Panda slap, using Prezzybox as a practical Case Study.  Thanks for the excellent example!

Steps to take to recover from the Panda update:

1. Get a good monitoring system.

AdvancedWebRanking, Analytics, GWMT, Twitter, Email, SEOMoz, SearchMetrics

 2. Use the data to help  make difficult decisions as to what needs to get purged

What big chunks of pages don’t generate traffic.  Low quality content that can be removed.

 3. Clean up the site index by dealing with extraneous URLS

Canonical cleanup

 4. Delete or rehoming orphan pages, and low internally linked pages

(pages you forgot about)

Xml sitemap

 5. Clean up navigation and architecture. Internal linking

6. Throw away product feeds that don’t drive traffic. Duplicate content pages from feeds. Rewrite the pages that actually are getting traffic and money

7. Throw away product categories with no depth of products

8. Add unique text to the top of the product pages. 200 words.  Making better content. Makes each page more different.

Using semantics: use insights for search.

 9.Adding UGC to increase relevance, uniqueness.  Adding FB comments around the product

Incentivize social content.

 SEO and Social Media Power Tools

Michael King (@iPullRank) gave an amazing summary of about a million tools. Fortunately, he has already given us access to his slides:  http://iacq.co/toolspullrank2.  Thanks, Michael, for such a fantastic presentation!

Dixon Jones from MajesticSEO gave an overview of the benefits of Receptional as a rank checking tool.  It is a white label solution for reporting, it can be country specific, the providers have to fix it as part of their business model, it is scalable, and it tracks over time.

Dixon also talked about Google Analytics alternatives, primarily so a company can keep their own data

  • Yahoo Web Analytics
  • Piwik which is open source
  • SEM Rush

SEM Rush provides useful competitor information, primarily for larger organisations.  Where they rank, their keywords, paid and organic. Very useful to investigate new markets.

Hydra:

 

Grab other presentations here:

http://www.slideshare.net/kevgibbo/how-much-seo-juice-do-you-get-from-google

https://seogadget.co.uk/amazing-seo-tools-for-excel/

http://www.slideshare.net/aleydasolis/hardcore-local-seo-tactics-smx-london-by-aleyda-solis

Leave a Reply

Your email address will not be published. Required fields are marked *