What I learned at the Web 2.0 Expo
This started out as an internal document that I was going to share with the marketing team here but in the tradition of openness and Web 2.0 I am going to share with the world. This has two positive effects. One; this is solid information that should be shared and two; I write MUCH better when I know the audience is larger than the inside of our corporate firewall.
Normally I would not go to a conference like this. I love Web 2.0 but my company, for better or worse is pretty firmly entrenched in the traditional software business model. But this conference was in nearby San Francisco and I was handed a full conference pass courtesy of my friends at Zoho and Cloudave.com. (Thank you) It was easy enough to hop on BART in Concord and arrive 40 minutes later, two blocks away from the convention center. Besides I already had the required goatee and iPhone. I fit right in with this crowd.
My first session was SEO: From Soup to Nuts presented by Stephan Spencer, Founder, President, CEO of Netconcepts. I was a little apprehensive at first because this was a three hour session with 121 slides. This guy better be good because that is a tall order to keep me in my seat for three hours. Stephan Spencer was not good, he was great. The three hours seemed to pass quicker than a $10.00 lap dance.
This is what I learned:
- 86% of clicks on Google are from organic search
- Organic search delivers qualified leads, these people are actively looking for you
- There is an implied endorsement with high natural search rankings vs paid ranking
- A number one natural rank has a 4:1 click through compared to number two
- If your website is not SEO you are leaving money on the table. He presented this formula
(number of people searching for your keywords) X (engine market share (google is 70%)) X (expected click through rate) X (average transaction amount) Just about any company can plug their numbers into this formula to see how much money the could be generating with SEO
Keywords research is IMPORTANT. Keywords need to be relevant to your business AND what actual people are using, not just what you think they are using. Here are some tools for checking keywords and doing research:
- Google external keyword tool
- Quintura will build a word cloud from related searches of specific keywords
- Yahoo Search Assist works with the Yahoo Search bar to give you live keyword ranking based on your keyword criteria
- Google Trends
- Google Insights for Search
- Trellian Keyword Discovery – Paid but has a free trial
- When thinking about keywords and key phrases for your business try and think like your customer not like an industry insider. Use the tools above to see what real people are searching for and then use those terms as your keywords.
- The longer and more specific the search phrase the potential customer is typing in the closer they are to making a buying decision.
- Consider using common misspellings as keywords
- Do your keyword research to come up with a list of phrases that people are looking for and then create a web page, article or post with that phrase as the title. For example by using Google Keyword Tool I see that the phrase “data center design” was used 9,900 times in February. I could write a post about our software and how it relates to data center design and give it the title Using “D-Tools for data center design”. That article would rank very high on future searches for data center design.
- Once you figure all of the keywords that are relevant to your business use those as tags for everything you do on the web. Paste liberally on Facebook, LinkedIn, Twitter, YouTube…
SEO is a moving target, lots of change, all the time but the tried and true tactics still work:
Steps to High Rankings
- The better your PageRank, the deeper & more often your site will be crawled by Google. PageRank is important, to get better PageRank get more “quality” inbound links.
- The proper use of anchor text. In the past I would describe a link to something else by hyper-linking the “here” in “click here”. That results in a solid search ranking on the word “here”. In the future if I am writing about something like the D-Tools free trial and I want to include a link to the free trial I will hyperlink the entire string like this: Download the D-Tools free trial. Google and other search engines will now index the link as well as the text and give it more weight because it is anchor text
Get Your Site Fully Indexed
- Make sure to use and submit an XML site map to google and other search engines. This will make sure all relevant pages are indexed and should auto-update as changes are made
- Pages can’t rank if they aren’t indexed
Indexation challenges typically stem from:
- Overly-complex URLs
- Content duplication. This is a problem with WordPress because the same content can have multiple URL’s. Category, Author, Archive, Tags can all point to the same article. I need to do some more research on exactly how this works. I “think” a XML site map will tell google the category path and ignore the other paths but I am not sure if these are redirected?
- Non-canonicalization (www vs. non-www) This has something to do with 301 vs 302 redirects. It gets kind of complex but my understanding is 302 = bad and 301 = good.
- Make sure your “404 File not found” page returns a hard 404 header status code
Some other tips to help indexing and crawl
- Use hyphens “-” not underscores “_” to separate words. Underscores are still not understood as word separators
- Never use a Flash home page, search engines do not fully understand what to do with Flash and could appear as a blank page.
- Avoid complex URLs
Get Your Pages Visible
- If you want to see what a search engine sees go to seobrowser.com If you want to see something funny take a look at what the “The most influential Flash site of the decade” looks like to a search engine by pasting this URL in the address form: http://v3.2a-archive.com/flashindex.htm The sound of one hand clapping.
- The title tag is the most important copy on the page. It should be meaningfull and keyword rich without being spammy. For example before this seminar the title on this blogs home page was “News and information about D-Tools“. It now says “News and information about the D-Tools design, engineering, estimating and management software platform for low voltage system integration contractors”
- Use google site search to check for “untitled document” on your site. The correct number should be 0
- Use google Webmaster Tools to check for duplicate titles on your site. Each page should have a unique title
- The home page is the most important page on the site
- Every page of the site should have a different “song” (keyword theme) that “sings” to the search engine
- Incorporate keywords into title tags, hyperlinks (anchor ext), headings, espically H1 and H2 tags high up on the page where theya re given more “weight”
- Forget about Meta Keywords and Meta Descriptions for SEO
- Have text for site navigation. This is somewhat of a conundrum because graphic buttons convert better than text buttons but search engines don’t see graphics. As a good example of how bad this is take a look at our D-Tools corporate site. Notice the nice graphics on the right? All that great hyperlinked info like free trial, webinar, and design award winners is INVISIBLE to search engines. To show how bad this is I did a google search for d-tools free trial and it does not show until page 5 in the search results
I can guarantee you that our web guy is going to lose sleep over this and will be figuring out an elegant way to incorporate text links as well as graphics into our main site very soon.
Some other notes:
- There a lot of metrics you can track but it requires a much deeper understanding of what is going on, really more than I want to get personally involved with
SEO Best Practices
- Target relevant keywords
- Don’t stuff keywords or replicate pages
- Create useful content
- Don’t conceal, manipulate, or over-optimize content
- Links should be relevant (no scheming!)
- Observe copyright/trademark law & Google’s guidelines
- And sometimes the best practices are just avoiding the worst practices …
Some SEO Worst Practices to avoid
- Hidden or small text
- Keyword stuffing
- Targeted to obviously irrelevant keywords
- Automated submitting, resubmitting, deep submitting
- Competitor names in meta tags
- Duplicate pages with minimal or no changes
- Machine generated content
Not bad practices but not good for SEO rankings
- Splash pages, content-less home page, Flash intros
- Title tags the same across the site
- Error pages in the search results (e.g., “Session expired”)
- “Click here” links
- Superfluous text like “Welcome to” at beginning of titles
- Spreading site across multiple domains (usually for load balancing).
- Content too many levels deep
Conduct an SEO Audit
- Is your site fully indexed?
- Are your pages fully optimized?
- Could you be acquiring more PageRank?
- Are you spending your PageRank wisely?
- Are you maximizing your clickthrough rates?
- Are you measuring the right things?
- Are you applying “best practices” in SEO & avoiding all the “worst practices?”
This article is about 3X longer than it should be so I am going to quit now. Much thanks and great respect to Stephen Spencer and Heather Lutze from Lutze Consulting for the great presentations at the Web 2.0 Expo. I hope I did not mangle your content too much.