SEO

109 Quick SEO Tips Even Mom Would Love

SEO tips even mom would love

Even Mom Could Cook With These Tips!

By Richard V. Burckhardt

Everyone loves a good tip, right?

Here are 109 quick tips for search engine optimization that even your mother could use to get cooking from my webmaster friends who run IT services in Calgary. Well, not my mother, but you get my point. Most novices with some web design and beginner SEO knowledge should be able to take these to the bank without any problem.

(Note: This list of tips is an update to the original post 55 Quick SEO Tips Even Your Mother Would Love. I am not blogging much these days, but hope this is of use to you.)

1. If you absolutely MUST use Java script drop down menus, image maps or image links, be sure to put text links somewhere on the page for the spiders to follow.

2. Content is king, so be sure to have good, well-written and unique content that will focus on your primary keyword or keyword phrase.

3. If content is king, then links are queen. Build a network of quality backlinks using your keyword phrase as the link. Remember, if there is no good, logical reason for that site to link to you, you don’t want the link.

4. Don’t be obsessed with PageRank. It is just one isty bitsy part of the ranking algorithm. A site with lower PR can actually outrank one with a higher PR.

5. Be sure you have a unique, keyword focused Title tag on every page of your site. And, if you MUST have the name of your company in it, put it at the end. Unless you are a major brand name that is a household name, your business name will probably get few searches.

6. Fresh content can help improve your rankings. Add new, useful content to your pages on a regular basis. Content freshness adds relevancy to your site in the eyes of the search engines.

7. Be sure links to your site and within your site use your keyword phrase. In other words, if your target is “blue widgets” then link to “blue widgets” instead of a “Click here” link.

8. Focus on search phrases, not single keywords, and put your location in your text (“our Palm Springs store” not “our store”) to help you get found in local searches.

9. Don’t design your web site without considering SEO. Make sure your web designer understands your expectations for organic SEO. Doing a retrofit on your shiny new graphics-based site after it is built won’t cut it.

10. Use keywords and keyword phrases appropriately in text links, image ALT attributes and even your domain name. Continue reading

Standard
Reviews, SEO

Optimization for Social Media Integration

Optimization for Social Media Integration
SEO and Social Media optimization are pretty much integrated these days.

As promised, though delayed a bit, here is the follow-up to SEO & Social Media Integration with some tips for optimizing your social media and integrating it into your overall SEO mix. Naturally, the more social links you have, the more you’ll be noticed (Hey, I’m here!) and linked to and crawled, natural SEO in any book.

  • Complete your profiles and put your web site or blog URL in ALL of your profiles. Including your web site link should be a no brainer, but LOTS of folks forget this. Social content is definitely showing up in search results. For instance, a search for my name in Google comes up with several of my profiles (LinkedIn, WebPro News, Plaxo, etc.)  as I write this. Include jobs, use keywords, never turn down interview or podcast opportunities that can be included.
  • Check industry trends through services like Google Alerts. Jump into breaking subjects with blog and social media posts.
  • Promote blogs, social media and RSS feeds of those who link to you through your own RSS feeds. This has a viral effect that sends more links to you.
  • Develop a series of “How to” videos and post them on video sites like YouTube (now 25% of Google searches). People LOVE these and will embed the videos or link to them. Post them on your own blog and in the social sites, too.
  • Content is king on Twitter, just like in standard, run of the mill SEO, so use keywords and hashtags (like the #bsg tag for Battlestar Galactica fans).
  • Utilize Facebook fan pages. These can have unlimited followers and can be optimized.
  • Subscriptions are gold! Social Media Content = Subscriptions + Links!
  • Transcribe your podcasts and post them on your blog.  Include keywords and links.
  • Create widgets that will pull your RSS feeds from blogs, social media and news feeds. As a rule, links in RSS feeds are direct links without redirection or the dreaded “nofollow” tag!
  • Re-optimize blog posts after their “shelf” life. Rework them and monetize them for breaking subjects. Any given blog post URL or page has history and links, so update them to keep current, ranking and posted in the social media.

OK, as always, these just scratch the surface, but you can see the optimizing for SEO and social media are pretty much one and the same these days. What you do with one totally effects the other.  Anyone who thinks they can be an SEO and ignore social media is, well, NOT an SEO.

Next up: Optimization for Social Media Integration: News

Standard
News

Matt Cutts at PubCon South

Live from Austin PubCon, March 12, 2009:

Google software engineer and spam cop,  Matt Cutts, also the search giant’s spokesperson to the SEO world, gave the keynote address for the second day of PubCon South with the announcement of the Google Friend Connect API, which will integrate accounts like OpenID, Google, Yahoo or AOL Instant Messenger to make leaving comments on blogs easier.

It’s not just for blogs, but can be used for forums and other CMS systems, too. The idea is to make it easier to leave comments without having to constantly type in user names and passwords, etc.  Ultimately this will also combat spam.

Matt said that Open Source plugins for WordPress, Drupal and phpBB will be released later today. Matt says since these are Open Source, improvements from the SEO world for these plugins are welcome.

The API will allow you to join web sites so that you don’t have to log in each time to leave comments.

This was Matt’s eighth PubCon, including the very first.

Matt’s blog for SEO and related issues is http://www.mattcutts.com/blog/ .  More information on the Google Friend Connect API will be available on his blog later today.

Standard
SEO

100 Quick SEO Tips Even Mom Would Love

Even Mom Could Cook With These Tips!

Even Mom Could Cook With These Tips!

Everyone loves a good tip, right?

 

Here are 100 quick tips for search engine optimization that even your mother could use to get cooking. Well, not my mother, but you get my point. Most novices with some web design and beginner SEO knowledge should be able to take these to the bank without any problem.

(Note: This list of tips is an update to the original post 55 Quick S E O Tips Even Your Mother Would Love. I am republishing this expanded version for the attendees of a local Palm Springs SEO training class that I am doing. It is also available as a downloadable e-book.)  

1. If you absolutely MUST use Java script drop down menus, image maps or image links, be sure to put text links somewhere on the page for the spiders to follow.

2. Content is king, so be sure to have good, well-written and unique content that will focus on your primary keyword or keyword phrase.

3. If content is king, then links are queen. Build a network of quality backlinks using your keyword phrase as the link. Remember, if there is no good, logical reason for that site to link to you, you don’t want the link.

4. Don’t be obsessed with PageRank. It is just one isty bitsy part of the ranking algorithm. A site with lower PR can actually outrank one with a higher PR.

5. Be sure you have a unique, keyword focused Title tag on every page of your site. And, if you MUST have the name of your company in it, put it at the end. Unless you are a major brand name that is a household name, your business name will probably get few searches.

6. Fresh content can help improve your rankings. Add new, useful content to your pages on a regular basis. Content freshness adds relevancy to your site in the eyes of the search engines.

7. Be sure links to your site and within your site use your keyword phrase. In other words, if your target is “blue widgets” then link to “blue widgets” instead of a “Click here” link.

8. Focus on search phrases, not single keywords, and put your location in your text (“our Palm Springs store” not “our store”) to help you get found in local searches.

9. Don’t design your web site without considering SEO. Make sure your web designer understands your expectations for organic SEO. Doing a retrofit on your shiny new graphics-based site after it is built won’t cut it.

10. Use keywords and keyword phrases appropriately in text links, image ALT attributes and even your domain name.

11. Check for canonicalization issues – www and non-www domains. Decide which you want to use and 301 redirect the other to it. In other words, if http://www.domain.com is your preference, then http://domain.com should redirect to it.

12. Check the link to your home page throughout your site. Is index.html appended to your domain name? If so, you’re splitting your links. Outside links go to http://www.domain.com and internal links go to http://www.domain.com/index.html. Ditch the index.html or default.php or whatever the page is and always link back to your domain.

13. Frames, Flash and AJAX all share a common problem – you can’t link to a single page. It’s either all or nothing. Don’t use Frames at all and use Flash and AJAX sparingly for best SEO results.

14. Your URL file extension doesn’t matter. You can use .html, .htm, .asp, .php, etc. and it won’t make a difference as far as your SEO is concerned.

15. Got a new web site you want spidered? Submitting through Google’s regular submission form can take weeks. The quickest way to get your site spidered is by getting a link to it through another quality site.

16. If your site content doesn’t change often, your site needs a blog because search spiders like fresh text. Blog at least three time a week with good, fresh content to feed those little crawlers.

17. When link building, think quality, not quantity. One single, good, authoritative link can do a lot more for you than a dozen poor quality links, which can actually hurt you.

18. Search engines want natural language content. Don’t try to stuff your text with keywords. It won’t work. Search engines look at how many times a term is in your content and if it is abnormally high, will count this against you rather than for you.

19. Not only should your links use keyword anchor text, but the text around the links should also be related to your keywords. In other words, surround the link with descriptive text.

20. If you are on a shared server, do a blacklist check to be sure you’re not on a proxy with a spammer or banned site. Their negative notoriety could affect your own rankings.

21. Be aware that by using services that block domain ownership information when you register a domain, Google might see you as a potential spammer.

22. When optimizing your blog posts, optimize your post title tag independently from your blog title.

23. The bottom line in SEO is Text, Links, Popularity and Reputation.

24. Make sure your site is easy to use. This can influence your link building ability and popularity and, thus, your ranking.

25. Give link love, Get link love. Don’t be stingy with linking out. That will encourage others to link to you.

Continue reading

Standard
SEO

SEO Tools & Tips

SEO toolboxIn my day to day SEO for my clients and for myself, I’ve come across a number of tools of the trade, some really good, some, well, not so good. Here are a few items in my geek toobox that I use daily and highly recommend.

1. Keyword Tool – I have tried them all and this is the one I always go back to for my keyword research. Not only does it give you variations on the keyword phrase you are searching for, but also provides the WordTracker count and daily estimated searches on Google, Yahoo and MSN along with shortcuts to various tools like Google Trends, Keyword Discovery and several other online tools. And, you can export the thing as a CSV file. Way to go Aaron!

2. Check Server Headers ToolQuick and easy way to check on whether your URL is being seen and followed properly by the spiders. For instance, I recently installed a WordPress plugin which appeared to work fine in a browser, but when I checked the page URLs that it produced here, I found that those pages were producing 404 errors, meaning the web surfer could see the pages, but the spiders couldn’t. Naturally, I ditched the plugin. The site also includes a batch URL processing capability (up to 25 URLs at once).

3. Web Page Analyzer – This online tool checks the speed of your site and lets you know what the download time would be at various connection speeds. Granted, most folks have broadband these days but you still don’t want a page to take several minutes to load on a 56k dial up connection. The test gives you suggestions on ways to speed up your site for visitors and spiders. Both will go away if your site is too slow.

4. Yahoo Site ExplorerYes, Google gives you some information on sites that link to you, but not like Yahoo’s Site Explorer, which is easy to use and just requires a Yahoo login. You can filter inbound links to see internal or external linking, number of pages Yahoo sees and more.

5. Spider Simulator – Just one of many free online tools offered by this site, I jump here when I need a quick look at what the spiders are seeing. A more comprehensive spider simulator report is available in the iBusinessPromoter client software on my PC, but this online utility serves my purpose most of the time.

6. Tweetscan These days keeping up with what is said about you and your clients is a must. I use Tweetscan to search for references to me or my clients in Twitter for reputation management, goodwill and networking opportunities.

7. SearchStatus This is a Firefox plugin that, among other things, allows you to highlight and see nofollow links. This comes in real handy when checking backlinks or sculpting the links on your own site. The plugin includes utilities to check backlinks, Alexa rankings and so forth, but I primarily use the nofollow highlight feature.

8. MyBlogLog Although the community aspects of the social site are free, I do use one paid service that this Yahoo owned site offers – statistics. For about $25 per year, I can get almost real time traffic stats coming off of web sites. I can see my site traffic nearly as it happens, where surfers are coming from and where they are going. From this, I can see if there is a trend or if something is wrong on a site now, not tomorrow when my Google Analytics stats are refreshed. I mentioned this service in my post on Web Analytics. This is the only non-free tool I mention in this list, but it’s such a bargain, I had to include it.

9. Google Chrome Though not technically a tool, Google’s first attempt at a web browser has one feature that keeps it open on one of my monitors all day – the ability to log into different Google accounts in different tabs. I keep my domain e-mail, which is hosted through Google Apps, in one tab and Google Webmaster Tools and Google Analytics for my work accounts in other tabs. Now, if Chrome would just pick up some cool plugins!

10. Google Webmaster Tools For something that I paid little attention to when first released, Google Webmaster Tools is now also open on one of my monitors all day. It just keeps getting better. From tracking down dead URLs on my sites to testing a robots.txt file, I can locate site issues that I wouldn’t otherwise know about. Though far from perfect, it’s just about the most valuable online tool I use these days.

Standard
Reviews, SEO

Robots.txt: Powerful but Picky!

The Robots.txt file is powerful but picky!I suspect most of us set up our robots.txt file as basically a one-size-fits-all for the spiders. We’ll instruct all spiders to crawl or not to crawl the same files. For instance, a simple robots.txt file covering all spiders would look something like this:

User-agent:*
Disallow: /cgi-bin/
Disallow: /ar/
Disallow: /el/
Disallow: /ja/
SITEMAP: http://www.domain.com/sitemap.xml

This tells all bots (that’s the * after User-agent) to stay away from four directories and also provides the location for the domain sitemap.

But, what if you want to give Google special instructions? You’d think it would be a simple matter of telling Google to do something since you’ve used the * wild card to tell all spiders to avoid certain files or directories. Unfortunately, it’s not that easy. Let’s say you add these lines to your robots.txt file to keep ONLY Google out of your /info-pages/ directory:

User-agent: googlebot
Disallow: /info_pages/

User-agent:*
Disallow: /cgi-bin/
Disallow: /ar/
Disallow: /el/
Disallow: /ja/
SITEMAP: http://www.domain.com/sitemap.xml

You would think that Google would understand that it should stay out of the /info-pages/ directory and then since the * was used in the next User-agent statement, it would also avoid those designated directories just like all of the other bots.

Danger, Will Robinson!

Sorry, but it doesn’t work that way. In this case, Google will avoid the /info-pages/ directory as instructed in its specific category in the robots.txt file and ignore all other instructions found in the file. It would still crawl all of those other directories. If you want to give Google (or any other bot) special instructions, they have to be complete. In this case, you would need to add all of the other directories to the Google section to keep that bot out of the /info-pages/ directory AND the other four directories along with pointing out where the domain’s sitemap is located. This is what the complete robots.txt file would look like:

User-agent: googlebot
Disallow: /info_pages/
Disallow: /cgi-bin/
Disallow: /ar/
Disallow: /el/
Disallow: /ja/
SITEMAP: http://www.domain.com/sitemap.xml

User-agent:*
Disallow: /cgi-bin/
Disallow: /ar/
Disallow: /el/
Disallow: /ja/
SITEMAP: http://www.domain.com/sitemap.xml

Quick robots.txt lesson: The robots.txt file has to be very specific. If you set up a category for a certain bot, it ONLY pays attention to the instructions for it in THAT category. All others are ignored.

For more information, see Robots Exclusion Standard.

Standard
SEO

Content Development Basics

Good content development can help rankingsI recently was contacted by a graphics house about some SEO suggestions for their web site. I took a look and found a gorgeous site with lots of great images and graphics, which is to be expected. After all, that’s what they do.

Problem is, there was basically no text. I asked about that and was told “We used ALT attributes on all of the images. Shouldn’t that work?”

I tried to explain that there is only so much that the ALT attribute can do for optimizing a web site. A combination of good ALT descriptions and keyword rich image names might make for good results in Google image search, but probably won’t help as far as general search rankings.

Spiders still need text, so get content! The more relevant text, the happier they will be and the more opportunities a site has to rank for general and longtail searches.

So, in addition to the other obvious basic S E O suggestions (like NOT using the exact same TITLE on every page), I suggested they add some descriptive text and dropped them these content development tips.

1. Fill in the gaps – Look for keywords in tools like Adwords Keyword Tool or the S E O Book Keyword Suggestion Tool for keyphrases and gaps you need to fill. These tools will help you find keywords and phrases relevant to consumer research cycles.

2. Blog about it – Got great images or graphics content to promote? Blog about them. Blogging is a great way to create descriptive text that can help build up rankings for sites that are restricted to a template or, like this example, wants to keep their main site all graphics or Flash. A blog adds great flexibility for adding text and links.

3. Pay for quality – If you are outsourcing your writing, make sure you get good content, not just keyword rich spam. SEO copywriting is for persuasion, not just filling a page with keywords. It’s more than writing, so expect quality to cost some money.

4. Convince the boss – Let’s say your boss just doesn’t understand why content is important. He/she thinks the site is beautiful, so who needs it? Take baby steps. Try tweaking the TITLE on each of the pages to reflect the targeted keywords. Show progress and explain why it worked. Win the war with small battles.

5. Explain – Give a good, clear explanation about what can be done to increase rankings through content development, but don’t lecture. Sure, you might be the resident expert, but NO ONE likes a lecture.

6. Forget magic keyword density – There is no magic keyword density for a page. That’s so 1998! Make your text natural language with your primary keyword phrase included two or three times on a page, ideally above the fold. That’s it.

7. Keep an editorial calendar – As you are creating your content, keep track of what has been done and what is planned. This is a mainstay for print publications and works quite well for bloggers, too.

8. Use keywords – Some folks have dropped the keywords meta tag. Yahoo and MSN still look at that tag, though, so go ahead and use it for best practices. You’ve got to start out with keyword research anyway, so pop them into the keyword meta tag. It’s easy and can’t hurt.

9. Get help – Got writer’s block? Check out SEOCopywriting.com or Robin Nobles’ Idea Motivator for tips, hints and suggestions online.

10. Help the user – Keep in mind that above all, it’s about helping the user to find content. It’s the user, not you or Google, the user!

For related articles, see the entire S E O 101 series.

Standard
Training

SEO 101: Web Analytics

Web Analytics for Beginner SEOsAs I mentioned in my post about off-page optimization factors, keeping track of site traffic and visitors is extremely important. You really need to understand where your traffic is coming from, what keywords are driving the traffic and why so that you can optimize your site. It can be complex and confusing, though, so what is a beginner SEO to do?

  • Check with your host. Most hosting companies offer at least some sort of bare bones log-based web analytics as part of your package. Many times this consists of something like AWStats or Webalizer, which are pretty standard and offer stats that are probably sufficient for very small sites. Study these and get familiar with some of the nooks and crannies, like where your traffic is coming from and what keywords are driving the traffic.
  • Go real time. If you haven’t heard of Yahoo’s MyBlogLog, it’s an online social site that’s especially targeted to blogs, but other sites are welcome. It’s big with SEOs. In addition to all of the social networking and community building opportunities, you can pay for their statistics service ($25 per year) and see real time traffic information for all of the pages to your blog/site. All you have to do is paste some tracking code within the BODY tags of your template or pages. The information is incredible – where your traffic is coming from today, what they are clicking on within your site and what outbound links they are clicking on. Reports can be run for various time periods. It’s a hidden feature that you need to check out.
  • Get a full-blown analytics package. If you’re looking for free and don’t mind Google having access to your data, sign up for Google Analytics. It’s a slick, feature-rich analytics program with most of the bells and whistles beginner SEOs could want. In fact, there is a learning curve in trying to find all of the features and figure out what they mean. Like with MyBlogLog, you have to insert tracking code on pages you want Google Analytics to follow. If you run an ecommerce site, it can even track conversions with some advanced set up.
  • Do it yourself. If you don’t like the idea of Google or anyone else having access to your stats, you could run log-based analytics software on your own. This is time-consuming and, as your site grows, can become impractical because log files can be huge. You might have to download your log files and run the software to analyze them or install analytics software on a dedicated web server. One free option is WebLog Expert Lite which also offers paid versions with more features. Running log-based web analytics software used to be the norm. I’m only offering this as an option to those who are really paranoid about their data. By the way, Google also offers a log-based solution called Urchin, but, it’s definitely not free.
  • Go commercial. There are zillions of commercial web analytics packages available with all sorts of wiz bang features. The problem with wiz bang is that many of us wind up banging our heads against the wall trying to figure out the wiz. From experience, I highly recommend spending time trying out trial versions of any analytics product you are considering. See if you understand how they work. Find out how available support will be for you. Some of these companies charge you a ton for the product, give you a few months of support and then want a contract for continued support and updates. Be absolutely sure about what you are buying into. One company I know of spent thousands on one of the top log-based analytics packages, couldn’t get it running properly for months, then couldn’t understand the interface once they got it running, had numerous tech and support issues and finally abandoned it altogether, losing several thousand dollars in the process. Don’t let yourself fall into that trap. Understand what you are getting.

What do most SEOs favor? An informal, very unscientific poll of my LinkedIn contacts came back with Google Analytics as the definite top choice. Again, this was a very small sample and by no means authoritative, but it does seem that Google’s freebie has its fans in the search marketing community. On the commercial side, Clicktracks and Mint were also mentioned. (Note: you’ll find people who both love and hate all of these, so test, test, test before making a final decision).

By the way, it’s worth mentioning that a log-based tracking system will track every action on your site – clicks, server calls, spidering, whatever. If you want to use analytics that depend on tracking code on your pages, be sure you have the code on ALL pages. Anything without the tracking code will be invisible to your analysis software or service.

Keep in mind that these suggestions are for newbie SEOs and not for you advanced folks out there. Some of these will seem simple to power users, but someone who has never studied web analytics in the past should find these recommendations easier options for starting out.

Standard
SEO

Reciprocal Linking for Ranking is Anything But Dead

Over the past couple of years I’ve heard the mantra that the value of reciprocal linking is diminishing daily, to the point where it’s no longer worth the time and effort.

Even Google’s Matt Cutts has said, “As Google changes algorithms over time, excessive reciprocal links will probably carry less weight.”

In fact, one of my own quick search engine optimization tips is: The acid test for a potential link is if there is a natural, logical reason for that site to link to you. If not, then you don’t want the link.

If Google’s recent rankings are any evidence, then that mantra is dead wrong and Matt, it ain’t working!

Over the past few months I have noticed that fairly new sites with thousands of reciprocal links, frequently using keyword phrases for anchor text, have come out of nowhere to rank extremely well, sometimes dominating their space. Some are just using power reciprocal linking. Others are combining thousands of reciprocal links with another supposedly dead black hat technique, triangular linking, sometimes called a mini-net.

For this article, I’ll use an example of a site using purely reciprocal links to power it.

Here’s one site that didn’t show up in Google Trends until about March and is now ranking #2 for “sunglasses” in the Google serps.

Google Trends for reciprocal link driven site

The site itself is pleasant enough, but until recently, the only way to contact whoever is running it was using an e-mail form. No address or location information is given, nor is any information about who owns it, just that it is incorporated in Toronto. All I can tell from a domain check is that it was registered with GoDaddy.com and the I.P. is in Albany, New York. They don’t appear to want you to have much information about them. Only recently have they added a telephone number so that orders can be placed by phone.

Not what I would call a trusted, authority site.

What appears to be driving the rankings for this site is the sheer volume of backlinks to it, mostly from reciprocal linking. The site includes a link page that lists hundreds of their link buddies, almost none theme related. The links are from every variety, size and flavor of web site, blog and directory out there.

So much for the value of link theme.

Here’s what Yahoo! Site Explorer sees:

Backlinks for this reciprocal link driven site

See that correctly? This site has 184,079 links to it! By comparison, I did the same backlink check for the Coca-Cola web site, a trusted site with a long history and authority. It only has 87,971 backlinks.

Clearly, reciprocal links are still working and well for many sites that otherwise would be left in the dust by longer established sites with more history and backlinks with theme focus.

I still don’t recommend this magnitude of reciprocal linking, though. Google is supposed to consider massive link trading to be spam, even though it currently appears to be ignoring it’s own statement:

“A spike may indicate either a topical phenomenon (e.g., a hot topic) or an attempt to spam search engine 125 by, for example, trading or purchasing links.”

Apparently, if you can get enough links of any kind, you can still power your way to the top in Google.

At least for the moment.

Standard
SEO

Google “Do you mean…?” Results Baffling

Lately I’ve noticed that the “Do you mean keyword?” results in Google vary quite a bit from a toolbar or search box search. For instance, let’s take a search for “rayban sunglasses” as an example. Most folks search for “ray ban sunglasses” (with the brand name as two words), but there is still a large number of searchers looking for the brand name as a single word.

Here’s what I get as I write this:

Image of search for rayban sunglasses

You’ll see that the Ray Ban sunglasses catalog page for FramesDirect.com comes up #3. Nice, but notice the “Do you mean ray ban sunglasses?” link at the top.

The Google Do You Mean link

Click on that and you’ll get this:

Google ray ban sunglasses result

Still a nice #3 ranking, right? Well, maybe not. Do the exact same search for “ray ban sunglasses” in the Google search bar:

Google search bar

This is what happens. A totally different result:

Google results from search bar

The FramesDirect.com page drops to position 6.

At first, I thought this might simply be a case of different data servers serving up different results in much the same way you can get different results from search bar searching. But, it appears to be consistent. Each and every time I click on a “Do you mean…?” link I will get one result and then a totally different one from a search bar query.

Is Google favoring pages in the “Do you mean…” links for some reason? I tried this in several different browsers (Firefox, IE7, Opera, Safari), not signed into Google with cleared caches and get the same actions every time.

Curious…

Standard