Share this content
Tags:

Harnessing internet search - Part 2. By Stewart Twynham

3rd Jan 2007
Share this content
Kashflow logo

In this second part of three articles, Stewart Twynham offers three important rules of good web design, and reveals the fact that Google is actually blind.

Size Matters
One caveat to the article that follows is that size matters. There are plenty of examples of hugely successful sites which break or bend the rules below - often because it’s too damned expensive for them to change anything. They win hands down by their sheer size - the size of their site or the millions of pages that link back to them. Smaller businesses with smaller websites cannot afford to break these rules in the same way.
Rule 1: Comply with standards
Standards compliance is poorly understood by many web developers - especially those involved in the software side of websites, and always in “home grown” sites. Standards compliance, put simply, means to design your website’s underlying code (for example HTML or XHTML, to meet certain agreed international standards.

Compliant websites perform better across multiple browsers and platforms, and there is plenty of anecdotal evidence suggesting that this affects how search engines deal with websites. Even if it didn’t have an impact, standards compliance is a precursor to the next two rules, which are themselves essential if you want great results.

You can test your site’s standards compliance right now by visiting: The WC3 website validation service.

Rule 2: Structured, quality content is King
Traditionally, content has always been King: a site packed full of information will always do better than a handful of largely empty pages. Today, the quality along with the semantic structure of that page is also important.

We all know that Google can spell. There is now growing evidence that Google is also measuring the quality of web pages - including spelling, grammar, punctuation, sentence length, and even the distribution of keywords - making blatant “spam” attempts futile.

Semantic structure, on the other hand, is how the various headings, paragraphs, meta tags, anchors, and other components of your page are all arranged.

Search engines use semantic structure to determine the relative importance of the words on the page, and hence how they will appear in searches. Without it, you’re already losing valuable points.

Many websites feature no semantic structure at all! Typical problems include a complete lack of headings (for example "h1" and "h2", poor use of the title element, of anchors, and a lack of supporting information on images, tables, forms, etc.)

If your web designers cannot develop a site which follows a sound semantic structure, then find someone who can. If your web designers can, then take heed of any advice they give you regarding the structure and layout of your site’s pages and templates - it’s that important.

Avoid Duplication!
Many websites contain content which looks very similar to search engines - and Google will largely ignore and/or penalise content that looks almost the same. You can test this by typing site:www.yourdomain.com into Google - duplication will be picked up if only a handful of pages are being displayed followed by:

In order to show you the most relevant results, we have omitted some entries very similar to the X already displayed. If you like, you can repeat the search with the omitted results included.

Rule 3: Accessibility
Often misunderstood, misquoted, and seriously abused, accessibility is probably the single most important aspect of good website design for the future.

In principle, accessibility is primarily targeted at disabled users - blind people, those who cannot use a mouse, or even those who simply cannot read your website’s tiny bright yellow text on a flashing pink background.

Meeting the needs of disabled users is clearly essential if you want to include the whole of your target market. For those who still doubt its relevance, I would also like to point out that one of the most disabled users on the Internet is actually Google, after all:

  • Google cannot read Flash animation.
  • Google cannot see what images mean (including images used to make up headings on web pages).
  • Google doesn’t know how to use a Java or JavaScript menu.
  • Google cannot make sense of tables used to layout pages, as this generally puts everything in the wrong order.

Google, in fact, sees most web pages in exactly the same way that a blind person would using a screen or Braille reader. Makes you think, doesn’t it?

When building your website, there are clearly things you should avoidL

  • Flash animated menus
  • Content embedded within Flash
  • JavaScript menus
  • Content embedded in graphics
  • Using frames.

Don’t worry - websites can still look great and be accessible - but you should expect to pay a little more if you want the very best results.

The subject of Accessibility is too vast even for several books, but helpfully, you can test your own website’s accessibility at WebXACT .

Next Week
I will look at the last three important rules of good web design, including a small change which recently increased one client’s “hits” by over 400% in just two months.

Stewart Twynham
Bawden Quinn Associates Ltd

Related articles

  • Harnessing internet search - Part 1
  • Web security Part 1: How safe is your site?
  • Part 2: Anatomy of a hack: It only takes a few minutes
  • Web security Part 3 - How to secure your site
  • Information security Expert Guides
  • Tags:

    You might also be interested in

    Replies (3)

    Please login or register to join the discussion.

    avatar
    By dahowlett
    03rd Jan 2007 14:06

    Do what you can do
    Many of the important pre-built components for modern web sites are NOT XHTML compliant. It will take years to get to that point. Most of the errors I find are not important to search but relate to the way strict 1.0 compliance is enforced.

    I have no problems hitting Google's front page on current profession issues. Much more to do with tags, keywords etc. And I know I have a bunch of XHTML errors. Where I have concentrated effort is in ridding the site of Javascript errors and CSS compliance.

    Javascript menus? Who cares? If you're aggregating content and using JS to provide navigation among that content ( as I do) then it doesn't matter that Google doesn't read JS.

    Once you've driven traffic to your site, then individuals will find ways of consuming that information which are not dependent upon Google. Provided the layout makes sense and the content is useful. Which must always be the caveat.

    I'd argue the reason most professional sites fail is because they don't pay attention to delivering differentiated content. Nearly all I've seen contain little more than 'cookie cutter' information you can get anywhere. Neither are many of the UK sites I've visited client focused. They're all about 'me.' who cares?

    I think your argument about size is incorrect. We're now seeing examples where relatively small, tightly focused sites attract quality audiences. Size may matter to Google but it sure as heck doesn't to someone practicing in Galashiels. What matters is focus and quality. Google search understands that.

    Thanks (0)
    avatar
    By Stewart Twynham
    05th Jan 2007 21:55

    Re: Do what you can do
    Dennis, one thing worth adding is the common American phrase "your mileage may vary". Clearly, someone offering professional services in Galashiels will have an easier job than someone offering the same services in London!

    On the subject of Javascript menus - it does very much depend on the site, but I've had to sort *several* sites out where Google only saw a single (front) page any *nothing else* - all because the navigation was entirely in written Javascript!

    I agree with the "differentiated content" comment - and would go further to argue that this impacts Google as much as the person browsing. One particular example is that many businesses resell products and services - the same products and services that are also resold on 10,000 other sites. It's easy to be lazy and simply copy and paste the same descriptions - and to Google this looks like 10,000 sites with very similar content. Differentiation and unique content is good.

    Finally, the point about size wasn't to infer that small size sites *can't* compete - on the contrary - it's that very large sites (eBay, BBC News, etc) may not play by the same rules because of their sheer size. Small sites have a different advantage - something called keyword density - i.e. the small amount of content they have is more focussed.

    Stewart

    Thanks (0)
    avatar
    By Stewart Twynham
    06th Jan 2007 08:12

    XHTML
    One further thing to add - there is NO lack of XHTML1.0 (and 1.1) support out there - most decent Content Management Systems (CMS) have been supporting XHTML for *years* (as we have).

    Agreed, it may be beyond the capabilties of a home-grown site if you're not a full-time web designer, but the excuses we hear all the time that standards compliance "isn't important" or is "too dificult", from "Professional" Web Design companies is simply the height of laziness!

    I'm sure you wouldn't send out invoices with 17.3% VAT on them, or put the wrong National Insurance number on a tax return, and say "oh well, it's near enough, if no-one notices it'll be OK!"?

    Stewart

    Thanks (0)