Showing posts with label design. Show all posts
Showing posts with label design. Show all posts

Thursday, March 19, 2009

The Theory of Relative URL's

How large is a PiE=mc2 or e=mc2 ?

Have you ever tried to type E=mc2? Notice how difficult it is to find the funny little 2. In fact how many people even know how to go about finding that 2? Similarly while Pi can be rounded to 3.14 the number is infinitely long and trying to remember it rounded to two decimal points is hard enough. However, this is not a science or maths lesson. The point I’m trying to make here is that Pi is easier to remember than 3.14… or how to find the 2.

Similarly we can compare the following URL’s:

www.widgets.com/purple-widgets.htm
www.widgets.com/itempage.htm?id=123

At first glance we would assume that one is a static URL and the other a dynamic URL. Both of these URL’s could be the exact same item, but which one are you more likely to remember? Already, by looking at the above mentioned URL’s you would be able to guess which one of those pages may relate to purple widgets.

We know that the anchor text in a link carries much weight when it comes to gaining a top rank for a specific keyword. Indeed anchor text alone can get a site ranked for a search term that is never mentioned on that page. This has been used and abused in the past. Link bombs, such as the “miserable failure” Google Bomb, serve to prove just how valuable anchor text in a link can be. While many links created on websites are displayed as “widgets.com” you can already see the benefit of having keywords in your URL.

The search engines continue to preach how you should be optimising your site for real people and not the bots that visit the website. With this in mind I wouldn’t be surprised to find that www.widgets.com/purple-widgets.htm would be ranked higher simply because it is simpler URL and surely a lot easer to remember than a messy dynamic URL. This could just be wishful thinking on my part, or is it? When running a few searches I found that 7 out of the top 10 results all had keywords in the URL.

Search engines prefer the use of hyphens in domain names because they can produce more accurate search results by being able to divide and recognize specified keywords in your URL. After all if it’s easier for us to read purple-widgets than it is to read purplewidgets why shouldn’t it be the same for a bot?

Many would then assume that the underscore “_” would be the same as a hyphen. This is not true. I would appear that as the underscore character is often used in programming languages it is treated as a whole other character of its own. As we all know a hyphen simply adds words together it is read as a simple join between two words, nothing more.

It is also be worth mentioning that the URL is listed in the actual search results themselves. While just a small single text entry the URL may give the searcher a little more faith that the page listed is actually what they are looking for. So with a neatly put together Title, gripping description and a URL that matches both you might just find that the URL could even aid in generating traffic.

Useful Tips:

1. When picking a domain name that people will link to, use your targeted search phrase.
2. When creating directories and files use your targeted keywords.
3. Individual words in the URL should be separated as the search engines might not recognize then when joined (although stemming seems to have seriously improved in the major search engines - Smaller Engines still look for exact matches), i.e. purplewidget.htm should be purple-widget.htm
4. When separating words in a file or directory, use a hyphen rather than an underscore (this is easier to see as an underscore can’t be seen if the link is underlined).

As you can see, the search engines and visitors alike have very similar needs when it comes to making sense of your website. Google have been on a crusade for as long as I can remember, trying to get webmasters to design websites that are aimed at a human audience. Perhaps this is prime example of good structures that work for both human and bot. Perhaps this is just a coincidence. But while we hope that the search engines return more accurate search results, this could indeed be a step in the right direction.

Which brings me back to the original question: E=mc2 or e=mc2 ?
Remember to always pick one that will easier for the end user to understand be it human or robot. As it would appear that they are a lot closer than many may think.

Tuesday, March 17, 2009

Is Your Website a Unicycle?

UnicycleIs your website a unicycle, a vehicle that requires much training and skill before it can be used? While there are so many “beautiful” websites online, some simply don’t make sense. Have you ever found yourself on a website that seems quite impossible to use? Even worse, landed on a website after doing a search only to wonder why you are there at all?

Site usability is possibly one of the more important factors of a top performing website. While so many will argue that the site is nothing without a genuine web presence, I will argue that some websites rely purely on offline marketing. At the end of the day, if your website is impossible to use, nobody will be able to (or even want to) use it. Points to ponder when designing your website:

1. Navigation
2. Login/Signup
3. Onsite search
4. Flash and other multimedia
5. Bookmarks/Favourites
6. Contact

1. Navigation
This may seem like an obvious point but as most visitors are more likely to find your homepage, are they able to navigate to the section of the website that best relates to their needs? Simple text navigation will also make it easier for the search engines to index the individual pages of your website (where have you heard the design the website for a human visitor before?).

2. Login and Signup’s
Does your website require that visitor’s to login; do you want new visitors to signup for your newsletter (or other services)? If so, is it possible to do so from the homepage? While you may not want to place a login on the homepage, a link to a login page will suffice. Again the key is to keep it simple and clear as to what you expect of the visitor.

3. Onsite Search
This is crucial for any website that offers a large quantity of information or products. Can you imagine trying to find an item among 2,000 by going through a product list 10 items at a time? I didn’t think so. Offer you visitors what they are looking for by adding a simple search to your website. This should help speed things along. Many websites have a quick search option towards the top right-hand corner of the homepage (sometimes this spans the entire website in all the headers). Keep it simple, visible and obvious. Make sure that the average visitor knows that this is a search function.

4. Flash and other Multimedia
Okay so Flash is a pet hate of mine. But the same could be said of all multimedia that simply clutters a website. Remember that while multimedia and other interactive agents can at times seem really cool or even a good idea, some visitors don’t have advanced updated browsers. That said, sometimes the best way of doing something is through the use of these tools. Make sure that these are placed on well marked pages with an explanation of what they are about. This way, if the visitor is unable to view the contents the at least know what it is about and why they can’t view it. Otherwise they will simply think that the website doesn’t work and leave. After all, what use is a website that is broken?

5. Bookmarks and Favourites
If you want returning visitors (who doesn’t?) then it is usually a very good idea to offer a “bookmark this page” or “add to favourites” button. I’m pretty sure we are all in agreement that traffic is valuable so there is no excuse for letting it get away. The “favicon” is a useful way of separating your website from the others. Once made a favourite this icon will be found next to your websites name. This is an ideal spot to promote your logo and brand.

6. Contact
Even after making the site as foolproof as possible there will still be occasions where even will all that planning something will come along that you hadn’t factored. When this occurs make it as easy as possible for the visitor to contact you. Be it by making your contact details (phone, email and fax) available on each page, or by placing a quick contact us form that is accessible from each page. Again, you’ve worked hard to drive the traffic to your website; don’t let it simply get away.

Remember simple is best, leave no room for mistaken functions. Signups, Logins and searches should be clearly marked so as not to confuse the visitor. Make it as easy as possible for your visitors to find what they are looking for. With a well structured website you will notice that the conversion from visitor to customer will increase. At worst the few questions on where to find something or how to use the website will decrease. Your website is after all supposed to make your life easier as well as save you time.

SEO Design

Ask Jeeves the ButlerThere are many aspects to consider when putting a design together, most of which are either second nature to the seasoned developer or overlooked completely by the novice. Although as we all know, what looks good doesn’t necessarily work well and vice versa.

Neatness of Code:
Code should be neat. Simple. No, it doesn't have to conform to W3C standards (Google doesn't even conform and it is estimated that only 3% of all sites actually do - side note that half the sites that do claim to conform don't either). With so few sites conforming to these standards how can Google (or any other engine for that matter) offer decent results if they negate 97% of the internet? Keeping code clean includes keeping all generic information such as style sheets and scripts in separate files.

Robot Tags and Text File:
Often you may not wish for the search engine bots to index certain pages. You can easily add the NoIndex attribute to this page. For whole directories you can simply add them into your robots.txt file located in the root of your website (http://www.website.com/robots.txt). Why is this important for design? Well the crawl rate and indexing is a concern for all departments. Remember I mentioned that all generic information should be kept separate? Well this way you can simply block those directories with the robots.txt file. This way the search engines will be forced to index your actual content pages before attempting to read your styling code.

Content:
Do not replicate content on multiple pages, it's a waste of time, effort and dilutes keyword value. While the duplicate “penalty” is a myth it does confuse the search engine as to which page is more important or even which page was first. Imagine someone giving you the key to a Ferrari and then telling you it was the key to the red Ferrari parked outside? Now imagine there are 10 red Ferrari’s parked outside! Which one does the key fit? If there is only one Ferrari the choice is easy. Usually the page which is indexed first is the one that is credited with being unique. The other pages are simply diluting their keywords and purpose. Personally I've always tried to aim one keyword per page this does lend itself to long tail combinations working on a single page as well.

While it is commonly accepted that the major search engines ignore boilerplate content (such as standard navigation, headers and footers), it has since been suggested that you can point out which sections Google should ignore. This doesn't seem to be in mainstream use just yet and I am sure that this won't make much of a difference as it remains open to abuse - as with so many other on-page factors.

URLS:
URL’s, or URI’s, can make a difference when it comes to ranking. As mentioned before people may link to the page (home or internal) with the actual URL. As mentioned before anchor text is vital for ranking a page so it makes sense then to include keywords in your URL. Long gone are the days when URL’s were dynamic and half the URL’s had strange characters and session ID’s (a massive source of duplicated pages).

www.website.com/page?ID=123
www.website.com/location/
www.website.com/Town-Name/

In addition to duplicating pages session ID’s and multiple variables can also cause a search engine spider to become trapped in deep pages. Once trapped a spider will leave the website this may result in your more important pages not being indexed. We can now specify the URL of a page through the use of a specific tag in the page header. In this instance the search engines (Google & Ask.com) will ignore session variables (or others you may have generated) and only index the page as you specify.

Links:
The easiest way for human and bot to get from page to page is through links. Not all links were created equal. Links hidden in flash, images or scripts may look good to the human but be impossible for the search engine bot to read. Content remains king and while community (social media) has recently been crowned queen but it is the text link that remains everyone’s servant. On your own website you can use desired anchor text to describe the page you are linking to.

From another website, if a link to a website is a vote, then the anchor text tells you what they are voting for. Because so many webmasters, bloggers and publishers link to pages using the URL as the link text it becomes quite clear as to just how valuable it can be to include your desired keywords in your URL. However, no matter how hard you try you will always have broken links to your site. This could be due to a typo or because you've moved the page (or restructured the website) in which case a custom 404 page is crucial. When rebuilding a website and changing your URL structure, it is advisable to 301 (permanent redirect) the old URL to the corresponding new one.

Forms and Restricted Pages:
Don’t hide your great content behind forms and other forms of logins. Robots can’t fill these in and won’t be able to reach these pages. Simply put they won’t know that it exists. There are ways around this, but why make it difficult of the Robots or even Humans who are now becoming more and more reluctant to part with personal info on the web.

Sitemaps:
XML sitemap for robots (often simply referred to as a Google Sitemap). If you have many pages, consider breaking these down into themes. At present I prefer to set up a static XML sitemap for the pages that won't change and a dynamic XML sitemap for listings, products, etc that will change on a regular basis.
HTML or plain text sitemap for humans can be a perfect place to get all those keywords in either the link itself or next to it. This is also an easy way for a visitor to find something listed on the website. Make sure that this page is easily accessible from the homepage.

Summary
It is reported that Google has over 200 criteria points when it comes to ranking a website. Many of those aren’t part of the design. But a few that are include:
  • Keep code to the minimal required

  • Minimise the use of code that search engines can’t read (hide it when possible)

  • Unique content - keep navigation consistent

  • Use descriptive URL’s

  • Keep unique URL's

  • Descriptive internal linking

  • Use text links to reach all of your pages

  • Custom 404 page

  • Don’t hide great content behind forms and login pages

  • Use XML Sitemaps for the search engines

  • Use a descriptive HTML sitemap

Sunday, March 15, 2009

A Design for AI

Artificial Intelligence MovieWhile creating a website can either be a simple or complex procedure it is always advised that you start simple and add on from there. Once you have a basic design it is a lot easier to add in advanced functionality.

Create a standard design that runs through the website, this is usually done by using base templates or include files. The search engines will read each file once per visit. What this does mean is that once the bot has cached the file it won’t need to reload it each time it views a page. More importantly this will also prevent these lines of code being replicated and taking up a good percentage of unique content on each page. Although it is now suggested that the major search engines can now recognise boilerplate content and filter it out for the most part.

CSS while being valuable to human visitors as it quickly styles a page with quicker load speeds this advantage can also be carried over to the bots. It has long been speculated that the quicker a page loads the more likely the bot is to continue indexing your website. It would almost seem that a time limit is posed on each visit, the more pages the bot can index in that time frame, the better for the site.

CSS has also widely taken over from the old frameset style of design used in what seems a totally different age. Frames are a bad way to design as the frameset page only holds details for where the how the frames are applied – there is no useful content. CSS enables you to place what you want exactly where you want it, no frames required.

Navigation is crucial for deep indexing. The search engines love text, they can follow a hyperlink to any page, but the anchor text in these links gives a very quick title of the page it links to. Use the anchor text wisely, if you are linking to a page about koalas, let the anchor text say “koala”. By following this format you will help identify the page as being about koalas.

Many people like to use flash for navigation. While this often looks pretty the search engines are unable to follow objects embedded in flash files. More often than not it would seem you could create a very similar effect by using CSS.

Links, links, links… This seems to be one of those things that everyone has on their mind constantly. From a design point it is important to remember a few simple things:

  • Don’t put too many links on a single page as this weakens the links strength.

  • Make sure that the links are text based and are clearly labelled; this will let the search engine know about the page it is linking to.

  • Try to link to all important pages from your homepage as this will help the bot create a hierarchy. If at all possible try to make each page accessible in just 2 clicks from the homepage.

  • Link to a static text sitemap from the homepage – This will help link each page just 2 clicks from the homepage. Also remember that when the bot finds this page it also finds every page on your website.

There is a lot more to linking but this is keeping it simple.

Breadcrumbs (No, I’m still not talking about the type that Hansel and Gretel used – but close enough) are useful once again to create a link to a page with related text as an anchor. In the case of our koala the pages that most likely lead them to where they are would be something like: animals – marsupials – koala. While we find ourselves on the koala page there is no need to link back to this page but you can link “animals” to the animal’s page and “marsupials” to the marsupial’s page. Again you get to make best use of anchor text.

Keep your pages uncluttered. This not only refers to the content but code of a page. By keeping the number of lines of code to a minimum you will be able to increase the download speed of the page. By doing this you will also prevent your content being diluted with what the bot may find to be little more than garble.

The content on your pages should be unique and specific. Going back to our friend the koala, many will know that koala’s eat eucalyptus. But it would be more beneficial to make short reference to this and then create a separate page for each of these related topics. While it won’t do any harm to mention eucalyptus on the page try to keep the info on that page predominantly about the koala. If the visitor is looking for koala’s then give them koala’s but by all means link to the eucalyptus page. Again you can make use of anchor text to link to eucalyptus.

As you can see by keeping it simple you can present the website in a manner that bot would really eat up and one that should also be focused on an exceptional visitor experience. The search engine engineers have tried their best to create a bot that will spot a website or page that is the most relevant for a particular search based on what would be best for a human visitor. As you can see in many cases the benefits of doing this is beneficial for both parties.

Tuesday, March 10, 2009

A Design for Life

Koala ImagesCreating a website is a fairly simple or very complex exercise depending on your needs. The best way to start is always go simple. Create a standard design that you will use to run throughout the website. The reason for creating a stock standard look and feel is so that the visitor very quickly gets used to navigation on your website and is able to find exactly what it is they are looking for. With a standard design it is also a lot easier to help drive home the branding of your service or product.

CSS, that’s Cascading Style Sheets, is a great way to keep a constant look and feel of a website consistent. Even better than including style to each page you can create individual files and simply include them in your pages. While the benefits are too many to remember at the moment, I see the greatest benefit of using CSS being the fact that you can make a single change to a single file and the changes will be global. You update one page and all your pages are updated in a single move.

Include files are wonderful. Depending on the language you are using to create your website much of the boilerplate design can be set in a single file and simply included. Much like CSS the biggest benefit from doing this is being able to make a single change and it updating every page on your website. These files could be the search function or a sign up script. They needn’t be full pages of information but rather standard snippets of code. Imagine you changed the subscription email address and had to update a script on 1000 pages? When you could simply update one include file and every page is instantly updated.

Navigation is vital. While it may seem sensible to categorise things the way that you have, remember that not everybody knows the finer details of your business. Imagine you had a website that was about mammals, separated into distinct sections (such as feline, canine, bears and so on). Not everyone would know that a koala is in fact not a bear despite often being referred to as a koala bear. For the record it’s a marsupial.

If you have many products or articles that people may need to look through then a search function is crucial. Even if I couldn’t find koala under bear if I used the search it would eventually bring up a link to the koala page.

Returning to the concept of keeping things simple it is notable that all of this could be incorporated into a single page.

Links, links, links… This seems to be one of those things that everyone has on their mind constantly. From a design point it is important to remember a few simple things:
  • Don’t put too many links on a single page.

  • Make sure that the links are clear and indicate where they lead to.

  • Try to link to all important pages from your homepage.

  • Link to a sitemap from the homepage – one that visitors can easily navigate to where they need to be.

There is a lot more to linking but this is keeping it simple.

Breadcrumbs (I’m not talking about the type that Hansel and Gretel used – but close enough) are very useful in letting the visitor know where they are and how they got there. In the case of our koala the pages that most likely lead them to where they are would be something like: animals – marsupials – koala

It is useful to link each of those back to the page that they represent (koala need not as they are already there) so link marsupials to the marsupials page and animals to the animals page.

Keep your pages uncluttered. This not only refers to the content but code of a page. By keeping the number of lines of code to a minimum you will be able to increase the download speed of the page. This includes optimising the images on your web pages for optimal display but also for the smallest possible files size. Again this will help speed up download times.

The content on your pages should make sense. Going back to our friend the koala, many will know that koala’s eat eucalyptus. But wouldn’t it make more sense to create a separate page for each of these topics. While it won’t do any harm to mention eucalyptus on the page try to keep the info on that page predominantly about the koala. If the visitor is looking for koala’s then give them koala’s but by all means link to the eucalyptus page.

As you can tell by keeping it simple you can give the average visitor a pleasing browsing experience. By building a good solid page the visitor will know where they are as well as why they are there.