Tuesday, November 3, 2009

The Curious Case of Argleton

Tin Foil Hat Time!!!

The curious case of "Argleton, Lancashire". Is it or isn't it? That seems to be the question on everyone's lips right about now. According to Google Maps Argleton does exist. Pity it looks like it's just a field in the middle of nowhere - or more accurately the middle of Aughton and Aughton Park.

Could this be Google trying to catch potential spammers by having anyone advertising at this location immediately flagged? Could this have simply been an error by those at Tele Atlas? I think there may be more to this than meets the eye.

But regardless, I tend to get a little nervous when Google starts to do strange things. What really is their motive. What really is their ultimate goal. I've got my tin foil hat, do you?

Friday, October 30, 2009

Halloween

Well no doubt the various search providers will update their search pages to reflect the Halloween theme in the near future. Yahoo! seem to be the first to have done so with a link to http://events.yahoo.com/halloween/2009/index.php already listed on the home page.

No doubt the others will follow in the coming hours. Wonder who'll be next?

Tuesday, October 27, 2009

Multiple Sites, Bad? Good?

I came across this nice write up on multiple sites by Michael Martinez. Because I wasn't logged in, thought I'd add my own $0.02 over here.

I agree 100%!

Okay, I'll go back a bit. Is having multiple sites for a business a bad thing or a good thing? Well many would argue off the bat that multiple sites instantly equal blackhat tactics. That multiple domains are a spammers delight.

I argue that point, and think Michael does a good job of pointing out why, once again, it's not what you do but how you do it. There are always bad ways of doing anything. Cloaking, is it good? Bad? How about IP delivery? That good or bad? It's the implementation and intent that's really the question.

I've often suggested several smaller sites instead of one site for new clients. The benefits of properly interlinking these sites is immediately noticeable. In addition sometimes you can really break a service, or group of services into separate sites. Each then becoming more focused while adding the the value of a whole.

While generally I'd only suggest this to new sites/clients while avoiding it with older more established sites, it's certainly a tactic that works.

Tuesday, September 22, 2009

Google Are Hiring

Google AdIt would seem that Google are once again hiring and looking for the brightest and talented individuals to join their company. While they claim to be the greatest company to work for, I'll reserve judgment for when I have the chance...

So according to the Ads placed at M.I.T, the image posted contains a phone number to call. If you can crack it... let me know ;)

Wednesday, September 2, 2009

No, Your Website Should Not Be Number One!

How many times have you been asked why a website is not at, "number one in Google?" Only to have a look at it and very quickly realise that the site has some very simple yet huge flaws. I guess the easiest answer to this one is, "while your product may be the best (or at least you think it is), if you don't explain this to Google (I'll include the other engines too) it'll never know."

If the site is made up entirely of flash (and no it's still not being indexed properly), even if it is beautiful, you'll not be found. If the search engines aren't able to find the individual pages... you'll not be found. If you think you've been clever and copied a competitor - because they're number one... you'll not be found, ever!

Ah, so this brings me to the next question how do I get to number one in Google? My stock standard answer to this one is, "If I knew that answer Google would pay me an awful lot to say nothing!"

But the reality is the answer is much longer. We do know how to get to the top of Google. It's the sum of several factors and then some. But while some still punt the magic bullet that is search engine submissions and guaranteed number one spots the rest of us will just have to keep on telling folk that their site simply doesn't deserve to be number one in Google.

But... I'll can certainly show you how you can get a lot closer.

Friday, August 28, 2009

Google and More Antitrust Issues

The Italian government is currently checking Google for antitrust violations. Several newspapers in Italy are claiming that when they told Google not to list their content on Google News, Google also delisted their results from the rest of the Google search engine.

Google have been called a scraper among other things. But looks like the news industry have had a lot more to say on the matter than many other industries. But could that be because print media is dying off? It seems that they won't go with out a fight.

While it hasn't been proven that Google did actually delist these sites I don't see what the problem is really. Much like Microsoft having to remove IE from Windows. Why? Don't get me wrong, I'm really not a fan of either of these companies, but it's their index/OS, so let them do what they want. If you don't like it, don't support them. As an end user it's that simple. If they really are a monopoly, then let the government take care of it... (yeah, I know can't really rely on them)

While I don't really see the problem, I'm going to hope that Google do feel the hard end of that stick. If only to feel what it's like when everyone want's your blood - kinda like they did to Microsoft (payback's a bitch - ain't it?).

Tuesday, August 25, 2009

Content is Dead... Long live Content!!

Well the saying has all but been worn out now - Content is King! Okay, so there I've said it. But, it remains as true today as ever... if not more so (yes, you can have truer statements).

While previously the actual content had to be very keyword focused this is not quite the case any more. Sure the content needs to be good, but no longer do you have to focus on keyword density, stuffing your page until it read like a badly translated DIY instruction document.

SEO is Rocket Science hits on the fact contextual links are the best. We already knew that (well, we did, didn't we?), proving that while links are important it's the context or the content that surrounds them that is really important.

Some may argue that links are the most important ranking feature. Others the content of a website. I would say that they are as reliant on each other as they are important. Good quality content will generate links and links to useless content won't guarantee rankings.

Tuesday, July 7, 2009

Google OS

Well it's hot news right now, trending at Number 1 on Twitter, Google have announced that they are going to be introducing the Google Chrome OS. Yay... or perhaps not?

*Okay, a quick disclaimer, I'm not a Google Fanboy. I do believe them to be evil and slowly we're all letting them take over the world.*

Okay, so that said. This is fantastic news for those in the netbook industry. Not only do these products need to be cheaper, the need to be running on fewer resources. In theory then, why not simply build a system that requires most of the hard processing to be done by another machine? The internet allows exactly that to happen. With the multitude of other free offerings from Google (like Maps, Docs and Gmail to name a few) you already have a good deal of what you already need by simply logging into your Google account. Could this be an on boot setting?

Okay, all that excitement. Should Microsoft be worried? I wouldn't think so right now. While they've not exactly set the world alight with their netbook offerings (I believe this to be the most competitive market at this time) I think they're more interested in what Google will offer than fear what they will offer at this time.

One part that kind of makes me wonder however is, "How will Google make money out of this deal?" Well if you're doing all of your computing online then you'll need to have a steady stream of ads, right? Google are an advertising company the bottom line is how they show a profit.

I'd turn this about and argue that they can't really create a true OS. If they did, they would once again offer choices to the user. Choice of programs they run, perhaps Open Office? Choice of Browser, perhaps Internet Explorer. But more importantly and even while you have a browser you'll always (we hope) have this choice, the choice in search engine, perhaps Ask.com. You see, if you automatically log into Google, all of those products offered by Google offer Google Search (that's a lot of Google).

Oh... and isn't locking your browser into your OS a bit of a problem? Okay, I know that there are many legal points there that I no doubt just don't get... and personally I think Microsoft should have won that one. I don't see Google offering a full OS for that reason.

There are tons of pro's and con's at this time for both Windows (Gasp! yes... that's right pro's too) and another type of OS. But if I were Microsoft I would simply keep working on Windows7 and IE8. Bing has already impressed me, so I'm really expecting much from them this time round - the first time in a very, very... long time!

The battle may not yet have begun, but have the Chrome Wars just started?

Monday, July 6, 2009

What's in a Name?

So much of SEO is about targeting the right keywords. This holds true for any form of marketing be it online or offline.

Russia's Gazprom and Nigeria's state-operated NNPC formed the company - pronounced "nye-gaz"

Then I came across an Epic FAIL. What happens when you mix Russian and Nigerian companies to supply Europe with gas? Nigerian Gas? Nope. Lets all welcome in Nigaz. Does nobody do homework? Did the Nigerian folk approve of that? Doesn't Russia have a single person with some American slang? Perhaps they should bring back the KGB? At least they would have avoided this one.

Well... I think we've all heard the old saying, "there's no such thing as bad publicity." But I think in this case we'll find an exception. While negative publicity has built brands like Ozzy Osbourne and Marilyn Manson it all but ended Michael Jackson (yes all you hating hypocrites who are now his fans once again!). Sometimes you want to be seen as corporate, business like, simple. I think Nigaz were trying for that but failed miserably.

Well there's not that much online for Nigaz at this time. Although most references are negative and on news sites at this time. This leads to so many other questions about Reputation management (but that's a whole series of posts).

The lesson in this one? Research is key, know your market. Always make sure that the keywords you are chasing are going to draw positive traffic.

Wednesday, July 1, 2009

Blog Spam

I could have gone on for ages about those SEO India posters but why do the hard leg work when someone already has?

Check out Ninja Commenting on Hobo. I think that post pretty much says it all.

Generally the blog is a good combination of sarcasm and info.

Wednesday, June 24, 2009

Cuil, Remember Them?

Hey, finally some news on Cuil, you remember Cuil don't you? The Google killers with the largest database of indexed pages in the world. Oh... not? Don't worry, I'd forgotten about them entirely too. Especially after the recent re-branding of MSN and Live search to Bing (which I'm am quite fond of).

Well it seems that Cuil have decided that it's time to innovate. As reported by Matt McGee on Search Engine Land - it would seem that Cuil have now added Maplines to their search results. This makes for a much more interesting results page as displayed in a search for George Orwell.

While Google keep the monopoly on simple search, could it be that the other providers have decided that if you can't beat them, start a new game? Ask.com brought out their 3D search (which has faded away), Microsoft eventually consolidated everything into Bing which is a lot more interactive, even offering a blurb on sites and now Cuil change it up a little. I can't help but feel sorry for Yahoo! as they really are lagging now (dead duck or just lame I wonder). I don't see people changing their search habits anytime soon, but the internet changes pretty quickly, what will Google do to counter this?

PageRank Update?

PageRank YodaSEO Crowd: "Hey there's a PageRank update!!!"
Me: "Like I care!"

hehehehe... yeah, okay so it's always amusing to see the freak out and the absolute fascination with PageRank. God forbid someone's rank drops...

All for a digit between 1 and ten... but what about that grey bar?!

Anyhow... we'll see the outcries later this week no doubt.

Outlook 2010 to be Broken?

Outlook LogoNow this is an interesting turn of events. While I've never been much of a fan of Outlook (I've preferred the express version - it's less, but less clutter as well) but it looks like they're changing it up all over again. Microsoft are due to change from rendering emails in HTML format to Word. This will basically break HTML email in Outlook. I imagine this will keep the absolute end user more than pleased with the product (lets face it, they don't care what's under the hood) as they won't really notice a difference - 'cept perhaps download speeds. Word does tend to make things a little bulkier.

I heard designers and developers that have become dependent on HTML format for email groaning in the background cursing Microsoft once again for making their lives difficult.

I'm really not a Microsoft fan, but I've come to accept that most people use their products so they <rant rel="for another day">are THE standard </rant>. That said I feel that these developers that have been moaning for so long about standards and the fact that IE is bundled into their OS are really the ones to blame. Microsoft are slowly being forced to remove IE from their offerings, in this case Windows7 (yeah, I know they could offer multiple browsers, but do you advertise for your competition? didn't think so!)

Basically Microsoft have been forced to change how they make use of their software. So if you can't use something that's built into the OS (because now there is NO browser) you have to use something that's shipped with the package. Office, I believe, ships with Outlook and Word. Word allows formatting, so use it as your base editor. Will this affect their customer loyalty? No... most end users have no idea that there are other options than MS Office or IE (believe it or not), so while the masses continue to use Microsoft products it is the rest of us that have to adapt and conform.

Anyone who ever insisted that Microsoft remove IE from their OS take a bow.

Well done moaning developers and designers... Congrats to all you in the EU forcing the anti-trust issue. You've finally got what you want. If anything this will simply boost sales of Word (and subsequently Office). While I hear so many moans and groans I can't help but think you all got just what you deserve.

Monday, June 22, 2009

Online Publications

Online Journalism - click for enlarged versionWhile advertising was always the bread and butter for the newspapers, it would seem that these bells and whistles are now the focus of online publications. The accompanying diagram best outlines this. Perhaps newspapers aren't dying as many seem to be pointing out, perhaps they are literally selling their existence. I know that far too many pages these days are nothing more than ads. The recent upgrade of News24 has gone a long way to reinforce this thought.

What ever happened to the "sponsored by" with a logo? Too many banners have caused banner blindness. What is an optimal click through rate on these banner ads? You have to stop and wonder. With fewer click throughs the advertiser is paying more and more for a lesser result. I wonder if any of these publications would ever (could ever) move over to a cost per click model. This would surely offer best value, or would it go to prove that their over stock of ads simply don't work - in this case for the publisher?

I say put the shoe on the other foot.

Thursday, June 18, 2009

Social Media: Twitter

twitter: rhcerffTired of spammers? Tired of the get rich schemes? Well it seems that these tactics continue to work, after all, if they didn't they would stop doing it right?

I found this on twitter SEOSumo: Social Media Douchebag.

It begs the question though, just how many people are out there pushing their secret to success on the numerous social platforms? How often are we hit with the twitter follower that is just a pretty girl (thanks Zaibatsu) punting their get 2,516 twitter followers in just 2 days tweet and nothing else? The whole debate on whether to use the long sales letter or even video. Don't we all just hate that automated direct messages?

I guess it's nice to find something with a little bit of humor on a Friday morning. Oh well... in the meantime I think I'll just refer these folk to www.socialmediadoucebag.net

Sunday, June 14, 2009

SEO Glossary

Age - First appearance of site in Archive.org, or first appearance in search engines. Not to be confused with domain age, which is the registration date of the domain name. Older sites have more credibility, but for SEO purposes the "age" clock starts when a site is cached by a search engine.

Algorithm - A very complex series of rules used by a search engine to determine rankings. The Google Algorithm uses up to 200 different factors to determine web rankings.

Analytics - Most often, this is a reference to Google Analytics, a free way to measure your site traffic. Other analytics programs include ClickTracks, WebTrends, and Omniture.

Anchor Text - Linked text on a web page. Example: This is anchor text. Anchor text is important because search engines use it to determine what the destination page is about. Therefore, anchor text must be topical and relevant.

Backlinks - The number of links from other websites to your website. Google Webmaster Tools will give you the most accurate picture of your own links, and a search in Yahoo under link:yourcompetitorsitehere.com will tell you how many links Yahoo is listing for that site.

Ban - A severe search engine penalty that takes you completely out of the index. Normally caused by using black hat techniques.

Black Hat - In reference to search engine optimization, a technique that is unethical in the eyes of a search engine, and can get you de-listed.

Bounce Rate - The number of people who come to a web page from another site (or search engine) and leave without visiting any other pages. A high bounce rate is believed to negatively affect search engine rankings over time. Most often measured using Google Analytics.

Cache - The search engine's stored data about your site. This information can be weeks or months out of date, depending on your crawl rate. When you make SEO changes to your site, it won't be applied until the site gets re-cached and re-indexed. To see your cache in Google, type in cache: followed by your website.

Content - All text on your website readable to the search engine. Usually this is in reference to the body text on your pages.

Conversion - A visit to your site that results in an action being completed by the user. This can be a form fill-out, purchase, or phone call.

Conversion Rate - The number of conversions divided by the number of visitors. Higher conversion rates are always preferred. In Google Analytics, this can be considered "Goal" conversion.

Crawl Rate (Frequency) - The interval between search engine robot visits to your site. Generally, sites with frequent changes and more interesting (to a robot) content get visited more often. Pages with higher PageRank also get visited more often.

Description - A metatag that allows for a brief description of the page's content. All description tags on a site should be unique, and less than 256 characters.

Directory - A website that lists other websites in categories.

Duplicate Content - Content that is substantially similar to content on other sites or on multiple pages of your own site. Non-original content is generally ignored by search engines, and referred to as a "duplicate content penalty" when it impacts your site. Duplicate content is often cached but not presented in normal search results.

External Link - A link to another site or online resource from your site.

Google Sitemap - An XML sitemap that lists pages on your website that you want Google to find. The same protocol is used by Yahoo and MSN. Several sources online will create a sitemap for you. Not to be confused with a sitemap that lists all the pages on your website.

Filter - A reduction in search engine ranking for a number of possible reasons. Filters are different than penalties, in that when the item tripping the "filter" is removed, then results should bounce back.

Indexing - When a search engine applies your site results and links to its current index. Web pages can be cached for some time before the cached results are applied to the index.

Internal Link - Links from pages on your site to other pages on your site. How pages link to each other is known as Navigation.

Keyword Blurring - Using the same keywords on multiple web pages. This keeps the search engine from picking a "best" page for the keyword, so multiple pages may have lower positions that a single page devoted to the topic.

Keyword Stuffing - Using multiple keyword repetition on a web page. Search engines prefer text and keyword use that is more readable and user-friendly.

Keyword Tool - Any tool that helps determine keyword demand. Wordtracker and the Google Keyword Tool are two popular sources.

Keyword Research - Strategic research into the demand for keywords relevant to a website's topic. Good keyword research also uncovers synonyms and search terms that may improve site traffic.

Link Juice - A way of explaining the relative power of any link to another page on the same site or external web page, based on the power of the referring page and the number of other links on that page. For example, a powerful page with a single outbound link to your site would have more "link juice" than the same page a link to you among 49 other links. Link Juice Illustrated.

Links - In the world of SEO, "links" is most commonly a way of referring to inbound links to your website, given that Google bases a great deal of its rankings on other sites that link to yours. The value of links is highly variable, and links from sites trusted by search engines are more powerful than links from low quality sites.

Link Popularity - An overall measurement of a website or web page's link value, as determined by links from outside sources and links form other pages, which may themselves be getting good inbound links.

Long Tail - A keyword that contains a long search phrase. Long tail keywords usually have a lower search volume but a higher conversion rate, because the people who type them in have a very specific idea about what they want.

Metatags - Page code not normally visible to a site visitor which describes the content of the page. The Meta Title, Keywords, and Description tags are the most common, but metatags can contain many different fields of data not important to search engines.

Navigation - The way links are configured on a website to allow people to get to other pages. Search engines like to follow navigation and use it to determine the relative importance of pages on a site.

PageRank - (1) a numerical representation applied by Google showing the link value of any given page. This is completely determined by links from other websites and internal links. It is not a representation of the relevance of the site. There is a logarithmic scale of 1 to ten for PageRank, and higher numbers may require millions of links. This can be found using the Google Toolbar. (2) The algorithm at Google, not completely known to the public, that determines part of how links impact rankings.

Pay-Per-Click (PPC) - Paid search engine advertisements that appear next to search results. PPC can be very expensive, but can be executed within hours, while SEO can take months.

Penalty - A change in search engine rankings caused by breaking one or more "rules" of search engine ethics. A search engine "filter" is a less strict penalty, but a "penalty" can be applied for a longer time period and is generally a sign that you are believed to be deliberately violating webmaster guidelines for search engines.

Ranking - A keyword position on a search engine, anywhere from #1 to somewhere in the billions. Usually you want your site to show on the first page for your keywords.

Ranking Report - A listing that shows positions on search engines (usually Google, MSN/Bing, and Yahoo) for a list of preferred keywords. Monthly ranking reports will show you your progress over time.

Reinclusion Request - A request to a search engine that a site be reexamined for inclusion back into listings. This is most commonly done when a site has been penalized or banned.

Relevance - The key to good SEO. More relevant sites are preferred by search engines because they confirm the search engine user's trust in the ability of the engine to deliver results. SEO practices help format a site in such a way that the engine can understand its relevance.

Robot - An automated program that visits your website.

Robots.txt - A file on your website that can either allow robots or restrict them. Robots files can be useful when you want duplicate pages to be ignored, or search engines are crawling unnecessary pages.

Sandbox (AKA Sandbox Penalty or Google Sandbox) - An artificially low ranking due to having a new website. The existence of the sandbox penalty is debated, but generally a new site will get lower rankings. Search engines use this to prevent junk sites from getting rankings. There are ways to get out of the "sandbox" by being relevant, but customers with new sites are still advised that search engines may take some time to show good rankings.

Search Volume - How many times (usually per month) that a keyword search is made in a given search engine, or all engines. High search volume indicates a competitive keyword which may be more profitable.

Short Tail - A one or two word search term like "auto parts" that gets a high search volume, but is not very specific. A "long tail" version of the same term would be "used auto parts free shipping."

Spider - Essentially a search engine robot that "crawls" your website for information.

SEM - Search Engine Marketing. This most often refers to Pay-Per-Click initiatives, but can also include SEO as part of an online marketing strategy.

SEO - Search Engine Optimization, or the practice of getting websites ranked on search engines through a variety of specialized methods.

SERP - Search Engine Results Page. The list of websites that you get when you make a search on a search engine.

Silo - A way of structuring categories on your website and individual web pages. Normally all the pages and navigation links in a silo are relevant to each other, and the "silo" structure helps improve rankings by structuring similar items into easily navigated categories. This benefits search engines and site users.

SPAM - In search engine parlance, Spam is not junk email but site content and linking practices that are keyword stuffed, automated, or created to get undeserved rankings for search terms.

Submission - The act of submitting a site to search engines or directories. For new sites, submission is still useful, but any site cached in a search engine would not need to be re-submitted.

Title - Also known as the meta title, the title of each web page appears at the top of the browser window. It tells search engines about the topic of each page. A well written title can have the fastest impact on search engine rankings if all other factors are good.

Webmaster Tools - Google Webmaster Tools is a free program that will help the average user understand how Google sees the website, if there are any problems, and if the site is penalized. Highly recommended to any webmaster.

White Hat - Search Engine Optimization techniques that are approved by search engines.

XML Sitemap - A "Google Sitemap" or a list of pages that you want search engines to find. This normally gets placed in your root directory in an XML format and named "sitemap.xml." The sitemap contains information about pages, their relative priority, and how often they are updated.

Borrowed from WebProNews. I really wish that it didn't take so long to load a single page.

Wednesday, June 3, 2009

Is Google Evil?

"Do no evil"

"In business, evil refers to unfair or unethical business practices. Firms that have a monopoly are often able to maintain the monopoly using tactics that are deemed unfair, and monopolies have the power to set prices at levels which are not socially efficient. Some people therefore consider monopolies to be evil. Economists do not generally consider monopolies to be 'evil' though they recognize that certain business practices by monopolies are often not in the public interest." - Wikipedia

Are Google a monopoly? I'd say yes. By definition (on Wikipedia - so take it with a pinch of salt) that would mean that Google are indeed Evil.

Google have often been said to "do no evil". Promoting free access to information for all. Arguing that ISP's shouldn't limit access to information. While I'm all for a free and fair internet and information at my fingertips, I am concerned at what the cost of "free" is. Power corrupts... absolute power corrupts absolutely.

"There are no free lunches!"

Or so I'm told. So if they aren't charging me money for this service what is the cost really? I came across an interesting article on "The Plot to Kill Google" - the real Google killers (so much for cuil - anyone heard of them since their launch?). However many do have compelling arguments as to why Google shouldn't be trusted.

While all of that is a good decent argument, I guess one should be pleased that someone is trying to monitor Google's activities.

On the other hand, it could be worse: Google Press Release
Okay so those are just for humor's sake. But imagine if they were actually true. Which is more evil. I guess we'll just have to keep an eye on the almighty Google for now.

By the way, all information was found searching with Ask.com

Tuesday, June 2, 2009

(Google) Bombing the Presidents

BombWell Google may have diffused that bomb, but looks like bing.com are more than happy to keep it locked and loaded. Danny Sullivan of Search Engine Land pointed out that for some reason the "miserable failure" bomb seems to be live and well (or is that lit and fused?) in recent times.

Google run their algorithms from time to time to locate and diffuse these bombs, but what of the other engines? Yahoo! never seemed to really get rid of it entirely and MSN was pretty much on the same path. At present a search for "miserable failure" on Ask.com reveals George Bush to be the number one contender while surprisingly his right hand man is now Barack Obama (does Ask know something we don't?). Okay, so there is more to this than meets the eye.

The White House implemented 301 redirects a little while ago to send most of that failure link-love back to its intended destination (that of past President GW Bush). It seems that Google have managed to once again find and block this bomb. MSN (now combined with Live as Bing.com) seem to have re-indexed the site and are now reporting the GW Bush bio as a "miserable failure", followed by the Wikipedia report on this event and surprisingly the new (and only one that I know of) bio for B Obama. Yahoo!'s results match those of Ask.com (although they have indexed the new URL). Interestingly enough Ask.com still list the old gwbbio.html file as the URL for GW Bush despite the site having implemented 301 redirects (clicking on the link takes you to /georgewbush/).

Okay, so what does this really tell us? That GW Bush and Obama are miserable failures? Well, that will forever remain a long debate.

From an observation point I'd say that this would suggest that the miserable failures in this case are the search engines. Agreed, they may simply be returning facts based on what the public perceive, however this simply goes to prove how easily these giants can be manipulated to this day.

Google seem to have worked around this one and I have a sneaky suspicion that when Matt Cutts moved from the domain www.mattcutts.com to www.dullest.com he was testing just how well they handle the 301. Perhaps Obama and Bush both owe Matt and his team a thanks on that one. On the other side of this we can gather that most of the search engines still need to work on how they deal with 301's.

Could this leave a door open to spammers to "Google bomb" a page, then simply 301 the link love to another page?

Wednesday, May 27, 2009

Sub-Domains or Directories?

DirectoryMany have asked the question, "Which is better for indexing or ranking, a directory tree or sub-domains?"

Well Matt Cutts of Google put in his 2c worth. It would seem that he generally prefers the use of directories. Personally I would agree. While I believe that much of ranking does include the reading from left to right of most factors (so starting at the beginning of the domain) I think it is just simpler to work with directories. In addition much of what we do in the development phase of a website is done using relative links making sub-directories even more difficult to manage.

From a end user or marketing side I might be tempted to run sub-domains, much like Google does with http://maps.google.com by doing this you clearly state that this is a particular product offered by the company. The joy of this is that even the developers may be restricted to only working on one project without having an impact on the rest of the site.

I believe that for ranking purposes sub-domains have had their fair share of success in the past, but as with all things Spammers arrived and ruined that. I imagine that these days the major search engines most likely run somewhat of a mash-up stemmed version of the URL for x number of characters to determine validity. Looking for one or two keywords that relate to form some kind of relevancy. Yes, the URL is important but most likely no more so than many other factors.

Anyhow, I guess this will remain a debate for quite some time. Which do you go for? I guess really it's all preference or need. I still opt for the Directory structure myself, mostly as I can almost form sentences as I go along making sense to humans and bot alike.

Tuesday, May 26, 2009

New Developments

Which way?Well some would argue that one site is better than 2 (or 3 or 4... etc). I lean the other way more often than not arguing that each keyword should have it's own website. You can't really get any more focused than that, but at what point does it become overkill?

We recently broke one of our larger clients websites into 3 distinct divisions. This generally would have made fantastic sense, but it seems that while 2 of the new sites are working well (the previous domain and the smallest domain). This kind of hampers the last site. Unfortunately the old domain still ranks for those phrases that are now on another site. The redirect seems to be working at this time so perhaps it is just a case of having Google follow and re-index the new domain. Time will tell I suppose.

I'll keep monitoring the progress and report back. I guess this is why this should have been the plan from the get-go.

Wednesday, April 29, 2009

SEO Since 1999

While I'm not usually one to scrape content from another site, simply linking to this wouldn't do. So, here's a fantastic personal take on SEO and I guess the internet in general since 1999. Enjoy...

Written by James Svoboda of Realicity Search Marketing

Monday, April 20th, 2009: Today I celebrate the completion of my first decade in search. I have been waiting for this day with some trepidation for the past 6 months or so. I am not really sure why or even what this anniversary really means. Does it mean that I am some sort of expert? Well, anything is possible. Or does it mean that I have wasted the past decade with little to show in an industry that my friends and family can hardly even understand? Hmm, I hope not. In truth, it means I not only have a job that I like, but a career that I love.

I represent one of a hundred or maybe even a thousand tenured SEO professionals in the world who have spent their time doing and not blogging about doing. The only notoriety that we typically gain is the word of mouth referrals from our happy clients. To some of us, these clients have become more like friends. These friends make what we do just that much more rewarding. I cannot even imagine doing anything else and came to the realization several years ago that I am going to spend the rest of my life with a career centered around the web. I guess only time will tell what that Tuesday morning in 1999 really started.

History & Change

I had a great history teacher in the 7th grade who regularly told us that “we need to learn from our past so we do not repeat our mistakes in the future.” I am sure he was not the first, or the last, to use a saying like this, but remembering it makes me wonder if someday public schools will teach students about the history of the internet along with the European Renaissance and Industrial Revolution. And since you can now earn a degree in internet marketing from several accredited universities and colleges like Rasmussen or Full Sail, it might only be a matter of time before someone like Rand or Danny starts teaching a course on the History of SEO at U DUB.

In the past few months I have been doing a lot of thinking about the changes that have occurred on the web since I started. I managed to get some of them down in a list and looking it over reminds me of so much, probably in the same way as cracking open an old high school yearbook or visiting with an old lifelong friend who you have not seen in years. Maybe it will invoke some old memories and you can reminisce too.

Search Engines

AltaVista had been the Google of that era. There were other somewhat popular search portals like Lycos, HotBot, DirectHit, Northern Light, Excite, and of course Yahoo, AOL and MSN had their loyal following. I guess AltaVista’s popularity is why Overture/Yahoo eventually bought it. I just don’t know why they ruined it.

I can even remember when we would resubmit each of our client’s pages to MSN every month because their submission page fed into Inktomi (which provided the search results for MSN, Yahoo, HotBot, Lycos and a few others) and you would receive a rankings boost for “New Pages.” Inktomi was great for us for a while. They eventually came out with a Paid Inclusion that you could purchase for a yearly inclusion for each page. No more resubmitting each month, yea! You received such a ranking and traffic boost that we started purchasing inclusion for all of our clients without charging them because it produced such better results that made us look great, and our clients loved us for it. That program was great until Yahoo bought Inktomi and changed the paid inclusion program from a yearly fee to its current a yearly fee plus a PPC fee called Search Submit. We pretty much had to drop all of our clients from the program at the time, which resulted in all of the pages that paid for inclusion getting dropped from their index (and along with them traffic, rankings and sales).

In those days the search engines were spending tons on TV commercials in order to capture a larger search share. There were the Lycos Dog commercials, AltaVista had Pamela Anderson, the Excite ads were kind of edgy, and Yahoo had some good ones like the Comb Over and Raise the Dead. However, my all time favorites were the HotBot Investment Tips and Political Scandals commercials in what I call the “Old Links” series. I think I found this series particularly funny because I am an SEO and can remember when you could do a search and within the first 3 pages find a listing or two that would lead to a 404 page. Also, the distinguished gentlemen they used were great! The staff at Search Engine Land also put together a search engine commercial montage that you should check out.

Eventually, many of the original search engines failed to turn a profit and were slowly being purchased for their search market share, mostly by Yahoo, CMGI or InterActiveCorp. The Searching Graveyard also has a few interesting tidbits on some oldies like Magellan, Deja and OpenText.

Meta Search

DogPile, MetaCrawler and WebCrawler were the three that I was first introduced to, and in fact all are still online today. What I really loved about Meta Search Engines was not their ability to pull results from a dozen different engines at the same time, but just the fact that their entire premise was based on how bad each individual search engine’s results could be. Even search results for top keywords were hit and miss, and quite often we were forced to jump from one engine to another in order to find relevancy. I think this was primarily due to the fact that algorithms of the day were heavily based on on-page factors like keyword density, and links were not a major ranking factor in most of the popular engines.

Web Directories

Wow, have these changed! We used to make getting listed in the Yahoo Directory and the Open Directory - DMOZ a priority because of the traffic they generated. Now any actual visitors that originate from them are probably either a link builder, a competitor or an editor.

Disney-owned Go.com, originally called Infoseek, launched a directory called the GoGuides that was pretty popular. You could join to become an editor and then list your clients that were sorted alphabetically and ranked by 1, 2 or 3 stars. When Disney shut down this directory, a few of the editors spun off a directory of their own called, of course, GoGuides. GoGuides.org is still online today with pretty much the same format that they brought with them from the original Go.com directory.

At one point Looksmart.com was a valuable directory that powered some results for MSN, Inktomi and several others. You could initially pay a $199 review fee for a permanent lifetime listing. This was something that we only did sometimes for our clients because the traffic coming out of Looksmart was not always enough to warrant the one time fee. They eventually changed their ad model to a Pay Per Click format and at the same time informed all of their previous customers who had purchased a lifetime listing that they were converting all of them to the new PPC program. They then funded your account with your original submission fee and you were charged a $0.15 per visitor fee until your account reached zero and you were de-listed. A class action law suit was brought against them and I believe they settled out of court. MSN was their big distribution partner at the beginning of their PPC/directory end. Once they lost MSN their stock dropped, their directory was eventually taken down, and they were never really the same again.

The original Snap.com portal also built a web directory that was of brief significance and provided good traffic. This one was probably around the shortest amount of time of any of the Dot Com directories.

Pay Per Click

The first PPC search engine that I managed a campaign for was GoTo.com. They were simple, with a “Visible” bidding process that could let you bid based on current bids without a quality score algorithm to consider. Originally you could only bid on each keyword by itself, or what is today called Exact Match. You always knew which keywords your traffic was coming from. The only downside for some is that you had to do some really deep and creative keyword research to reach your PPC potential. The GoTo search term suggestion tool (keyword research) was the best and most reliable keyword tool I have ever come across. You could punch in any term and know that you were going to get 50 results that were a derivative of your words. No synonyms or related terms to confuse you and make you wonder what keyword you actually just searched for. If you were given less than 50 results on the page, then you had reached the end of that search and it was time to try a different version. They only showed you results that had 25 or more searches from the previous month, and if you were given any results that did not end at or close to 25, you had to keep searching for additional derivatives. It was also very easy to find more versions as each keyword result was linked so you could click on it and run another search.

The GoTo tool only had 2 little drawbacks. The first was that it combined singular and plural versions of most keywords except a few like link & links, service & services, company & companies, and a few others. At one point I think I had a list of about a dozen versions that would need to be checked individually. The second was that it was not an intuitive tool to use if you were researching keywords for a campaign about a topic that you did not really know anything about, like a few software and business consulting clients I worked with early on. I owe much of my keyword research skillz to GoTo. Thanks.

At one point GoTo.com sued Go.com for infringement based on Go’s logo looking too similar to theirs. Go had a green stop light logo and GoTo had the full green, red and yellow light. GoTo won a bunch of money and made some big headlines as the then little engine that sued Disney and won. Eventually GoTo changed their name to Overture and then Yahoo bought them up and changed their name again by rebranding them as Yahoo Search Marketing. All of those name and logo changes must have really pissed off the legal staff at Disney that lost the infringement case!

There once was another little PPC engine that started out by providing ads for About.com. It was called Sprinks. That current version of Sprinks did not last very long, as Google soon bought them up in order to compete with Yahoo/Overture. They were quickly absorbed into Google and either stripped of its PPC technology or rebranded as Google AdWords.

There were also several other little PPC engines that tried to make it, like FindWhat - now Miva, Bay9, Kanoodle, 7Search.com, and I even remember one called 3Apes.com. Some of these are still around, some have been rebranded or bought by a more profitable company, and some have died and gone to PPC heaven. I hear it is nice there, with absolutely no click fraud to speak of. Of all these, only 2 have so completely rubbed me the wrong way that today I have to check myself before I getting into a profanity laced tirade when I hear their names. Bay9.com was a company that shorted an affiliate payment on a site that I placed their ads on. Their rep was great and said all the nice things and then never would answer or return calls again. This might be one of the many reasons why Bay9 is no longer in business. FindWhat (now Miva) is the other one. They were the first PPC engine where I could pinpoint click fraud, call my rep, get the run around about adding funds back into my account from the “bad” partners account, and have it happen again and again. That is another one that refused to return my calls after a while.

Before GoTo there was... I am not sure, as I can’t recall their name now. During my first year I spent a good amount of time doing keyword research based on GoTo.com keyword traffic stats. I remember my first introduction to GoTo when I was told their short history. It was also during this conversation that I remember hearing about a PPC predecessor to GoTo. Not a large one, but a predecessor nonetheless. I do not remember much about this site/engine/platform, so if you have any additional bytes of knowledge, feel free to share them. I only really remember that they did not last long. They were not perceived very well because they were charging for something that people thought should be free, and at that time the internet was free and nobody should cage it or have to pay for it!

Things were much different back then. The DOT COM had not yet bubbled or burst, banner ads were the main way to pay for search traffic, and I am pretty sure most people still thought the World Wide Web was flat and could only be accessed through a phone line.

Oh, and MSN eventually created their own PPC platform called adCenter.

Affiliate Programs

The first affiliate program to generate real buzz online was the Amazon.com associate program. They had a product to match just about every website out there. If you were a dog breeder, you could link to Amazon pages about books on dog training, breeding, or just about anything else you could think of. I remember seeing Amazon’s founder, Jeff Bezos, on the Tonight Show when he explained how he originally boxed books for shipment while kneeling on the floor. His story was about investing in how they did things like buying tables so they did not have to kneel any longer. Up until that point Amazon had yet to turn a profit, and he did not shy away from talking about how they were continually reinvesting profits back into the company. At the time I kept thinking that they were either going to crash so hard that many of the Amazon staff would end up jumping out of windows, or they were going to become an amazing success story. I was actually very happy for them when they reported their first quarter of profit.

After Amazon came Commission Junction. CJ was an affiliate marketplace that, like Amazon, had just about something for every website. If you were a dog breeder, you could probably find a few pet supply stores to link to and make some commissions. The great thing about them was that their affiliates paid much better than Amazon. The biggest drawback that I saw with Commission Junction was the fees that they charged for being the broker were pretty high. This eventually led to lower commission rates for the affiliates as the web stores and CJ tried to maintain their revenue. I am not sure where they stand in today’s affiliate world because I left Commission Junction as soon as Google launched AdSense and have had no reason to return.

Google AdSense has not changed much over the years. They are still easy for publishers to use, pay very well for each visitor, and they still have very few ad formats for being such an advertising powerhouse.

UGC & Social

At times I have a tendency to group these two together, mostly because in the early days websites that allowed User Generated Content or User Edited Content (UEC) also included some form of a social experience and community. Back then directories were my UGC & social experience. I was an editor at DMOZ and both versions of GoGuides. I added and edited directory content, read the newsletters and the editor forums, and made an effort to better my “community.” There were several Webmaster and SEO forums, but I did not spend too much time in them as they seemed to be filled with hacks that passed misinformation. I was never interested in chat or IM, and I don’t even know what ICQ stands for. (Well, now I do.)

I remember when About.com first became popular. They had just changed their name from The Mining Co. and there was a buss around it in a similar fashion to Wikipedia when it first hit the scene. I once checked out the available topics section and then decided not to try and join. I just didn’t see how I could find enough time to develop that much content. I guess About didn’t see how they could either, and that is why they opened it up to Guides.

CitySearch has been around for a while and I once used them religiously, but they are now a sinking ship. Yelp discovered the true value in UGC that CitySearch had discarded in favor of advertising revenue: free business listings. It's a simple platform for reviews that is open to almost anyone, and they encourage offline community involvement and have user profiles with photos for recognition, community status, list building and bookmarking, review tracking and a way for people to connect with each other. They really have thought of just about everything that CitySearch chose not to.

Analytics & Reporting

Webtrends log file analyzer was the first program that I ever used for client reports. Its reports were similar to some of the free solutions around, like Webalizer and AWStats. It was great for identifying the search engines and keywords that we were generating traffic from, and it also provided a little bit of data like browsers and operating systems.

We eventually gave in to client requests for ranking reports and started using WebPosition Gold. We knew that traffic was what really mattered, but we found by providing these reports that our clients rarely questioned campaign progress and traffic numbers. I think it was because they had a little easier time understanding a top 10 position as a measurement as compared to a report with hundreds of different keywords and visits. WebPosition is now owned by Webtrends and has remained mostly unchanged for years.

Google Analytics became a mainstay of client reports. Its reporting features, conversion tracking and visual elements are great, especially for the price! I still run a ranking report once a month just to save time of checking top keywords myself and to help watch for movement on whole keyword segments. Like almost all of Google’s non-search technology, GA was once part of another company called Urchin and was absorbed into the Google collective.

Algorithms & Updates

AltaVista’s Black Monday happened only 6 short months after I started in SEO. I came into the office on what seemed like an ordinary Monday to what eerily felt like the first day of the stock market crash that led to the Great Depression. Most of our clients' traffic had disappeared from our biggest source at the time. We normally would check a few client log files to see how traffic went over the weekend, but that day we checked all of them. There was a tiny hint or feeling of doom in the room. We were almost panicked. What would our clients do if we could not figure this out soon? Would they leave us? What about company revenue? Had it tanked enough to affect me? Eventually it all worked out, but what an ordeal!

Google’s Florida was another tough one. It happened right before the main holidays. It really screwed most of our clients, many of whom had ecommerce sites that really needed to sell over the holiday season. Florida took longer to overcome than AltaVista’s Black Monday and not just from a ranking and traffic standpoint, but from the loss of clients as well.

It was not long after Florida that Yahoo decided to change their Paid Inclusion program from the yearly fee to the yearly fee plus cost per visit. I guess since Google’s updates were given names in alphabetical order just like hurricanes, it is only fitting that we were hit hard by those two in succession and felt like we living through hurricane season.
See You in 10

Now that I have spent a little time looking back, pondering search, my career and how the internet has evolved, I can’t help but wonder as to the changes that are coming up and in particular what’s next for search. Maybe in another 10 years or so I will write a follow up and take an even longer stroll down search memory lane. I plan on being around then. SEOmoz will probably still be here. Hopefully you will too.

Cheers!

Monday, April 20, 2009

Jeeves Returns

Well at least to Ask in the UK. Okay, so I wasn't ever the biggest fan of the butler, mostly because I thought it was a poor way to promote a search engine. From a marketing perspective, why ask Jeeves if you could simply "Ask it!" Doesn't it make more sense to "ask it" rather than "google it"? Well it would have a good few years back.

Well I guess that's why Ask never got me to do their marketing. Here in South Africa half the searching public don't even know that there ever was a Jeeves, for others Google is the internet (much like that blue icon is the internet). While still no fan of the Jeeves concept I have to admit it's a blast from the past to see an "old friend."

Will Jeeves make a real return? I don't know, but I'd doubt it right now. While he might work for the British public, I don't see him making full time return to the US, or even the rest of the world for that matter. Jeeves was retired for a good few reasons, I don't know what has changed since then and now. But then again, perhaps the public really are wanting to put a real face to search? Would newer users prefer to have a familiar face to credit their answers?

Thursday, April 16, 2009

Google AdWords

Well, Google have done it again. Upgraded another great service. Only problem I see however is that with each and every update the service comes to a crawl.

Google AdWords have now had yet another update. The interface while more appealing and easier on the eye has gone the same way of the Analytics. Only problem is now it takes about 5 times longer for me to load. I think Google have forgotten that we in South Africa just don't have the connection speeds that Europe or North America do. A pity because when running multiple accounts it seems that things simply aren't working at times.

Okay the "previous interface" is available, for now. And while the "New Interface(Beta)" is still in Beta, why do they default to it. I guess with the enforced changes and updates of other popular sites such as Facebook. But possibly the most annoying thing for me at this time... the horizontal scroll! I've always hated it, always will! Okay so it overlaps just slightly, I still need to scroll to confirm the average position of the ads.

Google please offer those of us that have poor connections to view the older versions of these services. Yes, I know that we may even be missing out on some functionality, but at the end of the day I'd rather have limited functionality than none at all.

Monday, April 6, 2009

PageRank Update!!!

Like I care...

hehehehe...

Yeah, I'm not a fan of the PageRank craze. It looks like Google have run out another PageRank update. So far it would seem that many of our sites have seen a bump up in PageRank, and none come down as of yet. Guess from a cosmetic view I'm doing something right.

Thursday, April 2, 2009

Moving a Home / Website

NavigationI've been in the process of packing up to move for the last few weeks. Personally I hate moving I've done it so many times and it's rarely, if ever, an easy exercise. This got me thinking. Moving home and moving a website are both equally difficult or cumbersome at best.

Depending on the move you plan, you may have to pack everything up and transport it to a new location. When moving home you need to make sure that all your furniture and other valuables are safely stored for easy transport. When moving a site you'd need to make sure that you've backed up your site correctly (with a second backup just in case) so that you can easily load it to a new server.

Then there's the unpacking stage. Is your new home big enough for all your furniture? Will you struggle to fit all of your furniture into the rooms? Are there more rooms that will allow growth - if that is your desired plan? In much the same way, assuming that you don't simply have flat static HTML files, does your new host support the language that your site has been written in? Do you have any server specific scripts that will need additional server support, or would you have to rewrite a few scripts (mail scripts tend to be the bane of my life on this one).

Usually when moving location specific information about you will change. Details such as your mailing address, home phone number and actual physical address will change. When moving a website this also holds true. While many details will remain the same, do you plan to update the website? Possibly change URL structure? Or even the programming language of your website? If so this will leave the older pages lost in cyber space. If your domain name has changed, and you still own the old domain name, redirect traffic to the relevant page on your new site. If you've kept the old domain but have decided to change the URL structure, redirect the old pages to the relevant new one - in much the same way that you would do for your snail-mail postage.

As you would (and should) let authorities, banks and others that need to be able to find you by updating your personal info, let the search bots find your new pages and domain by alerting them to the move. This will speed up an re-indexing that will need to take place as well as preserve any link strength that you may have earned so far.

April Fools

April Fools DayWell the 1st of April has come and gone. Many tried to play pranks, some tried to avoid them and no doubt many are still claiming that they weren't fooled but were simply playing along ;) right...

Sometimes however fact really is stranger than fiction. But not with the addition of CADIE. This Panda was going to revolutionize many Google products by quietly doing all the mundane chores that you don't like. Removing red-eye from Picassa or answering email on your behalf. The actual homepage for CADIE was amusing, it looked like something from GeoCities, or pretty much anything designed (if you call it that?) in the late 90's early 2000's. Bright, in your face and flashing every little available pixel.

On the flip side... Google now announce that they are due to cut some 200 jobs. I guess it's not always fun and games - even at Google.

Barry Schwartz, an avid Mac fan, of Search Engine Round Table even found time to set up the BSOD for Windows visitors. Ha... Although I can say I've been working on this machine for almost 2 years and never had a BSOD - I can't believe it myself ;)

Wednesday, April 1, 2009

Loved this SEO Quiz

I just loved this SEO quiz. You can't help but pick the funny answers... which as it would seem tend to be the correct ones ;)

What Stage of SEO Career Are You at?

My Result: Expert
View user's Quiz School Profile
Robert
You can't afford reading each message in your email box. You can't remember who 25% of your Facebook friends are. You don't comment at SEO blogs as often as you used to - you can hardly manage to read them now.

You have a strange feeling you have no time for anything. No, you still don't understand how the heck Tamar is doing that and you start secretly suspect there are several Tamars out there.

You read SEJ daily :)
Quiz SchoolTake this quiz & get your result


Yeah, I'm sending the link love back too :P

Monday, March 30, 2009

Social Media

Team WorkOkay, so I was eventually going to get round to this one at some time. Here's the real twist... I know nothing about social media. Okay, that's a bit of a lie/exaggeration, I know enough to know that I don't really have a clue. So I did what everyone else would do in my shoes... called a consultant.

I guess this post really starts some two years ago when I first joined Prop Data. Online marketing has come a long way from simply calling yourself an SEO by stuffing keywords, these days it's a full time commitment to the betterment of the internet (yeah, I'm fighting the crusade for the good guy). Okay, so I admit to spamming on occasion, but who hasn't? When I joined the company I knew that many things were changing online, web 2.0 wasn't just making websites easier to maintain and update but also making them a lot more interactive. While visitors could interact with the site, it's still just code, visitors needed interaction to work back to them. Enter the age of blogs and social media in general.

This brings us back to last Friday. Having joined Twitter a good long while ago I've been closely following other folk in the SEO, SEM and Social Media circles. While there are precious few in South Africa that claim to follow these trends, one chap Mike Stopforth has put himself out there. Replying to a Tweet he sent out a few months back offering a free consultation he agreed to join us and speak to us. Being Prop Data, the team were all over worked, understaffed - the usual. We changed the format up a little and it became a open discussion between Mike, the sales guys and myself.

The discussion was great. Broken down into some very simple points Mike did a great job of highlighting the points to consider and questions to ask before venturing forward on any social project. I guess many of these points we already knew, it was just a case of putting them into perspective. After all, there rarely is a point in doing something simply for the sake of doing it. Focus and result is the main point, as it always should be. Sometimes having a blog isn't a good idea when a fan page would make a lot more sense. Not everybody likes a particular product, but it may have many fans. Simple point, but I'd never thought of it that way. What can I say, I don't know social media.

I guess like so many other things IRL (that's "In Real Life" ;) it's not what you do, but how you do it. Social Media is an animal. You have to feed it and nurture it, if you don't it will turn and bite you. Those wishing to engage in Social Media, "Are you ready for that kind of commitment?"

Friday, March 27, 2009

Arrrgh... It's a Treasure, uh, Sitemap

Treasure MapThere are two types of sitemaps that you might employ on a website. First the HTML version which is intended to offer the average visitor to the website an overview of your website. The second is a machine readable XML sitemap (commonly referred to as a Google Sitemap) intended to inform a search bot of the pages found on your website. I'd always recommend the use of both - in various forms.

HTML sitemaps are a wonderful resource. If a visitor to the website can't find what they are looking for this is the easiest way to point it out. In addition many fail to realise that his is a prime spot to put text links to pages with your keywords as the anchor text. In addition why not add a little additional information next to the link. Suddenly the whole page becomes a fantastic resource for both human and bot.

XML sitemaps a great for dynamic websites. Many of the websites we have worked on include many products or other kind of items listed. Scripts can be run to dynamically generate this sitemap as listings are changed or updated. These sitemaps can be submitted directly to the search engines informing them that this is a true reflection of your site.

I've tended to break my XML sitemaps down into smaller maps. Often breaking down static pages and separating them from the dynamic pages. On our really large sites I have even broken the dynamic sitemaps into categories. Mostly you don't want to offer a sitemap of thousands of pages to the search engines at a time. A thousand pages or so may be the limit.

Submit your XML sitemaps to the search engines and be sure to link to your HTML sitemap from your homepage, this will ensure that the sitemap is easily crawled by the search engines and will subsequently lead to the other pages being easily indexed.

Tuesday, March 24, 2009

Google Code

Comment ImageTelling Google what to index might not be a figment of webmasters imagination for that much longer. I recently came across a few lines of code explaining that you can tell Google not to index parts of your page. This could prove to be quite useful.

Don't index word:
fish <!--googleoff: index-->shark
<!--googleon: index-->mackerel

Don't use link text to describe target page:
<!--googleoff: anchor--><A href=sharks_rugby.html>
shark </A> <!--googleon: anchor-->

Don't use for snippet:
<!--googleoff: snippet-->Come to the fair!
<!--googleon: snippet-->

Don't index any:
<!--googleoff: all-->Come to the fair!
<!--googleon: all-->

Now I'm not sure if any of this works just yet. I'm still testing, but I imagine that Google will for the most part ignore these comments. We know that the Google bot pretty much tries to read all the code on a page including scripts. But if the snippet comment works at least we might be able to use a description that might be useful.

Watch this space...

Too Cuil for You

Well it's been some time since the Google killer Cuil was launched and I've not heard much since. Launched to much fanfare and expectation I think this has to be the largest flop seen in years. I wonder just how much was put into this development in time and money? I wonder if any of the investors would be getting anything back?

As I've not used this search engine (mostly as I found it to be useless at launch) I can't comment too much on the accuracy of the search results but do know that the images displayed still don't quite match up. Nice try though. I think Google, Ask and even Live have better image results blended into their universal search.

Perhaps in time they will be able to make sense of "the largest directory of indexed pages". But for now the results seem to be outdated, irrelevant and at times just wrong. Was this what we expected of the ex-Googlers? Perhaps this is a prime example of why they are Ex Google folk?

Sunday, March 22, 2009

What's in a Name

Name TagThe title tag has to be, in my humble opinion, the most important on-page factor when it comes to high rankings in the search engine results pages. Found in the head tag of a standard HTML page, the title is the first place that you can start placing your keywords. Surprisingly some pages don’t define this tag. Worse yet, some overlook it and omit it altogether. Here is a basic example of where the title tag fits into an average HTML page.

<html>
<head>
<title> The title goes here</title>
</head>
<body>
Web page content goes here.
</body>
</html>

Here are five points I always consider when constructing a title.

Limit the length of the title.
Google currently displays approximately 63 characters of a title. The total number of characters displayed varies from engine to engine. While it is not the end of the world to exceed this by a slight margin, (I don’t believe there are any penalties for having a long title) remember that the search engines will cut off anything that goes beyond that which they display. This would leave you with a “…” instead of a complete title.

The title tag can be useful for branding your traffic.
By adding your website or company name to the title tag you can build brand awareness and increase returning direct traffic. While many suggest doing this I would only recommend adding your company or website name to the end of your title tag. While I don’t think it makes much difference to the order, your keywords are placed in the title tag, I suggest that you ensure your keywords are towards the beginning of the tag as it reads easier. Once again don’t forget that the title tag is the first thing that is displayed from each site by the search engines.

Divide your title tag.
When branding your site, break the title tag so that it becomes obvious which is the page title and which is the site name/title. I find that by using the pipe break “|” (that’s the funny symbol above the “\” key) I am able to do this quite neatly. This is also a great way keep your titles consistent. For example:
<title>Keyword Phrase Goes Here | Some Company</title>
Instead of:
<title>Keyword Phrase Goes Here and Blends into the Name of the Company</title>
As you can see it makes it a lot clearer when considering which part of the title labels the page and which part labels the website.

NEVER, I repeat never, repeat your title.
Each page should have a unique title. By giving each page a unique title you are telling the search engines that each page is indeed unique. For exactly the same reason that you don’t name every file the same, (Well, apart from the most obvious reason which is that you just can’t!) as it is easy to distinguish the contents of a file by simply scanning the title. The same principal applies to web pages. This also goes a long way to indexing the priority of each page. If every page had the same title, which page would be ranked more relevant than the next?

Keywords in the title.
I have spent a lot of time optimising websites for real estate agents. While their stock standard pages have targeted keywords in the title, headings and content, it becomes a little more challenging to do the same for each listing. This is usually where the developers come into play. With a little effort the Title can be dynamically created. In my case, it drastically changed the titles I could offer from something such as:
<title>Property Listed for sale or to let by Estate Agent</title>
To a far more specific title that really does describe the listing perfectly:
<title>House for Sale in Suburb, Area | Estate Agent</title>
Okay, so I usually go a little further than that, but as you can see the title not only makes perfect sense and describes the page but also is keyword rich for the search phrase “house for sale in suburb” or even area in this case. While this works well for this kind of website, the principles can be applied to any other dynamically created web page.

I have noticed that time and time again the search engines return results with the search phrase in the title. I think we can all agree that if a web page has been titled correctly then the page will be accurately described. However search engines will discount a title that is no more than a list of spammed keywords. I think we’ve all heard the mantra, create pages for real people not robots, too many times. I would prefer to change that statement:
Make well structured, informative web pages that are relevant to what you are doing.

When you apply the above, search engines have little option but to regard each page highly and rank it accordingly. While there are so many other factors to consider when optimising a page I believe the title to be a crucial element.

Thursday, March 19, 2009

The Theory of Relative URL's

How large is a PiE=mc2 or e=mc2 ?

Have you ever tried to type E=mc2? Notice how difficult it is to find the funny little 2. In fact how many people even know how to go about finding that 2? Similarly while Pi can be rounded to 3.14 the number is infinitely long and trying to remember it rounded to two decimal points is hard enough. However, this is not a science or maths lesson. The point I’m trying to make here is that Pi is easier to remember than 3.14… or how to find the 2.

Similarly we can compare the following URL’s:

www.widgets.com/purple-widgets.htm
www.widgets.com/itempage.htm?id=123

At first glance we would assume that one is a static URL and the other a dynamic URL. Both of these URL’s could be the exact same item, but which one are you more likely to remember? Already, by looking at the above mentioned URL’s you would be able to guess which one of those pages may relate to purple widgets.

We know that the anchor text in a link carries much weight when it comes to gaining a top rank for a specific keyword. Indeed anchor text alone can get a site ranked for a search term that is never mentioned on that page. This has been used and abused in the past. Link bombs, such as the “miserable failure” Google Bomb, serve to prove just how valuable anchor text in a link can be. While many links created on websites are displayed as “widgets.com” you can already see the benefit of having keywords in your URL.

The search engines continue to preach how you should be optimising your site for real people and not the bots that visit the website. With this in mind I wouldn’t be surprised to find that www.widgets.com/purple-widgets.htm would be ranked higher simply because it is simpler URL and surely a lot easer to remember than a messy dynamic URL. This could just be wishful thinking on my part, or is it? When running a few searches I found that 7 out of the top 10 results all had keywords in the URL.

Search engines prefer the use of hyphens in domain names because they can produce more accurate search results by being able to divide and recognize specified keywords in your URL. After all if it’s easier for us to read purple-widgets than it is to read purplewidgets why shouldn’t it be the same for a bot?

Many would then assume that the underscore “_” would be the same as a hyphen. This is not true. I would appear that as the underscore character is often used in programming languages it is treated as a whole other character of its own. As we all know a hyphen simply adds words together it is read as a simple join between two words, nothing more.

It is also be worth mentioning that the URL is listed in the actual search results themselves. While just a small single text entry the URL may give the searcher a little more faith that the page listed is actually what they are looking for. So with a neatly put together Title, gripping description and a URL that matches both you might just find that the URL could even aid in generating traffic.

Useful Tips:

1. When picking a domain name that people will link to, use your targeted search phrase.
2. When creating directories and files use your targeted keywords.
3. Individual words in the URL should be separated as the search engines might not recognize then when joined (although stemming seems to have seriously improved in the major search engines - Smaller Engines still look for exact matches), i.e. purplewidget.htm should be purple-widget.htm
4. When separating words in a file or directory, use a hyphen rather than an underscore (this is easier to see as an underscore can’t be seen if the link is underlined).

As you can see, the search engines and visitors alike have very similar needs when it comes to making sense of your website. Google have been on a crusade for as long as I can remember, trying to get webmasters to design websites that are aimed at a human audience. Perhaps this is prime example of good structures that work for both human and bot. Perhaps this is just a coincidence. But while we hope that the search engines return more accurate search results, this could indeed be a step in the right direction.

Which brings me back to the original question: E=mc2 or e=mc2 ?
Remember to always pick one that will easier for the end user to understand be it human or robot. As it would appear that they are a lot closer than many may think.

Tuesday, March 17, 2009

Is Your Website a Unicycle?

UnicycleIs your website a unicycle, a vehicle that requires much training and skill before it can be used? While there are so many “beautiful” websites online, some simply don’t make sense. Have you ever found yourself on a website that seems quite impossible to use? Even worse, landed on a website after doing a search only to wonder why you are there at all?

Site usability is possibly one of the more important factors of a top performing website. While so many will argue that the site is nothing without a genuine web presence, I will argue that some websites rely purely on offline marketing. At the end of the day, if your website is impossible to use, nobody will be able to (or even want to) use it. Points to ponder when designing your website:

1. Navigation
2. Login/Signup
3. Onsite search
4. Flash and other multimedia
5. Bookmarks/Favourites
6. Contact

1. Navigation
This may seem like an obvious point but as most visitors are more likely to find your homepage, are they able to navigate to the section of the website that best relates to their needs? Simple text navigation will also make it easier for the search engines to index the individual pages of your website (where have you heard the design the website for a human visitor before?).

2. Login and Signup’s
Does your website require that visitor’s to login; do you want new visitors to signup for your newsletter (or other services)? If so, is it possible to do so from the homepage? While you may not want to place a login on the homepage, a link to a login page will suffice. Again the key is to keep it simple and clear as to what you expect of the visitor.

3. Onsite Search
This is crucial for any website that offers a large quantity of information or products. Can you imagine trying to find an item among 2,000 by going through a product list 10 items at a time? I didn’t think so. Offer you visitors what they are looking for by adding a simple search to your website. This should help speed things along. Many websites have a quick search option towards the top right-hand corner of the homepage (sometimes this spans the entire website in all the headers). Keep it simple, visible and obvious. Make sure that the average visitor knows that this is a search function.

4. Flash and other Multimedia
Okay so Flash is a pet hate of mine. But the same could be said of all multimedia that simply clutters a website. Remember that while multimedia and other interactive agents can at times seem really cool or even a good idea, some visitors don’t have advanced updated browsers. That said, sometimes the best way of doing something is through the use of these tools. Make sure that these are placed on well marked pages with an explanation of what they are about. This way, if the visitor is unable to view the contents the at least know what it is about and why they can’t view it. Otherwise they will simply think that the website doesn’t work and leave. After all, what use is a website that is broken?

5. Bookmarks and Favourites
If you want returning visitors (who doesn’t?) then it is usually a very good idea to offer a “bookmark this page” or “add to favourites” button. I’m pretty sure we are all in agreement that traffic is valuable so there is no excuse for letting it get away. The “favicon” is a useful way of separating your website from the others. Once made a favourite this icon will be found next to your websites name. This is an ideal spot to promote your logo and brand.

6. Contact
Even after making the site as foolproof as possible there will still be occasions where even will all that planning something will come along that you hadn’t factored. When this occurs make it as easy as possible for the visitor to contact you. Be it by making your contact details (phone, email and fax) available on each page, or by placing a quick contact us form that is accessible from each page. Again, you’ve worked hard to drive the traffic to your website; don’t let it simply get away.

Remember simple is best, leave no room for mistaken functions. Signups, Logins and searches should be clearly marked so as not to confuse the visitor. Make it as easy as possible for your visitors to find what they are looking for. With a well structured website you will notice that the conversion from visitor to customer will increase. At worst the few questions on where to find something or how to use the website will decrease. Your website is after all supposed to make your life easier as well as save you time.

SEO Design

Ask Jeeves the ButlerThere are many aspects to consider when putting a design together, most of which are either second nature to the seasoned developer or overlooked completely by the novice. Although as we all know, what looks good doesn’t necessarily work well and vice versa.

Neatness of Code:
Code should be neat. Simple. No, it doesn't have to conform to W3C standards (Google doesn't even conform and it is estimated that only 3% of all sites actually do - side note that half the sites that do claim to conform don't either). With so few sites conforming to these standards how can Google (or any other engine for that matter) offer decent results if they negate 97% of the internet? Keeping code clean includes keeping all generic information such as style sheets and scripts in separate files.

Robot Tags and Text File:
Often you may not wish for the search engine bots to index certain pages. You can easily add the NoIndex attribute to this page. For whole directories you can simply add them into your robots.txt file located in the root of your website (http://www.website.com/robots.txt). Why is this important for design? Well the crawl rate and indexing is a concern for all departments. Remember I mentioned that all generic information should be kept separate? Well this way you can simply block those directories with the robots.txt file. This way the search engines will be forced to index your actual content pages before attempting to read your styling code.

Content:
Do not replicate content on multiple pages, it's a waste of time, effort and dilutes keyword value. While the duplicate “penalty” is a myth it does confuse the search engine as to which page is more important or even which page was first. Imagine someone giving you the key to a Ferrari and then telling you it was the key to the red Ferrari parked outside? Now imagine there are 10 red Ferrari’s parked outside! Which one does the key fit? If there is only one Ferrari the choice is easy. Usually the page which is indexed first is the one that is credited with being unique. The other pages are simply diluting their keywords and purpose. Personally I've always tried to aim one keyword per page this does lend itself to long tail combinations working on a single page as well.

While it is commonly accepted that the major search engines ignore boilerplate content (such as standard navigation, headers and footers), it has since been suggested that you can point out which sections Google should ignore. This doesn't seem to be in mainstream use just yet and I am sure that this won't make much of a difference as it remains open to abuse - as with so many other on-page factors.

URLS:
URL’s, or URI’s, can make a difference when it comes to ranking. As mentioned before people may link to the page (home or internal) with the actual URL. As mentioned before anchor text is vital for ranking a page so it makes sense then to include keywords in your URL. Long gone are the days when URL’s were dynamic and half the URL’s had strange characters and session ID’s (a massive source of duplicated pages).

www.website.com/page?ID=123
www.website.com/location/
www.website.com/Town-Name/

In addition to duplicating pages session ID’s and multiple variables can also cause a search engine spider to become trapped in deep pages. Once trapped a spider will leave the website this may result in your more important pages not being indexed. We can now specify the URL of a page through the use of a specific tag in the page header. In this instance the search engines (Google & Ask.com) will ignore session variables (or others you may have generated) and only index the page as you specify.

Links:
The easiest way for human and bot to get from page to page is through links. Not all links were created equal. Links hidden in flash, images or scripts may look good to the human but be impossible for the search engine bot to read. Content remains king and while community (social media) has recently been crowned queen but it is the text link that remains everyone’s servant. On your own website you can use desired anchor text to describe the page you are linking to.

From another website, if a link to a website is a vote, then the anchor text tells you what they are voting for. Because so many webmasters, bloggers and publishers link to pages using the URL as the link text it becomes quite clear as to just how valuable it can be to include your desired keywords in your URL. However, no matter how hard you try you will always have broken links to your site. This could be due to a typo or because you've moved the page (or restructured the website) in which case a custom 404 page is crucial. When rebuilding a website and changing your URL structure, it is advisable to 301 (permanent redirect) the old URL to the corresponding new one.

Forms and Restricted Pages:
Don’t hide your great content behind forms and other forms of logins. Robots can’t fill these in and won’t be able to reach these pages. Simply put they won’t know that it exists. There are ways around this, but why make it difficult of the Robots or even Humans who are now becoming more and more reluctant to part with personal info on the web.

Sitemaps:
XML sitemap for robots (often simply referred to as a Google Sitemap). If you have many pages, consider breaking these down into themes. At present I prefer to set up a static XML sitemap for the pages that won't change and a dynamic XML sitemap for listings, products, etc that will change on a regular basis.
HTML or plain text sitemap for humans can be a perfect place to get all those keywords in either the link itself or next to it. This is also an easy way for a visitor to find something listed on the website. Make sure that this page is easily accessible from the homepage.

Summary
It is reported that Google has over 200 criteria points when it comes to ranking a website. Many of those aren’t part of the design. But a few that are include:
  • Keep code to the minimal required

  • Minimise the use of code that search engines can’t read (hide it when possible)

  • Unique content - keep navigation consistent

  • Use descriptive URL’s

  • Keep unique URL's

  • Descriptive internal linking

  • Use text links to reach all of your pages

  • Custom 404 page

  • Don’t hide great content behind forms and login pages

  • Use XML Sitemaps for the search engines

  • Use a descriptive HTML sitemap