Wednesday, June 24, 2009

Cuil, Remember Them?

Hey, finally some news on Cuil, you remember Cuil don't you? The Google killers with the largest database of indexed pages in the world. Oh... not? Don't worry, I'd forgotten about them entirely too. Especially after the recent re-branding of MSN and Live search to Bing (which I'm am quite fond of).

Well it seems that Cuil have decided that it's time to innovate. As reported by Matt McGee on Search Engine Land - it would seem that Cuil have now added Maplines to their search results. This makes for a much more interesting results page as displayed in a search for George Orwell.

While Google keep the monopoly on simple search, could it be that the other providers have decided that if you can't beat them, start a new game? Ask.com brought out their 3D search (which has faded away), Microsoft eventually consolidated everything into Bing which is a lot more interactive, even offering a blurb on sites and now Cuil change it up a little. I can't help but feel sorry for Yahoo! as they really are lagging now (dead duck or just lame I wonder). I don't see people changing their search habits anytime soon, but the internet changes pretty quickly, what will Google do to counter this?

PageRank Update?

PageRank YodaSEO Crowd: "Hey there's a PageRank update!!!"
Me: "Like I care!"

hehehehe... yeah, okay so it's always amusing to see the freak out and the absolute fascination with PageRank. God forbid someone's rank drops...

All for a digit between 1 and ten... but what about that grey bar?!

Anyhow... we'll see the outcries later this week no doubt.

Outlook 2010 to be Broken?

Outlook LogoNow this is an interesting turn of events. While I've never been much of a fan of Outlook (I've preferred the express version - it's less, but less clutter as well) but it looks like they're changing it up all over again. Microsoft are due to change from rendering emails in HTML format to Word. This will basically break HTML email in Outlook. I imagine this will keep the absolute end user more than pleased with the product (lets face it, they don't care what's under the hood) as they won't really notice a difference - 'cept perhaps download speeds. Word does tend to make things a little bulkier.

I heard designers and developers that have become dependent on HTML format for email groaning in the background cursing Microsoft once again for making their lives difficult.

I'm really not a Microsoft fan, but I've come to accept that most people use their products so they <rant rel="for another day">are THE standard </rant>. That said I feel that these developers that have been moaning for so long about standards and the fact that IE is bundled into their OS are really the ones to blame. Microsoft are slowly being forced to remove IE from their offerings, in this case Windows7 (yeah, I know they could offer multiple browsers, but do you advertise for your competition? didn't think so!)

Basically Microsoft have been forced to change how they make use of their software. So if you can't use something that's built into the OS (because now there is NO browser) you have to use something that's shipped with the package. Office, I believe, ships with Outlook and Word. Word allows formatting, so use it as your base editor. Will this affect their customer loyalty? No... most end users have no idea that there are other options than MS Office or IE (believe it or not), so while the masses continue to use Microsoft products it is the rest of us that have to adapt and conform.

Anyone who ever insisted that Microsoft remove IE from their OS take a bow.

Well done moaning developers and designers... Congrats to all you in the EU forcing the anti-trust issue. You've finally got what you want. If anything this will simply boost sales of Word (and subsequently Office). While I hear so many moans and groans I can't help but think you all got just what you deserve.

Monday, June 22, 2009

Online Publications

Online Journalism - click for enlarged versionWhile advertising was always the bread and butter for the newspapers, it would seem that these bells and whistles are now the focus of online publications. The accompanying diagram best outlines this. Perhaps newspapers aren't dying as many seem to be pointing out, perhaps they are literally selling their existence. I know that far too many pages these days are nothing more than ads. The recent upgrade of News24 has gone a long way to reinforce this thought.

What ever happened to the "sponsored by" with a logo? Too many banners have caused banner blindness. What is an optimal click through rate on these banner ads? You have to stop and wonder. With fewer click throughs the advertiser is paying more and more for a lesser result. I wonder if any of these publications would ever (could ever) move over to a cost per click model. This would surely offer best value, or would it go to prove that their over stock of ads simply don't work - in this case for the publisher?

I say put the shoe on the other foot.

Thursday, June 18, 2009

Social Media: Twitter

twitter: rhcerffTired of spammers? Tired of the get rich schemes? Well it seems that these tactics continue to work, after all, if they didn't they would stop doing it right?

I found this on twitter SEOSumo: Social Media Douchebag.

It begs the question though, just how many people are out there pushing their secret to success on the numerous social platforms? How often are we hit with the twitter follower that is just a pretty girl (thanks Zaibatsu) punting their get 2,516 twitter followers in just 2 days tweet and nothing else? The whole debate on whether to use the long sales letter or even video. Don't we all just hate that automated direct messages?

I guess it's nice to find something with a little bit of humor on a Friday morning. Oh well... in the meantime I think I'll just refer these folk to www.socialmediadoucebag.net

Sunday, June 14, 2009

SEO Glossary

Age - First appearance of site in Archive.org, or first appearance in search engines. Not to be confused with domain age, which is the registration date of the domain name. Older sites have more credibility, but for SEO purposes the "age" clock starts when a site is cached by a search engine.

Algorithm - A very complex series of rules used by a search engine to determine rankings. The Google Algorithm uses up to 200 different factors to determine web rankings.

Analytics - Most often, this is a reference to Google Analytics, a free way to measure your site traffic. Other analytics programs include ClickTracks, WebTrends, and Omniture.

Anchor Text - Linked text on a web page. Example: This is anchor text. Anchor text is important because search engines use it to determine what the destination page is about. Therefore, anchor text must be topical and relevant.

Backlinks - The number of links from other websites to your website. Google Webmaster Tools will give you the most accurate picture of your own links, and a search in Yahoo under link:yourcompetitorsitehere.com will tell you how many links Yahoo is listing for that site.

Ban - A severe search engine penalty that takes you completely out of the index. Normally caused by using black hat techniques.

Black Hat - In reference to search engine optimization, a technique that is unethical in the eyes of a search engine, and can get you de-listed.

Bounce Rate - The number of people who come to a web page from another site (or search engine) and leave without visiting any other pages. A high bounce rate is believed to negatively affect search engine rankings over time. Most often measured using Google Analytics.

Cache - The search engine's stored data about your site. This information can be weeks or months out of date, depending on your crawl rate. When you make SEO changes to your site, it won't be applied until the site gets re-cached and re-indexed. To see your cache in Google, type in cache: followed by your website.

Content - All text on your website readable to the search engine. Usually this is in reference to the body text on your pages.

Conversion - A visit to your site that results in an action being completed by the user. This can be a form fill-out, purchase, or phone call.

Conversion Rate - The number of conversions divided by the number of visitors. Higher conversion rates are always preferred. In Google Analytics, this can be considered "Goal" conversion.

Crawl Rate (Frequency) - The interval between search engine robot visits to your site. Generally, sites with frequent changes and more interesting (to a robot) content get visited more often. Pages with higher PageRank also get visited more often.

Description - A metatag that allows for a brief description of the page's content. All description tags on a site should be unique, and less than 256 characters.

Directory - A website that lists other websites in categories.

Duplicate Content - Content that is substantially similar to content on other sites or on multiple pages of your own site. Non-original content is generally ignored by search engines, and referred to as a "duplicate content penalty" when it impacts your site. Duplicate content is often cached but not presented in normal search results.

External Link - A link to another site or online resource from your site.

Google Sitemap - An XML sitemap that lists pages on your website that you want Google to find. The same protocol is used by Yahoo and MSN. Several sources online will create a sitemap for you. Not to be confused with a sitemap that lists all the pages on your website.

Filter - A reduction in search engine ranking for a number of possible reasons. Filters are different than penalties, in that when the item tripping the "filter" is removed, then results should bounce back.

Indexing - When a search engine applies your site results and links to its current index. Web pages can be cached for some time before the cached results are applied to the index.

Internal Link - Links from pages on your site to other pages on your site. How pages link to each other is known as Navigation.

Keyword Blurring - Using the same keywords on multiple web pages. This keeps the search engine from picking a "best" page for the keyword, so multiple pages may have lower positions that a single page devoted to the topic.

Keyword Stuffing - Using multiple keyword repetition on a web page. Search engines prefer text and keyword use that is more readable and user-friendly.

Keyword Tool - Any tool that helps determine keyword demand. Wordtracker and the Google Keyword Tool are two popular sources.

Keyword Research - Strategic research into the demand for keywords relevant to a website's topic. Good keyword research also uncovers synonyms and search terms that may improve site traffic.

Link Juice - A way of explaining the relative power of any link to another page on the same site or external web page, based on the power of the referring page and the number of other links on that page. For example, a powerful page with a single outbound link to your site would have more "link juice" than the same page a link to you among 49 other links. Link Juice Illustrated.

Links - In the world of SEO, "links" is most commonly a way of referring to inbound links to your website, given that Google bases a great deal of its rankings on other sites that link to yours. The value of links is highly variable, and links from sites trusted by search engines are more powerful than links from low quality sites.

Link Popularity - An overall measurement of a website or web page's link value, as determined by links from outside sources and links form other pages, which may themselves be getting good inbound links.

Long Tail - A keyword that contains a long search phrase. Long tail keywords usually have a lower search volume but a higher conversion rate, because the people who type them in have a very specific idea about what they want.

Metatags - Page code not normally visible to a site visitor which describes the content of the page. The Meta Title, Keywords, and Description tags are the most common, but metatags can contain many different fields of data not important to search engines.

Navigation - The way links are configured on a website to allow people to get to other pages. Search engines like to follow navigation and use it to determine the relative importance of pages on a site.

PageRank - (1) a numerical representation applied by Google showing the link value of any given page. This is completely determined by links from other websites and internal links. It is not a representation of the relevance of the site. There is a logarithmic scale of 1 to ten for PageRank, and higher numbers may require millions of links. This can be found using the Google Toolbar. (2) The algorithm at Google, not completely known to the public, that determines part of how links impact rankings.

Pay-Per-Click (PPC) - Paid search engine advertisements that appear next to search results. PPC can be very expensive, but can be executed within hours, while SEO can take months.

Penalty - A change in search engine rankings caused by breaking one or more "rules" of search engine ethics. A search engine "filter" is a less strict penalty, but a "penalty" can be applied for a longer time period and is generally a sign that you are believed to be deliberately violating webmaster guidelines for search engines.

Ranking - A keyword position on a search engine, anywhere from #1 to somewhere in the billions. Usually you want your site to show on the first page for your keywords.

Ranking Report - A listing that shows positions on search engines (usually Google, MSN/Bing, and Yahoo) for a list of preferred keywords. Monthly ranking reports will show you your progress over time.

Reinclusion Request - A request to a search engine that a site be reexamined for inclusion back into listings. This is most commonly done when a site has been penalized or banned.

Relevance - The key to good SEO. More relevant sites are preferred by search engines because they confirm the search engine user's trust in the ability of the engine to deliver results. SEO practices help format a site in such a way that the engine can understand its relevance.

Robot - An automated program that visits your website.

Robots.txt - A file on your website that can either allow robots or restrict them. Robots files can be useful when you want duplicate pages to be ignored, or search engines are crawling unnecessary pages.

Sandbox (AKA Sandbox Penalty or Google Sandbox) - An artificially low ranking due to having a new website. The existence of the sandbox penalty is debated, but generally a new site will get lower rankings. Search engines use this to prevent junk sites from getting rankings. There are ways to get out of the "sandbox" by being relevant, but customers with new sites are still advised that search engines may take some time to show good rankings.

Search Volume - How many times (usually per month) that a keyword search is made in a given search engine, or all engines. High search volume indicates a competitive keyword which may be more profitable.

Short Tail - A one or two word search term like "auto parts" that gets a high search volume, but is not very specific. A "long tail" version of the same term would be "used auto parts free shipping."

Spider - Essentially a search engine robot that "crawls" your website for information.

SEM - Search Engine Marketing. This most often refers to Pay-Per-Click initiatives, but can also include SEO as part of an online marketing strategy.

SEO - Search Engine Optimization, or the practice of getting websites ranked on search engines through a variety of specialized methods.

SERP - Search Engine Results Page. The list of websites that you get when you make a search on a search engine.

Silo - A way of structuring categories on your website and individual web pages. Normally all the pages and navigation links in a silo are relevant to each other, and the "silo" structure helps improve rankings by structuring similar items into easily navigated categories. This benefits search engines and site users.

SPAM - In search engine parlance, Spam is not junk email but site content and linking practices that are keyword stuffed, automated, or created to get undeserved rankings for search terms.

Submission - The act of submitting a site to search engines or directories. For new sites, submission is still useful, but any site cached in a search engine would not need to be re-submitted.

Title - Also known as the meta title, the title of each web page appears at the top of the browser window. It tells search engines about the topic of each page. A well written title can have the fastest impact on search engine rankings if all other factors are good.

Webmaster Tools - Google Webmaster Tools is a free program that will help the average user understand how Google sees the website, if there are any problems, and if the site is penalized. Highly recommended to any webmaster.

White Hat - Search Engine Optimization techniques that are approved by search engines.

XML Sitemap - A "Google Sitemap" or a list of pages that you want search engines to find. This normally gets placed in your root directory in an XML format and named "sitemap.xml." The sitemap contains information about pages, their relative priority, and how often they are updated.

Borrowed from WebProNews. I really wish that it didn't take so long to load a single page.

Wednesday, June 3, 2009

Is Google Evil?

"Do no evil"

"In business, evil refers to unfair or unethical business practices. Firms that have a monopoly are often able to maintain the monopoly using tactics that are deemed unfair, and monopolies have the power to set prices at levels which are not socially efficient. Some people therefore consider monopolies to be evil. Economists do not generally consider monopolies to be 'evil' though they recognize that certain business practices by monopolies are often not in the public interest." - Wikipedia

Are Google a monopoly? I'd say yes. By definition (on Wikipedia - so take it with a pinch of salt) that would mean that Google are indeed Evil.

Google have often been said to "do no evil". Promoting free access to information for all. Arguing that ISP's shouldn't limit access to information. While I'm all for a free and fair internet and information at my fingertips, I am concerned at what the cost of "free" is. Power corrupts... absolute power corrupts absolutely.

"There are no free lunches!"

Or so I'm told. So if they aren't charging me money for this service what is the cost really? I came across an interesting article on "The Plot to Kill Google" - the real Google killers (so much for cuil - anyone heard of them since their launch?). However many do have compelling arguments as to why Google shouldn't be trusted.

While all of that is a good decent argument, I guess one should be pleased that someone is trying to monitor Google's activities.

On the other hand, it could be worse: Google Press Release
Okay so those are just for humor's sake. But imagine if they were actually true. Which is more evil. I guess we'll just have to keep an eye on the almighty Google for now.

By the way, all information was found searching with Ask.com

Tuesday, June 2, 2009

(Google) Bombing the Presidents

BombWell Google may have diffused that bomb, but looks like bing.com are more than happy to keep it locked and loaded. Danny Sullivan of Search Engine Land pointed out that for some reason the "miserable failure" bomb seems to be live and well (or is that lit and fused?) in recent times.

Google run their algorithms from time to time to locate and diffuse these bombs, but what of the other engines? Yahoo! never seemed to really get rid of it entirely and MSN was pretty much on the same path. At present a search for "miserable failure" on Ask.com reveals George Bush to be the number one contender while surprisingly his right hand man is now Barack Obama (does Ask know something we don't?). Okay, so there is more to this than meets the eye.

The White House implemented 301 redirects a little while ago to send most of that failure link-love back to its intended destination (that of past President GW Bush). It seems that Google have managed to once again find and block this bomb. MSN (now combined with Live as Bing.com) seem to have re-indexed the site and are now reporting the GW Bush bio as a "miserable failure", followed by the Wikipedia report on this event and surprisingly the new (and only one that I know of) bio for B Obama. Yahoo!'s results match those of Ask.com (although they have indexed the new URL). Interestingly enough Ask.com still list the old gwbbio.html file as the URL for GW Bush despite the site having implemented 301 redirects (clicking on the link takes you to /georgewbush/).

Okay, so what does this really tell us? That GW Bush and Obama are miserable failures? Well, that will forever remain a long debate.

From an observation point I'd say that this would suggest that the miserable failures in this case are the search engines. Agreed, they may simply be returning facts based on what the public perceive, however this simply goes to prove how easily these giants can be manipulated to this day.

Google seem to have worked around this one and I have a sneaky suspicion that when Matt Cutts moved from the domain www.mattcutts.com to www.dullest.com he was testing just how well they handle the 301. Perhaps Obama and Bush both owe Matt and his team a thanks on that one. On the other side of this we can gather that most of the search engines still need to work on how they deal with 301's.

Could this leave a door open to spammers to "Google bomb" a page, then simply 301 the link love to another page?