Algorithms or human judgement? Give me logic

  • December 23, 2006
  • James Skemp
  • Internet

Earlier today, it was noted in the news that Wikipedia founder Jimmy Wales is working with Amazon on a search engine.  This new engine supposedly will use human judgement to weight results.  Naturally, sources are saying that this engine's competition is Google.  While Web 2.0 has some good things going for it, this is one of the silliest ideas I've heard of in a long time.

First, DMOZ anyone?  Want your site to show up high in the DMOZ directory? Bribe a DMOZ editor.  It's not hard.

Want your site or service to be posted on Digg?  No, your money is good with Digg posters.

Maybe people are intrinsically good (I like to think that they are neither good nor evil/bad in their original state/position), but thousands and thousands or years of evolution and humanity has resulted in man being less than perfect.

With people already trying to cheat the mathematical algorithms that the best SEs (Search Engines) use, how is a human-based search engine going to do any better?  What we really are talking about, when we talk about human judged search results is a directory.

Putting aside the cheating and lying, the Wikipedia is a great example of this - probably one of the best out there (that I know of - comment if you know of anything potentially better).  The content, taken with a grain of salt, includes links to external sources.  For technical/Web-based items, you bet I'll hit the Wiki, as there's probably going to be at least one link on any article that points to the essentials (id est, the essential information).

Anything more than this and you have to account for man's selfishness, pride, and greed.  As with the Wiki, it will be essential for individuals to have 'super' powers, the ability to lock items down, as well as the need to educate users that the service should be taken with a grain of salt.

Of course, you have to do that with search engines now, for black-hat SEOs and the rest are interested in few things as much as the green, but ... when it comes to billions of sites, give me an algorithm-based search engine any day.

After all, how else are we to find the sites for our directory in the first place?  User submission?

On a related note ...

One thing that I would like to see in future versions of search engines is the ability to have some control over how my sites are listed.  Using sitemaps, I can already tell Google that I own such-and-such domains, but it would be nice to give my results some credibility.

"This site's Webmaster has registered with us, and here's their information."  (Information could be basic contact information, a Web page address, or what-have-you.)  I don't know, but it seems like a semi-good idea.  If the site is 'trouble', then the search engine could be notified of this.  You could potentially tie this into user reviews of the site as well.

Of course, there's a number of sites out there that do this, but that's the problem - there's a number of sites that do this.  (Again, why trust people in general when there's at least half-a-dozen sites for every possible idea out there?  Why not consolidate, and allow branching that's linked to the original site/group?)  Why build another site when, given sufficient numbers, you can make an existing site better?

 

A unicorn, perhaps.