New Search Engine aims to replace Google
Dec 31, 2011-Search engines are an integral part of our online experience. However, it wasnt always like that. Early attempts to create something that resembles a search engine resulted in what we know as web directories today. The practice became outdated very quickly, there was simpy too much content for directories to become practical. In December 1993, JumpStation created an interface for user queries and released a full featured web robot. It was the first resource discovery tool that combined the three essential features of a web search engine as we know them today: crawling, indexing, and searching.
Yahoo!, Magellan, Lycos, Infoseek, and Excite were the brightes stars in the Internet investing frenzy that occurred in the late 1990s. A decade later only one of the early adapters survived. A revolutionary algorithm by Larry Page and Sergey Brin, the founders of Google, weeded the competition down to three major players: Google, Yahoo! and Bing. The change was inavoidable. Before Google come into the scene, search engines based their search results on the number of keywords in the content. A clever webmaster could stuff a keyword of his desire on a page and quickly climb up the search results. Google’s approach was different.
In the pre-Google era, if you saw a website you liked and wanted to remember the address you had no reliable place to put it in, except your own website. Moreover, Larry and Sergey realized that popular resources are mentioned frequently by the users of the world wide web. Back then, if a website had many links pointing to it, it meant that the website was actually good. This is how Page Rank algorithm was born. It’s essense was simple – it literally sorted the search results according to the number of their back links. A website with large number of links pointing to it had a greater value in the eyes of Google than keyword stuffed website with no links.
Although Page Rank was more difficult to game, webmasters were quick to adapt. Instead keyword stuffing, many sites focused on off-page search engine optimization, which included: exchanging, buying, and selling links, often on a massive scale. Some of these schemes, or link farms, involved the creation of thousands of sites for the sole purpose of link spamming. By 2011, Google had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation. However, as it stands now in the battle between creative webmasters and Google, the later is losing.
Have you noticed that search results are no longer suffice your needs? That you spend much more time searching for trivial things? That Google fails to deliver when it comes to long-tail keywords? You’re not alone. What was once an excellent algorithm has become less and less so with the growth of the Internet. A change is needed, and there is a new player on the scene who might be able to provide a better Internet experience to all of us.
Blekko is a web search engine whose goal is to provide better search results than those offered by Google. The site, launched to the public on November 1, 2010, raised $24 million in venture capital from such individuals as Netscape founder Marc Andreessen and Ron Conway, as well as from U.S. Venture Partners and CMEA Capital. The company aims to provide useful search results without the extraneous links often provided by Google. Individuals who enter searches for such frequently searched categories as cars, finance, health and hotels will receive results prescreened by Blekko editors who will use what The New York Times described as “Wikipedia-style policing” to weed out pages created by content farms and focus on results from professionals.
Blekko uses slashtags to provide results for common searches, an innovative concept that will restrict the set of search results to those matching the specified characterisitcs. Do you want to find the latest gossip on Kim Kardashian? Use “Kim Kardashian /gossip” in your query – as simple as that.
Blekko’s queries related to personal health are limited to a prescreened list of 76 sites that Blekko editors have determined to be trustworthy, excluding many sites that rank highly in Google searches. As of Blekko’s launch date, its 8,000 beta editors had developed 3,000 slashtags corresponding to the site’s most frequent searches. The company hopes to use editors to develop prepared lists of the 50 sites that best match its 100,000 most frequent search targets. Additional tools built into Blekko allow users to see the IP address that a website is running on and let registered users label a site as spam.
Google is still a major player who processes 82.80% of world wide web search queries, and Blekko’s battle for market share wouldn’t be easy. However, the Internet is still very young and dynamic environment where everything can happen in relatively short time. Twitter, who came out of nowhere and became the most dominant micro-blog on the web is a live proof to that.
Blekko-it and you might find.