Search neutrality
Search neutrality is a principle that search engines should have no editorial policies other than that their results be comprehensive, impartial and based solely on relevance. This means that when a user types in a search engine query, the engine should return the most relevant results found in the provider's domain, without manipulating the order of the results, excluding results, or in any other way manipulating the results to a certain bias.
Search neutrality is related to network neutrality in that they both aim to keep any one organization from limiting or altering a user's access to services on the Internet. Search neutrality aims to keep the organic search results of a search engine free from any manipulation, while network neutrality aims to keep those who provide and govern access to the Internet from limiting the availability of resources to access any given content.
Background
The term "search neutrality" in context of the internet appears as early as March 2009 in an academic paper by Andrew Odlyzko titled, "Network Neutrality, Search Neutrality, and the Never-ending Conflict between Efficiency and Fairness in Markets". In this paper, Odlykzo predicts that if net neutrality were to be accepted as a legal or regulatory principle, then the questions surrounding search neutrality would be the next controversies. Indeed, in December 2009 the New York Times published an opinion letter by Foundem co-founder and lead complainant in an anti-trust complaint against Google, Adam Raff, which likely brought the term to the broader public. According to Raff in his opinion letter, search neutrality ought to be "the principle that search engines should have no editorial policies other than that their results be comprehensive, impartial and based solely on relevance". On October 11, 2009, Adam and his wife Shivaun launched SearchNeutrality.org, an initiative dedicated to promoting investigations against Google's search engine practices. There, the Raffs note that they chose to frame their issue with Google as "search neutrality" in order to benefit from the focus and interest on net neutrality.In contrast to net neutrality, answers to such questions, as "what is search neutrality?" or "what are appropriate legislative or regulatory principles to protect search neutrality?", appear to have less consensus. The idea that neutrality means equal treatment, regardless of the content, comes from debates on net neutrality. Neutrality in search is complicated by the fact that search engines, by design and in implementation, are not intended to be neutral or impartial. Rather, search engines and other information retrieval applications are designed to collect and store information, receive a query from a user, search for and filter relevant information based on that query, and then present the user with only a subset of those results, which are ranked from most relevant to least relevant. "Relevance" is a form of bias used to favor some results and rank those favored results. Relevance is defined in the search engine so that a user is satisfied with the results and is therefore subject to the user's preferences. And because relevance is so subjective, putting search neutrality into practice has been so contentious.
Search neutrality became a concern after search engines, most notably Google, were accused of search bias by other companies. Competitors and companies claim search engines systematically favor some sites over others in their lists of results, disrupting the objective results users believe they are getting.
The call for search neutrality goes beyond traditional search engines. Sites like Amazon.com and Facebook are also accused of skewing results. Amazon's search results are influenced by companies that pay to rank higher in their search results while Facebook filters their newsfeed lists to conduct social experiments.
"Vertical search" spam penalties
In order to find information on the Web, most users make use of search engines, which crawl the web, index it and show a list of results ordered by relevance. The use of search engines to access information through the web has become a key factor for online businesses, which depend on the flow of users visiting their pages. One of these companies is Foundem. Foundem provides a "vertical search" service to compare products available on online markets for the U.K. Many people see these "vertical search" sites as spam. Beginning in 2006 and for three and a half years following, Foundem's traffic and business dropped significantly due to what they assert to be a penalty deliberately applied by Google. It is unclear, however, whether their claim of a penalty was self-imposed via their use of iframe HTML tags to embed the content from other websites. At the time at which Foundem claims the penalties were imposed, it was unclear whether web crawlers crawled beyond the main page of a website using iframe tags without some extra modifications. The former SEO director OMD UK, Jaamit Durrani, among others, offered this alternative explanation, stating that “Two of the major issues that Foundem had in summer was content in iFrames and content requiring javascript to load – both of which I looked at in August, and they were definitely in place. Both are huge barriers to search visibility in my book. They have been fixed somewhere between then and the lifting of the supposed ‘penalty’. I don't think that's a coincidence.”Most of Foundem’s accusations claim that Google deliberately applies penalties to other vertical search engines because they represent competition. Foundem is backed by a Microsoft proxy group, the 'Initiative for Competitive Online Marketplace'.
The Foundem’s case chronology
The following table details Foundem's chronology of events as found on their website:Date | Event |
June 2006 | Foundem's Google search penalty begins. Foundem starts an arduous campaign to have the penalty lifted. |
August 2006 | Foundem's AdWord penalty begins. Foundem starts an arduous campaign to have the penalty lifted. |
August 2007 | Teleconference with Google AdWords Quality Team representative. |
September 2007 | Foundem is “whitelisted” for AdWords. |
January 2009 | Foundem starts “public” campaign to raise awareness of this new breed of penalty and manual whitelisting. |
April 2009 | First meeting with ICOMP. |
October 2009 | Teleconference with Google Search Quality Team representative, beginning a detailed dialogue between Foundem and Google. |
December 2009 | Foundem is “whitelisted” for Google natural search. |
Other cases
's large market share has made them a target for search neutrality litigation via antitrust laws. In February 2010, Google released an article on the Google Public Policy blog expressing their concern for fair competition, when other companies at the UK joined Foundem's cause also claiming being unfairly penalized by Google.The FTC’s Investigation into Allegations of Search Bias
After two years of looking into claims that Google “manipulated its search algorithms to harm vertical websites and unfairly promote its own competing vertical properties,” the Federal Trade Commission voted unanimously to end the antitrust portion of its investigation without filing a formal complaint against Google. The FTC concluded that Google's “practice of favoring its own content in the presentation of search results” did not violate U.S. antitrust laws. The FTC further determined that even though competitors might be negatively impacted by Google's changing algorithms, Google did not change its algorithms to hurt competitors, but as a product improvement to benefit consumers.Arguments
There are a number of arguments for and against search neutrality.Pros
- Those who advocate search neutrality argue that the results would not be biased towards sites with more advertising, but towards sites most relevant to the user.
- Search neutrality encourages sites to have more quality content rather than pay to rank higher on organic results.
- Restrains search engines from only supporting their best advertisers.
- Search engines would allow traffic to sites that depend on visitors, keeping their results comprehensive, impartial, and based solely on relevance.
- Allows for organized, logical manipulation of search results by an objective, automatic algorithm. At the same time, disallowing underhanded ranking of results on an individual basis.
- Personalized search results might suppress information that disagrees with users' worldviews, isolating them in their own cultural or ideological "filter bubbles".
Cons
- Forcing search engines to treat all websites equally would lead to the removal of their biased look at the Internet. A biased view of the Internet is exactly what search users are seeking. By performing a search the user is seeking what that search engine perceives as the "best" result to their query. Enforced search neutrality would, essentially, remove this bias. Users continually return to a specific search engine because they find the "biased" or "subjective" results to fit their needs.
- Search neutrality has the possibility of causing search engines to become stagnant. If site A is first on a SERP one month, and then tenth the next month search neutrality advocates cry "foul play," but in reality it is often the page's loss in popularity, relevance, or quality content that has caused the move. The case against Google brought forth by the owners of Foundem extoll this phenomenon and regulation could limit the search engine's ability to adjust ranking based on their own metrics.
- Proponents of search neutrality desire transparency in a search engine's ranking algorithm. Requiring transparent algorithms leads to two concerns. These algorithms are the companies private intellectual property and should not be forced into the open. This would be similar to forcing a soda manufacturer to publish their recipes. The second concern is that opening the algorithm would allow spammers to exploit and target how the algorithm functions directly. This would permit spammers to circumvent the metrics in place that prevent spammed websites from being at the top of a SERP.
- Removing a search engine's ability to directly manipulate rankings limits their ability to penalize dishonest websites that practice black hat techniques to improve their rankings. Any site who finds a way to circumvent the algorithm would benefit from a search engine's inability to manually decrease their ranking causing a spam site to gain high ranking for extended periods of time.
Related issues