Chapter 4: Web Search
Search neutrality is a principle that search engines should have no editorial policies other than that their results be comprehensive, impartial and based solely on relevance. This means that when a user queries a search engine, the engine should return the most relevant results found in the provider’s domain (those sites which the engine has knowledge of), without manipulating the order of the results (except to rank them by relevance), excluding results, or in any other way manipulating the results to a certain bias.
Search neutrality is related to network neutrality in that they both aim to keep any one organization from limiting or altering a user’s access to services on the Internet. Search neutrality aims to keep the organic search results (results returned because of their relevance to the search terms, as opposed to results sponsored by advertising) of a search engine free from any manipulation, while network neutrality aims to keep those who provide and govern access to the Internet from limiting the availability of resources to access any given content.
Search neutrality became a concern after search engines, most notably Google, were accused of search bias. Competitors and companies claim search engines systematically favor some sites (and some kind of sites) over others in their lists of results, disrupting the objective results users believe they are getting. The call for search neutrality goes beyond traditional search engines. Sites like Amazon.com and Facebook are also accused of skewing results. Amazon’s search results are influenced by companies that pay to rank higher in their search results while Facebook filters their newsfeed lists to conduct social experiments.
“Vertical search” spam penalties
In order to find information on the Web, most users make use of search engines, which crawl the web, index it and show a list of results ordered by relevance. The use of search engines to access information through the web has become a key factor for online businesses companies, which depend on the flow of users visiting their pages. One of these companies is Foundem. Foundem provides a “vertical search” service to compare products available on online markets for the U.K. Many people see these “vertical search” sites as spam. Beginning in 2006 and for three and a half years following, Foundem’s traffic and business dropped significantly due to a “penalty” applied by Google. Adam Raff, co-founder of Foundem, first coined the term search neutrality in December 2009 in his op-ed piece in The New York Times after Google removed the “penalty” from Foundem. Foundem launched SearchNeutrality.org, a website dedicated to promote investigations against Google, in the same year. Most of Foundem’s accusations claim that Google applies penalties to other vertical search engines just because they represent competition. Foundem is backed by a Microsoft proxy group, the ‘Initiative for Competitive Online Marketplace’.
There are a number of arguments for and against search neutrality.
- Those who advocate search neutrality argue that the results would not be biased towards sites with more advertising, but towards sites most relevant to the user.
- Search neutrality encourages sites to have more quality content rather than pay to rank higher on organic results.
- Restrains search engines from only supporting their best advertisers.
- Search engines would allow traffic to sites that depend on visitors, keeping their results comprehensive, impartial, and based solely on relevance.
- Allows for organized, logical manipulation of search results by an objective, automatic algorithm. At the same time, disallowing underhanded ranking of results on an individual basis.
- Personalized search results might suppress information that disagrees with users’ worldviews, isolating them in their own cultural or ideological “filter bubbles“.
- Forcing search engines to treat all websites equally would lead to the removal of their biased look at the Internet. A biased view of the Internet is exactly what search users are seeking. By performing a search the user is seeking what that search engine perceives as the “best” result to their query. Enforced search neutrality would, essentially, remove this bias. Users continually return to a specific search engine because they find the “biased” or “subjective” results to fit their needs.
- Search neutrality has the possibility of causing search engines to become stagnant. If site A is first on a SERP (search engine results page) one month, and then tenth the next month search neutrality advocates cry “foul play,” but in reality it is often the pages loss in popularity, relevance, or quality content that has caused the move. The case against Google brought forth by the owners of Foundem extoll this phenomenon and regulation could limit the search engine’s ability to adjust ranking based on their own metrics.
- Proponents of search neutrality desire transparency in a search engine’s ranking algorithm. Requiring transparent algorithms leads to two concerns. These algorithms are the companies private intellectual property and should not be forced into the open. This would be similar to forcing a soda manufacturer to publish their recipes. The second concern is that opening the algorithm would allow spammers to exploit and target how the algorithm functions directly. This would permit spammers to circumvent the metrics in place that prevent spammed websites from being at the top of a SERP.
- Removing a search engine’s ability to directly manipulate rankings limits their ability to penalize dishonest websites that practice black hat techniques to improve their rankings. Any site who finds a way to circumvent the algorithm would benefit from a search engine’s inability to manually decrease their ranking causing a spam site to gain high ranking for extended periods of time.