Based on a study published last week by Google researchers, it seems that it will soon be possible to verify the truth of a “fact” or the “trustworthiness” of a website by a simple Internet search. In the research paper, a group of computer scientists proposes a method for ranking search results not by reputation/how many links the web page has to other websites, but by scoring the factual accuracy of web pages.
As authors Xin Luna Dong, Evgeniy Gabrilovich, Kevin Murphy, Van Dang Wilko Horn, Camillo Lugaresi, Shaohua Sun and Wei Zhang note in the abstract of their paper: “The quality of web sources has been traditionally evaluated using exogenous signals such as the hyperlink structure of the graph. We propose a new approach that relies on endogenous signals, namely, the correctness of factual information provided by the source. A source that has few false facts is considered to be trustworthy.”
More on Google’s new truth verifying technology
The new software uses Google’s Knowledge Vault, a huge database of verified of facts that Google has created. With the system, facts that web unanimously agrees on are considered a reasonable proxy for truth. On the other hand, pages that contain contradictory information are bumped down the rankings, the more “untruths” the lower the ranking.
Google gets the verified data from services like Freebase, Wikipedia and the CIA World Factbook and then creates “knowledge triples” that include the subject, relationship, attributes of the fact.
So in order to verify a fact, all Google has to do is reference it against the knowledge triples in the Knowledge vault. Moreover, to confirm the accuracy of a web page or a web site, the algorithm checks the site’s knowledge triples to determine how many do not agree with its Knowledge Vault facts.
Of note, there are already quite a few apps available to help web surfers find out the truth. For example, LazyTruth is a browser extension that skims inboxes to identify known fake or hoax emails. Emergent (from the Tow Center for Digital Journalism at Columbia University) analyses rumors from various sites, then verifies or debunks them with a system that cross-references with other sources.
Ranking websites by accuracy would be major change
This paper hints that search results ranking algorithm could eventually include accuracy among the factors in displaying search results .That would bea mjor development given another study with with a random sampling of pages found that only 20 of 85 factually correct sites were ranked highly under Google’s current search method. Adding an accuracy or a trustworthiness factor to its search engine would mean more reliable and accurate information for the millions who use Google search every day.