European court: Google must yield on personal info
Google and other search engines were thrust into an unwanted new role yesterday – caretaker of people’s reputations – when Europe’s highest court ruled that individuals should have some say over what information pops up when their names are Googled.
The landmark ruling by the Court of Justice of the European Union will force search engines to decide when to censor computer users’ search results across the 28-nation bloc of more than 500 million people.
The decision – which cannot be appealed – was celebrated by some as a victory for privacy rights in the internet age. Others warned it could lead to online censorship.
The ruling applies to EU citizens and all search engines in Europe, including Yahoo and Microsoft’s Bing.
It has no immediate impact on the way Google and other search engines display their results in the United States or other countries outside Europe.
But it could create logistical headaches for such companies by forcing them to make judgment calls about the fairness of information published on other websites.
In its ruling, the EU court said search engines must listen and sometimes comply when people ask for the removal of links to newspaper articles or other sites containing outdated or otherwise objectionable information.
Google Inc. has long maintained that people with such complaints should take it up with the websites that posted the material.
“This is a disappointing ruling for search engines and online publishers in general,” the Mountain View, Calif., company said in a statement.
Though Europe is one of Google’s biggest markets, the decision isn’t expected to have much effect on the company’s earnings. That’s because it has no direct bearing on the online ads that Google places alongside its search results.
Investors evidently weren’t worried. Google’s most widely traded class of stock gained $3.11 to close at $541.54 yesterday.
It’s unclear exactly how the European court envisions Google and others handling complaints.
Google, though, has dealt with similar situations in the past.
The company already censors some of its search results in several countries to comply with local laws. For instance, Google and other search engines are banned from displaying links to Nazi paraphernalia and certain hate speech in Germany and France.
The company also has set up a process so people can have their images blurred if they appear in Google’s street-level photographic maps.
What Google and other search engines have sought to avoid is acting as the arbiters of what kind of information to include in their searches.
These companies rely on formulas, or algorithms, and automated “crawlers” that roam the internet and gather up results in response to search requests.
“There’s not much guidance for Google on how to figure out how and when they are supposed to comply with take-down requests – they just know they have to weigh the public interest,” said Joel Reidenberg, a Fordham University law professor now visiting Princeton University.
The case was referred to the European Court from Spain’s National Court, which asked for advice in the case of Mario Costeja, a Spaniard who found a search on his name turned up links to a notice that his property was due to be auctioned because of an unpaid welfare debt. The notice had been published in a Spanish newspaper in 1998 and was tracked by Google’s robots when the newspaper digitized its archive.
Costeja argued that the debt had long since been settled, and he asked the Spanish privacy agency to have the reference removed. In 2010, the agency agreed, but Google refused and took the matter to court, saying it should not be asked to censor material that had been legally published by the newspaper.
“It’s a great relief to be shown that you were right when you have fought for your ideas. It’s a joy,” Costeja said.
He said that “ordinary people will know where they have to go” to complain about bad or old information that turns up on a Google search.
Costeja’s case will now return to Spain for final judgment. There are about 200 others in the Spanish court system, some of which may still prove difficult to decide. For instance, one involves a plastic surgeon who wants mentions of a botched operation removed from Google’s results.
Debates over the “right to be forgotten” – to have negative information erased after a period of time – have surfaced across the world as tech users struggle to reconcile the forgive-and-forget nature of human relations with the unforgiving permanence of the internet.
Though the idea of such a right has generally been well-received in Europe, many in the U.S. have criticized it as a disguised form of censorship that could, for example, allow ex-convicts to delete references to their crimes or politicians to airbrush their records.
Alejandro Tourino, a Spanish lawyer who specializes in mass media issues, said the ruling was a first of its kind and “quite a blow for Google.”
“It is a most important ruling and the first time European authorities have ruled on the ‘right to be forgotten,’ ” said Tourino, who has worked for the Associated Press in several legal cases and is the author of The Right to be Forgotten and Privacy on the Internet.
Some limited forms of a “right to be forgotten” exist in the U.S. and elsewhere – for example, in regard to crimes committed by minors or bankruptcy regulations, both of which usually require that records be expunged in some way. However, the burden falls on the publisher of the information, usually a government – not on search engines.
Viviane Reding, the EU’s top justice official, said in a Facebook posting that the ruling confirmed that “data belongs to the individual” and that unless there is a good reason to retain data, “an individual should be empowered by law to request erasure.”
However, Javier Ruiz, policy director at Open Rights Group, a British-based organization, cautioned that authorities have to be careful in how they move forward.
“We need to take into account individuals’ right to privacy,” he said. “But if search engines are forced to remove links to legitimate content that is already in the public domain . . . it could lead to online censorship.”