Michael B. Duff

Lubbock's answer to a question no one asked

Posts Tagged ‘Google

Should Google be responsible for what people do with its maps?

If I had to explain this column in terms of an academic paper, I would call it “a meditation on the moral consequences of information technology.”

I like to think of myself as spanning the border between science fiction and science fact. The best science fiction explores the moral consequences of technology that we don’t have yet.

How would we act if we could spy on our neighbors any time we want? What would society look like if we could manipulate the genetic heritage of our children?

Now imagine yourself living in a science-fiction universe where you could get the answer to any question by typing it into a machine or view historical maps of the world on a perfect virtual globe.

That science fiction concept has become fact in 2009 and has created moral problems that even Ray Bradbury couldn’t predict.

On Monday, we published an Associated Press story by Jay Alabaster. Google Earth has recently added historical maps of Japan to its library, and this information is creating a unique moral problem for citizens of modern-day Japan.

To American ears, the availability of historical maps sounds like a dry academic issue, the kind of news you skip over unless you’re a historian or a geography professor.

But Google is facing angry accusations of prejudice and responding to inquiries from the Justice Ministry because the publication of these ancient maps is enabling discrimination in the present.

Alabaster’s article explains: “The maps date back to the country’s feudal era, when shoguns ruled and a strict caste system was in place. At the bottom of the hierarchy were a class called the ‘burakumin,’ ethnically identical to other Japanese but forced to live in isolation because they did jobs associated with death, such as working with leather, butchering animals and digging graves.”

The caste system is long dead, but people who live in these areas still face discrimination based on where they live or where their ancestors lived.

Alabaster quotes an anonymous source who works for a large, well-known Japanese company: “If we suspect that an applicant is a burakumin, we always do a background check to find out.”

These areas have been known to reduce nearby property values, and residents of these areas have been the target of graffiti and racial taunts.

This information has been circulating on bulletin boards for a while now, but Google’s maps have revealed new blocks of “dirty” addresses and made geographic discrimination easier for anyone who cares to look – a textbook case of a moral dilemma created by information technology.

If someone uses the information provided by Google to harass, demean or harm a person from one of these areas, is Google responsible for the crime?

This information has been available for decades, but you had to visit libraries and do tedious academic work to use it. It’s not the existence of information but the improved presentation of information that has created this problem.

Google is making it easier for bad people to do bad things, but is that really Google’s fault? I think of Google as a kind of common carrier. If criminals use telephone lines to harass people and plan crimes, can you really blame the telephone?

If thieves get blueprints from the library and use them to plan a theft, is it fair to blame the library?

I like to use gun control analogies here. A gun is a weapon but it can’t fire itself. Google makes information available but can’t control how people use it.

Maybe it’s a privacy issue? Individuals can remove themselves from phone directories and put themselves on do-not-call lists. But how do you hide yourself from something as general as a map?

As of this writing, Google has removed the offending maps from its software. The historical maps have been altered to blank out data from the sensitive neighborhoods, but I’m not convinced that the company did the right thing.

Information is a resource. It can’t be “good” or “evil” by itself. The consequences aren’t harmful until a person acts on that information. I think we need to hold individuals responsible for their actions and protect providers who make these tools available.

Written by Michael B. Duff

May 8, 2009 at 18:20

Posted in Columns

Tagged with