Recently, over dinner, I was explaining the benefits of Geo-fencing to a family friend Katie, who is a mother to two young kids.
Geo-fence, for the uninitiated, is a virtual boundary surrounding a geographic region. When a person with a mobile phone crosses a geo-fence boundary, a notification is automatically issued to that mobile phone. And that notification can very well be a relevant targeted locational context-sensitive ad or a promotional coupon. Note, the consumer needs to opt-in and install an app on the mobile phone for this scenario to work.
I was explaining Katie that if she is willing to opt-in, the next-time she is shopping at Target for her kids, she will get a relevant ad/coupon for items on her shopping list. All she has to do is to agree that her location will be tracked, if and only if, she is within the geo-fence. Outside, the geo-fence, she will not be tracked.
In my mind, it was good trade-off between Katie sacrificing some of her personal data including location (aka sensitive personal data or SPD) and getting goods at discounted price. However, Katie had strong reservations about being tracked. What if they don’t stop tracking outside the geo-fence? What if some criminal elements hack into the system and track her all the time engendering harm on her young family? And she brought up examples of recent break-ins into yahoo email, identity theft, etc.
The same evening, after reaching home, I started reading Atul Gawande’s fascinating essay about health-care reform in US. In this essay, Atul introduced me to the notion of “wicked problems”. Paraphrasing Atul –
In 1973, two social scientists, Horst Rittel and Melvin Webber, defined a class of problems they called “wicked problems.” Wicked problems are messy, ill-defined, more complex than we fully grasp, and open to multiple interpretations based on one’s point of view. They are problems such as poverty, obesity, where to put a new highway—or how to make sure that people have adequate health care.
They are the opposite of “tame problems,” which can be crisply defined, completely understood, and fixed through technical solutions. Tame problems are not necessarily simple—they include putting a man on the moon or devising a cure for diabetes. They are, however, solvable. Solutions to tame problems either work or they don’t.
Solutions to wicked problems, by contrast, are only better or worse. Trade-offs are unavoidable. Unanticipated complications and benefits are both common. And opportunities to learn by trial and error are limited. You can’t try a new highway over here and over there; you put it where you put it. But new issues will arise. Adjustments will be required. No solution to a wicked problem is ever permanent or wholly satisfying, which leaves every solution open to easy polemical attack
The question is – Do we believe that privacy issue is a wicked or a tame problem?
Irfan Khan, CTO of Sybase, in his excellent blog entry argues that –
I believe that consumers are very concerned about their SPD. But I’d argue they’re more concerned about it falling into unintended hands. That is, people who have good relationships with businesses are happy to share SPD with them because the services and goods they receive are improved in the process. But they want those companies to keep a tight lid on that information. They want bullet-proof security of their SPD. When security fails, that’s when their privacy concerns are heightened. If companies could protect their data 100 percent of the time, that is, if they never suffered a security breach, I’d wager privacy issues would all but disappear among the vast majority of consumers.
Or, in other words, technology comes to rescue and this becomes a solvable and a “tame” problem.
Given that a significant portion security breaches happen within inside an organization as opposed to outside, reaching the 100 percent goal seems like an elusive target. Apart from protecting the data, there are other fundamental issues with how the company uses the data to make business decisions; for instance, if one where to share their private genetic data with your HMO (like Kaiser) with the goal to improve quality of care and the HMO determines that you are genetically pre-disposed to any chronic diseases (e.g., diabetes), would they increase your medical insurance premium? Perhaps, this might be regulated by the law. What if this data is shared with your life insurance company? Will they drop your coverage or increase your premium?
In conclusion, user privacy (or SPD) issue is messy, ill-defined and more complex than we can fully grasp. Trade-offs are unavoidable, when it comes to solutions. Unanticipated complications and benefits are both common. No solution is ever permanent or wholly satisfying, which leaves every solution open to easy polemical attack. Or, in other words, it’s a wicked problem.