Data Privacy – Need for an Economic Perspective

Privacy and… sheep

The standard scenario considered in most data privacy discussions is the one in which data describing people is collected, essentially for free, by an individual (or an organization), who then process that data to obtain knowledge valuable for them. This knowledge often becomes an asset to which a monetary value can be assigned. Exaggerating only slightly, a parallel can be drawn with shearing of the sheep: the sheep (people) give away, often unknowingly, their woolen fleece (the data describing them). The fleece is then processed by scouring, spinning and weaving (data cleaning, data engineering, model building), and all the participants of the process – except the sheep -reap the profit, either monetary or as value added to their operations. The sheep go on growing more fleece, until the next shearing.

Economics – ABC

This parallel makes us think about approaching data privacy from an economic perspective. We believe that such view may allow us to better understand privacy as a subject of an exchange, and therefore – as a good in the economic sense. It will also help us to capture the multiple dimensions of data privacy: the value of the data to its owner, as well as to the party that wants to use the data, the cost (or effort) of attacking the data in a privacy breach, the cost of protecting the data against attacks, etc.

Economics has a long track of providing quantitative models of exchanges in a social context that can be reasoned with for measurement and prediction of specific concepts. As a discipline, economics attempts to provide us with understanding of actions of agents in the process of exchange of goods. This exchange is aimed at consumption of goods in order to satisfy needs. The exchange may involve money, measurable or virtual goods, may result in delayed or immediate consumption. The detailed description of exchange is called organization of market or shortly – a market. It is important to note that a complex market is defined not only by agreed upon rules stating terms and conditions of the exchange process, but also involves consent on social constraints – goals for achievement and threats perceived as socially undesirable. For our purposes we identify the society with institutions it has evolved to make decisions (state, government, etc.), which are called the regulator.

A data market

We propose to involve the concept of market in analyzing data privacy. With this approach, agents exchange their data (data about themselves that they own, in the sense of consenting to the collection and use the data describing them for specific purposes) for a “price”. This leads to the concept of a data market, and we believe it can address some aspect of data privacy. Such markets are the force governing the collection and use of the “traces of economic activities, of searches of information, of connections to people and of their special movements” (see Introduction to this issue of the Privacy Observatory Magazine). In our view there is no evidence that spontaneous rise of such markets will address the societal constraints one want to observe in data privacy, so we propose to turn to market design for a solution1.

Given this market context, the following questions seem critical:

  1. Who are parties in the transactions, and what are the exchange mechanisms?
  2. What are product characteristics?
  3. How is the product – knowledge acquired from the aggregated data-to be valued?
  4. Could there be a fair and efficient market (regulation)?

We have looked in detail into questions 1-3 in our earlier paper [Matwin, Szapiro 2010], which attempts to relate the existing literature devoted to these issues. Here we will summarize our preliminary answers to these questions and we will consider question 4.

For the sake of simplicity we consider two types of agents: individual data owners and data aggregators. The two kinds of agents interact in Aggregate Data Privacy Markets (APDMs). Members of first group exchange (give up the control of) their personal data for a price. Specifically, individuals (agents) trade their data with people or organizations interested in aggregating the data in order to perform model building from the data aggregates. These models are used in knowledge tools (e.g. recommender systems, marketing profiles, risk profiles, etc.) that add value to the company business. We have therefore a free organization of market, i.e. terms and trades of transactions are left to parties. In most of practical situations the price to the aggregators is zero, since data owners valuate future implications of this exchange as insignificant. A consequence of the zero price is the phenomenon known in economics as free riding: data aggregators obtain a valuable good (information about individual clients needed for model building) for free.

Data need more than pure market forces

In the data privacy context, another undesirable consequence of the market is the potential social exclusion that may result from using aggregated data. In extreme situations such exclusion amounts to discrimination. Profiles obtained from the data may be used to exclude society members from services (e.g. health insurance) or opportunities (credit, jobs) – often implicitly and incorrectly. Lack of acceptance for social exclusion is an example of a social norm that is advanced not only by the regulator, but also by all individuals. A regulator usually will not accept mechanisms leading to social exclusion.

This market perspective on data privacy immediately faces a serious difficulty – deficit of satisfaction may not be perceived by consumers during the transaction, but only later. An agent may freely give their data for a marketing study, only to be bombarded by product offers. Or she may voluntarily participate in a medical study, only to be ranked as a high-risk individual to whom higher insurance premiums apply. This may create a deficit of privacy, perceived after the data “exchange”. Thus from an economic point of view the deficit of privacy creates a need to heal this pain, and actions are considered which meet that need. For each action an intrinsic value is determined and a utility results from computing resultant profit (i.e. satisfying the need to decrease the privacy deficit). In this framework decisions result from personal and societal preferences (values), which can be increased by material or other explicit incentives (laws) and social sanctions or rewards (norms). Norms and incentives can strengthen or undermine each other. Preferences of agents are defined by their utilities known only to them.

Theoretically, in ADPM we assume that individuals maximize utility involving evaluation of their own privacy, profit from incentives to give it up, consequences and costs related to actions involved in giving or refusing their data, etc. Practically, it may occur that conveying true private information is not rational for agent – it leads to lower exchange outcome (and thus lower level of satisfaction of needs). This means that goods’ allocations may appear not socially optimal, i.e. allocation of goods among agents does not maximize jointly (the sum of) utilities of all agents and that another allocation could be better. In ADPM we face exactly this situation – data aggregators are not interested in informing data owners of their potential privacy deficits and the ensuing risks. This results in socially non-optimal allocations (e.g. people without medical insurance). Furthermore, there is asymmetry of information: just like the sheep are not aware of the value of their wool, the individual agents are usually not aware of the value of their data to the aggregators, who, however, are well aware of the value of the aggregated data. The existing market is therefore deficient, and there is need to decrease both information asymmetry and free riding on APDM.

Early ideas on how to “fix’ the data market

In the context of ADPM, socially desirable behaviors can be achieved through rewarding/punishing data aggregators for revealing/hiding information regarding the future use of data. One extreme proposal suggested involvement of state-run financial institutions in enforcing the existence of agreements between data owners and data aggregators on future profits from sale of aggregated data ([Laudon 96]). In an economic approach, the motivating reward and punishment would be material. In the psychological perspective, incentives result from manipulation of collective identity or social perceptions of correct behavior. Legal view would price violation of regulation according to their economic and social valuation.

Solutions for extending classical pure market mechanisms have been proposed in the economic literature (e.g. recent work of [Bénabou and Tirole 11]). While they do not venture into the area of privacy, we believe that their proposal on the introduction of norms into markets may be interesting for building “improved” data privacy markets. They show how building norms and incentive mechanisms, taking into account different behaviors of market participants, may lead to optimized markets. The creation of these mechanisms is delegated to the regulator who defines incentives – used in rules for individual actions comparison – and therefore creates a framework for individual rationality. Optimality of this mechanism is achieved through such choice of incentives for which maximization of individual utilities maximizes aggregated outcome. Bénabou and Tirole provide a stylized analytical model involving formal and informal interactions, and allowing quantitative analysis leading to interpretable conclusions.

To summarize, we advocate an economic model of data privacy. We believe that market mechanisms can be used to understand (and improve) the exchanges involved in situations when data privacy concerns arise. We recognize the limitations of the `pure` market approach and propose to address them by enhancing the market model with the introduction of social norms and incentive mechanisms. This may lead to the fair participation of the sheep in the benefits their wool brings to the society.

References

  • [Bénabou and Tirole 11] Bénabou, R., Tirole, J., “Laws and Norms”, http://econ.as.nyu.edu/docs/IO/16878/Benabou_20101019.pdf
  • [Laudon 96] Laudon, K. (1996), ‘Markets and privacy’, CACM 39(9), 92-104
  • [Matwin, Szapiro 10] Matwin, S. Szapiro, T. “Data Privacy: From Technology to Economics” Springer Studies in Computational Intelligence vol. 263, pp. 43-74, 2010.