I study what influences and motivates individual behavior.
I am interested in questions related to privacy, fairness, and information markets that have implications for managers and policymakers.
In general, I'm inspired by experiments that uncover how peoples' beliefs and preferences influence their decisions.
People are often unaware that their personal data can serve as valuable inputs for economic activities in secondary data markets. However, whether secondary monetization of personal data determines privacy preferences remains unclear. I examine whether privacy decisions are motivated by the data recipient's ability to benefit from trading individuals' data with a third party. A large, online laboratory experiment involving personally identifiable psychometric data is implemented with real data-sharing consequences and monetary benefits. I find that individuals decrease their willingness to share data---both in terms of their likelihood of participating in the data market and the prices demanded for such participation---when the recipient's ability to monetize the data through secondary trade is salient. Strategic responses to updated beliefs about the recipient's gain from the trade are ruled out via the chosen price elicitation. I find that increased data exposure (to more recipients) does not explain the significant, revealed disutility from secondary monetization. These findings are also robust to controlling for the risk exposure differences between data recipients and third parties.
This study explores the value-enhancing effects of the psychological ownership of information, by studying a genre of rule-based algorithms: a type of information good containing a finite-step, computer-implementable procedure to solve a well-defined problem. I study how the implicit values of algorithms elicited from the perspective of an algorithm provider differs from the values revealed from the perspective of a user. Whether a person is a developer, provider, or user is assigned in a controlled experiment containing a real-consequence algorithm market. To support predictions about the value-enhancing effects of psychological ownership over information goods, I provide a conceptual framework for how the ownership individuals feel towards an algorithm are either strengthened or weakened by their role in an algorithm licensing decision. I find empirical evidence that both developers and non-developers assigned to be providers of algorithms reveal higher reservation prices for licensing their algorithms than potential users are willing to pay. When non-developer providers and users are provided with metainformation about an algorithm's likelihood of success, valuation gaps do not appear among algorithms with at least a 50 percent chance of success; however, valuation gaps remain among lower quality algorithms.
with Marcel Preuss, Germán Reyes, and Jason Somerville
This paper examines how people redistribute income when there is uncertainty about the role luck plays in determining opportunities and outcomes. We introduce a portable experimental method that generates exogenous variation in the probability that real workers' earnings are due to luck, while varying whether luck interacts with effort in the earning process. Then, we elicit redistribution decisions from a U.S.-nationally representative sample who observe worker outcomes and whether luck magnified workers' effort (``lucky opportunities'') or determined workers’ income directly (``lucky outcomes''). We find that participants redistribute less and are less reactive to changes in the importance of luck in environments with lucky opportunities. We show that individuals rely on a simple heuristic when assessing the impact of unequal opportunities, which leads them to underappreciate the extent to which small differences in opportunities can have a large impact on outcomes. Our findings have implications for models that seek to understand and predict attitudes toward redistribution, while helping to explain the gap between lab evidence on preferences for redistribution and real-world inequality trends.
A common conclusion among researchers is that there is a "privacy paradox" between individuals' privacy attitudes and privacy behavior.
Whether general beliefs and normative stances about privacy regimes are related to real data-sharing activities (i.e., one's value of data privacy) remains an under-explored question,
especially when the methods for eliciting these two preference measurements can be distantly related in practice.
The descriptive results of this online survey show that data security, control rights, and sharing attitudes can be aligned with individuals’ data-sharing activities.
Users’ who exhibited weak privacy behavior also displayed relatively weaker attitudes (and vice versa) in a setting where stated attitudes are elicited after information provisions---about data-sharing consequences---and actual privacy decisions.
Moreover, individuals who were privacy-seeking due to second parties’ data exploitation activities were less likely to state an expectation that these activities occur when businesses collect their personal data.
These results suggest that
(1) concern about data privacy,
(2) demand for control rights and restrictions on the free movement of data, and
(3) awareness about data exploitation
can align with a person's actual privacy behavior.
with Marcel Preuss, Germán Reyes, and Jason Somerville
with Avinash Collis and Ananya Sen
Data brokers sell access to and usage of personal information they harvest about individuals and facilitate the exchange of personal data utilized throughout the digital economy. However, little is known about people's valuation, perceptions, and demand for their privacy from data brokers. We conducted a survey experiment of 4,000 U.S.-representative individuals and provisioned information about how brokers harvest user data from two broad categories of data transfers people participate in: those with government agencies and those with commercial entities. We gathered individuals’ beliefs about data exposure and their revealed preferences for privacy from brokers. Perceptions about exposure to brokers shift in a direction that suggests Americans largely underestimate their data exposure—particularly the size of the broker market that harvests government records—and these beliefs are malleable to information interventions. Despite shifting their beliefs, individuals’ willingness to pay to delete data from brokers is unaffected by our interventions on the aggregate level. However, we find significant variations in how effective information interventions are at influencing valuations for privacy based on characteristics of individuals, such as whether they conducted prior data deletion requests and whether they have applied for a change of address request with USPS. These results highlight how individuals can be systematically underinformed about how much privacy they have, yet their value of privacy preservation depend on their idiosyncratic, instrumental values of privacy.