TY - JOUR A1 - Leng, Yan A1 - Dimmery, Drew T1 - Calibration of Heterogeneous Treatment Effects in Randomized Experiments JF - Information Systems Research N2 - Machine learning is commonly used to estimate the heterogeneous treatment effects (HTEs) in randomized experiments. Using large-scale randomized experiments on the Facebook and Criteo platforms, we observe substantial discrepancies between machine learning-based treatment effect estimates and difference-in-means estimates directly from the randomized experiment. This paper provides a two-step framework for practitioners and researchers to diagnose and rectify this discrepancy. We first introduce a diagnostic tool to assess whether bias exists in the model-based estimates from machine learning. If bias exists, we then offer a model-agnostic method to calibrate any HTE estimates to known, unbiased, subgroup difference-in-means estimates, ensuring that the sign and magnitude of the subgroup estimates approximate the model-free benchmarks. This calibration method requires no additional data and can be scaled for large data sets. To highlight potential sources of bias, we theoretically show that this bias can result from regularization and further use synthetic simulation to show biases result from misspecification and high-dimensional features. We demonstrate the efficacy of our calibration method using extensive synthetic simulations and two real-world randomized experiments. We further demonstrate the practical value of this calibration in three typical policy-making settings: a prescriptive, budget-constrained optimization framework; a setting seeking to maximize multiple performance indicators; and a multitreatment uplift modeling setting. Y1 - 2024 U6 - https://doi.org/10.1287/isre.2021.0343 ER - TY - JOUR A1 - Allcott, Hunt A1 - Gentzkow, Matthew A1 - Mason, Winter A1 - Wilkins, Arjun A1 - Barberá, Pablo A1 - Brown, Taylor A1 - Cisneros, Juan Carlos A1 - Crespo-Tenorio, Adriana A1 - Dimmery, Drew A1 - Freelon, Deen A1 - González-Bailón, Sandra A1 - Guess, Andrew M. A1 - Kim, Young Mie A1 - Lazer, David A1 - Malhotra, Neil A1 - Moehler, Devra A1 - Nair-Desai, Sameer A1 - Nait El Barj, Houda A1 - Nyhan, Brendan A1 - Paixao de Queiroz, Ana Carolina A1 - Pan, Jennifer A1 - Settle, Jaime A1 - Thorson, Emily A1 - Tromble, Rebekah A1 - Velasco Rivera, Carlos A1 - Wittenbrink, Benjamin A1 - Wojcieszak, Magdalena A1 - Zahedian, Saam A1 - Franco, Annie A1 - Kiewiet de Jonge, Chad A1 - Stroud, Natalie Jomini A1 - Tucker, Joshua A. T1 - The effects of Facebook and Instagram on the 2020 election: A deactivation experiment JF - Proceedings of the National Academy of Sciences N2 - We study the effect of Facebook and Instagram access on political beliefs, attitudes, and behavior by randomizing a subset of 19,857 Facebook users and 15,585 Instagram users to deactivate their accounts for 6 wk before the 2020 U.S. election. We report four key findings. First, both Facebook and Instagram deactivation reduced an index of political participation (driven mainly by reduced participation online). Second, Facebook deactivation had no significant effect on an index of knowledge, but secondary analyses suggest that it reduced knowledge of general news while possibly also decreasing belief in misinformation circulating online. Third, Facebook deactivation may have reduced self-reported net votes for Trump, though this effect does not meet our preregistered significance threshold. Finally, the effects of both Facebook and Instagram deactivation on affective and issue polarization, perceived legitimacy of the election, candidate favorability, and voter turnout were all precisely estimated and close to zero. Y1 - 2024 U6 - https://doi.org/10.1073/pnas.2321584121 VL - 121 IS - 21 ER -