Findings

What Can You Do?

Kevin Lewis

December 30, 2024

Why Has Construction Productivity Stagnated? The Role of Land-Use Regulation
Leonardo D'Amico et al.
NBER Working Paper, November 2024

Abstract:
We document a Kuznets curve for construction productivity in 20th-century America. Homes built per construction worker remained stagnant between 1900 and 1940, boomed after World War II, and then plummeted after 1970. The productivity boom from 1940 to 1970 shows that nothing makes technological progress inherently impossible in construction. What stopped it? We present a model in which local land-use controls limit the size of building projects. This constraint reduces the equilibrium size of construction companies, reducing both scale economies and incentives to invest in innovation. Our model shows that, in a competitive industry, such inefficient reductions in firm size and technology investment are a distinctive consequence of restrictive project regulation, while classic regulatory barriers to entry increase firm size. The model is consistent with an extensive series of key facts about the nature of the construction sector. The post-1970 productivity decline coincides with increases in our best proxies for land-use regulation. The size of development projects is small today and has declined over time. The size of construction firms is also quite small, especially relative to other goods-producing firms, and smaller builders are less productive. Areas with stricter land use regulation have particularly small and unproductive construction establishments. Patenting activity in construction stagnated and diverged from other sectors. A back-of-the-envelope calculation indicates that, if half of the observed link between establishment size and productivity is causal, America’s residential construction firms would be approximately 60 percent more productive if their size distribution matched that of manufacturing.


Market versus policy responses to novel occupational risks
Robert Cramer, Elissa Philip Gentry & Kip Viscusi
Journal of Empirical Legal Studies, December 2024, Pages 716-756

Abstract:
The unprecedented occupational risks posed by the COVID-19 pandemic prompted employers to boost wages and federal authorities to propose hazard pay policies. This article estimates a market-based compensating differential for workers facing elevated risks through contact with the public using CPS employment data for 2019–2020 and occupational characteristic data from the US Department of Labor's Occupational Information Network. The estimated premium for exposure was roughly $820 overall and $1000 for essential workers. These premiums fall short of those proposed -- but not enacted -- by the federal government and are more commensurate with estimates of the value of a statistical life than were the federal proposals.


The readability of contracts: Big data analysis
Yonathan Arbel
Journal of Empirical Legal Studies, December 2024, Pages 927-978

Abstract:
The plain language movement waged a silent revolution in the last generation, passing nearly 800 laws nationwide with little public debate. The movement asserted that it could scientifically show that there is a widespread readability crisis in legal documents, particularly contracts, that are unreadable to most adults. This article presents the largest empirical analysis of these claims to date, utilizing a dataset of 2 million contracts spanning multiple decades and industries and applying machine learning techniques. The study challenges fundamental tenets of the plain language movement. Contrary to prevailing beliefs, consumer agreements have median reading scores almost indistinguishable from those of daily news articles. A critical evaluation further exposes that readability tools endorsed by the movement are shoddy and manipulable and can produce grade-level differences of up to 4.6 years for identical texts. Moreover, the movement's core belief that Americans cannot read past the level of an eighth grader is exposed as an unsubstantiated myth. These findings fundamentally challenge the premises and effectiveness of one of the central consumer protection policies. These results call for a radical rethinking of legal access strategies, suggesting a shift from superficial readability metrics to addressing substantive issues in market dynamics and focusing on truly vulnerable populations. More broadly, this case study serves as a cautionary tale about the propagation of myths in legal scholarship and the potential for well-intentioned reform movements to divert attention and resources from more effective interventions.


Estimating industry conduct using promotion data
Christian Michel, Jose Manuel Paz y Miño & Stefan Weiergraeber
RAND Journal of Economics, Winter 2024, Pages 658-683

Abstract:
We estimate the evolution of competition in the ready-to-eat cereal industry. To separately identify detailed patterns of industry conduct from unobserved marginal cost shocks, we construct novel instruments that interact data on rival firms' promotions with measures of products' relative isolation in the characteristics space. We find strong evidence for partial price coordination among cereal manufacturers in the beginning of our sample. After a merger in 1993 conduct becomes more competitive and on average consistent with multiproduct Nash pricing. The last part of our sample is characterized by even more aggressive pricing, implying median wholesale margins of less than 5%.


Complainer's Dilemma
Greg Leo & Jennifer Pate
Journal of Public Economic Theory, forthcoming

Abstract:
Technological innovations have made complaining easier. Often, when it is easy to complain, only problems that meet a high threshold of complaints are addressed. We present a novel model of the strategic environment facing complainers and demonstrate that the properties of the resulting games' equilibria justify the existence of high complaint thresholds. By setting the thresholds appropriately, an administrator can prevent complaints that are not worth addressing. Policies that minimize the cost of complaining while requiring a large threshold are universally more efficient for large constituencies. Our results regarding the equilibrium for large constituencies are facilitated by the application of the Lambert-W function, demonstrating how this tool can be employed to analyze games with a large number of players. We motivate the model using a rich data set of complaints from New York City.


Regulating Explainable Artificial Intelligence (XAI) May Harm Consumers
Behnam Mohammadi et al.
Marketing Science, forthcoming

Abstract:
The most recent artificial intelligence (AI) algorithms lack interpretability. Explainable artificial intelligence (XAI) aims to address this by explaining AI decisions to customers. Although it is commonly believed that the requirement of fully transparent XAI enhances consumer surplus, our paper challenges this view. We present a game-theoretic model where a policymaker maximizes consumer surplus in a duopoly market with heterogeneous customer preferences. Our model integrates AI accuracy, explanation depth, and method. We find that partial explanations can be an equilibrium in an unregulated setting. Furthermore, we identify scenarios where customers’ and firms’ desires for full explanation are misaligned. In these cases, regulating full explanations may not be socially optimal and could worsen the outcomes for firms and consumers. Flexible XAI policies outperform both full transparency and unregulated extremes.


The Biodiversity Protection Discount
Golnaz Bahrami, Matthew Gustafson & Eva Steiner
Pennsylvania State University Working Paper, December 2024

Abstract:
Land use restrictions are the preferred policy tool to halt the dramatic decline in global biodiversity, but their economic costs are unknown. We estimate an average discount of 45% in the value of protected land in the U.S. This discount is driven by restrictions to development. It is larger in locations where developable land is more scarce and where political regimes are more committed to conservation. We quantify the costs of existing biodiversity protections at $820 billion or 3% of aggregate U.S. land value. These costs should be weighed against the benefits of biodiversity to determine optimal conservation policy.


Do Sports Bettors Need Consumer Protection? Evidence from a Field Experiment
Matthew Brown, Nick Grasley & Mariana Guido
Stanford Working Paper, October 2024

Abstract:
Corrective policy in sports betting markets is motivated by concerns that demand may be distorted by behavioral bias. We conduct a field experiment with frequent sports bettors to measure the impact of two biases, overoptimism about financial returns and self-control problems, on the demand for sports betting. We find widespread overoptimism about financial returns. The average participant predicts that they will break even, but in fact loses 7.5 cents for every dollar wagered. We also find evidence of significant self-control problems, though these are smaller than overoptimism. We estimate a model of biased betting and use it to evaluate several corrective policies. Our estimates imply that the surplus-maximizing corrective excise tax on sports betting is twice as large as prevailing tax rates. We estimate substantial heterogeneity in bias across bettors, which implies that targeted interventions that directly eliminate bias could improve on a tax. However, eliminating bias is challenging: we show that two bias-correction interventions favored by the gambling industry are not effective.


Insight

from the

Archives

A weekly newsletter with free essays from past issues of National Affairs and The Public Interest that shed light on the week's pressing issues.

advertisement

Sign-in to your National Affairs subscriber account.


Already a subscriber? Activate your account.


subscribe

Unlimited access to intelligent essays on the nation’s affairs.

SUBSCRIBE
Subscribe to National Affairs.