“Big Data” Goes to the Movies: Uncovering Latent Coherence among Movies and its Implications for Movie Performance and Studio Strategy (working paper)
Streaming platforms like Amazon Prime Video and Netflix have transformed the motion picture industry by changing the way audiences view content and by altering who produces content and how. Recently, movie studios have been launching their own video streaming platforms to compete with existing streaming platforms. But they are at a disadvantage: they do not possess the vast user data of Amazon or Netflix, which informs everything from content production and advertising. How can studios successfully compete? This paper uncovers a novel factor based on publicly available user data and shows that it matters for movie performance, seeking to inform studio strategy. The paper also highlights the comparative advantage that streaming platforms have over traditional studios due to user data, and sheds light on what studios can learn from publicly available data to compete with streaming platforms.
Storm Crowds: Evidence from Zooniverse on Crowd Contribution Design (working paper, with Joshua Gans)
What is the impact of crowdsourcing platform design on crowds' willingness to contribute? The proliferation of platforms with distributed content production, such as Wikipedia, Zooniverse, and others, has led to scholarly interest in understanding why individuals contribute to them. Few studies have focused on the impact of platforms' architectural design on contributions. One relevant architecture component is the extent to which platform designs allow incomplete, or partial, contributions by the crowds - a concept we refer to as the design's "tolerance to incompleteness." This paper explores the relationship between this design element and crowds' willingness to contribute. Through a quasi-experimental approach, we test our predictions with data from the citizen science platform Zooniverse. Findings show that in designs with lower tolerance to incompleteness, contributors make fewer total edits, but more complete edits. Moreover, there is a trade-off between the quantity and quality of complete edits, with the quality of complete edits lower in design with lower tolerance to incompleteness. Our findings are relevant to the management and design of crowdsourcing platforms characterized by simple, independent, and well-structured tasks.