‘Not Ready for Prime Time’: Biometrics and Biopolitics in the (Un)Making of California’s Facial Recognition Ban
Asvatha Babu and Saif Shahin
AI for Everyone? Critical Perspectives
Facial recognition is one of the most common—and contentious—applications of artificial intelligence. In October 2019, the US state of California, home to the Silicon Valley, passed the legislation AB1215, banning police from using facial recognition technology on body cameras. This article traces the trajectory of AB1215 as a social discourse from its first reading in February until it was signed into law through a close study of legislative documents, industry reports, civil society releases, and media coverage. We specifically investigate how and why the ban, initially intended to be permanent, was reduced to just three years, and identify the key social actors shaping the discourse. We argue that opposition to algorithmic governance needs to focus on its potential to exacerbate social discrimination rather than the shortcomings of the technology itself.
Dial M for Money: Transnational Narratives of Mobile Money in the Global South
Saif Shahin, Mohammad Ala-Uddin, Tarishi Verma and Frankline Matanji
Routledge Handbook of Media and Communication in the Global South
Mobile money transfer technology has diffused widely in recent years, especially in the Global South. Our study adopts a constructivist approach to understanding mobile money as a ‘technosocial’ artifact, which is shaped by social forces even as it reshapes society over time. By analyzing local news discourses from Kenya, Bangladesh, and India between 2011 and 2017 (N=20,104), we explain how mobile money becomes enmeshed in the economic, political, and cultural fabric of a society. We employ unsupervised machine learning to distinguish three transnational narratives that give meaning to mobile money—particular ways of regarding the technology that become widely available ‘truths’. Further, we examine the influence of social ideologies such as technological determinism, modernization, nationalism, and neoliberalism as well as the role of the state, the market, and mass media in this meaning-making process.
Big Data and the Illusion of Choice: Comparing the Evolution of India’s Aadhaar and China’s Social Credit System as Technosocial Discourses
Saif Shahin and Pei Zheng
Social Science Computer Review
India and China have launched enormous projects aimed at collecting vital personal information regarding their billion-plus populations and building the world’s biggest data sets in the process. However, both Aadhaar in India and the Social Credit System in China are controversial and raise a plethora of political and ethical concerns. The governments claim that participation in these projects is voluntary, even as they link vital services to citizens registering with these projects. In this study, we analyze how the news media in India and China—crucial data intermediaries that shape public perceptions on data and technological practices—framed these projects since their inception. Topic modeling suggests news coverage in both nations disregards the public interest and focuses largely on how businesses can benefit from them. The media, institutionally and ideologically linked with governments and corporations, show little concern with violations of privacy and mass surveillance that these projects could lead to. We argue that this renders citizens structurally incapable of making a meaningful “choice” about whether or not to participate in such projects. Implications for various stakeholders are discussed.
Facing up to Facebook: How Digital Activism, Independent Regulation, and Mass Media Foiled a Neoliberal Threat to Net Neutrality
Information, Communication & Society
This study traces how Facebook-promoted internet.org/Free Basics, despite initial acclaim, was eventually rejected in India – and how net neutrality came to be codified in the process. Topic modeling of articles (N = 1752) published over two-and-a-half years in 100 media outlets pinpoints the critical junctures in time at which the public discourse changed its trajectory. Critical discourse analysis of different phases of the discourse then identifies the causal factors and contingent conditions that produced the new policy. The study advances an understanding of technologies as social constructs and technological change as a social process, shaped by the dynamic interaction of a complex array of social actors coming together at critical junctures. It also draws attention to how discourse, produced by social actors in contingent conditions, recursively shapes the dominant ideology and structures these interactions. In addition, the study demonstrates how algorithmic and interpretive research techniques can be combined for longitudinal analysis of textual data sets.
Analysis of Messy Data
International Encyclopedia of Communication Research Methods
Raw data collected through surveys, experiments, coding of textual artifacts, or other quantitative means may not meet the assumptions upon which statistical analyses rely. The presence of univariate or multivariate outliers, skewness or kurtosis in a distribution, and heteroscedasticity or multicollinearity among variables may compromise data analysis. Scholars have devised a variety of techniques to discern and address such problems.
Right to be Forgotten: How National Identity, Political Orientation, and Capitalist Ideology Structured a Trans-Atlantic Debate on Information Access and Control
Journalism & Mass Communication Quarterly
This study examines U.S. and British media coverage of the “right to be forgotten” in the light of their legal approaches and public attitudes toward privacy. Algorithmic and qualitative textual analysis techniques are combined to uncover the ideologies and interests that structure the discourse and shape its outcome. The analysis reveals that U.S. media, irrespective of their perceived “liberal” or “conservative” orientation, treat users’ online privacy as subservient to the business interests of technology companies—in line with the country’s lax legal approach. The coverage is more diverse in Britain, where the legal concept of privacy is also more stringent.
When Scale Meets Depth: Integrating Natural Language Processing and Textual Analysis for Studying Digital Corpora
Communication Methods and Measures
As computer-assisted research of voluminous datasets becomes more pervasive, so does the criticism of its epistemological, methodological, and ethical/normative inadequacies. This article proposes a hybrid approach that combines the scale of computational methods with the depth of qualitative analysis. It uses simple natural language processing algorithms to extract purposive samples from large textual corpora, which can then be analyzed using interpretive techniques. This approach helps research become more theoretically grounded and contextually sensitive—two major failings of typical “Big Data” studies. Simultaneously, it allows qualitative scholars to examine datasets that are otherwise too large to study manually and also bring more rigor to the process of sampling. The method is illustrated with two case studies, one looking at the inaugural addresses of U.S. presidents and the other investigating the news coverage of two shootings at an army camp in Texas.
A Critical Axiology for Big Data Studies
Big Data is having a huge impact on journalism and communication studies. At the same time, it has raised a plethora of social concerns ranging from mass surveillance to the legitimization of prejudices such as racism. This article develops an agenda for critical Big Data research. It discusses what the purpose of such research should be, what pitfalls it should guard against, and the possibility of adapting Big Data methods to conduct empirical research from a critical standpoint. Such a research program will not only enable critical scholarship to meaningfully challenge Big Data as a hegemonic tool, but will also make it possible for scholars to draw upon Big Data resources to address a range of social issues in previously impossible ways. The article calls for methodological innovation in combining emerging Big Data techniques with critical/qualitative methods of research, such as ethnography and discourse analysis, in ways that allow them to complement each other.