Jeffrey Lockhart

Recent Scholarship

Who Authors Social Science? Demographics and the Production of Knowledge

Social Currents

Author demographics are of key epistemic importance in science—shaping the approaches to and contents of research—especially in social scientific knowledge production, yet we know very little about who produces social scientific publications. We fielded an original demographic survey of nearly 20,000 sociology, economics, and communication authors in the Web of Science from 2016–2020. Our results include not only details about gender and race/ethnicity but also the first descriptive statistics on social science authors’ sexuality, disability, parental education, and employment characteristics. We find authorship in the social sciences looks very different from other measures of disciplinary membership like who holds PhDs or faculty positions. For example, half of the authors in each discipline’s journals say that they are not a member of the discipline in which they published. Moreover, social science authors are considerably less diverse than other measures of disciplinary membership. In sociology, women constitute a majority of PhDs, faculty, and American Sociological Association members; by contrast, men make up a majority of sociology’s authors. Additionally, we include a wide array of descriptive statistics across a range of demographic characteristics, which will be of interest to inequality scholars, science scholars, and social scientists engaged in diversifying their disciplines.

 

Lockhart, J. W., King, M. M., & Munsch, C. (2024). “Who Authors Social Science? Demographics and the Production of Knowledge.” Social Currents, 0(0). https://doi.org/10.1177/23294965241246805. (Preprint) (SI)

Gender, Sex, and the Constraints of Machine Learning Methods

Oxford Handbook of the Sociology of Machine Learning

This chapter discusses the wide array of ways that gender and sex interact with machine learning (ML) and the artificial intelligence technologies that rely on it. Some of these interactions are intentional; others are unintentional or even against practitioners’ concerted efforts. Some are born out of the allure of a seemingly simple variable that is aligned with the technical needs of ML. Often, gender lurks without invitation, because these methods mine data for associations, and gendered associations are ubiquitous. In a growing body of work, scholars are using ML to actively interrogate measurements and theories of gender and sex. ML brings with it new paradigms of quantitative reasoning that hold the potential to either reinscribe or revolutionize gender in both technical systems and scientific knowledge.

Lockhart, Jeffrey W., 2023. Gender, Sex, and the Constraints of Machine Learning Methods, in Christian Borch, and Juan Pablo Pardo-Guerra (eds)The Oxford Handbook of the Sociology of Machine Learning. (Preprint)

A gender hypothesis of sex disparities in adverse drug events

Social Science & Medicine

Pharmacovigilance databases contain larger numbers of adverse drug events (ADEs) that occurred in women compared to men. The cause of this disparity is frequently attributed to sex-linked biological factors. We offer an alternative Gender Hypothesis, positing that gendered social factors are central to the production of aggregate sex disparities in ADE reports. We describe four pathways through which gender may influence observed sex disparities in pharmacovigilance databases: healthcare utilization; bias and discrimination in the clinic; experience of a drug event as adverse; and pre-existing social and structural determinants of health. We then use data from the U.S. FDA Adverse Event Reporting System (FAERS) to explore how the Gender Hypothesis might generate novel predictions and explanations of sex disparities in ADEs in existing widely referenced datasets. Analyzing more than 37 million records of ADEs between 2014 and 2022, we find that patient-reported ADEs show a larger female skew than healthcare provider-reported ADEs and that the sex disparity is markedly smaller for outcomes involving death or hospitalization. We also find that the sex disparity varies greatly across types of ADEs, for example, cosmetically salient ADEs are skewed heavily female and sexual dysfunction ADEs are skewed male. Together, we interpret these findings as providing evidence of the promise of the Gender Hypothesis for identifying intervenable mechanisms and pathways contributing to sex disparities in ADEs. Rigorous application of the Gender Hypothesis to additional datasets and in future research studies could yield new insights into the causes of sex disparities in ADEs.

Lee, Katharine MN, Tamara Rushovich, Annika Gompers, Marion Boulicault, Steven
Worthington, Jeffrey W. Lockhart, Sarah S. Richardson. 2023. “A gender hypothesis of sex disparities in adverse drug events.” Social Science & Medicine. 339:116384.

The Gay Right: A Framework for Understanding Right Wing LGBT Organizations

Journal of Homosexuality

While there has been considerable interest in debates about right wing ideas in LGBT movements—military service, marriage, nationalism, white supremacy—there has been comparatively little attention to self-proclaimed right wing LGBT organizations, what I call the “gay right.” Social theory to date offers a fragmented set of theoretical tools to explain them, including homonationalism, post-gay identity, additive intersectionality, and systems justification theory. I propose a two axis framework to unify these theories and map wide ranging diversity within the gay right. This framework is based on both a review of existing theories, and also analysis of 38 gay right organizations in 14 countries. I illustrate the framework with an extended analysis of three gay-right organizations which share a context—the contemporary UK—but have very different politics.

Lockhart, Jeffrey W. 2023. The Gay Right: A Framework for Understanding Right Wing LGBT OrganizationsJournal of Homosexuality. (preprint)

Adverse Drug Events by Sex After Adjusting for Baseline Rates of Drug Use

JAMA Network Open

Although they have known limitations, spontaneous reporting pharmacovigilance databases like the US Food and Drug Administration (FDA) Adverse Event Reporting System (FAERS) and World Health Organization VigiBase are widely cited as evidence for claims that women experience adverse drug events (ADEs) at as high as twice the rate of men. Pharmacokinetics and pharmacodynamics are typically used to explain these sex differences; however, many factors could influence the distribution of ADE reports by sex, including well-known disparities in the rates at which men and women use prescribed drugs. This study examined ADEs reported by sex in the FAERS database after adjusting for drug use by men and women.

Rushovich, Tamara*, Annika Gompers*, Jeffrey W. Lockhart*, Ife Omidiran, Steven Worthington, Sarah S. Richardson, and Katharine M. N. Lee (* denotes equal authorship). 2023. “Adverse Drug Events by Sex After Adjusting for Baseline Rates of Drug Use.” JAMA Network Open 6(8):e2329074.

Name-Based Demographic Inference and the Unequal Distribution of Misrecognition

Nature Human Behavior

Academics and companies increasingly draw on large datasets to understand the social world, and name-based demographic ascription tools are widespread for imputing information like gender and race that are often missing from these large datasets. These approaches have drawn criticism on ethical, empirical, and theoretical grounds. Employing a survey of all authors listed on articles in sociology, economics, and communications journals in the Web of Science between 2015 and 2020, we compared self-identified demographics with name-based imputations of gender and race/ethnicity for 19,924 scholars across four gender ascription tools and four race/ethnicity ascription tools. We find substantial inequalities in how these tools misgender and misrecognize the race/ethnicity of authors, distributing erroneous ascriptions unevenly among other demographic traits. Because of the empirical and ethical consequences of these errors, scholars need to be cautious with the use of demographic imputation. We recommend five principles for the responsible use of name-based demographic inference.

Lockhart, Jeffrey W, Molly M. King, and Christin Munsch. 2023. “Name-based Demographic Inference and the Unequal Distribution of Misrecognition.” Nature Human Behavior 0(0):xx-xx. (Open Access) (Preprint)

Because the Machine Can Discriminate: How Machine Learning Serves and Transforms Biological Explanations of Human Difference

Big Data & Society

Research on scientific/intellectual movements, and social movements generally, tends to focus on resources and conditions outside the substance of the movements, such as funding and publication opportunities or the prestige and networks of movement actors. Drawing on Pinch’s (2008) theory of technologies as institutions, I argue that research methods can also serve as resources for scientific movements by institutionalizing their ideas in research practice. I demonstrate the argument with the case of neuroscience, where the adoption of machine learning changed how scientists think about measurement and modeling of group difference. This provided an opportunity for members of the sex difference movement by offering a ‘truly categorical’ quantitative methodology that aligned more closely with their understanding of male and female brains and bodies as categorically distinct. The result was a flurry of publications and symbiotic relationships with other researchers that rescued a scientific movement which had been growing increasingly untenable under the prior methodological regime of univariate, frequentist analyses. I call for increased sociological attention to the inner workings of technologies that we typically black box in light of their potential consequences for the social world. I also suggest that machine learning in particular might have wide-reaching implications for how we conceive of human groups beyond sex, including race, sexuality, criminality, and political position, where scientists are just beginning to adopt its methods.

Lockhart, Jeffrey W. 2023. “Because the Machine Can Discriminate: How Machine Learning Serves and Transforms Biological Explanations of Human Difference.” Big Data & Society 10(1):1-14. (Open Access)

 

Paradigms of Sex Research and Women in STEM

Gender & Society

Scientists’ identities and social locations influence their work, but the content of scientific work can also influence scientists. Theory from feminist science studies, autoethnographic accounts, interviews, and experiments indicate that the substance of scientific research can have profound effects on how scientists are treated by colleagues and their sense of belonging in science. I bring together these disparate literatures under the framework of professional cultures. Drawing on the Survey of Earned Doctorates and the Web of Science, I use computational social science tools to argue that the way scientists write about sex in their research influences the future gender ratio of PhDs awarded across 53 subfields of the life sciences over a span of 47 years. Specifically, I show that a critical paradigm of “feminist biology” that seeks to de-essentialize sex and gender corresponds to increases in women’s graduation rates, whereas “sex difference” research—sometimes called “neurosexism” because of its emphasis on essential, categorical differences—corresponds to decreases in women’s graduation rates in most fields.

Lockhart, Jeffrey W. 2021. Paradigms of Sex Research and Women in STEMGender & Society. 35(3): pp.449-475. Preprint & Supplemental materials.

Diagnosing Gender Bias in Image Recognition Systems

Socius.

Image recognition systems offer the promise to learn from images at scale without requiring expert knowledge. However, past research suggests that machine learning systems often produce biased output. In this article, we evaluate potential gender biases of commercial image recognition platforms using photographs of U.S. members of Congress and a large number of Twitter images posted by these politicians. Our crowdsourced validation shows that commercial image recognition systems can produce labels that are correct and biased at the same time as they selectively report a subset of many possible true labels. We find that images of women received three times more annotations related to physical appearance. Moreover, women in images are recognized at substantially lower rates in comparison with men. We discuss how encoded biases such as these affect the visibility of women, reinforce harmful gender stereotypes, and limit the validity of the insights that can be gathered from such data.

Schwemmer, Carsten, Carly Knight, Emily Bello-Pardo, Stan Oklobdzija, Martijn Schoonvelde, and Jeffrey W. Lockhart. 2020. Diagnosing Gender Bias in Image Recognition SystemsSociusReplication materials.

Scroll to Top
Mastodon