Summer Institute in Advanced Research Methods for Science, Technology, Engineering, and Mathematics Education Research

June 2024 Conference

Quantitative Methods for Assessing the Potential to Disrupt Disparities in and through STEM Education

Conference Program at a Glance

Day 1 (Wednesday, June 5): Mini-Conference on measurement of STEM identity, inequity, and intersectionality

8:45 – 9:15      Welcome Address and Opening Remarks

9:15 – 10:15    Keynote Address

Measurement as a driving force toward education equity -- Michael Rodriguez, University of Minnesota

Michael Rodriguez — Professor and Dean, College of Education & Human Development, University of Minnesota

Abstract:

Measurement has long been a tool of education reform. For some, it is a core element of a cycle of teaching, learning, and assessment. For others, it is a way to sort, select, reward. Such purposes have not always promoted individual or collective goals, or the greater good. Limitations in social and political structures, racism and segregation, and misinformation limit our progress in every domain. Global trends in higher education require us to shift our attention—measurement can be used as a lever to clarify where we are at and support our efforts to set and achieve collective goals. In some respect, it will take a shift in what and how we measure, but to a larger extent, it will take a shift in mindsets to reorient ourselves to secure valid interpretation and use of such measures.

10:30 – 12:00  Symposium 1: Intersectionality and its implications in measurement

“A Comparative Guide to Quantitative Intersectional Modeling” -- Ben Van Dusen, Iowa State University

Ben Van Dusen — Associate Professor, Iowa State University, School of Education
— co-authors: Heidi Cian, Jayson Nissen, and Lucy Arellano

Abstract:

Drawing on intersectional theory, we examine the application of the Multilevel Analysis of Individual Heterogeneity and Discriminatory Accuracy (MAIHDA) within science education research (Evans et al., 2018). MAIHDA is a novel analytical technique designed to create quantitative intersectional models that emerged from the health sciences (Evans, 2015). To improve model predictions across social identity groups, MAIHDA nests individuals within their social identities (Evans et al., 2020; Keller et al., 2023). By combining the main effects with the variance terms for each social identity group, MAIHDA can create more accurate and precise predictions than fixed-effect models that address intersectionality with interaction terms. Our study compares MAIHDA and the more commonly used fixed effects models, with a dual focus on evaluating model alignment with intersectional theory and the ability to predict group performance precisely, thereby allowing for the representation of additional social groups. By doing so, we engage with and contribute to the evolving landscape of quantitative intersectional research.

This manuscript analyzes the performance data of 9,672 physics students in 315 courses at 41 institutions collected through the LASSO platform. We analyzed student performance using a hierarchical fixed effect model and a cross- classified MAIHDA model to investigate the interplays of racism and sexism and their collective impact on educational achievements within physics. Our comparative analysis examines the precision of the modeling techniques in representing educational outcomes while highlighting MAIHDA’s capacity to include a more granular representation of student identities. Preliminary findings suggest that MAIHDA surpasses fixed effects modeling in predictive accuracy and demonstrates a significant capability to integrate a broader spectrum of intersectional identities (Van Dusen et al., in press).

This paper will guide researchers to understand and apply MAIHDA in their studies that explore inequities through a lens of intersectionality. To support this effort, we will provide practical advice and R code for implementing MAIHDA with common features of educational data (e.g., nesting structures and missing data).

“Intersectional Measurement Invariance on the CogAT 7: A Bi-Factor Model Exploration” -- Joni Lakin, University of Alabama/Qingzhou Shi, Northwestern University

Joni Lakin — Professor of Educational Studies, University of Alabama
Qingzhou Shi — Postdoctoral scholar, Spatial Intelligence and Learning Center, Northwestern University

Abstract:

Measurement invariance is essential for valid comparisons across diverse groups in cognitive assessments. Traditional approaches often assess invariance across single social identities, such as gender, but may overlook intersectional identities. This study evaluates the Cognitive Abilities Test Form 7 (CogAT 7) for measurement invariance across intersectionally defined groups by gender, race, and socioeconomic status (SES), employing Alignment Structural Equation Modeling (ASEM) within a bi-factor framework. The sample comprises 5,537 third- grade students from the 2010 national standardization of CogAT 7. Results demonstrate robust internal consistency across batteries and satisfactory fit indices. Measurement invariance tests reveal only 8% of parameters as non- invariant, indicating substantial invariance across the intersectional groups. Additionally, actor mean comparisons suggested performance variations among intersectional groups, with notable differences influenced by SES and gender. This study highlights the critical need for intersectional analyses in cognitive assessments to promote equitable measurement and illustrates the effectiveness of ASEM in handling complex model structures.

“Honoring Heterogeneity when Measuring the Politics of Identity” -- Phillip Boda, University of Illinois Chicago

Phillip A. Boda — Assistant Professor, University of Illinois Chicago

Abstract:

Measuring difference traditionally assumes that schemas developed within a group’s psyche bounded by socially constructions of identity are similar. This, they argue, can and should be estimated aggregately to explore the role identity plays in research design, implementation of design inventions, and ways we evaluate a study’s efficacy. This argument follows identity politics theories and guides broader psychometric premises that self-selected words (e.g., ‘disabled’, ‘white’, ‘female’) can represent the multi-level, multi-dimensional, and multi-life experiences of humans. Beyond ‘identity politics’ is a frame the ‘politics of identity’. That is, fixed categories are insufficient to aggregate people’s lived experiences, and doing so denies individuals’ opportunities to self-identify. A politics of identity approach argues that groups of people have similar types of experiences and then provides nuance to the way their identities may intersect

to create more nuanced dimensions of those experiences. Most importantly, an engaged politics of identity is concerned with authority, agency, power, and choice. For this presentation, I honor the heterogeneity in identity measurement by allowing our participants to self-identify the impact of identity from their point of view as a space to explore differential experience as

embodied in the diverse ways of self-identification. Using survey data from three different studies, I will report preliminary results that explore this argument’s premise through open-ended responses added to progressive demographics. The implications of this research reflect the pertinent need to move beyond homogeneity in how we view the estimation of survey measurement and the importance of identity nuance.

12:15 – 1:30    Lunch-time Keynote Address

“Embracing the Many Ways of Being Human in Measuring Complex STEM Education Constructs” --Bruno D. Zumbo, University of British Columbia

Bruno D. Zumbo — Distinguished University Scholar & Professor Tier 1 Canada Research Chair in Psychometrics and Measurement, University of British Columbia (Vancouver, Canada)

Abstract:

This presentation explores my theoretical and methodological advancements in measurement research, specifically focusing on the intricate task of measuring complex STEM (Science, Technology, Engineering, and Mathematics) education constructs, such as STEM identity, within the broader context of educational inequality. I will provide a brief overview of the central theoretical and methodological innovations emerging from my explanation-focused view of measurement and test validation research. This framework is embedded within an ecological model of item responding and test performance, placing a centrality on test consequences, values, and what I term ‘the many ways of being human’ (Zumbo, 2023). By directing research efforts toward understanding the variation in STEM identity using this comprehensive framework, I argue that researchers and educators may be better equipped to address inequalities in STEM education effectively.

1:45 – 3:15      Symposium 2: Methodological Considerations in Studying Complex Social Constructs

“Explanatory Differential Item Functioning” -- Sun-Joo, Vanderbilt University

Sun-Joo Cho — Professor, Vanderbilt University’s Peabody College, Psychometrics

Abstract:

In educational and psychological assessments, ensuring fairness across diverse subgroups is critical. Differential Item Functioning (DIF) occurs when individuals from different subgroups with the same level of a construct have different probabilities of answering an item correctly, raising concerns about measurement bias. In this talk, I will introduce the concepts of DIF and provide an overview of DIF research. Special emphasis will be placed on methods for explaining DIF, such as mixture item response models and machine learning techniques. These approaches enhance our understanding by pinpointing underlying variables and subgroup differences that contribute to DIF. Through this overview, the talk aims to highlight the importance of addressing DIF to ensure that scores reflect true differences in a primary construct rather than a secondary construct. I hope that attendees will gain insights into recent advancements in statistical methodologies and practical strategies for mitigating DIF in their own evaluative practices.

“Issues to Consider when Applying Quantitative Methods Critically” -- Michael Russell, Boston College

Michael Russell — Professor, Measurement, Evaluation, Statistics, & Assessment. Lynch School of Education and Human Development, Boston College

Abstract:

This session explores a few issues to consider when applying quantitative methods to examine social programs, structures, and systems. Among the topics explored are: the use of race, gender, economic status, and other socially structured demographic variables in quantitative analyses; statistical terminology and the production of deficit narratives; alignment between theory, variable selection, and modeling techniques; and applying Intersectionality Theory to quantitative analyses.

3:30 – 4:10      Round-Table Discussions

“Debating the QuantCrit Triangulation” -- Darnell Leatherwood

Darnell Leatherwood — Lecturer, University of Chicago, National Institutes of Health (NIH) Postdoctoral Research Fellow, Johns Hopkins University

Watch the roundtable video here.

Abstract:

At the 2024 annual meeting of the American Education Research Association (AERA), the Critical Quantitative Methodologies Special Interest Group (SIG #186) hosted its inaugural session titled “Exploring New Horizons: Critical Quantitative Methodologies in Education. During this session, a framework was forwarded to help scholars interested in using Quantitative Critical Race Theory (QuantCrit) think seriously about how to best position themselves to fully activate the tenets of QuantCrit. The QuantCrit Triangulation posits that a scholar must excel in three areas if they are to buttress their capacities to apply the tenets of QuantCrit. These areas include: (a) a firm understanding of quantitative/statistical methods, (b) a willingness to critique and develop measures, (c) a deep and unapologetic concern for people. By delineating the three areas of the QuantCrit Triangulation, this author aims to encourage colleagues to connect the rigorous application of quantitative methods with the tenets of QuantCrit, strengthening the scientific basis of what we can say and do, and advancing towards justice, with a focus on measurement.

“Measuring Black Undergraduate Women’s Intersectional Experiences in Engineering/Computing: Preliminary Scale Development Insights” -- Krystal Williams, University of Georgia/William Walker, Louise McBee Institute of Higher Education

Krystal L. Williams — Assistant Professor, University of Georgia, Louise McBee Institute of Higher Education, Director of The Education Policy and Equity Research Collective (Ed_PERC)
William Walker — Doctoral student, Louise McBee Institute of Higher Education, Presidential Graduate Fellow
— co-authors: Imani Callan, Ijaz Ahmad, Lilly Kate Steiner, Shriya Rasale

Abstract:

Black women are underrepresented in engineering/computing and scholars have begun exploring their experiences in these fields to better understand barriers to/facilitators of successful outcomes (e.g. Williams, 2024). Many studies have employed qualitative methods and borrowed from concepts such as intersectionality (Crenshaw, 1991) to understand these phenomena. While these qualitative insights provide a rich foundation for understanding key challenges, few studies have tapped into Black women’s experiences using quantitative approaches. Moreover, there is a prevailing gap in the literature concerning how best to represent Black women’s intersectional experiences in engineering/computing from a measurement perspective, using scales that are normed on Black women. Given this gap in research, this research is informed by Kimberle’ Crenshaw’s (1991) intersectionality framework and seeks to design a scale for Black undergraduate women in engineering/computing to measure their intersectional experiences within three domains—structural, political, and representational. Preliminary insights concerning these overarching aims will be discussed.

“Applying Cognitive Diagnostic Models to Math and Physics Concept Inventories” -- Vy Le, Iowa State University / Ben Van Dusen, Iowa State University

Ben Van Dusen — Associate Professor, Iowa State University, School of Education
Vy Le — Ph.D. Student, Iowa State University, Science Education
— co-author: Jayson Nissen

Watch the roundtable video here.

Abstract:

In physics education research, instructors and researchers often use research-based assessments (RBAs) to assess students’ skills and knowledge. In this paper, we support the development of a mechanics cognitive diagnostic to test and implement effective and equitable pedagogies for physics instruction. Adaptive assessments using cognitive diagnostic models provide significant advantages over fixed-length RBAs commonly used in physics education research. As part of a broader project to develop a cognitive diagnostic assessment for introductory mechanics within an evidence-centered design framework, we identified and tested student models of four skills that cross content areas in introductory physics: apply vectors, conceptual relationships, algebra, and visualizations. We developed the student models in three steps. First, we based the model on learning objectives from instructors. Second, we coded the items on RBAs using the student models. Lastly, we tested and refined this coding using a common cognitive diagnostic model, the deterministic inputs, noisy “and” gate (DINA) model. The data included over 22,000 students who completed either the Force Concept Inventory, Force and Motion Conceptual Evaluation, Energy and Momentum Conceptual Survey, Calculus Concept Assessment, Calculus Concept Inventory, or Pre-calculus Concept Assessment on the LASSO platform. The results indicated a good to adequate fit for the student models with high accuracies for classifying students with many of the skills. The items from these RBAs do not cover all of the skills in enough detail, however, they will form a useful initial item bank for the development of the mechanics cognitive diagnostic.

“The Use of Artificial Intelligence in Coding Data” -- Yasemin Copur-Gencturk, University of Southern California

Yasemin Copur-Gencturk — Associate Professor of Education, University of Southern California

Watch the roundtable video here.

Abstract:

Despite the agreed-on importance of this knowledge in teaching and student learning, supporting empirical evidence is limited. Such challenges have been attributed to measurement-related issues, particularly around the limited capacity of multiple-choice questions to capture the depth of and nuances in teachers’ knowledge.

Therefore, one solution to address this problem is to capture teachers’ knowledge through open-ended responses. Scholars measuring teachers’ knowledge through open-ended responses have indeed proven they are able to capture the nuances in teachers’ knowledge. However, measuring teachers’ knowledge through open-text responses requires substantial resources, given that raters must be trained to code the data accurately. This task is even more challenging considering the ambiguity of answers, mathematical logic, and human subjectivity, which often lead to divergent gradings given by different human scorers.

In this work, we are testing several AI-based approaches designed to automatically score mathematics teachers’ open-ended responses through human-like debate and peer evaluation. One of these approaches is the multi-agent framework, which includes multiple communication agents who evaluate teachers’ answers individually and then discuss them to seek a resolution. This process encourages a more thorough interpretation of the mathematical language and reduces misconceptions among individual LLM-powered evaluators. We utilized existing data (collected from 200 middle school teachers) that were coded by humans to explore the extent to which these different approaches were able to code the data reliably.

In this session, I will report findings from preliminary analyses of a variety of AI-based approaches we used to code the data and the percentage of agreement between human and AI coders. The feedback I need from participants and experts is related to the validity checks we need to establish with AI coders. For instance, we are using the percentage of agreement between AI and human raters as well as kappa statistics, but are there other approaches we can use? In addition, the reason we are planning to use AI in coding large-scale data is to avoid needing the data coded by humans. This choice brings up issues regarding how to establish when the AI code is indeed valid when human coders are not used. In these situations, what are the established guidelines we can follow, and what are the issues we need to consider?

4:10 – 4:30      Closing Remarks

Day 2 (Thursday, June 6): Mini-Conference on identifying leverages for reducing disparity in and through STEM education

8:45 – 9:00      Welcome and Opening Remarks

9:00 – 10:15    Panel: What is (in)equitable STEM education? Implications for the search for mechanisms
Panelists: Adam Gamoran, Odis Johnson, Steve Raudenbush

10:30 – 11:45  Methodology Symposium 1: Causal decomposition analysis—causes, covariates, and confounders

“How to identify contributing factors to social disparities using causal decomposition and sensitivity analyses” -- Soojin Park, University of California

Soojin Park — Assistant Professor of Quantitative Methods, School of Education at the University of California, Riverside

Abstract:

Large disparities in educational, economic, and health outcomes persist in the US across social groups, such as those based on race, gender, and sexuality. Traditional approaches to decomposition (e.g., difference-in- coefficients or Oaxaca Blinder decomposition) have been utilized to identify risk factors, such as educational attainment, that explain these disparities. However, these approaches have limitations, as they fail to define clear causal estimands (effects of interest), do not clarify assumptions that permit a causal interpretation of the effects (e.g., no omitted confounding), and lack a clear way to validate findings against possible violations of those assumptions. The development of causal decomposition analysis (Jackson & VanderWeele, 2018) has defined causal estimands within a counterfactual framework. My recent work on methods for disparity research (Park, Kang, Lee, & Ma, 2023) developed a sensitivity analysis that assesses the robustness of findings against a reasonable amount of omitted confounding. As a result, stronger causal interpretations of effects can be made. In this presentation, I will discuss the application of causal decomposition methods and sensitivity analysis using an educational example. An R code will be provided.

Related Reading:

Womack, T. A., Palardy, G. J., & Park, S. (2024). The Role of Opportunity to Learn and School Socioeconomic Composition in Reducing Racial and Gendered Disparities in Mathematics Achievement. Journal of Research on Educational Effectiveness, 1–27. h ttps://doi.org/10.1080/19345747.2024.2342850

Park, S., Kang, S., Lee, C. and Ma, S. (2023). Sensitivity analysis for causal decomposition analysis: Assessing robustness toward omitted variable bias. Journal of Causal Inference, 11(1). h ttps://doi.org/10.1515/jci-2022-0031 R code: h ttps://github.com/soojinpark33/Math_Disparities

“Overview of Causal Decomposition Analysis: an Analytic Approach for Informing Interventions to Reduce Disparity” -- John Jackson, Johns Hopkins Bloomberg School of Public Health

John Jackson — Associate Professor, Departments of Epidemiology, Biostatistics, and Mental Health at the Johns Hopkins Bloomberg School of Public Health

Abstract:

In this talk I will outline how to incorporate equity value judgements in analytic approaches to identify leverage points for reducing disparities (what I call causal decomposition analysis). I will focus on the issue of how covariate adjustments in defining disparities and in equalizing potential determinants of disparities in outcomes (casual decompositions) ultimately conveys value judgements about what one believes is fair and equitable in the distribution of health and its determinants, and how these ideas become reflected in the estimation process. Along the way, I will discuss various principles to guide the choice of covariates for meaningfully defining disparities and decompositions while adjusting for other covariates to account for confounding.

12:00 – 1:50    Lunch-Time Presentations: Identifying mechanisms towards STEM

“The Impact of Female College Instructors: Experimental Evidence from Engineering Colleges in India” -- Prashant Loyalka, Stanford University

Prashant Loyalka — Associate Professor and Senior Fellow, Graduate School of Education and Freeman Spogli Institute for International Studies, Stanford University

Abstract:

Using a novel dataset that combines survey, assessment, and administrative information, as well as an experimental design in which students are randomly assigned to faculty within courses, we investigate the effects of female instructors on gender gaps in STEM – specifically academic performance, anxiety levels, and beliefs about ability differences. We find that female instructors improve the academic performance and anxiety levels of female students. Female instructors also promote more gender-equitable beliefs about ability among all students, particularly males. The findings of our paper contribute to discourse on the importance of female role models in STEM education and has implications for fostering inclusive learning environments.

“Promoting intraminority solidarity by manipulating framings of racism” -- Sapna Cheryan, University of Washington

Sapna Cheryan — Professor of Psychology, University of Washington

Abstract:

Racism has most commonly been studied in psychology as perpetrated by White Americans. Less work has investigated how minoritized groups think about their own prejudicial attitudes and behaviors. In the current proposal, we investigate how different framings of racism influence Asian Americans’ intraminority solidarity with Black Americans. We hypothesize that framing racism against Black Americans as the result of anti-Blackness will increase Asian Americans’ sense of solidarity with Black Americans compared to framing racism as the result of White supremacy. In experimental studies, we investigate 1) the existence of two dominant framings of racism in the U.S.: White supremacy and anti-Blackness, and 2) the influence of these framings on Asian Americans’ solidarity with Black Americans and sense of responsibility for anti-Black racism. Systematically investigating two framings of racism gives us insight into intraminority dynamics and when and why groups less frequently studied in social psychology, such as Asian Americans, perpetuate anti-Black racism.

“Building the Capacity Necessary to Implement Equity-Focused K-12 Computer Science Education Policy” -- Stefanie Marshall [

Stefanie Marshall — Assistant Professor, Michigan State University, Science Education, Leadership, Policy
–co-authors: Ain Grooms, Josh Childs, SJ Hemmerich, Grace Tukurah

Abstract:

The future of equitable Computer Science Education (CSEd) is largely dependent upon state CSEd leaders having the capacity necessary to support local schools and districts in their implementation of state-level policies. Despite state-level efforts (e.g. CSEd development and curriculum standards adoption, CSEd courses counting for graduation requirements, etc.), the needs of many students and their communities are not being met. Specifically, meeting the needs of students from traditionally marginalized backgrounds has been an ongoing issue for schools and districts to consistently address. The importance of high-quality CSEd instruction includes preparing students to enter an increasingly technology-driven workforce and culture, equipping students to critically learn about the harms and benefits that are caused by an increased digitalized society (Vakil, 2018), and providing training that could be transferrable in future workforce endeavors. The rapid growth of CSEd in schools across the United States has led to new opportunities for educational leaders to strategically implement equitable CSEd policies and practices, however there is limited capacity (e.g. time, resources, etc.) to do so. This study explores how state CSEd leaders are building the capacity necessary to support the implementation of equity-oriented K-12 computer science education policies.

“Reducing Disparities in Secondary Science Course Pathways and STEM Persistence” -- Vandeen Campbell, Rutgers University

Vandeen Campbell — Assistant Research Professor and Associate Director, The Joseph C. Cornwall Center for Metropolitan Studies, Rutgers University – Newark

Abstract:

This is a project for which I am seeking funding. I am hoping to study, through a critical quantitative lens, what it takes to reduce segregation-based disparities in access to core and advanced science learning opportunities and disparities in persistence in STEM majors for cohorts of high school students in New Jersey (NJ). The strategic goal is to advance knowledge on broadening participation in STEM fields.

The STEM education research issues of interest are secondary science course-taking pathways and persisting in a STEM major in college. The objective is to conceptualize exposure/isolation and relative diversity segregation measures in the Stanford Education Data Archives (SEDA) through a critical quantitative lens; study how segregation-based disparities in science course-taking pathways can be reduced; and study how reducing the disparity in science course-taking pathways can reduce the disparity in persistence in STEM majors.

This study will expand the body of research studying the effects of secondary-level science course-taking on postsecondary outcomes. It will extend the knowledge base by applying a critical quantitative lens in testing how reducing disparities in science course pathways can reduce the disparities in a key measure of access to STEM career fields – persisting in a STEM major. Disparity analysis models how changing the shares of a particular group receiving an intervention would reduce the disparity. The idea is to study whether the racial/ethnic disparities in the distribution of students persisting in a STEM major can be mediated when shares of students from schools with greater Black, Latinx, and socioeconomic isolation or less diversity are taking the predictive science course pathways equivalent to the distribution in schools with greater balance in terms of exposure and diversity.

In conducting disparity analysis through a critical quantitative lens, both quantifying the disparities and analyzing how they can be closed, the study has the potential to give the field meaningful targets for shifting science course- taking toward broadening participation in STEM. Findings from the study can place historically persistent issues placed in more solvable terms.

“The Math is Not Mathing: The Impact of Historical Residential Segregation on Latinx Mathematics Course Taking in High School” -- Elizabeth Rivera Rodas, Montclair State University

Elizabeth Rivera Rodas — Associate Professor, Quantitative Methods and Sociology of Education, Department of Educational Foundations, Montclair State University’s College for Education and Engaged Learning

Abstract:

This study investigates one of the structural processes that affect the low enrollment of Latinx students in STEM postsecondary degrees – course availability. Districts and schools control which courses they offer and therefore, access to high quality advanced-level mathematics courses is partially under the control of education agencies.

These districts and schools are a product of their neighborhoods and residential segregation. Historically divested neighborhoods have lower advanced level course availability (Owens, 2020), and schools in gentrifying neighborhoods have been found to offer special and advanced programs to attract the gentry (Stillman, 2012). The central research question of this project is: “How does the structural barrier of advanced-level mathematics course availability due to residential segregation in high school contribute to the disproportionately low enrollment of STEM postsecondary degrees of Latinx students?” The structural barrier of course unavailability due to historic residential segregation not only affects Latinx course taking in high school, but also restricts the upward mobility of the Latinx community as this barrier negatively affects postsecondary enrollment.

2:00 – 3:30      Methodology Symposium 2: Causal decomposition analysis—mechanisms, approaches, and dilemmas

“Disparity Analysis: A Tale of Two Approaches” -- Xiang Zhou, Harvard University

Xiang Zhou — Professor of Sociology, Harvard University

Abstract:

To understand the patterns and trends of various forms of inequality, quantitative social science research has typically relied on statistical models linking the conditional mean of an outcome of interest to a range of explanatory factors. A prime example of this approach is the widely used Kitagawa-Oaxaca-Blinder (KOB) method. By fitting two linear models separately for an advantaged group and a disadvantaged group, the KOB method decomposes the between-group outcome disparity into two parts, a part explained by group differences in a set of background characteristics and an unexplained part often dubbed “residual inequality.” In this paper, we explicate, contrast, and extend two distinct approaches to studying group disparities, which we term the descriptive approach, as epitomized by the KOB method and its variants, and the prescriptive approach, which focuses on how a disparity of interest would change under a hypothetical intervention to one or more manipulable treatments. For the descriptive approach, we propose a generalized KOB decomposition that considers multiple (sets of) explanatory variables sequentially. For the prescriptive approach, we introduce a variety of stylized interventions, such as lottery-type and affirmative-action-type interventions that close between-group gaps in treatment. We illustrate the two approaches to disparity analysis by assessing the Black-White gap in college completion, how it is statistically explained by racial differences in demographic and socioeconomic background, family structure, academic performance and behavior, and college selectivity, and the extent to which it would be reduced under hypothetical reallocations of college-goers from different racial/economic backgrounds into different tiers of college — reallocations that could be targeted by race- or class-conscious admissions policies.

“Causal Decomposition of Group Disparities: three mechanisms” -- Ang Yu, University of Wisconsin-Madison / Felix Elwert, University of Wisconsin-Madison

Ang Yu — PhD candidate in Sociology, University of Wisconsin-Madison
Felix Elwert — Professor of Sociology & Biostatistics, University of Wisconsin-Madison

Abstract:

We introduce a new non-parametric causal decomposition approach that identifies the mechanisms by which a treatment variable contributes to group-based outcomes disparities. In contrast to prior approaches that distinguished only one or two mechanisms, our approach distinguishes three mechanisms: differential within-group sorting into treatment, differential prevalence of treatment across groups, and differential effects of treatment across groups. Our approach can be used for retrospective causal explanation of existing disparities and for the prospective planning of interventions to change disparities. We develop multiply-robust and semi-parametrically efficient estimators for all components of our decomposition. We illustrate our novel approach by analyzing the mechanisms by which college graduation causally contributes to inter-generational income persistence.

“Real-World Implications of a Methodological Dilemma: Endogenous Confounding in Causal Decomposition Analysis” -- Ha-Joon Chung, Princeton University / Guanglei Hong, University of Chicago

Ha-Joon Chung — PhD Candidate in Sociology and Social Policy, Princeton University
Guanglei Hong — Professor in the Comparative Human Development Department and the Committee on Education at the University of Chicago

Abstract:

Should a set of hypothetical interventions eliminate the Black-White gap in malleable factors such as schooling, how much racial disparity in subsequent youth development would be reduced and how much would remain? To address this question, a causal decomposition analysis must credibly identify the causal impact of intervening on the malleable factors. However, major confounders such as parental SES are intermediate outcomes of systemic racism experienced over generations that place Black households and communities at a great disadvantage. For this reason, an intervention that attempts to eliminate the Black-White gap in schooling within levels of parental SES or other endogenous confounders has two major problems: (1) Such an attempt is infeasible when only one but not both racial groups are present within some extreme levels of parental SES. (2) Even if both groups are present within every level of parental SES, the attempt would nonetheless fail to eliminate the marginal Black-White gap in schooling due to the endogeneity of parental SES. We propose a novel solution that replaces the original scale of an endogenous confounder with a scale preserving one’s within-group relative standing. A semiparametric weighting strategy then emulates an equity-oriented intervention. Our analysis of the NLSY 1997 data reveals the real-world implications of the methodological dilemma posed by endogenous confounding.

3:45 – 4:15      Round-Table Discussions hosted by Fellows

4:20 – 4:50      Round-Table Discussions hosted by methodologists

5:00 – 5:30      Closing Remarks and Reflections