Scholarship in Disability, Technology, and Politics

My research spans multiple disciplines, including Science & Technology Studies, Critical Disability Studies, Human-Computer Interaction, Design Methods, and Media Studies. Through the interconnected theoretical foundations of these disciplines, I have developed critical practices in qualitative reflexive methods which center multiply marginalized experiences in the production of knowledge.  In the spirit of Chela Sandoval’s “oppositional consciousness” (Sandoval, 2000), I critically examine the ways in which human subjects research in science and technology constructs disability as deficit, lack, and tragedy to overcome or eliminate. As a disabled researcher, I envision transformative research practices which build and expand intracommunal solidarities and strive toward liberation for disabled people. My research agenda adapts to emergent and urgent needs the disability community faces when navigating our complex sociotechnical landscape. I center disabled epistemologies as a cultural site of disruptive invention for justice and equity in science, technology, and society.

Team Members

Rua

Associated Funding

  • Unfunded

Calls for Participation

none

Associated Publications

  • Williams, R. M. (2025). Disability Theory in HCI: Research Reform through Community, Learning, and Play. Proceedings of the 2025 Conference on Research on Equitable and Sustained Participation in Engineering, Computing, and Technology, 160–166. https://doi.org/10.1145/3704637.3734761
    • Despite a growing body of scholarship integrating Disability Studies into Human-Computer Interaction research, access to mentorship in this interdisciplinary approach to technology research remains limited. I detail an ongoing distributed mentoring program for disabled HCI academics supporting their research in disability, policy, and ethics. I then describe the Disability Theory in HCI Workshop Series (Fall 2024), which brought scholars together to learn broader concepts of disability theory, including disabled public scholarship. Such programs fill a critical gap in mentorship and development of cross-disciplinary research in HCI as the field continues to build capacity and interest for this paradigm shifting work.

  • Williams, R. M. (2024, June 5). The Real Dickheads: Investigating the Source of Patient-Physician Conflict in the United States. Just Tech Platform, MediaWell, Social Science Research Councilhttps://just-tech.ssrc.org/articles/the-real-dickheads-investigating-the-source-of-patient-physician-conflict-in-the-united-states/
    • In October 2018, the Twitter hashtag #DoctorsAreDickheads began trending after YouTube blogger Stevie Boebi released a video discussing her years of experiencing medical gaslighting and dismissal, as well as her eventual diagnosis of Ehlers-Danlos Syndrome. The hashtag was started by Wren Frey, formerly known as K. Sauder, in response and solidarity with Boebi’s experiences to foster a wider conversation about these issues.[1] Other social media denizens (predominantly women, people of color, and people with psychiatric conditions) used the hashtag to express their own frustrated histories of medical neglect and abuse. The hashtag has been part of a larger discussion of implicit and explicit bias[2] amongst physicians that prevents them from providing adequate medical care to patients with marginalized identities.[3] The discourse empowered by #DoctorsAreDickheads exposes the ways that our cultural rhetoric of normative health has material consequences for those most vulnerable to (cis)sexism, fatphobia, and stigma against mental illness.

  • Jackson, L., & Williams, R. M. (2024, April 4). The Wheelchair to Warfare Pipeline: How Disabled People Get Exploited to Build the Technology of War. The New Republichttps://newrepublic.com/article/179391/wheelchair-warfare-pipeline-disability-technology
    • The cutting-edge products that Big Tech and the Pentagon are developing could be rebuilding an untold number of lives. Instead, they’re being sent to the battlefield to ruin more.

  • Williams, R. M. (2023, March). On Being an Outlier: Bias in a Culture of Optimization. Gegenüber | Goethe InstitutSynthetic Truthhttps://www.goethe.de/prj/geg/en/thm/tru/25453339.html
    • Proponents of AI promise incredible benefits — but at what cost? Sometimes, we mistake AI as a threat for the far-off future, but our financial, judicial, and medical systems already rely on algorithms. Dr. Rua Williams reflects on the unexpected impact of AI technologies on marginalized groups.

  • Williams, R. M. (2022). “Only the Old and Sick Will Die”—Reproducing ‘Eugenic Visuality’ in COVID-19 Data Visualization. 2022 IEEE International Symposium on Technology and Society (ISTAS), 1–5. https://doi.org/10.1109/ISTAS55053.2022.10227111
    • COVID-19 illness and death has disproportionately impacted marginalized groups the world over. In the United States, Black and Indigenous people have endured the largest risk of death. Disabled and chronically ill people have continued to isolate as their peers “return to normal”, bearing sole liability for their own safety in a society that deems their lives not worth the “sacrifice” of public health measures. While public and institutional policy makers bare personal responsibility for “survival of the fittest” approaches to public health, data science and visualization has contributed to and legitimized many of these eugenic policy decisions through design tropes I characterize as ‘eugenic visuality’. In this paper, I explore how inadequacies and obscurities in COVID-19 data visualization have contributed to and sustained public narratives that devalue marginalized lives for the comfort of white-supremacist and capitalist social norms. While I focus on visualizations and statements provided by the CDC, the implications extend beyond any individual or institution to our collective preconceptions and values. Namely, unexamined biases and unquestioned norms are embedded in data science and visualization, constraining how data is represented and interpreted. These assumptions limit how data can be leveraged in the pursuit of just social policy. Therefore, I propose guiding principles for a Just Visuality in data science and representation, supported by the work of disabled activists and scholars of color.

  • Jackson, L., Haagaard, A., & Williams, R. M. (2022, April 9). Disability Dongle. CASTAC Platypus Bloghttps://blog.castac.org/2022/04/disability-dongle/
    • Disability Dongles are contemporary fairy tales that appeal to the abled imagination by presenting a heroic designer-protagonist whose prototype provides a techno-utopian (re)solution to the design problem. Disability Dongle rhetoric instills in students the value of a quick fix over structural change, thus preventing them from seeking out, participating in, and contributing to existing inquiry. By labeling these material-discursive phenomena—the designed artifacts and the discourse through which their meaning is constituted—we work to shift the focus from their misguided concern about our bodies to their under-analyzed intentions and ambitions.

  • Gibson, A., & Williams, R. M. (2022). Who’s in Charge? Information Technology and Disability Justice in the United States (Just Tech Platform) [Field Report]. Social Science Research Council. https://just-tech.ssrc.org/field-reviews/whos-in-charge-information-technology-and-disability-justice-in-the-united-states/
    • Disabled people in the United States are surrounded, defined, and, to some degree, controlled by data, technology, and information—from medical technology and therapies to educational systems to social and government services and policies that shape their lives. The extent to which they can access and use technologies to accomplish their own goals is less clear. This review discusses access to data and technology for people with disabilities, focusing on agency and digital transinstitutionalization—the extension of institutional frameworks, such as surveillance and control, from state hospitals into community settings via data-driven technologies. We amplify academic scholarship and public discussion on disability access and accessibility. We also challenge the idea that disabled people have “access” to technology in contexts where they do not control technology, such as healthcare, internet-enabled smart homes and communities, and the workplace. Whenever possible, we highlight the work of openly disabled researchers, authors, thinkers, and advocates across multiple fields who write about disability and technology and work toward equity for disabled people.

  • Williams, R. M. (2021). I, Misfit: Empty Fortresses, Social Robots, and Peculiar Relations in Autism Research. Techné: Research in Philosophy and Technology25(3), 451–478. https://doi.org/10.5840/techne20211019147
    • I draw upon Critical Disability Studies and Race Critical Code Studies to apply an oppositional reading of applied robotics in autism intervention. Roboticists identify care work as a prime legitimizing application for their creations. Popular imagination of robotics in therapeutic or rehabilitative contexts figures the robot as nurse or orderly. Likewise, the dominant narrative tropes of autism are robotic—misfit androids, denizens of the uncanny valley. Diagnostic measures reinforce tropes of autistic uncanniness: monotonous speech, jerky movements, and systematic, over-logical minds. Today, robots are pitched as therapeutic tools to intervene in the social (under)development of autistic children; robots with monotonous voices, jerky, dis-coordinated movements, unsettling affect, and behavior predicated on a system of finite state logic. I present eerie and uneasy connections between the discredited works on autism and selfhood by Bettelheim and contemporary rehabilitative robotics research and imagine possibilities for robotics to divest from legacies of enslavement and policing.

  • Williams, R. M., & Boyd, L. E. (2019). Prefigurative Politics and Passionate Witnessing. The 21st International ACM SIGACCESS Conference on Computers and Accessibility – ASSETS ’19, 262–266. https://doi.org/10.1145/3308561.3355617
    • SIGACCESS and SIGCHI are being shaped by new members as they integrate feminist theory, critical race theory, postcolonial studies, and critical disability studies into the field. What happens when we begin to hold each other accountable to the impact our work has on marginalized users and communities? How do we develop a community of collaborative personal growth and institutional transformation? In this experience report, the authors retell the tangled events that brought them together. Through the passionate witnessing of critical analysis and critique, what might have been a contentious adversarial relationship became a potent partnership devoted to pushing the field toward a transformative future. From our story, we present reflections for the “Prefigurative Politics” of socially conscious computing.

  • Williams, R. M., Boyd, L., & Gilbert, J. E. (2023). Counterventions: A reparative reflection on interventionist HCI. Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, 1–11. https://doi.org/10.1145/3544548.3581480
    • Research in HCI applied to clinical interventions relies on normative assumptions about which bodies and minds are healthy, valuable, and desirable. To disrupt this normalizing drive in HCI, we define a “counterventional approach” to intervention technology design informed by critical scholarship and community perspectives. This approach is meant to unsettle normative assumptions of intervention as urgent, necessary, and curative. We begin with a historical overview of intervention in HCI and its critics. Then, through reparative readings of past HCI projects in autism intervention, we illustrate the emergent principles of a counterventional approach and how it may manifest research outcomes that are fundamentally divergent from dominant approaches. We then explicate characteristics of “counterventions” – projects that aim to contest dominant sociotechnical paradigms through privileging community and participants in research inquiry, interaction design, and analysis of outcomes. These divergent research imaginaries have transformative implications for how interventionist HCI might be conducted in future.

  • Williams, R. M., & Gilbert, J. E. (2019). Cyborg Perspectives on Computing Research Reform. Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems – CHI EA ’19, 1–11. https://doi.org/10.1145/3290607.3310421
    • Recent exposures of extant and potentially discriminatory impacts of technological advancement have prompted members of the computing research field to reflect on their duty to actively predict and mitigate negative consequences of their work. In 2018, Hecht et al. proposed changes to the peer-review process attending to the computing research community’s responsibility for impacts on society. In requiring researchers and reviewers to expressly consider the positive and negative consequences of each study, the hope is that our community can earnestly shape more ethical innovation and inquiry. We question whether most researchers have sufficient historical context and awareness of activist movements to recognize crucial impacts to marginalized populations. Drawing from the work of feminist theorists and critical disability scholars, we present case studies in leveraging “situated knowledges” in the analysis of research ethics.

  • Williams, R. M., & Gilbert, J. E. (2020). Perseverations of the academy: A survey of wearable technologies applied to autism intervention. International Journal of Human-Computer Studies143, 102485. https://doi.org/10.1016/j.ijhcs.2020.102485
    • The Combating Autism Act of 2006 and its reauthorization in 2014 produced unprecedented interest in autism research. Computer Science researchers have devoted considerable attention to applying wearable technologies to existing autism interventions, as well as producing new forms of intervention. Many of these applications base their approach in popular conceptions of autism, leading the work to focus predominantly on social skills training. This survey reviews existing research inquiries and produces alternative research directions informed by emerging research in the fields of psychology, neurology, education, and critical disability studies. Wearable technologies may be uniquely suited to support and empower autistic people in sensorimotor integration, emotional regulation, executive function, communication, and other underrepresented domains of this misunderstood disability.

  • Williams, R. M., Smarr, S., Prioleau, D., & Gilbert, J. E. (2021). Oh No, Not Another Trolley! On the Need for a Co-Liberative Consciousness in CS Pedagogy. IEEE Transactions on Technology and Society, 1–1. https://doi.org/10.1109/TTS.2021.3084913
    • Due to growing concerns for the disproportionate dangers, artificial intelligence (AI) advances pose to marginalized groups, proposals for procedural solutions to ethics in AI abound. It is time to consider that some systems may be inherently unethical, even violent, whether or not they are fair. In this article, we deploy a feminist critical discourse analysis of long-format responses to ethical scenarios from computing science undergraduate students. We find that even among students that had a strong understanding of social justice and the power of AI to exacerbate existing inequities, most students contextualize these problems as the product of biased datasets and human mis/trust factors, rather than as problems of design and purpose. Further, while many students recognized racism and classism at play in the potential negative impacts of AI systems, most students failed to recognize ableism as a driving social force for inequity. As computing science faculty, we must recognize that our students graduate to become the researchers and developers of future technosocial systems. Pedagogically, we need more than procedural fixes to systemic inequities. We are not going to program our way into justice. We must learn to say no to building violent things.

  • Williams, R. M. (2021). Six Ways of Looking at Fractal Mechanics. Catalyst: Feminism, Theory, Technoscience7(2). https://doi.org/10.28968/cftt.v7i2.33181
    • In this creative nonfiction essay, I traverse through permutations of “fractal mechanics” as a means of processing experiences of oppression and imagining revolutionary futures. I introduce fractal mechanics as a method for thinking through how “the institution,” broadly understood, travels and transmutes from physical structure localized in place to a set of internalized rule sets that bind themselves to transinstitutionalized “host bodies”—a NeoLiberation. Through a series of vignettes illustrating violent experiences of “inclusion,” I explore how the institution is reproduced in neoliberal constructions of inclusion, liberation, and justice. I then integrate critiques of liberation within neoliberal frames with crip imaginings of justice-in-relation to explicate how fractal mechanics can be understood not only as a method of oppression but also a method for revolution. I close with a series of imaginaries that encourage us to prefigure, or dream, a fractal politic of intercommunal connection.

  • Williams, R. M. (2023). All Robots Are Disabled. In Social Robots in Social Institutions (pp. 229–238). IOS Press. https://doi.org/10.3233/FAIA220622
    • All robots are disabled. I mean this in every possible sense. The anthropomorphic robot is always viewed through a deficit lens—a catalog of its failures to ascend from the uncanny valley. The assembly line robot is regarded with a precarious caution and skepticism—it is only noticed when it fails, becomes convalescent, a disruption to capitalist efficiency. Our relationship to robotic agents is constantly tensored by our desire that they transcend human limitations and our frustration that they remain inelegant and demand our vigilance and maintenance. Everywhere the robot is, the politics of disablement follow. Nowhere is this more evident than within the bodies of common cyborgs—disabled people whose ontology is mediated through the interface of organic matter and technology. Through case studies in human-robot relations (in the home, at work, and within the body), this piece explores how disability stigma informs our sociality with robotic agents and sustains exploitative social systems. I then deploy an alternative reading of these relations, informed by disabled cyborg scholarship and experiences, to propose cripped human-robot relations that prefigure liberation from the legacies of colonial, plantation, and carceral social relations.

  • Williams, R. M., & Gilbert, J. E. (2019). “Nothing about us without us”: Transforming participatory research and ethics in human systems engineering. In R. Roscoe & E. Chiou (Eds.), Advancing Diversity Inclusion and Social Justice Through Human Systems Engineering. Taylor & Francis Group.
    • Human systems engineering (HSE) is a broad-reaching, multidisciplinary field that investigates the interaction between systems and human factors to more effectively design materials, tools, and interfaces. HSE is inherently user-focused and humancentered. In a recent HSE-related publication containing 46 conference proceedings, 20 papers included one or more references to “human-centered,” “human factors,” “user-centered,” “user engagement,” “user acceptance,” or similar terms in the title (i.e., Ahram, Karwowski, & Taiar, 2018). However, human-centered and user-centered do not always mean “participatory” and as this chapter will reveal, participatory design is not de facto ethical. Researchers must look beyond the comfort of defined procedure to critically evaluate the flow of power in their work.


Posted

in

,

by

Tags:

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *