Penn HCI is founded and led by Assistant Professors Andrew Head and Danaë Metaxa. We are currently recruiting postdocs, graduate students, and undergraduate researchers to continue growing our group.
Assistant Professor, University of Pennsylvania
Assistant Professor, University of Pennsylvania
PhD Student, University of Pennsylvania
Danaë spoke with Professor Duncan J. Watts as part of the Annenberg Conversations on Gender Seminar series.learn more
In January 2022, the Comp Info Sci department will welcome Andrew Head as an Assistant Professor. Andrew, who will be starting a Penn HCI (Human Computer Interaction) Group with associate new hire Danaë Metaxa, mainly focuses on helping others express their work fluidly and efficiently.learn more
When asked what made them passionate about the work that they do, Danaë Metaxa describes an intrinsic calling to look to the needs of those that scientific design and application neglects.learn more
View recent publications and filter by topic, author, year, and more.
Danaë Metaxa, Michelle A. Gan, Su Goh, Jeff Hancock, and James A. Landay
Algorithmically-mediated content is both a product and producer of dominant social narratives, and it has the potential to impact users’ beliefs and behaviors. We present two studies on the content and impact of gender and racial representation in image search results for common occupations. In Study 1, we compare 2020 workforce gender and racial composition to that reflected in image search. We find evidence of underrepresentation on both dimensions: women are underrepresented in search at a rate of 42% women for a field with 50% women; people of color are underrepresented with 16% in search compared to an occupation with 22% people of color (the latter being proportional to the U.S. workforce). We also compare our gender representation data with that collected in 2015 by Kay et al., finding little improvement in the last half-decade. In Study 2, we study people’s impressions of occupations and sense of belonging in a given field when shown search results with different proportions of women and people of color. We find that both axes of representation as well as people’s own racial and gender identities impact their experience of image search results. We conclude by emphasizing the need for designers and auditors of algorithms to consider the disparate impacts of algorithmic content on users of marginalized identities.
Dongyeop Kang, Andrew Head, Risham Sidhu, Kyle Lo, Daniel S. Weld, and Marti A. Hearst
The task of definition detection is important for scholarly papers, because papers often make use of technical terminology that may be unfamiliar to readers. Despite prior work on definition detection, current approaches are far from being accurate enough to use in real- world applications. In this paper, we first perform in-depth er- ror analysis of the current best performing definition detection system and discover major causes of errors. Based on this analysis, we develop a new definition detection system, HEDDEx, that utilizes syntactic features, transformer encoders, and heuristic filters, and evaluate it on a standard sentence-level benchmark. Because current benchmarks evaluate randomly sampled sentences, we propose an alternative evaluation that assesses every sentence within a document. This allows for evaluating recall in addition to precision. HEDDEx outperforms the leading system on both the sentence-level and the document-level tasks, by 12.7 F1 points and 14.4 F1 points, respectively. We note that performance on the high-recall document-level task is much lower than in the standard evaluation approach, due to the necessity of incorporation of document structure as features. We discuss remaining challenges in document-level definition detection, ideas for improvements, and potential is- sues for the development of reading aid applications.
Andrew Head, Kyle Lo, Dongyeop Kang, Raymond Fok, Sam Skjonsberg, Daniel S. Weld, and Marti A. Hearst
Despite the central importance of research papers to scientific progress, they can be difficult to read. Comprehension is often stymied when the information needed to understand a passage resides somewhere else—in another section, or in another paper. In this work, we envision how interfaces can bring definitions of technical terms and symbols to readers when and where they need them most. We introduce ScholarPhi, an augmented reading interface with four novel features: (1) tooltips that surface position-sensitive definitions from elsewhere in a paper, (2) a filter over the paper that “declutters” it to reveal how the term or symbol is used across the paper, (3) automatic equation diagrams that expose multiple definitions in parallel, and (4) an automatically generated glossary of important terms and symbols. A usability study showed that the tool helps researchers of all experience levels read papers. Furthermore, researchers were eager to have ScholarPhi’s definitions available to support their everyday reading.
In this special topics course, you will learn the essentials of human-computer interaction (HCI). Over the course of a semester, you will learn how to design interactive systems that satisfy and delight users by undertaking the human-centered design process, from ideation to prototyping, implementation, and assessment with human users. You will learn key tools in the HCI research toolkit, including need-finding, user studies, visual design, cognitive models, demo’ing, ethical considerations, and writing HCI research papers. This course also provides a primer on several emerging research areas in HCI, including human-AI interaction, interactive programming tools, and education technology. To hone your craft as an HCI researcher, during this course you will undertake a small group HCI research project, which you will choose in consultation with the instructor. The final submission will be an extended abstract that could be submitted to an HCI conference (exceptional projects will be encouraged to submit for publication, and in past versions of this course have gotten published!)
The design, implementation, and evaluation of user interfaces. User-centered design and task analysis. Conceptual models and interface metaphors. Usability inspection and evaluation methods. Analysis of user study data. Input methods (keyboard, pointing, touch, tangible) and input models. Visual design principles. Interface prototyping and implementation methodologies and tools. Students will develop a user interface for a specific task and target user group in teams.
In this graduate seminar, we will explore a growing body of work at the intersection of technology and social justice. A range of areas are included under this umbrella including tech ethics, design justice, algorithmic fairness, as well as work on equity, bias, diversity, and representation in computer science and other related disciplines. In this course, students will read and discuss a wide range of this work, through both critical and generative lenses.