My work spans human-centered computing, accessibility, and emerging interaction techniques—driven by a consistent goal: to design intuitive, inclusive, and research-informed user experiences that solve real-world problems.
I was primarily responsible for refining and applying new methods to clean and analyze data from user research by using Python. I also conducted a critical analysis of existing educational tools, and proposed an AI-informed design framework and set of metrics to help balance control between teachers and students in the classroom.
I synthesized insights from 60+ scholarly articles on accessibility, inclusive design, and VR technologies, using both qualitative and quantitative methods to shape our research direction and methodology, which directly informed the design goals and constraints for our interface prototypes.
I visualized upper-body gestures for 26 VR commands and collected 312 user-defined gestures from users with diverse motor abilities, helping to define inclusive gesture vocabularies for accessible interaction design.
I recruited 24+ participants and conducted 48 user studies combining semi-structured interviews, field observations, and prototype-based usability testing. These sessions captured both behavioral and attitudinal data, with several studies involving task-based testing on a functional AR prototype. I applied a pre-post comparison framework to evaluate improvements in usability, efficiency, and user satisfaction following key design interventions. Using MAXQDA, I analyzed both qualitative and quantitative data to uncover performance patterns and inform inclusive interaction design.