I am an Assistant Professor in Interactive Media at the Department of Computer Graphics Technology (CGT), Polytechnic Institute at Purdue University. At Purdue, I lead Design & Engineering for Making (DΞ4M) Lab. I develop enabling tools, techniques, and devices that mediate and enhance human interaction with physical and virtual objects and environments. My research combines computational design, digital fabrication, mechanical engineering, and computer science to facilitate the making of future tools and interfaces for integrated experience from personal to environmental scales.
Before joining Purdue, I obtained my Ph.D. in Computer Science & Engineering from the University of Washington. I also worked at the HP Labs, Microsoft Research (Redmond), and Keio-NUS CUTE Center. My research was published at top HCI/UbiComp venues such as CHI, UIST, IMWUT, TEI, and ASSETS with awards.
Dear prospective students: Thank you for your interest!
PhD applicants: I will be looking to hire one PhD student for the 2024-2025 academic year. For information about applications and admissions in the Spring and Fall semesters, please apply through Purdue Polytechnic Institute.
Graduate, undergraduate, high school students: Please fill out this form if you are interested in working at DΞ4M Lab.
Fact: I do research, but I also like to make things for myself, my dear friends, and my loved communities.
Teaching at Purdue:
Fall 2023: CGT512 - Foundational Readings of UX Design
Spring 2023: CGT 532 - UX Design Graduate Studio II: Cross-Channel
Fall 2022: CGT 116 - Geometric Modeling for Visualization and Communication
09/23:
Invited to SIGGRAPH 2024 Emerging Technologies Jury
08/23:
One poster paper accepted to UIST 2023
08/23:
One poster paper accepted to ASSETS 2023
07/23:
Talk at Tsinghua University (hosted by Yukang Yan)
06/23:
Invited as an external mentor for NSF REU HDF 2023
06/23:
One paper conditionally accepted to UIST 2023
05/23:
Received recognition for excellent reviews for UIST 2023
05/23:
Talk at Zhejiang University (hosted by Guanyun Wang)
05/23:
Invited to CHI 2024 Program Subcommittee
05/23:
Talk at Duke Kunshan University (hosted by Xin Tong)
05/23:
Received recognition for excellent reviews for DIS 2023
04/23:
Talk at MIT HCI Seminar (hosted by Arvind Satyanarayan)
01/23:
Invited to serve on the Program Committee for ASSETS'23
01/23:
Invited to serve as an AC for DIS'23 Papers and Pictorials
Zeyu Yan, Hsuanling Lee, Liang He, and Huaishu Peng
We present a pipeline for printing interactive and always-on mag- netophoretic displays using affordable Fused Deposition Modeling (FDM) 3D printers. Using our pipeline, an end-user can convert the surface of a 3D shape into a matrix of voxels. The generated model can be sent to an FDM 3D printer equipped with an additional syringe-based injector. During the printing process, an oil and iron powder-based liquid mixture is injected into each voxel cell, allow- ing the appearance of the once-printed object to be editable with external magnetic sources. To achieve this, we made modifications to the 3D printer hardware and the firmware. We also developed a 3D editor to prepare printable models. We demonstrate our pipeline with a variety of examples, including a printed Stanford bunny with customizable appearances, a small espresso mug that can be used as a post-it note surface, a board game figurine with a computationally updated display, and a collection of flexible wearable accessories with editable visuals.
Liang He, Xia Su, Huaishu Peng, Jeffrey I. Lipton, and Jon E. Froehlich
We present Kinergy—an interactive design tool for creating self-propelled motion by harnessing the energy stored in 3D printable springs. To produce controllable output motions, we introduce 3D printable kinetic units, a set of parameterizable designs that encapsulate 3D printable springs, compliant locks, and transmission mechanisms for three non-periodic motions—instant translation, instant rotation, continuous translation—and four periodic motions—continuous rotation, reciprocation, oscillation, intermittent rotation. Kinergy allows the user to create motion-enabled 3D models by embedding kinetic units, customize output motion characteristics by parameterizing embedded springs and kinematic elements, control energy by operating the specialized lock, and preview the resulting motion in an interactive environment. We demonstrate the potential of our techniques via example applications from spring-loaded cars to kinetic sculptures and close with a discussion of key challenges such as geometric constraints.
Hongnan Lin, Liang He, Fangli Song, Yifan Li, Tingyu Cheng, Clement Zheng, Wei Wang, and Hyunjoo Oh
This paper presents FlexHaptics, a design method for creating custom haptic input interfaces. Our approach leverages planar compliant structures whose force-deformation relationship can be altered by adjusting the geometries. Embedded with such structures, a FlexHaptics module exerts a fine-tunable haptic effect (i.e., resistance, detent, or bounce) along a movement path (i.e., linear, rotary, or ortho-planar). These modules can work separately or combine into an interface with complex movement paths and haptic effects. To enable the parametric design of FlexHaptic modules, we provide a design editor that converts user-specified haptic properties into underlying mechanical structures of haptic modules. We validate our approach and demonstrate the potential of FlexHaptic modules through six application examples, including a slider control for a painting application and a piano keyboard interface on touchscreens, a tactile low vision timer, VR game controllers, and a compound input device of a joystick and a two-step button.
Liang He, Jarrid A Wittkopf, Ji Won Jun, Kris Erickson, and Rafael 'Tico' Ballagas
Integrating electronics with highly custom 3D designs for the physical fabrication of interactive prototypes is traditionally cumbersome and requires numerous iterations of manual assembly and debugging. With the new capabilities of 3D printers, combining electronic design and 3D modeling workflows can lower the barrier for achieving interactive functionality or iterating on the overall design. We present ModElec—an interactive design tool that enables the coordinated expression of electronic and physical design intent by allowing designers to integrate 3D-printable circuits with 3D forms. With ModElec, the user can arrange electronic parts in a 3D body, modify the model design with embedded circuits updated, and preview the auto-generated 3D traces that can be directly printed with a multi-material-based 3D printer.
Liang He, Huaishu Peng, Michelle Lin, Ravikanth Konjeti, François Guimbretière, and Jon E. Froehlich
We present Ondulé—an interactive design tool that allows novices to create parameterizable deformation behaviors in 3D-printable models using helical springs and embedded joints. Informed by spring theory and our empirical mechanical experiments, we introduce spring and joint-based design techniques that support a range of parameterizable deformation behaviors, including compress, extend, twist, bend, and various combinations. To enable users to design and add these deformations to their models, we introduce a custom design tool for Rhino. With the tool, users can convert selected geometries into springs, customize spring stiffness, and parameterize their design with mechanical constraints for desired behaviors.
Liang He, Gierad Laput, Eric Brockmeyer, and Jon E. Froehlich
We present SqueezaPulse, a technique for embedding interactivity into fabricated objects using soft, passive, lowcost bellow-like structures. When a soft cavity is squeezed, air pulses travel along a flexible pipe and into a uniquely designed corrugated tube that shapes the airflow into predictable sound signatures. A microphone captures and identifies these air pulses enabling interactivity. Informed by the underlying acoustic theory, we described an informal examination of the effect of different 3D-printed corrugations on air signatures and our resulting SqueezaPulse implementation. To demonstrate and evaluate the potential of SqueezaPulse, we present four prototype applications and a small, lab-based user study (N=9). Our evaluations show that our approach is accurate across users and robust to external noise
Majeed Kazemitabaar, Jason McPeak, Alexander Jiao, Liang He, Thomas Outing, and Jon E. Froehlich
Wearable construction toolkits have shown promise in broadening participation in computing and empowering users to create personally meaningful computational designs. However, these kits present a high barrier of entry for some users, particularly young children (K-6). In this paper, we introduce MakerWear, a new wearable construction kit for children that uses a tangible, modular approach to wearable creation. We describe our participatory design process, the iterative development of MakerWear, and results from single- and multi-session workshops with 32 children (ages 5-12; M=8.3 years). Our findings reveal how children engage in wearable design, what they make (and want to make), and what challenges they face. As a secondary analysis, we also explore age-related differences.
Best Paper Award at CHI'17 | Best LBW Paper Award at CHI'16
Beryl Plimmer, Liang He, Tariq Zaman, Kasun Karunanayaka, Alvin W. Yeo, Garen Jengan, Rachel Blagojevic, and Ellen Yi-Luen Do
The Penan people of Malaysian Borneo were traditionally nomads of the rainforest. They would leave messages in the jungle for each other by shaping natural objects into language tokens and arranging these symbols in specific ways -- much like words in a sentence. With settlement, the language is being lost as it is not being used by the younger generation. We report here, a tangible system designed to help the Penan preserve their unique object writing language. The key features of the system are that: its tangibles are made of real objects; it works in the wild; and new tangibles can be fabricated and added to the system by the users. Our evaluations show that the system is engaging and encourages intergenerational knowledge transfer and thus has the potential to help preserve this language.
Honorable Mentions Award at CHI'15
Xuhai Xu, Jiahao Li, Tianyi Yuan, Liang He, Xin Liu, Yukang Yan, Yuntao Wang, Yuanchun Shi, Jennifer Mankoff, and Anind K Dey.
We present HulaMove, a novel interaction technique that leverages the movement of the waist as a new eyes-free and hands-free input method for both the physical world and the virtual world. We first conducted a user study (N=12) to understand users’ ability to control their waist. We found that users could easily discriminate eight shifting directions and two rotating orientations, and quickly confirm actions by returning to the original position (quick return). We developed a design space with eight gestures for waist interaction based on the results and implemented an IMU-based real-time system. Using a hierarchical machine learning model, our system could recognize waist gestures at an accuracy of 97.5%. Finally, we conducted a second user study (N=12) for usability testing in both real-world scenarios and virtual reality settings.
Liang He, Ruolin Wang, and Xuhai Xu
Blind and visually impaired (BVI) people can fetch objects in an acquainted environment by touching objects or relying on their memory. However, in a complex and less familiar situation, those strategies become less useful or even result in dangers (e.g., touching hazardous obstacles).We present PneuFetch, a light haptic cue-based wearable device that supports blind and visually impaired (BVI) people to fetch nearby objects in an unfamiliar environment. In our design, we generate friendly, non-intrusive, and gentle presses and drags to deliver direction and distance cues on BVI user's wrist and forearm. As a concept of proof, we discuss our PneuFetch wearable prototype, contrast it with past work, and describe a preliminary user study.
Venkatesh Potluri, Liang He, Christine Chen, Jon E. Froehlich, and Jennifer Mankoff
Blind and visually impaired (BVI) individuals are increasingly creating visual content online; however, there is a lack of tools that allow these individuals to modify the visual attributes of the content and verify the validity of those modifications. We discuss the design and preliminary exploration of a multi-modal and accessible approach for BVI developers to edit visual layouts of webpages while maintaining visual aesthetics. The system includes three parts: an accessible canvas, a code editor, and a controller that checks if the updates violate design guidelines.
Liang He, Zijian Wan, Leah Findlater, Jon E. Froehlich
Tactile overlays with audio annotations can increase the accessibility of touchscreens for blind users; however, preparing these overlays is complex and labor intensive. We introduce TacTILE, a novel toolchain to more easily create tactile overlays with audio annotations for arbitrary touchscreen graphics (e.g., graphs, pictures, maps). The workflow includes: (i) an annotation tool to add audio to graphical elements, (ii) a fabrication process that generates 3D-printed tactile overlays, and (iii) a custom app for the user to explore graphics with these overlays. We close with a pilot study with one blind participant who explores three examples (floor plan, photo, and chart), and a discussion of future work.
Kelvin Cheng, Liang He, Xiaojun Meng, David A. Shamma, Dung Nguyen, and Anbarasan T.
With the use of several tablet devices and a shared large display, CozyMaps is a multi-display system that supports real-time collocated collaboration on a shared map. This paper builds on existing works and introduces rich user interactions by proposing awareness, notification, and view sharing techniques, to enable seamless information sharing and integration in map-based applications. Based on our exploratory study, we demonstrated that participants are satisfied with these new proposed interactions. We found that view sharing techniques should be location-focused rather than user-focused. Our results provide implications for the design of interactive techniques in collaborative multi-display map systems.
Liang He, Cheng Xu, Ding Xu, and Ryan Brill
A common approach in creating haptic cues is moving the contact surface with electromechanical actuators such as vibrating electric motors, piezoelectric motors, or voicecoils. While these actuators can be configured to effectively convey rich information, their high frequency movementscould raise negative responses after lengthy exposure. PneuHaptic is a pneumatically-actuated arm-worn haptic interface. The system triggers a range of tactile sensations on the arm by alternately pressurizing and depressurizing a series of custom molded silicone chambers. We detail the implementation of our functional prototype and explore the possibilities for interaction enabled by the system.
DE4M Lab Logo Design
Makeability Lab Logo Design
HiLab at UCLA Logo Design (co-designed with Yang Zhang)
ASSETS 2022 Logo Design
UIST 2019 Logo Design
CHI 2019 Student Volunteer T-shirt Design
CHI 2014 Student Volunteer T-shirt Design
Design and development of FabGalaxy - an visualization tool for Personal Fabrication Research in HCI and Graphics: An Overview of Related Work, which is maintained by HCI Engineering Group, MIT CSAIL
09/2023:
Invited to SIGGRAPH 2024 Emerging Technologies Jury Committee
08/2023:
One poster paper accepted to UIST 2023
08/2023:
One poster paper accepted to ASSETS 2023
07/2023:
Talk at Tsinghua University (hosted by Yukang Yan)
06/2023:
Invited as an external mentor for NSF REU HDF 2023
06/2023:
One paper conditionally accepted to UIST 2023
05/2023:
Received recognition for excellent reviews for UIST 2023
05/2023:
Talk at Zhejiang University (hosted by Guanyun Wang)
05/2023:
Invited to CHI 2024 Program Subcommittee
05/2023:
Talk at Duke Kunshan University (hosted by Xin Tong)
05/2023:
Received recognition for excellent reviews for DIS 2023
04/2023:
Talk at MIT HCI Seminar (hosted by Arvind Satyanarayan)
01/2023:
Invited to serve on the Program Committee for ASSETS'23
01/2023:
Invited to serve as an AC for DIS'23 Papers and Pictorials
01/2023:
Invited to serve as Posters & Demos Co-chair for ASSETS 2023
01/2023:
Started to serve as Proceedings Co-chair for UIST 2023
01/2023:
Received recognition for excellent reviews for CHI 2023
12/2022:
Received recognition for excellent reviews for IMWUT
12/2022:
Invited guest editor for Journal CCF TPCI
11/2022:
Invited guest talk on robotics & assistive tech at Purdue
11/2022:
Invited guest talk on "Prototyping" at U of Delaware
11/2022:
Invited to serve on IDC '23 Program Committee
10/2022:
Attending SCF '22 and UIST '22: presenting sPrintr and Kinergy and chairing one talk session
09/2022:
One demo paper about mobile 3D printing was accepted to SCF 2022
06/2022:
DURI research proposal (as the PI) was accepted!
06/2022:
One paper was conditionally accepted to UIST 2022
06/2022:
Invited panelist at DIS 2022 AMA
06/2022:
Received special recognition for excellent reviews UIST 2022
05/2022:
Attending CHI 2022 in person in NOLA
04/2022:
Mentoring college students from underrepresented groups in CSNext Workshop at UW
03/2022:
Invited talk at Georgia Tech
02/2022:
Invited to serve on ASSETS'22 Program Committee
02/2022:
Proposal on "Towards More Personal Health Sensing" accepted to CHI'22 SIG
01/2022:
Invited talk on "Beyond Shape" at Hasso Plattner Institute
12/2021:
Invited talk on "Beyond Shape" at the University of Maryland, College Park
11/2021:
Invited talk on ModElec, CSE Colloquium, UW
11/2021:
FlexHaptics was conditionally accepted to CHI'22
11/2021:
Received special recognition for excellent reviews CHI'22
10/2021:
Serve as Proceedings co-chair for UIST '22
10/2021:
Serve as Web and Graphic Design co-chair for ASSETS '22
10/2021:
Served as a session chair at UIST '21
10/2021:
ModElec was recommended for acceptance to IMWUT with minor revisions (first submission round)
07/2021:
Invited talk on "Beyond Shape" at the University of Calgary
06/2021:
Received the Bob Bandes Memorial Honorable Mention Student Teaching Award for 2020-2021
05/2021:
Received special recognition for excellent reviews UIST'21
12/2020:
Passed PhD general exam (thesis proposal)
12/2020:
One paper conditionally accepted to CHI '21
12/2020:
Invited talk on 3D printed electronics at HP 3D Print Lab-Technical Forum
10/2020:
Presenting thesis work at UIST '20 Doctoral Symposium
07/2020:
Position paper accepted to UIST '20 Doctoral Symposium
07/2020:
Received special recognition for excellent reviews UIST'20
05/2020:
Accepted to UW DUB Doctoral Colloquium 2020
03/2020:
Received special recognition for excellent reviews CHI'20
03/2020:
Received UW GPSS travel grant and Graduate School Conference Travel Award
02/2020:
PneuFetch accepted to CHI '20 Late-Breaking Work (LBW)
12/2019:
Talk at ISCAS, Beijing
09/2019:
Invited HCI Lunch Talk at Stanford
08/2019:
UITalk accepted to ASSETS '19 Poster
07/2019:
Invited to UW MSR Summer Institutes
07/2019:
Start research internship at HP Labs
06/2019:
Ondulé accepted to UIST '19
2022
Invited Talk at Georgia Tech.
2022
Invited Talk on "Beyond Shape" at Hasso Plattner Institute.
2021
Invited Talk on "Beyond Shape" at the University of Maryland, College Park.
2021
Invited Talk on "Beyond Shape" at the University of Calgary.
2021
Lightning Talk. IWHEC 2021 Affiliated Forum.
2020
Talk. HP 3D Print Lab.
2020
Presentation. UIST '20 DC.
2020
PhD Talk. DUB DC, UW.
2019
Talk. ISCAS, Beijing.
2019
HCI Lunch Talk. Stanford.
2019
Lightning Talk. UW CSE/MSR Summer Institute.
2019
Lecture "Heuristic Evaluation". CSE440A (Introduction to HCI), UW.
2018
"Video Making". CSE SkillShare Workshop, UW.
2018
Computational Fabrication. UW CSE Colloquia.
2018
Industry Affiliates Research Day, UW, Seattle.
2018
Workshop "3D Modeling with Fusion 360". CSE 590A (Ubiquitous Computing), UW.
2018
Lecture "Intro to Laser Cutting". HCID 521 (Prototyping Studio), UW.
2016
Tech+Design: Interaction Design for a Purpose. UMD, College Park.
2016
HCIL’s Annual Symposium. UMD, College Park.