LI∆NG HΞ

Assistant Professor (He/His)
Department of Computer Graphics Technology
Polytechnic Institute, Purdue University
Office: Knoy 331, West Lafayette | lianghe@purdue.edu

I am an Assistant Professor in Interactive Media at the Department of Computer Graphics Technology (CGT), Polytechnic Institute at Purdue University. At Purdue, I lead Design & Engineering for Making (DΞ4M) Lab. I develop enabling tools, techniques, and devices that mediate and enhance human interaction with physical and virtual objects and environments. My research combines computational design, digital fabrication, mechanical engineering, and computer science to facilitate the making of future tools and interfaces for integrated experience from personal to environmental scales.

Before joining Purdue, I obtained my Ph.D. in Computer Science & Engineering from the University of Washington. I also worked at the HP Labs, Microsoft Research (Redmond), and Keio-NUS CUTE Center. My research was published at top HCI/UbiComp venues such as CHI, UIST, IMWUT, TEI, and ASSETS with awards.

Dear prospective students: Thank you for your interest!

PhD applicants: I will be looking to hire one PhD student for the 2024-2025 academic year. For information about applications and admissions in the Spring and Fall semesters, please apply through Purdue Polytechnic Institute.

Graduate, undergraduate, high school students: Please fill out this form if you are interested in working at DΞ4M Lab.

Fact: I do research, but I also like to make things for myself, my dear friends, and my loved communities.

Teaching at Purdue:

Fall 2023: CGT512 - Foundational Readings of UX Design

Spring 2023: CGT 532 - UX Design Graduate Studio II: Cross-Channel

Fall 2022: CGT 116 - Geometric Modeling for Visualization and Communication

Liang's profile image
NEWS
•••

09/23:

Invited to SIGGRAPH 2024 Emerging Technologies Jury

08/23:

One poster paper accepted to UIST 2023

08/23:

One poster paper accepted to ASSETS 2023

07/23:

Talk at Tsinghua University (hosted by Yukang Yan)

06/23:

Invited as an external mentor for NSF REU HDF 2023

06/23:

One paper conditionally accepted to UIST 2023

05/23:

Received recognition for excellent reviews for UIST 2023

05/23:

Talk at Zhejiang University (hosted by Guanyun Wang)

05/23:

Invited to CHI 2024 Program Subcommittee

05/23:

Talk at Duke Kunshan University (hosted by Xin Tong)

05/23:

Received recognition for excellent reviews for DIS 2023

04/23:

Talk at MIT HCI Seminar (hosted by Arvind Satyanarayan)

01/23:

Invited to serve on the Program Committee for ASSETS'23

01/23:

Invited to serve as an AC for DIS'23 Papers and Pictorials

#1 - Major Research Foci: Digtial Fabrication / Tangible Interaction / Making
3D Printing Magnetophoretic Displays (UIST 2023)

Zeyu Yan, Hsuanling Lee, Liang He, and Huaishu Peng

We present a pipeline for printing interactive and always-on mag- netophoretic displays using affordable Fused Deposition Modeling (FDM) 3D printers. Using our pipeline, an end-user can convert the surface of a 3D shape into a matrix of voxels. The generated model can be sent to an FDM 3D printer equipped with an additional syringe-based injector. During the printing process, an oil and iron powder-based liquid mixture is injected into each voxel cell, allow- ing the appearance of the once-printed object to be editable with external magnetic sources. To achieve this, we made modifications to the 3D printer hardware and the firmware. We also developed a 3D editor to prepare printable models. We demonstrate our pipeline with a variety of examples, including a printed Stanford bunny with customizable appearances, a small espresso mug that can be used as a post-it note surface, a board game figurine with a computationally updated display, and a collection of flexible wearable accessories with editable visuals.

    To appear at UIST 2023 [Code repository will be released soon]
Kinergy: Creating 3D Printable Motion using Embedded Kinetic Energy (UIST 2022)

Liang He, Xia Su, Huaishu Peng, Jeffrey I. Lipton, and Jon E. Froehlich

We present Kinergy—an interactive design tool for creating self-propelled motion by harnessing the energy stored in 3D printable springs. To produce controllable output motions, we introduce 3D printable kinetic units, a set of parameterizable designs that encapsulate 3D printable springs, compliant locks, and transmission mechanisms for three non-periodic motions—instant translation, instant rotation, continuous translation—and four periodic motions—continuous rotation, reciprocation, oscillation, intermittent rotation. Kinergy allows the user to create motion-enabled 3D models by embedding kinetic units, customize output motion characteristics by parameterizing embedded springs and kinematic elements, control energy by operating the specialized lock, and preview the resulting motion in an interactive environment. We demonstrate the potential of our techniques via example applications from spring-loaded cars to kinetic sculptures and close with a discussion of key challenges such as geometric constraints.

FlexHaptics: A Design Method for Passive Haptic Inputs Using Planar Compliant Structures (CHI 2022)

Hongnan Lin, Liang He, Fangli Song, Yifan Li, Tingyu Cheng, Clement Zheng, Wei Wang, and Hyunjoo Oh

This paper presents FlexHaptics, a design method for creating custom haptic input interfaces. Our approach leverages planar compliant structures whose force-deformation relationship can be altered by adjusting the geometries. Embedded with such structures, a FlexHaptics module exerts a fine-tunable haptic effect (i.e., resistance, detent, or bounce) along a movement path (i.e., linear, rotary, or ortho-planar). These modules can work separately or combine into an interface with complex movement paths and haptic effects. To enable the parametric design of FlexHaptic modules, we provide a design editor that converts user-specified haptic properties into underlying mechanical structures of haptic modules. We validate our approach and demonstrate the potential of FlexHaptic modules through six application examples, including a slider control for a painting application and a piano keyboard interface on touchscreens, a tactile low vision timer, VR game controllers, and a compound input device of a joystick and a two-step button.

ModElec: A Design Tool for Prototyping Physical Computing Devices Using Conductive 3D Printing (IMWUT 2021)

Liang He, Jarrid A Wittkopf, Ji Won Jun, Kris Erickson, and Rafael 'Tico' Ballagas

Integrating electronics with highly custom 3D designs for the physical fabrication of interactive prototypes is traditionally cumbersome and requires numerous iterations of manual assembly and debugging. With the new capabilities of 3D printers, combining electronic design and 3D modeling workflows can lower the barrier for achieving interactive functionality or iterating on the overall design. We present ModElec—an interactive design tool that enables the coordinated expression of electronic and physical design intent by allowing designers to integrate 3D-printable circuits with 3D forms. With ModElec, the user can arrange electronic parts in a 3D body, modify the model design with embedded circuits updated, and preview the auto-generated 3D traces that can be directly printed with a multi-material-based 3D printer.

Ondulé: Designing and Controlling 3D Printable Springs (UIST 2019)

Liang He, Huaishu Peng, Michelle Lin, Ravikanth Konjeti, François Guimbretière, and Jon E. Froehlich

We present Ondulé—an interactive design tool that allows novices to create parameterizable deformation behaviors in 3D-printable models using helical springs and embedded joints. Informed by spring theory and our empirical mechanical experiments, we introduce spring and joint-based design techniques that support a range of parameterizable deformation behaviors, including compress, extend, twist, bend, and various combinations. To enable users to design and add these deformations to their models, we introduce a custom design tool for Rhino. With the tool, users can convert selected geometries into springs, customize spring stiffness, and parameterize their design with mechanical constraints for desired behaviors.

SqueezaPulse: Adding Interactive Input to Fabricated Objects (TEI 2017)

Liang He, Gierad Laput, Eric Brockmeyer, and Jon E. Froehlich

We present SqueezaPulse, a technique for embedding interactivity into fabricated objects using soft, passive, lowcost bellow-like structures. When a soft cavity is squeezed, air pulses travel along a flexible pipe and into a uniquely designed corrugated tube that shapes the airflow into predictable sound signatures. A microphone captures and identifies these air pulses enabling interactivity. Informed by the underlying acoustic theory, we described an informal examination of the effect of different 3D-printed corrugations on air signatures and our resulting SqueezaPulse implementation. To demonstrate and evaluate the potential of SqueezaPulse, we present four prototype applications and a small, lab-based user study (N=9). Our evaluations show that our approach is accurate across users and robust to external noise

MakerWear: A Tangible Approach to Interactive Wearable Creation (CHI 2017)

Majeed Kazemitabaar, Jason McPeak, Alexander Jiao, Liang He, Thomas Outing, and Jon E. Froehlich

Wearable construction toolkits have shown promise in broadening participation in computing and empowering users to create personally meaningful computational designs. However, these kits present a high barrier of entry for some users, particularly young children (K-6). In this paper, we introduce MakerWear, a new wearable construction kit for children that uses a tangible, modular approach to wearable creation. We describe our participatory design process, the iterative development of MakerWear, and results from single- and multi-session workshops with 32 children (ages 5-12; M=8.3 years). Our findings reveal how children engage in wearable design, what they make (and want to make), and what challenges they face. As a secondary analysis, we also explore age-related differences.    

  Best Paper Award at CHI'17 |   Best LBW Paper Award at CHI'16

New Interaction Tools for Preserving an Old Language (CHI 2015)

Beryl Plimmer, Liang He, Tariq Zaman, Kasun Karunanayaka, Alvin W. Yeo, Garen Jengan, Rachel Blagojevic, and Ellen Yi-Luen Do

The Penan people of Malaysian Borneo were traditionally nomads of the rainforest. They would leave messages in the jungle for each other by shaping natural objects into language tokens and arranging these symbols in specific ways -- much like words in a sentence. With settlement, the language is being lost as it is not being used by the younger generation. We report here, a tangible system designed to help the Penan preserve their unique object writing language. The key features of the system are that: its tangibles are made of real objects; it works in the wild; and new tangibles can be fabricated and added to the system by the users. Our evaluations show that the system is engaging and encourages intergenerational knowledge transfer and thus has the potential to help preserve this language.

  Honorable Mentions Award at CHI'15

#2 - Other Research Threads: Interaction Techniques / Haptic and Tactile Interface
HulaMove: Using Commodity IMU for Waist Interaction (CHI 2021)

Xuhai Xu, Jiahao Li, Tianyi Yuan, Liang He, Xin Liu, Yukang Yan, Yuntao Wang, Yuanchun Shi, Jennifer Mankoff, and Anind K Dey.

We present HulaMove, a novel interaction technique that leverages the movement of the waist as a new eyes-free and hands-free input method for both the physical world and the virtual world. We first conducted a user study (N=12) to understand users’ ability to control their waist. We found that users could easily discriminate eight shifting directions and two rotating orientations, and quickly confirm actions by returning to the original position (quick return). We developed a design space with eight gestures for waist interaction based on the results and implemented an IMU-based real-time system. Using a hierarchical machine learning model, our system could recognize waist gestures at an accuracy of 97.5%. Finally, we conducted a second user study (N=12) for usability testing in both real-world scenarios and virtual reality settings.

PneuFetch: Supporting BVI People to Fetch Nearby Objects (CHI 2020)

Liang He, Ruolin Wang, and Xuhai Xu

Blind and visually impaired (BVI) people can fetch objects in an acquainted environment by touching objects or relying on their memory. However, in a complex and less familiar situation, those strategies become less useful or even result in dangers (e.g., touching hazardous obstacles).We present PneuFetch, a light haptic cue-based wearable device that supports blind and visually impaired (BVI) people to fetch nearby objects in an unfamiliar environment. In our design, we generate friendly, non-intrusive, and gentle presses and drags to deliver direction and distance cues on BVI user's wrist and forearm. As a concept of proof, we discuss our PneuFetch wearable prototype, contrast it with past work, and describe a preliminary user study.

A Multi-Modal Approach for BVI Developers to Edit Webpages (ASSETS 2019)

Venkatesh Potluri, Liang He, Christine Chen, Jon E. Froehlich, and Jennifer Mankoff

Blind and visually impaired (BVI) individuals are increasingly creating visual content online; however, there is a lack of tools that allow these individuals to modify the visual attributes of the content and verify the validity of those modifications. We discuss the design and preliminary exploration of a multi-modal and accessible approach for BVI developers to edit visual layouts of webpages while maintaining visual aesthetics. The system includes three parts: an accessible canvas, a code editor, and a controller that checks if the updates violate design guidelines.

TacTILE: A Toolchain for Creating Accessible Graphics with 3D-Printed Overlays and Auditory Annotations (ASSETS 2017)

Liang He, Zijian Wan, Leah Findlater, Jon E. Froehlich

Tactile overlays with audio annotations can increase the accessibility of touchscreens for blind users; however, preparing these overlays is complex and labor intensive. We introduce TacTILE, a novel toolchain to more easily create tactile overlays with audio annotations for arbitrary touchscreen graphics (e.g., graphs, pictures, maps). The workflow includes: (i) an annotation tool to add audio to graphical elements, (ii) a fabrication process that generates 3D-printed tactile overlays, and (iii) a custom app for the user to explore graphics with these overlays. We close with a pilot study with one blind participant who explores three examples (floor plan, photo, and chart), and a discussion of future work.

CozyMaps: Real-time Collaboration With Multiple Displays (MobileHCI 2015)

Kelvin Cheng, Liang He, Xiaojun Meng, David A. Shamma, Dung Nguyen, and Anbarasan T.

With the use of several tablet devices and a shared large display, CozyMaps is a multi-display system that supports real-time collocated collaboration on a shared map. This paper builds on existing works and introduces rich user interactions by proposing awareness, notification, and view sharing techniques, to enable seamless information sharing and integration in map-based applications. Based on our exploratory study, we demonstrated that participants are satisfied with these new proposed interactions. We found that view sharing techniques should be location-focused rather than user-focused. Our results provide implications for the design of interactive techniques in collaborative multi-display map systems.

PneuHaptic: Delivering Haptic Cues with a Pneumatic Armband (ISWC 2015)

Liang He, Cheng Xu, Ding Xu, and Ryan Brill

A common approach in creating haptic cues is moving the contact surface with electromechanical actuators such as vibrating electric motors, piezoelectric motors, or voicecoils. While these actuators can be configured to effectively convey rich information, their high frequency movementscould raise negative responses after lengthy exposure. PneuHaptic is a pneumatically-actuated arm-worn haptic interface. The system triggers a range of tactile sensations on the arm by alternately pressurizing and depressurizing a series of custom molded silicone chambers. We detail the implementation of our functional prototype and explore the possibilities for interaction enabled by the system.

Making

I occationally create random interactive installations and knickknacks like the ramblings of a paranoid. My ego revives through the construction of the visual and physical forms of my abstract ideas.
When I have time and genuine motivations, I also make visual design and development for my dear friends and my loved communities. Before I started my PhD, I was the co-founder of three startups (two in China and one in the US) where my primary role was the chief UI/UX designer.

DE4M Lab Logo Design

Makeability Lab Logo Design

HiLab at UCLA Logo Design (co-designed with Yang Zhang)

ASSETS 2022 Logo Design

UIST 2019 Logo Design

CHI 2019 Student Volunteer T-shirt Design

CHI 2014 Student Volunteer T-shirt Design

Design and development of FabGalaxy - an visualization tool for Personal Fabrication Research in HCI and Graphics: An Overview of Related Work, which is maintained by HCI Engineering Group, MIT CSAIL