Liang's profile image

Liang HE

Assistant Professor, Computer Graphics Technology Department
Affiliated Faculty, Applied AI Research Center (AARC)
Director, Design & Engineering for Making (DΞ4M) Lab
Polytechnic Institute, Purdue University

I am an Assistant Professor in Interactive Media at the Department of Computer Graphics Technology (CGT) and an affiliated faculty member at the Applied AI Research Center (AARC), Polytechnic Institute at Purdue University. At Purdue, I lead Design & Engineering for Making (DΞ4M) Lab. Before joining Purdue, I obtained my Ph.D. in Computer Science & Engineering from the University of Washington, advised with Jon E. Froehlich. I also worked at the HP Labs, Microsoft Research (Redmond), and Keio-NUS CUTE Center.

My research interests lie in the field of human-computer interaction (HCI), including digital fabrication, tactile and haptic interfaces, tangible interaction, accessibility, and physical intelligence. I develop enabling tools, techniques, and devices that mediate and enhance human interaction with physical and virtual objects and environments. My works were published at top HCI/UbiComp venues such as ACM CHI, UIST, IMWUT, TEI, and ASSETS with awards.

In my research, I focus on:

developing computational design tools and techniques to augment physical object properties with interactivity to enhance hand- and body-based interactions (e.g., wearables, assistive devices).
building AI-assisted intelligent systems and workflows to empower end-users with minimal expertise in specialized tasks (e.g., 3D modeling, circuit prototyping, learning).
exploring electro-mechanical and material-based approaches to enable the integration of advanced physical intelligence in objects and environments for contextualized user needs (e.g., medical intervention, HRI).

RESEARCH VISION: From Shape to Physical Intelligence

This research vision is deeply rooted and curated from the concept of "Beyond Shape", distilled from my understanding and refleciton on the study in digital fabrication during my PhD.
"Beyond Shape" talk at MIT CSAIL HCI Seminar (04/18/2023)
Video credit: Arvind Satyanarayan
Research agenda diagram
As we are entering an era where computing and interactions are everywhere in both digital and physical worlds, we are urged to explore future agents with embedded physical intelligence—devices, tools, and interfaces that mechanically respond to user needs, integrate electrical functionalities like sensing and actuation, and physically adapt to humans and environments. To meet contextualized user demands and provide personalized experiences, we also need to democratize the approaches to encoding physical intelligence in custom agents through low-barrier tools and workflows. See our work towards this vision at Design & Engineering for Making (DΞ4M) Lab.
NEWS
•••

10/24:

Received Best Poster at UIST '24

06/24:

Received review recognitions for UIST '24

06/24:

One paper conditionally accepted to UIST

06/24:

Invited to CHI '25 Program Subcommittee

05/24:

One poster accepted to SIGGRAPH '24

04/24:

Recieved IPAI Postdoc Research Award

04/24:

Received PPI Faculty Research Award

03/24:

Invited to SIGGRAPH 2024 Posters PC

03/24:

Received review recognition for DIS '24

03/24:

Invited to ASSETS '24 program committee

02/24:

Invited to serve as an AC for UIST '24

01/24:

Received review recognition for CHI '24

01/24:

Invited to CHI 2024 SDC Committee

12/23:

Invited to serve as an AC for DIS 2024

11/23:

Invited to co-chair SIC for UIST 2024

11/23:

Received review recognition for CHI '24

STUDENTS

PhD Committee
Siqi Guo, CGT (2024 - )
Ali Baigelenov, CGT (2023 - )
Min Soo Choi, CGT (2023 - )
Dixuan Cui, CGT (graduated in 2023; now Assist. Prof. at Sam Houston State University)

COURSES

CGT581: Interactive Prototyping & Fabrication
CGT27108: UX Design Learning Studio (Screen)
CGT512: Foundational Readings of UX Design
CGT27208: UX Design Learning Studio (Cross-Channel)
CGT532: UX Design Graduate Studio (Cross-Channel)
CGT116: Geometric Modeling for Vis & Communication

SERVICES

Program Committees
CHI 2025, TEI 2025, UIST 2024, SIGGRAPH 2024 Poster Jury, SIGGRAPH 2024 E-Tech Jury, ASSETS 2024, DIS 2024, CHI 2024, ASSETS 2023, DIS 2023, IDC 2023, ASSETS 2022, IDC 2021 (WiP), CHI 2021 (LBW), CHI 2020 (LBW), CHI 2019 (LBW)

Organizing Committees
UIST 2024 (Student Innovation Contest), ASSETS 2023 (Posters & Demos, Experience Reports Chair), UIST 2023 (Proceedings Chair), UIST 2022 (Proceedings Chair), ASSETS 2022 (Web and Graphic Design Chair), UIST 2019 (Design and Web Chair)

Paper Reviews (250+)
(14 special recognitions for excellent review)

- CHI 2016/17/18/19/20/21/22/23/24/25
- UIST 2019/20/21/22/23/24
- DIS 2020/21/23/24
- SCF 2020/21/23
- ASSETS 2022/23/24
- TEI 2017/18
- IEEE VR 2023/24
- IDC 2023
- IMWUT 2023
- CSCW 2021
- WAC 2019
- MobileHCI 2016
- SIGGRAPH 2024
- ISMAR 2024

MAKING

I occationally create random interactive installations and knickknacks like the ramblings of a paranoid. My ego revives through the construction of the visual and physical forms of my abstract ideas.

Visual Identity Designs


HiLab at UCLA Logo (co-designed with Yang Zhang)

UIST 2019 Logo

Accepted CHI '19 SV T-Shirt

Accepted CHI '14 SV T-Shirt
OUTREACH


Selected and created a custom hardware kit—Gen-M Kit—that contains over 80 programmable modules provided by Seeed Studio and distributed the kits to eight student teams around the world.



DevelopedFabGalaxy—an visualization tool for Personal Fabrication Research in HCI and Graphics: An Overview of Related Work, which is maintained by HCI Engineering Group, MIT CSAIL.

ACM UIST 2024
MobiPrint: A Mobile 3D Printer for Environment-Scale Design and Fabrication
ZDaniel Campos Zamora, Liang He, and Jon E. Froehlich
MobiPrint provides a multi-stage fabrication pipeline: first, the robotic 3D printer automatically scans and maps an indoor space; second, a custom design tool converts the map into an interactive CAD canvas for editing and placing models in the physical world; finally, the MobiPrint robot prints the object directly on the ground at the defined location.
ACM UIST 2024     Best Poster Award at UIST'24
Fluxable: A Tool for Making 3D Printable Sensors and Actuators
Hsuanling Lee, Yujie Shan, Hongchao Mao, and Liang He
We present Fluxable, a tool converts arbitrary 3D models into deformable sensors and actuators with integrated helix-and-lattice structures, which comprise a hollow helical channel, lattice paddings, and a wireframe surface structure.
ACM UIST 2023
3D Printing Magnetophoretic Displays
Zeyu Yan, Hsuanling Lee, Liang He, and Huaishu Peng
We present a pipeline for printing interactive and always-on magnetophoretic displays using affordable FDM 3D printers. Using our pipeline, an end-user can convert the surface of a 3D shape into a matrix of voxels. The generated model can be sent to a modified 3D printer equipped with an additional syringe-based injector. To achieve this, we made modifications to the 3D printer hardware and the firmware. We also developed a 3D editor to prepare printable models.
ACM UIST 2022
Kinergy: Creating 3D Printable Motion using Embedded Kinetic Energy
Liang He, Xia Su, Huaishu Peng, Jeffrey I. Lipton, and Jon E. Froehlich
We present Kinergy—an interactive design tool for creating self-propelled motion by harnessing the energy stored in 3D printable springs. To produce controllable output motions, we introduce 3D printable kinetic units, a set of parameterizable designs that encapsulate 3D printable springs, compliant locks, and transmission mechanisms for three non-periodic motions and four periodic motions.
ACM CHI 2022
FlexHaptics: A Design Method for Passive Haptic Inputs Using Planar Compliant Structures
Hongnan Lin, Liang He, Fangli Song, Yifan Li, Tingyu Cheng, Clement Zheng, Wei Wang, and Hyunjoo Oh
We present FlexHaptics, a design method that leverages planar compliant structures to create custom haptic input interfaces. FlexHaptics modules can work separately or combine into an interface with complex movement paths and haptic effects with a parametric design editor.
ACM UbiComp 2022 / IMWUT December, 2021
ModElec: A Design Tool for Prototyping Physical Computing Devices Using Conductive 3D Printing
Liang He, Jarrid A Wittkopf, Ji Won Jun, Kris Erickson, Rafael 'Tico' Ballagas
ModElec is an interactive design tool that enables the coordinated expression of electronic and physical design intent by allowing designers to integrate 3D-printable circuits with 3D forms. With ModElec, the user can arrange electronic parts in a 3D body, modify the model with embedded circuits updated, and preview the auto-generated 3D traces that are printed with a multi-material 3D printer.
ACM CHI 2021
HulaMove: Using Commodity IMU for Waist Interaction
Xuhai Xu, Jiahao Li, Tianyi Yuan, Liang He, Xin Liu, Yukang Yan, Yuntao Wang, Yuanchun Shi, Jennifer Mankoff, and Anind K Dey.
We present HulaMove, a novel interaction technique that leverages the movement of the waist as a new eyes-free and hands-free input method for both the physical world and the virtual world. We developed a design space with eight gestures for waist interaction and implemented an IMU-based real-time system. Using a hierarchical machine learning model, our system could reach an accuracy of 97.5%.
ACM UIST 2019
Ondulé: Designing and Controlling 3D Printable Springs
Liang He, Huaishu Peng, Michelle Lin, Ravikanth Konjeti, François Guimbretière, and Jon E. Froehlich
We present Ondulé, an interactive design tool that allows novices to create parameterizable deformation behaviors in 3D printable models using helical springs and embedded joints. We introduce design techniques that support parameterizable individual and compound deformation behaviors. To enable users to design deformable models, we introduce a custom design tool for Rhino.
ACM CHI 2017     Best Paper Award at CHI'17 |   Best LBW Paper Award at CHI'16
MakerWear: A Tangible Approach to Interactive Wearable Creation
Majeed Kazemitabaar, Jason McPeak, Alexander Jiao, Liang He, Thomas Outing, and Jon E. Froehlich
We introduce MakerWear, a new wearable construction kit for children that uses a tangible, modular approach to wearable creation. We describe our participatory design process, the iterative development of MakerWear, and results from single- and multi-session workshops with 32 children (ages 5-12; M=8.3 years).
ACM TEI 2017
SqueezaPulse: Adding Interactive Input to Fabricated Objects
Liang He, Gierad Laput, Eric Brockmeyer, and Jon E. Froehlich
We present SqueezaPulse, a technique for embedding interactivity into fabricated objects using soft, passive, lowcost bellow-like structures. When a soft cavity is squeezed, air pulses travel along a flexible pipe and into a uniquely designed corrugated tube that shapes the airflow into predictable sound signatures. A microphone captures and identifies these air pulses enabling interactivity.
ACM CHI 2015     Honorable Mentions Award at CHI'15
New Interaction Tools for Preserving an Old Language
Beryl Plimmer, Liang He, Tariq Zaman, Kasun Karunanayaka, Alvin W. Yeo, Garen Jengan, Rachel Blagojevic, and Ellen Yi-Luen Do
We introduce a tangible system designed to help the Penan preserve their unique object writing language. The key features of the system are that: its tangibles are made of real objects; it works in the wild; and new tangibles can be fabricated and added to the system by the users. Our evaluations show that the system is engaging and encourages intergenerational knowledge transfer.
ACM MobileHCI 2015
CozyMaps: Real-time Collaboration With Multiple Displays
Kelvin Cheng, Liang He, Xiaojun Meng, David A. Shamma, Dung Nguyen, and Anbarasan Thangapalam
CozyMaps is a multi-display system that supports real-time collocated collaboration on a shared map. We introduce rich user interactions by proposing awareness, notification, and view sharing techniques to enable seamless information sharing and integration in map-based applications. Our exploratory study demonstrated that users are satisfied with these new proposed interactions.
ACM ISWC 2015
PneuHaptic: Delivering Haptic Cues with a Pneumatic Armband
Liang He, Cheng Xu, Ding Xu, and Ryan Brill
PneuHaptic is a pneumatically-actuated arm-worn haptic interface. The system triggers a range of tactile sensations on the arm by alternately pressurizing and depressurizing a series of custom molded silicone chambers. We detail the implementation of our functional prototype and explore the possibilities for interaction enabled by the system.