
GitHub: github.com/PANSLAB

Sense for Less:
Physics-Informed Cyber-Physical Sensing Augmentation
Data acquisition quality directly affects the information representative as well as the model accuracy in applications for real-world deployed cyber-physical systems. We combine physical and data-driven knowledge to design metrics and methods to assess the signal and dataset quality for particular sensing tasks. The task-oriented sensing quality assessments are used for:
-
Fair dataset quality comparison for system performance evaluation and dataset sharing
-
Collaboratively sensing system adaptation to optimize data quality.
-
System auto-configuration to enhance CPS scalability.
Related talks:
Keynote: the 2nd Workshop of Device-Free Human Sensing video
Publication:
Yue Zhang, Carlos Ruiz, Shubham Rohal, and Shijia Pan. "CPA: Cyber-Physical Augmentation for Vibration Sensing in Autonomous Retails." In The 24th International Workshop on Mobile Computing Systems and Applications (HotMobile '23), February 22-23, 2023, Newport Beach, CA, USA. ACM, New York, NY, USA.
Zhizhang Hu, Yue Zhang, and Shijia Pan. "Footstep-Induced Floor Vibration Dataset: Reusability and Transferability Analysis". In Proceedings of the 19th ACM Conference on Embedded Networked Sensor Systems, pp. 546-551. 2021. (DATA '21). paper., code
Gang Wang, Shijia Pan, and Susu Xu. "Decoupling the Unfairness Propagation Chain in Crowd Sensing and Learning Systems for Spatio-temporal Urban Monitoring." Accepted by BuildSys 2021.
Yue Zhang, Zhizhang Hu, Susu Xu, Shijia Pan. AutoQual: Task-Oriented Structural Vibration Sensing Quality Assessment Leveraging Co-Located Mobile Sensing Context. CCF Transactions on Pervasive Computing and Interaction, paper.
Tong Yu, Yue Zhang, Zhizhang Hu, Susu Xu, Shijia Pan. Vibration-Based Indoor Human Sensing Quality Reinforcement via Thompson Sampling. In Proceedings of the 1st ACM Workshop on Cyber-Physical Human Sensing, part of CPS-IoT Week 2021. (Invited paper)
Lixing He, Carlos Ruiz, Mostafa Mirshekari, Shijia Pan. SCSV2: Physics-informed Self-Configuration Sensing through Vision and Vibration Context Modeling. In the 3rd Workshop on Combining Physical and Data-Driven Knowledge in Ubiquitous Computing, September 2020 paper.
Yue Zhang, Lin Zhang, Hae Young Noh, Pei Zhang, and Shijia Pan. A Signal Quality Assessment Metric for Vibration-based Human Sensing Data Acquisition. In the 2nd Workshop on Data Acquisition to Analysis. November 10, 2019, New York, NY, USA paper.
Workshop:
DATA: Acquisition To Analysis 2018 2019 2020 2021
Published Dataset:
Footstep-induced Floor Vibration: Multiple Variation Factors

Continual Multimodal Learning for CPS-IoT
The heterogeneity of cyber-physical systems brings challenges and opportunities for various applications. Our goal is to explore new methods to combine the complementary characteristics of multiple sensing modalities for accurate fine-grained learning with limited (labeled) data.
Application 1: Elderly In-home Long-term Monitoring
Monitoring older adults' walking patterns and analyzing their fall risk is essential for fall prevention. Prior technologies such as computer vision or audio sensing raise privacy issues for long-term home monitoring scenarios. We look into an alternative solution through structural (e.g., floor) vibration (infrastructural sensing) and wearables (mobile sensing). By utilizing their complementary sensing properties and shared context, we can obtain high-fidelity data for older adults' information learning and modeling.
Application 2: Autonomous Retail
We utilize the complementary characteristics of multiple sensing modalities including computer vision, depth, vibration/acoustic, conductivity, etc. to achieve accurate autonomous retail and store inventory management.
Publication:
Zhizhang Hu, Yue Zhang, Tong Yu, and Shijia Pan. "VMA: Domain Variance- and Modality-Aware Model Transfer for Fine-Grained Occupant Activity Recognition". In the Proceedings of the 21st ACM/IEEE International Conference on Information Processing in Sensor Networks, Milan, Italy, Mat 4 - 6, 2022. pp. 247-258. 2022.
Zhizhang Hu. PhD Forum Abstract: Inferring Finer-grained Human Information with Multi-modal Cross-granularity Learning, In the SenSys 2020 PhD Forum.
João Falcão, Carlos Ruiz, Shijia Pan, Hae Young Noh, and Pei Zhang. "FAIM: Vision and Weight Sensing Fusion Framework for Autonomous Inventory Monitoring in Convenience Stores." Frontiers in Built Environment 6 (2020): 175.
Zhizhang Hu, Tong Yu, Yue Zhang, Shijia Pan. Fine-grained Activities Recognition with Coarse-grained Labeled Multi-modal Data. In the 2nd Workshop on Continual and Multimodal Learning for Internet of Things, September 2020.
Carlos Ruiz, Shijia Pan, Adeola Bannis, Ming-Po Chang, Hae Young Noh, and Pei Zhang. "IDIoT: Towards Ubiquitous Identification of IoT Devices through Visual and Inertial Orientation Matching During Human Activity." In 2020 IEEE/ACM Fifth International Conference on Internet-of-Things Design and Implementation (IoTDI), pp. 40-52. IEEE, 2020
Carlos Ruiz, João Falcão, Shijia Pan, Hae Young Noh, and Pei Zhang. "AIM3S: Autonomous Inventory Monitoring through Multi-Modal Sensing for Cashier-Less Convenience Stores." In Proceedings of the 6th ACM International Conference on Systems for Energy-Efficient Buildings, Cities, and Transportation, pp. 135-144. 2019.
Workshop:
CML-IoT 2019 2020 2021
AutoCheckout Competition 2020
Published Dataset:
Activity of Daily Living: ADL
AutoCheckout Competition: AutoCheck

Vibrations of Everything:
Turning Things into Sensors
Physical and physiological information is essential for indoor human monitoring applications. In order to capture such information non-intrusively, we utilize/design sensors that can capture the vibration induced by people/body parts to infer a variance of activities (over different scales, both infrastructure and wearable) in the physical world. We explore new opportunities to utilize complementary information over different scales (on-body, room, city, etc.) to achieve accurate inference.
Application 1: Vibration-enabled On-body HCI
We enable a novel form of human-computer interaction based on unvoiced jaw movement tracking. We study the neurological and anatomical structure of the human cheek and jaw to design JawSense so that jaw movement can be reliably captured under the strong impact of noises from human artifacts. It senses the muscle deformation and vibration caused by unvoiced speaking to decode the unvoiced phonemes spoken by the user.
Application 2: Oral Health Monitoring via Teeth Vibration
We build an in-mouth sensing platform in the form of a retainer and mouthguard to capture the in-mouth physical activities of the patient for fine-grained health prediction. When a person bites or grinds their teeth, the interaction between the upper and lower teeth induces vibration. The piezo-film-based sensor embedded in the retainer covering the molars and canines would capture these teeth' contact-induced vibration signals to extract the functional occlusion info of the person.

Application 3: Floor Vibration-based Pig Monitoring
Pigs' behavior and health conditions directly affect farms' profit as well as meat product quality. However, wearables are often easily destroyed by pigs and cameras often suffer from occlusion situations. We look into the pigpen floor vibration induced by pigs and placed vibration sensors underneath the floor slab of pig pens for pig activity monitoring.

Application 4: Contact-less Sleep Stage Monitoring
Sleep disorder impairs people's health. To accurately analyze patients' sleep quality, it is important to monitor their sleep stages in their natural status. Prior methods, such as polysomnography (PSG) and Fitbit, are often intrusive and having the potential to change the user's daily sleep routine. We non-intrusively identify sleep stages through bed-frame vibrations. Our system detects patients' movements during their sleep and estimates their sleep stages via their movements induced vibration on the bed frame.
Publication:
Shijia Pan, Dong Yoon Lee, Jun Ho Lee, and Phuc Nguyen. "TeethVib: Monitoring Teeth Functional Occlusion Through Retainer Vibration Sensing." Accepted by CHASE 2021.
Amelie Bonde, Jesse R Codling, Kanittha Naruethep, Yiwen Dong, Wachirawich Siripaktanakon, Sripong Ariyadech, Akkarit Sangpetch, Orathai Sangpetch, Shijia Pan, Hae Young Noh, Pei Zhang. "PigNet: Failure-Tolerant Pig Activity Monitoring System Using Structural Vibration". Accepted by IPSN 2021.
Zhizhang Hu, Yue Zhang, and Shijia Pan. Poster Abstract: Vibration-based Indoor Occupant Gait Monitoring with Robot Vacuum Cleaners. In IoTDI 2021. link.
Prerna Khanna, Tanmay Srivastava, Shijia Pan, Shubham Jain, and Phuc Nguyen. JawSense: Recognizing Unvoiced Sound using a Low-cost Ear-worn System. In the Twenty-second International Workshop on Mobile Computing Systems and Applications (HotMobile 2021).
Shijia Pan and Phuc Nguyen. Opportunities in the Cross-Scale Collaborative Human Sensing of ‘Developing’ Device-Free and Wearable Systems. In The 2nd ACM Workshop on Device-Free Human Sensing (DFHS’20), November 15, 2020, Virtual Event, Japan. ACM, New York, NY, USA.
Zhizhang Hu, Emre Sezgin, Simon Lin, Pei Zhang, Hae Young Noh, and Shijia Pan. Device-free Sleep Stage Recognition through Bed Frame Vibration Sensing. In the 1st ACM International Workshop on Device-Free Human Sensing, November 10, 2019, New York, NY, USA.
Ariyadech, Sripong, Amelie Bonde, Orathai Sangpetch, Woranun Woramontri, Wachirawich Siripaktanakon, Shijia Pan, Akkarit Sangpetch, Hae Young Noh, and Pei Zhang. "Dependable Sensing System for Pig Farming." In 2019 IEEE Global Conference on Internet of Things (GCIoT), pp. 1-7. IEEE, 2019.
Published Dataset:
Amelie Bonde, Shijia Pan, Orathai Sangpetch, Akkarit Sangpetch, Woranun Woramontri, and Pei Zhang. "Structural vibration sensing to evaluate animal activity on a pig farm." In the 1st Workshop on Data Acquisition to Analysis. pp. 25-26. 2018.
Yiwen Dong, Shijia Pan, Tong Yu, Mostafa Mirshekari, Jonathon Fagert, Amelie Bonde, Ole J. Mengshoel, Pei Zhang, and Hae Young Noh. 2021. The FootprintID Dataset: Footstep-Induced Structural Vibration Data for Person Identification with 8 Different Walking Speeds. Zenodo, DOI: https://doi.org/10.5281/zenodo.4691144
We sincerely thank sponsors for supporting our research!
[2023-2024] Hellman Fellows Award (So-PI)
"Unobtrusive Older Adults In-Home Monitoring via Miura-Ori Origami-Inspired Configurable Elastic Surface Sensing"
[2023-2025] UC Merced Climate Action Seed Competition (Lead-PI)
"Enhancing Central Valley Climate Resilience against Wildfire via AIoT-Enabled Augmented Air Quality Monitoring".
[2023-2024] CITRIS Seed Funding Program Award (Co-PI)
"Activity Monitoring to Improve Caregiver Connection and Care for Older Adults Living Alone with Alzheimer’s Disease".
[2022-2023] AiFi Inc. Gift (Sole-PI)
Continual Multimodal Inference for Autonomous Retail.
[2022-2023] Academic Senate Faculty Research Grants Program (Co-PI)
"Smart Kiosk for Raising Awareness of Recycling in the UC Merced Community".
[2021-2022] Academic Senate Faculty Research Grants Program (Sole-PI)
"TeethVib: Detecting Teeth Ill-Fitting Through Mouth-Guard Vibration Sensing".
[2020-2022] CITRIS Seed Funding Program Award (Co-PI)
"Data-Driven Fall Prevention Intervention for Older Adults".