Sense for Less:
Physics-Informed Cyber-Physical Sensing Augmentation
Data acquisition quality directly affects the information representative as well as the model accuracy in applications for real-world deployed cyber-physical systems. We combine physical and data-driven knowledge to design metrics and methods to assess the signal and dataset quality for particular sensing tasks. The task-oriented sensing quality assessments are used for:
Fair dataset quality comparison for system performance evaluation and dataset sharing
Collaboratively sensing system adaptation to optimize data quality.
System auto-configuration to enhance CPS scalability.
Keynote: the 2nd Workshop of Device-Free Human Sensing video
Lixing He, Carlos Ruiz, Mostafa Mirshekari, Shijia Pan. SCSV2: Physics-informed Self-Configuration Sensing through Vision and Vibration Context Modeling. In the 3rd Workshop on Combining Physical and Data-Driven Knowledge in Ubiquitous Computing, September 2020.
Yue Zhang, Susu Xu, Laixi Shi, Shijia Pan. Poster Abstract: Using Mobile Sensing to Enable the Signal Quality Assessment for Infrastructure Sensing Systems. In the 21st Annual International Workshop on Mobile Computing Systems and Applications (ACM HotMobile 2020).
Yue Zhang, Lin Zhang, Hae Young Noh, Pei Zhang, and Shijia Pan. A Signal Quality Assessment Metric for Vibration-based Human Sensing Data Acquisition. In the 2nd Workshop on Data Acquisition to Analysis. November 10, 2019, New York, NY, USA.
Continual Multimodal Learning for CPS-IoT
The heterogeneity of cyber-physical systems brings challenges and opportunities for various applications. Our goal is to explore new methods to combine the complementary characteristics of multiple sensing modalities for accurate fine-grained learning.
Application 1: Elderly In-home Long-term Monitoring
Monitoring older adults' walking patterns and analyzing their fall risk is essential for fall prevention. Prior technologies such as computer vision or audio sensing raise privacy issues for long term home monitoring scenarios. We look into an alternative solution through structural (e.g., floor) vibration (infrastructural sensing) and wearables (mobile sensing). By utilizing their complementary sensing properties and shared context, we can obtain high-fidelity data for older adults' information learning and modeling.
Application 2: Autonomous Retail
We utilize the complementary characteristics of multiple sensing modalities including computer vision, weight, and location information of the items to achieve accurate autonomous retail and store inventory management.
Zhizhang Hu. PhD Forum Abstract: Inferring Finer-grained Human Information with Multi-modal Cross-granularity Learning, In the SenSys 2020 PhD Forum.
João Falcão, Carlos Ruiz, Shijia Pan, Hae Young Noh, and Pei Zhang. "FAIM: Vision and Weight Sensing Fusion Framework for Autonomous Inventory Monitoring in Convenience Stores." Frontiers in Built Environment 6 (2020): 175.
Zhizhang Hu, Tong Yu, Yue Zhang, Shijia Pan. Fine-grained Activities Recognition with Coarse-grained Labeled Multi-modal Data. In the 2nd Workshop on Continual and Multimodal Learning for Internet of Things, September 2020.
Carlos Ruiz, João Falcão, Shijia Pan, Hae Young Noh, and Pei Zhang. "AIM3S: Autonomous Inventory Monitoring through Multi-Modal Sensing for Cashier-Less Convenience Stores." In Proceedings of the 6th ACM International Conference on Systems for Energy-Efficient Buildings, Cities, and Transportation, pp. 135-144. 2019.
AutoCheckout Competition 2020
Activity of Daily Living: ADL
AutoCheckout Competition: AutoCheck
Cross-scale Human Monitoring via Infrastructure and Wearable Collaborative Sensing
Collaborator: UT Arlington
Physical and physiology information is essential for indoor human monitoring applications. Various non-intrusive `developing' sensing modalities -- both infrastructure and wearable -- have been explored. We explore new opportunities to utilize complementary information over different scales (on-body, room, city, etc.) to achieve accurate inference.
Shijia Pan and Phuc Nguyen. Opportunities in the Cross-Scale Collaborative Human Sensing of ‘Developing’ Device-Free and Wearable Systems. In The 2nd ACM Workshop on Device-Free Human Sensing (DFHS’20), November 15, 2020, Virtual Event, Japan. ACM, New York, NY, USA.
Structures as Sensors:
Floor Vibration-based Pig Monitoring
Collaborator: CMU, CMKL, Betagro
Pigs' behavior and health conditions directly affect farms' profit as well as meat product quality. However, wearables are often easily destroyed by pigs and cameras often suffer from occlusion situations. We look into the structural vibration induced by pigs and placed vibration sensors underneath the floor slab of pig pens for pig activity monitoring.
Ariyadech, Sripong, Amelie Bonde, Orathai Sangpetch, Woranun Woramontri, Wachirawich Siripaktanakon, Shijia Pan, Akkarit Sangpetch, Hae Young Noh, and Pei Zhang. "Dependable Sensing System for Pig Farming." In 2019 IEEE Global Conference on Internet of Things (GCIoT), pp. 1-7. IEEE, 2019.
Amelie Bonde, Shijia Pan, Orathai Sangpetch, Akkarit Sangpetch, Woranun Woramontri, and Pei Zhang. "Structural vibration sensing to evaluate animal activity on a pig farm." In the 1st Workshop on Data Acquisition to Analysis. pp. 25-26. 2018.
Sleep Stage Monitoring through Contact-less Sensing
Collaborator: NCH, CMU
Sleep disorder impairs people's health. To accurately analyze patients' sleep quality, it is important to monitor their sleep stages in their natural status. Prior methods, such as polysomnography (PSG) and Fitbit, are often intrusive and having the potential to change the user's daily sleep routine. We non-intrusively identify sleep stages through bed-frame vibrations. Our system detects patients' movements during their sleep and estimates their sleep stages via their movements induced vibration on the bed frame.
Zhizhang Hu, Emre Sezgin, Simon Lin, Pei Zhang, Hae Young Noh, and Shijia Pan. Device-free Sleep Stage Recognition through Bed Frame Vibration Sensing. In the 1st ACM International Workshop on Device-Free Human Sensing, November 10, 2019, New York, NY, USA.