The PiCASSo Lab
Welcome to the PiCASSo Lab. The Pervasive Computing and Smart Sensing Lab designs and develops innovative sensing systems that lie at the intersection of smart urban communities and medical cyber-physical systems. We leverage pervasive sensing devices, including but not limited to smartphones, cameras, and wearables, towards developing a multimodal data integration and interpretation framework.
Research
Silent Speech Systems: Beyond Voice Interaction for Next Gen DevicesTanmay is working on developing intuitive, privacy-preserving, and accessible interaction methods for next-generation computing devices through… |
JawSense: Recognizing Unvoiced Sound using a Low-cost Ear-worn SystemJawSense explores a new wearable system enabling a novel form of human-computer interaction based on unvoiced jaw movement tracking. JawSense allows… |
LiftRight: Quantifying performance measures for fitness domainsLiftRight is a low-cost abstraction that captures upper body dynamics and computes several performance measures accurately. |
News
-
April, 2026: Our paper LiftSafe: Predicting Unsafe Repetitions in Strength Training using a Single Wearable Sensor has been accepted to IEEE/ACM CHASE 2026!
-
February, 2026: Great news! Our paper ViFiCon: Vision and Wireless Association Via Self-Supervised Contrastive Learning has been accepted to the 9th Multimodal Learning and Applications Workshop at CVPR 2026!
-
February, 2026: Thrilled to share that our paper FeudalNav: A Simple Framework for Visual Navigation has been accepted to CVPR 2026 Findings!
-
December, 2025: Our paper Beyond-Voice: Leveraging Articulatory Motion for Next-gen AI Assistants has been conditionally accepted to the 27th International Workshop on Mobile Computing Systems and Applications (HotMobile) 2026!
-
August 2025: Taseen and Amartya have joined the lab — welcome to the team!
