The post-pandemic era has witnessed a major shift towards adapting online or hybrid modes of education.
While decentralized classrooms offer a wide range of conveniences, they make it exponentially more challenging to track learners' cognitive involvement in the class, resulting in high dropout rates or increased scope of distractions.
Similarly, accessing online lectures requires traditional touch-based interfaces, thus limiting their accessibility to individuals with diverse abilities.
In this set of works, the challenge of automated attention estimation in online meetings, online classes, Massive Open Online Classes, and prerecorded videos has been addressed by developing ubiquitous assistive systems for analyzing cognitive features.
Further, through novel interactive assistive systems, seamless touch-free interactions between users and devices have been facilitated.
Key Features
- Cost-effective solutions: The systems work with commodity devices and require no additional sensors/hardware.
- Inclusive and Accessible: The systems are inclusive and can be used by users with diverse abilities and clinical conditions like motor impairments, Dactylitis, Sarcopenia, etc.
- Non-intrusive and Privacy-Preserving: The proposed approaches do not create distractions or any additional cognitive loads, and preserve users' privacy.
- Multimodal Analysis and Critical Evaluation: The systems critically analyze various modalities like gaze, speech, the motion of the head/nose, blinks, etc., and leverage their corresponding advantages and suitability, thus making the systems significantly accurate. These systems are tested with real users under robust setups.
- Diverse: The systems explore various levels and types of cognitive and interactive processes and address the challenges at various levels.
Pragma Kar
IIIT-Delhi, India
Samiran Chattopadhyay
Techno India University, India
Sandip Chakraborty
IIT Kharagpur, India
Publications
- Pragma Kar, Shyamvanshikumar Singh, Avijit Mandal, Samiran Chattopadhyay, and Sandip Chakraborty. 2023. ExpresSense: Exploring a Standalone Smartphone to Sense Engagement of Users from Facial Expressions Using Acoustic Sensing. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (CHI '23). Association for Computing Machinery, New York, NY, USA, Article 265, 1–18. https://doi.org/10.1145/3544548.3581235
-
Pragma Kar, Samiran Chattopadhyay, and Sandip Chakraborty. 2022. Bifurcating Cognitive Attention from Visual Concentration: Utilizing Cooperative Audiovisual Sensing for Demarcating Inattentive Online Meeting Participants. Proc. ACM Hum.-Comput. Interact. 6, CSCW2, Article 498 (November 2022), 34 pages. https://doi.org/10.1145/3555656
- Pragma Kar, Krishna Mishra, Sudipro Ghosh, Sandip Chakraborty, and Samiran Chattopadhyay. 2021. Nosype: A Novel Nose-tip Tracking-based Text Entry System for Smartphone Users with Clinical Disabilities for Touch-based Typing. In Proceedings of the 23rd International Conference on Mobile Human-Computer Interaction (MobileHCI '21). Association for Computing Machinery, New York, NY, USA, Article 26, 1–16. https://doi.org/10.1145/3447526.3472054
- Pragma Kar, Krishna Mishra, Sudipro Ghosh, Sandip Chakraborty and Samiran Chattopadhyay. 2021. Exploratory Analysis of Nose-gesture for Smartphone Aided Typing for Users with Clinical Conditions. 2021 IEEE International Conference on Pervasive Com-
puting and Communications Workshops and other Aliated Events (PerCom Workshops),
2021, pp. 380-383, doi: 10.1109/PerComWorkshops51409.2021.9430933.
- Pragma Kar, Samiran Chattopadhyay, and Sandip Chakraborty. 2020. Gestatten: Estimation of User's Attention in Mobile MOOCs From Eye Gaze and Gaze Gesture Tracking. Proc. ACM Hum.-Comput. Interact. 4, EICS, Article 72 (June 2020), 32 pages. https://doi.org/10.1145/3394974
Funding and Support
For questions and general feedback, contact
Pragma Kar