Human-Centered Immersive eXperience Lab (HIX-Lab)
Our Lab is located in the Atrium Building at KSU Marietta Campus
Room: J-220A
Email: rislam11@kennesaw.edu
Welcome!
Welcome to the Human-Centered Immersive eXperience Lab (HIX-Lab) at Kennesaw State University. Our lab conducts research in the field of virtual, augmented, and mixed reality, focusing on intelligent immersive systems, immersive user experience, accessibility, and human-computer interaction. Our goal is to conduct research and development projects to propose a human-centered approach for different problems in the fields of healthcare, rehabilitation, education, and training solutions, showcased at prestigious conferences such as IEEE VR, ISMAR, and CVPR. Our research spans a wide range of topics, including but not limited to:
User experience design for immersive environments.
Interaction techniques for virtual, augmented, and mixed reality.
Intelligent and contextually aware immersive systems.
Applications of virtual reality, augmented reality, and mixed reality in healthcare, rehabilitation, education, and training
Thank you for visiting our lab website. Feel free to contact us if you want to know more about us!
Recent Updates:
New Funding Alert! Our lab has been awarded a DARPA ICS project titled “VeriPro: Verified Probabilistic Cognitive Reasoning for Tactical Mixed Reality Systems,” with a funding amount of $400,000 (August 2024 - Present).
Welcome New Team Members! Two new graduate students have joined the HiX Lab: Nasim (MS-CS Student) and Ridwan (CS-PhD Student). We are excited to have them on board!
Summer Research Fellow! We’re thrilled to announce that Dr. Islam (Director of HiX lab) received the Summer Research Fellowship at KSU for Summer 2024.
Publication News! Our latest paper, “Investigating Personalization Techniques for Improved Cybersickness Prediction in Virtual Reality Environments,” has been accepted for publication in the IEEE Transactions on Visualization and Computer Graphics (TVCG) track for IEEE VR 2024 (acceptance rate ~10%).
Funding Alert! We’ve also secured funding from DEVCOM Army Research Lab for the project “Towards Personalized Automatic Cybersickness Prediction and Reduction for Army Personnel,” with a total grant of $465,000 (December 2023 - Present). Our lab’s portion is $151,000.
Keynote Speaker! We delivered a keynote on “Towards Intelligent and Data-driven Adaptive Extended Reality” at the Data4XR workshop during IEEE VR 2023.
Our Paper “LiteVR: Interpretable and Lightweight Cybersickness Detection using Explainable AI” accepted at IEEE VR 2023. Congratulations to the collaborators! (video presentation)(Accepted for Conference, Acceptance Rate: 21%)
Invited Talks: “Automatic Cybersickness Detection using Multimodal Data” at Workshop on Enhancing User Comfort, Health and Safety in VR and AR at IEEE ISMAR 2022.
R. Islam, K. Desai and J. Quarles “Towards Forecasting the Onset of Cybersickness by Fusing Physiological, Head-tracking , and Eye-tracking with Multimodal Deep Fusion Network” 2022 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). (Accepted for Conference, Acceptance Rate: 21%)
R. Kundu, R. Islam, Khaza Hoque “TruVR: Trustworthy Cybersickness Detection using Explainable Machine Learning” 2022 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). (Accepted for Conference, Acceptance Rate: 22%)
Our paper “Cybersickness Prediction from Integrated HMD’s Sensors: A Multimodal Deep Fusion Approach using Eye-tracking and Head-tracking Data“ got accepted at the 2021 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). (video link)
Current Research Projects:
Cybersickness Personalization for US Army Tactical Use Case
This project tackles the critical issue of cybersickness among army personnel during Extended Reality (XR) exposures, particularly with systems like the Army's IVAS headset.
Funded By: Devcome Army Research Lab
Total Funding: $152,000 for 36 months.
VeriPro: Verified Probabilistic Cognitive Reasoning for Tactical Mixed Reality Systems
In this project, our research team will employ various cognitive modeling techniques to detect and defend against cognitive attacks in tactical Mixed reality systems.
Funded By: Defense Advanced Research Projects Agency (DARPA)
Total Funding: $400,000 for 36 months.
XR-Cane for People with Vision Impairments
In this project, our research team will design and develop innovative interaction models using extended reality systems to deliver multimodal feedback, enhancing navigation safety and effectiveness for individuals with visual impairments.