Our Lab is located in the Atrium Building at KSU Marietta Campus
Room: J-220A and R2 331
Email: rislam11@kennesaw.edu
Welcome to the Human-Centered Immersive eXperience Lab (HIX-Lab) at Kennesaw State University! Our research focus is virtual, augmented, and mixed reality, with a strong emphasis on intelligent immersive systems, immersive user experiences, accessibility, and human-computer interaction. We take a human-centered approach to design and develop innovative solutions addressing real-world challenges in healthcare, rehabilitation, education, and training. Our research is regularly featured at prestigious international conferences such as IEEE VR, ISMAR, and CVPR.
Our key research areas include but not limitted to:
Intelligent and Contextually Aware immersive systems.
Human Centered Design for immersive environments.
Novel Interaction Techniques in virtual, augmented, and mixed reality.
Applied Research in Virtual, Augmented, and Mixed Reality technologies in healthcare, rehabilitation, education, and training.
Thank you for visiting our website. Feel free to reach out if you'd like to learn more about our work!
Congratulations to Md. Jahirul Islam for successfully completing his MS thesis defense.
New Funding Alert! Our lab has been awarded a DARPA ICS project titled “VeriPro: Verified Probabilistic Cognitive Reasoning for Tactical Mixed Reality Systems,” with a funding amount of $400,000 (August 2024 - Present).
Welcome New Team Members! Two new graduate students have joined the HiX Lab: Nasim (MS-CS Student) and Ridwan (CS-PhD Student). We are excited to have them on board!
Summer Research Fellow! We’re thrilled to announce that Dr. Islam (Director of HiX lab) received the Summer Research Fellowship at KSU for Summer 2024.
Publication News! Our latest paper, “Investigating Personalization Techniques for Improved Cybersickness Prediction in Virtual Reality Environments,” has been accepted for publication in the IEEE Transactions on Visualization and Computer Graphics (TVCG) track for IEEE VR 2024 (acceptance rate ~10%).
Funding Alert! We’ve also secured funding from DEVCOM Army Research Lab for the project “Towards Personalized Automatic Cybersickness Prediction and Reduction for Army Personnel,” with a total grant of $465,000 (December 2023 - Present). Our lab’s portion is $151,000.
Keynote Speaker! We delivered a keynote on “Towards Intelligent and Data-driven Adaptive Extended Reality” at the Data4XR workshop during IEEE VR 2023.
Our Paper “LiteVR: Interpretable and Lightweight Cybersickness Detection using Explainable AI” accepted at IEEE VR 2023. Congratulations to the collaborators! (video presentation)(Accepted for Conference, Acceptance Rate: 21%)
Invited Talks: “Automatic Cybersickness Detection using Multimodal Data” at Workshop on Enhancing User Comfort, Health and Safety in VR and AR at IEEE ISMAR 2022.
R. Islam, K. Desai and J. Quarles “Towards Forecasting the Onset of Cybersickness by Fusing Physiological, Head-tracking , and Eye-tracking with Multimodal Deep Fusion Network” 2022 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). (Accepted for Conference, Acceptance Rate: 21%)
R. Kundu, R. Islam, Khaza Hoque “TruVR: Trustworthy Cybersickness Detection using Explainable Machine Learning” 2022 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). (Accepted for Conference, Acceptance Rate: 22%)
Our paper “Cybersickness Prediction from Integrated HMD’s Sensors: A Multimodal Deep Fusion Approach using Eye-tracking and Head-tracking Data“ got accepted at the 2021 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). (video link)
This project tackles the critical issue of cybersickness among army personnel during Extended Reality (XR) exposures, particularly with systems like the Army's IVAS headset.
Funded By: Devcome Army Research Lab
Total Funding: $152,000 for 36 months.
In this project, our research team will employ various cognitive modeling techniques to detect and defend against cognitive attacks in tactical Mixed reality systems.
Funded By: Defense Advanced Research Projects Agency (DARPA)
Total Funding: $400,000 for 36 months.
In this project, our research team will design and develop innovative interaction models using extended reality systems to deliver multimodal feedback, enhancing navigation safety and effectiveness for individuals with visual impairments.