Multimodal Diagnosis Prediction Through Neuro-Symbolic Integration
DOI:
https://doi.org/10.47392/IRJAEM.2026.0062Keywords:
Alzheimer’s Disease, Neuro-Symbolic AI, Explainable AI, Logical Neural Networks, Multimodal Learning, Mild Cognitive ImpairmentAbstract
Alzheimer’s disease (AD) is a progressive neurodegenerative disorder that gradually impairs memory and cognitive function. Early diagnosis, particularly at the Mild Cognitive Impairment (MCI) stage, is essential for enabling timely intervention and effective disease management. Although deep learning–based diagnostic systems achieve high classification accuracy using MRI data, their black-box nature limits interpretability and reduces clinical trust. Most existing frameworks focus primarily on single-modality imaging data, neglecting the integration of complementary clinical and genetic information. Recent neuro-symbolic approaches, such as NeuroSymAD and Logical Neural Networks (LNNs), have introduced explainable reasoning by combining neural perception with symbolic logic. However, these systems remain limited in multimodal integration and fine-grained staging of Alzheimer’s disease. To address these limitations, this paper proposes a Multimodal Diagnosis Prediction framework through Neuro-Symbolic Integration. The proposed system integrates MRI images, clinical assessments, and genetic biomarkers within an interpretable architecture that combines deep neural networks with symbolic reasoning. The framework generates both diagnostic predictions and transparent rule-based explanations, enhancing accuracy, interpretability, and clinical reliability for Alzheimer’s disease detection.
Downloads
Downloads
Published
Issue
Section
License
Copyright (c) 2026 International Research Journal on Advanced Engineering and Management (IRJAEM)

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
.