Multimodal Diagnosis Prediction Through Neuro-Symbolic Integration

Authors

  • Dr.R.G. Suresh Kumar Professor & HoD, Dept. of CSE, Rajiv Gandhi College of Engineering & Technology, Puducherry Author
  • Kaviya K PG Scholar, Dept. of CSE, Rajiv Gandhi College of Engineering & Technology, Puducherry Author

DOI:

https://doi.org/10.47392/IRJAEM.2026.0062

Keywords:

Alzheimer’s Disease, Neuro-Symbolic AI, Explainable AI, Logical Neural Networks, Multimodal Learning, Mild Cognitive Impairment

Abstract

Alzheimer’s disease (AD) is a progressive neurodegenerative disorder that gradually impairs memory and cognitive function. Early diagnosis, particularly at the Mild Cognitive Impairment (MCI) stage, is essential for enabling timely intervention and effective disease management. Although deep learning–based diagnostic systems achieve high classification accuracy using MRI data, their black-box nature limits interpretability and reduces clinical trust. Most existing frameworks focus primarily on single-modality imaging data, neglecting the integration of complementary clinical and genetic information. Recent neuro-symbolic approaches, such as NeuroSymAD and Logical Neural Networks (LNNs), have introduced explainable reasoning by combining neural perception with symbolic logic. However, these systems remain limited in multimodal integration and fine-grained staging of Alzheimer’s disease. To address these limitations, this paper proposes a Multimodal Diagnosis Prediction framework through Neuro-Symbolic Integration. The proposed system integrates MRI images, clinical assessments, and genetic biomarkers within an interpretable architecture that combines deep neural networks with symbolic reasoning. The framework generates both diagnostic predictions and transparent rule-based explanations, enhancing accuracy, interpretability, and clinical reliability for Alzheimer’s disease detection.

Downloads

Download data is not yet available.

Downloads

Published

2026-03-18