https://doi.org/10.1051/epjconf/202532801059
ChestXFusionNet: A Multimodal Deep Learning Framework for Predicting Chest Diseases from X-ray Images and Clinical Data
Sandip University, School of Computer Science and Engineering, Nashik, India
* Corresponding author: nivedita.shimbre@gmail.com
Published online: 18 June 2025
The growing availability of multimodal healthcare data has opened new avenues for developing intelligent systems capable of early and accurate disease prediction. This study presents the design and development of a robust multimodal deep learning system aimed at predicting diseases using clinical and imaging Chest X-ray data. Our multimodal deep learning framework ChestXFusionNet employs 2D CNNs for feature extraction from chest X-ray images and 1D Deep Learning to process auxiliary patient data (e.g., age, gender, view position), which are subsequently fused for comprehensive chest disease prediction. To improve the performance of proposed ChestXFusionNet model the input image activation blended map is generated and then it passed to proposed model. Leveraging the NIH Chest X-ray dataset, which comprises over 100,000 frontal-view chest radiographs across 14 disease labels, our system integrates binary classification (finding / non finding of disease) on both image-based (Transfer leaning) and metadata-driven learning approaches to enhance predictive performance. Rigorous experimentation and evaluation demonstrate the system's ability to classify Chest X-ray disease as disease finding or not with accuracy of 92 %. This work contributes to the field of medical diagnosis by offering a scalable, computer-driven structure that supports the radiologist in clinical decision-making and paves the way for future multimodal health services AI solutions. The system shows the ability to integrate different data types to create intelligent, general disease spread models.
© The Authors, published by EDP Sciences, 2025
This is an Open Access article distributed under the terms of the Creative Commons Attribution License 4.0, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.