TUBERCULOSIS DETECTION USING CNN AND VISUALISATION USING EXTENDED REALITY
Abstract
Tuberculosis (TB) continues to pose a significant global health challenge, particularly in resource-limited settings where timely and accurate diagnosis is crucial for effective treatment and containment. The World Health Organization (WHO) emphasizes the urgent need for innovative diagnostic strategies to address this persistent infectious disease, as detailed in their latest global tuberculosis report [1]. Traditional diagnostic methods, such as sputum microscopy and culture, often suffer from limitations in sensitivity and processing time, leading to diagnostic delays and increased transmission risks. This paper introduces an innovative approach that synergistically combines deep learning, specifically Convolutional Neural Networks (CNNs), with Extended Reality (XR) visualization to improve the efficiency and accuracy of TB detection from chest X-ray (CXR) images.
We propose a robust CNN architecture, leveraging transfer learning from pre-trained models, to automate the identification of radiographic abnormalities indicative of pulmonary TB. The network is trained and validated on a large, diverse dataset of CXR images, encompassing both healthy and TB-infected cases, and employs data augmentation strategies to enhance model generalization and robustness. As noted in recent studies on deep learning and medical image analysis, CNNs have shown remarkable capabilities in automating the detection of various diseases, including TB, from medical images [2]. The output of the CNN, which includes localization maps highlighting potential TB lesions, is then integrated into an immersive XR environment. This XR visualization tool allows clinicians to interact with and examine the CXR images in a three-dimensional (3D) space, offering a more intuitive and comprehensive understanding of radiographic findings.
The integration of XR technologies, which provide immersive visualization capabilities, has the potential to transform medical data interpretation [3]. By transforming the CNN's output into a volumetric representation, we enable clinicians to navigate through the lung parenchyma, identify subtle abnormalities, and assess the extent of disease involvement with greater precision. This approach effectively addresses the limitations of traditional two-dimensional (2D) CXR interpretation, which often suffers from overlapping anatomical structures and restricted depth perception. The enhanced visualization capabilities provided by XR can lead to more rapid and accurate diagnoses, ultimately contributing to better patient outcomes and reduced TB transmission. We evaluate the performance of the CNN model using standard diagnostic metrics, including accuracy, sensitivity, specificity, and the area under the receiver operating characteristic curve (AUC). The 1 effectiveness of the XR visualization tool is assessed through user studies, where clinicians evaluate its usability, diagnostic accuracy, and impact on their confidence in TB detection. This integrated approach, combining the power of deep learning with the immersive capabilities of XR, offers a promising solution for improving TB detection in clinical practice, particularly in resource-constrained settings where access to expert radiologists is limited.
Author
Mrs.A.AafiyaThahaseen, Dr. R.Nithiavathy, Mohamed Sulaiman M, Shaik Zuber B, Mohammed Raffek A, Navaneetha krishnan T
Download