Knowledge Distillation-Based TinyML Model for Breast Cancer Detection Using Real and Wasserstein GAN-Generated Microwave Imaging Data.

Tiny Machine Learning (TinyML) offers a trans-formative approach to efficient, intelligent cancer diagnostics on edge devices. However, designing optimized TinyML models for breast cancer diagnosis using Microwave Imaging datasets remains a challenge due to domain-specific customization requirements. To address this, we propose a novel Knowledge Distillation framework, where a teacher model, built using residual convolutional neural networks, transfers its knowledge to a lightweight student model, enhancing efficiency while maintaining high accuracy. The teacher model achieves a 95.42% accuracy on test data. Through knowledge distillation, the student model attains 95.32% accuracy while achieving a 96% reduction in model size, significantly enhancing computational efficiency without compromising performance. Without distillation, the student model reaches a testing accuracy of 86.5%. A major contribution of this work is the generation of a high-quality synthetic breast cancer dataset to address the scarcity of microwave imaging data. We employ a Wasserstein Generative Adversarial Network with Gradient Penalty (WGAN-GP) to synthesize realistic breast cancer scans, ensuring reliability through rigorous validation with a one-class Support Vector Machine. By training all models on a combination of real and synthetic datasets, our approach enhances robustness, making TinyML-powered breast cancer detection more viable for real-world deployment.Clinical RelevanceThis work enables efficient, accurate breast cancer detection on edge devices, supporting early diagnosis in clinical settings.
Cancer
Access
Care/Management
Advocacy

Authors

Khalid Khalid, Dagli Dagli, Obafemi-Ajayi Obafemi-Ajayi, Wunsch Wunsch
View on Pubmed
Share
Facebook
X (Twitter)
Bluesky
Linkedin
Copy to clipboard