Deep learning for gender estimation using hand radiographs: a comparative evaluation of CNN models

dc.contributor.authorUlubaba, Hilal Er
dc.contributor.authorAtik, Ipek
dc.contributor.authorCiftci, Rukiye
dc.contributor.authorEken, Ozgur
dc.contributor.authorAldhahi, Monira I.
dc.date.accessioned2026-04-04T13:33:10Z
dc.date.available2026-04-04T13:33:10Z
dc.date.issued2025
dc.departmentİnönü Üniversitesi
dc.description.abstractBackgroundAccurate gender estimation plays a crucial role in forensic identification, especially in mass disasters or cases involving fragmented or decomposed remains where traditional skeletal landmarks are unavailable. This study aimed to develop a deep learning-based model for gender classification using hand radiographs, offering a rapid and objective alternative to conventional methods.MethodsWe analyzed 470 left-hand X-ray images from adults aged 18 to 65 years using four convolutional neural network (CNN) architectures: ResNet-18, ResNet-50, InceptionV3, and EfficientNet-B0. Following image preprocessing and data augmentation, models were trained and validated using standard classification metrics: accuracy, precision, recall, and F1 score. Data augmentation included random rotation, horizontal flipping, and brightness adjustments to enhance model generalization.ResultsAmong the tested models, ResNet-50 achieved the highest classification accuracy (93.2%) with precision of 92.4%, recall of 93.3%, and F1 score of 92.5%. While other models demonstrated acceptable performance, ResNet-50 consistently outperformed them across all metrics. These findings suggest CNNs can reliably extract sexually dimorphic features from hand radiographs.ConclusionsDeep learning approaches, particularly ResNet-50, provide a robust, scalable, and efficient solution for gender prediction from hand X-ray images. This method may serve as a valuable tool in forensic scenarios where speed and reliability are critical. Future research should validate these findings across diverse populations and incorporate explainable AI techniques to enhance interpretability.
dc.description.sponsorshipPrincess Nourah bint Abdulrahman University [PNURSP2025R286]; Princess Nourah bint Abdulrahman University, Riyadh, Saudi Arabia
dc.description.sponsorshipWe would like to thank Princess Nourah bint Abdulrahman University for supporting this project through Princess Nourah bint Abdulrahman University Researchers Supporting Project number (PNURSP2025R286), Princess Nourah bint Abdulrahman University, Riyadh, Saudi Arabia.
dc.identifier.doi10.1186/s12880-025-01809-8
dc.identifier.issn1471-2342
dc.identifier.issue1
dc.identifier.orcid0000-0003-2124-4525
dc.identifier.orcid0000-0002-5488-3158
dc.identifier.pmid40597748
dc.identifier.scopus2-s2.0-105009738955
dc.identifier.scopusqualityN/A
dc.identifier.urihttps://doi.org/10.1186/s12880-025-01809-8
dc.identifier.urihttps://hdl.handle.net/11616/108975
dc.identifier.volume25
dc.identifier.wosWOS:001522890200004
dc.identifier.wosqualityQ1
dc.indekslendigikaynakWeb of Science
dc.indekslendigikaynakScopus
dc.indekslendigikaynakPubMed
dc.language.isoen
dc.publisherBmc
dc.relation.ispartofBmc Medical Imaging
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanı
dc.rightsinfo:eu-repo/semantics/openAccess
dc.snmzKA_WOS_20250329
dc.subjectGender prediction
dc.subjectHand radiograph
dc.subjectDeep learning
dc.subjectConvolutional neural network (CNN)
dc.subjectForensic identification
dc.subjectResNet-50
dc.subjectArtificial intelligence
dc.subjectMass disaster response
dc.titleDeep learning for gender estimation using hand radiographs: a comparative evaluation of CNN models
dc.typeArticle

Dosyalar