Human gender estimation from CT images of skull using deep feature selection and feature fusion

dc.authorscopusid57441648700
dc.authorscopusid56779958100
dc.authorscopusid57873129600
dc.authorscopusid56416374000
dc.authorscopusid55326012400
dc.authorscopusid57194337237
dc.contributor.authorÇiftçi R.
dc.contributor.authorDönmez E.
dc.contributor.authorKurtoğlu A.
dc.contributor.authorEken Ö.
dc.contributor.authorSamee N.A.
dc.contributor.authorAlkanhel R.I.
dc.date.accessioned2024-08-04T20:03:30Z
dc.date.available2024-08-04T20:03:30Z
dc.date.issued2024
dc.departmentİnönü Üniversitesien_US
dc.description.abstractThis research endeavors to prognosticate gender by harnessing the potential of skull computed tomography (CT) images, given the seminal role of gender identification in the realm of identification. The study encompasses a corpus of CT images of cranial structures derived from 218 male and 203 female subjects, constituting a total cohort of 421 individuals within the age bracket of 25 to 65 years. Employing deep learning, a prominent subset of machine learning algorithms, the study deploys convolutional neural network (CNN) models to excavate profound attributes inherent in the skull CT images. In pursuit of the research objective, the focal methodology involves the exclusive application of deep learning algorithms to image datasets, culminating in an accuracy rate of 96.4%. The gender estimation process exhibits a precision of 96.1% for male individuals and 96.8% for female individuals. The precision performance varies across different selections of feature numbers, namely 100, 300, and 500, alongside 1000 features without feature selection. The respective precision rates for these selections are recorded as 95.0%, 95.5%, 96.2%, and 96.4%. It is notable that gender estimation via visual radiography mitigates the discrepancy in measurements between experts, concurrently yielding an expedited estimation rate. Predicated on the empirical findings of this investigation, it is inferred that the efficacy of the CNN model, the configurational intricacies of the classifier, and the judicious selection of features collectively constitute pivotal determinants in shaping the performance attributes of the proposed methodology. © The Author(s) 2024.en_US
dc.description.sponsorshipPrincess Nourah Bint Abdulrahman University, PNU: PNURSP2024R323en_US
dc.description.sponsorshipThis research was funded by Princess Nourah bint Abdulrahman University Researchers Supporting Project number (PNURSP2024R323), Princess Nourah bint Abdulrahman University, Riyadh, Saudi Arabia.en_US
dc.identifier.doi10.1038/s41598-024-65521-3
dc.identifier.issn2045-2322
dc.identifier.issue1en_US
dc.identifier.pmid39043755en_US
dc.identifier.scopus2-s2.0-85199308314en_US
dc.identifier.scopusqualityQ1en_US
dc.identifier.urihttps://doi.org/10.1038/s41598-024-65521-3
dc.identifier.urihttps://hdl.handle.net/11616/91876
dc.identifier.volume14en_US
dc.indekslendigikaynakScopusen_US
dc.indekslendigikaynakPubMeden_US
dc.language.isoenen_US
dc.publisherNature Researchen_US
dc.relation.ispartofScientific Reportsen_US
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanıen_US
dc.rightsinfo:eu-repo/semantics/openAccessen_US
dc.subjectConvolutional neural networks (CNN)en_US
dc.subjectDeep learning algorithmsen_US
dc.subjectGender prognosticationen_US
dc.subjectPrecision gender estimationen_US
dc.subjectSkull computed tomographyen_US
dc.titleHuman gender estimation from CT images of skull using deep feature selection and feature fusionen_US
dc.typeArticleen_US

Dosyalar