Comparison of the Stochastic Gradient Descent Based Optimization Techniques

dc.authoridTalu, Muhammed Fatih/0000-0003-1166-8404
dc.authorwosidTalu, Muhammed Fatih/W-2834-2017
dc.contributor.authorYazan, Ersan
dc.contributor.authorTalu, M. Fatih
dc.date.accessioned2024-08-04T20:44:14Z
dc.date.available2024-08-04T20:44:14Z
dc.date.issued2017
dc.departmentİnönü Üniversitesien_US
dc.description2017 International Artificial Intelligence and Data Processing Symposium (IDAP) -- SEP 16-17, 2017 -- Malatya, TURKEYen_US
dc.description.abstractThe stochastic gradual descent method (SGD) is a popular optimization technique based on updating each theta(k) parameter in the partial derivative J(theta)/partial derivative theta(k) direction to minimize / maximize the (J theta) cost function. This technique is frequently used in current artificial learning methods such as convolutional learning and automatic encoders. In this study, five different approaches (Momentum, Adagrad, Adadelta, Rmsprop ve Adam) based on SDA used in updating the theta parameters were investigated. By selecting specific test functions, the advantages and disadvantages of each approach are compared with each other in terms of the number of oscillations, the parameter update rate and the minimum cost reached. The comparison results are shown graphically.en_US
dc.description.sponsorshipIEEE Turkey Sect,Anatolian Scien_US
dc.identifier.isbn978-1-5386-1880-6
dc.identifier.scopus2-s2.0-85039922503en_US
dc.identifier.scopusqualityN/Aen_US
dc.identifier.urihttps://hdl.handle.net/11616/98097
dc.identifier.wosWOS:000426868700139en_US
dc.identifier.wosqualityN/Aen_US
dc.indekslendigikaynakWeb of Scienceen_US
dc.indekslendigikaynakScopusen_US
dc.language.isotren_US
dc.publisherIeeeen_US
dc.relation.ispartof2017 International Artificial Intelligence and Data Processing Symposium (Idap)en_US
dc.relation.publicationcategoryKonferans Öğesi - Uluslararası - Kurum Öğretim Elemanıen_US
dc.rightsinfo:eu-repo/semantics/closedAccessen_US
dc.subjectGradient Descenten_US
dc.subjectMomentumen_US
dc.subjectAdagraden_US
dc.subjectAdadeltaen_US
dc.subjectRmspropen_US
dc.subjectAdamen_US
dc.titleComparison of the Stochastic Gradient Descent Based Optimization Techniquesen_US
dc.typeConference Objecten_US

Dosyalar