Arşiv logosu
  • Türkçe
  • English
  • Giriş
    Yeni kullanıcı mısınız? Kayıt için tıklayın. Şifrenizi mi unuttunuz?
Arşiv logosu
  • Koleksiyonlar
  • Sistem İçeriği
  • Analiz
  • Talep/Soru
  • Türkçe
  • English
  • Giriş
    Yeni kullanıcı mısınız? Kayıt için tıklayın. Şifrenizi mi unuttunuz?
  1. Ana Sayfa
  2. Yazara Göre Listele

Yazar "Acikgoz, Hakan" seçeneğine göre listele

Listeleniyor 1 - 1 / 1
Sayfa Başına Sonuç
Sıralama seçenekleri
  • Küçük Resim Yok
    Öğe
    Feature fusion-based hand gesture classification with time-domain descriptors and multi-level deep attention network
    (Elsevier, 2025) Alcin, Omer Faruk; Korkmaz, Deniz; Acikgoz, Hakan
    In conventional human-robot interaction (HRI), it is difficult to provide adaptability by located systems in the human body. Surface Electromyography (sEMG) signals have the potential to meet adaptability in HRI by directly representing movements, and classifying hand gestures with sEMG can be an effective solution to meet the increasing needs of these applications. In this paper, a hybrid and multi-scale convolutional neural network (CNN) model is proposed to obtain an efficient sEMG-based classification approach of human hand gestures. The proposed method includes an effective feature extraction process, including spectral moments, sparseness, irregularity factor, Teager-Kaiser energy, Shannon entropy, Katz fractal dimension, and Higuchi's fractal dimension, and waveform length. The obtained features are then converted to RGB images. The designed network is built on multi-scale convolutional blocks with residual learning and convolutional blocks, including the CBAM to improve the network performance by focusing on channel and spatial features. Furthermore, a pyramid non-pooling local block is utilized at the end of the network to learn more powerful features and their correlations. Five comprehensive publicly available datasets are evaluated in the experiments, and the obtained results are compared with the benchmark CNN models and network variations with different attention mechanisms. In the comparative evaluations, the CBAM achieves a classification accuracy between 84.62 % and 97.56 % while other attention mechanism results give accuracy values between 82.88 % and 97.17 %. The experiments show that the proposed method gives more accurate and robust classification performance compared with other variations and benchmark models.

| İnönü Üniversitesi | Kütüphane | Rehber | OAI-PMH |

Bu site Creative Commons Alıntı-Gayri Ticari-Türetilemez 4.0 Uluslararası Lisansı ile korunmaktadır.


İnönü Üniversitesi, Battalgazi, Malatya, TÜRKİYE
İçerikte herhangi bir hata görürseniz lütfen bize bildirin

DSpace 7.6.1, Powered by İdeal DSpace

DSpace yazılımı telif hakkı © 2002-2026 LYRASIS

  • Çerez Ayarları
  • Gizlilik Politikası
  • Son Kullanıcı Sözleşmesi
  • Geri Bildirim