Repository logo
  • Communities & Collections
  • All of DSpace
  • English
  • ÄŒeÅ¡tina
  • Deutsch
  • Español
  • Français
  • Gàidhlig
  • LatvieÅ¡u
  • Magyar
  • Nederlands
  • Português
  • Português do Brasil
  • Suomi
  • Svenska
  • Türkçe
  • Қазақ
  • বাংলা
  • हिंदी
  • Ελληνικά
  • Log In
    or
    New user? Click here to register.Have you forgotten your password?
  1. Home
  2. Browse by Author

Browsing by Author "SUGIURA, YUTA"

Now showing 1 - 2 of 2
Results Per Page
Sort Options
  • Loading...
    Thumbnail Image
    Item
    Automatic Labeling of Training Data by Vowel Recognition for Mouth Shape Recognition with Optical Sensors Embedded in Head-Mounted Display
    (The Eurographics Association, 2019) Nakamura, Fumihiko; Suzuki, Katsuhiro; Masai, Katsutoshi; Itoh, Yuta; Sugiura, Yuta; Sugimoto, Maki; Kakehi, Yasuaki and Hiyama, Atsushi
    Facial expressions enrich communication via avatars. However, in common immersive virtual reality (VR) systems, facial occlusions by head-mounted displays (HMD) lead to difficulties in capturing users' faces. In particular, the mouth plays an important role in facial expressions because it is essential for rich interaction. In this paper, we propose a technique that classifies mouth shapes into six classes using optical sensors embedded in HMD and gives labels automatically to the training dataset by vowel recognition. We experiment with five subjects to compare the recognition rates of machine learning under manual and automated labeling conditions. Results show that our method achieves average classification accuracy of 99.9% and 96.3% under manual and automated labeling conditions, respectively. These findings indicate that automated labeling is competitive relative to manual labeling, although the former's classification accuracy is slightly higher than that of the latter. Furthermore, we develop an application that reflects the mouth shape on avatars. This application blends six mouth shapes and then applies the blended mouth shapes to avatars.
  • Loading...
    Thumbnail Image
    Item
    Evaluation of Trajectory Presentation of Conducting Motions Using Tactile Sensation for the Visually Impaired
    (The Eurographics Association, 2022) Ueda, Yuto; Sugiura, Yuta; Theophilus Teo; Ryota Kondo
    Visually impaired people may have difficulty participating in orchestral or other performance activities because they cannot see the beat represented by the conductor's hand movements. Therefore, we proposed a method of presenting conducting motions tactilely using vibration actuators. By using tactile apparent movement, the trajectory of conducting motions can be presented. We conducted comparative experiments using the reaction time between the correct beat timing and the predicted beat timing.

Eurographics Association © 2013-2025  |  System hosted at Graz University of Technology      
DSpace software copyright © 2002-2025 LYRASIS

  • Cookie settings
  • Privacy policy
  • End User Agreement
  • Send Feedback