Howdy, I'm Yuki Yamato!

I'm a full-stack web/mobile/IoT/interface developer.

This is my face.

About Me


I'm a Master's student at University of Tsukuba and an engineer.

I'm majoring in User Interface, Human-Computer Interaction, Computer Science. I have proposed interfaces that enable a new interaction and have mplemented a prototype/Proof of Concept.
As an engineer, I have experienced both a long-term internship and short-term internships.
Of course, I also do developments on my own.

My
Programming
Skill


5 years
Internship | GWT, GCP

5 years
Internship, Startup, Research, Individual | Git, Github Actions

5 years
Research, Startup | scikit-learn, tensorflow, flask, fastapi, sqlalchemy

5 years
Internship | React, NextJS, Express, Slack App, VS Code Extension, Chrome Extension

3 years
Startup, Research | Chat Application, Eye-Tracking

4 years
Internship, Startup, Research | Chat Application, Google Maps, Firebase, Bluetooth Serial

1 years
Individual | Tutorial

4 years
Startup | Dart, Firebase, Cross Platform Development, PWA

1 years
Internship | Server Side, AWS

My Career


Accenture Japan Ltd.

company image

Experienced a series of engineer work from analysis to requirement definition and construction through application development as a team.

  • Type: Short-term internship
  • Position: engineer
  • Term: 4 days, November 2020
  • Topic: Data-Analysis Team-Development Group-Work JavaScript Python

DMM.com LCC.

company image

Experienced developments in many areas such as mobile, business improvement tools, CI, etc.

  • Type: Short-term internship
  • Position: engineer
  • Term: 2 weeks, September 2020
  • Topic: Nodejs, Go, AWS, SlackAPI, Shell Script, Kotlin, CircleCI

GA technologies, Co., Ltd.

company image

Designed, implemented and evaluated new models and python scripts to estimate the sales price for each house in machine learning and systems engineering.

  • Type: Hackathon
  • Position: engineer
  • Term: 3 days, August 2019
  • Topic: Machine Learning Python Kaggle GCP

SIOS Technology, Inc (Gluegent, inc.)

company image

Developed and tested prototypes with cloud services such as GCP, AWS, and Azure. Designed and developed a workflow management application on Slack.

  • Type: Long-term internship
  • Position: engineer
  • Term: July 2017 - Current
  • Topic: Java NodeJS React GCP Azure AWS Git

Recruit Holdings Co., Ltd.

company image

Experienced the flow from issue creation to release through Zexy's iOS app development.

  • Type: Short-term internship
  • Position: engineer
  • Term: 1.5 months, October - November 2020
  • Topic: iOS Swift ObjectiveC Scrum Adobe-SiteCatalyst

Star Prince, LLC.

company image

Developed chat applications on Android/iOS using Java/Objective-C. Experienced both of frontend(mobile) and backend(flask/fastapi) development in a middle-size team.

  • Type: Part-time job
  • Position: engineer
  • Term: April 2020 - Current
  • Topic: Java ObjectiveC Python Android iOS Flask fastapi AWS Git

Yahoo Japan Corporation

company image

Collaborated with senior engineers on the project of a chat-bot management tool with React and offered insight. Experienced LeanXP development methodologies on the project to shorten development times and make better designs.

  • Type: Short-term internship
  • Position: engineer
  • Term: 2 weeks, September 2019
  • Topic: React NodeJS Git LeanXP Agile

My Research


Hand Gesture Interaction with a Low-Resolution Infrared Image Sensor on an Inner Wrist

We propose a hand gesture interaction method using a low-resolution infrared image sensor on an inner wrist. We attach the image sensor into the belt of a wrist-worn device (e.g. smartwatch, activity tracker) on the palmer side and recognize gestures of the opposite hand using machine learning techniques. Since the low-resolution sensor is placed on the inner wrist, the user can naturally control its direction to reduce privacy invasion. Our method can recognize four types of hand gestures: static hand poses, dynamic hand gestures, the movement of the finger, and the relative position of the hand. We developed a prototype using an 8x8 low-resolution infrared image sensor so that it does not invade the privacy of surrounding people. We conducted three experiments to evaluate the recognition accuracies of our prototype. We introduce an implemented map application using our designed hand gestures of both hands.

  • Type: User Interface
  • Term: April 2019 - Present
  • Topic: Mobile Interaction, Hand Gesture, Smartwatch, Infrared Sensor, Machine Learning
Yuki Yamato, Yutaro Suzuki, Kodai Sekimori, Buntarou Shizuki, and Shin Takahashi. 2020. Hand Gesture Interaction with a Low-Resolution Infrared Image Sensor on an Inner Wrist. In Proceedings of the International Conference on Advanced Visual Interfaces (AVI '20). Association for Computing Machinery, New York, NY, USA, Article 58, 1–5. DOI: https://doi.org/10.1145/3399715.3399858

A Viewpoint Control Method for 360° Media Using Helmet Touch Interface

We have developed a helmet touch interface for the viewpoint control of a 360° media. The user of this interface can control the camera in 360° media by touching the surface of the helmet. To detect touch, two micro-controllers and 54 capacitive touch sensor points mounted on the interface surface are used.

  • Type: User Interface
  • Term: April 2019 - October 2019
  • Topic: Virtual Reality, Viewpoint Control, Touch Interfac
Takumi Kitagawa, Yuki Yamato, Buntarou Shizuki, and Shin Takahashi. A Viewpoint Control Method for 360° Media Using Helmet Touch Interface. In Symposium on Spatial User Interaction (SUI '19), Christoph W. Borst, Arun K. Kulshreshth, Gerd Bruder, Stefania Serafin, Christian Sandor, Kyle Johnsen, Jinwei Ye, Daniel Roth, and Sungchul Jung (Eds.). ACM, New York, NY, USA, Article 33, 2 pages, DOI: https://doi.org/10.1145/3357251.3360008

FGFlick: Augmenting Single-Finger Input Vocabulary for Smartphones with Simultaneous Finger and Gaze Flicks

FGFlick is an interactive technique featuring simultaneous single-finger operation and a gaze. The user flicks a smartphone and moves their gaze linearly. FGFlick thus augments the single-finger input vocabulary. As a result of the evaluation of the FGFlick gestures, we achieved success rates of 84.0%.

  • Type: User Interface
  • Term: April 2021 - September 2021
  • Topic: Single-finger input vocabulary, Flick gesture, Gaze input Interaction techniques, Mobile devices
Yuki Yamato, Yutaro Suzuki, and Shin Takahashi.FGFlick: Augmenting Single-Finger Input Vocabulary for Smartphones with Simultaneous Finger and Gaze Flicks. In Proceedings of 18th IFIP TC 13 International Conference on Human-Computer Interaction – INTERACT 2021, LNCS 12936, pp. 421–425, Springer, Bari, Italy, August 30 – September 3, 2021. DOI: https://doi.org/10.1007/978-3-030-85607-6_50

A Mouth Gesture Interface Featuring a Mutual-Capacitance Sensor Embedded in a Surgical Mask

We developed a mouth gesture interface featuring a mutual-capacitance sensor embedded in a surgical mask. This wearable hands-free interface recognizes non-verbal mouth gestures; others cannot eavesdrop on anything the user does with the user's device. The mouth is hidden by the mask; others do not know what the user is doing. We confirm the feasibility of our approach and demonstrate the accuracy of mouth shape recognition. We present two applications. Mouth shape can be used to zoom in or out, or to select an application from a menu.

  • Type: User Interface
  • Term: April 2019 - Present
  • Topic: Mask-Type Interface, Mutual-Capacitance-Sensor, Mouth Gesture, Hands-Free, Machine Learning
Yutaro Suzuki, Kodai Sekimori, Yuki Yamato, Yusuke Yamasaki, Buntarou Shizuki, and Shin Takahashi. A Mouth Gesture Interface Featuring a Mutual-Capacitance Sensor Embedded in a Surgical Mask. In Proceedings of The 22nd International Conference on Human-Computer Interaction (HCI International 2020). Springer. Multimodal and Natural Interaction, 154–165. https://doi.org/10.1007/978-3-030-49062-1_10

Contact Me


We'll never share your email with anyone else.
© 2021 by ukitomato All Rights Reserved.