蘇兆懷 Chao-Huai Su

About Me

He is a master student of CMLAB. He received his bachelor's degree and master's degree in Information Management from National Taiwan University in 2011 and 2013, respectively.

He majors in Human-Computer Interaction, especially in mobile interaction, graphical UI, on-body interaction and information visualization.

image
Testimonial
Services
Facebook
Contact

Curriculum Vitae

Publication

  • NailDisplay: Bringing Always-Available Visual Display To Fingertips
  • Chao-Huai Su, Liwei Chan, Chien-Ting Weng, Rong-Hao Liang, Kai-Yin Cheng , Bing-Yu Chen
    ACM SIGCHI 2013 Best paper award, Paris, France
  • GaussSense: Attachable Stylus Sensing Using Magnetic Sensor Grid
  • ACM UIST 2012, MA, USA.
  • SonarWatch: Appropriating the Forearm as a Slider Bar
  • Rong-Hao Liang, Shu-Yang Lin, Zhao-Huai Su, Kai-Yin Cheng, Bing-Yu Chen, De-Nian Yang
    ACM SIGGRAPH Asia E-Tech 2011, Hong Kong, China
  • PUB - Point Upon Body: Exploring Eyes-Free Interaction and Methods on an Arm
  • Shu-Yang Lin, Chao-Huai Su, Kai-Yin Cheng, Rong-Hao Liang, Tzu-Hao Kuo, Bing-Yu Chen
    ACM UIST 2011, Santa Barbara, California, USA.

Education

  • Graduate Institute of Information Management, NTU
  • MBA in Information Management 2010-2013
  • Department of Information Management, NTU
  • B.S. in Information Management 2007-2010
    GPA: 3.86/4

Work Experience

  • HTC Inc. Intern
  • Software Engineer Oct.2011-June2012
  • Aiming High For A Low-Carbon Taiwan Expo
  • National Science Council
    Software Engineer Mar.2011-Apr.2011

Skills

  • Programming
  • C/C++/Java/C#/Objective-C
  • Graphics
  • DirectX/OpenGL/OpenCV/Kinect SDK/OpenNI/CUDA
  • Language
  • Chinese (native)/English (good)
home Previous Next

Research

NailDisplay: Bringing an Always-Available Visual Display To Fingertips

image 1
This work presents a novel and always-available nail mounted display known as NailDisplay. The proposed display augments the use of a finger by allowing for always-available visual feedback owing to its fast accessibility and binding user controls with the display, i.e. what you control is what you see (through the display). Potential benefits of NailDisplay are demonstrated in three applications: from displaying to combining it with user controls. In the first application, NailDisplay can reveal what is occluded under a finger touch, making it a solution to operate small UI elements. In the second application, NailDisplay is complementary to an imaginary interface, helping users to learn an imaginary interface (e.g., on the users’ arms) and allowing them to reassure the interface when their memory of it becomes unclear. In the third application, NailDisplay is integrated with rich finger interactions, such as swiping in the air. We also report users’ feedbacks gathered from an explorative user study.

GaussSense: Attachable Stylus Sensing Using Magnetic Sensor Grid

image 1
GaussSense is a back-of-device sensing technique for enabling input on an arbitrary surface using stylus by exploiting magnetism. A 2mm-thick Hall sensor grid is developed to sense magnets that are embedded in the stylus. Our system can sense the magnetic field that is emitted from the stylus when it is within 2cm of any nonferromagnetic surface. Attaching the sensor behind an arbitrary thin surface enables the stylus input to be recognized by analyzing the distribution of the applied magnetic field. Attaching the sensor grid to the back of a touchscreen device and incorporating magnets into the corresponding stylus enable the system 1) to distinguish touch events that are caused by a finger from those caused by the stylus, 2) to sense the tilt angle of the stylus and the pressure with which it is applied, and 3) to detect where the stylus hovers over the screen. A pilot study reveals that people were satisfied with the novel sketching experiences based on this system.

SonarWatch: Appropriating the Forearm as a Slider Bar

image 1

PUB: Point Upon Body: Exploring Eyes-Free Interaction and Methods on an Arm

image 1
This paper presents a novel interaction system, PUB (Point Upon Body), to explore eyes-free interaction in a personal space by allowing users tapping on their own arms to be provided with haptic feedback from their skin. Two user studies determine how users can interact precisely with their forearms and how users behave when operating in their arm space. According to those results, normal users can divide their arm space at most into 6 points between their wrists and elbows with iterative practice. Experimental results also indicate that the divided pattern of each user is unique from that of other ones. Based on the design principles from the observations, an interaction system, PUB, is designed to demonstrate how interaction design benefits from those findings. Two scenarios, remote display control and mobile device control, are demonstrated through the UltraSonic device attached on the users’ wrists to detect their tapped positions.
home Previous