About this item:

287 Views | 339 Downloads

Author Notes:

Email: tiffany.chen@gatech.edu

Conceived and designed the experiments: TC TB JLM MH LT CK.

Performed the experiments: TC TB.

Analyzed the data: TC TB JLM JB MH LT CK.

Contributed reagents/materials/analysis tools: MH LT CK.

Wrote the paper: TC TB JLM JB MH LT CK.

The authors would like to thank Yun Seong Song, Stacie A. Chvatal, Andrew B. Sawers, and all the expert dancers who participated in this study.

We also thank William R. Delaune for his advice on statistical analysis.

We gratefully acknowledge support from NSF Emerging Frontiers in Research and Innovation (EFRI) 1137229 and the NSF Graduate Research Fellowship Program (GRFP).

The authors have declared that no competing interests exist.

The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Subject:

Research Funding:

Funding was provided by the NSF Graduate Research Fellowship Program (GRFP), and National Science Foundation EFRI-M3C: Partnered Rehabilitative Movement: Cooperative Human-Robot Interactions for Motor Assistance, Learning, and Communication Award #: 1137229, MH LT CK

Keywords:

  • Science & Technology
  • Multidisciplinary Sciences
  • Science & Technology - Other Topics
  • PARKINSONS-DISEASE
  • SUBACUTE STROKE
  • INTERFACE
  • MOVEMENT
  • GAIT

Evaluation by Expert Dancers of a Robot That Performs Partnered Stepping via Haptic Interaction

Tools:

Journal Title:

PLoS ONE

Volume:

Volume 10, Number 5

Publisher:

, Pages e0125179-e0125179

Type of Work:

Article | Final Publisher PDF

Abstract:

Our long-term goal is to enable a robot to engage in partner dance for use in rehabilitation therapy, assessment, diagnosis, and scientific investigations of two-person whole-body motor coordination. Partner dance has been shown to improve balance and gait in people with Parkinson's disease and in older adults, which motivates our work. During partner dance, dance couples rely heavily on haptic interaction to convey motor intent such as speed and direction. In this paper, we investigate the potential for a wheeled mobile robot with a human-like upper-body to perform partnered stepping with people based on the forces applied to its end effectors. Blindfolded expert dancers (N=10) performed a forward/backward walking step to a recorded drum beat while holding the robot's end effectors. We varied the admittance gain of the robot's mobile base controller and the stiffness of the robot's arms. The robot followed the participants with low lag (M=224, SD=194 ms) across all trials. High admittance gain and high arm stiffness conditions resulted in significantly improved performance with respect to subjective and objective measures. Biomechanical measures such as the human hand to human sternum distance, center-of-mass of leader to center-of-mass of follower (CoM-CoM) distance, and interaction forces correlated with the expert dancers' subjective ratings of their interactions with the robot, which were internally consistent (Cronbach's α=0.92). In response to a final questionnaire, 1/10 expert dancers strongly agreed, 5/10 agreed, and 1/10 disagreed with the statement "The robot was a good follower." 2/10 strongly agreed, 3/10 agreed, and 2/10 disagreed with the statement "The robot was fun to dance with." The remaining participants were neutral with respect to these two questions.

Copyright information:

© 2015, Public Library of Science. All rights reserved.

This is an Open Access work distributed under the terms of the Creative Commons Universal : Public Domain Dedication License (http://creativecommons.org/publicdomain/zero/1.0/).

Creative Commons License

Export to EndNote