---
_id: '17275'
abstract:
- lang: eng
  text: How to teach actions to a robot as well as how a robot learns actions is an
    important issue to be discussed in designing robot learning systems. Inspired
    by human parent-infant interaction, we hypothesize that a robot equipped with
    infant-like abilities can take advantage of parental proper teaching. Parents
    are known to significantly alter their infant-directed actions versus adult-directed
    ones, e.g. make more pauses between movements, which is assumed to aid the infants’
    understanding of the actions. As a first step, we analyzed parental actions using
    a primal attention model. The model based on visual saliency can detect likely
    important locations in a scene without employing any knowledge about the actions
    or the environment. Our statistical analysis revealed that the model was able
    to extract meaningful structures of the actions, e.g. the initial and final state
    of the actions and the significant state changes in them, which were highlighted
    by parental action modifications. We further discuss the issue of designing an
    infant-like robot that can induce parent-like teaching, and present a human-robot
    interaction experiment evaluating our robot simulation equipped with the saliency
    model.
author:
- first_name: Yukie
  full_name: Nagai, Yukie
  last_name: Nagai
- first_name: Claudia
  full_name: Muhl, Claudia
  last_name: Muhl
- first_name: Katharina
  full_name: Rohlfing, Katharina
  id: '50352'
  last_name: Rohlfing
citation:
  ama: 'Nagai Y, Muhl C, Rohlfing K. Toward Designing a Robot that Learns Actions
    from Parental Demonstrations. In: <i>The 2008 IEEE International Conference on
    Robotics and Automation</i>. ; 2008:3545-3550. doi:<a href="https://doi.org/10.1109/robot.2008.4543753">10.1109/robot.2008.4543753</a>'
  apa: Nagai, Y., Muhl, C., &#38; Rohlfing, K. (2008). Toward Designing a Robot that
    Learns Actions from Parental Demonstrations. <i>The 2008 IEEE International Conference
    on Robotics and Automation</i>, 3545–3550. <a href="https://doi.org/10.1109/robot.2008.4543753">https://doi.org/10.1109/robot.2008.4543753</a>
  bibtex: '@inproceedings{Nagai_Muhl_Rohlfing_2008, title={Toward Designing a Robot
    that Learns Actions from Parental Demonstrations}, DOI={<a href="https://doi.org/10.1109/robot.2008.4543753">10.1109/robot.2008.4543753</a>},
    booktitle={The 2008 IEEE International Conference on Robotics and Automation},
    author={Nagai, Yukie and Muhl, Claudia and Rohlfing, Katharina}, year={2008},
    pages={3545–3550} }'
  chicago: Nagai, Yukie, Claudia Muhl, and Katharina Rohlfing. “Toward Designing a
    Robot That Learns Actions from Parental Demonstrations.” In <i>The 2008 IEEE International
    Conference on Robotics and Automation</i>, 3545–50, 2008. <a href="https://doi.org/10.1109/robot.2008.4543753">https://doi.org/10.1109/robot.2008.4543753</a>.
  ieee: 'Y. Nagai, C. Muhl, and K. Rohlfing, “Toward Designing a Robot that Learns
    Actions from Parental Demonstrations,” in <i>The 2008 IEEE International Conference
    on Robotics and Automation</i>, 2008, pp. 3545–3550, doi: <a href="https://doi.org/10.1109/robot.2008.4543753">10.1109/robot.2008.4543753</a>.'
  mla: Nagai, Yukie, et al. “Toward Designing a Robot That Learns Actions from Parental
    Demonstrations.” <i>The 2008 IEEE International Conference on Robotics and Automation</i>,
    2008, pp. 3545–50, doi:<a href="https://doi.org/10.1109/robot.2008.4543753">10.1109/robot.2008.4543753</a>.
  short: 'Y. Nagai, C. Muhl, K. Rohlfing, in: The 2008 IEEE International Conference
    on Robotics and Automation, 2008, pp. 3545–3550.'
date_created: 2020-06-24T13:02:46Z
date_updated: 2023-02-01T13:07:30Z
department:
- _id: '749'
doi: 10.1109/robot.2008.4543753
keyword:
- icra08
language:
- iso: eng
page: 3545-3550
publication: The 2008 IEEE International Conference on Robotics and Automation
status: public
title: Toward Designing a Robot that Learns Actions from Parental Demonstrations
type: conference
user_id: '14931'
year: '2008'
...
