---
_id: '43437'
abstract:
- lang: eng
  text: '<jats:p>In virtual reality (VR), participants may not always have hands,
    bodies, eyes, or even voices—using VR helmets and two controllers, participants
    control an avatar through virtual worlds that do not necessarily obey familiar
    laws of physics; moreover, the avatar’s bodily characteristics may not neatly
    match our bodies in the physical world. Despite these limitations and specificities,
    humans get things done through collaboration and the creative use of the environment.
    While multiuser interactive VR is attracting greater numbers of participants,
    there are currently few attempts to analyze the in situ interaction systematically.
    This paper proposes a video-analytic detail-oriented methodological framework
    for studying virtual reality interaction. Using multimodal conversation analysis,
    the paper investigates a nonverbal, embodied, two-person interaction: two players
    in a survival game strive to gesturally resolve a misunderstanding regarding an
    in-game mechanic—however, both of their microphones are turned off for the duration
    of play. The players’ inability to resort to complex language to resolve this
    issue results in a dense sequence of back-and-forth activity involving gestures,
    object manipulation, gaze, and body work. Most crucially, timing and modified
    repetitions of previously produced actions turn out to be the key to overcome
    both technical and communicative challenges. The paper analyzes these action sequences,
    demonstrates how they generate intended outcomes, and proposes a vocabulary to
    speak about these types of interaction more generally. The findings demonstrate
    the viability of multimodal analysis of VR interaction, shed light on unique challenges
    of analyzing interaction in virtual reality, and generate broader methodological
    insights about the study of nonverbal action.</jats:p>'
article_type: original
author:
- first_name: Nils
  full_name: Klowait, Nils
  id: '98454'
  last_name: Klowait
  orcid: 0000-0002-7347-099X
citation:
  ama: Klowait N. On the Multimodal Resolution of a Search Sequence in Virtual Reality.
    <i>Human Behavior and Emerging Technologies</i>. 2023;2023:1-15. doi:<a href="https://doi.org/10.1155/2023/8417012">10.1155/2023/8417012</a>
  apa: Klowait, N. (2023). On the Multimodal Resolution of a Search Sequence in Virtual
    Reality. <i>Human Behavior and Emerging Technologies</i>, <i>2023</i>, 1–15. <a
    href="https://doi.org/10.1155/2023/8417012">https://doi.org/10.1155/2023/8417012</a>
  bibtex: '@article{Klowait_2023, title={On the Multimodal Resolution of a Search
    Sequence in Virtual Reality}, volume={2023}, DOI={<a href="https://doi.org/10.1155/2023/8417012">10.1155/2023/8417012</a>},
    journal={Human Behavior and Emerging Technologies}, publisher={Hindawi Limited},
    author={Klowait, Nils}, year={2023}, pages={1–15} }'
  chicago: 'Klowait, Nils. “On the Multimodal Resolution of a Search Sequence in Virtual
    Reality.” <i>Human Behavior and Emerging Technologies</i> 2023 (2023): 1–15. <a
    href="https://doi.org/10.1155/2023/8417012">https://doi.org/10.1155/2023/8417012</a>.'
  ieee: 'N. Klowait, “On the Multimodal Resolution of a Search Sequence in Virtual
    Reality,” <i>Human Behavior and Emerging Technologies</i>, vol. 2023, pp. 1–15,
    2023, doi: <a href="https://doi.org/10.1155/2023/8417012">10.1155/2023/8417012</a>.'
  mla: Klowait, Nils. “On the Multimodal Resolution of a Search Sequence in Virtual
    Reality.” <i>Human Behavior and Emerging Technologies</i>, vol. 2023, Hindawi
    Limited, 2023, pp. 1–15, doi:<a href="https://doi.org/10.1155/2023/8417012">10.1155/2023/8417012</a>.
  short: N. Klowait, Human Behavior and Emerging Technologies 2023 (2023) 1–15.
date_created: 2023-04-06T10:57:28Z
date_updated: 2024-03-26T09:40:53Z
ddc:
- '300'
department:
- _id: '9'
doi: 10.1155/2023/8417012
file:
- access_level: closed
  content_type: application/pdf
  creator: nklowait
  date_created: 2023-04-06T11:00:01Z
  date_updated: 2023-04-06T11:00:01Z
  file_id: '43438'
  file_name: Klowait_2023a.pdf
  file_size: 2877385
  relation: main_file
  success: 1
file_date_updated: 2023-04-06T11:00:01Z
funded_apc: '1'
has_accepted_license: '1'
intvolume: '      2023'
keyword:
- Human-Computer Interaction
- General Social Sciences
- Social Psychology
- 'Virtual Reality : Multimodality'
- Nonverbal Interaction
- Search Sequence
- Gesture
- Co-Operative Action
- Goodwin
- Ethnomethodology
language:
- iso: eng
main_file_link:
- open_access: '1'
  url: https://doi.org/10.1155/2023/8417012
oa: '1'
page: 1-15
project:
- _id: '119'
  name: 'TRR 318 - Ö: TRR 318 - Project Area Ö'
publication: Human Behavior and Emerging Technologies
publication_identifier:
  issn:
  - 2578-1863
publication_status: published
publisher: Hindawi Limited
quality_controlled: '1'
status: public
title: On the Multimodal Resolution of a Search Sequence in Virtual Reality
type: journal_article
user_id: '98454'
volume: 2023
year: '2023'
...
---
_id: '29840'
abstract:
- lang: eng
  text: Due to the proliferation of Virtual Reality (VR) technology, VR is finding
    new applications in various domains, such as stock trading. Here, traders invest
    in stocks intending to increase their profit. For this purpose, in conventional
    stock trading, traders usually make use of 2D applications on desktop or laptop
    devices. This leads to many drawbacks such as poor visibility due to limited 2D
    representation, complex interaction due to indirect interaction via mouse and
    keyboard, or restricted support for collaboration between traders. To overcome
    these issues, we have developed a novel collaborative, virtual environment for
    stock trading, which enables stock traders to view financial information and trade
    stocks with other collaborators. The main results of a usability study indicate
    that the VR environment, compared to conventional stock trading, shows no significant
    advantages concerning efficiency and effectiveness, however, we could observe
    an increased user satisfaction and better collaboration.
author:
- first_name: Enes
  full_name: Yigitbas, Enes
  id: '8447'
  last_name: Yigitbas
  orcid: 0000-0002-5967-833X
- first_name: Sebastian
  full_name: Gottschalk, Sebastian
  id: '47208'
  last_name: Gottschalk
- first_name: Alexander
  full_name: Nowosad, Alexander
  last_name: Nowosad
- first_name: Gregor
  full_name: Engels, Gregor
  id: '107'
  last_name: Engels
citation:
  ama: 'Yigitbas E, Gottschalk S, Nowosad A, Engels G. Development and Evaluation
    of a Collaborative Stock Trading Environment in Virtual Reality. In: <i>Proceedings
    of the 17th International Conference on Wirtschaftsinformatik</i>. AIS; 2022.'
  apa: Yigitbas, E., Gottschalk, S., Nowosad, A., &#38; Engels, G. (2022). Development
    and Evaluation of a Collaborative Stock Trading Environment in Virtual Reality.
    <i>Proceedings of the 17th International Conference on Wirtschaftsinformatik</i>.
    17th International Conference on Wirtschaftsinformatik, Nuremberg.
  bibtex: '@inproceedings{Yigitbas_Gottschalk_Nowosad_Engels_2022, title={Development
    and Evaluation of a Collaborative Stock Trading Environment in Virtual Reality},
    booktitle={Proceedings of the 17th International Conference on Wirtschaftsinformatik},
    publisher={AIS}, author={Yigitbas, Enes and Gottschalk, Sebastian and Nowosad,
    Alexander and Engels, Gregor}, year={2022} }'
  chicago: Yigitbas, Enes, Sebastian Gottschalk, Alexander Nowosad, and Gregor Engels.
    “Development and Evaluation of a Collaborative Stock Trading Environment in Virtual
    Reality.” In <i>Proceedings of the 17th International Conference on Wirtschaftsinformatik</i>.
    AIS, 2022.
  ieee: E. Yigitbas, S. Gottschalk, A. Nowosad, and G. Engels, “Development and Evaluation
    of a Collaborative Stock Trading Environment in Virtual Reality,” presented at
    the 17th International Conference on Wirtschaftsinformatik, Nuremberg, 2022.
  mla: Yigitbas, Enes, et al. “Development and Evaluation of a Collaborative Stock
    Trading Environment in Virtual Reality.” <i>Proceedings of the 17th International
    Conference on Wirtschaftsinformatik</i>, AIS, 2022.
  short: 'E. Yigitbas, S. Gottschalk, A. Nowosad, G. Engels, in: Proceedings of the
    17th International Conference on Wirtschaftsinformatik, AIS, 2022.'
conference:
  end_date: 2022-02-23
  location: Nuremberg
  name: 17th International Conference on Wirtschaftsinformatik
  start_date: 2022-02-21
date_created: 2022-02-15T07:24:50Z
date_updated: 2022-02-15T07:25:45Z
department:
- _id: '66'
- _id: '534'
keyword:
- virtual reality
- stock trading
- collaboration
- usability
language:
- iso: eng
main_file_link:
- open_access: '1'
  url: https://aisel.aisnet.org/wi2022/hci/hci/17/
oa: '1'
publication: Proceedings of the 17th International Conference on Wirtschaftsinformatik
publisher: AIS
status: public
title: Development and Evaluation of a Collaborative Stock Trading Environment in
  Virtual Reality
type: conference
user_id: '47208'
year: '2022'
...
---
_id: '34127'
abstract:
- lang: ger
  text: "Obwohl die Idee von Augmented Reality (AR) und Virtual Reality (VR) so alt
    ist wie die Verbreitung erster Spielekonsolen und Computer, hat das Thema erst
    durch den technologischen Fortschritt und dem damit verbundenen Preisverfall an
    Bedeutung gewonnen [1]. So lassen sich mittlerweile bereits anspruchsvolle AR-
    und VR-Anwendungen auf handelsüblichen Smartphones und Tablets betreiben [1][2].
    Daraus erschließen sich neue Möglichkeiten in der Lehre, z.B die Visualisierung
    räumlicher Darstellungen, die Förderung der räumlicher Vorstellungskraft der Studierenden,
    sowie die Vermittlung von abstrakten und damit schwer verständlichen Konzepten
    in den Naturwissenschaften [3].\r\nZahlreiche Studien zeigen bereits, dass, wenn
    AR effektiv in der Lehre eingesetzt wird, nicht nur das Lerninteresse, sondern
    auch die Konzentration der Lernenden gesteigert werden kann [3][4]. Voraussetzung
    hierfür ist jedoch, dass zunächst lernförderliche Merkmale identifiziert und bzgl.
    ihrer Wirksamkeit in einer VR- oder AR-Umgebung untersucht werden [5].\r\nZu den
    Pflichtveranstaltungen eines Elektrotechnik-Studiums an der Universität Paderborn
    gehören drei fächerübergreifende Laborpraktika, die der Vertiefung theoretischer
    Vorlesungsinhalte dient. Ein großes Problem stellt dabei die Bedienung der elektrotechnischen
    Laborgeräte dar. Sowohl Studierende als auch die betreuenden Laboringenieure kritisieren,
    dass ein erster Kontakt mit den Geräten erst innerhalb des Praktikums stattfindet.
    Um dieser Problematik entgegen zu wirken, soll eine Lernumgebung entwickelt werden,
    in der Studierende den Umgang mit dem Laborequipment sowohl zeit- als auch ortsunabhängig
    erlernen können.\r\nIn diesem Beitrag wird daher untersucht, welche Potentiale
    die VR- und die AR-Technologie auf mobilen Endgeräten bieten, um praktische Fertigkeiten
    im Umgang mit elektrotechnischer Laborausstattung als Vorbereitung auf das praktische
    Arbeiten im Labor zu erwerben und zu vertiefen. Es wird gezeigt, wo die besonderen
    Unterschiede und Vorzüge beider Technologien sind und insbesondere wie die (Inter-)Aktion
    des Lernenden innerhalb einer VR- oder AR-Umgebung aussehen kann. \r\nIn einer
    anschließenden Arbeit soll ausgehend von den hier erarbeiteten Potentialen und
    den zu bekannten lerntheorethischen und kognitionspsychologischen Thereorien des
    Wissenserwerbs ein Konzept zur Gestaltung einer VR- und einer AR-Umgebung im Rahmen
    eines Laborpraktikums entwickelt werden. Dabei werden motivationspsychologische
    Aspekte, z.B. etablierte Gamification-Konzepte analysiert, die in solch einer
    Umgebung genutzt werden können, um u.a.die Lernmotivation weiter zu fördern."
author:
- first_name: Mesut
  full_name: Alptekin, Mesut
  id: '11763'
  last_name: Alptekin
- first_name: Katrin
  full_name: Temmen, Katrin
  id: '30086'
  last_name: Temmen
citation:
  ama: Alptekin M, Temmen K. Möglichkeiten und Grenzen von Virtual-und Augmented Reality
    im Laborpraktikum. <i>Digitalisierung in der Techniklehre - ihr Beitrag zum Profil
    technischer Bildung</i>. 2017;12:91-98.
  apa: Alptekin, M., &#38; Temmen, K. (2017). Möglichkeiten und Grenzen von Virtual-und
    Augmented Reality im Laborpraktikum. <i>Digitalisierung in der Techniklehre -
    ihr Beitrag zum Profil technischer Bildung</i>, <i>12</i>, 91–98.
  bibtex: '@article{Alptekin_Temmen_2017, title={Möglichkeiten und Grenzen von Virtual-und
    Augmented Reality im Laborpraktikum}, volume={12}, journal={Digitalisierung in
    der Techniklehre - ihr Beitrag zum Profil technischer Bildung}, publisher={Gudrun
    Kammasch, Henning Klaf e, Sönke Knutzen (Hrsg.)}, author={Alptekin, Mesut and
    Temmen, Katrin}, year={2017}, pages={91–98} }'
  chicago: 'Alptekin, Mesut, and Katrin Temmen. “Möglichkeiten und Grenzen von Virtual-und
    Augmented Reality im Laborpraktikum.” <i>Digitalisierung in der Techniklehre -
    ihr Beitrag zum Profil technischer Bildung</i> 12 (2017): 91–98.'
  ieee: M. Alptekin and K. Temmen, “Möglichkeiten und Grenzen von Virtual-und Augmented
    Reality im Laborpraktikum,” <i>Digitalisierung in der Techniklehre - ihr Beitrag
    zum Profil technischer Bildung</i>, vol. 12, pp. 91–98, 2017.
  mla: Alptekin, Mesut, and Katrin Temmen. “Möglichkeiten und Grenzen von Virtual-und
    Augmented Reality im Laborpraktikum.” <i>Digitalisierung in der Techniklehre -
    ihr Beitrag zum Profil technischer Bildung</i>, vol. 12, Gudrun Kammasch, Henning
    Klaf e, Sönke Knutzen (Hrsg.), 2017, pp. 91–98.
  short: M. Alptekin, K. Temmen, Digitalisierung in der Techniklehre - ihr Beitrag
    zum Profil technischer Bildung 12 (2017) 91–98.
conference:
  end_date: 2017-05-13
  location: Technische Universität Ilmenau
  name: Ingenieur-Pädagogische Wissenschaftsgesellschaft (IPW), 12. Regionaltagung
    2017
  start_date: 2017-05-11
date_created: 2022-11-22T12:03:07Z
date_updated: 2023-01-18T14:32:14Z
department:
- _id: '34'
- _id: '300'
intvolume: '        12'
keyword:
- Virtual Reality
- Augmented Reality
- Laborpraktika
- Ingenieurdidaktik
- Labordidaktik
language:
- iso: ger
page: 91-98
publication: Digitalisierung in der Techniklehre - ihr Beitrag zum Profil technischer
  Bildung
publication_identifier:
  isbn:
  - 978-3-9818728-1-1
publication_status: published
publisher: Gudrun Kammasch, Henning Klaf e, Sönke Knutzen (Hrsg.)
status: public
title: Möglichkeiten und Grenzen von Virtual-und Augmented Reality im Laborpraktikum
type: journal_article
user_id: '11763'
volume: 12
year: '2017'
...
---
_id: '39493'
abstract:
- lang: eng
  text: This article presents the animated visual 3D programming language SAM (Solid
    Agents in Motion) for parallel systems specification and animation. A SAM program
    is a set of interacting agents synchronously exchanging messages. The agent's
    behaviour is specified by means of production rules with a condition and a sequence
    of actions each. Actions are linearly ordered and execute when matching a rule.
    In SAM, main syntactic objects like agents, rules, and messages are 3D. These
    objects can have an abstract and a concrete, solid 3D presentation. While the
    abstract representation is for programming and debugging, the concrete representation
    is for animated 3D end-user presentations. After outlining the concepts of SAM
    this article gives two programming examples of 3D micro worlds and an overview
    of the programming environment.
author:
- first_name: Christian
  full_name: Geiger, Christian
  last_name: Geiger
- first_name: Wolfgang
  full_name: Müller, Wolfgang
  id: '16243'
  last_name: Müller
- first_name: W.
  full_name: Rosenbach, W.
  last_name: Rosenbach
citation:
  ama: 'Geiger C, Müller W, Rosenbach W. SAM - An Animated 3D Programming Language.
    In: <i>Proceedings of the IEEE Symposium on Visual Languages</i>. ; 1998. doi:<a
    href="https://doi.org/10.1109/VL.1998.706167">10.1109/VL.1998.706167</a>'
  apa: Geiger, C., Müller, W., &#38; Rosenbach, W. (1998). SAM - An Animated 3D Programming
    Language. <i>Proceedings of the IEEE Symposium on Visual Languages</i>. 1998 IEEE
    Symposium on Visual Languages, Halifax, Canada. <a href="https://doi.org/10.1109/VL.1998.706167">https://doi.org/10.1109/VL.1998.706167</a>
  bibtex: '@inproceedings{Geiger_Müller_Rosenbach_1998, place={Halifax, Canada}, title={SAM
    - An Animated 3D Programming Language}, DOI={<a href="https://doi.org/10.1109/VL.1998.706167">10.1109/VL.1998.706167</a>},
    booktitle={Proceedings of the IEEE Symposium on Visual Languages}, author={Geiger,
    Christian and Müller, Wolfgang and Rosenbach, W.}, year={1998} }'
  chicago: Geiger, Christian, Wolfgang Müller, and W. Rosenbach. “SAM - An Animated
    3D Programming Language.” In <i>Proceedings of the IEEE Symposium on Visual Languages</i>.
    Halifax, Canada, 1998. <a href="https://doi.org/10.1109/VL.1998.706167">https://doi.org/10.1109/VL.1998.706167</a>.
  ieee: 'C. Geiger, W. Müller, and W. Rosenbach, “SAM - An Animated 3D Programming
    Language,” presented at the 1998 IEEE Symposium on Visual Languages, Halifax,
    Canada, 1998, doi: <a href="https://doi.org/10.1109/VL.1998.706167">10.1109/VL.1998.706167</a>.'
  mla: Geiger, Christian, et al. “SAM - An Animated 3D Programming Language.” <i>Proceedings
    of the IEEE Symposium on Visual Languages</i>, 1998, doi:<a href="https://doi.org/10.1109/VL.1998.706167">10.1109/VL.1998.706167</a>.
  short: 'C. Geiger, W. Müller, W. Rosenbach, in: Proceedings of the IEEE Symposium
    on Visual Languages, Halifax, Canada, 1998.'
conference:
  location: Halifax, Canada
  name: 1998 IEEE Symposium on Visual Languages
date_created: 2023-01-24T11:39:30Z
date_updated: 2023-01-24T11:39:35Z
department:
- _id: '672'
doi: 10.1109/VL.1998.706167
keyword:
- Animation
- Computer languages
- Solids
- Concrete
- Application software
- Virtual reality
- Programming profession
- Switches
- Visualization
- Debugging
language:
- iso: eng
place: Halifax, Canada
publication: Proceedings of the IEEE Symposium on Visual Languages
publication_identifier:
  isbn:
  - 0-8186-8712-6
status: public
title: SAM - An Animated 3D Programming Language
type: conference
user_id: '5786'
year: '1998'
...
---
_id: '39505'
abstract:
- lang: eng
  text: '3D-graphics are becoming popular in a steadily increasing number of areas
    such as entertainment, scientific visualization, simulation, and virtual reality.
    Despite this rapid growth the generation of animated 3D scenes is by no means
    trivial. Since animated 3D objects evolve over time the authors denote these objects
    as 4D. The article presents a novel approach to the rapid prototyping of 4D models.
    They introduce the AAL (Animated Agent Layer) system. AAL is an interpreter-based
    approach covering a textual (AAL-PR) as well as a visual command language (AAL-VL)
    for the specification of the dynamics in 4D scenes. AAL provides support for different
    levels of abstraction: primitives, structured objects, animated objects, and animated
    (autonomous) agents.'
author:
- first_name: M.
  full_name: Dücker, M.
  last_name: Dücker
- first_name: Christian
  full_name: Geiger, Christian
  last_name: Geiger
- first_name: R.
  full_name: Hunstock, R.
  last_name: Hunstock
- first_name: Georg
  full_name: Lehrenfeld, Georg
  last_name: Lehrenfeld
- first_name: Wolfgang
  full_name: Müller, Wolfgang
  id: '16243'
  last_name: Müller
citation:
  ama: 'Dücker M, Geiger C, Hunstock R, Lehrenfeld G, Müller W. Visual-Textual Prototyping
    of 4D Scenes. In: <i>Proceedings of the 1997 IEEE Symposium on Visual Languages</i>.
    ; 1997. doi:<a href="https://doi.org/10.1109/VL.1997.626601">10.1109/VL.1997.626601</a>'
  apa: Dücker, M., Geiger, C., Hunstock, R., Lehrenfeld, G., &#38; Müller, W. (1997).
    Visual-Textual Prototyping of 4D Scenes. <i>Proceedings of the 1997 IEEE Symposium
    on Visual Languages</i>. 1997 IEEE Symposium on Visual Languages. <a href="https://doi.org/10.1109/VL.1997.626601">https://doi.org/10.1109/VL.1997.626601</a>
  bibtex: '@inproceedings{Dücker_Geiger_Hunstock_Lehrenfeld_Müller_1997, place={Capri,
    Italy}, title={Visual-Textual Prototyping of 4D Scenes}, DOI={<a href="https://doi.org/10.1109/VL.1997.626601">10.1109/VL.1997.626601</a>},
    booktitle={Proceedings of the 1997 IEEE Symposium on Visual Languages}, author={Dücker,
    M. and Geiger, Christian and Hunstock, R. and Lehrenfeld, Georg and Müller, Wolfgang},
    year={1997} }'
  chicago: Dücker, M., Christian Geiger, R. Hunstock, Georg Lehrenfeld, and Wolfgang
    Müller. “Visual-Textual Prototyping of 4D Scenes.” In <i>Proceedings of the 1997
    IEEE Symposium on Visual Languages</i>. Capri, Italy, 1997. <a href="https://doi.org/10.1109/VL.1997.626601">https://doi.org/10.1109/VL.1997.626601</a>.
  ieee: 'M. Dücker, C. Geiger, R. Hunstock, G. Lehrenfeld, and W. Müller, “Visual-Textual
    Prototyping of 4D Scenes,” presented at the 1997 IEEE Symposium on Visual Languages,
    1997, doi: <a href="https://doi.org/10.1109/VL.1997.626601">10.1109/VL.1997.626601</a>.'
  mla: Dücker, M., et al. “Visual-Textual Prototyping of 4D Scenes.” <i>Proceedings
    of the 1997 IEEE Symposium on Visual Languages</i>, 1997, doi:<a href="https://doi.org/10.1109/VL.1997.626601">10.1109/VL.1997.626601</a>.
  short: 'M. Dücker, C. Geiger, R. Hunstock, G. Lehrenfeld, W. Müller, in: Proceedings
    of the 1997 IEEE Symposium on Visual Languages, Capri, Italy, 1997.'
conference:
  name: 1997 IEEE Symposium on Visual Languages
date_created: 2023-01-24T11:48:57Z
date_updated: 2023-01-24T11:49:01Z
department:
- _id: '672'
doi: 10.1109/VL.1997.626601
keyword:
- Prototypes
- Layout
- Animation
- Command languages
- Application software
- Libraries
- Virtual reality
- Computer graphics
- Hardware
- Context modeling
language:
- iso: eng
place: Capri, Italy
publication: Proceedings of the 1997 IEEE Symposium on Visual Languages
publication_identifier:
  isbn:
  - 0-8186-8144-6
status: public
title: Visual-Textual Prototyping of 4D Scenes
type: conference
user_id: '5786'
year: '1997'
...
