[{"publication":"Frontiers in Psychology","type":"journal_article","abstract":[{"text":"When humans interact with artificial intelligence (AI), one desideratum is appropriate trust. Typically, appropriate trust encompasses that humans trust AI except for instances in which they either explicitly notice AI errors or are suspicious that errors could be present. So far, appropriate trust or related notions have mainly been investigated by assessing trust and reliance. In this contribution, we argue that these assessments are insufficient to measure the complex aim of appropriate trust and the related notion of healthy distrust. We introduce and test the perspective of covert visual attention as an additional indicator for appropriate trust and draw conceptual connections to the notion of healthy distrust. To test the validity of our conceptualization, we formalize visual attention using the Theory of Visual Attention and measure its properties that are potentially relevant to appropriate trust and healthy distrust in an image classification task. Based on temporal-order judgment performance, we estimate participants' attentional capacity and attentional weight toward correct and incorrect mock-up AI classifications. We observe that misclassifications reduce attentional capacity compared to correct classifications. However, our results do not indicate that this reduction is beneficial for a subsequent judgment of the classifications. The attentional weighting is not affected by the classifications' correctness but by the difficulty of categorizing the stimuli themselves. We discuss these results, their implications, and the limited potential for using visual attention as an indicator of appropriate trust and healthy distrust.","lang":"eng"}],"status":"public","_id":"63611","project":[{"_id":"124","name":"TRR 318 ; TP C01: Gesundes Misstrauen in Erklärungen"}],"department":[{"_id":"424"},{"_id":"660"}],"user_id":"92810","keyword":["appropriate trust","healthy distrust","visual attention","Theory of Visual Attention","human-AI interaction","Bayesian cognitive model","image classification"],"article_type":"original","article_number":"1694367","language":[{"iso":"eng"}],"publication_identifier":{"issn":["1664-1078"]},"publication_status":"published","year":"2026","intvolume":"        16","citation":{"bibtex":"@article{Peters_Biermeier_Scharlau_2026, title={Assessing healthy distrust in human-AI interaction: interpreting changes in visual attention}, volume={16}, DOI={<a href=\"https://doi.org/10.3389/fpsyg.2025.1694367\">10.3389/fpsyg.2025.1694367</a>}, number={1694367}, journal={Frontiers in Psychology}, publisher={Frontiers Media SA}, author={Peters, Tobias Martin and Biermeier, Kai and Scharlau, Ingrid}, year={2026} }","mla":"Peters, Tobias Martin, et al. “Assessing Healthy Distrust in Human-AI Interaction: Interpreting Changes in Visual Attention.” <i>Frontiers in Psychology</i>, vol. 16, 1694367, Frontiers Media SA, 2026, doi:<a href=\"https://doi.org/10.3389/fpsyg.2025.1694367\">10.3389/fpsyg.2025.1694367</a>.","short":"T.M. Peters, K. Biermeier, I. Scharlau, Frontiers in Psychology 16 (2026).","apa":"Peters, T. M., Biermeier, K., &#38; Scharlau, I. (2026). Assessing healthy distrust in human-AI interaction: interpreting changes in visual attention. <i>Frontiers in Psychology</i>, <i>16</i>, Article 1694367. <a href=\"https://doi.org/10.3389/fpsyg.2025.1694367\">https://doi.org/10.3389/fpsyg.2025.1694367</a>","ama":"Peters TM, Biermeier K, Scharlau I. Assessing healthy distrust in human-AI interaction: interpreting changes in visual attention. <i>Frontiers in Psychology</i>. 2026;16. doi:<a href=\"https://doi.org/10.3389/fpsyg.2025.1694367\">10.3389/fpsyg.2025.1694367</a>","chicago":"Peters, Tobias Martin, Kai Biermeier, and Ingrid Scharlau. “Assessing Healthy Distrust in Human-AI Interaction: Interpreting Changes in Visual Attention.” <i>Frontiers in Psychology</i> 16 (2026). <a href=\"https://doi.org/10.3389/fpsyg.2025.1694367\">https://doi.org/10.3389/fpsyg.2025.1694367</a>.","ieee":"T. M. Peters, K. Biermeier, and I. Scharlau, “Assessing healthy distrust in human-AI interaction: interpreting changes in visual attention,” <i>Frontiers in Psychology</i>, vol. 16, Art. no. 1694367, 2026, doi: <a href=\"https://doi.org/10.3389/fpsyg.2025.1694367\">10.3389/fpsyg.2025.1694367</a>."},"publisher":"Frontiers Media SA","date_updated":"2026-01-14T14:29:03Z","volume":16,"author":[{"last_name":"Peters","orcid":"0009-0008-5193-6243","full_name":"Peters, Tobias Martin","id":"92810","first_name":"Tobias Martin"},{"first_name":"Kai","id":"55908","full_name":"Biermeier, Kai","orcid":"0000-0002-2879-2359","last_name":"Biermeier"},{"first_name":"Ingrid","full_name":"Scharlau, Ingrid","id":"451","orcid":"0000-0003-2364-9489","last_name":"Scharlau"}],"date_created":"2026-01-14T14:21:59Z","title":"Assessing healthy distrust in human-AI interaction: interpreting changes in visual attention","doi":"10.3389/fpsyg.2025.1694367"},{"keyword":["Attention","Action","Repairs","Task model","HRI","Eyemovement"],"language":[{"iso":"eng"}],"abstract":[{"lang":"eng","text":"This study investigated how action histories – unfolding sequences of actions with objects – provide a context for both attentional allocation and linguistic repair strategies. Building on theories of enactive cognition and sensorimotor contingency theory, we experimentally manipulated action sequences (action history) to create either simple or rich “situational models,” and investigated how these models interact with attention and reflect in linguistic processes during human–robot interaction. Participants (N = 30) engaged in a controlled object placement task with a humanoid robot, where the action (manner) information was either provided or omitted. The omission elicited repair behaviors in participants that were in focus of our investigation. For rich models (competing action possibilities) participants demonstrated: a) increased attentional reorientation, reflecting active engagement with the situational model b) preference for restricted repairs, targeting the specific source of trouble in action selection. Conversely, a simple situational model led to more generalized attention patterns and open repair strategies, suggesting weaker constraints on internal processing. These findings highlight how situational structures emerge externally to scaffold internal cognitive processes, with action histories serving as a crucial context for the interface between perception, action, and language. We discuss how to implement such a tight loop in the assistance of a system."}],"publication":"IEEE International Conference on Development and Learning (ICDL)","title":"Manners Matter: Action history guides attention and repair choices during interaction","date_created":"2025-09-24T12:32:52Z","year":"2025","quality_controlled":"1","project":[{"name":"TRR 318; TP A05: Echtzeitmessung der Aufmerksamkeit im Mensch-Roboter-Erklärdialog","_id":"115"}],"_id":"61432","user_id":"91018","department":[{"_id":"749"},{"_id":"660"}],"status":"public","type":"conference","main_file_link":[{"open_access":"1","url":"https://doi.org/10.31234/osf.io/yn2we_v1"}],"conference":{"location":"Prague","end_date":"2025-09-19","start_date":"2025-09-15","name":"IEEE International Conference on Development and Learning (ICDL)"},"doi":"10.31234/osf.io/yn2we_v1","date_updated":"2025-09-24T12:39:25Z","oa":"1","author":[{"first_name":"Amit","id":"91018","full_name":"Singh, Amit","orcid":"0000-0002-7789-1521","last_name":"Singh"},{"full_name":"Rohlfing, Katharina J.","id":"50352","orcid":"0000-0002-5676-8233","last_name":"Rohlfing","first_name":"Katharina J."}],"place":" Prague","citation":{"apa":"Singh, A., &#38; Rohlfing, K. J. (2025). Manners Matter: Action history guides attention and repair choices during interaction. <i>IEEE International Conference on Development and Learning (ICDL)</i>. IEEE International Conference on Development and Learning (ICDL), Prague. <a href=\"https://doi.org/10.31234/osf.io/yn2we_v1\">https://doi.org/10.31234/osf.io/yn2we_v1</a>","short":"A. Singh, K.J. Rohlfing, in: IEEE International Conference on Development and Learning (ICDL),  Prague, 2025.","bibtex":"@inproceedings{Singh_Rohlfing_2025, place={ Prague}, title={Manners Matter: Action history guides attention and repair choices during interaction}, DOI={<a href=\"https://doi.org/10.31234/osf.io/yn2we_v1\">10.31234/osf.io/yn2we_v1</a>}, booktitle={IEEE International Conference on Development and Learning (ICDL)}, author={Singh, Amit and Rohlfing, Katharina J.}, year={2025} }","mla":"Singh, Amit, and Katharina J. Rohlfing. “Manners Matter: Action History Guides Attention and Repair Choices during Interaction.” <i>IEEE International Conference on Development and Learning (ICDL)</i>, 2025, doi:<a href=\"https://doi.org/10.31234/osf.io/yn2we_v1\">10.31234/osf.io/yn2we_v1</a>.","ieee":"A. Singh and K. J. Rohlfing, “Manners Matter: Action history guides attention and repair choices during interaction,” presented at the IEEE International Conference on Development and Learning (ICDL), Prague, 2025, doi: <a href=\"https://doi.org/10.31234/osf.io/yn2we_v1\">10.31234/osf.io/yn2we_v1</a>.","chicago":"Singh, Amit, and Katharina J. Rohlfing. “Manners Matter: Action History Guides Attention and Repair Choices during Interaction.” In <i>IEEE International Conference on Development and Learning (ICDL)</i>.  Prague, 2025. <a href=\"https://doi.org/10.31234/osf.io/yn2we_v1\">https://doi.org/10.31234/osf.io/yn2we_v1</a>.","ama":"Singh A, Rohlfing KJ. Manners Matter: Action history guides attention and repair choices during interaction. In: <i>IEEE International Conference on Development and Learning (ICDL)</i>. ; 2025. doi:<a href=\"https://doi.org/10.31234/osf.io/yn2we_v1\">10.31234/osf.io/yn2we_v1</a>"},"publication_status":"published"},{"main_file_link":[{"url":"https://dl.acm.org/doi/pdf/10.1145/3652037.3652050","open_access":"1"}],"doi":"10.1145/3652037.3652050","title":"Measuring Visual Attention Capacity Across xReality","author":[{"id":"55908","full_name":"Biermeier, Kai","last_name":"Biermeier","orcid":"0000-0002-2879-2359","first_name":"Kai"},{"id":"451","full_name":"Scharlau, Ingrid","last_name":"Scharlau","orcid":"0000-0003-2364-9489","first_name":"Ingrid"},{"first_name":"Enes","orcid":"0000-0002-5967-833X","last_name":"Yigitbas","full_name":"Yigitbas, Enes","id":"8447"}],"date_created":"2024-05-02T10:28:03Z","publisher":"ACM","oa":"1","date_updated":"2024-07-08T08:32:21Z","citation":{"apa":"Biermeier, K., Scharlau, I., &#38; Yigitbas, E. (2024). Measuring Visual Attention Capacity Across xReality. <i>Proceedings of the 17th International Conference on PErvasive Technologies Related to Assistive Environments (PETRA 2024)</i>. <a href=\"https://doi.org/10.1145/3652037.3652050\">https://doi.org/10.1145/3652037.3652050</a>","mla":"Biermeier, Kai, et al. “Measuring Visual Attention Capacity Across XReality.” <i>Proceedings of the 17th International Conference on PErvasive Technologies Related to Assistive Environments (PETRA 2024)</i>, ACM, 2024, doi:<a href=\"https://doi.org/10.1145/3652037.3652050\">10.1145/3652037.3652050</a>.","bibtex":"@inproceedings{Biermeier_Scharlau_Yigitbas_2024, title={Measuring Visual Attention Capacity Across xReality}, DOI={<a href=\"https://doi.org/10.1145/3652037.3652050\">10.1145/3652037.3652050</a>}, booktitle={Proceedings of the 17th International Conference on PErvasive Technologies Related to Assistive Environments (PETRA 2024)}, publisher={ACM}, author={Biermeier, Kai and Scharlau, Ingrid and Yigitbas, Enes}, year={2024} }","short":"K. Biermeier, I. Scharlau, E. Yigitbas, in: Proceedings of the 17th International Conference on PErvasive Technologies Related to Assistive Environments (PETRA 2024), ACM, 2024.","ama":"Biermeier K, Scharlau I, Yigitbas E. Measuring Visual Attention Capacity Across xReality. In: <i>Proceedings of the 17th International Conference on PErvasive Technologies Related to Assistive Environments (PETRA 2024)</i>. ACM; 2024. doi:<a href=\"https://doi.org/10.1145/3652037.3652050\">10.1145/3652037.3652050</a>","chicago":"Biermeier, Kai, Ingrid Scharlau, and Enes Yigitbas. “Measuring Visual Attention Capacity Across XReality.” In <i>Proceedings of the 17th International Conference on PErvasive Technologies Related to Assistive Environments (PETRA 2024)</i>. ACM, 2024. <a href=\"https://doi.org/10.1145/3652037.3652050\">https://doi.org/10.1145/3652037.3652050</a>.","ieee":"K. Biermeier, I. Scharlau, and E. Yigitbas, “Measuring Visual Attention Capacity Across xReality,” 2024, doi: <a href=\"https://doi.org/10.1145/3652037.3652050\">10.1145/3652037.3652050</a>."},"year":"2024","language":[{"iso":"eng"}],"keyword":["Visual Attention","TVA","Cognitive Modelling","Bayesian Modelling","AR","VR"],"user_id":"55908","department":[{"_id":"66"},{"_id":"534"},{"_id":"424"}],"_id":"53816","status":"public","abstract":[{"lang":"eng","text":"Augmented (AR) and Virtual Reality (VR) technologies have been applied very broadly in the recent past. While prior work emphasizes the potential of these technologies in various application domains, the process of visual attention in and across the contexts of AR/VR environments is not exhaustively explored yet. By now, visual attention in AR/VR environments has majorly been studied by means of overt attention (i.e. saccadic eye movements), self-report, and process-related visual attention proxies (like reaction time). In this work, we analyze covert visual attention based on the (psychological) Theory of Visual Attention (TVA), which allows us to quantify theory-based interpretable properties of the visual attention process. For example, the TVA allows us to measure the overall processing speed. We instantiate this TVA-based framework with a 30-participant explorative within-subjects study. The results show a decisive difference in visual attention between Reality (i.e. the neutral condition) and Virtual Reality and a weak difference between Reality and Augmented Reality. We discuss the consequences of our findings and provide ideas for future studies."}],"type":"conference","publication":"Proceedings of the 17th International Conference on PErvasive Technologies Related to Assistive Environments (PETRA 2024)"},{"type":"conference","popular_science":"1","status":"public","project":[{"grant_number":"438445824","_id":"115","name":"TRR 318 - A05: TRR 318 - Echtzeitmessung der Aufmerksamkeit im Mensch-Roboter-Erklärdialog (Teilprojekt A05)"}],"_id":"46067","user_id":"91018","department":[{"_id":"749"},{"_id":"660"}],"publication_status":"published","related_material":{"record":[{"relation":"contains","id":"46067","status":"public"}]},"place":"Sydney, Australia","citation":{"chicago":"Singh, Amit, and Katharina J. Rohlfing. “Contrastiveness in the Context of Action Demonstration: An Eye-Tracking Study on Its Effects on Action Perception and Action Recall.” In <i>Proceedings of the Annual Meeting of the Cognitive Science Society 45 (45)</i>. Sydney, Australia: Cognitive Science Society, 2023.","ieee":"A. Singh and K. J. Rohlfing, “Contrastiveness in the context of action demonstration: an eye-tracking study on its effects on action perception and action recall,” presented at the 45th Annual Conference of the Cognitive Science Society, Sydney, 2023.","ama":"Singh A, Rohlfing KJ. Contrastiveness in the context of action demonstration: an eye-tracking study on its effects on action perception and action recall. In: <i>Proceedings of the Annual Meeting of the Cognitive Science Society 45 (45)</i>. Cognitive Science Society; 2023.","bibtex":"@inproceedings{Singh_Rohlfing_2023, place={Sydney, Australia}, title={Contrastiveness in the context of action demonstration: an eye-tracking study on its effects on action perception and action recall}, booktitle={Proceedings of the Annual Meeting of the Cognitive Science Society 45 (45)}, publisher={Cognitive Science Society}, author={Singh, Amit and Rohlfing, Katharina J.}, year={2023} }","mla":"Singh, Amit, and Katharina J. Rohlfing. “Contrastiveness in the Context of Action Demonstration: An Eye-Tracking Study on Its Effects on Action Perception and Action Recall.” <i>Proceedings of the Annual Meeting of the Cognitive Science Society 45 (45)</i>, Cognitive Science Society, 2023.","short":"A. Singh, K.J. Rohlfing, in: Proceedings of the Annual Meeting of the Cognitive Science Society 45 (45), Cognitive Science Society, Sydney, Australia, 2023.","apa":"Singh, A., &#38; Rohlfing, K. J. (2023). Contrastiveness in the context of action demonstration: an eye-tracking study on its effects on action perception and action recall. <i>Proceedings of the Annual Meeting of the Cognitive Science Society 45 (45)</i>. 45th Annual Conference of the Cognitive Science Society, Sydney."},"date_updated":"2023-09-27T13:51:42Z","oa":"1","author":[{"last_name":"Singh","orcid":"0000-0002-7789-1521","id":"91018","full_name":"Singh, Amit","first_name":"Amit"},{"first_name":"Katharina J.","last_name":"Rohlfing","id":"50352","full_name":"Rohlfing, Katharina J."}],"main_file_link":[{"url":"https://escholarship.org/uc/item/2w94t4cv","open_access":"1"}],"conference":{"location":"Sydney","name":"45th Annual Conference of the Cognitive Science Society"},"publication":"Proceedings of the Annual Meeting of the Cognitive Science Society 45 (45)","abstract":[{"lang":"eng","text":"<p>The study investigates two different ways of guiding the addressee of an explanation - an explainee, through action demonstration: contrastive and non-contrastive. Their effect was tested on attention to specific action elements (goal) as well as on event memory. In an eye-tracking experiment, participants were shown different motion videos that were either contrastive or non-contrastive with respect to the segments of movement presentation. Given that everyday action demonstration is often multimodal, the stimuli were created with re- spect to their visual and verbal presentation. For visual presentation, a video combined two movements in a contrastive (e.g., Up-motion following a Down-motion) or non-contrastive way (e.g., two Up-motions following each other). For verbal presentation, each video was combined with a sequence of instruction descriptions in the form of negative (i.e., contrastive) or assertive (i.e., non-contrastive) guidance. It was found that a) attention to the event goal increased for this condition in the later time window, and b) participants’ recall of the event was facilitated when a visually contrastive motion was combined with a verbal contrast.</p>"}],"keyword":["Attention","negation","contrastive  guidance","eye-movements","action understanding","event representation"],"language":[{"iso":"eng"}],"quality_controlled":"1","year":"2023","publisher":"Cognitive Science Society","date_created":"2023-07-15T12:16:42Z","title":"Contrastiveness in the context of action demonstration: an eye-tracking study on its effects on action perception and action recall"},{"author":[{"first_name":"Jan-Peter","full_name":"Kucklick, Jan-Peter","id":"77066","last_name":"Kucklick"}],"date_created":"2021-11-17T07:08:15Z","date_updated":"2022-01-06T06:57:40Z","oa":"1","main_file_link":[{"url":"https://scholarspace.manoa.hawaii.edu/bitstream/10125/79519/0149.pdf","open_access":"1"}],"conference":{"location":"Virtual","end_date":"2022-01-07","start_date":"2022-01-03","name":"Hawaii International Conference on System Science (HICSS)"},"title":"Visual Interpretability of Image-based Real Estate Appraisal","citation":{"ieee":"J.-P. Kucklick, “Visual Interpretability of Image-based Real Estate Appraisal,” presented at the Hawaii International Conference on System Science (HICSS), Virtual, 2022.","chicago":"Kucklick, Jan-Peter. “Visual Interpretability of Image-Based Real Estate Appraisal.” In <i>55th Annual Hawaii International Conference on System Sciences (HICSS-55)</i>, 2022.","ama":"Kucklick J-P. Visual Interpretability of Image-based Real Estate Appraisal. In: <i>55th Annual Hawaii International Conference on System Sciences (HICSS-55)</i>. ; 2022.","apa":"Kucklick, J.-P. (2022). Visual Interpretability of Image-based Real Estate Appraisal. <i>55th Annual Hawaii International Conference on System Sciences (HICSS-55)</i>. Hawaii International Conference on System Science (HICSS), Virtual.","short":"J.-P. Kucklick, in: 55th Annual Hawaii International Conference on System Sciences (HICSS-55), 2022.","bibtex":"@inproceedings{Kucklick_2022, title={Visual Interpretability of Image-based Real Estate Appraisal}, booktitle={55th Annual Hawaii International Conference on System Sciences (HICSS-55)}, author={Kucklick, Jan-Peter}, year={2022} }","mla":"Kucklick, Jan-Peter. “Visual Interpretability of Image-Based Real Estate Appraisal.” <i>55th Annual Hawaii International Conference on System Sciences (HICSS-55)</i>, 2022."},"year":"2022","user_id":"77066","department":[{"_id":"195"},{"_id":"196"}],"_id":"27506","language":[{"iso":"eng"}],"keyword":["Explainable Artificial Intelligence (XAI)","Regression Activation Maps","Real Estate Appraisal","Convolutional Block Attention Module","Computer Vision"],"type":"conference","publication":"55th Annual Hawaii International Conference on System Sciences (HICSS-55)","status":"public","abstract":[{"text":"Explainability for machine learning gets more and more important in high-stakes decisions like real estate appraisal. While traditional hedonic house pricing models are fed with hard information based on housing attributes, recently also soft information has been incorporated to increase the predictive performance. This soft information can be extracted from image data by complex models like Convolutional Neural Networks (CNNs). However, these are intransparent which excludes their use for high-stakes financial decisions. To overcome this limitation, we examine if a two-stage modeling approach can provide explainability. We combine visual interpretability by Regression Activation Maps (RAM) for the CNN and a linear regression for the overall prediction. Our experiments are based on 62.000 family homes in Philadelphia and the results indicate that the CNN learns aspects related to vegetation and quality aspects of the house from exterior images, improving the predictive accuracy of real estate appraisal by up to 5.4%.","lang":"eng"}]},{"doi":"10.3758/s13414-017-1325-6","volume":79,"author":[{"full_name":"Krüger, Alexander","last_name":"Krüger","first_name":"Alexander"},{"first_name":"Jan","full_name":"Tünnermann, Jan","last_name":"Tünnermann"},{"full_name":"Scharlau, Ingrid","id":"451","last_name":"Scharlau","orcid":"0000-0003-2364-9489","first_name":"Ingrid"}],"date_updated":"2022-06-06T14:08:05Z","page":"1593 - 1614","intvolume":"        79","citation":{"ieee":"A. Krüger, J. Tünnermann, and I. Scharlau, “Measuring and modeling salience with the theory of visual attention.,” <i>Attention, Perception, &#38; Psychophysics</i>, vol. 79, no. 6, pp. 1593–1614, 2017, doi: <a href=\"https://doi.org/10.3758/s13414-017-1325-6\">10.3758/s13414-017-1325-6</a>.","chicago":"Krüger, Alexander, Jan Tünnermann, and Ingrid Scharlau. “Measuring and Modeling Salience with the Theory of Visual Attention.” <i>Attention, Perception, &#38; Psychophysics</i> 79, no. 6 (2017): 1593–1614. <a href=\"https://doi.org/10.3758/s13414-017-1325-6\">https://doi.org/10.3758/s13414-017-1325-6</a>.","ama":"Krüger A, Tünnermann J, Scharlau I. Measuring and modeling salience with the theory of visual attention. <i>Attention, Perception, &#38; Psychophysics</i>. 2017;79(6):1593-1614. doi:<a href=\"https://doi.org/10.3758/s13414-017-1325-6\">10.3758/s13414-017-1325-6</a>","bibtex":"@article{Krüger_Tünnermann_Scharlau_2017, title={Measuring and modeling salience with the theory of visual attention.}, volume={79}, DOI={<a href=\"https://doi.org/10.3758/s13414-017-1325-6\">10.3758/s13414-017-1325-6</a>}, number={6}, journal={Attention, Perception, &#38; Psychophysics}, author={Krüger, Alexander and Tünnermann, Jan and Scharlau, Ingrid}, year={2017}, pages={1593–1614} }","short":"A. Krüger, J. Tünnermann, I. Scharlau, Attention, Perception, &#38; Psychophysics 79 (2017) 1593–1614.","mla":"Krüger, Alexander, et al. “Measuring and Modeling Salience with the Theory of Visual Attention.” <i>Attention, Perception, &#38; Psychophysics</i>, vol. 79, no. 6, 2017, pp. 1593–614, doi:<a href=\"https://doi.org/10.3758/s13414-017-1325-6\">10.3758/s13414-017-1325-6</a>.","apa":"Krüger, A., Tünnermann, J., &#38; Scharlau, I. (2017). Measuring and modeling salience with the theory of visual attention. <i>Attention, Perception, &#38; Psychophysics</i>, <i>79</i>(6), 1593–1614. <a href=\"https://doi.org/10.3758/s13414-017-1325-6\">https://doi.org/10.3758/s13414-017-1325-6</a>"},"publication_identifier":{"issn":["1943-3921"]},"publication_status":"published","article_type":"original","department":[{"_id":"424"}],"user_id":"42165","_id":"6075","status":"public","type":"journal_article","title":"Measuring and modeling salience with the theory of visual attention.","date_created":"2018-12-10T07:05:04Z","year":"2017","issue":"6","language":[{"iso":"eng"}],"keyword":["Salience","Visual attention","Bayesian inference","Theory of visual attention","Computational modeling","Inference","Object Recognition","Theories","Visual Perception","Visual Attention","Luminance","Perceptual Orientation","Statistical Probability","Stimulus Salience","Computational Modeling"],"abstract":[{"text":"For almost three decades, the theory of visual attention (TVA) has been successful in mathematically describing and explaining a wide variety of phenomena in visual selection and recognition with high quantitative precision. Interestingly, the influence of feature contrast on attention has been included in TVA only recently, although it has been extensively studied outside the TVA framework. The present approach further develops this extension of TVA’s scope by measuring and modeling salience. An empirical measure of salience is achieved by linking different (orientation and luminance) contrasts to a TVA parameter. In the modeling part, the function relating feature contrasts to salience is described mathematically and tested against alternatives by Bayesian model comparison. This model comparison reveals that the power function is an appropriate model of salience growth in the dimensions of orientation and luminance contrast. Furthermore, if contrasts from the two dimensions are comb","lang":"eng"}],"publication":"Attention, Perception, & Psychophysics"},{"type":"journal_article","status":"public","department":[{"_id":"424"}],"user_id":"42165","_id":"6071","funded_apc":"1","publication_identifier":{"issn":["1895-1171"]},"publication_status":"published","page":"20 - 38","intvolume":"        12","citation":{"apa":"Krüger, A., Tünnermann, J., &#38; Scharlau, I. (2016). Fast and conspicuous? Quantifying salience with the theory of visual attention. <i>Advances in Cognitive Psychology</i>, <i>12</i>(1), 20–38. <a href=\"https://doi.org/10.5709/acp-0184-1\">https://doi.org/10.5709/acp-0184-1</a>","bibtex":"@article{Krüger_Tünnermann_Scharlau_2016, title={Fast and conspicuous? Quantifying salience with the theory of visual attention.}, volume={12}, DOI={<a href=\"https://doi.org/10.5709/acp-0184-1\">10.5709/acp-0184-1</a>}, number={1}, journal={Advances in Cognitive Psychology}, author={Krüger, Alexander and Tünnermann, Jan and Scharlau, Ingrid}, year={2016}, pages={20–38} }","short":"A. Krüger, J. Tünnermann, I. Scharlau, Advances in Cognitive Psychology 12 (2016) 20–38.","mla":"Krüger, Alexander, et al. “Fast and Conspicuous? Quantifying Salience with the Theory of Visual Attention.” <i>Advances in Cognitive Psychology</i>, vol. 12, no. 1, 2016, pp. 20–38, doi:<a href=\"https://doi.org/10.5709/acp-0184-1\">10.5709/acp-0184-1</a>.","chicago":"Krüger, Alexander, Jan Tünnermann, and Ingrid Scharlau. “Fast and Conspicuous? Quantifying Salience with the Theory of Visual Attention.” <i>Advances in Cognitive Psychology</i> 12, no. 1 (2016): 20–38. <a href=\"https://doi.org/10.5709/acp-0184-1\">https://doi.org/10.5709/acp-0184-1</a>.","ieee":"A. Krüger, J. Tünnermann, and I. Scharlau, “Fast and conspicuous? Quantifying salience with the theory of visual attention.,” <i>Advances in Cognitive Psychology</i>, vol. 12, no. 1, pp. 20–38, 2016, doi: <a href=\"https://doi.org/10.5709/acp-0184-1\">10.5709/acp-0184-1</a>.","ama":"Krüger A, Tünnermann J, Scharlau I. Fast and conspicuous? Quantifying salience with the theory of visual attention. <i>Advances in Cognitive Psychology</i>. 2016;12(1):20-38. doi:<a href=\"https://doi.org/10.5709/acp-0184-1\">10.5709/acp-0184-1</a>"},"volume":12,"author":[{"first_name":"Alexander","last_name":"Krüger","full_name":"Krüger, Alexander"},{"last_name":"Tünnermann","full_name":"Tünnermann, Jan","first_name":"Jan"},{"last_name":"Scharlau","orcid":"0000-0003-2364-9489","id":"451","full_name":"Scharlau, Ingrid","first_name":"Ingrid"}],"date_updated":"2022-06-06T16:21:09Z","oa":"1","doi":"10.5709/acp-0184-1","main_file_link":[{"open_access":"1","url":"http://ac-psych.org/en/download-pdf/volume/12/issue/1/id/185"}],"publication":"Advances in Cognitive Psychology","abstract":[{"text":"Particular differences between an object and its surrounding cause salience, guide attention, and improve performance in various tasks. While much research has been dedicated to identifying which feature dimensions contribute to salience, much less regard has been paid to the quantitative strength of the salience caused by feature differences. Only a few studies systematically related salience effects to a common salience measure, and they are partly outdated in the light of new findings on the time course of salience effects. We propose Bundesen’s Theory of Visual Attention (TV A) as a theoretical basis for measuring salience and introduce an empirical and modeling approach to link this theory to data retrieved from temporal-order judgments. With this procedure, TV A becomes applicable to a broad range of salience-related stimulus material. Three experiments with orientation pop-out displays demonstrate the feasibility of the method. A 4th experiment substantiates its applicability t","lang":"eng"}],"language":[{"iso":"eng"}],"keyword":["salience","visual attention","Bayesian inference","theory of visual attention","computational modeling","Visual Attention","Computational Modeling","Inference","Judgment","Statistical Probability"],"issue":"1","year":"2016","date_created":"2018-12-10T07:04:15Z","title":"Fast and conspicuous? Quantifying salience with the theory of visual attention."},{"keyword":["cueing","temporal-order judgements","theory of visual attention (TVA)","peripheral cue","processing speed","stimulus encoding","prior entry","Attention","Cues","Face Perception","Judgment"],"language":[{"iso":"eng"}],"_id":"6080","department":[{"_id":"424"}],"user_id":"42165","abstract":[{"text":"Peripheral visual cues lead to large shifts in psychometric distributions of temporal-order judgments. In one view, such shifts are attributed to attention speeding up processing of the cued stimulus, so-called prior entry. However, sometimes these shifts are so large that it is unlikely that they are caused by attention alone. Here we tested the prevalent alternative explanation that the cue is sometimes confused with the target on a perceptual level, bolstering the shift of the psychometric function. We applied a novel model of cued temporal-order judgments, derived from Bundesen’s Theory of Visual Attention.We found that cue–target confusions indeed contribute to shifting psychometric functions. However, cue-induced changes in the processing rates of the target stimuli play an important role, too. At smaller cueing intervals, the cue increased the processing speed of the target. At larger intervals, inhibition of return was predominant. Earlier studies of cued TOJs were insensitive","lang":"eng"}],"status":"public","publication":"Frontiers in Psychology","type":"journal_article","title":"Peripheral visual cues: Their fate in processing and effects on attention and temporal-order perception.","doi":"10.3389/fpsyg.2016.01442","main_file_link":[{"url":"https://www.frontiersin.org/articles/10.3389/fpsyg.2016.01442/full","open_access":"1"}],"date_updated":"2022-06-06T16:29:50Z","oa":"1","volume":7,"author":[{"full_name":"Tünnermann, Jan","last_name":"Tünnermann","first_name":"Jan"},{"id":"451","full_name":"Scharlau, Ingrid","orcid":"0000-0003-2364-9489","last_name":"Scharlau","first_name":"Ingrid"}],"date_created":"2018-12-10T07:06:09Z","year":"2016","intvolume":"         7","citation":{"ama":"Tünnermann J, Scharlau I. Peripheral visual cues: Their fate in processing and effects on attention and temporal-order perception. <i>Frontiers in Psychology</i>. 2016;7. doi:<a href=\"https://doi.org/10.3389/fpsyg.2016.01442\">10.3389/fpsyg.2016.01442</a>","chicago":"Tünnermann, Jan, and Ingrid Scharlau. “Peripheral Visual Cues: Their Fate in Processing and Effects on Attention and Temporal-Order Perception.” <i>Frontiers in Psychology</i> 7 (2016). <a href=\"https://doi.org/10.3389/fpsyg.2016.01442\">https://doi.org/10.3389/fpsyg.2016.01442</a>.","ieee":"J. Tünnermann and I. Scharlau, “Peripheral visual cues: Their fate in processing and effects on attention and temporal-order perception.,” <i>Frontiers in Psychology</i>, vol. 7, 2016, doi: <a href=\"https://doi.org/10.3389/fpsyg.2016.01442\">10.3389/fpsyg.2016.01442</a>.","apa":"Tünnermann, J., &#38; Scharlau, I. (2016). Peripheral visual cues: Their fate in processing and effects on attention and temporal-order perception. <i>Frontiers in Psychology</i>, <i>7</i>. <a href=\"https://doi.org/10.3389/fpsyg.2016.01442\">https://doi.org/10.3389/fpsyg.2016.01442</a>","bibtex":"@article{Tünnermann_Scharlau_2016, title={Peripheral visual cues: Their fate in processing and effects on attention and temporal-order perception.}, volume={7}, DOI={<a href=\"https://doi.org/10.3389/fpsyg.2016.01442\">10.3389/fpsyg.2016.01442</a>}, journal={Frontiers in Psychology}, author={Tünnermann, Jan and Scharlau, Ingrid}, year={2016} }","short":"J. Tünnermann, I. Scharlau, Frontiers in Psychology 7 (2016).","mla":"Tünnermann, Jan, and Ingrid Scharlau. “Peripheral Visual Cues: Their Fate in Processing and Effects on Attention and Temporal-Order Perception.” <i>Frontiers in Psychology</i>, vol. 7, 2016, doi:<a href=\"https://doi.org/10.3389/fpsyg.2016.01442\">10.3389/fpsyg.2016.01442</a>."},"publication_identifier":{"issn":["1664-1078"]},"publication_status":"published"},{"language":[{"iso":"eng"}],"keyword":["unattended stimuli","attention speed","cognitive processing","Attention","Humans","Judgment","Mental Recall","Visual Perception","Stimulus Parameters","Visual Perception","Visual Attention","Cognitive Processes","Velocity"],"abstract":[{"text":"Selective visual attention improves performance in many tasks. Among others, it leads to 'prior entry'—earlier perception of an attended compared to an unattended stimulus. Whether this phenomenon is purely based on an increase of the processing rate of the attended stimulus or if a decrease in the processing rate of the unattended stimulus also contributes to the effect is, up to now, unanswered. Here we describe a novel approach to this question based on Bundesen’s Theory of Visual Attention, which we use to overcome the limitations of earlier prior-entry assessment with temporal order judgments (TOJs) that only allow relative statements regarding the processing speed of attended and unattended stimuli. Prevalent models of prior entry in TOJs either indirectly predict a pure acceleration or cannot model the difference between acceleration and deceleration. In a paradigm that combines a letter-identification task with TOJs, we show that indeed acceleration of the attended and deceler","lang":"eng"}],"publication":"Journal of Vision","title":"Does attention speed up processing? Decreases and increases of processing rates in visual prior entry.","date_created":"2018-12-10T07:01:56Z","year":"2015","issue":"3","user_id":"42165","department":[{"_id":"424"}],"_id":"6066","status":"public","type":"journal_article","main_file_link":[{"url":"https://jov.arvojournals.org/article.aspx?articleid=2213282","open_access":"1"}],"doi":"10.1167/15.3.1","author":[{"last_name":"Tünnermann","full_name":"Tünnermann, Jan","first_name":"Jan"},{"last_name":"Petersen","full_name":"Petersen, Anders","first_name":"Anders"},{"id":"451","full_name":"Scharlau, Ingrid","orcid":"0000-0003-2364-9489","last_name":"Scharlau","first_name":"Ingrid"}],"volume":15,"oa":"1","date_updated":"2022-06-06T16:31:07Z","citation":{"bibtex":"@article{Tünnermann_Petersen_Scharlau_2015, title={Does attention speed up processing? Decreases and increases of processing rates in visual prior entry.}, volume={15}, DOI={<a href=\"https://doi.org/10.1167/15.3.1\">10.1167/15.3.1</a>}, number={3}, journal={Journal of Vision}, author={Tünnermann, Jan and Petersen, Anders and Scharlau, Ingrid}, year={2015} }","mla":"Tünnermann, Jan, et al. “Does Attention Speed up Processing? Decreases and Increases of Processing Rates in Visual Prior Entry.” <i>Journal of Vision</i>, vol. 15, no. 3, 2015, doi:<a href=\"https://doi.org/10.1167/15.3.1\">10.1167/15.3.1</a>.","short":"J. Tünnermann, A. Petersen, I. Scharlau, Journal of Vision 15 (2015).","apa":"Tünnermann, J., Petersen, A., &#38; Scharlau, I. (2015). Does attention speed up processing? Decreases and increases of processing rates in visual prior entry. <i>Journal of Vision</i>, <i>15</i>(3). <a href=\"https://doi.org/10.1167/15.3.1\">https://doi.org/10.1167/15.3.1</a>","ama":"Tünnermann J, Petersen A, Scharlau I. Does attention speed up processing? Decreases and increases of processing rates in visual prior entry. <i>Journal of Vision</i>. 2015;15(3). doi:<a href=\"https://doi.org/10.1167/15.3.1\">10.1167/15.3.1</a>","ieee":"J. Tünnermann, A. Petersen, and I. Scharlau, “Does attention speed up processing? Decreases and increases of processing rates in visual prior entry.,” <i>Journal of Vision</i>, vol. 15, no. 3, 2015, doi: <a href=\"https://doi.org/10.1167/15.3.1\">10.1167/15.3.1</a>.","chicago":"Tünnermann, Jan, Anders Petersen, and Ingrid Scharlau. “Does Attention Speed up Processing? Decreases and Increases of Processing Rates in Visual Prior Entry.” <i>Journal of Vision</i> 15, no. 3 (2015). <a href=\"https://doi.org/10.1167/15.3.1\">https://doi.org/10.1167/15.3.1</a>."},"intvolume":"        15","publication_status":"published","publication_identifier":{"issn":["1534-7362"]}},{"language":[{"iso":"eng"}],"keyword":["Human behavior","Attention"],"department":[{"_id":"749"}],"user_id":"14931","_id":"17197","status":"public","abstract":[{"text":"According to natural pedagogy theory, infants are sensitive to particular ostensive cues that communicate to them that they are being addressed and that they can expect to learn referential information. We demonstrate that 6-month-old infants follow others' gaze direction in situations that are highly attention-grabbing. This occurs irrespective of whether these situations include communicative intent and ostensive cues (a model looks directly into the child's eyes prior to shifting gaze to an object) or not (a model shivers while looking down prior to shifting gaze to an object). In contrast, in less attention-grabbing contexts in which the model simply looks down prior to shifting gaze to an object, no effect is found. These findings demonstrate that one of the central pillars of natural pedagogy is false. Sensitivity to gaze following in infancy is not restricted to contexts in which ostensive cues are conveyed.","lang":"eng"}],"publication":"Scientific Reports","type":"journal_article","doi":"10.1038/srep05304","title":"Is ostension any more than attention?","volume":4,"date_created":"2020-06-24T13:01:15Z","author":[{"first_name":"Joanna","last_name":"Szufnarowska","full_name":"Szufnarowska, Joanna"},{"last_name":"Rohlfing","id":"50352","full_name":"Rohlfing, Katharina","first_name":"Katharina"},{"first_name":"Fawcett","last_name":"Christine","full_name":"Christine, Fawcett"},{"last_name":"Gustaf","full_name":"Gustaf, Gredebäck","first_name":"Gredebäck"}],"publisher":"Nature Publishing Group","date_updated":"2023-02-01T16:10:11Z","intvolume":"         4","citation":{"chicago":"Szufnarowska, Joanna, Katharina Rohlfing, Fawcett Christine, and Gredebäck Gustaf. “Is Ostension Any More than Attention?” <i>Scientific Reports</i> 4, no. 1 (2014). <a href=\"https://doi.org/10.1038/srep05304\">https://doi.org/10.1038/srep05304</a>.","ieee":"J. Szufnarowska, K. Rohlfing, F. Christine, and G. Gustaf, “Is ostension any more than attention?,” <i>Scientific Reports</i>, vol. 4, no. 1, 2014, doi: <a href=\"https://doi.org/10.1038/srep05304\">10.1038/srep05304</a>.","ama":"Szufnarowska J, Rohlfing K, Christine F, Gustaf G. Is ostension any more than attention? <i>Scientific Reports</i>. 2014;4(1). doi:<a href=\"https://doi.org/10.1038/srep05304\">10.1038/srep05304</a>","apa":"Szufnarowska, J., Rohlfing, K., Christine, F., &#38; Gustaf, G. (2014). Is ostension any more than attention? <i>Scientific Reports</i>, <i>4</i>(1). <a href=\"https://doi.org/10.1038/srep05304\">https://doi.org/10.1038/srep05304</a>","mla":"Szufnarowska, Joanna, et al. “Is Ostension Any More than Attention?” <i>Scientific Reports</i>, vol. 4, no. 1, Nature Publishing Group, 2014, doi:<a href=\"https://doi.org/10.1038/srep05304\">10.1038/srep05304</a>.","bibtex":"@article{Szufnarowska_Rohlfing_Christine_Gustaf_2014, title={Is ostension any more than attention?}, volume={4}, DOI={<a href=\"https://doi.org/10.1038/srep05304\">10.1038/srep05304</a>}, number={1}, journal={Scientific Reports}, publisher={Nature Publishing Group}, author={Szufnarowska, Joanna and Rohlfing, Katharina and Christine, Fawcett and Gustaf, Gredebäck}, year={2014} }","short":"J. Szufnarowska, K. Rohlfing, F. Christine, G. Gustaf, Scientific Reports 4 (2014)."},"year":"2014","issue":"1","publication_identifier":{"issn":["2045-2322"]}},{"date_updated":"2023-02-01T16:12:50Z","publisher":"John Benjamins Publishing Company","volume":14,"date_created":"2020-06-24T13:01:23Z","author":[{"last_name":"Nomikou","full_name":"Nomikou, Iris","first_name":"Iris"},{"id":"50352","full_name":"Rohlfing, Katharina","last_name":"Rohlfing","first_name":"Katharina"},{"first_name":"Joanna","full_name":"Szufnarowska, Joanna","last_name":"Szufnarowska"}],"title":"Educating attention: recruiting, maintaining, and framing eye contact in early natural mother-infant interactions","doi":"10.1075/is.14.2.05nom","publication_identifier":{"issn":["1572-0381"]},"issue":"2","year":"2013","page":"240-267","intvolume":"        14","citation":{"bibtex":"@article{Nomikou_Rohlfing_Szufnarowska_2013, title={Educating attention: recruiting, maintaining, and framing eye contact in early natural mother-infant interactions}, volume={14}, DOI={<a href=\"https://doi.org/10.1075/is.14.2.05nom\">10.1075/is.14.2.05nom</a>}, number={2}, journal={Interaction Studies}, publisher={John Benjamins Publishing Company}, author={Nomikou, Iris and Rohlfing, Katharina and Szufnarowska, Joanna}, year={2013}, pages={240–267} }","short":"I. Nomikou, K. Rohlfing, J. Szufnarowska, Interaction Studies 14 (2013) 240–267.","mla":"Nomikou, Iris, et al. “Educating Attention: Recruiting, Maintaining, and Framing Eye Contact in Early Natural Mother-Infant Interactions.” <i>Interaction Studies</i>, vol. 14, no. 2, John Benjamins Publishing Company, 2013, pp. 240–67, doi:<a href=\"https://doi.org/10.1075/is.14.2.05nom\">10.1075/is.14.2.05nom</a>.","apa":"Nomikou, I., Rohlfing, K., &#38; Szufnarowska, J. (2013). Educating attention: recruiting, maintaining, and framing eye contact in early natural mother-infant interactions. <i>Interaction Studies</i>, <i>14</i>(2), 240–267. <a href=\"https://doi.org/10.1075/is.14.2.05nom\">https://doi.org/10.1075/is.14.2.05nom</a>","ieee":"I. Nomikou, K. Rohlfing, and J. Szufnarowska, “Educating attention: recruiting, maintaining, and framing eye contact in early natural mother-infant interactions,” <i>Interaction Studies</i>, vol. 14, no. 2, pp. 240–267, 2013, doi: <a href=\"https://doi.org/10.1075/is.14.2.05nom\">10.1075/is.14.2.05nom</a>.","chicago":"Nomikou, Iris, Katharina Rohlfing, and Joanna Szufnarowska. “Educating Attention: Recruiting, Maintaining, and Framing Eye Contact in Early Natural Mother-Infant Interactions.” <i>Interaction Studies</i> 14, no. 2 (2013): 240–67. <a href=\"https://doi.org/10.1075/is.14.2.05nom\">https://doi.org/10.1075/is.14.2.05nom</a>.","ama":"Nomikou I, Rohlfing K, Szufnarowska J. Educating attention: recruiting, maintaining, and framing eye contact in early natural mother-infant interactions. <i>Interaction Studies</i>. 2013;14(2):240-267. doi:<a href=\"https://doi.org/10.1075/is.14.2.05nom\">10.1075/is.14.2.05nom</a>"},"_id":"17204","department":[{"_id":"749"}],"user_id":"14931","keyword":["interactional adaptation","multimodal input","social learning","ecology of attention","eye contact"],"language":[{"iso":"eng"}],"publication":"Interaction Studies","type":"journal_article","abstract":[{"lang":"eng","text":"In a longitudinal naturalistic study, we observed German mothers interacting with their infants when they were 3 and 6 months old. Pursuing the idea that infants’ attention is socialized in everyday interactions, we explored whether eye contact is reinforced selectively by behavioral modification in the input provided to infants. Applying a microanalytical approach focusing on the sequential organization of interaction, we explored how the mother draws the infant’s attention to herself and how she tries to maintain attention when the infant is looking at her. Results showed that eye contact is reinforced by specific infant-directed practices: interrogatives and conversational openings, multimodal stimulation, repetition, and imitation. In addition, these practices are contingent on the infant’s own behavior. By comparing the two data points (3 and 6 months), we showed how the education of attention evolves hand-in-hand with the developing capacities of the infant."}],"status":"public"},{"keyword":["attentional blink","attentional enhancement","lag-1 sparing","prior entry","temporal cueing","visual attention","rapid serial presentation","Adolescent","Adult","Attention","Attentional Blink","Color Perception","Cues","Female","Humans","Male","Neuropsychological Tests","Pattern Recognition","Visual","Time Factors","Visual Perception","Young Adult","Cues","Serial Recall","Visual Attention","Eyeblink Reflex"],"funded_apc":"1","language":[{"iso":"eng"}],"_id":"6081","user_id":"42165","department":[{"_id":"424"}],"abstract":[{"text":"The law of prior entry states that attended objects come to consciousness more quickly than unattended ones. This has been well established in spatial cueing paradigms, where two task-relevant stimuli are presented near-simultaneously at two different locations. Here, we suggest that prior entry also plays a pivotal role in temporal attention paradigms, where stimuli appear at the same location but at distinct moments in time, in rapid serial presentation (RSVP). Specifically, we hypothesize that prior entry can explain temporal order reversals in reporting two targets from RSVP. In support of this, three experiments show that cueing attention toward either of the targets has a strong influence on order errors. We conclude that prior entry provides a viable explanation of the way in which relevant information is prioritized in RSVP. (PsycINFO Database Record (c) 2016 APA, all rights reserved)","lang":"eng"}],"status":"public","type":"journal_article","publication":"Journal of Experimental Psychology: Human Perception and Performance","title":"Prior entry and temporal attention: Cueing affects order errors in RSVP.","date_updated":"2022-06-06T16:35:40Z","author":[{"first_name":"Frederic","full_name":"Hilkenmeier, Frederic","last_name":"Hilkenmeier"},{"full_name":"Olivers, Christian N. L.","last_name":"Olivers","first_name":"Christian N. L."},{"full_name":"Scharlau, Ingrid","id":"451","last_name":"Scharlau","orcid":"0000-0003-2364-9489","first_name":"Ingrid"}],"date_created":"2018-12-10T07:06:20Z","volume":38,"year":"2012","citation":{"ieee":"F. Hilkenmeier, C. N. L. Olivers, and I. Scharlau, “Prior entry and temporal attention: Cueing affects order errors in RSVP.,” <i>Journal of Experimental Psychology: Human Perception and Performance</i>, vol. 38, no. 1, pp. 180–190, 2012.","chicago":"Hilkenmeier, Frederic, Christian N. L. Olivers, and Ingrid Scharlau. “Prior Entry and Temporal Attention: Cueing Affects Order Errors in RSVP.” <i>Journal of Experimental Psychology: Human Perception and Performance</i> 38, no. 1 (2012): 180–90.","apa":"Hilkenmeier, F., Olivers, C. N. L., &#38; Scharlau, I. (2012). Prior entry and temporal attention: Cueing affects order errors in RSVP. <i>Journal of Experimental Psychology: Human Perception and Performance</i>, <i>38</i>(1), 180–190.","ama":"Hilkenmeier F, Olivers CNL, Scharlau I. Prior entry and temporal attention: Cueing affects order errors in RSVP. <i>Journal of Experimental Psychology: Human Perception and Performance</i>. 2012;38(1):180-190.","short":"F. Hilkenmeier, C.N.L. Olivers, I. Scharlau, Journal of Experimental Psychology: Human Perception and Performance 38 (2012) 180–190.","bibtex":"@article{Hilkenmeier_Olivers_Scharlau_2012, title={Prior entry and temporal attention: Cueing affects order errors in RSVP.}, volume={38}, number={1}, journal={Journal of Experimental Psychology: Human Perception and Performance}, author={Hilkenmeier, Frederic and Olivers, Christian N. L. and Scharlau, Ingrid}, year={2012}, pages={180–190} }","mla":"Hilkenmeier, Frederic, et al. “Prior Entry and Temporal Attention: Cueing Affects Order Errors in RSVP.” <i>Journal of Experimental Psychology: Human Perception and Performance</i>, vol. 38, no. 1, 2012, pp. 180–90."},"page":"180 - 190","intvolume":"        38","publication_status":"published","publication_identifier":{"issn":["0096-1523"]},"issue":"1"},{"issue":"1","publication_identifier":{"issn":["0001-6918"]},"publication_status":"published","intvolume":"       139","page":"54 - 64","citation":{"chicago":"Weiß, Katharina, and Ingrid Scharlau. “At the Mercy of Prior Entry: Prior Entry Induced by Invisible Primes Is Not Susceptible to Current Intentions.” <i>Acta Psychologica</i> 139, no. 1 (2012): 54–64.","ieee":"K. Weiß and I. Scharlau, “At the mercy of prior entry: Prior entry induced by invisible primes is not susceptible to current intentions.,” <i>Acta Psychologica</i>, vol. 139, no. 1, pp. 54–64, 2012.","ama":"Weiß K, Scharlau I. At the mercy of prior entry: Prior entry induced by invisible primes is not susceptible to current intentions. <i>Acta Psychologica</i>. 2012;139(1):54-64.","apa":"Weiß, K., &#38; Scharlau, I. (2012). At the mercy of prior entry: Prior entry induced by invisible primes is not susceptible to current intentions. <i>Acta Psychologica</i>, <i>139</i>(1), 54–64.","bibtex":"@article{Weiß_Scharlau_2012, title={At the mercy of prior entry: Prior entry induced by invisible primes is not susceptible to current intentions.}, volume={139}, number={1}, journal={Acta Psychologica}, author={Weiß, Katharina and Scharlau, Ingrid}, year={2012}, pages={54–64} }","mla":"Weiß, Katharina, and Ingrid Scharlau. “At the Mercy of Prior Entry: Prior Entry Induced by Invisible Primes Is Not Susceptible to Current Intentions.” <i>Acta Psychologica</i>, vol. 139, no. 1, 2012, pp. 54–64.","short":"K. Weiß, I. Scharlau, Acta Psychologica 139 (2012) 54–64."},"year":"2012","volume":139,"author":[{"first_name":"Katharina","last_name":"Weiß","full_name":"Weiß, Katharina"},{"first_name":"Ingrid","full_name":"Scharlau, Ingrid","id":"451","last_name":"Scharlau","orcid":"0000-0003-2364-9489"}],"date_created":"2018-12-10T07:01:19Z","date_updated":"2022-06-06T16:41:22Z","title":"At the mercy of prior entry: Prior entry induced by invisible primes is not susceptible to current intentions.","publication":"Acta Psychologica","type":"journal_article","status":"public","abstract":[{"lang":"eng","text":"If one of two events is attended to, it will be perceived earlier than a simultaneously occurring unattended event. Since 150 years, this effect has been ascribed to the facilitating influence of attention, also known as prior entry. Yet, the attentional origin of prior-entry effects¹ has been repeatedly doubted. One criticism is that prior-entry effects might be due to biased decision processes that would mimic a temporal advantage for attended stimuli. Although most obvious biases have already been excluded experimentally (e.g. judgment criteria, response compatibility) and prior-entry effects have shown to persist (Shore, Spence, & Klein, 2001), many other biases are conceivable, which makes it difficult to put the debate to an end. Thus, we approach this problem the other way around by asking whether prior-entry effects can be biased voluntarily. Observers were informed about prior entry and instructed to reduce it as far as possible. For this aim they received continuous feedback"}],"department":[{"_id":"424"}],"user_id":"42165","_id":"6064","language":[{"iso":"eng"}],"funded_apc":"1","keyword":["intentions","events","attention","decision processes","Adult","Attention","Choice Behavior","Cues","Female","Humans","Intention","Judgment","Male","Middle Aged","Reaction Time","Time Perception","Visual Perception","Attention","Decision Making","Experiences (Events)","Intention"]},{"publication_status":"published","publication_identifier":{"issn":["1943-3921"]},"issue":"2","year":"2012","citation":{"ama":"Priess H-W, Scharlau I, Becker SI, Ansorge U. Spatial mislocalization as a consequence of sequential coding of stimuli. <i>Attention, Perception, &#38; Psychophysics</i>. 2012;74(2):365-378.","chicago":"Priess, Heinz-Werner, Ingrid Scharlau, Stefanie I. Becker, and Ulrich Ansorge. “Spatial Mislocalization as a Consequence of Sequential Coding of Stimuli.” <i>Attention, Perception, &#38; Psychophysics</i> 74, no. 2 (2012): 365–78.","ieee":"H.-W. Priess, I. Scharlau, S. I. Becker, and U. Ansorge, “Spatial mislocalization as a consequence of sequential coding of stimuli.,” <i>Attention, Perception, &#38; Psychophysics</i>, vol. 74, no. 2, pp. 365–378, 2012.","short":"H.-W. Priess, I. Scharlau, S.I. Becker, U. Ansorge, Attention, Perception, &#38; Psychophysics 74 (2012) 365–378.","mla":"Priess, Heinz-Werner, et al. “Spatial Mislocalization as a Consequence of Sequential Coding of Stimuli.” <i>Attention, Perception, &#38; Psychophysics</i>, vol. 74, no. 2, 2012, pp. 365–78.","bibtex":"@article{Priess_Scharlau_Becker_Ansorge_2012, title={Spatial mislocalization as a consequence of sequential coding of stimuli.}, volume={74}, number={2}, journal={Attention, Perception, &#38; Psychophysics}, author={Priess, Heinz-Werner and Scharlau, Ingrid and Becker, Stefanie I. and Ansorge, Ulrich}, year={2012}, pages={365–378} }","apa":"Priess, H.-W., Scharlau, I., Becker, S. I., &#38; Ansorge, U. (2012). Spatial mislocalization as a consequence of sequential coding of stimuli. <i>Attention, Perception, &#38; Psychophysics</i>, <i>74</i>(2), 365–378."},"page":"365 - 378","intvolume":"        74","date_updated":"2022-06-06T16:38:04Z","author":[{"last_name":"Priess","full_name":"Priess, Heinz-Werner","first_name":"Heinz-Werner"},{"first_name":"Ingrid","last_name":"Scharlau","orcid":"0000-0003-2364-9489","id":"451","full_name":"Scharlau, Ingrid"},{"first_name":"Stefanie I.","full_name":"Becker, Stefanie I.","last_name":"Becker"},{"first_name":"Ulrich","full_name":"Ansorge, Ulrich","last_name":"Ansorge"}],"date_created":"2018-12-10T07:07:08Z","volume":74,"title":"Spatial mislocalization as a consequence of sequential coding of stimuli.","type":"journal_article","publication":"Attention, Perception, & Psychophysics","abstract":[{"text":"In three experiments, we tested whether sequentially coding two visual stimuli can create a spatial misperception of a visual moving stimulus. In Experiment 1, we showed that a spatial misperception, the flash-lag effect, is accompanied by a similar temporal misperception of first perceiving the flash and only then a change of the moving stimulus, when in fact the two events were exactly simultaneous. In Experiment 2, we demonstrated that when the spatial misperception of a flash-lag effect is absent, the temporal misperception is also absent. In Experiment 3, we extended these findings and showed that if the stimulus conditions require coding first a flash and subsequently a nearby moving stimulus, a spatial flash-lag effect is found, with the position of the moving stimulus being misperceived as shifted in the direction of its motion, whereas this spatial misperception is reversed so that the moving stimulus is misperceived as shifted in a direction opposite to its motion when the c","lang":"eng"}],"status":"public","_id":"6085","user_id":"42165","department":[{"_id":"424"}],"keyword":["spatial mislocalization","sequential coding","stimulus parameters","Attention","Discrimination (Psychology)","Humans","Judgment","Motion Perception","Optical Illusions","Orientation","Pattern Recognition","Visual","Psychophysics","Space Perception","Cognitive Processes","Motion Perception","Perceptual Localization","Spatial Perception","Stimulus Parameters","Consequence"],"funded_apc":"1","language":[{"iso":"eng"}]},{"abstract":[{"lang":"eng","text":"When two targets are presented in rapid succession, the first target (T1) is usually identified, but the second target (T2) is often missed. A remarkable exception to this 'attentional blink' occurs when T2 immediately follows the first T1, at lag 1. It is then often spared but reported in the wrong order—that is, before T1. These order reversals have led to the hypothesis that 'lag 1 sparing' occurs because the two targets merge into a single episodic representation. Here, we report evidence consistent with an alternative theory: T2 receives more attention than T1, leading to prior entry into working memory. Two experiments showed that the more T2 performance exceeded that for T1, the more order reversals were made. Furthermore, precuing T1 led to a shift in performance benefits from T2 to T1 and to an equivalent reduction in order reversals. We conclude that it is not necessary to assume episodic integration to explain lag 1 sparing or the accompanying order reversals. (PsycINFO Dat"}],"publication":"Attention, Perception, & Psychophysics","language":[{"iso":"eng"}],"keyword":["attentional blink","order reversals","prior entry","working memory","visual attention","attentional performance","Adolescent","Adult","Attention","Attentional Blink","Color Perception","Cues","Discrimination (Psychology)","Female","Humans","Male","Memory","Short-Term","Pattern Recognition","Visual","Psychophysics","Reaction Time","Reversal Learning","Sensory Gating","Serial Learning","Young Adult","Eyeblink Reflex","Stimulus Change","Stimulus Parameters","Visual Attention","Attentional Blink","Short Term Memory"],"year":"2011","issue":"1","title":"Prior entry explains order reversals in the attentional blink.","date_created":"2018-12-10T07:06:31Z","status":"public","type":"journal_article","funded_apc":"1","user_id":"42165","department":[{"_id":"424"}],"_id":"6082","citation":{"short":"C.N.L. Olivers, F. Hilkenmeier, I. Scharlau, Attention, Perception, &#38; Psychophysics 73 (2011) 53–67.","mla":"Olivers, Christian N. L., et al. “Prior Entry Explains Order Reversals in the Attentional Blink.” <i>Attention, Perception, &#38; Psychophysics</i>, vol. 73, no. 1, 2011, pp. 53–67.","bibtex":"@article{Olivers_Hilkenmeier_Scharlau_2011, title={Prior entry explains order reversals in the attentional blink.}, volume={73}, number={1}, journal={Attention, Perception, &#38; Psychophysics}, author={Olivers, Christian N. L. and Hilkenmeier, Frederic and Scharlau, Ingrid}, year={2011}, pages={53–67} }","apa":"Olivers, C. N. L., Hilkenmeier, F., &#38; Scharlau, I. (2011). Prior entry explains order reversals in the attentional blink. <i>Attention, Perception, &#38; Psychophysics</i>, <i>73</i>(1), 53–67.","chicago":"Olivers, Christian N. L., Frederic Hilkenmeier, and Ingrid Scharlau. “Prior Entry Explains Order Reversals in the Attentional Blink.” <i>Attention, Perception, &#38; Psychophysics</i> 73, no. 1 (2011): 53–67.","ieee":"C. N. L. Olivers, F. Hilkenmeier, and I. Scharlau, “Prior entry explains order reversals in the attentional blink.,” <i>Attention, Perception, &#38; Psychophysics</i>, vol. 73, no. 1, pp. 53–67, 2011.","ama":"Olivers CNL, Hilkenmeier F, Scharlau I. Prior entry explains order reversals in the attentional blink. <i>Attention, Perception, &#38; Psychophysics</i>. 2011;73(1):53-67."},"intvolume":"        73","page":"53 - 67","publication_status":"published","publication_identifier":{"issn":["1943-3921"]},"main_file_link":[{"url":"https://kw.uni-paderborn.de/fileadmin/fakultaet/Institute/psychologie/Kognitive_Psychologie/Publikationen/Olivers_etal__2011__AP_PProofs.pdf","open_access":"1"}],"author":[{"full_name":"Olivers, Christian N. L.","last_name":"Olivers","first_name":"Christian N. L."},{"first_name":"Frederic","full_name":"Hilkenmeier, Frederic","last_name":"Hilkenmeier"},{"id":"451","full_name":"Scharlau, Ingrid","orcid":"0000-0003-2364-9489","last_name":"Scharlau","first_name":"Ingrid"}],"volume":73,"date_updated":"2022-06-07T00:16:50Z","oa":"1"},{"issue":"2","year":"2011","date_created":"2018-12-10T07:06:56Z","title":"Simultaneity and temporal order perception: Different sides of the same coin? Evidence from a visual prior-entry study.","publication":"The Quarterly Journal of Experimental Psychology","abstract":[{"text":"Attended stimuli are perceived as occurring earlier than unattended stimuli. This phenomenon of prior entry is usually identified by a shift in the point of subjective simultaneity (PSS) in temporal order judgements (TOJs). According to its traditional psychophysical interpretation, the PSS coincides with the perception of simultaneity. This assumption is, however, questionable. Technically, the PSS represents the temporal interval between two stimuli at which the two alternative TOJs are equally likely. Thus it also seems possible that observers perceive not simultaneity, but uncertainty of temporal order. This possibility is supported by prior-entry studies, which find that perception of simultaneity is not very likely at the PSS. The present study tested the percept at the PSS in prior entry, using peripheral cues to orient attention. We found that manipulating attention caused varying temporal perceptions around the PSS. On some occasions observers perceived the two stimuli as sim","lang":"eng"}],"keyword":["temporal order perception","simultaneity","temporal order judgment","attention","visual perception","Adolescent","Adult","Attention","Cues","Discrimination (Psychology)","Female","Humans","Judgment","Male","Models","Psychological","Photic Stimulation","Reaction Time","Time Factors","Uncertainty","Visual Perception","Young Adult","Attention","Judgment","Stimulus Similarity","Time Perception","Visual Discrimination","Temporal Order (Judgment)"],"language":[{"iso":"eng"}],"publication_status":"published","publication_identifier":{"issn":["1747-0218"]},"citation":{"mla":"Weiß, Katharina, and Ingrid Scharlau. “Simultaneity and Temporal Order Perception: Different Sides of the Same Coin? Evidence from a Visual Prior-Entry Study.” <i>The Quarterly Journal of Experimental Psychology</i>, vol. 64, no. 2, 2011, pp. 394–416.","short":"K. Weiß, I. Scharlau, The Quarterly Journal of Experimental Psychology 64 (2011) 394–416.","bibtex":"@article{Weiß_Scharlau_2011, title={Simultaneity and temporal order perception: Different sides of the same coin? Evidence from a visual prior-entry study.}, volume={64}, number={2}, journal={The Quarterly Journal of Experimental Psychology}, author={Weiß, Katharina and Scharlau, Ingrid}, year={2011}, pages={394–416} }","ama":"Weiß K, Scharlau I. Simultaneity and temporal order perception: Different sides of the same coin? Evidence from a visual prior-entry study. <i>The Quarterly Journal of Experimental Psychology</i>. 2011;64(2):394-416.","apa":"Weiß, K., &#38; Scharlau, I. (2011). Simultaneity and temporal order perception: Different sides of the same coin? Evidence from a visual prior-entry study. <i>The Quarterly Journal of Experimental Psychology</i>, <i>64</i>(2), 394–416.","ieee":"K. Weiß and I. Scharlau, “Simultaneity and temporal order perception: Different sides of the same coin? Evidence from a visual prior-entry study.,” <i>The Quarterly Journal of Experimental Psychology</i>, vol. 64, no. 2, pp. 394–416, 2011.","chicago":"Weiß, Katharina, and Ingrid Scharlau. “Simultaneity and Temporal Order Perception: Different Sides of the Same Coin? Evidence from a Visual Prior-Entry Study.” <i>The Quarterly Journal of Experimental Psychology</i> 64, no. 2 (2011): 394–416."},"intvolume":"        64","page":"394 - 416","oa":"1","date_updated":"2022-06-07T00:17:26Z","author":[{"first_name":"Katharina","full_name":"Weiß, Katharina","last_name":"Weiß"},{"full_name":"Scharlau, Ingrid","id":"451","orcid":"0000-0003-2364-9489","last_name":"Scharlau","first_name":"Ingrid"}],"volume":64,"main_file_link":[{"url":"https://kw.uni-paderborn.de/fileadmin/fakultaet/Institute/psychologie/Kognitive_Psychologie/Publikationen/WeissScharlau2010.pdf","open_access":"1"}],"type":"journal_article","status":"public","_id":"6084","user_id":"42165","department":[{"_id":"424"}],"funded_apc":"1"},{"user_id":"42165","department":[{"_id":"424"}],"_id":"6090","language":[{"iso":"eng"}],"keyword":["visual selection","attention","information","visual field","brain","Attention","Humans","Models","Psychological","Visual Perception","Volition","Brain","Visual Field","Visual Perception","Visual Attention","Information"],"type":"journal_article","publication":"Acta Psychologica","status":"public","abstract":[{"lang":"eng","text":"Comments on an article by Jan Theeuwes (see record [rid]2010-20897-002[/rid]). Theeuwes summarizes an impressive number of studies demonstrating interference by irrelevant visual singletons in computer experiments with humans. Theeuwes assumes that this salience-driven capture of attention is fast and occurs within 150 ms since singleton onset, during the feed-forward phase of visual processing. In contrast to Theeuwes, we think that top–down contingent capture is the rule and explains initial and fast attention capture effects in the first feed-forward phase of visual processing. During a later phase and under some conditions exogenous capture of attention possibly follows. At the same time, we propose that the evidence presented by Theeuwes fails to support exogenous orienting because it fails to exclude a top–down contingent capture explanation. We present our arguments in two sections. One major source of evidence for top–down controlled attentional capture during the feed-forward"}],"date_created":"2018-12-10T07:08:08Z","author":[{"last_name":"Ansorge","full_name":"Ansorge, Ulrich","first_name":"Ulrich"},{"first_name":"Gernot","last_name":"Horstmann","full_name":"Horstmann, Gernot"},{"last_name":"Scharlau","orcid":"0000-0003-2364-9489","id":"451","full_name":"Scharlau, Ingrid","first_name":"Ingrid"}],"volume":135,"date_updated":"2022-06-07T00:17:51Z","oa":"1","main_file_link":[{"url":"https://kw.uni-paderborn.de/fileadmin/fakultaet/Institute/psychologie/Kognitive_Psychologie/Publikationen/AHSActa2011.pdf","open_access":"1"}],"title":"Top–down contingent attentional capture during feed-forward visual processing.","issue":"2","publication_status":"published","publication_identifier":{"issn":["0001-6918"]},"citation":{"bibtex":"@article{Ansorge_Horstmann_Scharlau_2010, title={Top–down contingent attentional capture during feed-forward visual processing.}, volume={135}, number={2}, journal={Acta Psychologica}, author={Ansorge, Ulrich and Horstmann, Gernot and Scharlau, Ingrid}, year={2010}, pages={123–126} }","short":"U. Ansorge, G. Horstmann, I. Scharlau, Acta Psychologica 135 (2010) 123–126.","mla":"Ansorge, Ulrich, et al. “Top–down Contingent Attentional Capture during Feed-Forward Visual Processing.” <i>Acta Psychologica</i>, vol. 135, no. 2, 2010, pp. 123–26.","apa":"Ansorge, U., Horstmann, G., &#38; Scharlau, I. (2010). Top–down contingent attentional capture during feed-forward visual processing. <i>Acta Psychologica</i>, <i>135</i>(2), 123–126.","ama":"Ansorge U, Horstmann G, Scharlau I. Top–down contingent attentional capture during feed-forward visual processing. <i>Acta Psychologica</i>. 2010;135(2):123-126.","chicago":"Ansorge, Ulrich, Gernot Horstmann, and Ingrid Scharlau. “Top–down Contingent Attentional Capture during Feed-Forward Visual Processing.” <i>Acta Psychologica</i> 135, no. 2 (2010): 123–26.","ieee":"U. Ansorge, G. Horstmann, and I. Scharlau, “Top–down contingent attentional capture during feed-forward visual processing.,” <i>Acta Psychologica</i>, vol. 135, no. 2, pp. 123–126, 2010."},"page":"123 - 126","intvolume":"       135","year":"2010"},{"status":"public","type":"journal_article","funded_apc":"1","_id":"6083","department":[{"_id":"424"}],"user_id":"42165","intvolume":"        22","page":"1222 - 1234","citation":{"short":"F. Hilkenmeier, I. Scharlau, European Journal of Cognitive Psychology 22 (2010) 1222–1234.","bibtex":"@article{Hilkenmeier_Scharlau_2010, title={Rapid allocation of temporal attention in the attentional blink paradigm.}, volume={22}, number={8}, journal={European Journal of Cognitive Psychology}, author={Hilkenmeier, Frederic and Scharlau, Ingrid}, year={2010}, pages={1222–1234} }","mla":"Hilkenmeier, Frederic, and Ingrid Scharlau. “Rapid Allocation of Temporal Attention in the Attentional Blink Paradigm.” <i>European Journal of Cognitive Psychology</i>, vol. 22, no. 8, 2010, pp. 1222–34.","apa":"Hilkenmeier, F., &#38; Scharlau, I. (2010). Rapid allocation of temporal attention in the attentional blink paradigm. <i>European Journal of Cognitive Psychology</i>, <i>22</i>(8), 1222–1234.","chicago":"Hilkenmeier, Frederic, and Ingrid Scharlau. “Rapid Allocation of Temporal Attention in the Attentional Blink Paradigm.” <i>European Journal of Cognitive Psychology</i> 22, no. 8 (2010): 1222–34.","ieee":"F. Hilkenmeier and I. Scharlau, “Rapid allocation of temporal attention in the attentional blink paradigm.,” <i>European Journal of Cognitive Psychology</i>, vol. 22, no. 8, pp. 1222–1234, 2010.","ama":"Hilkenmeier F, Scharlau I. Rapid allocation of temporal attention in the attentional blink paradigm. <i>European Journal of Cognitive Psychology</i>. 2010;22(8):1222-1234."},"publication_identifier":{"issn":["0954-1446"]},"publication_status":"published","main_file_link":[{"open_access":"1","url":"https://kw.uni-paderborn.de/fileadmin/fakultaet/Institute/psychologie/Kognitive_Psychologie/Publikationen/HilkenmeierScharlau2010.pdf"}],"oa":"1","date_updated":"2022-06-07T00:18:16Z","volume":22,"author":[{"first_name":"Frederic","last_name":"Hilkenmeier","full_name":"Hilkenmeier, Frederic"},{"orcid":"0000-0003-2364-9489","last_name":"Scharlau","id":"451","full_name":"Scharlau, Ingrid","first_name":"Ingrid"}],"abstract":[{"lang":"eng","text":"How fast can information of a first target (T1) in a rapid serial visual presentation be used for top-down allocation of attention in time? A valid cue about the temporal position of a second target (T2) was integrated into T1. The data show that 100 ms after T1 onset, T2 was identified better than without cue, raising the conditional T2 performance. T1 apparently triggers a facilitative effect of attention, known from other paradigms such as peripheral cueing. (PsycINFO Database Record (c) 2016 APA, all rights reserved)"}],"publication":"European Journal of Cognitive Psychology","keyword":["temporal attention","attentional blink paradigm","first target information","top-down allocation","rapid serial visual presentation","Stimulus Presentation Methods","Visual Stimulation","Visual Attention"],"language":[{"iso":"eng"}],"year":"2010","issue":"8","title":"Rapid allocation of temporal attention in the attentional blink paradigm.","date_created":"2018-12-10T07:06:43Z"},{"title":"Early Top-Down Influences in Control of Attention: Evidence from the Attentional Blink","main_file_link":[{"open_access":"1","url":"https://kw.uni-paderborn.de/fileadmin/fakultaet/Institute/psychologie/Kognitive_Psychologie/Publikationen/KI09_Hilkenmeier_TD_AB.pdf"}],"oa":"1","date_updated":"2022-06-07T00:18:37Z","date_created":"2021-12-15T13:09:25Z","author":[{"full_name":"Hilkenmeier, Frederic","last_name":"Hilkenmeier","first_name":"Frederic"},{"first_name":"Jan","full_name":"Tünnermann, Jan","last_name":"Tünnermann"},{"first_name":"Ingrid","orcid":"0000-0003-2364-9489","last_name":"Scharlau","full_name":"Scharlau, Ingrid","id":"451"}],"year":"2009","citation":{"apa":"Hilkenmeier, F., Tünnermann, J., &#38; Scharlau, I. (2009). Early Top-Down Influences in Control of Attention: Evidence from the Attentional Blink. <i>KI 2009: Advances in Artificial Intelligence. Proceedings of the 32nd Annual Conference on Artificial Intelligence.</i>","mla":"Hilkenmeier, Frederic, et al. “Early Top-Down Influences in Control of Attention: Evidence from the Attentional Blink.” <i>KI 2009: Advances in Artificial Intelligence. Proceedings of the 32nd Annual Conference on Artificial Intelligence.</i>, 2009.","short":"F. Hilkenmeier, J. Tünnermann, I. Scharlau, KI 2009: Advances in Artificial Intelligence. Proceedings of the 32nd Annual Conference on Artificial Intelligence. (2009).","bibtex":"@article{Hilkenmeier_Tünnermann_Scharlau_2009, title={Early Top-Down Influences in Control of Attention: Evidence from the Attentional Blink}, journal={KI 2009: Advances in Artificial Intelligence. Proceedings of the 32nd Annual Conference on Artificial Intelligence.}, author={Hilkenmeier, Frederic and Tünnermann, Jan and Scharlau, Ingrid}, year={2009} }","ieee":"F. Hilkenmeier, J. Tünnermann, and I. Scharlau, “Early Top-Down Influences in Control of Attention: Evidence from the Attentional Blink,” <i>KI 2009: Advances in Artificial Intelligence. Proceedings of the 32nd Annual Conference on Artificial Intelligence.</i>, 2009.","chicago":"Hilkenmeier, Frederic, Jan Tünnermann, and Ingrid Scharlau. “Early Top-Down Influences in Control of Attention: Evidence from the Attentional Blink.” <i>KI 2009: Advances in Artificial Intelligence. Proceedings of the 32nd Annual Conference on Artificial Intelligence.</i>, 2009.","ama":"Hilkenmeier F, Tünnermann J, Scharlau I. Early Top-Down Influences in Control of Attention: Evidence from the Attentional Blink. <i>KI 2009: Advances in Artificial Intelligence Proceedings of the 32nd Annual Conference on Artificial Intelligence</i>. Published online 2009."},"publication_status":"published","keyword":["visuo-spatial attention","top-down control","task relevance","artificial visual attention","attentional blink"],"language":[{"iso":"eng"}],"funded_apc":"1","_id":"28964","department":[{"_id":"424"}],"user_id":"42165","abstract":[{"lang":"eng","text":"The relevance of top-down information in the deployment of attention has more and more been emphasized in cognitive psychology. We present recent findings about the dynamic of these processes and also demonstrate that task relevance can be adjusted rapidly by incoming bottom-up information. This adjustment substantially increases performance in a subsequent task. Implications for artificial visual models are discussed."}],"status":"public","publication":"KI 2009: Advances in Artificial Intelligence. Proceedings of the 32nd Annual Conference on Artificial Intelligence.","type":"journal_article"},{"keyword":["visuo-spatial attention","prior entry","selection for action","selection for perception"],"language":[{"iso":"eng"}],"funded_apc":"1","_id":"28955","user_id":"42165","department":[{"_id":"424"}],"abstract":[{"text":"Attention speeds up information processing. Although this finding has a long history in experimental psychology, it has found less regard in computational models of visual attention. In psychological research, two frameworks explain the function of attention.Selection for perception emphasizes that perception- or consciousness-related processing presupposes selection of relevant information, whereas selection for action emphasizes that action constraints make selection necessary. In the present study, we ask whether or how far attention, as measured by the speed-up of information processing, is based on selection for perception or selection for action. The accelerating effect was primarily based on selection for perception, but there was also a substantial effect of selection for action.","lang":"eng"}],"status":"public","type":"journal_article","publication":"KI 2009: Advances in Artificial Intelligence. Proceedings of the 32nd Annual Conference on Artificial Intelligence.","title":"Attention Speeds Up Visual Information Processing: Selection for Perception or Selection for Action?","main_file_link":[{"url":"https://kw.uni-paderborn.de/fileadmin/fakultaet/Institute/psychologie/Kognitive_Psychologie/Publikationen/Attention_speeds_up_visual_information_processing4.pdf"}],"date_updated":"2022-06-07T00:19:30Z","date_created":"2021-12-15T12:53:50Z","author":[{"full_name":"Weiß, Katharina","last_name":"Weiß","first_name":"Katharina"},{"full_name":"Scharlau, Ingrid","id":"451","last_name":"Scharlau","orcid":"0000-0003-2364-9489","first_name":"Ingrid"}],"year":"2009","citation":{"mla":"Weiß, Katharina, and Ingrid Scharlau. “Attention Speeds Up Visual Information Processing: Selection for Perception or Selection for Action?” <i>KI 2009: Advances in Artificial Intelligence. Proceedings of the 32nd Annual Conference on Artificial Intelligence.</i>, 2009.","bibtex":"@article{Weiß_Scharlau_2009, title={Attention Speeds Up Visual Information Processing: Selection for Perception or Selection for Action?}, journal={KI 2009: Advances in Artificial Intelligence. Proceedings of the 32nd Annual Conference on Artificial Intelligence.}, author={Weiß, Katharina and Scharlau, Ingrid}, year={2009} }","short":"K. Weiß, I. Scharlau, KI 2009: Advances in Artificial Intelligence. Proceedings of the 32nd Annual Conference on Artificial Intelligence. (2009).","apa":"Weiß, K., &#38; Scharlau, I. (2009). Attention Speeds Up Visual Information Processing: Selection for Perception or Selection for Action? <i>KI 2009: Advances in Artificial Intelligence. Proceedings of the 32nd Annual Conference on Artificial Intelligence.</i>","ama":"Weiß K, Scharlau I. Attention Speeds Up Visual Information Processing: Selection for Perception or Selection for Action? <i>KI 2009: Advances in Artificial Intelligence Proceedings of the 32nd Annual Conference on Artificial Intelligence</i>. Published online 2009.","ieee":"K. Weiß and I. Scharlau, “Attention Speeds Up Visual Information Processing: Selection for Perception or Selection for Action?,” <i>KI 2009: Advances in Artificial Intelligence. Proceedings of the 32nd Annual Conference on Artificial Intelligence.</i>, 2009.","chicago":"Weiß, Katharina, and Ingrid Scharlau. “Attention Speeds Up Visual Information Processing: Selection for Perception or Selection for Action?” <i>KI 2009: Advances in Artificial Intelligence. Proceedings of the 32nd Annual Conference on Artificial Intelligence.</i>, 2009."}}]
