Pozniak, C., Hemforth, B., & Scheepers, C. (2018). Cross-domain priming from mathematics to relative-clause attachment: A visual-world study in French. Frontiers in Psychology, 9:2056. DOI: 10.3389/fpsyg.2018.02056
Thompson, D., Ferreira, F., & Scheepers, C. (2018). One step at a time: Representational overlap between active voice, be-passive, and get-passive forms in English. Journal of Cognition, 1(1): 35, pp. 1–24, DOI: https://doi.org/10.5334/joc.36.
Jicol, C., Proulx, M. J., Pollick, F. E., & Petrini, K. (2018). Long-term music training modulates the recalibration of audiovisual simultaneity. Experimental brain research, 1-12.
Abstract: To overcome differences in physical transmission time and neural processing, the brain adaptively recalibrates the point of simultaneity between auditory and visual signals by adapting to audiovisual asynchronies. Here, we examine whether the prolonged recalibration process of passively sensed visual and auditory signals is affected by naturally occurring multisensory training known to enhance audiovisual perceptual accuracy. Hence, we asked a group of drummers, of non-drummer musicians and of non-musicians to judge the audiovisual simultaneity of musical and non-musical audiovisual events, before and after adaptation with two fixed audiovisual asynchronies. We found that the recalibration for the musicians and drummers was in the opposite direction (sound leading vision) to that of non-musicians (vision leading sound), and change together with both increased music training and increased perceptual accuracy (i.e. ability to detect asynchrony). Our findings demonstrate that long-term musical training reshapes the way humans adaptively recalibrate simultaneity between auditory and visual signals.
Until 6 July, 2018 there is free access to the paper/chapter at:
Pollick, F. E., Vicary, S., Noble, K., Kim, N., Jang, S., & Stevens, C. J. (2018). Exploring collective experience in watching dance through intersubject correlation and functional connectivity of fMRI brain activity. Progress in brain research. https://doi.org/10.1016/bs.pbr.2018.03.016
Abstract: How the brain contends with naturalistic viewing conditions when it must cope with concurrent streams of diverse sensory inputs and internally generated thoughts is still largely an open question. In this study, we used fMRI to record brain activity while a group of 18 participants watched an edited dance duet accompanied by a soundtrack. After scanning, participants performed a short behavioral task to identify neural correlates of dance segments that could later be recalled. Intersubject correlation (ISC) analysis was used to identify the brain regions correlated among observers, and the results of this ISC map were used to define a set of regions for subsequent analysis of functional connectivity. The resulting network was found to be composed of eight subnetworks and the significance of these subnetworks is discussed. While most subnetworks could be explained by sensory and motor processes, two subnetworks appeared related more to complex cognition. These results inform our understanding of the neural basis of common experience in watching dance and open new directions for the study of complex cognition.
The overarching aim of this PhD project is to develop a library of naturalistic emotional movements generated by expert dancers, and then implement and test the communicative value of these movements in artificial agents in naturalistic social settings. This studentship is richly interdisciplinary in nature, drawing from the social sciences, performing arts and engineering to tackle a major challenge that falls under the remit of the RCUK Digital Economy theme: namely, to improve artificial agents’ social acceptance and usability by providing them with emotionally expressive behaviours that are instantly readable by human interaction partners. This project comprises three main studies, with the first two primarily involving social sciences research (with performing arts elements as well), and the third study combining knowledge generated from the social sciences and performing arts with computing science. For the first third of the project, the student will work closely with the Scottish National Ballet and motion tracking technology to create and validate a rich library of emotions expressed via bodily movement. Next, the student will develop expertise with quantitative and qualitative behavioural methods (including eye tracking, self-report measures of affective valence), as well as working with different participant samples (expert and naïve dancers) to further identify how emotion is expressed via bodily movements, and which elements of a body in motion convey the most meaningful information about a mover’s emotion. The final third of the project applies insights gained from the first two parts to the computing science and robotics world, by implementing insights gained into the movements and behaviour of physically present robots and virtual representations of avatars. Together, the project provides an ideal and exciting opportunity to train a PhD student who is equipped with the theoretical and technical skills to work between the social sciences, arts, and technology.
ESRC Studentship advertisement
ESRC Collaborative Award: Using Virtual Reality Technology to Explore the Inner Perceptual World of Autism
Autism, a common neuro-developmental condition, affects at least 1% of the UK population. Autism is partly characterized by sensory difficulties, such as over- or under-responsiveness to certain types of lighting and everyday noises, and an almost obsessive desire for particular types of sensory stimulation, known as “sensory seeking” behaviour. To date, most research on sensory aspects of autism has used parent/caregiver-reports, combined with a smaller amount of self-report data from those able to speak for themselves and further data from lab-based experiments. So far, however, despite these data providing us with some fascinating insights, we have yet to fully appreciate precisely what is going on in the “inner perceptual world” of autism, although it is clear that it is qualitatively different from what typical individuals experience.
Abstract: Understanding the mechanisms and consequences of attributing socialness to artificial agents has important implications for how we can use technology to lead more productive and fulfilling lives. Here, we integrate recent findings on the factors that shape behavioral and brain mechanisms that support social interactions between humans and artificial agents. We review how visual features of an agent, as well as knowledge factors within the human observer, shape attributions across dimensions of socialness. We explore how anthropomorphism and dehumanization further influence how we perceive and interact with artificial agents. Based on these findings, we argue that the cognitive reconstruction within the human observer is likely to be far more crucial in shaping our interactions with artificial agents than previously thought, while the artificial agent’s visual features are possibly of lesser importance. We combine these findings to provide an integrative theoretical account based on the “like me” hypothesis, and discuss the key role played by the Theory‐of‐Mind network, especially the temporal parietal junction, in the shift from mechanistic to social attributions. We conclude by highlighting outstanding questions on the impact of long‐term interactions with artificial agents on the behavioral and brain mechanisms of attributing socialness to these agents
Frank Pollick will present his new research with Yashar Moshfeghi at the World Wide Web Conference in Lyon. This is part of the ongoing research collaboration joining together cognitive neuroscience and information retrieval/data science approaches to understand search.
Moshfeghi, Y., & Pollick, F. E. (2018, April). Search Process as Transitions Between Neural States. In Proceedings of the 2018 World Wide Web Conference on World Wide Web (pp. 1683-1692). International World Wide Web Conferences Steering Committee.
Search is one of the most performed activities on the World Wide Web. Various conceptual models postulate that the search process can be broken down into distinct emotional and cognitive states of searchers while they engage in a search process. These models significantly contribute to our understanding of the search process. However, they are typically based on self-report measures, such as surveys, questionnaire, etc. and therefore, only indirectly monitor the brain activity that supports such a process. With this work, we take one step further and directly measure the brain activity involved in a search process. To do so, we break down a search process into five time periods: a realisation of Information Need, Query Formulation, Query Submission, Relevance Judgment and Satisfaction Judgment. We then investigate the brain activity between these time periods. Using functional Magnetic Resonance Imaging (fMRI), we monitored the brain activity of twenty-four participants during a search process that involved answering questions carefully selected from the TREC-8 and TREC 2001 Q/A Tracks. This novel analysis that focuses on transitions rather than states reveals the contrasting brain activity between time periods – which enables the identification of the distinct parts of the search process as the user moves through them. This work, therefore, provides an important first step in representing the search process based on the transitions between neural states. Discovering more precisely how brain activity relates to different parts of the search process will enable the development of brain-computer interactions that better support search and search interactions, which we believe our study and conclusions advance.
The PsyTeachR team was selected for a £1000 College Team Teaching Excellence Award. The team consists of School of Psychology members Heather Cleland Woods, Helena Paterson and Niamh Stack, and cSCAN members Dale Barr, Phil McAleer and Lisa DeBruine. The panel praised the Teaching Reproducible Psychology with R workshop, co-creation of materials with students, and online approaches. You can see some of these materials at the team website.