Neural and Behavioural Insights into Predictive Processing in Action and Perception
Chair(s): Ody, Edward (University of Marburg, Germany), He, Yifei (University of Marburg)
Presenter(s): Mukamel, Roy (Tel Aviv University), Buaron, Batel (Tel Aviv University), Widmann, Andreas (University of Leipzig), Neszmélyi, Bence (University of Wuerzburg), Ody, Edward (University of Marburg), Wang, Peng (University of Greifswald)
Predicting the sensory consequences of actions is fundamental to maintaining meaningful interaction with the outside world, allowing us to select appropriate actions, distinguish self- and externally generated sensory sensations, retain a sense of agency, and interact with others. This symposium brings together current research examining predictive processing in action and perception, exploring how motor-based expectations influence neural and behavioural responses to self-generated and socially elicited sensations. We will present a range of perspectives covering EEG, MRI, EMG and behavioural methods.
Roy Mukamel and Batel Buaron (Tel Aviv) will begin by discussing how actions are bound to their sensory outcomes, examining how temporal expectations and predictions contribute to agency and the neural processing of self-generated stimuli. Andreas Widmann (Leipzig) will follow with research on how action intentions modulate early auditory sensory processing, demonstrating that top-down predictions influence prediction error responses to unexpected self-generated sounds. Bence Neszmélyi (Würzburg) will then explore the role of predictability in social and non-social action effects, investigating whether anticipatory representations of social responses emerge in effector systems before an action is performed. Edward Ody (Marburg) will present findings on how motor prediction sharpens the neural representation of action outcomes, revealing that active movement enhances early visual processing independently of prior expectations. Finally, Peng Wang (Greifswald) will discuss studies demonstrating how neural oscillations align with movement frequencies, particularly under conditions requiring visuomotor adaptation, shedding light on how rhythmic cortical activity supports sensory-motor integration.
Together, these diverse contributions provide insights into how motor-based prediction influences perception.
Neural Mechanisms of Motor and Sensory Predictive Signals
Mukamel, Roy; Buaron, Batel
Tel Aviv University, Israel
Performance of goal-directed actions requires integrating motor commands with their expected outcome and discriminating external sensory events from those evoked by the agent. It was shown that voluntary actions modulate sensory evoked neural responses relative to responses evoked by identical stimuli from external sources. A prominent theory suggests that outcome predictions are sent from motor to relevant sensory regions and modulate their neural state. However, predictive signals are not unique to actions and can be associated with non-motor sources. In a set of two studies, we examined whether motor and auditory signals predicting a visual outcome share common mechanisms. In an EEG study (n=30), participants learned the coupling between cues (button-press/sound) and ensuing pictures. Visual evoked responses (P100) were smaller in the visuomotor vs. audiovisual condition, even on the first repetition of learning when no specific visual prediction could be formed. Additionally, no interaction between cue-type and learning stage was found, suggesting that the effect of experience on P100 amplitude is similar across cue type. We further examine the anatomical distribution of motor and auditory predictive signals using fMRI (n=14). Participants watched visual stimuli preceded by button-press/sound. We found that activity in visual cortex for identical visual stimuli was sensitive to cue type. Furthermore, button presses, but not sounds, influence visual cortex even in the absence of visual stimulation. Together these studies suggest that both motor and sensory predictive cues affect sensory regions, in addition to a global influence unique to actions that is irrespective of coupled visual outcome.
Action Intention Shapes the Early Sensory Processing of Auditory Action Effects
Widmann, Andreas1; Korka, Betina2; Dercksen, Tjerk T.3; Schröger, Erich1
1Wilhelm Wundt Institute for Psychology, Leipzig University, Germany; 2Zander Labs, Munich, Germany; 3Leibniz Institute for Neurobiology, Magdeburg, Germany
I will summarise a series of recent studies showing that action intention can modulate early auditory sensory processing through top-down predictions in the context of self-generated sounds. Signatures of auditory prediction error (e.g., N1, Mismatch Negativity [MMN]) can be observed in event-related brain potentials (ERPs) in response to auditory input that violates bottom-up established predictions based on sensory regularities. (1) We observed similar prediction errors in response to unexpected action effects that violated top-down intention-based predictions (but not sensory regularities). (2) Moreover, prediction errors in response to violations of sensory regularities were abolished when the violating action effects were consistent with action intention. This suggests that intention-based predictions can override predictions based on sensory regularities. (3) Prediction error responses to unexpected violations of action-effect associations were only observed when participants were asked to produce a sound sequence by pressing buttons, but not when they were asked to produce a sequence of button presses. Thus, the association of an action with a specific action-effect is not sufficient, but action intention is required. (4) Finally, early prediction error responses to the unexpected omission of action effects provide evidence that indeed intention-related top-down predictions are reflected in the observed modulations of early sensory processing of sounds. Taken together, these findings provide a comprehensive picture of the interplay between bottom-up and top-down prediction and the primacy of intention over sensory regularities in action-effect processing.
Motor Prediction Sharpens Early Visual Representations of Action Outcomes Independent of Prior Expectations
Ody, Edward; Kircher, Tilo; Straube, Benjamin; He, Yifei
University of Marburg, Germany
According to forward model theories of motor control, sensory consequences of actions are predicted based on an efference copy of the motor command. Correct predictions result in the modulation of action feedback while incorrect predictions result in prediction errors. An alternative view, based on Bayesian models, suggests that incoming sensory feedback is predicted based on the accumulation of prior evidence. Here, in an EEG study (N = 24), we examined whether motor prediction sharpens the neural representation of action outcomes independently of prior expectations. In separate blocks, participants either actively triggered Gabor patches with a button press or passively observed them. The patches had 50% probability of having left or right orientation and the order was randomised. To retain attention, participants were asked to respond to vertical catch trials (4/60 per block). We ran a time-resolved decoding analysis by training a classifier (SVM, 5-fold) at each time point to decode the orientation separately for the active and passive conditions. Both conditions showed above-chance decoding shortly after (~100 ms) the onset of the Gabor patch. However, active showed significantly higher decoding than passive, suggesting a sharper representation of the grating orientation in early visual processing. This result demonstrates that forward model motor prediction contributes to sharpening the representation of action outcomes, even when those outcomes can not be predicted based on the accumulation of prior knowledge.
Aligning Brain Rhythms with Action: A Case For Neural Entrainment During Visuomotor Conflict
Wang, Peng; Limanowski, Jakub
University of Greifswald, Germany
Cortical oscillations have been linked to key processes in sensorimotor integration and motor control; and they have been shown to align with environmental or behavioural rhythms in many cases (“entrainment”). Here, I will present the results of MEG and EEG studies that used virtual reality to investigate the oscillatory correlates of adaptive visuomotor control. In both studies, participants performed a continuous rhythmic hand-target matching task; i.e., via a data glove worn on their unseen real hand (RH), they controlled a virtual hand (VH) to match a visual target oscillation. We manipulated visuomotor congruence by adding delays to the VH movements. In the MEG study, participants focused on either the RH or VH to match a 0.5 Hz target oscillation under delayed or synchronous visual movement feedback. In the EEG study, we varied both the target frequencies (0.3 vs 0.5 Hz) and the visual feedback delays (3/20 vs 1/4 cycle). Across experiments, we observed strong induced low-frequency neural oscillations related to key task frequency—suggesting neural entrainment to behaviourally relevant rhythms. Furthermore, when tracking under delayed visual movement feedback, we found low-frequency oscillations in the beta range to phase-lock with the task frequencies. Beta power thereby did not seem to exclusively encode somatomotor or visual signals, but rather, their (nonlinear) integration. Together, these results align with a proposed key role of beta oscillations in behavioural control; and suggest their potential interaction with bodily rhythms especially under adaptation to visuomotor conflicts.
|