Utilize este identificador para referenciar este registo: https://hdl.handle.net/1822/82760

TítuloSpatial and temporal (non)binding of audiovisual rhythms in sensorimotor synchronisation
Autor(es)Lapenta, Olivia Morgan
Keller, Peter E.
Nozaradan, Sylvie
Varlet, Manuel
Palavras-chaveFrequency tagging
Motor tracking
Multisensory integration
Movement synchronisation
Steady-state evoked potentials
Data15-Fev-2023
EditoraSpringer Nature
RevistaExperimental Brain Research
CitaçãoAdam, K. C. S., Chang, L., Rangan, N., & Serences, J. T. (2021, April). Steady-State Visually Evoked Potentials and Feature-based Attention: Preregistered Null Results and a Focused Review of Methodological Considerations. Journal of Cognitive Neuroscience. MIT Press - Journals. http://doi.org/10.1162/jocn_a_01665
Resumo(s)Human movement synchronisation with moving objects strongly relies on visual input. However, auditory information also plays an important role, since real environments are intrinsically multimodal. We used electroencephalography (EEG) frequency tagging to investigate the selective neural processing and integration of visual and auditory information during motor tracking and tested the effects of spatial and temporal congruency between audiovisual modalities. EEG was recorded while participants tracked with their index finger a red flickering (rate fV = 15 Hz) dot oscillating horizontally on a screen. The simultaneous auditory stimulus was modulated in pitch (rate fA = 32 Hz) and lateralised between left and right audio channels to induce perception of a periodic displacement of the sound source. Audiovisual congruency was manipulated in terms of space in Experiment 1 (no motion, same direction or opposite direction), and timing in Experiment 2 (no delay, medium delay or large delay). For both experiments, significant EEG responses were elicited at fV and fA tagging frequencies. It was also hypothesised that intermodulation products corresponding to the nonlinear integration of visual and auditory stimuli at frequencies fV ± fA would be elicited, due to audiovisual integration, especially in Congruent conditions. However, these components were not observed. Moreover, synchronisation and EEG results were not influenced by congruency manipulations, which invites further exploration of the conditions which may modulate audiovisual processing and the motor tracking of moving objects.
TipoArtigo
DescriçãoAll data are held in a public repository, available at OSF database (URL access: https://osf.io/2jr48/?view_only=17e3f6f57651418c980832e00d818072).
URIhttps://hdl.handle.net/1822/82760
DOI10.1007/s00221-023-06569-x
ISSN0014-4819
e-ISSN1432-1106
Versão da editorahttps://link.springer.com/content/pdf/10.1007/s00221-023-06569-x.pdf
Arbitragem científicayes
AcessoAcesso aberto
Aparece nas coleções:CIPsi - Artigos (Papers)

Ficheiros deste registo:
Ficheiro Descrição TamanhoFormato 
Lapenta_et_al-2023-EBR.pdfArticle pdf2,69 MBAdobe PDFVer/Abrir

Este trabalho está licenciado sob uma Licença Creative Commons Creative Commons

Partilhe no FacebookPartilhe no TwitterPartilhe no DeliciousPartilhe no LinkedInPartilhe no DiggAdicionar ao Google BookmarksPartilhe no MySpacePartilhe no Orkut
Exporte no formato BibTex mendeley Exporte no formato Endnote Adicione ao seu ORCID