Disentangling sequential from hierarchical learning in Artificial Grammar Learning: Evidence from a modified Simon Task.

In this paper we probe the interaction between sequential and hierarchical learning by investigating implicit learning in a group of school-aged children. We administered a serial reaction time task, in the form of a modified Simon Task in which the stimuli were organised following the rules of two...

Full description

Bibliographic Details
Main Authors: Maria Vender, Diego Gabriel Krivochen, Arianna Compostella, Beth Phillips, Denis Delfitto, Douglas Saddy
Format: Article
Language:English
Published: Public Library of Science (PLoS) 2020-01-01
Series:PLoS ONE
Online Access:https://doi.org/10.1371/journal.pone.0232687
id doaj-a7a2389c7c1b459790f2279a9bd30458
record_format Article
spelling doaj-a7a2389c7c1b459790f2279a9bd304582021-03-03T21:47:25ZengPublic Library of Science (PLoS)PLoS ONE1932-62032020-01-01155e023268710.1371/journal.pone.0232687Disentangling sequential from hierarchical learning in Artificial Grammar Learning: Evidence from a modified Simon Task.Maria VenderDiego Gabriel KrivochenArianna CompostellaBeth PhillipsDenis DelfittoDouglas SaddyIn this paper we probe the interaction between sequential and hierarchical learning by investigating implicit learning in a group of school-aged children. We administered a serial reaction time task, in the form of a modified Simon Task in which the stimuli were organised following the rules of two distinct artificial grammars, specifically Lindenmayer systems: the Fibonacci grammar (Fib) and the Skip grammar (a modification of the former). The choice of grammars is determined by the goal of this study, which is to investigate how sensitivity to structure emerges in the course of exposure to an input whose surface transitional properties (by hypothesis) bootstrap structure. The studies conducted to date have been mainly designed to investigate low-level superficial regularities, learnable in purely statistical terms, whereas hierarchical learning has not been effectively investigated yet. The possibility to directly pinpoint the interplay between sequential and hierarchical learning is instead at the core of our study: we presented children with two grammars, Fib and Skip, which share the same transitional regularities, thus providing identical opportunities for sequential learning, while crucially differing in their hierarchical structure. More particularly, there are specific points in the sequence (k-points), which, despite giving rise to the same transitional regularities in the two grammars, support hierarchical reconstruction in Fib but not in Skip. In our protocol, children were simply asked to perform a traditional Simon Task, and they were completely unaware of the real purposes of the task. Results indicate that sequential learning occurred in both grammars, as shown by the decrease in reaction times throughout the task, while differences were found in the sensitivity to k-points: these, we contend, play a role in hierarchical reconstruction in Fib, whereas they are devoid of structural significance in Skip. More particularly, we found that children were faster in correspondence to k-points in sequences produced by Fib, thus providing an entirely new kind of evidence for the hypothesis that implicit learning involves an early activation of strategies of hierarchical reconstruction, based on a straightforward interplay with the statistically-based computation of transitional regularities on the sequences of symbols.https://doi.org/10.1371/journal.pone.0232687
collection DOAJ
language English
format Article
sources DOAJ
author Maria Vender
Diego Gabriel Krivochen
Arianna Compostella
Beth Phillips
Denis Delfitto
Douglas Saddy
spellingShingle Maria Vender
Diego Gabriel Krivochen
Arianna Compostella
Beth Phillips
Denis Delfitto
Douglas Saddy
Disentangling sequential from hierarchical learning in Artificial Grammar Learning: Evidence from a modified Simon Task.
PLoS ONE
author_facet Maria Vender
Diego Gabriel Krivochen
Arianna Compostella
Beth Phillips
Denis Delfitto
Douglas Saddy
author_sort Maria Vender
title Disentangling sequential from hierarchical learning in Artificial Grammar Learning: Evidence from a modified Simon Task.
title_short Disentangling sequential from hierarchical learning in Artificial Grammar Learning: Evidence from a modified Simon Task.
title_full Disentangling sequential from hierarchical learning in Artificial Grammar Learning: Evidence from a modified Simon Task.
title_fullStr Disentangling sequential from hierarchical learning in Artificial Grammar Learning: Evidence from a modified Simon Task.
title_full_unstemmed Disentangling sequential from hierarchical learning in Artificial Grammar Learning: Evidence from a modified Simon Task.
title_sort disentangling sequential from hierarchical learning in artificial grammar learning: evidence from a modified simon task.
publisher Public Library of Science (PLoS)
series PLoS ONE
issn 1932-6203
publishDate 2020-01-01
description In this paper we probe the interaction between sequential and hierarchical learning by investigating implicit learning in a group of school-aged children. We administered a serial reaction time task, in the form of a modified Simon Task in which the stimuli were organised following the rules of two distinct artificial grammars, specifically Lindenmayer systems: the Fibonacci grammar (Fib) and the Skip grammar (a modification of the former). The choice of grammars is determined by the goal of this study, which is to investigate how sensitivity to structure emerges in the course of exposure to an input whose surface transitional properties (by hypothesis) bootstrap structure. The studies conducted to date have been mainly designed to investigate low-level superficial regularities, learnable in purely statistical terms, whereas hierarchical learning has not been effectively investigated yet. The possibility to directly pinpoint the interplay between sequential and hierarchical learning is instead at the core of our study: we presented children with two grammars, Fib and Skip, which share the same transitional regularities, thus providing identical opportunities for sequential learning, while crucially differing in their hierarchical structure. More particularly, there are specific points in the sequence (k-points), which, despite giving rise to the same transitional regularities in the two grammars, support hierarchical reconstruction in Fib but not in Skip. In our protocol, children were simply asked to perform a traditional Simon Task, and they were completely unaware of the real purposes of the task. Results indicate that sequential learning occurred in both grammars, as shown by the decrease in reaction times throughout the task, while differences were found in the sensitivity to k-points: these, we contend, play a role in hierarchical reconstruction in Fib, whereas they are devoid of structural significance in Skip. More particularly, we found that children were faster in correspondence to k-points in sequences produced by Fib, thus providing an entirely new kind of evidence for the hypothesis that implicit learning involves an early activation of strategies of hierarchical reconstruction, based on a straightforward interplay with the statistically-based computation of transitional regularities on the sequences of symbols.
url https://doi.org/10.1371/journal.pone.0232687
work_keys_str_mv AT mariavender disentanglingsequentialfromhierarchicallearninginartificialgrammarlearningevidencefromamodifiedsimontask
AT diegogabrielkrivochen disentanglingsequentialfromhierarchicallearninginartificialgrammarlearningevidencefromamodifiedsimontask
AT ariannacompostella disentanglingsequentialfromhierarchicallearninginartificialgrammarlearningevidencefromamodifiedsimontask
AT bethphillips disentanglingsequentialfromhierarchicallearninginartificialgrammarlearningevidencefromamodifiedsimontask
AT denisdelfitto disentanglingsequentialfromhierarchicallearninginartificialgrammarlearningevidencefromamodifiedsimontask
AT douglassaddy disentanglingsequentialfromhierarchicallearninginartificialgrammarlearningevidencefromamodifiedsimontask
_version_ 1714815064736792576