How Many Mechanisms Are Needed to Analyze Speech? A Connectionist Simulation of Structural Rule Learning in Artificial Language Acquisition
dc.contributor.author | Laakso, Aarre | en_US |
dc.contributor.author | Calvo, Paco | en_US |
dc.date.accessioned | 2011-11-10T15:33:39Z | |
dc.date.available | 2012-11-02T18:56:40Z | en_US |
dc.date.issued | 2011-09 | en_US |
dc.identifier.citation | Laakso, Aarre; Calvo, Paco (2011). "How Many Mechanisms Are Needed to Analyze Speech? A Connectionist Simulation of Structural Rule Learning in Artificial Language Acquisition." Cognitive Science 35(7). <http://hdl.handle.net/2027.42/86908> | en_US |
dc.identifier.issn | 0364-0213 | en_US |
dc.identifier.issn | 1551-6709 | en_US |
dc.identifier.uri | https://hdl.handle.net/2027.42/86908 | |
dc.description.abstract | Some empirical evidence in the artificial language acquisition literature has been taken to suggest that statistical learning mechanisms are insufficient for extracting structural information from an artificial language. According to the more than one mechanism (MOM) hypothesis, at least two mechanisms are required in order to acquire language from speech: (a) a statistical mechanism for speech segmentation; and (b) an additional rule‐following mechanism in order to induce grammatical regularities. In this article, we present a set of neural network studies demonstrating that a single statistical mechanism can mimic the apparent discovery of structural regularities, beyond the segmentation of speech. We argue that our results undermine one argument for the MOM hypothesis. | en_US |
dc.publisher | Blackwell Publishing Ltd | en_US |
dc.publisher | Wiley Periodicals, Inc. | en_US |
dc.subject.other | Artificial Grammar Learning | en_US |
dc.subject.other | Speech Processing | en_US |
dc.subject.other | Language Acquisition | en_US |
dc.subject.other | More Than One Mechanism Hypothesis | en_US |
dc.subject.other | Statistical Learning | en_US |
dc.subject.other | Connectionism | en_US |
dc.title | How Many Mechanisms Are Needed to Analyze Speech? A Connectionist Simulation of Structural Rule Learning in Artificial Language Acquisition | en_US |
dc.type | Article | en_US |
dc.rights.robots | IndexNoFollow | en_US |
dc.subject.hlbsecondlevel | Neurosciences | en_US |
dc.subject.hlbtoplevel | Health Sciences | en_US |
dc.description.peerreviewed | Peer Reviewed | en_US |
dc.contributor.affiliationum | Department of Behavioral Sciences, University of Michigan‐Dearborn | en_US |
dc.contributor.affiliationother | Philosophy Department, University of Murcia | en_US |
dc.description.bitstreamurl | http://deepblue.lib.umich.edu/bitstream/2027.42/86908/1/j.1551-6709.2011.01191.x.pdf | |
dc.identifier.doi | 10.1111/j.1551-6709.2011.01191.x | en_US |
dc.identifier.source | Cognitive Science | en_US |
dc.identifier.citedreference | Altmann, G. T. ( 2002 ). Learning and development in neural networks – the importance of prior experience. Cognition, 85 ( 2 ), B43 – B50. | en_US |
dc.identifier.citedreference | Bishop, C. M. ( 1995 ). Neural networks for pattern recognition. Oxford, England: Oxford University Press. | en_US |
dc.identifier.citedreference | Boden, M., & Wiles, J. ( 2000 ). Context‐free and context‐sensitive dynamics in recurrent neural networks. Connection Science: Journal of Neural Computing, Artificial Intelligence & Cognitive Research, 12 ( 3–4 ), 197 – 210. | en_US |
dc.identifier.citedreference | Bonatti, L. L., Peña, M., Nespor, M., & Mehler, J. ( 2006 ). How to hit Scylla without avoiding Charybdis: Comment on Perruchet, Tyler, Galland, and Peereman (2004). Journal of Experimental Psychology: General, 135 ( 2 ), 314 – 321. | en_US |
dc.identifier.citedreference | Chalup, S. K., & Blair, A. D. ( 2003 ). Incremental training of first order recurrent neural networks to predict a context‐sensitive language. Neural Networks, 16 ( 7 ), 955 – 972. | en_US |
dc.identifier.citedreference | Chomsky, N. ( 1980 ). Rules and representations. Oxford, England: Basil Blackwell. | en_US |
dc.identifier.citedreference | Christiansen, M. H., & Chater, N. ( 1999 ). Connectionist natural language processing: The state of the art. Cognitive Science, 23, 417 – 437. | en_US |
dc.identifier.citedreference | Christiansen, M., & Curtin, S. ( 1999 ). Transfer of learning: Rule acquisition or statistical learning? Trends in Cognitive Science, 3 ( 8 ), 289 – 290. | en_US |
dc.identifier.citedreference | Elman, J. L. ( 1990 ). Finding structure in time. Cognitive Science, 14 ( 2 ), 179 – 211. | en_US |
dc.identifier.citedreference | Endress, A. D., & Bonatti, L. L. ( 2007 ). Rapid learning of syllable classes from a perceptually continuous speech stream. Cognition, 105 ( 2 ), 247 – 299. | en_US |
dc.identifier.citedreference | Endress, A. D., & Mehler, J. ( 2009 ). Primitive computations in speech processing. The Quarterly Journal of Experimental Psychology, 62 ( 11 ), 2187 – 2209. | en_US |
dc.identifier.citedreference | Fodor, J. A., & Pylyshyn, Z. W. ( 1988 ). Connectionism and cognitive architecture: A critical analysis. Cognition, 28, 3 – 71. | en_US |
dc.identifier.citedreference | Gerken, L. A., Wilson, R., & Lewis, W. ( 2005 ). Infants can use distributional cues to form syntactic categories. Journal of Child Language, 32, 249 – 268. | en_US |
dc.identifier.citedreference | Gómez, R. L., & Maye, J. ( 2005 ). The developmental trajectory of nonadjacent dependency learning. Infancy, 7 ( 2 ), 183 – 206. | en_US |
dc.identifier.citedreference | Hare, M., Elman, J. L., & Daugherty, K. G. ( 1995 ). Default generalisation in connectionist networks. Language & Cognitive Processes, 10 ( 6 ), 601 – 630. | en_US |
dc.identifier.citedreference | Henson, R. N. A. ( 1998 ). Short‐term memory for serial order: The start‐end model. Cognitive Psychology, 36 ( 2 ), 73 – 137. | en_US |
dc.identifier.citedreference | Laakso, A., & Calvo, P. ( 2008 ). A connectionist simulation of structural rule learning in language acquisition. In B. C. Love, K. McRae, & V. M. Sloutsky (Eds.), Proceedings of the 30th Annual Meeting of the Cognitive Science Society (pp. 709 – 714 ). Austin, TX: Cognitive Science Society. | en_US |
dc.identifier.citedreference | Marcus, G. F. ( 2001 ). The algebraic mind: Integrating connectionism and cognitive science. Cambridge, MA: MIT Press. | en_US |
dc.identifier.citedreference | Marcus, G. F., Vijayan, S., Bandi Rao, S., & Vishton, P. M. ( 1999 ). Rule learning by seven‐month‐old infants. Science, 283, 77 – 80. | en_US |
dc.identifier.citedreference | McClelland, J. L., & Rumelhart, D. E. (Eds.). ( 1986 ). Parallel distributed processing: Explorations in the microstructure of cognition (vol. 2: psychological and biological models). Cambridge, MA: MIT Press. | en_US |
dc.identifier.citedreference | Monaghan, P., Chater, N., & Christiansen, M. H. ( 2005 ). The differential role of phonological and distributional cues in grammatical categorisation. Cognition, 96 ( 2 ), 143 – 182. | en_US |
dc.identifier.citedreference | Newport, E. L., & Aslin, R. N. ( 2000 ). Innately constrained learning: Blending old and new approaches to language acquisition. In S. C. Howell, S. A. Fish, & T. Keith‐Lucas (Eds.), BUCLD 24: Proceedings of the 24th Annual Boston University Conference on Language Development 1 – 21. Boston: Cascalla Press. | en_US |
dc.identifier.citedreference | Newport, E. L., & Aslin, R. N. ( 2004 ). Learning at a distance: I. Statistical learning of non‐adjacent dependencies. Cognitive Psychology, 48 ( 2 ), 127 – 162. | en_US |
dc.identifier.citedreference | Onnis, L., Monaghan, P., Richmond, K., & Chater, N. ( 2005 ). Phonology impacts segmentation in online speech processing. Journal of Memory and Language, 53 ( 2 ), 225 – 237. | en_US |
dc.identifier.citedreference | Pelucchi, B., Hay, J. F., & Saffran, J. R. ( 2009 ). Learning in reverse: Eight‐month‐old infants track backward transitional probabilities. Cognition, 113, 244 – 247. | en_US |
dc.identifier.citedreference | Peña, M., Bonatti, L. L., Nespor, M., & Mehler, J. ( 2002a ). Signal‐driven computations in speech processing. Science, 298 ( 5593 ), 604 – 607. | en_US |
dc.identifier.citedreference | Peña, M., Bonatti, L. L., Nespor, M., & Mehler, J. ( 2002b ). Signal‐driven computations in speech processing (online supplement on materials and methods). Science, 298 ( 5593 ). Available at: http://www.sciencemag.org/cgi/content/full/1072901/DC1 | en_US |
dc.identifier.citedreference | Perruchet, P., & Desaulty, S. ( 2008 ). A role for backward transitional probabilities in word segmentation? Memory & Cognition, 36 ( 7 ), 1299 – 1305. | en_US |
dc.identifier.citedreference | Perruchet, P., Peereman, R., & Tyler, M. D. ( 2006 ). Do we need algebraic‐like computations? A reply to Bonatti, Peña, Nespor, and Mehler (2006). Journal of Experimental Psychology: General, 135 ( 2 ), 322 – 326. | en_US |
dc.identifier.citedreference | Perruchet, P., & Tillmann, B. ( 2010 ). Exploiting multiple sources of information in learning an artificial language: Human data and modeling. Cognitive Science, 34 ( 2 ), 255 – 285. | en_US |
dc.identifier.citedreference | Perruchet, P., Tyler, M. D., Galland, N., & Peereman, R. ( 2004 ). Learning nonadjacent dependencies: No need for algebraic‐like computations. Journal of Experimental Psychology: General, 133 ( 4 ), 573 – 583. | en_US |
dc.identifier.citedreference | Perruchet, P., & Vinter, A. ( 1998 ). PARSER: A model of word segmentation. Journal of Memory and Language, 39 ( 2 ), 246 – 263. | en_US |
dc.identifier.citedreference | Pinker, S., & Ullman, M. T. ( 2002 ). The past and future of the past tense. Trends in Cognitive Sciences, 6 ( 11 ), 456 – 463. | en_US |
dc.identifier.citedreference | Plunkett, K., & Juola, P. ( 1999 ). A connectionist model of English past tense and plural morphology. Cognitive Science, 23 ( 4 ), 463 – 490. | en_US |
dc.identifier.citedreference | Ramscar, M. ( 2002 ). The role of meaning in inflection: Why the past tense does not require a rule. Cognitive Psychology, 45 ( 1 ), 45 – 94. | en_US |
dc.identifier.citedreference | Redington, M., Chater, N., & Finch, S. P. ( 1998 ). Distributional information: A powerful cue for acquiring syntactic categories. Cognitive Science, 22 ( 4 ), 425 – 469. | en_US |
dc.identifier.citedreference | Rodriguez, P. ( 2001 ). Simple recurrent networks learn context‐free and context‐sensitive languages by counting. Neural Computation, 13 ( 9 ), 2093 – 2118. | en_US |
dc.identifier.citedreference | Saffran, J. R., Aslin, R. N., & Newport, E. L. ( 1996 ). Statistical learning by 8‐month old infants. Science, 274 ( 5294 ), 1926 – 1928. | en_US |
dc.identifier.citedreference | Saffran, J. R., Newport, E. L., & Aslin, R. N. ( 1996 ). Word segmentation: The role of distributional cues. Journal of Memory & Language, 35 ( 4 ), 606 – 621. | en_US |
dc.identifier.citedreference | Seidenberg, M. S. ( 1997 ). Language acquisition and use: Learning and applying probabilistic constraints. Science, 275 ( 5306 ), 1599 – 1603. | en_US |
dc.identifier.citedreference | Seidenberg, M. S., & Elman, J. L. ( 1999a ). Do infants learn grammar with algebra or statistics? Science, 284, 435 – 436. | en_US |
dc.identifier.citedreference | Seidenberg, M. S., & Elman, J. L. ( 1999b ). Networks are not ‘hidden rules.’. Trends in Cognitive Sciences, 3 ( 8 ), 288 – 289. | en_US |
dc.identifier.citedreference | Seidenberg, M. S., MacDonald, M. C., & Saffran, J. R. ( 2002 ). Does grammar start where statistics stop? Science, 298, 553 – 554. | en_US |
dc.identifier.citedreference | Tversky, A. ( 1977 ). Features of similarity. Psychological Review, 84 ( 4 ), 327 – 352. | en_US |
dc.owningcollname | Interdisciplinary and Peer-Reviewed |
Files in this item
Remediation of Harmful Language
The University of Michigan Library aims to describe library materials in a way that respects the people and communities who create, use, and are represented in our collections. Report harmful or offensive language in catalog records, finding aids, or elsewhere in our collections anonymously through our metadata feedback form. More information at Remediation of Harmful Language.
Accessibility
If you are unable to use this file in its current format, please select the Contact Us link and we can modify it to make it more accessible to you.