Sunteți pe pagina 1din 9

Age and the critical period hypothesis

In the field of second language acquisition (SLA), how specific aspects of learning a non-native
language (L2) may be affected by when the process begins is referred to as the age factor.
Because of the way age intersects with a range of social, affective, educational, and experiential
variables, clarifying its relationship with learning rate and/or success is a major challenge.

There is a popular belief that children as L2 learners are superior to adults (Scovel 2000), that
is, the younger the learner, the quicker the learning process and the better the outcomes.
Nevertheless, a closer examination of the ways in which age combines with other variables
reveals a more complex picture, with both favourable and unfavourable age-related differences
being associated with early- and late-starting L2 learners (Johnstone 2002).

The critical period hypothesis (CPH) is a particularly relevant case in point. This is the claim
that there is, indeed, an optimal period for language acquisition, ending at puberty. However, in
its original formulation (Lenneberg 1967), evidence for its existence was based on the relearning
of impaired L1 skills, rather than the learning of a second language under normal circumstances.

Furthermore, although the age factor is an uncontroversial research variable extending from birth
to death (Cook 1995), and the CPH is a narrowly focused proposal subject to recurrent debate,
ironically, it is the latter that tends to dominate SLA discussions (Garca Lecumberri and
Gallardo 2003), resulting in a number of competing conceptualizations. Thus, in the current
literature on the subject (Bialystok 1997; Richards and Schmidt 2002; Abello-Contesse et al.
2006), references can be found to (i) multiple critical periods (each based on a specific language
component, such as age six for L2 phonology), (ii) the non-existence of one or more critical
periods for L2 versus L1 acquisition, (iii) a sensitive yet not critical period, and (iv) a gradual
and continual decline from childhood to adulthood.

It therefore needs to be recognized that there is a marked contrast between the CPH as an issue of
continuing dispute in SLA, on the one hand, and, on the other, the popular view that it is an
invariable law, equally applicable to any L2 acquisition context or situation. In fact, research
indicates that age effects of all kinds depend largely on the actual opportunities for learning
which are available within overall contexts of L2 acquisition and particular learning situations,
notably the extent to which initial exposure is substantial and sustained (Lightbown 2000).

Thus, most classroom-based studies have shown not only a lack of direct correlation between an
earlier start and more successful/rapid L2 development but also a strong tendency for older
children and teenagers to be more efficient learners. For example, in research conducted in the
context of conventional school programmes, Cenoz (2003) and Muoz (2006) have shown that
learners whose exposure to the L2 began at age 11 consistently displayed higher levels of
proficiency than those for whom it began at 4 or 8. Furthermore, comparable limitations have
been reported for young learners in school settings involving innovative, immersion-type
programmes, where exposure to the target language is significantly increased through subject-
matter teaching in the L2 (Genesee 1992; Abello-Contesse 2006). In sum, as Harley and Wang
(1997) have argued, more mature learners are usually capable of making faster initial progress in
acquiring the grammatical and lexical components of an L2 due to their higher level of cognitive
development and greater analytical abilities.

In terms of language pedagogy, it can therefore be concluded that (i) there is no single magic
age for L2 learning, (ii) both older and younger learners are able to achieve advanced levels of
proficiency in an L2, and (iii) the general and specific characteristics of the learning environment
are also likely to be variables of equal or greater importance.

Critical period

In developmental psychology and developmental biology, a critical period is a maturational


stage in the lifespan of an organism during which the nervous system is especially sensitive to
certain environmental stimuli. If, for some reason, the organism does not receive the appropriate
stimulus during this "critical period" to learn a given skill or trait, it may be difficult, ultimately
less successful, or even impossible, to develop some functions later in life. Functions that are
indispensable to an organism's survival, such as vision, are particularly likely to develop during
critical periods. "Critical period" also relate to ability to acquire first language. Researchers
found that people who passed "critical period" would not acquire first language fluently.[1]

Some researchers differentiate between 'critical' and 'sensitive' periodsdefining 'sensitive'


periods as more extended periods, after which learning is still possible.[2] Other researchers
consider these the same phenomenon.[3]

For example, the critical period for the development of a human child's binocular vision is
thought to be between three and eight months, with sensitivity to damage extending up to at least
three years of age. Further critical periods have been identified for the development of hearing[4]
and the vestibular system.[1] There are critical periods during early postnatal development in
which imprinting can occur, such as when a greylag goose becomes attached to a parent figure
within the first 36 hours after hatching. A young chaffinch must hear an adult singing before it
sexually matures, or it never properly learns the highly intricate song.

Confirming the existence of a critical period for a particular ability requires evidence that there is
a point after which the associated behavior is no longer correlated with age, and ability stays at
the same level. Some experimental research into critical periods has involved depriving animals
of stimuli at different stages of development, while other studies have looked at children
deprived of certain experiences due to illness (such as temporary blindness), or social isolation
(such as feral children). Many of the studies investigating a critical period for language
acquisition have focused on deaf children of hearing parents.
Critical period graph

Linguistics
Main article: Critical period hypothesis

First language acquisition


The Critical Period Hypothesis (CPH) states that the first few years of life constitute the time
during which language develops readily and after which (sometime between age 5 and puberty)
language acquisition is much more difficult and ultimately less successful.[5] The hypothesis that
language is acquired during a critical period was first proposed by neurologists Wilder Penfield
and Lamar Roberts in 1959 and popularized by linguist Eric H. Lenneberg in 1967. Lenneberg
argued for the hypothesis based on evidence that children who experience brain injury early in
life develop far better language skills than adults with similar injuries.

The two most famous cases of children who failed to acquire language after the critical period
are Genie and the feral child Victor of Aveyron.[6] However, the tragic circumstances of these
cases and the moral and ethical impermissibility of replicating them make it difficult to draw
conclusions about them. The children may have been cognitively disabled from infancy, or their
inability to develop language may have resulted from the profound neglect and abuse they
suffered.[5]

Many subsequent researchers have further developed the CPH, most notably Elissa Newport and
Rachel Mayberry. Studies conducted by these researchers demonstrated that profoundly deaf
individuals who are not exposed to a sign language as children never achieve full proficiency,
even after 30 years of daily use.[7] While the effect is most profound for individuals who receive
no sign language input until after the age of 12, even those deaf people who began learning a
sign language at age 5 were significantly less fluent than native deaf signers (whose exposure to
a sign language began at birth). Early language exposure also affects the ability to learn a second
language later in life: profoundly deaf individuals with early language exposure achieve
comparable levels of proficiency in a second language to hearing individuals with early language
exposure. In contrast, deaf individuals without early language exposure perform far worse.[8]

Steven Pinker discusses the CPH in his book, The Language Instinct. According to Pinker,
language must be viewed as a concept rather than a specific language for the sounds, grammar,
meaning, vocabulary, and social norms play an important role in the acquisition of language.[9]
Physiological changes in the brain are also conceivable causes for the terminus of the critical
period for language acquisition.[10] As language acquisition is crucial during this phase, similarly
infant-parent attachment is crucial for social development of the infant. An infant learns to trust
and feel safe with the parent, but there are cases in which the infant might be staying at an
orphanage where it does not receive the same attachment with their caregiver. Research shows
that infants who were unable to develop this attachment had major difficulty in keeping close
relationships, and had maladaptive behaviors with adopted parents.[1]

Other evidence comes from neuropsychology where it is known that adults well beyond the
critical period, are more likely to suffer permanent language impairment from brain damage than
are children, believed to be due to youthful resiliency of neural reorganization.[5]

Second language acquisition


The theory[11] has often been extended to a critical period for Second Language Acquisition
(SLA), which has influenced researchers in the field on both sides of the spectrum, supportive
and unsupportive of CPH, to explore.[12] However, the nature of this phenomenon has been one of
the most fiercely debated issues in psycholinguistics and cognitive science in general for
decades.

Certainly, older learners of a second language rarely achieve the native-like fluency that younger
learners display, despite often progressing faster than children in the initial stages. This is
generally accepted as evidence supporting the CPH. Incorporating the idea, "younger equals
better" by Penfield, David Singleton (1995) states that in learning a second language there are
many exceptions, noting that five percent of adult bilinguals master a second language even
though they begin learning it when they are well into adulthood long after any critical period
has presumably come to a close. The critical period hypothesis holds that first language
acquisition must occur before cerebral lateralization completes, at about the age of puberty. One
prediction of this hypothesis is that second language acquisition is relatively fast, successful, and
qualitatively similar to first language only if it occurs before the age of puberty.[13] To grasp a
better understanding of SLA, it is essential to consider linguistic, cognitive, and social factors
rather than age alone, as they are all essential to the learner's language acquisition.[12]

Vision

In mammals, neurons in the brain, which process vision, actually develop after birth based on
signals from the eyes. A landmark experiment by David H. Hubel and Torsten Wiesel (1963)
showed that cats that had one eye sewn shut from birth to three months of age (monocular
deprivation) only fully developed vision in the open eye. They showed that columns in the
primary visual cortex receiving inputs from the other eye took over the areas that would
normally receive input from the deprived eye. In general electrophysiological analyses of axons
and neurons in the lateral geniculate nucleus showed that the visual receptive field properties was
comparable to adult cats. However, the layers of cortex that were deprived had less activity and
fewer responses were isolated. The kittens had abnormally small ocular dominance columns (part
of the brain that processes sight) connected to the closed eye, and abnormally large, wide
columns connected to the open eye. Because the critical period time had elapsed, it would be
impossible for the kittens to alter and develop vision in the closed eye. This did not happen to
adult cats even when one eye was sewn shut for a year because they had fully developed their
vision during their critical period. Later experiments in monkeys found similar results.[14]

In a follow-up experiment, Hubel and Wiesel (1963) explored the cortical responses present in
kittens after binocular deprivation; they found it difficult to find any active cells in the cortex,
and the responses they did get were either slow-moving or fast-fatiguing. Furthermore, the cells
that did respond selected for edges and bars with distinct orientation preferences. Nevertheless,
these kittens developed normal binocularity. Hubel and Wiesel first explained the mechanism,
known as orientation selectivity, in the mammalian visual cortex. Orientation tuning, a model
that originated with their model, is a concept in which receptive fields of neurons in the LGN
excite a cortical simple cell and are arranged in rows. This model was important because it was
able to describe a critical period for the proper development of normal ocular dominance
columns in the lateral geniculate nucleus, and thus able to explain the effects of monocular
deprivation during this critical period. The critical period for cats is about three months and for
monkeys, about six months.[15]

In a similar experiment, Antonini and Stryker (1993) examined the anatomical changes that can
be observed after monocular deprivation. They compared geniculocortical axonal arbors in
monocularly deprived animals in the long term (4- weeks) to short term (67 days) during the
critical period established by Hubel and Wiesel (1993). They found that in the long term,
monocular deprivation causes reduced branching at the end of neurons, while the amount of
afferents allocated to the nondeprived eye increased. Even in the short term, Antonini and
Stryker (1993) found that geniculocortical neurons were similarly affected. This supports the
aforementioned concept of a critical period for proper neural development for vision in the
cortex.[16]

In humans, some babies are born blind in one or both eyes, for example, due to cataracts. Even
when their vision is restored later by treatment, their sight would not function in the normal way
as for someone who had binocular vision from birth or had surgery to restore vision shortly after
birth. Therefore, it is important to treat babies born blind soon if their condition is treatable.
Expression of the protein Lynx1 has been associated with the normal end of the critical period
for synaptic plasticity in the visual system.[17]

Imprinting

In psychology, imprinting is any type of rapid learning that occurs in a particular life stage.
While this rapid learning is independent of the behavioral outcome, it also establishes it and can
effect behavioral responses to different stimuli. Konrad Lorenz is well known for his classic
studies of filial imprinting in graylag geese. From 1935 to 1938, he presented himself to a group
of newly hatched gosling and took note of how he was instantly accepted, followed, and called to
as if he were the one who laid them himself. As the first moving object they encountered, Lorenz
studied the phenomenon in how quickly the geese were able to form such an irreversible bond.
Through his work he demonstrated that this only developed during a brief critical period,
which was about a few hours after hatching.[18] Lorenz also discovered a long-lasting effect of his
studies, and that was a shift in the species' sexual imprinting as a result from imprinting upon a
foster mother of a second species. For certain species, when raised by a second one, they develop
and retain imprinted preferences and approach the second species they were raised by rather than
choose their own, if given a choice.[19]

Imprinting serves as the distinguishing factor between one's own mother and other mother
figures. The mother and the infant both identify with each other, this is a strong bonding moment
for humans. It provides a sort of model or guide to adult behaviors in addition to other factors
such as nurture, protection in infancy, guidance, and nourishment. The imprinting process,
Lorenz also found, brought about a sense of familiarity for the young animals. When such a
strong bond is formed at such an early stage, it creates a sense of security and comfort for the
subject and actually encourages the imprinting behavior.

Pheromones play a key role in the imprinting process, they trigger a biochemical response in the
recipient, leading to a confirmed identification in the other individual. If direct contact between
mother and infant is not maintained during the critical imprinting period, then the mother goose
may reject the infant because she is unfamiliar with her newborn's scent. If that does happen,
then the infant's life would be in jeopardy unless it were claimed by a substitute mother and if it
failed to imprint would, that trigger psychological trauma, possibly leading to awkward social
behavior in later life.[20] In relation to humans, a newborn during the critical period identifies with
its mother's and other peoples' scents since its scent is one of the most developed senses at that
stage in life. The newborn uses this pheromone identification to seek the people it identifies with,
when in times of distress, hunger, and discomfort as a survival skill.[21] Inferences could be made
for newborns based upon Lorenz's studies. When imprinting on their mothers, newborns look to
them for nourishment, a sense of security, and comfort. Human newborns are among the most
helpless newborns known, and with very little that they can do for themselves the most they
could do is form bonds the people that they could depend on to provide them the essentials
needed. Imprinting is a crucial factor of the critical period because it facilitates the newborn's
abilities to form bonds with other people, from infancy to adulthood.

Auditory processing
Many studies have supported a correlation between the type of auditory stimuli present in the
early postnatal environment and the development on the topographical and structural
development of the auditory system.[4]

First reports on critical periods came from deaf children and animals that received a cochlear
implant to restore hearing. Approximately at the same time, both an electroencephalographic
study by Sharma, Dorman and Spahr [22] and an in-vivo investigation of the cortical plasticity in
deaf cats by Kral and colleagues [23] demonstrated that the adaptation to the cochlear implant is
subject to an early, developmental sensitive period. The closure of sensitive periods likely
involves a multitude of processes that in their combination make it difficult to reopen these
behaviorally.[4] The understanding of the mechanisms behind critical periods has consequences
for medical therapy of hearing loss.[24] M. Merzenich and colleagues showed that during an early
critical period, noise exposure can affect the frequency organization of the auditory cortex.[25]

Recent studies have examined the possibility of a critical period for thalamocortical connectivity
in the auditory system. For example, Zhou and Merzenich (2008) studied the effects of noise on
development in the primary auditory cortex in rats. In their study, rats were exposed to pulsed
noise during the critical period and the effect on cortical processing was measured. Rats that
were exposed to pulsed noise during the critical period had cortical neurons that were less able to
respond to repeated stimuli; the early auditory environment interrupted normal structural
organization during development.

In a related study, Barkat, Polley and Hensch (2011) looked at how exposure to different sound
frequencies influences the development of the tonotopic map in the primary auditory cortex and
the ventral medical geniculate body. In this experiment, mice were reared either in normal
environments or in the presence of 7 kHz tones during early postnatal days. They found that mice
that were exposed to an abnormal auditory environment during a critical period P11- P15 had an
atypical tonotopic map in the primary auditory cortex.[26] These studies support the notion that
exposure to certain sounds within the critical period can influence the development of tonotopic
maps and the response properties of neurons. Critical periods are important for the development
of the brain for the function from a pattern of connectivity. In general, the early auditory
environment influences the structural development and response specificity of the primary
auditory cortex.[27]

Musical ability
Absolute pitch manifests itself almost always before adolescence and rarely if ever among
individuals who are first exposed to music after mid-childhood, suggesting that exposure to
music or similar phenomena (e.g., tonal languages) in early to mid-childhood is a necessary
condition for its development or refinement. Studies that ask musicians and non-musicians to
sing or hum well-known popular songs that have definitive recordings (and hence are sung in
standardized keys) show thaton averageparticipants sing within a semitone of the
standardized key but that outside the small subset of participants with absolute pitch there is
broad variation (the "bell curve" that reflects the degree of approximation to the standard key is
broad and flat).{{{1}}} These results suggest that almost all humans have some innate aptitude
for absolute pitch recognitionthough other factors may enhance or limit the level of that
aptitude. Also, the results' conjunction with the aforementioned chronological observations
suggests that early to mid-childhood exposure to environments whose interpretation depends on
pitch is a developmental "trigger" for whatever aptitude an individual possesses.

Vestibular system
In our vestibular system, neurons are undeveloped at neuronal birth and mature during the
critical period of the first 2-3 postnatal weeks. Hence, disruption of maturation during this period
can cause changes in normal balance and movement through space. Animals with abnormal
vestibular development tend to have irregular motor skills.[28] Studies have consistently shown
that animals with genetic vestibular deficiencies during this critical period have altered vestibular
phenotypes, most likely as a result of lack insufficient input from the semicircular canals and
dopaminergic abnormalities. Moreover, exposure to abnormal vestibular stimuli during the
critical period is associated with irregular motor development. Children with hypofunctioning
vestibular receptors frequently have delayed motor development. The results of the studies done
on ferrets and rats reinforced the idea that the vestibular system is very important to motor
development during the initial neonatal period. If the vestibular receptors are present during the
initial six months to a year when the infant is learning to sit and stand, then the child may
develop motor control and balance normally.[29]

The vestibulo-ocular reflex (VOR) is a reflex eye movement that stabilizes images on the retina
during head movement. It produces an eye movement in the direction opposite to head
movement, thus preserving the image on the center of the visual field. Studies in fish and
amphibians revealed a sensitivity in their VOR. They launched into space flight for 9-10, some
with developing VOR's and others with already developed reflexes. The fish with developing
reflexes developed an upward bend in their tails. The altered gravity resulted in a shift of
orientation. Those who were already matured with the reflex were insensitive to the microgravity
exposure.[30]

Memory
Recent studies also support the possibility of a critical period for the development of neurons that
mediate memory processing. Experimental evidence supports that notion that young neurons in
the adult dentate gyrus have a critical period (about 13 weeks after neuronal birth) during which
they are integral to memory formation.[31] Although the exact reasoning behind this observation is
uncertain, studies suggest that the functional properties of neurons at this age make them most
appropriate for this purpose; these neurons: (1) Remain hyperactive during the formation of
memories; (2) are more excitable; and (3) More easily depolarizable due to GABAergic effects.
It is also possible that hyperplasticity makes the neurons more useful in memory formation. If
these young neurons had more plasticity than adult neurons in the same context, they could be
more influential in smaller numbers.[31] The role of these neurons in the adult dentate gyrus in
memory processing is further supported by the fact that behavioral experiments have shown that
an intact dentate gyrus is integral to hippocampal memory formation.[31] It is speculated that the
dentate gyrus acts a relay station for information relating to memory storage. The likelihood of a
critical period could change the way we view memory processing because it would ultimately
mean that the collection of neurons present is constantly being replenished as new neurons
replace old ones. If a critical period does indeed exist, this could possibly mean that: (1) Diverse
populations of neurons that represent events occurring soon after one another may connect those
events temporally in the memory formation and processing; OR (2) These different populations
of neurons may distinguish between similar events, independent of temporal position; OR (3)
Separate populations may mediate the formation of new memories when the same events occur
frequently.[31]

S-ar putea să vă placă și