Sunteți pe pagina 1din 23

LANGUAGE

TEACHING
RESEARCH

Language Teaching Research

Seeing eye to eye? The 14(4) 517–539


© The Author(s) 2010

academic writing needs of


Reprints and permissions: sagepub.
co.uk/journalsPermissions.nav
DOI: 10.1177/1362168810375372
graduate and undergraduate http://ltr.sagepub.com

students from students’ and


instructors’ perspectives

Li-Shih Huang
University of Victoria

Abstract
This article reports on findings from a research project designed to assess undergraduate and graduate
students’ language-learning needs in the context of a new academic language support center at a
Canadian university.  A total of 432 students of English as an additional language and 93 instructors
responded to the questionnaires, which asked them to provide importance ratings of academic language
skills, to assess their own or their students’ skill status, and to respond to open-ended questions.This
article reports on data collected from the writing section of the study.The findings indicated that there
is much overlap in the skill items identified as ‘very important’ between graduate and undergraduate
students and instructors. Students’ self-assessments and instructors’ assessments of their students
differed dramatically, however. In addition to important pedagogical implications, this study suggests a
need to be cautious when interpreting needs assessment results because what instructors or students
consider as an important skill to possess may not be what students need to develop.

Keywords
needs assessment, academic language-learning needs, English for academic purposes, L2 writing

I  Introduction
English for Academic Purposes (EAP) begins ‘with the learner and the situation, whereas
general English begins with the language’ (Hamp-Lyons, 2001, p. 126). This quotation
suggests that understanding student needs is a good starting point for designing an EAP
course. Scant research has focused on multiple language skill domains from both the
instructors’ and students’ perspectives, and none has combined both graduate and under-
graduate students’ and instructors’ perceptions of the importance and the status of stu-
dents’ skills in the four language domains (i.e. speaking, writing, reading, and listening).

Corresponding author:
Li-Shih Huang, Department of Linguistics, University of Victoria, 3800 Finnerty Road,Victoria, BC V8P 5C2, Canada
Email: lshuang@uvic.ca
518 Language Teaching Research 14(4)

This article reports on the writing section1 of a project on students’ academic language
needs at the undergraduate and graduate levels from both students’ and instructors’
(including both faculty and sessionals) perspectives across disciplines.

II  Background
1  Needs analysis
In the 1960s, literature and language trained instructors of English began to face the task of
teaching students English for the purpose of studying the sciences. The phase of needs anal-
ysis did not signal the coming of age of English for specific purposes (ESP) until the 1980s
and onwards, however. According to West (1994), the term ‘needs’ is often considered an
umbrella term, allowing different interpretations. A literature search reveals a wide array of
terms associated with the concepts of needs and needs analysis, for example: objective and
subjective needs (Brindley, 1989); perceived and felt needs (Berwick, 1989); target-situation
analyses or goal-oriented and learning needs (Hutchinson & Waters, 1987); process-oriented
and product-oriented needs (Brindley, 1989); necessities, wants, and lacks (Hutchinson &
Waters, 1987); and the list goes on. Researchers generally concur, however, that collecting
and applying information on learners’ needs is a defining feature of ESP, and within ESP,
EAP (e.g. Johns & Dudley-Evans, 1991; Jordan, 1997; Flowerdew & Peacock, 2001).
Needs analysis has been ‘a key instrument in course design’ in ESP (West, 1994, p. 2).
Given that EAP has emerged from the larger field of ESP, it is not surprising that ‘needs
analysis is fundamental to an EAP approach to course design’ (Hamp-Lyons, 2001,
p. 127). Discussions of both ESP and EAP have emphasized the important role that needs
analysis plays as a basis and a starting point for identifying learners’ language needs and
for curriculum design, text selection, tasks design, and materials development (e.g. Long
& Crookes, 1992; Benesch, 1996).

2  A brief overview of previous research


Since the 1980s, numerous studies have examined English as an additional language
learners’ academic language needs. Earlier needs analyses in EAP focused mainly on
academic literacy skills (e.g. Kroll, 1979; Bridgeman & Carlson, 1984; Horowitz, 1986;
Leki & Carson, 1994, 1997), general language skills (e.g. Ostler, 1980; Geoghegan,
1983), or exclusively on writing (e.g. Horowitz, 1986; Casanave & Hubbard, 1992;
Jenkins et al., 1993) or aural/oral communication (e.g. Ferris & Tagg, 1996a, 1996b; Ferris
1998). Earlier ESP and EAP literature was also enriched by various descriptions of meth-
odologies and research findings related to learners’ needs, including:

• surveys of learners’ background and goals (e.g. Frodesen, 1995; Cumming, 2005);
• consultations with teachers on course requirements (e.g. Johns, 1981; Bridgeman
& Carlson, 1983);
• collections of writing assignments (e.g. Horowitz, 1986; Braine, 1995); and
• observations of the language demands of learning situations (e.g. Jacobson, 1987;
McKenna, 1987).
Huang 519

Previous studies have focused on students’ general and domain-specific needs from
students’ (e.g. Ostler, 1980; Leki & Carson, 1994; Ferris, 1998), instructors’ (Johns,
1981; Casanave & Hubbard, 1992; Jenkins et al., 1993; Ferris & Tagg, 1996a), or both
(e.g. Rosenfeld et al., 2001; Zhu & Flaitz, 2005) perspectives.
One might expect to find more research activities at the heart of the current learner-centred
and communicative approaches that analyse learner needs; however, beyond acknowledging
the fundamental role that needs assessment plays in education and training, the development
of a theory of needs analysis (e.g. Berwick, 1989; Brindley, 1989), Munby’s (1978) compre-
hensive lists and taxonomies of communicative needs, and various needs analysis frameworks
(e.g. Waters & Vilches, 2001), studies related to the academic language needs of graduate and
undergraduate students across disciplines are mostly outdated, and the well-acknowledged
importance of needs assessment warrants a serious, renewed effort to conduct further studies.
Without such information, instructors, curriculum developers, and materials developers may
have to rely on personal perceptions, experiences, or intuitions about students’ needs when
planning courses (Tarone & Yule, 1989; Barkhuizen, 1998; Spratt, 1999).

3  Institutional context and project background


The current project was undertaken at a medium-sized, comprehensive university in
British Columbia, Canada. Approximately 8% of the university’s 16,500 undergraduates
and 13% of its 2,400 graduate students are students of English as an additional language.
The English Language Proficiency Working Group, formed in fall 2006, is dedicated to
dealing with English language proficiency issues and policies on campus. After creating
a new academic language support center in fall 2007, the university organized a research
project to begin assessing its undergraduate and graduate students’ perceived academic
language needs across disciplines. The results from this project are intended to inform
the initial development of EAP programs and workshops tailored to the university’s
graduate and undergraduate students’ needs.

III  Research questions


The present study examines students’ and instructors’ perceptions of the importance of
skills required for successful course completion in their degree programs, and also
explores their self-assessment of their own or their students’ competency on those skill
items. Specifically, this article reports on the following research questions:

1  Importance of language skills


a  Graduate students’ perspectives:  What academic writing skills do graduate students
across disciplines and in specific disciplines judge as important for satisfactory comple-
tion of the courses they have taken thus far in their programs?

b  Undergraduate students’ perspectives:  What academic writing skills do undergraduate


students across disciplines and in specific disciplines judge as important for satisfactory
completion of the courses they have taken thus far in their programs?
520 Language Teaching Research 14(4)

c  Instructors’ perspectives:  What academic writing skills do instructors judge as important


for satisfactory completion of coursework across disciplines and in specific disciplines at
the undergraduate and graduate levels?

d  Instructors vis-à-vis graduate students and undergraduate students:  What overlaps and
differences are there among the skills in the writing domain that instructors and students
judge as important for satisfactory completion of coursework (graduate instructors vs.
graduate students and undergraduate instructors vs. undergraduate students)?

2  Status of language skills


a  Graduate students’ perspective: What academic writing skills do graduate students
identify as needing additional development?

b  Undergraduate students’ perspective:  What academic writing skills do undergraduate


students identify as needing additional development?

c  Instructors’ perspective:  What academic writing skills do instructors who teach under-
graduate and graduate students identify as needing additional development among their
students at the undergraduate and graduate levels?

d  Instructors vis-à-vis graduate students and undergraduate students:  What overlaps and
differences are there among the skills in the writing domain that instructors who teach
undergraduate and graduate students and students at the undergraduate and graduate lev-
els identify as needing additional development?

3  Importance of language skills vis-à-vis status of language skills


What is the relationship between the importance ratings and the skill status ratings in
the writing domain assigned by instructors who teach undergraduate and graduate
students and the ratings assigned by the undergraduate and graduate students’ groups,
respectively?

IV  Methods
1  Data collection
In spring 2008, a separate email message was sent to instructors and students, inviting
them to voluntarily complete an online questionnaire. Of the 610 potential undergraduate
respondents who accessed the questionnaire, 337 completed it, for a completion rate of
55.2%. For the graduate group, of the 139 who accessed the questionnaire, 95 completed
all questions, for a completion rate of 68.3%. Of the 116 potential instructors who
accessed the questionnaire, 64 completed it, for a completion rate of 55.2%.
Huang 521

2  Respondent demographics
Demographic characteristics of each group of respondents are presented in Table 1. A total
of 525 students and instructors voluntarily completed an online survey. Among them,
there were 152 participants from the humanities division, 223 from the life sciences

Table 1  Demographic profile of respondents (N = 525)

N Percentage

Undergraduate (N = 337):
Gender
   Female 245 72.7
   Male   92 27.3
Division
   Humanities   98 29.1
   Social Sciences 137 40.7
   Physical Sciences   51 15.1
   Life Sciences   51 15.1
Year of study
   1   49 14.5
   2   56 16.6
   3   95 28.2
   4 137 40.7
Graduate (N = 95):
Gender
   Female   75 78.9
   Male   20 21.1
Division
   Humanities   24 25.3
   Social Sciences   46 48.4
   Physical Sciences    8   8.4
   Life Sciences   17 17.9
Degree Program
   Masters   67 70.5
   Ph.D.   28 29.5
Instructors (N = 93):
Gender
   Female   53 57.0
   Male   40 43.0
Years of teaching
   1–10   36 38.7
   11–20   13 14.0
   20+   44 47.3
Division
   Humanities   30 36.1
   Social Sciences   40 41.0
   Physical and Life Sciences   23 23.0
Note: English as a first language students are not included in the report. Percentages may not sum to 100 due
to rounding.
522 Language Teaching Research 14(4)

division, 72 from the physical sciences division, and 78 from the life sciences division.
The student respondents had various language and cultural backgrounds (Arabic, Cantonese,
Danish, Dutch, Farsi, Flemish, French, German, Hebrew, Hindi, Icelandic, Japanese,
Mandarin, Polish, Portuguese, Russian, Spanish, Taiwanese, and Vietnamese).

3  Survey instrument
The instrument was adapted from Rosenfeld et al.’s (2001) research, which was the first
to assess the reading, writing, speaking, and listening tasks important for academic success
at the undergraduate and graduate levels.2 The instrument in the present study was modi-
fied in the following ways:

• unlike Rosenfeld et al.’s (2001) survey, the survey for this study had three sections,
which addressed: respondent background, four language skills, and open-ended
questions;
• in the four language skills sections, the present study extends Rosenfeld et al.’s
study to include assessments of students’ skills from each respondent group’s
perspective;
• discipline-specific writing tasks, such as thesis proposals and theses and disserta-
tions writing, and level of studies specific speaking tasks, such as conference pre-
sentations, were included in the graduate version of the survey, as well as the version
of the survey to be completed by instructors who teach graduate students; and
• the survey’s background section was modified to reflect the university context
where the data were collected.

Excluding the questions in the background and the open-ended questions sections, the
total number of ratable task statements was 45 for the graduate student group’s version
and 43 for the undergraduate student group’s version. The instructor group’s version
included both the undergraduate and graduate sections, and the respondents could choose
to respond to either or both sections. All respondents were requested to rate the impor-
tance of the skill items, from ‘I/They do not need to perform this task’ (‘0’) to ‘extremely
important’ (‘5’), and to assess their own or their students’ skill status with respect to those
skill items from ‘I/They do not know how’ (‘1’) to ‘I am/They are competent at this’ (‘4’).
Open-ended questions included, for example: ‘Would you judge that you are in need of
additional language training in order to develop the skills you need for satisfactory com-
pletion of the courses in your programs? If so, what might they be? (Please be specific).’
A pilot test involving volunteer participants from each respondent group was con-
ducted before administering the full-scale survey, and the instruments also underwent
internal reliability tests. Cronbach’s alpha statistics (Cronbach, 1970) were computed for
each section to assess the questionnaires’ internal consistency reliability. This statistical
method provides the most commonly reported reliability estimates in the language
assessment literature, and it provides a sound underestimate of the reliability of a set of
questionnaire or test results. The results showed a values ranging from 0.74 to 0.94 for
each section of the questionnaires and from 0.85 to 0.94 for the writing section com-
pleted by each respondent group (Table 2). These figures indicate that the questionnaires
have satisfactory reliability (i.e. a values of 0.7 or greater) (Dörnyei, 2003).
Huang 523

Table 2  Results from the Cronbach’s alpha statistics for the questionnaires’ internal
consistency reliability

Respondent Importance of language Language skill


skills section status section

R W S L R W S L

Undergraduate students 0.84 0.89 0.89 0.89 0.86 0.88 0.90 0.90
Undergraduate instructors 0.81 0.94 0.91 0.87 0.88 0.92 0.89 0.91
Graduate students 0.87 0.90 0.92 0.91 0.90 0.91 0.94 0.92
Graduate instructors 0.74 0.85 0.88 0.90 0.88 0.92 0.91 0.89
Note: R = reading; W = writing; S = speaking; L = listening.

4  Data analyses
Several types of analyses were conducted at multiple levels of aggregation, using SPSS
Version 15.0. Analyses were conducted for each group of respondents (i.e. undergraduate
students, graduate students, and instructors). These group-level analyses were followed
by subgroup analyses that included divisions (i.e. humanities, social sciences, physical
sciences, and life sciences), language skills (i.e. reading, writing, speaking, and listen-
ing), degree programs for graduate-level respondents (i.e. PhD and Master’s), and year
of study for undergraduate-level respondents (i.e. years one to four).
Means, standard deviations, and standard errors were computed for each task state-
ment in the Importance Rating Scale and the Skill Status Rating Scale for all participant
groups. A mean rating of 4.00 (‘very important’) or higher was selected to distinguish the
most important skills from those of lesser importance. 3 The cut-off point of 4.00 for the
importance ratings was selected for two main reasons:

• those tasks that were rated as ‘very important’ could be easily identified for con-
sideration in course or workshop development; and
• all standards may be subject to debate, and the argument could be made that
respondents may not perceive the weight of each semantic label and its accompa-
nying integer in the same way, and thus no claim can be made that a mean of 4.00
or higher reflects any kind of ‘absolute’. Still, a mean rating of 4.00 or higher
(‘very important’) provides a firmer basis and clearer reference point for consider-
ing academic language skills’ relatedness than a cut-off point of 3.00 (‘important’)
or higher.

As for the language skill status section, mean ratings below 3.00 indicate areas where
respondents reported needing help in skill development.
To assess differences among divisions in each respondent group’s importance ratings
for the four skills, data were subjected to a one-way analysis of variance (ANOVA). If
significant differences were found, then the Newman-Keuls follow-up test was employed
for post-hoc mean comparisons to estimate the level of differences between means
(Tabachnick & Fidell, 1989). To discover the level of agreement in ratings, logistic regres-
sion analyses were used. Logistic regression makes it possible to predict a discrete
524 Language Teaching Research 14(4)

outcome, such as group membership. In instances where the independent variables are
categorical, or a mix of continuous and categorical, logistic regression is preferred (cf.
discriminant analysis) (Howell, 2003; Spicer, 2004). The analysis identifies items that
discriminate between groups as being statistically significant; these items are ones on
which the respondent groups did not agree.
In combination with logistic regression, a chi-square goodness-of-fit test and cross-
tabs were used to discover the extent of agreement between students’ and instructors’
ratings. The analyses enable the evaluation of each group’s response distribution.
Finally, correlational analyses were used to examine the relationship between how
respondents rated the importance of language skills and their own or their students’ skill
status. The analyses were conducted on several levels (i.e. respondent groups, divisions,
language skill domains, and individual skill items). Note that, for the skill importance
scale, the higher the rating, the higher the importance; for the skill status rating, the lower
the rating, the greater the perceived need for help. Therefore, for example, a negative
correlation between skill importance ratings and their respective status skill ratings
means that, for those items rated as more important, there was a greater perceived need
for help on those items. Non-significant correlations mean that the responses were so
variable that no significant agreement or disagreement was detected. An alpha level of
.05 was used to determine significance.

V  Results
1  Importance of language skills
a  Graduate students’ perspectives: Graduate students judged 37 of the 45 ratable skill
statements on the questionnaire to be very important for satisfactory completion of the
courses they have taken thus far in their programs. Among those, 10 (27%) were related
to the writing domain (Table 3).
Analyses across divisions showed variation in individual skill items. When individual
skill items within the writing domain were reviewed, graduate students across divisions
shared only one common skill as being among the top-five most important: disciplinary
writing that involves major research papers, thesis proposals, grant proposals, and theses.
For both master’s and doctoral-level graduate students, only writing and reading were
rated above 4.00 (i.e. very important), with means ranging between 4.01 and 4.37. Over-
all, analyses of graduate students’ responses according to divisions and degree programs
showed that all respondents identified writing (followed by reading, speaking, and listen-
ing) as the most important language domain, with means ranging from 4.32 to 4.38. The
ANOVA indicated significant differences in the writing domain across divisions
(F(2, 94) = 8.26, df = 2, p = .001), with nonsignificant results from the post-hoc tests (p > .05).

b  Undergraduate students’ perspectives:  Undergraduate students judged 17 of the 43 ratable


skill statements on their questionnaires to be very important for satisfactory completion of
the courses they have taken thus far in their programs. Among those, six items (35%) were
Huang 525

Table 3  Task statements in the writing domain rated above 4.0 (‘very important’) by graduate
students across divisions

Task statement M SE SD

Demonstrate competence in discipline-specific writing 4.68 .07   .67


tasks (e.g. research papers, thesis proposals, grant proposals,
theses).
Organize writing in order to convey major and supporting 4.66 .07   .70
ideas.
Use relevant reasons and examples to support a position 4.61 .07   .71
or an idea.
Demonstrate a command of standard written English, 4.47 .09   .92
including grammar, phrasing, effective sentence structure,
spelling, and punctuation.
Produce writing that effectively summarizes and 4.36 .09   .92
paraphrases the works and words of others.
Demonstrate a facility with a range of vocabulary 4.34 .09   .92
appropriate to the topic.
Produce a sufficient quantity of written text appropriate 4.33 .09   .88
to the assignment and time constraints.
Use appropriate transitions to connect ideas and 4.28 .09   .89
information.
Write in response to an assignment and stay on topic 4.22 .10 1.02
without digressions or redundancies.
Use background knowledge, reference or non-text materials, 4.17 .11 1.09
personal viewpoints, and other sources appropriately to
support and analyse ideas and refine arguments.
Note: N = 95.

Table 4  Task statements in the writing domain rated above 4.0 (‘very important’) by
undergraduate students across divisions

Task statement M SE SD

Demonstrate a command of standard written English, including 4.38 .05   .95


grammar, phrasing, effective sentence structure, spelling, and
punctuation.
Use relevant reasons and examples to support a position or an 4.29 .04   .89
idea.
Organize writing in order to convey major and supporting ideas. 4.23 .05   .93
Produce a sufficient quantity of written text appropriate to the 4.12 .05 1.01
assignment and time constraints.
Write in response to an assignment and stay on topic without 4.07 .05   .96
digressions or redundancies.
Demonstrate a facility with a range of vocabulary appropriate to 4.06 .05   .98
the topic.
Note: N = 337.
526 Language Teaching Research 14(4)

related to writing (Table 4). Like the graduate students’ responses, undergraduate students’
responses showed variation across divisions. The only skill item in the writing domain
shared by respondents from the humanities and social sciences divisions was related to the
ability to demonstrate a command of standard written English (humanities: M = 4.61,
SD = .74; social sciences: M = 4.40, SD = .93).
In terms of the overall importance rankings of the four language domains, like the
graduate respondents, undergraduate respondents ranked writing (followed by reading,
speaking, and listening) as the most important domain (M = 4.05, SD = .71). The analy-
ses by year of study also showed that respondents from each year of study ranked writ-
ing4 as the most important language skill domain. The ANOVA indicated significant
differences across all four language skill domains (p < .001). When the data were anal-
ysed by respondent’s year of study, however, the results from the post-hoc tests were
nonsignificant (p > .05).

c  Graduate instructors’ perspectives:  Instructors who teach graduate students rated 405 of
the 45 skill statements as very important. Among those, 10 (25%) were related to writing
(Table 5). Overall, skills in the writing domain were among the top skills identified by
the instructors. When individual skill items were examined according to divisions, results
showed variation as well as commonality. All respondents across divisions rated the
same three individual skill items in the writing domain as among the top-five most
important: organize writing in order to convey major and supporting ideas, demonstrate
competence in discipline-specific writing tasks, and demonstrate a command of standard

Table 5  Task statements in the writing domain rated above 4.0 (‘very important’) by graduate
instructors

Task statement M SE SD

Demonstrate competence in discipline-specific writing tasks (e.g. research 4.81 .07   .39
papers, thesis proposals, grant proposals, theses).
Organize writing in order to convey major and supporting ideas. 4.81 .09   .53
Use relevant reasons and examples to support a position or an idea. 4.72 .12   .72
Demonstrate a command of standard written English, including grammar, 4.69 .10   .59
phrasing, effective sentence structure, spelling, and punctuation.
Produce writing that effectively summarizes and paraphrases the works 4.56 .12   .71
and words of others.
Use background knowledge, reference or non-text materials, personal 4.55 .14   .81
viewpoints, and other sources appropriately to support and analyse ideas
and refine arguments.
Write in response to an assignment and stay on topic without digressions 4.50 .17   .98
or redundancies.
Use appropriate transitions to connect ideas and information. 4.50 .12   .71
Demonstrate a facility with a range of vocabulary appropriate to the topic. 4.34 .17   .97
Show awareness of audience needs and write to a particular audience or 4.34 .18 1.03
reader.
Note: N = 32.
Huang 527

Table 6  Task statements in the writing domain rated above 4.0 (‘very important’) by
undergraduate instructors
Task statement M SE SD

Write in response to an assignment and stay on topic without digressions 4.38 .12   .95
or redundancies.
Organize writing in order to convey major and supporting ideas. 4.26 .15 1.14
Use relevant reasons and examples to support a position or an idea. 4.22 .12   .97
Demonstrate a command of standard written English, including grammar, 4.17 .13 1.06
phrasing, effective sentence structure, spelling, and punctuation.
Use background knowledge, reference or non-text materials, personal 4.07 .14 1.10
viewpoints, and other sources appropriately to support and analyse ideas
and refine arguments.
Produce a sufficient quantity of written text appropriate to the assignment 4.02 .15 1.13
and time constraints.
Note: N = 61.

written English. As for ranking the relative importance of the four skill domains, all
graduate respondents rated writing (followed by speaking, reading, and listening) as the
most important (M = 4.53, SD = .50). However, the ANOVA and post-hoc tests indicated
that the differences among the four skills were statistically nonsignificant.

d  Undergraduate instructors’ perspective:  Those teaching undergraduate students rated


22 of the 43 skills as very important for students’ satisfactory completion of their courses.
Among those, six (27%) were in the writing domain (Table 6). When individual skill
items were examined according to divisions, results indicated that only one was among
the top skills in the writing domain: organizing writing in order to convey major and sup-
porting ideas (M = 4.38, SD = .95). Analyses across divisions showed both variation and
overlap in skills that were considered very important. Overall, writing (preceded by read-
ing as the most important skill domain and followed by speaking and listening) was the
second most important skill domain (M = 3.93, SD = .93). The ANOVA indicated that the
writing domain (F(2, 60) = 5.50, df = 2, p = .006) was ranked significantly differently
among respondents according to their divisions. Results from the post-hoc tests were
nonsignificant (p > .05).

e  Graduate students vs. graduate instructors:  When comparing individual skill items that
graduate students and graduate instructors rated as ‘very important’, 94.5% of those iden-
tified by the graduate group overlapped with those identified by instructors. Only one
writing skill that graduate students identified was not shared by instructors: produce a suf-
ficient quantity of written text appropriate to the assignment and time constraints (M = 4.33,
SD = .88). Five skill items that instructors identified as ‘very important’ were not shared
by graduate students. Among those, one was in the writing domain: show awareness of
audience needs and write to a particular audience and reader (M = 4.34, SD = .50).
Results from the logistic regressions also showed that in the writing domain, two
items were statistically significant between the two groups:
528 Language Teaching Research 14(4)

1. show awareness of audience needs; and


2. write to a particular audience or reader and produce a sufficient quantity of written
text appropriate to the assignment and time constraints (p < .05).

The proportion of graduate instructors who rated (1) above as ‘extremely important’ was
62.5% (vs. 41.1% of students). Graduate students rated (2) above higher than graduate
instructors did. (A total of 84.1% of graduate students rated the item ‘very important’ or
‘extremely important’ vs. 65.7% of the graduate instructors.) Overall, however, the dif-
ference in proportions for the two writing items was not significant (p > .05).

f  Undergraduate students vs. undergraduate instructors:  When comparing the individual


skill items that undergraduate students and undergraduate instructors rated as ‘very
important’, 82% of those identified by the undergraduate group and 64% of those identi-
fied by the undergraduate instructor group overlapped. When the responses were exam-
ined according to divisions, only one skill item in the writing domain identified by the
student group was not shared by the instructor group: demonstrating a facility with a
range of vocabulary appropriate to the topic (M = 4.06, SD = .98).
Results from the logistic regressions showed that, in the writing domain, two items
indicated significant rating disagreement between two groups (p < .05):

1. write in response to an assignment and stay on topic without digressions or redun-


dancies; and
2. demonstrate a facility with a range of vocabulary appropriate to the topic.

The cross-tab and Chi-square analyses showed that 60% of undergraduate instructors
rated (1) above as ‘extremely important’ (vs. 38.3% of students); the difference in propor-
tions was significant, c2 (4, N = 384) = 12.40, p < .05. For (2) above, the difference in
proportions was not significant (p > .05), with both the undergraduate student and the
undergraduate instructor groups having the highest proportion of ratings in the ‘extremely
important’ category (40.3% vs. 40.7%, respectively), followed by ‘very important’ (35.2%
vs. 27.1%) and ‘important’ (16.7% vs. 20.3%).

2  Language skill status


a  Graduate students’ self-assessment:  Overall, graduate students identified only one item
(i.e. demonstrate competence in discipline-specific writing tasks) as being in need of
help (M = 2.87, SD = .80). Across divisions, among the top-five skill items that respon-
dents reported as needing support, one item in the writing domain was shared by respon-
dents from the physical and life sciences divisions: the ability to use appropriate
transitions to connect ideas and information (physical sciences: M = 2.75, SD = 1.28; life
sciences: M = 2.88, SD = .69). This item was in addition to the need to develop disciplin-
ary writing skills, which was shared by respondents from all four divisions. Overall,
when examining the responses by both divisions and degree programs, graduate students
ranked writing (followed by speaking, reading, and listening) as the most important skill
domain that needs further development. The ANOVA indicated that the ratings of the
status of the writing domain were significantly different across divisions (F(1, 94) = 8.26,
Huang 529

df = 2, p < .05). By degree program, only writing (F(1, 94) = 5.67, df = 1, p < .05) was
also significantly different between graduate students at the Master’s and PhD levels.

b  Undergraduate students’ self-assessment:  Undergraduate respondents self-rated all


skill statements with a mean of above 3.00, which indicated that, overall, undergraduate
students did not see themselves in need of developing any skills. A look at the data across
divisions showed that, among the top-five items that undergraduate respondents rated as
needing help in skill development, there were both variations and similarities, but there
were no skills from the writing domain. However, in terms of ranking language skills by
divisions and by year of study, overall, writing (followed by reading, speaking, and lis-
tening) was considered as needing more help than the other skill domains. The ANOVA
indicated that respondents’ language skill status in writing differed significantly across
divisions, F(1.336) = 4.09, df = 3, p < .05, while results from post-hoc tests indicated that
differences among the four skills were not statistically significant (p > .05).

c  Graduate instructors’ assessment:  Graduate instructors rated 18 skill statements below


3.00. Among them, nine (50%) were in the writing domain. The top-ten skill items that
were identified as representing areas where graduate students need the most help also
included several skill items in the writing domain, such as: organize writing in order to
convey major and supporting ideas (M = 2.69, SD = .82); show awareness of audience
needs and write to a particular audience or reader (M = 2.72, SD = .77); use appropriate
transitions to connect ideas and information (M = 2.84, SD = .72); produce writing that
effectively summarizes and paraphrases the works and words of others (M = 2.94,
SD = .75); demonstrate competence in discipline-specific writing tasks (M = 2.94, SD = .66);
and use sources appropriately to support and refine arguments (M = 2.97, SD = .74). Over-
all, the instructors who teach graduate students rated graduate students’ writing and speak-
ing skills as needing help and development, with speaking ranked first (M = 2.49, SD = .46),
followed by writing (M = 2.90, SD = .57). The ANOVA and post-hoc tests, however, indi-
cated that differences among the four skills were not statistically significant (p > .05).

d  Undergraduate instructors’ assessment:  Instructors who teach undergraduate students


rated 35 skill statements below 3.00. Among them, 10 (28.6%) were in the writing
domain, and three items were among the top-ten skill statements identified by instructors
as representing areas where students need the most help: produce writing that effectively
summarizes and paraphrases the works and words of others (M = 2.38, SD = .78); orga-
nize writing in order to convey major and supporting ideas (M = 2.28, SD = .81); and
demonstrate a command of standard written English, including grammar, phrasing,
effective sentence structure, spelling, and punctuation (M = 2.31, SD = .85). Overall, the
instructors rated all four skill domains as representing areas where students need help,
with speaking ranked first (M = 2.19, SD = .56), followed by writing (M = 2.50, SD =
.63). The ANOVA and post-hoc tests indicated that differences among the ratings of the
four skills were not statistically significant (p > .05).

e  Graduate students’ vs. graduate instructors’ assessments:  As previously presented, grad-


uate instructors identified 18 skill items below 3.0, whereas the graduate student group
reported needing help with only one item, discipline-specific writing. The results from
530 Language Teaching Research 14(4)

the logistic regressions comparing the two groups’ response patterns indicated that, of
the nine items showing significant divergence in ratings (p < .05), four were in the writ-
ing domain. The proportion of graduate instructors who assessed their students as need-
ing help with showing an awareness of audience needs and writing to a particular
audience or reader was 40.6% (vs. 12.6% of students). Similarly, regarding the ability to
use appropriate transitions to connect ideas and information, the proportion of graduate
students who self-rated as ‘quite competent’ was 44.2% (vs. 14.9% of instructors). Con-
cerning the ability to demonstrate a command of standard written English, 46.9% of the
responses by the graduate instructor group indicated a need for help (vs. 12.6% of stu-
dents). Finally, the proportion of graduate instructors who rated their students as needing
help with developing their ability to organize writing in order to convey major and sup-
port ideas was 40.7% (vs. 10.6% of students).

f  Undergraduate students’ vs. undergraduate instructors’ assessments:  As previously pre-


sented, instructors who teach undergraduate students rated 35 skill items below 3.0,
whereas the undergraduate student group did not consider themselves in need of any
language skill development. In addition, the results from the logistic regressions showed
that, among the 13 items about which the two groups disagreed, two were in the writing
domain (p < .001). Specifically, 63.8% of undergraduate instructors rated their students’
ability to organize writing in order to convey major and supporting ideas as needing help
(vs. 12.2% of students). Similarly, for the skill item of being able to demonstrate a com-
mand of standard written English, 56.6% of instructors regarded their students as need-
ing help in skill development (vs. 14.9% of students).

g  Graduate and undergraduate respondents’ writing-related comments:  The data collected


from the open-ended questions were consistent with instructors’ ratings of their students’
skill status, which indicates that much help is needed in the writing domain. As one
undergraduate instructor pointed out: ‘Using new information in writing is extremely
problematic. First- and second-year students are not good at synthesizing new informa-
tion and applying it to their written work’ (T32). Another commented: ‘The majority of
students in my classes are deficient in written communication skills. Only about 10% of
the students in my classes are competent in these written communication skills at a basic
level of undergraduate academic writing’ (T11).
For the student respondent groups, of the 293 comments that undergraduate student
respondents provided, 25.9% specifically addressed the need to improve their writing
skills. For the graduate students, 55.7% of the 122 comments provided were specifically
related to writing. Those writing-related comments from the graduate group mainly con-
cerned discipline-specific writing tasks, with comments such as, ‘I need help with writ-
ing literature reviews and thesis proposals’ (GS91); ‘I am struggling to find my own
voice in writing rather than just putting together a summary of other peoples’ ideas’
(GS6); and ‘I don’t have a clear idea of the expectations for different stages of disserta-
tion, or the overall structure’ (GI45). For the undergraduate student group, the writing-
related comments dealt more with the surface-level issues than discourse-level features
related to essays and reports for course requirements (e.g. grammar, sentence structure,
organization). For example, ‘I need help using APA style for writing assignments’
Huang 531

(UGS4); ‘Writing techniques, my verb tenses are inconsistent’ (UGS226); and ‘I need
help understanding proper grammar for written assignments’ (UGS277).

3  Correlational analyses of importance of language skills ratings vis-à-vis


language skill status ratings

Overall, the results showed highly significant, positive correlations with each of the
respective four skill domains (p < .001) and also among all four skill domains (p < .05).
For the graduate instructor group, the correlations were positive in the writing domain,
whereas for the undergraduate instructor group, the results indicated a generally nega-
tive, but nonsignificant correlation (p > .05). Results from correlational analyses to
examine the relationship between the individual skill importance items and the skill sta-
tus items indicated that, overall, no skill items in the writing domain were statistically
significant (p > .05). For the graduate student group, only four skill importance items and
their respective skill statuses were statistically significantly correlated. Of these, one was
in the writing domain: produce a sufficient quantity of written text appropriate to the
assignment and time constraints (r(94) = –.275, p < .05). For the undergraduate instruc-
tor group, of the six negative correlations between the skill importance ratings and their
respective skill status ratings, three were in the writing domain. The more important the
undergraduate instructors judged those three items, the more they perceived their stu-
dents as in need of help with them: write in response to an assignment and stay on topic
without digressions or redundancies, r(61) = –.296, p < .05; show awareness of audience
needs and write to a particular audience or reader, r(61) = –.444, p < .05; and produce a
sufficient quantity of written text appropriate to the assignment and time constraints,
r(61) = –.361, p < .05.

VI  Discussion and implications


Overall, the results indicated that there is much overlap of the skill items identified as
‘very important’ between graduate students and graduate instructors and also between
undergraduate students and undergraduate instructors. The relative agreement between
the students and instructors suggests that students have a clear idea of what language skills
their instructors consider important for satisfactorily completing their degree programs.
A comparison of the present study’s findings from the importance ratings section with
those of Rosenfeld et al.’s (2001) study indicated both commonalities and differences.
While common skill items from the writing domain were identified across the studies
(e.g. organizing writing in order to convey major and supporting ideas), all respondent
groups in the present study identified more writing-related items than those identified in
Rosenfeld’s study. Skill items that were not shared between the two studies were related
to content (e.g. writing in response to an assignment and staying on topic), development
(i.e. producing a sufficient quantity of written text appropriate to the assignment and time
constraints; using relevant reasons and examples to support a position or idea), and
language (i.e. demonstrating a facility with a range of vocabulary appropriate to the
532 Language Teaching Research 14(4)

topic). This comparison suggests skill items that may be applicable across contexts as
well as items that may be respondent or context dependent. More importantly, it points
to a need for caution when implementing findings from needs analysis research into cur-
riculum or language support program designs.
In line with numerous studies, writing was considered the most important or a major
problem for students, and its importance is also supported by numerous other research
efforts (e.g. Bridgeman & Carlson, 1984; Horowitz, 1986; Jenkins et al., 1993; Leki &
Carson, 1994; Rosenfeld, et al., 2001; Zhu & Flaitz, 2005). Traditionally, there has been
a greater emphasis placed on writing than on skills in the other language domains. A
review of services provided by English language support centers (or ‘writing’ centers) at
the post-secondary and graduate levels and in journal articles also indicates that far more
attention has been paid to the development of writing skills in academic settings than
other language skill domains. The overemphasis on writing, often at the expense of other
skills or overlooking the connection of skills in the writing domain to skills in the other
language domains, once again points to the need to be cautious when interpreting and
incorporating findings from needs analysis research into curriculum designs.
It is important to point out here that, unlike most findings from self-assessment sur-
veys, in which respondents have cited reading as the least difficult skill (see Jordan,
2005), results from the present study indicated otherwise. As recent research has indi-
cated, activities such as reading that require focused attention seem to be in decline (e.g.
Hirvela, 2005). Both instructor and student respondents across divisions and levels
pointed out issues with reading, specifically critical reading. For example: ‘In 100 and
200 level classes, at least half the students need help with reading and thinking critically’
(T26). Another instructor commented: ‘At least half of my students need to work on
reading skills – on basic comprehension of written material’ (T24). Students expressed
similar issues with reading: ‘I have difficulty finding the main points when reading texts’
(UG78); ‘picking out important information from readings is a big one’ (UG255); and ‘I
find reading critically very difficult’ (UG276). Results from the respondents at both the
undergraduate and graduate levels in the present study indicated that skills in the reading
domain were more important than skills in the speaking and/or listening domains, and
that they assessed their own reading skills as being more in need of improvement. Speak-
ing (e.g. making class presentations, participating in class discussions, making com­
parison and contrasts, or synthesizing information) and especially writing (e.g. using
reference materials and other sources appropriately to support and refine arguments,
producing writing that effectively summarizes and paraphrases the works and words of
others) often require the ability to perform many of the skill items in the reading domain.
As one graduate student confessed: ‘I have trouble making connections and synthesizing
information from other sources and sometimes [with] thinking critically about questions
I should be asking about the material’ (G35). As such, courses that integrate skills, such
reading-based writing, that develop both critical reading skills and writing skills deserve
consideration for inclusion in the curriculum.
In terms of students’ self-assessments and instructors’ assessments of their students’
skill status, the results indicated a great divergence. Even though the study did not
involve a direct matching of instructors and students, numerous studies have highlighted
a similar mismatch between students’ perceived needs and expectations and those of
Huang 533

instructors (e.g. Thorp, 1991; Sherman, 1992; Zhu & Flaitz, 2005). In addition, findings
from the students’ self-assessment data do not seem to be consistent with those of previ-
ous studies, which have suggested that students have a clear idea of what their problems
are (e.g. Ferris, 1998). The differences in perceptions of skill status may be attributable
to respondents’ knowledge and experience of the different items on the questionnaire or
possibly their varied language backgrounds. In addition, not all students may be at the
stage where they can accurately self-diagnose their competency or challenges in aca-
demic settings (Freeman & Huang, 2005) or what is required of them to perform compe-
tently in their program of study (Ferris, 1998). By the same token, instructors also may
not be the best judges of the ways in which their students are struggling. The following
comment depicts the teacher–learner gap that pervaded students’ comments:

My teachers have described my writing as long-winded, too long, and too broad. For this rea-
son, I rarely do well in essay-format assignments, and my marks are almost always in the B
range … I find that most comments are very vague. I am also always perplexed when I have
to write (synthesize) and am not provided with any examples of what is expected … (UGS36)

Unlike other studies (e.g. Ferris, 1998; Wang & Bakken, 2004), which found that learn-
ers lacked confidence in their communication skills, both undergraduate and graduate
students in this study self-assessed at a much higher perceived competency level than
instructors assessed them in almost all cases. As Horwitz (1987) pointed out, what learn-
ers believe about what they need to learn strongly influences their receptiveness to learn-
ing. One may ask: Why is it difficult for learners to accurately self-diagnose, as indicated
in the present study? The answer may be that learners are only aware of what is already
in their consciousness. In other words, learners cannot identify challenges that are in the
zone of unconscious incompetence.6 Learners select, accurately or inaccurately, an area
based on their own self-assessment within their zones of conscious incompetence.
Instructors need to use the knowledge gained from a diagnostic/needs assessment session
as the starting point to guide learners toward discovering and exploring skills that lie
within the zone of unconscious incompetence.
Several implications can be derived from this study. First, whether such assessments
reflect how students perceive themselves or how they think the researcher thinks they
ought to be is a question that deserves further probing, because, as mentioned in the
results section, the data gathered from the open-ended questions indicated that students
do perceive that they need help with their writing. The specific, varied areas where
students perceived that support is needed, based on their responses to the open-ended
questions, and where instructors indicated a need in their skill status ratings (e.g. surface-
level features, such as mechanics, and discourse features, such as development and
organization of ideas for the undergraduate students; and content-related or genre-based
features, such as writing to a particular audience or reader and discipline-specific writing
skills for the graduate group) nevertheless seems to be consistent with findings from
students’ ratings.
Second, the correlational analyses indicated that the skills learners identify as impor-
tant may not be the same as those that they perceive as needing help. This points to the
issue of how learner needs are assessed and interpreted, and how the results are reflected
534 Language Teaching Research 14(4)

in instruction. Most earlier surveys enquired about tasks that students perceived as ones
that they must perform at university (e.g. Rosenfeld et al., 2001). Such surveys may enable
researchers to identify essential tasks, but such tasks do not necessarily correspond to
skills students perceive needing help. In addition, multiple perspectives, both emi and
etic, and multiple sources will help generate a more complete picture of students’ per-
ceived language needs and indicate how to best prioritize teaching and learning activities.
Conducting needs analysis is essential, but taking into account a diversity of perspectives
and sources will help maximize support, especially when resources are limited.
Third, despite the differences between instructors and students in terms of skill status,
the findings point to three points that deserve consideration when designing instruction
and pedagogical materials: (a) instructors should consider incorporating into instruction
the individual skill items in the writing domains that respondents at the graduate and
undergraduate levels and across divisions identified as important and as in need of devel-
opment; (b) most respondents indicated that they need support at the discourse (e.g. with
organization and development of ideas) and local (e.g. with grammar, phrasing, effective
sentence structure, spelling, and punctuation) levels of writing, support services should
continue to focus on writing issues at different levels; and (c) for graduate students, con-
tent-related writing issues, such as using relevant reasons and examples to support a posi-
tion or an idea, showing an awareness of audience needs, and writing to a particular
audience or reader, and disciplinary writing that involves major research papers, thesis
proposals, grant proposals, theses, and journal articles was identified by both the graduate
student and instructor groups across divisions as important and in need of help. While both
students and instructors identified discipline-specific writing as an area where help is
needed, the statistically nonsignificant findings from analyses of differences in language
needs across divisions suggest that certain task-based skills, such as summarizing and
paraphrasing, using appropriate transitions, writing critical reviews using relevant reasons
and examples to support a position, and so on, may be taught to students across divisions.

VII  Limitations and future research


This study’s results should be considered in light of some potential limitations. One con-
cern is the unknown reliability of self-report data: Some respondents may provide care-
fully considered responses; some may quickly rush through the questionnaire giving
token responses; and others may provide inauthentic answers thought to be more ‘socially
desirable’. There is also the issue concerning how respondents with different language
backgrounds may respond to Likert response format items in a varied manner (refer to
Schaffer & Riordan, 2003). With a sufficient number of respondents, future research
could examine this aspect of individual differences in the response patterns of respon-
dents from various language backgrounds as well as include objective measures to vali-
date the findings.
Despite the criticism that the reliability of any type of needs assessment is questionable
because learners’ ‘true’ needs are difficult to specify, few would deny that there is a high
value in what learners can teach us about themselves through surveys, or argue that there
are no such things as infallible instruments and bias-free judgments or ratings. The result-
ing divergence in teachers’ and learners’ ratings of skill importance and of perceived
Huang 535

competency, rated on the basis of personal beliefs and expectations, points to the need for
such effort. Furthermore, given that the project is designed to capture students’ perceived
language skills needs, using self-reports seems justified at this early stage in the univer-
sity’s language support services’ development and allows the gathering of information
that is not available from one-time production or language proficiency data alone.
It is also recognized that, although much information can be derived from needs
assessments, this study alone cannot provide all the information needed about the univer-
sity’s graduate and undergraduate students’ academic language-learning needs, and no
assessment can quantify perfectly a specific program’s or workshop’s requirements. The
use of questionnaires also inevitably makes the questions more broadly based, rather
than focusing on specific learning situations. Furthermore, even though the response rate
for each participant group falls into the ‘magic sampling fraction’ (Dörnyei, 2003, p. 74),
as Brown (2001) pointed out, those respondents who completed the questionnaires may
be generalized to only a particular ‘eager-beaver’ or ‘gung-ho’ group of participants,
rather than the targeted population (p. 85). Given these potential limitations, the results
suggest the probable perceived language needs and statuses, and are not conclusive.
Needs analysis is an on-going and endless process, because no process is perfect, as
numerous researchers have pointed out (e.g. Nunan, 1988; Brown, 1995, 2001). Future
research that addresses these limitations is necessary to understand more completely
undergraduate and graduate students’ academic language needs and the relationship
between their perceived needs and competency as measured by both objective and sub-
jective needs gathered from multiple sources of information.

VIII  Conclusions
In any endeavor to discover learners’ needs and create a more learner-centered curriculum,
whether the needs are ascertained through questionnaires, observations, a collection of
documentation, or through any other rigorously scientific methods, there is always a dan-
ger in considering learners a homogeneous bunch. Some may claim that conducting a
needs analysis across multiple institutions, as Rosenfeld et al. (2001) did, is a strength, but
one may also argue that efforts to seek findings’ generalizability may be fruitless because
needs analysis is, by definition, context-dependent and context-specific, taking into account
the very different linguistic cultures and the variety of institutional environments.
As best-selling author Malcolm Gladwell (Gladwell, n.d.) said, ‘There is no perfect
spaghetti sauce, only perfect spaghetti sauces.’ If we, as curriculum designers, materials
writers, and teachers, intend to put the learner at the center of the learning experience,
our duty is to offer targeted, varied workshops that meet their individual and discipline-
specific needs. An ongoing questioning of learners’ needs helps instructors begin their
instruction where the learners are and the knowledge gained will enable instructors to
prioritize what they teach.

Acknowledgements
Thanks to Jim Anglin and Teresa Dawson for the support that made this study possible; to the
Educational Testing Service for permission to use Rosenfeld et al.’s survey; to Gweneth Doane, Dave
Mckercher, and Ray Letts, who assisted with permissions and distribution of the surveys; to Adam
536 Language Teaching Research 14(4)

Steffanick for his assistance with the project; and most of all, to the participants in this study. Thanks
also to two anonymous reviewers for their helpful comments about an earlier version of this article,
which was presented at the 43rd Annual TESOL Convention, Denver, CO (March 15–28, 2009).

Notes
1 The decision to focus on the writing section was made because (1) the findings indicated skills
in the writing domain were perceived as both very important and as in need of help from both
instructors’ and students’ perspectives; and (2) space limitation does not permit a full coverage
of both skill importance and skill status from all four language domains reported by three
respondent groups.
2 For a detailed account of the development of the survey instruments, refer to Rosenfeld et al.
(2001).
3 Opinions vary on whether the numbers from a Likert response format should be treated as
ordinal or interval. It is uncertain whether the intervals between the semantic distances on the
scale are equal, and, hence, some have asserted that those items are ordinal; others, however,
have argued that the Likert response format, such as the one used in the present study, is not
a canonical ordinal scale. The argument is that, instead, the wording of response levels is
anchored with consecutive integers that have the interval property and are accompanied by
a visual analog scale with equal spacing of response levels clearly indicated, or that the line
between ordinal and interval can be ‘fuzzy’ (e.g. Abelson, 1995). As such, there is a strong
argument for treating the type of responses used in the present study as interval-level data that
can be analysed parametrically if the data are close to a normal distribution. As also pointed out
in Likert’s (1932) original paper, the labels used conceptually provided equidistant intervals
and described data on the scale as yielding a distribution resembling a normal distribution.
Therefore, departures from this property may reflect the character of respondents in a particular
study. In addition, the Cronbach alphas conducted across each section of all three versions of
the survey for each respondent group indicated high internal consistency reliabilities. The high
Cronbach alphas indicate a high consistency among the items in terms of interval equality
and distribution normality of item responses. This suggests that the items were consistent in
assessing the respondent groups’ ratings of importance and skill status. Furthermore, since
the study is exploratory rather than hypothesis confirming or rejecting, it is justified to make
assumptions about the questionnaire responses being of equal intervals, which support the use
of such parametric tests as ANOVA and Pearson correlational analyses. As also firmly stated
by Carifio and Perla (2007), ‘The F-test was incredibly robust to violations of the interval data
assumption (as well as moderate skewing)’ and ‘One does not have to lose statistical power
and sensitivity by using non-parametric statistical tests in its place when analysing Likert scale
data’ (pp. 110–111).
4 Respondents from years two and above, who constituted 85.5% of all undergraduates who
completed the questionnaire, also ranked writing as the most important, followed by reading,
speaking, and listening. First-year undergraduate students likewise ranked writing as the
most important skill domain, but considered speaking skills more important than reading or
listening skills.
5 The ratings for the remaining five skill items were still within the ‘important’ (3.00) and ‘very
important’ (4.00) range. They ranged from 3.59 to 3.97 and included items from all four skill
domains.
6 In the field of communication, Howell (1979, 1982) described levels of competence and the
transitions that occur when moving from one level to the next. ‘Unconscious incompetence’
and ‘conscious incompetence’ describe the first two levels, where learners begin without any
Huang 537

perception that they are making communication mistakes and then enter the phase where they
become aware of their incompetence but do not know what they need to do (see also Morell
et al., 2002).

References

Abelson, R.P. (1995). Statistics as principled argument. Hillsdale, NJ: Lawrence Erlbaum.
Barkhuizen, G.P. (1998). Discovering learners’ perceptions of ESL. Classroom teaching/learning
activities in a South African context. TESOL Quarterly 32, 85–108.
Benesch, S. (1996). Needs analysis and curriculum development in EAP: An example of a critical
approach. TESOL Quarterly, 30, 723–38.
Berwick, R. (1989). Needs assessment in language programming. In R.K. Johnson (Ed.) The sec-
ond language curriculum (pp. 48–62). Cambridge: Cambridge University Press.
Braine, G. (1995). Writing in the natural sciences and engineering. In D. Belcher & G. Braine
(Eds.), Academic writing in a second language: Essays on research and pedagogy (pp. 113–34).
Norwood, NJ: Ablex.
Bridgeman, B. & Carlson, S. (1983). A survey of academic writing tasks required of graduate and
undergraduate foreign students. TOEFL research report no. 15. Princeton, NJ: Educational
Testing Service.
Bridgeman, B. & Carlson, S. (1984). Survey of academic writing tasks. Written Communication,
1, 247–80.
Brindley, G. (1989). The role of needs analysis in adult ESL programme design. In R.K. Johnson
(Ed.) The second language curriculum (pp. 63–78). Cambridge: Cambridge University Press.
Brown, J.D. (1995). The elements of language curriculum. New York: Heinle & Heinle.
Brown, J.D. (2001). Using surveys in language programs. Cambridge: Cambridge University
Press.
Carifio J. & Perla, R.J. (2007). Ten common misunderstandings, misconceptions, persistent myths
and urban legends about Likert scales and Likert response formats and their antidotes. Journal
of Social Sciences, 3, 106–16.
Casanave, C. & Hubbard, P. (1992). The writing assignments and writing problems of doctoral
students: Faculty perceptions, pedagogical issues, and needed research. English for Specific
Purposes, 11, 33–49.
Cronbach, L.J. (1970). Essentials of psychological testing. 3rd edition. New York: Harper & Row.
Cumming, A. (Ed.). (2005). Goals for academic writing: ESL students and their instructors.
Amsterdam: John Benjamins.
Dörnyei, Z. (2003). Questionnaires in second language research: Construction, administration,
and processing. Mahwah, NJ: Lawrence Erlbaum.
Dudley-Evans, T. & St John, M. (1998). Development in English for specific purposes. Cambridge:
Cambridge University Press.
Ferris, D. (1998). Students’ views of academic aural/oral skills: A comparative needs analysis.
TESOL Quarterly, 32, 289–317.
Ferris, D. & Tagg, T. (1996a). Academic oral communication needs of EAP learners: What subject-
matter instructors actually require. TESOL Quarterly, 30, 31–58.
Ferris, D. & Tagg, T. (1996b). Academic listening/speaking tasks for ESL students: Problems, sug-
gestions, and implications. TESOL Quarterly, 30, 297–320.
538 Language Teaching Research 14(4)

Flowerdew, J. & Peacock, M. (2001). Research perspectives on English for academic purposes.
Cambridge: Cambridge University Press.
Freeman, J. & Huang, L.-S. (2005). Destablizing the construct of competence. Paper presented at
the TESL Ontario Conference, Toronto, Ontario, Canada.
Frodesen J. (1995). Negotiating the syllabus: A learning-centered, interactive approach to ESL
graduate writing course design. In D. Belcher & G. Braine (Eds.), Academic writing in a sec-
ond language: Essays on research and pedagogy (pp. 331–50). Norwood, NJ: Ablex.
Geoghegan, G. (1983). Non-native speakers of English at Cambridge University. Cambridge: Bell
Educational Trust.
Gladwell, M. (n.d.). What we can learn from spaghetti sauce. Retrieved from: http://www.ted.com/
index.php/talks/malcolm_gladwell_on_spaghetti_sauce.html (July 2010).
Hamp-Lyons, L. (2001). English for academic purposes. In R. Cater & D. Nunan (Eds.), The
Cambridge guide to teaching English to speakers of other languages. Cambridge: Cambridge
University Press.
Hirvela, A. (2005). Computer-based reading and writing across the curriculum: Two case studies
of L2 writers. Computers and Composition, 22, 337–56.
Horowitz, D. (1986). What professors actually require: Academic tasks for the ESL classroom.
TESOL Quarterly, 20, 445–62.
Horwitz, E.K. (1987). Student beliefs about language learning. In A. Wenden & J. Rubin (Eds.),
Learner strategies in language learning (pp. 119–29). Englewood Cliffs, NJ: Prentice Hall.
Howell, D.C. (2003). Fundamental statistics for the behavioral sciences. Belmont, CA: Wad-
sworth Publishing.
Howell, W.S. (1979). Theoretical directions for intercultural communication. In M.K. Asanti, E.
Newmark, & C.A. Black (Eds.), Handbook of intercultural communication (pp. 23–41). Bev-
erly Hills, CA: Sage.
Howell, W.S. (1982). The empathic communicator. Belmont, CA: Wadsworth.
Hutchinson, T. & Waters, A. (1987). English for specific purposes: A learning-centred approach.
Cambridge: Cambridge University Press.
Jacobson, W.H. (1987). An assessment of the communication needs of non-native speakers of
English in an undergraduate physics labs. English for Specific Purposes, 6, 173–86.
Jenkins, S., Jordan, M.K., & Weiland, P.O. (1993). The role of writing in graduate engineering
education: A survey of faculty beliefs and practices. English for Specific Purposes, 12, 51–67.
Johns, A.M. (1981). Necessary English: A faculty survey. TESOL Quarterly, 15, 51–57.
Johns, A.M. & Dudley-Evans, T. (1991). English for specific purposes: International in scope,
specific in purpose. TESOL Quarterly, 25, 297–314.
Jordan, R.R. (1997/2005). English for academic purposes: A guide and resource book for teachers.
Cambridge: Cambridge University Press.
Kroll, B. (1979). A survey of the writing needs of foreign and American college freshmen. English
Language Teaching Journal, 33, 219–27.
Leki, I. & Carson, J. (1994). Students’ perceptions of EAP writing instruction and writing needs
across the disciplines. TESOL Quarterly, 28, 81–101.
Leki, I. & Carson, J. (1997). Completely different worlds: EAP and the writing experiences of ESL
students in university courses. TESOL Quarterly, 31, 39–69.
Likert, R. (1932). A technique for the measurement of attitudes. Archives of Psychology, 140,
1–55.
Huang 539

Long, M.H. & Crookes, G. (1992). Three approaches to task-based syllabus design. TESOL Quar-
terly, 26, 55–98.
McKenna, E. (1987). Good-enough reading: Momentum and accuracy in the reading of complex
fiction. Research in the Teaching of English, 31, 428–58.
Morell, V.W., Sharp, P.C., & Crandall, S.J. (2002). Creating student awareness to improve cultural
competence: Creating the critical incident. Medical Teacher, 24, 532–34.
Munby, J. (1978). Communicative syllabus design. London: Cambridge University Press.
Nunan, D. (1988). The learner-centered curriculum. Cambridge: Cambridge University Press.
Ostler, S.E. (1980). A survey of academic needs for advanced ESL. TESOL Quarterly, 14, 489–502.
Rosenfeld, M., Leung, S., & Oltman, P. (2001). The reading, writing, speaking, and listening tasks
important for academic success at the undergraduate and graduate levels. TOEFL monograph
21. Princeton, NJ: Educational Testing Service.
Schaffer, B. & Riordan, C.M. (2003). A review of cross-cultural methodologies for organizational
research: A best practices approach. Organizational Research Methods, 6, 169–215.
Sherman, J. (1992). Your own thoughts in your own words. ELT Journal, 46, 190–98.
Spicer, J. (2004). Making sense of multivariate data analysis: An intuitive approach. Thousand
Oaks, CA: Sage.
Spratt, M. (1999). How good are we at knowing what learners like? System, 27, 141–55.
Tabachnick, B.G. & Fidell, L.S. (1989). Using multivariate statistics. New York: Harper & Row.
Tarone, E. & Yule, G. (1989). Focus on the language learner. Oxford: Oxford University Press.
Thorp, D. (1991). Confused encounters: Differing expectations in the EAP classroom. ELT Jour-
nal, 45, 108–18.
Wang, M. & Bakken, L.L. (2004). An academic writing needs assessment of English-as-a-second-
language clinical investigators. The Journal of Continuing Education in the Health Profes-
sions, 24, 181–89.
Waters, A. & Vilches, M. (2001). Implementing ELT innovations: A needs analysis framework.
ELT Journal, 55, 133–41.
West, R. (1994). Needs analysis in language teaching. Language Teaching, 27, 1–19.
Zhu, W. & Flaitz, J. (2005). Using focus group methodology to understand international students’
academic language needs: A comparison of perspectives. TESL-EJ, 8. Retrieved from: http://
writing.berkeley.edu/TESL-EJ/ej32/a3.html (June 2010).

S-ar putea să vă placă și