Sunteți pe pagina 1din 29

ACCOUNTING PROGRAM RESEARCH RANKINGS BY TOPICAL AREA AND

METHODOLOGY

Joshua G. Coyne
Brigham Young University
Scott L. Summers
Brigham Young University
Brady Williams
Brigham Young University
David A. Wood
Brigham Young University
515 TNRB
Provo, UT 84602
Telephone: (801) 422-8642
Fax: (801) 422-0621
davidwood@byu.edu

July 2009
We express our thanks to Derek Oler, Mitch Oler, and Chris Skousen for sharing data with us. We
thank George Foster, Jeff Hoopes, Robert Jensen, Steve Kachelmeier, Jagan Krishnan, Jeff
McMullin, Eric Press, Kari Olsen, Chad Simon, Jason Smith, Nate Stephens, Stanley Veliotis,
and participants at the 2008 BYU Accounting Research Symposium for helpful comments and
suggestions.

Electronic copy available at: http://ssrn.com/abstract=1337755

ACCOUNTING PROGRAM RESEARCH RANKINGS BY TOPICAL AREA AND


METHODOLOGY

ABSTRACT:
This paper makes two novel contributions to ranking accounting research programs constructed
from publication counts in top journals (AOS, Auditing, BRIA, CAR, JAE, JAR, JATA, JIS,
JMAR, RAST, and TAR). In contrast to previous studies, we recognize the mobility of
intellectual assets tied to the human capital of accounting researchers and therefore base our
rankings on the researchers' current affiliations, rather than their affiliations at the time of
publication. Also, we categorize each article written by topical area (auditing, financial,
managerial, accounting information systems, tax, and other) and by methodology (analytical,
archival, experimental, and other) and provide separate accounting program rankings by topical
area and by methodology. These two innovations provide a rich, centralized information resource
for decision-makersboth institutional and individualin choosing how to allocate time,
resources, and expertise.

Key Words: Accounting Research Rankings, Accounting Research Methodology, Accounting


Research Topical Areas
JEL Descriptors: M4, M40, M41, M42, M49
Data Availability: Requests for data may be made to the authors.

Electronic copy available at: http://ssrn.com/abstract=1337755

INTRODUCTION
Some of the most important career decisions made by academic accountants are made in
an environment that is distinctly lacking in centralized, high-quality, quantitative information.
Those deciding where to pursue a Ph.D. or seek employment have traditionally relied on
informal inquiries about the reputations of institutions and their faculties combined with coarse
measures of quality and surveys of perceptions. Given that most academic accountants specialize
in a particular sub-discipline of accounting, decisions based on overarching reputations, singular
rankings, and broad surveys suffer from a lack of specialized, granular information that could
better inform decision-makers. We contribute to knowledge in the field of accounting academics
by providing granular, quantitative information to these decision-makers.
We propose a unique design for measuring the intellectual assets held by accounting
research programs that will enhance the current body of ranking literature in two ways. First, we
assume that the intellectual assets of a researcher stay with that researcher when moving from
institution to institution1. Second, recognizing specialty areas (both topical and methodological)
within the accounting disciplinea similar concept currently exists in other business school
disciplines (e.g. management)we report rankings by methodology and topical area that allow
institutions to be recognized for their expertise in specialty areas2. Since the most influential
accounting journals publish a disproportionately high number of articles in the financial specialty
area (Bonner et al. 2006), parties interested in expertise in other specialty areas may make sub-

Crediting a publication to the author's current institution allows a universitys research ranking to change based on
the addition or loss of a distinguished researcher. Creating rankings in this way measures the impact of the
intellectual assets (contributions) of the individual researcher by tying those assets to the present institution rather
than ascribing intellectual assets to an entity incapable of reasoninga university.
2
Previous research (cf. Bazley et al. 1975; Andrews and McKenzie 1978; Windall 1981; Zivney and Thomas 1985;
Hasselback and Reinstein 1995; Fogarty 1995; Trieschmann et al. 2000; Brown 2003; Chan et al. 2007) has ranked
research programs, but not by topicaccounting information systems (AIS)/auditing/financial/managerial/taxor
methodology archival/analytical/experimental; We note that Brown and Laksmana (2004) break rankings of Ph.D.
programs into two categories, financial and non-financial.

optimal decisions by relying on ratings heavily influenced by the financial accounting specialty.
The creation of accounting research program rankings by specialty area allows universities to
compare and contrast their programs with programs that are focusing on the same specialties,
individual academics to recognize pockets of specialty where they might choose to network, and
future academics to evaluate the depth and breadth of research coming from universities where
they might pursue their terminal degree.
These rankings significantly increase the amount of available information that could be
used by multiple decision-makers, including:
Prospective Ph.D. students. Each student or accounting professional that decides to
pursue a Ph.D. in accounting is immediately faced with the challenge of finding a university with
programs, faculty, and expectations that match the applicants needs, wants, and career goals.
This decision is often multifaceted and complex. To help in this process, we provide program
ranking information broken down by topical area and research methodology which are two
crucial factors for applicants to consider as they evaluate which programs are best able to support
their research interests.
Ph.D. graduates. Rankings decomposed by discipline and based on current location of
human capital will also benefit Ph.D. candidates as they graduate and enter the job market. They
will be able to use these rankings to target positions at universities that fit their career goals. This
study identifies top programs in each specialty area, which is especially valuable if these
programs do not register highly in past ranking studies.
Research institutions. Accounting department heads, business school administrators,
and university leadership may find these results useful in establishing legitimacyboth internal
and external. This study recognizes those schools that have been and are making a concerted

effort to specialize and improve their research reputation. A school that has never placed highly
in general rankings may be able to use these rankings to demonstrate credibility in certain
specialty areas or methodologies. This credibility can help justify internal funding for materials,
technology, or additional research and help attract external funding.
Professional organizations. In July 2008, the AICPA announced the introduction of a
$15 million fund designed to send experienced practitioners back to school to get Ph.D.s to help
fill the shortage of audit and tax faculty (AICPA 2008). This fund, which is made up of donations
from many of the largest accounting firms and many state accounting associations, is designed to
send professionals back for audit and tax training in Ph.D. programs. The rankings provided in
this paper will highlight programs that specialize in research related to audit and tax. In this way,
the effectiveness of the fund could be enhanced by allowing these individuals to target programs
where they will get the best audit- and tax-specific training and by helping fund administrators to
know where to direct additional funds.
SAMPLE DESCRIPTION
To create our rankings, we index all peer-reviewed articles in Accounting, Organizations,
and Society (AOS); Auditing: A Journal of Practice & Theory (Auditing); Behavioral Research
in Accounting (BRIA); Contemporary Accounting Research (CAR); Journal of Accounting &
Economics (JAE); Journal of Accounting Information Systems (JIS); Journal of Accounting
Research (JAR); Journal of Management Accounting Research (JMAR); Journal of the
American Taxation Association (JATA); Review of Accounting Studies (RAST); and The
Accounting Review (TAR).3 We chose these journals because previous research has shown that

We do not include articles that were invited by the editor or conference discussant papers (such as JAR or CAR
conference discussion papers) since these articles are not required to go through the peer-review process. Also, we
exclude articles written directly to a professional audience and educational cases.

six of these journals (AOS, CAR, JAE, JAR, RAST, and TAR) are considered the highest rated
accounting journals (cf. Glover et al. 2006; Bonner et al. 2006; Lowensohn and Samelson 2006).
Studies have also provided evidence that these journals may not provide representative
coverage of accounting methodologies and topical areas (Bonner et al. 2006). Using the results
of a survey of 517 academics from various American Accounting Association (AAA) sections
(Lowensohn and Samelson 2006), we selected the journals perceived to be the best by
methodology (behavioral) and topical area (tax, managerial, and AIS). 4 By this process we add
BRIA, JATA, JMAR, and JIS. We add Auditing to this list as it is regularly considered to be the
top journal for publishing audit research aside from those already mentioned. Including these
additional journals should provide greater coverage of topical areas and methodologies that are
not adequately represented in the traditional top six journals.
Our rankings do not explicitly recognize top-tier contributions of researchers in
supporting disciplines (e.g., finance, economics, psychology, etc). We made this choice because
of our interest in identifying top accounting research programs and because of the time intensive
nature of creating these rankings. While contributions in the top-tier of supporting journals are
important and contribute to the academic prestige of the researcher, we believe they are less
relevant to identifying accounting expertise than an evaluation of research published in
accounting journals.5

Although the survey included the topical areas of government and non-profit, we do not employ these topics in our
rankings, so we did not include them in the journal selection process.
5
Glover et al. (2006) examine the publication records of faculty promoted at the top 75 research schools. In
unreported analyses, the correlation between publishing in the top 3 accounting journals (TAR, JAR, and JAE) and
publishing in other top business journals is 0.86 when considered at the school portfolio level. This suggests that
while our results will not provide a complete picture of the articles published by accounting scholars, they are
unlikely to be biased by excluding articles published in other top business journals.

METHODOLOGY
To create our rankings, we index all articles published in the aforementioned journals
between 1990 and 2009 and categorize them based on topic and methodology. Because of the
time-intensive nature involved in creating these rankings, we limit our analysis to a 20-year
window, which effectively covers three tenure cycles. We note that authors who were prolific
researchers before 1990 but have not continued to actively research since 1990 likely have fewer
current intellectual research assets to share with colleagues.6
We categorize each article by methodological category: analytical, archival,
experimental, or other; however, our methodological categories are not mutually exclusive. 7 For
example, Hodder et al. (2008) employ an experiment as well as archival tests in their paper. We
categorize this article as both archival and experimental for purposes of our rankings. We define
our methodological classifications as follows:
Analytical: studies whose analyses and conclusions are based on the act of formally
modeling theories or substantiating ideas in mathematical terms. These studies use analytic
devices to predict, explain, or give substance to theory.
Archival: studies whose analyses and conclusions are based on objective data collected
from repositories. Also included are studies in which the researchers, or another third party,
collected the research data and in which the data has objective amounts such as net income,
sales, fees, etc. (i.e., the researcher creates an objective repository of data).
Experimental: studies whose analyses and conclusions are based on data the researcher
gathered by administering treatments to subjects. Usually these studies employ random

We explore the importance of currency in more depth later in the paper.


Although the decision to allow for multiple methodological and/or topical categorizations per article causes some
articles to be counted multiple times, we argue that this more accurately captures the authors' potential to contribute
in multiple areas.
7

assignment; however, if the researcher selected different populations in an attempt to


manipulate a variable (e.g., participants of different experience levels were selected for
participation), we also consider these experimental in nature.
Other: studies that did not fit into one of the other methodological categories. The
methodologies in these studies vary significantly and include such things as surveys, case
studies, field studies, simulations, persuasive arguments, etc.
Similar to our categorization by methodology, our categorization by topical area allows
for multiple categories per article. If an article sheds light on multiple topical areas, it is
categorized as providing a contribution to each area (e.g., Menon and Williams [2008] examine
management turnover [managerial] following auditor resignations [audit]). In categorizing
articles by topical area, we employ the following definitions:
AIS: studies which address issues related to the systems and the users of systems that
collect, store, and generate accounting information. Users are defined broadly to include those
involved in collection, storage, or use of accounting information or even the implementation of
the system. These systems may be electronic or not. Research streams include, but are not
limited to design science, ontological investigations, expert systems, decision aides, support
systems, processing assurance, security, controls, system usability, and system performance.
Auditing: studies in which the topical content involves an audit topic. These studies vary
widely and include, but are not limited to, the study of the audit environmentexternal and
internal, auditor decision making, auditor independence, the effects of auditing on the financial
reporting process, and auditor fees.
Financial: studies that address the topical content of financial accounting, financial
markets, and decision making based on financial accounting information.

Managerial: studies that examine issues regarding budgeting, compensation, decisionmaking within an enterprise, incentives, and the allocation of resources within an enterprise.
Tax: studies that examine issues related to taxpayer decision-making, tax allocations, tax
computations, structuring of accounting transactions to meet tax goals, tax incentives, or market
reactions to tax disclosures.
Other: studies that do not fit into one of the other topical areas. The topical areas in these
studies vary significantly and include such things as education, methodologies, law, psychology,
history, the accounting profession, work environment, etc.
We use data previously categorized by Oler et al. (2008) as a starting point for
categorizing articles appearing in AOS, CAR, JAE, JAR, RAST and TAR journals. For this data,
one of the authors on this project reviewed each article categorization made by the Oler et al.
(2008) team and made changes as deemed appropriate to fit our categorization scheme. For the
other journals, two of the authors on this project categorized each article. All discrepancies in
ratings were resolved through discussion.
After categorizing each article, we identified the authors current school affiliation by
first searching in the 2008 Hasselback directory (Hasselback 2008). We then visited the website
of the university listed in the Hasselback directory and verified that the professor was listed as
being employed at the institution. If the author was not listed in the Hasselback directory, or if
we could not find them on the website of the institution listed by the Hasselback directory, we
searched the internet for the author and recorded the authors current university affiliation. 8 If
professors were listed as holding joint appointments or were listed as visiting scholars, we
credited the home school for those publications. We created initial rankings after performing
8

To conduct our internet search, we searched for the researchers name or their name and special key words (e.g.,
accounting, university, etc.). If we found initial evidence of a professor at a university (e.g., a paper listed on SSRN),
we then visited that university website to verify the faculty member was employed at the school.

this step; subsequently, for all schools that were listed in the top 50 of any of these initial
rankings, we revisited the school's faculty website, verified that the authors listed belong to that
school, and searched for any professors listed on the school website that had not been
categorized in our database. If we could not find a professors affiliation after performing all
these steps, we considered that professor to be no longer employed in academia and, therefore,
we gave no credit to any institution for that individual's research.9
To create our rankings, we gave each author full credit for each article published in these
journals (i.e., for coauthored papers, all institutions of the authors received credit for the
publication and if multiple authors were from the same institution, the institution received credit
for each author).10 We then summed the number of total publications for each school by
methodology and by topical area. Finally, we ranked schools by the total productivity of the
faculty currently at that school.11
We take four additional steps to maximize the usefulness of this data. First, for all
rankings we provide the number of distinct professors that contribute to each ranking. Given our
methodology, schools that have larger faculties are more likely to be ranked higher because they
employ more individuals who have the possibility of publishing articles. 12

We gave credit to a school for authors outside of accounting who publish in accounting journals if we could locate
their current school affiliation as described in the text.
10
We chose to give each author full credit because we view each author as likely to have increased their intellectual
assets by working on the project. We also did not want to introduce noise or bias by attempting to create a subjective
weighting scheme of the value of different journal articles. If high quality outcome data becomes available in the
future for which reliable and theoretically justified weightings could be created, then future researchers should
reexamine these rankings using those weights. However, to our knowledge, we are unaware of a high quality
weighting based on empirical data.
11
If we discovered that a professor had retired, was emeritus, or had died we did not include them in the rankings.
12
We do not scale our rankings by faculty size because our objective is ranking the intellectual assets available at
institutions rather than ranking the average productivity of faculty. Further, choosing to scale by faculty size is
problematic due to the difficult nature of determining faculty size, especially in specialty areas. Several possible
ways to scale the data by size include scaling by the size of the department, number of authors who published the
articles, or the number of professors who research in an area (or use that methodology). We noticed as we
categorized articles that many schools do not have a separate accounting department or combine the accounting
department with finance, information systems, or the entire business school. In addition, many accounting

Second, we provide three types of consolidated rankings: by topical area, by


methodology, and by both topical area and methodology. This consolidation allows for a
discussion of which institutions are well-versed or well-rounded in all specialty areas. The
consolidated rankings are created by averaging the topical area rankings, the methodology
rankings, or both. This is in contrast to consolidated rankings based on total publication counts.
Rankings based on total publications introduce weighting problems as some areas are
disproportionately represented in journals (Bonner et al. 2006). These rankings recognize schools
that are able to do well in all or virtually all methodological and topical areas and are likely of
special interest to prospective Ph.D. students who may not know exactly what they will want to
research and would like to go to a school that supports broad topical areas and/or broad
methodologies.
Third, we report rankings based on three different time windowsthe full time window
(20 years), the previous 12 years, and the previous 6 years. Providing rankings of shorter
windows allows users to infer various trends. For example, if a school is very highly ranked in
the full time window but not in the previous 6 year window, it may suggest that the school
employs an aging faculty who are winding down their research careers. Conversely, a school that
is ranked very highly in the 6 year window but not in the full window may have promising young
scholars who are highly productive but have not been employed a sufficient length of time to
produce a tremendous quantity of research.

academics work in administrative positions making it difficult to choose whether to include them in the
department. These problems make scaling by the number in the department problematic and subjective. Scaling by
the number of authors who published articles is problematic in that one person could publish a high number of
articles and therefore cause that school to score very highly despite being the only active researcher there. We do not
believe this type of ranking would be of greatest usefulness to the accounting academy. Finally, scaling by the
number of professors who research in an area is problematic because many researchers research in multiple
methodologies and there is no clear way to count the number of professors working in a particular area.

Finally, we create a website with additional functionality to increase the usefulness of the
rankings (see the Appendix for additional discussion of the website).
RESULTS
Table 1 presents descriptive statistics of the sample. Panel A shows the percentage of
articles by topical area for each different journal. It is apparent that journals have very different
tastes in terms of topics of articles published. Of the traditional big 3 accounting journals (TAR,
JAR, and JAE), TAR publishes the broadest topical scope of articles. AOS and BRIA are the only
non-specialty-topic journals that publish a higher percentage of articles in an area other than
financial (AOS publishes more managerial than any other topical area and BRIA publishes more
other research and auditing than any other topical area). Also of note is the almost complete
lack of publication of AIS research in any journal other than JIS. We note that Table 1 does not
consider the quantity of different types of articles submitted to the journals; therefore, we cannot
conclude from this table that there is an editorial or reviewer bias against certain topical areas or
methodologies. 13
Panel B shows the percentage of articles by methodology for each different journal. With
the exception of AOS, BRIA, JIS, and JMAR, archival research is the dominant methodology
published. BRIA publishes a higher percentage of experimental research than other
methodologies, and JIS, JMAR, and AOS publish a higher percentage of other methodologies
than analytical, archival, or experimental. Although these descriptive statistics provide evidence
that all research methodologies can be published somewhere, it also shows that specific journals

13

As an example of the importance of considering the rate of submission before determining bias, the 2007
Contemporary Accounting Research Editors Report reveals that only 5 of 258 submissions to the journal were in
the area of tax. Thus, even if CAR published all of these articles, it would still show a low percentage of published
tax studies in a presentation similar to Table 1. Thus, the results in Table 1 do not necessarily suggest editor/reviewer
bias but may be explained by unknown submission rates to the journals.

10

may have defined methodological and/or topical area tastes in terms of research they have
published in the past.
Panel C of Table 1 shows the percentage of articles by methodology for each topical area.
Managerial research has the greatest distribution of methodologies as each methodology is used
at least 16 percent of the time in managerial publications. Financial has the least distribution of
methodologies as archival is used 76 percent of the time and the next highest used methodology
is analytical, used 12 percent of the time. Audit research uses a relatively equal blend of archival,
experimental, and other methodologies but lags behind in employing the analytical methodology.
Tax is reasonably diverse in terms of methodology as the lowest methodology, other, is used in
10 percent of publications. Finally, AIS uses primarily experimental and other methodologies
to address research questions.
(Insert Table 1 about here)
Table 2 presents the rankings of universities broken down by topical area. We list the top
40 schools for each topical area and present three rankings: rankings over the previous 6, 12, or
full year range (rankings are sorted by the 6 year column). We list each topical area
alphabetically.
There are several interesting things to note from the rankings other than just the rank
ordering of the universities. The trend of universities rankings from 6 years to 20 years is
valuable information. For example, a school like Florida International in the audit rankings is
ranked first over the 6 year window but eighth over the 20-year window. This suggests that
Florida International has been very active in the recent past and is the top producer of audit
research in the last 6 years.

11

Analyzing the trends of publications also reveals interesting findings when looking at an
entire topical areas rankings. For example, the top 10 schools in financial over the 6 year
window were all in the top 25 over the full year window. In Managerial and Auditing, five and
seven schools in the top 10 during the 6 year window were not in the top 25 schools over the full
year window. This suggests that there is significant change in rankings for some topical areas
relative to the other topical areas.
These rankings are also useful to non-US schools. Note in the managerial rankings that
11 of the top 40 schools in the 6 year window are international schools. In the audit rankings, 11
of the top 40 schools are international schools as well. These rankings help to give credibility to
these institutions in terms of their ability to produce top quality research in given topical areas.
Also of interest in these rankings is the number of faculty whose published articles have
contributed to a given ranking. For example, in the managerial rankings, Stanford is rated first
over the last 19 years even though only 5 different authors published managerial articles. The
second ranked university, Michigan State, has twice as many authors. This information could be
used by potential Ph.D. students (current doctoral students) in targeting which school to attend
(work for). Whereas Stanford appears to have fewer researchers publishing managerial research,
these researchers are publishing a very high volume of articles. Michigan State, the second
ranked school, has more researchers, but they do not appear to be producing at a rate as fast as
Stanford.
(Insert Table 2 about here)
Table 3 is very similar to Table 2 except Table 3 presents rankings by research
methodology rather than by topical area. We note that users may benefit from interpreting Table
3 in similar fashion to the way we discussed interpreting Table 2.

12

(Insert Table 3 about here)


Table 4 presents four different rankings that provide information about which schools
provide the greatest breadth of research expertise. In Table 4 we provide the results of averaging
the topical area rankings, averaging the methodology rankings, or averaging the topical area and
methodology rankings. Schools that focus on one or two topical areas or on a single
methodology will not rank as highly in these rankings. In addition, we provide rankings based on
raw total counts instead of breaking rankings down by topical area and methodology. This
ranking is comparable to most previous accounting program rankings.
As would be expected, large schools fair particularly well in these rankings. These
schools likely have great breadth because their size allows professors to specialize their teaching
and thus their research in areas other than financial accounting. As with Table 2 and Table 3, we
provide rankings over different time horizons so users can make informed decision using these
rankings.
(Insert Table 4 about here)
CONCLUSIONS
This study ranks all accounting research programs by considering publication counts in
top accounting journals. These rankings differ from most prior rankings in two important ways.
First, we provide separate research rankings by topical area (AIS, auditing, financial, managerial,
and tax) and by methodology (analytical, archival, and experimental). Second, we give
institutions credit for all research published by professors currently employed at the institution
rather than giving institutions credit for publications of faculty who published at the university
but no longer work there. These rankings should be useful to decision-makers in multiple

13

settings (e.g., prospective Ph.D. students, doctoral students, faculty, accounting departments,
business schools, and universities).
This study is not without limitations. We highlight the most important limitations and
caution decision-makers to consider how these limitations may impact their decision-making
setting. First, using counts to rank accounting research programs treats all articles as making
equal contributions to the literature. Counts do not take into consideration level of impact of a
particular article. Thus, faculty at an institution that produces few, highly innovative and
paradigm-altering articles may not rank as highly in these rankings as an institution that focuses
on producing a large quantity of research publications. Whether one of these strategies is better
in terms of producing accounting knowledge is debatable, and this research does not provide
evidence for either side of this debate.
Second, we consider a basket of accounting journals that likely vary in terms of perceived
and actual quality. We do not attempt to weight articles published in different journals as being
worth more or less than other articles due to the subjective nature of determining weightings. We
carefully selected journals, choosing only those of perceived high quality (Lowensohn and
Samelson 2006; Herron and Hall 2004; Chan et al. 2009) while balancing this with the
publication patterns that previous researchers have noted some journals exhibit (Bonner et al.
2006).
Third, we do not explicitly take into account faculty size in determining our rankings; we
also employ a methodology that does not explicitly consider an institutions ability to influence
researchers ability to publish by having access to such things as more and/or better databases,
providing more talented research assistants, decreasing teaching loads, or other similar
characteristics that likely improve research productivity. However, our methodology does

14

indirectly capture this ability as the most prolific researchers are likely aware of these
institutional advantages and more likely to work at schools that offer these advantages.
Finally, by recognizing the mobility of human capital, an excellent research school
recently raided of talent may receive a low ranking. It is likely that a school that establishes the
culture and financial means to be a top-tier research school will likely be able to attract high
quality researchers even if it was recently raided. Thus, some schools may appear artificially low
in our rankings because at the time of this study they had not been able to rebuild their faculty.
Even with the limitations to this research, we view this study as providing an important
incremental contribution to the prior research ranking literature. In addition, we believe this
study will be highly useful to the academy and the professional community of accountants.

15

REFERENCES
Andrews, W. T., and P. B. McKenzie. 1978. Leading accounting departments revisited. The
Accounting Review (January): 135-138.
Bazley, J. D., and L. A. Nikolai. 1975. A comparison of published accounting research and
qualities of accounting faculty and doctoral programs. The Accounting Review (July):
605-610.
Bonner, S. E., J. W. Hesford, W. A. Van der Stede, and S. M. Young. 2006. The most influential
journals in academic accounting. Accounting, Organizations and Society 31: 663-685.
Brown, L. D. 2003. Ranking journals using social science research network downloads. Review
of Quantitative Finance and Accounting 20 (3): 291-307.
_______, and J. C. Gardner. 1985. Applying citation analysis to evaluate the research
contributions of accounting faculty and doctoral programs. The Accounting Review 60
(2): 262-277.
_______, and I. Laksmana. 2004. Ranking accounting Ph.D. programs and faculties using social
science research network downloads. Review of Quantitative Finance and Accounting 22
(3): 249-266.
Chan, K. C., C. R. Chen, and L. T. W. Cheng. 2005. Ranking research productivity in accounting
for Asia-Pacific universities. Review of Quantitative Finance and Accounting 24: 47-64.
_______, _______, and _______. 2007. Global ranking of accounting programmes and the elite
effect in accounting research. Accounting & Finance 47 (2): 187-220.
_______, _______, and _______. 2009. Ranking accounting journals using dissertation citation
analysis: A research note. Accounting Organizations and Society article in press.
Contemporary Accounting Research 2007 Editors Report. 2007. Available online at
http://www.caaa.ca/CAR/EditorRpt/index.html.
Ellison, G. 2002. Evolving standards for academic publishing: A q-r theory. Journal of Political
Economy 110 (5): 1053-1079.
Fogarty, T. 1995. A ranking to end all rankings: A meta-analysis and critique of studies ranking
academic accounting departments. Accounting Perspectives 1: 1-22.
Glover, S. M., D. F. Prawitt, and D. A. Wood. 2006. Publication records of faculty promoted at
the top 75 accounting research programs. Issues in Accounting Education 21 (3): 195218.

16

Hasselback, J. R. 2008. Accounting Faculty Directory 2008-2009. Upper Saddle River, NJ:
Pearson Prentice Hall.
_______, and A. Reinstein. 1995. A proposal for measuring the scholarly productivity of
accounting faculty. Issues in Accounting Education (Fall): 269-306.
Herron, T. L., and T. W. Hall. 2004. Faculty perceptions of journals: Quality and publishing
feasibility. Journal of Accounting Education 22 (3): 175-210.
Hodder, L., P. E. Hopkins, and D. A. Wood. 2008. The effects of financial statement and
informational complexity on analysts cash flow forecasts. The Accounting Review 83 (4):
915-956.
Lowensohn, S., and D. P. Samelson. 2006. An examination of faculty perceptions of academic
journal quality within five specialized areas of accounting research. Issues in Accounting
Education 21 (3): 219-239.
Menon, K. and D. D. Williams. 2008. Management turnover following auditor resignations.
Contemporary Accounting Research 25 (2): 567-604.
Oler, D. K., M. J. Oler, and C. J. Skousen. 2009. Characterizing accounting research. Working
Paper, Indiana University, Virginia Polytechnic Institute & State University, Utah State
University.
Reinstein, A., and J. R. Hasselback. 1997. A literature review of articles assessing the
productivity of accounting faculty. Journal of Accounting Education 15 (3): 425-455.
Swanson, E. 2004. Publishing in the majors: A comparison of accounting, finance, management,
and marketing. Contemporary Accounting Research 21 (1): 223-255.
_______, C. J. Wolfe, and A. Zardkoohi. 2007. Concentration in publishing at top-tier business
journals: Evidence and potential explanations. Contemporary Accounting Research 24
(4): 1255-1289.
Trieschmann, J. S., A. R. Dennis, G. B. Northcraft, and A. W. Niemi. 2000. Serving multiple
constituencies in the business school: MBA program versus research performance.
Academy of Management Journal
Windall, F. W. 1981. Publishing for a varied public: An empirical study. The Accounting Review
(July): 653-658.
Zivney, T. L., and A. G. Thomas. 1985. A comprehensive examination of accounting faculty
publishing. Issues in Accounting Education (Spring): 1-25.

17

TABLE 1
Descriptive Statistics
Panel A: Percentage of articles by topical area published in different journals
Journal
AIS
Audit Financial Managerial Tax
AOS
1%
13%
16%
39%
2%
Auditing
3%
97%
19%
1%
0%
BRIA
1%
35%
13%
19%
5%
CAR
1%
28%
54%
15%
8%
JAE
0%
6%
78%
16%
8%
JAR
1%
18%
70%
13%
7%
JIS
100%
18%
10%
5%
0%
JMAR
0%
1%
1%
98%
1%
JATA
0%
1%
28%
4%
96%
RAST
0%
6%
79%
16%
4%
TAR
1%
21%
58%
18%
10%

Other
33%
13%
37%
12%
5%
3%
0%
3%
7%
0%
6%

Panel B: Percentage of articles by methodology published in different journals


Journal
AOS
Auditing
BRIA
CAR
JAE
JAR
JIS
JMAR
JATA
RAST
TAR

Analytical
1%
3%
2%
20%
13%
21%
3%
19%
14%
36%
14%

Archival Experimental
10%
13%
38%
33%
1%
59%
52%
18%
83%
1%
64%
13%
12%
39%
16%
16%
53%
21%
62%
3%
63%
21%

Other
77%
30%
38%
12%
3%
3%
47%
51%
18%
0%
4%

(Continued on next page)

18

TABLE 1 Continued
Panel C: Percentage of articles by methodology per topical area
Topical Area
AIS
Audit
Financial
Managerial
Tax
Other

Analytical
4%
8%
12%
22%
13%
4%

Archival Experimental
12%
36%
32%
38%
76%
7%
22%
16%
58%
19%
22%
16%

Other
48%
23%
5%
40%
10%
59%

Panel A and Panel B percentages do not add up to 100 percent as topical area and methodology
categorizations are not mutually exclusive (e.g., an article can be both financial and audit or use
both experimental and archival methodologies).

19

TABLE 2
Rankings of Accounting Institutions by Topical Area
University

6 Yrs

AIS
12 Yrs

All

University

Rutgers
Florida St
No Carol St
Cen Fla
So Illinois
Tx Tech
Portland St
Cal St Long Bch
No Arizona
Auburn
Tulsa
Ghent U
Bentley
Ariz St
M ichigan St
Texas A&M
Georgia St
Arkansas
Okla St
Tennessee
Houston-Cl L
No Colo
Emory
M aastricht
Akron
Kent St
Hawaii-Manoa
Queensland
M issouri
So Florida
Kansas
Virg Comm
Delaware
Temple
Utah
Iowa State
Denver U
Brock U
Cal St Northridge
Fla Atlantic

1 #3
1 #2
3 #3
4 #2
4 #3
4 #3
4 #3
4 #1
4 #2
4 #2
4 #3
4 #3
13 #2
13 #1
13 #2
13 #1
13 #2
13 #1
13 #1
13 #2
13 #2
13 #2
13 #1
13 #2
13 #1
13 #2
13 #1
13 #2
29 #1
29 #1
29 #1
29 #1
29 #1
29 #1
29 #1
29 #1
29 #1
29 #1
29 #1
29 #1

2 #4
9 #2
10 #3
4 #2
4 #4
6 #4
10 #3
10 #2
17 #2
17 #2
17 #3
17 #3
1 #4
6 #3
10 #2
10 #1
10 #3
10 #1
26 #1
26 #2
26 #2
26 #2
26 #1
26 #2
26 #1
26 #2
26 #1
26 #2
3 #3
6 #4
17 #3
17 #2
26 #1
26 #2
47 #1
47 #1
47 #1
47 #1
47 #1
47 #1

2 #5
9 #3
19 #3
4 #3
5 #6
8 #4
9 #4
19 #2
14 #2
19 #3
28 #3
28 #3
1 #4
3 #5
9 #3
14 #2
14 #3
19 #1
19 #2
28 #3
40 #2
40 #2
40 #1
40 #2
40 #1
40 #2
40 #1
40 #2
5 #3
7 #4
19 #4
28 #2
40 #1
40 #2
14 #1
28 #1
28 #3
40 #2
40 #1
68 #1

Fla Internat
1 #4
Illinois at Urb. Cham.
2 #9
Northeastern
3 #6
Rutgers
4 #6
Bentley
5 #5
New So Wales
5 #6
M issouri
5 #2
Nanyang Tech
8 #4
Texas A&M
9 #3
Kentucky
9 #3
Queens
9 #4
Fla Atlantic
9 #6
Hong Kong Uni. of S&T 9 #2
Brigham Young U
14 #4
Tennessee
14 #5
Tx-Austin
14 #3
Indiana Indianapolis
14 #2
HongKong PolyTechnic University
14 #4
Temple
14 #3
So Carol
14 #4
Wisconsin
21 #3
Alabama
21 #3
Kansas
21 #3
So Calif
21 #4
Florida
21 #2
Alberta
21 #3
M ass
21 #3
Georgia
21 #4
Auckland
21 #4
Athens
21 #4
Georgia St
31 #4
Indiana Bloomington
31 #3
Toronto
31 #2
Virginia Tech
31 #2
Chinese HK U
31 #2
Auburn
31 #3
No Texas
31 #3
Nat Taiwan U
31 #4
Cornell
31 #2
SUNY-Bingham
31 #3

6 Yrs

Audit
12 Yrs
All
2 #5
5 #11
3 #6
11 #7
4 #6
6 #10
7 #2
7 #6
16 #4
21 #3
24 #5
36 #7
36 #2
7 #8
7 #5
13 #6
14 #4
14 #7
21 #4
24 #4
1 #6
12 #4
16 #4
19 #6
19 #4
30 #4
30 #3
36 #4
50 #4
59 #4
24 #5
24 #5
30 #2
30 #4
30 #5
42 #3
50 #3
50 #4
59 #2
59 #4

8 #5
4 #13
3 #6
15 #9
6 #6
9 #11
15 #2
15 #6
10 #9
15 #4
39 #5
32 #9
50 #3
13 #8
13 #5
2 #10
20 #5
24 #7
34 #4
39 #5
5 #9
24 #4
24 #4
6 #9
10 #5
34 #6
50 #3
29 #5
81 #4
100 #4
15 #6
34 #8
29 #5
44 #5
50 #5
58 #5
44 #6
68 #5
12 #3
100 #4

20

TABLE 2 - Continued

University

6 Yrs

Financial
12 Yrs
All

University

Chicago
U of Washington
Stanford
Tx-Austin
Duke
M IT
Penn
New York U
Penn St
So Calif
Texas A&M
Iowa
Arizona
London Bus
Cornell
Indiana Bloomington
No Carol
Northwestern
M ichigan
Toronto
CUNY-Baruch
Illinois at Urb. Cham.
Ohio St
Rochester
Columbia
Wisconsin
Utah
M issouri
UCLA
M ichigan St
Ariz St
Hong Kong Uni. of S&T
Harvard
Berkeley
Yale
Lancaster
M iami
M innesota
Houston
Georgia

1 #18
2 #10
3 #13
4 #13
4 #10
6 #10
7 #11
8 #11
9 #9
9 #9
9 #9
9 #11
9 #10
14 #6
15 #6
16 #10
17 #9
17 #10
17 #9
17 #8
17 #11
17 #8
23 #9
23 #7
25 #8
25 #6
27 #6
27 #8
29 #5
29 #7
29 #7
29 #7
33 #8
33 #6
33 #4
33 #8
33 #5
38 #6
38 #7
40 #5

3 #19
1 #12
2 #14
5 #18
7 #10
13 #13
4 #15
6 #14
9 #11
12 #15
14 #12
20 #11
20 #11
29 #8
8 #9
15 #13
10 #10
10 #12
17 #11
23 #9
25 #15
31 #9
15 #15
41 #7
19 #11
35 #9
30 #8
37 #9
17 #7
22 #12
32 #9
32 #10
23 #14
25 #6
37 #5
47 #8
48 #5
32 #8
37 #9
28 #9

Stanford
1 #5
M ichigan St
2 #6
London SchEcon
3 #5
Ohio St
4 #4
Tilburg U
4 #5
UCLA
4 #5
Indiana Bloomington
7 #5
Berkeley
7 #4
Pittsburgh
7 #4
Tx-Austin
7 #6
University of Navarra IESE7 #2
M ichigan
12 #4
So Carol
13 #5
So Calif
14 #4
M elbourne
14 #3
Rice
14 #1
Temple
14 #2
Penn St
14 #4
M iami
14 #4
M onash U
14 #1
Illinois at Urb. Cham.
14 #4
Emory
14 #2
Duke
14 #2
Georgia St
14 #2
Car Mellon
14 #3
Yale
14 #1
M aastricht
14 #2
Leuven
14 #3
Louisiana Tech
14 #3
M IT
14 #3
Twente
14 #2
M iss St
14 #2
Penn
33 #2
Columbia
33 #1
M anchester
33 #2
Iowa
33 #2
Queens
33 #2
Utah
33 #3
Chicago
33 #3
Warwick
33 #1

5 #21
1 #18
2 #15
5 #20
10 #10
15 #13
3 #17
4 #15
11 #12
8 #19
9 #15
24 #13
24 #13
31 #8
16 #9
13 #16
7 #11
11 #12
16 #14
27 #11
21 #22
32 #13
19 #17
36 #9
14 #11
34 #9
41 #8
45 #10
18 #9
26 #17
20 #15
39 #11
29 #17
21 #7
44 #5
60 #8
64 #5
32 #9
34 #10
28 #10

6 Yrs

Managerial
12 Yrs
All
2 #5
1 #9
5 #8
5 #7
13 #6
17 #5
4 #6
10 #5
13 #6
20 #7
20 #2
5 #6
29 #6
8 #8
10 #4
12 #2
13 #3
17 #7
17 #4
20 #2
26 #6
26 #3
29 #4
29 #4
37 #3
37 #3
37 #2
37 #3
51 #3
51 #3
62 #2
62 #2
3 #7
8 #4
20 #7
29 #3
29 #3
29 #4
37 #5
37 #3

1 #5
2 #10
7 #10
4 #10
29 #6
27 #6
7 #6
10 #7
17 #6
37 #7
40 #2
11 #6
46 #6
4 #8
7 #7
21 #2
6 #4
21 #9
29 #5
14 #3
37 #8
40 #4
29 #5
46 #5
20 #7
40 #4
55 #3
63 #3
78 #3
78 #3
95 #2
95 #2
3 #9
14 #4
21 #9
12 #5
21 #6
46 #4
37 #5
40 #3

21

TABLE 2 - Continued

University

6 Yrs

Tax
12 Yrs

All

Arizona
Dartmouth
Chicago
Iowa
Texas A&M
Tx-Austin
Oregon
Georgia
Geo M ason
So Carol
U of Washington
M ichigan
M issouri
No Carol
Brigham Young U
Conn
Columbia
Virginia Tech
Tx Tech
Lingnan U
Notre Dame
Alabama
Indiana Bloomington
Virg-Grad
Idaho State
Kansas
Laval
Cen Fla
M ichigan St
Houston
Florida St
Rochester
Arkansas
Cal Davis
Ariz St
No Texas
Tilburg U
Pittsburgh
Wyoming
M iami

1 #3
2 #2
3 #3
3 #3
5 #2
5 #2
5 #2
8 #2
8 #2
8 #2
8 #1
8 #1
8 #4
14 #2
14 #3
14 #3
14 #2
14 #3
14 #3
14 #3
14 #2
14 #1
14 #2
14 #2
14 #1
14 #3
14 #3
28 #1
28 #2
28 #1
28 #2
28 #1
28 #2
28 #1
28 #2
28 #2
28 #1
28 #2
28 #2
28 #1

1 #3
5 #2
6 #4
8 #3
2 #8
4 #3
10 #4
11 #3
11 #4
19 #2
19 #1
19 #1
34 #4
2 #5
7 #6
8 #3
11 #7
14 #3
15 #4
15 #3
19 #2
19 #2
28 #4
28 #2
34 #1
34 #3
34 #3
15 #4
19 #4
19 #1
28 #3
34 #2
34 #4
34 #1
44 #3
44 #2
44 #1
58 #2
58 #2
58 #1

2 #3
5 #2
6 #5
16 #4
4 #9
2 #5
11 #4
16 #3
24 #4
24 #4
24 #2
34 #2
53 #4
1 #6
9 #6
8 #3
21 #7
14 #3
16 #5
29 #3
29 #3
34 #3
14 #8
46 #2
40 #1
46 #4
53 #3
34 #4
9 #8
16 #3
40 #4
53 #2
53 #4
53 #1
13 #7
65 #2
65 #1
53 #2
84 #2
84 #1

Rankings for the top 40 schools based on the 6 year


window are presented. The number of authors who
contributed to the rankings are also presented (i.e.,
number after #). If there were ties at the cutoff amount,
the school that is presented was the highest in the 12 year
category or if still a tie then the all year category (or if
still a tie, alphabetically). Time windows represent all
articles published in the previous 6, 12, or 19 years.

22

TABLE 3
Rankings of Accounting Institutions by Research Methodology

University

6 Yrs

Stanford
1 #4
Ohio St
2 #3
Berkeley
2 #4
Dartmouth
4 #3
UCLA
4 #4
Yale
4 #1
Car Mellon
7 #3
M innesota
7 #3
Indiana Indianapolis
7 #2
Houston
7 #5
Illinois at Urb. Cham.
7 #4
Columbia
12 #2
Penn
12 #4
Northwestern
12 #2
Chicago
12 #4
Tilburg U
12 #2
British Colu
17 #4
Duke
17 #3
Tel Aviv Un
17 #1
Ariz St
20 #1
Purdue West Lafayette
20 #3
Tx-Austin
20 #2
Georgetown
20 #1
Florida
24 #2
Illinios at Chicago
24 #2
Penn St
24 #2
M ichigan
24 #2
Toronto
24 #2
New York U
24 #2
Hong Kong Uni. of S&T 24 #2
CUNY-Baruch
24 #2
Pittsburgh
24 #2
Alberta
24 #1
Princeton
24 #1
Utah
24 #2
Nat Taiwan U
24 #2
Clark U
24 #1
Fla Atlantic
24 #2
Aarhus Universitet
39 #1
Sungkyunkwan University 39 #1

Analytical
12 Yrs
All

University

1 #5
2 #5
4 #4
5 #3
6 #5
13 #2
9 #4
10 #4
10 #3
12 #5
13 #4
2 #5
6 #4
6 #4
16 #4
16 #3
23 #5
23 #3
34 #1
13 #3
23 #4
28 #2
37 #1
16 #2
16 #4
16 #2
23 #4
28 #2
28 #4
28 #3
34 #2
37 #2
37 #2
37 #2
49 #2
49 #2
49 #1
49 #2
21 #2
21 #1

Chicago
1 #15
M IT
2 #10
Stanford
3 #11
Penn
3 #10
U of Washington
3 #7
Texas A&M
3 #12
Duke
3 #10
Arizona
3 #11
So Calif
9 #11
Iowa
10 #13
New York U
11 #11
Penn St
12 #9
London Bus
13 #5
No Carol
14 #9
M ichigan
14 #9
Tx-Austin
14 #9
CUNY-Baruch
14 #12
M ichigan St
18 #9
Fla Internat
18 #6
Hong Kong Uni. of S&T 18 #7
Indiana Bloomington
21 #10
Toronto
21 #9
M issouri
21 #8
Rochester
24 #7
Wisconsin
25 #7
Utah
26 #7
Temple
27 #7
Harvard
27 #10
Columbia
27 #7
Ohio St
27 #9
Northwestern
31 #8
UCLA
31 #5
M iami
31 #6
So M ethodist
34 #7
Georgia
35 #6
Tx-Dallas
35 #6
Berkeley
35 #4
HongKong PolyTechnic University
35 #6
Illinois at Urb. Cham.
35 #7
Geo Wash
40 #7

1 #7
4 #5
10 #5
8 #3
6 #6
13 #4
11 #5
6 #7
15 #3
8 #5
18 #5
5 #6
2 #4
2 #6
27 #5
32 #3
38 #5
39 #3
46 #1
11 #5
27 #4
32 #2
58 #1
15 #2
20 #6
22 #3
14 #6
20 #6
32 #4
43 #3
24 #3
27 #3
39 #2
58 #2
46 #2
66 #2
66 #1
66 #2
27 #2
32 #1

6 Yrs

Archival
12 Yrs
All
3 #16
15 #12
1 #12
2 #14
4 #10
5 #15
6 #10
13 #12
8 #13
16 #13
10 #15
10 #13
34 #6
6 #10
12 #13
14 #12
23 #15
9 #16
20 #8
31 #9
18 #12
22 #9
23 #10
37 #7
17 #12
27 #9
18 #10
20 #17
25 #9
30 #11
25 #10
27 #6
37 #6
36 #9
31 #9
31 #12
35 #5
37 #9
59 #8
51 #9

3 #19
14 #12
1 #13
7 #16
4 #15
5 #18
10 #10
13 #15
5 #17
17 #13
8 #15
10 #14
38 #6
2 #12
16 #14
12 #13
19 #22
9 #20
34 #8
37 #10
22 #13
26 #9
34 #11
39 #9
21 #13
41 #9
17 #11
25 #19
22 #10
34 #12
28 #11
28 #8
60 #6
28 #12
20 #10
27 #12
24 #6
50 #10
50 #11
41 #12

23

TABLE 3 - Continued

University

6 Yrs

Experimental
12 Yrs
All

Tx-Austin
Cornell
Indiana Bloomington
Northeastern
Illinois at Urb. Cham.
M ichigan St
So Carol
Brigham Young U
Alabama
Nanyang Tech
Ariz St
Emory
Georgia St
U of Washington
Bentley
Florida St
Kansas
Texas A&M
Kentucky
M ass
So Illinois
Georgia Tech
Auburn
Cen Fla
Pittsburgh
Georgia
Leuven
York U (Canada)
Tilburg U
New So Wales
Oklahoma
Wisconsin
Virginia Tech
Tx Tech
M elbourne
M issouri
So Florida
Iowa State
No Carol St
Louisiana Tech

1 #6
1 #5
3 #6
3 #7
5 #6
5 #6
5 #5
8 #6
8 #4
10 #4
10 #4
10 #5
13 #6
14 #3
15 #2
15 #4
15 #3
15 #2
19 #3
19 #3
19 #3
19 #3
19 #3
24 #3
24 #4
24 #4
24 #3
24 #3
24 #2
30 #1
30 #2
30 #3
30 #2
30 #3
30 #1
30 #2
30 #3
30 #2
30 #3
30 #1

3 #10
5 #5
2 #8
3 #8
6 #12
8 #6
11 #5
1 #11
7 #5
8 #6
11 #8
13 #6
16 #6
13 #5
10 #3
22 #7
22 #3
26 #3
16 #4
18 #4
26 #4
31 #4
33 #3
15 #4
26 #4
36 #5
36 #3
36 #3
45 #2
19 #6
19 #6
19 #6
22 #4
22 #5
26 #3
26 #3
36 #5
55 #3
55 #3
55 #1

1 #12
3 #6
4 #10
5 #8
7 #13
8 #7
9 #6
6 #11
12 #5
14 #6
2 #10
23 #6
19 #7
12 #7
9 #4
30 #7
34 #3
29 #6
9 #6
27 #4
33 #4
27 #4
47 #4
21 #7
15 #5
43 #5
58 #3
58 #3
67 #2
16 #7
16 #6
16 #7
25 #5
31 #5
36 #3
36 #3
47 #6
34 #6
36 #7
67 #2

Rankings for the top 40 schools based on the 6 year window


are presented. The number of authors who contributed to the
rankings are also presented (i.e., number after #). If there were
ties at the cutoff amount, the school that is presented was the
highest in the 12 year category or if still a tie then the all year
category (or if still a tie, alphabetically). Time windows
represent all articles published in the previous 6, 12, or 19
years.

24

TABLE 4
Rankings of Accounting Institutions by Raw Total Article Counts and by Averaging Rankings of Topical Areas, Research
Methodologies, or Topic and Methodology Combined
University

Average of Topic
6 Yrs
12 Yrs
All

Texas A&M
1 #3
Tx-Austin
2 #5
M ichigan St
3 #4
Indiana Bloomington
4 #4
So Calif
5 #4
Stanford
6 #4
So Carol
7 #3
M issouri
8 #3
Georgia St
9 #3
Iowa
10 #3
Emory
11 #3
Illinois at Urb. Cham.
12 #4
M ichigan
13 #3
Chicago
13 #5
U of Washington
15 #3
Temple
16 #2
Ariz St
17 #3
M IT
18 #3
Utah
19 #2
M iami
20 #2
Fla Internat
20 #2
Arizona
22 #3
Cornell
23 #2
Penn
24 #3
Toronto
24 #3
No Carol
26 #3
Penn St
27 #3
Wisconsin
28 #2
Richmond
29 #2
HongKong PolyTechnic University
30 #2
Laval
30 #2
Ohio St
32 #3
Rice
33 #1
Queens
34 #2
Pittsburgh
35 #2
Oklahoma
36 #1
Duke
37 #3
Hong Kong Uni. of S&T 38 #2
Alabama
39 #1
M elbourne
40 #2

1 #5
4 #6
2 #5
6 #5
3 #5
14 #4
5 #4
7 #4
10 #3
17 #4
9 #3
8 #5
13 #4
18 #6
15 #4
11 #3
12 #4
28 #3
21 #3
30 #2
39 #2
23 #4
37 #2
22 #4
29 #3
24 #3
19 #4
16 #3
45 #2
43 #3
54 #2
20 #4
41 #2
31 #2
42 #2
25 #2
32 #3
52 #2
33 #2
35 #2

1 #6
4 #7
3 #6
8 #6
2 #6
15 #5
7 #4
9 #4
11 #4
20 #4
13 #4
6 #6
17 #4
24 #6
16 #5
10 #3
5 #5
47 #4
19 #3
52 #2
58 #2
22 #4
34 #3
21 #5
26 #3
28 #4
18 #5
12 #4
64 #2
43 #4
63 #2
14 #5
54 #2
37 #3
39 #3
23 #2
36 #4
59 #3
40 #3
32 #3

University

Average of Method
6 Yrs
12 Yrs
All

Tx-Austin
1 #6
Illinois at Urb. Cham.
2 #6
U of Washington
3 #4
Pittsburgh
4 #4
Ariz St
5 #4
Texas A&M
6 #5
Wisconsin
7 #4
Ohio St
8 #4
M ichigan St
9 #5
Indiana Bloomington
10 #5
Florida
11 #2
Alberta
11 #4
Arizona
13 #4
M issouri
14 #3
Stanford
15 #5
Florida St
16 #3
Cornell
16 #3
Georgia
18 #3
Chicago
19 #6
M iami
20 #3
Brigham Young U
21 #4
Penn
21 #5
HongKong PolyTechnic University
23 #3
Duke
24 #4
Georgia St
24 #3
Tilburg U
26 #2
Nanyang Tech
27 #3
Emory
27 #4
Oklahoma
27 #2
So Calif
30 #4
Kansas
30 #2
Fla Atlantic
32 #3
UCLA
33 #3
New York U
33 #4
Penn St
35 #4
Notre Dame
36 #2
Berkeley
36 #3
CUNY-Baruch
38 #5
M ichigan
38 #4
Columbia
40 #3

1 #7
2 #7
3 #4
7 #4
5 #5
14 #5
6 #5
4 #5
9 #6
10 #6
10 #2
12 #4
19 #4
13 #4
28 #5
16 #4
20 #4
27 #4
17 #7
32 #3
24 #5
32 #5
38 #4
14 #5
21 #4
35 #2
25 #3
29 #4
34 #3
8 #6
22 #2
55 #3
47 #3
50 #5
22 #4
30 #3
53 #3
30 #5
50 #5
45 #4

1 #8
2 #8
2 #6
12 #4
4 #7
20 #6
6 #6
5 #6
9 #7
8 #7
10 #3
13 #5
18 #5
14 #4
42 #6
16 #5
17 #4
39 #4
23 #7
52 #3
34 #6
46 #6
33 #4
11 #5
21 #4
47 #3
32 #4
40 #4
26 #4
7 #7
22 #3
56 #3
43 #4
61 #6
19 #5
24 #4
60 #3
29 #7
44 #5
57 #4

25

TABLE 4 - Continued

University

Average of Topic and Method


6 Yrs
12 Yrs
All

Tx-Austin
1 #5
Texas A&M
2 #4
M ichigan St
3 #4
Indiana Bloomington
4 #4
Illinois at Urb. Cham.
5 #5
U of Washington
6 #3
Stanford
7 #4
M issouri
8 #3
So Calif
9 #4
Ariz St
9 #3
Georgia St
11 #3
Emory
12 #3
Chicago
13 #6
So Carol
14 #3
Wisconsin
15 #3
Arizona
16 #4
M ichigan
16 #3
Cornell
18 #2
Ohio St
19 #3
Pittsburgh
19 #2
M iami
19 #2
Penn
22 #4
Iowa
23 #4
Utah
24 #3
Penn St
25 #3
Alberta
26 #2
Toronto
27 #3
HongKong PolyTechnic University
28 #3
M IT
29 #3
Duke
30 #3
Oklahoma
31 #2
Temple
32 #2
Nanyang Tech
33 #2
Fla Internat
34 #2
No Carol
35 #3
Florida St
36 #3
Georgia
37 #3
Hong Kong Uni. of S&T 37 #2
UCLA
39 #2
Columbia
40 #2

1 #6
2 #5
3 #6
7 #5
4 #6
6 #4
16 #5
9 #4
5 #5
8 #4
12 #3
13 #4
15 #6
14 #4
11 #4
19 #4
20 #4
32 #3
10 #4
24 #3
34 #2
21 #5
29 #4
22 #3
17 #4
33 #3
25 #3
41 #3
44 #4
23 #4
27 #2
18 #3
43 #3
58 #2
36 #3
28 #3
49 #3
54 #3
56 #3
47 #4

1 #7
4 #6
3 #7
8 #6
6 #7
9 #5
19 #5
11 #4
2 #6
5 #6
12 #4
16 #4
21 #6
17 #4
10 #5
15 #4
22 #5
30 #3
7 #5
29 #4
49 #2
28 #5
25 #4
18 #3
14 #5
41 #4
20 #4
39 #4
59 #4
27 #4
24 #3
13 #4
57 #3
69 #2
37 #4
32 #4
36 #4
61 #3
56 #3
46 #4

University

6 Yrs

Total
12 Yrs

All

Stanford
Tx-Austin
Chicago
U of Washington
So Calif
Texas A&M
Penn
M ichigan St
Duke
Illinois at Urb. Cham.
Indiana Bloomington
Arizona
M IT
Penn St
Cornell
New York U
Iowa
M ichigan
Ohio St
M issouri
Toronto
Fla Internat
CUNY-Baruch
London Bus
No Carol
Wisconsin
Columbia
Ariz St
UCLA
Northwestern
Alberta
So Carol
Hong Kong Uni. of S&T
Berkeley
Utah
Brigham Young U
Bentley
Nanyang Tech
Florida St
Georgia St

1 #13
2 #16
2 #19
4 #10
5 #13
5 #12
7 #13
7 #13
7 #12
7 #18
11 #14
12 #14
12 #10
14 #12
14 #8
16 #12
16 #13
18 #11
19 #12
19 #10
19 #11
22 #7
22 #13
22 #6
25 #9
25 #10
25 #10
25 #11
25 #7
30 #11
30 #10
30 #10
30 #7
34 #7
34 #9
36 #9
36 #7
36 #7
36 #10
36 #9

1 #14
4 #21
6 #22
4 #12
6 #20
9 #16
2 #17
3 #19
11 #14
12 #21
8 #17
17 #16
25 #13
10 #14
14 #11
19 #15
27 #14
14 #15
14 #17
25 #12
29 #11
34 #9
35 #18
52 #9
13 #10
17 #15
21 #14
23 #18
27 #8
23 #13
35 #15
43 #11
48 #10
30 #7
39 #12
19 #17
22 #9
41 #10
45 #13
48 #11

1 #16
3 #25
9 #26
7 #18
2 #24
8 #23
4 #18
6 #27
21 #15
18 #24
9 #22
20 #17
36 #13
12 #16
14 #13
15 #16
23 #17
21 #16
13 #23
41 #13
29 #16
55 #9
29 #23
55 #10
11 #12
15 #16
19 #14
4 #25
25 #11
17 #20
36 #16
53 #12
67 #11
27 #10
51 #12
24 #19
33 #9
67 #10
53 #16
39 #15

Rankings for the top 40 schools based on the 6 year window are presented. The number of authors who contributed to the
rankings are also presented (i.e., number after #). If there were ties at the cutoff amount, the school that is presented was the
highest in the 12 year category or if still a tie then the all year category (or if still a tie, alphabetically). Time windows
represent all articles published in the previous 6, 12, or 19 years

26

APPENDIX 1
Description of Companion Website
To enhance the usefulness of the descriptive data presented in this paper, we have
developed a companion website for this paper. The website is located at URL (to be filled in
once review of paper is finished). The website provides the following:

Complete listing of rankings for each ranking presented in this paper (not just top 40
rankings).

Periodically updated rankings as new journal issues are published and professors change
locations.

Ability for user to select which journals to include for computation of rankings.

Ability for user to select year range for computation of rankings.

Ability for user to create rankings by scaling by the number of authors at each institution.

Creation of fantasy rankings. Users can add/remove individual professors from schools
and examine the impact on rankings. Thus, the website would allow for users to examine
the impact of adding/deleting a particular researcher at a university.

27

S-ar putea să vă placă și