Sunteți pe pagina 1din 19

International Journal of Quality & Reliability Management

Benchmarking and quality improvement: A quality benchmarking deployment approach


Hsiu-Li Chen
Article information:
To cite this document:
Hsiu-Li Chen, (2002),"Benchmarking and quality improvement", International Journal of Quality & Reliability
Management, Vol. 19 Iss 6 pp. 757 - 773
Permanent link to this document:
http://dx.doi.org/10.1108/02656710210429609
Downloaded on: 16 February 2015, At: 07:18 (PT)
References: this document contains references to 27 other documents.
Downloaded by Howard University At 07:19 16 February 2015 (PT)

To copy this document: permissions@emeraldinsight.com


The fulltext of this document has been downloaded 2433 times since 2006*
Users who downloaded this article also downloaded:
John P. Moriarty, (2011),"A theory of benchmarking", Benchmarking: An International Journal, Vol. 18 Iss 4
pp. 588-611 http://dx.doi.org/10.1108/14635771111147650
G. Anand, Rambabu Kodali, (2008),"Benchmarking the benchmarking models", Benchmarking: An
International Journal, Vol. 15 Iss 3 pp. 257-291 http://dx.doi.org/10.1108/14635770810876593
Shirley Daniels, (1996),"Benchmarking", Work Study, Vol. 45 Iss 3 pp. 18-20 http://
dx.doi.org/10.1108/00438029610115488

Access to this document was granted through an Emerald subscription provided by 172684 []
For Authors
If you would like to write for this, or any other Emerald publication, then please use our Emerald for
Authors service information about how to choose which publication to write for and submission guidelines
are available for all. Please visit www.emeraldinsight.com/authors for more information.
About Emerald www.emeraldinsight.com
Emerald is a global publisher linking research and practice to the benefit of society. The company
manages a portfolio of more than 290 journals and over 2,350 books and book series volumes, as well as
providing an extensive range of online products and additional customer resources and services.
Emerald is both COUNTER 4 and TRANSFER compliant. The organization is a partner of the Committee
on Publication Ethics (COPE) and also works with Portico and the LOCKSS initiative for digital archive
preservation.

*Related content and download information correct at time of download.


The research register for this journal is available at The current issue and full text archive of this journal is available at
http://www.emeraldinsight.com/researchregisters http://www.emeraldinsight.com/0265-671X.htm

Benchmarking and quality Benchmarking


and quality
improvement improvement

A quality benchmarking deployment


approach 757
Hsiu-Li Chen Received June 2001
Revised December 2001
Department of International Business, Ming Chuan University, Taipei,
Taiwan, ROC and Chung-Hua Institution for Economic Research,
Taipei, Taiwan, ROC
Downloaded by Howard University At 07:19 16 February 2015 (PT)

Keywords Benchmarking, Quality improvement, Airports


Abstract As part of total quality management (TQM), benchmarking management has become
a competitive technology taken by many successful companies. A benchmark is the value of some
parameters used as a reference point to compare the effectiveness of the various benchmarking
processes within one corporation with another and the information obtained is used to improve
the processes. In this paper, we propose a quantitative model which links performance indicators
with benchmarking process to help the company establish competitive benchmarking. In recent
years, industry practices have evolved their strategic and operational decisions taking customer
orientation into consideration. Therefore, this study attempts to build the benchmarking from the
``voice'' of the customer. The comprehensive methodology we propose here is called the quality
benchmarking deployment (QBD) technique. In this empirical study, we examined the CKS
International Airport and found that the ``convenience of transport facilities connecting to the
outside'', the ``interior design and layout'', and the ``information service of the airport'' should be
priorities to be improved in performing benchmarking activities. Airport benchmarking could
provide the CKS International Airport authority with a long-term vision and a valuable strategic
planning tool in airport service.

I. Introduction
Since Xerox developed a strong benchmarking foundation through its
``leadership through quality'' program in 1979, benchmarking management has
become a competitive technique taken by many companies such as IBM,
Motorola, AT&T, 3M, DuPont, and Digital hoping to minimize unit production
cost and product defects so as to improve productivity and meet the customer
needs. The CEO of Xerox, David Kearns, defined the benchmarking approach
as ``to continuously improve the product and service in order to compete with
the best one and the leadership in the industry.'' The benchmarking process has
been recognized in USA as the most important tool in improving product
quality for the past ten years and has been certified by Malcolm Baldrige
National Quality Award (MBNQA).
Benchmarking and performance evaluations are components of modern
management practices and parts of total quality management (TQM). However,
the benchmarking process differs from performance evaluation. Performance International Journal of Quality &
Reliability Management,
evaluation is a tool used for measuring productivity, cost efficiency, and Vol. 19 No. 6, 2002, pp. 757-773.
# MCB UP Limited, 0265-671X
operational advantages, which has traditionally been realized on a historical DOI 10.1108/02656710210429609
IJQRM basis. The benchmarking process is usually based on a competitive basis and is
19,6 the value of some parameters used as a reference point in comparisons. It could
be used to compare the performance within one corporation (internal) or among
different companies in an industry (external). Stonehouse, et al. (2000) suggested
that a successful benchmarking process relies on:
. commitment from the managers of the organization;
758
. acceptance of the need for improvement;
. willingness to accept others' perspectives;
. a supportive community;
. continuous development of competence; and
. a constructive vision, mission and clear objectives.
Downloaded by Howard University At 07:19 16 February 2015 (PT)

Typically, the benchmarking process comprises four phases, namely, planning,


analysis, integration and action (Dodwell and Zhang, 2000). The first two
phases fall under the external applications (amongst companies) and the last
two, internal applications (in each company). Planning lays the foundation and
is critical to implementing a successful benchmarking project. It involves the
complete understanding of the existing internal processes and measurements.
The analysis phase involves analyzing the benchmarking data to identify and
understand the practices which best contribute to the subject's strengths.
Hence, the company could determine current performance gap. In the
integration phase, the company develops goals and integrates them into the
benchmarking process to obtain significant performance improvements.
Finally, the action phase needed to achieve the goal is decided in the integration
phase. Pervious studies failed to set up a quantitative measurement to address
the benchmarking process. Therefore, the first objective of this paper is to
propose a tool to help the company to establish its competitive benchmarking.
The management, accounting and production literatures on benchmarking
are mainly focused on discussions about the methods of performance
measurement (Griffin and Hauser, 1993; Toni et al., 1995; Horsky and Nelson,
1996; Govers, 1996; Evans et al., 1997) or the types of benchmarking process
(Lucertini et al., 1995; Dodwell and Zhang, 2000). Only a few are dedicated to
linking the benchmarking with the value of a set of performance indicators
(Banker, 1997). The second objective of this paper is to link performance
indicators with the benchmarking system so that we can deepen one's
understanding of the importance of a company's performance for establishing
the benchmarking.
In recent years, industry practices have evolved their strategic and
operational decisions taking customer orientation into account. Specifically, for
each stage of product development and production, the company requires
greater details on customer needs than provided by the traditional market
research. However, until now it seems that no research paper on benchmarking
focused on the consumer's voice has been published elsewhere. Therefore, the
third objective of this paper is to build the benchmarking from the ``voice'' of the Benchmarking
customer. The comprehensive methodology we propose is the quality and quality
benchmarking deployment (QBD) technique. improvement
Section II of this paper presents the conceptual framework of this study
wherein the ``house of quality'' theory and its application in the QBD process
are described. Section III reports the empirical results of the Taiwanese
aviation center of the CKS international airport by using QBD technique. The 759
final section presents our conclusions.

II. Benchmarking process and QBD technique


1. Benchmarking process
As mentioned earlier, the benchmarking process is composed of four phases:
planning, analysis, integration, and action phases. In the planning phase,
Downloaded by Howard University At 07:19 16 February 2015 (PT)

management should reach agreement on such issues as: identifying what is to


be benchmarked, identifying competitors, and developing methods for data
collection. In the analysis phase, the key questions are these: what is the
performance of the best competition organization? How to determine current
performance, and how to benefit from the ``benchmark'' organization? The
integration phase is to develop goals and integrate them with the
benchmarking approach in order to achieve performance improvement. This
phase addresses such questions as: has top management accepted this finding?
Have the goals been completely communicated to all parties involved? While
the action phase follows the third phase, a continuous examination of the goals
is crucial. In this research, we especially focus on the first two phases, planning
and analysis. The integration and action phases are not the focus of this paper.
These two phases involve actions that are internal to the benchmarking
process, which should be implemented within the organization.
The task of linking the planning and analysis phases is to measure the gap
between the company's products and those of the best company in the industry.
To accomplish this task, to achieve the company's objectives and missions and
to make appropriate selections of benchmarking by considering the limits on
capital resources and capabilities, the QBD methodology is developed from the
technique of ``house of quality'' which is used to organize the qualitative and
quantitative data that is gathered, and to identify an organization's competitive
benchmarking. The conceptual framework is shown in Figure 1.

2. QBD technique: the house of quality


As mentioned earlier, the central theme of establishing benchmarking is built
upon customer needs, the house of quality technique is considered to be the
most useful quantitative tool to translate directly the ``voice'' of the customer
into product development and improvement elements. Mizuno and Akao (1996)
noted that the so-called ``quality function'' is an activity of continuing quality
improvement. Therefore, from product design and development to product
disposal, product quality function is clearly established in each stage. As such,
the quality function of deployment (QFD) technique systematically works to
IJQRM
19,6

760
Downloaded by Howard University At 07:19 16 February 2015 (PT)

Figure 1.
The benchmarking
process model

fulfill this purpose. In contrast with QFD, the QBD puts emphasis on
competitor's analysis and reveals our own position with respect to competitors.
The quality-planning table is provided here to capture this spirit. It reports
priority needs of the customer, upgrading rate of our product and target
quality-planning level of the best company in industry. Using this information,
one can calculate the weighted rank of each customer's needs. This allows an
organization to set up the priority of its product or service improvement and
compete with rivals based on customers' preferences.
As shown in Figure 2, the house of quality consists of four main parts:
customer needs, design attributes, relationship matrix, and competitive
benchmarking. In the following, the QBD tool will be proposed and how it
could be used to establish the company's benchmarking will be shown.
First, the benchmarking company should identify and structure its
customers' needs these are called the ``requirement qualities''. This is the most
critical part of the benchmarking process. Generally, customer needs may come
in various forms, such as basic needs, articulated needs, and surprise needs. It
is difficult for the product development team to deal with a list of 100-300
customer needs, examining them item by item. Therefore, we may use the KJ
Benchmarking
and quality
improvement

761

Figure 2.
Downloaded by Howard University At 07:19 16 February 2015 (PT)

The QBD technique


from the house of
quality

method to structure the customer needs into a hierarchy of primary, secondary,


and tertiary needs. In doing so, we are able to work out the requirement quality
deployment table.
Second, once the requirement quality deployment table is completed, the
downstream steps will be determined more and more specifically. On the right
hand side of the ``house'' is the competitive benchmarking which comes from
the customers' perceptions of performance. In this step, we first collect
customers' satisfaction of the products amongst competitive brands and obtain
the difference in value of the performance measures between our product and
the best one in the industry. Besides, some customer needs have higher
priorities for customers than do other needs; therefore, the relative importance
of various requirements should be ranked. These priorities will help the
company to make decisions, which balance costs of fulfilling a customer need,
with the desirability of fulfilling that need. Finally, we are able to design a
target quality level to be referenced by the company to redesign or improve
their products or service. According to our position with respect to the
competitors, in this step, we establish the quality planning table.
Third, one should translate the customer needs into technical specifications
in order to obtain design requirements. Because the design attributes must
reflect a valid measurement of customer requirements, the joint consideration
of marketing orientation and engineering orientation is required. Therefore, a
cross table is provided to serve this purpose[1]. In doing so, one can continue to
calculate the weighted importance of these ``quality elements'' (design
attributes)[2]. Specifically, it can be represented as the following equation:
!
X X
Wj RIi  Aij j Aij 1
i j
IJQRM where Wj denotes the weighted importance of the j element, RIi denotes a
19,6 relative importance of the i requirement quality and Aij denotes the interaction
degree between the i requirement quality and the j quality element. By
comparing the magnitudes of the Wj , we may aim at those elements with
higher values and redesign or improve the product or service accordingly. In
this step, the quality table is established.
762 Finally, the weighted importance (Wj ) provides abundant information. For
example, it contains details of the needs of the customers and their importance,
it is a competitiveness assessment of our own product, and it has a weight
based on the interaction of customer needs and design attributes. Therefore, the
relative values of the weighted importance represent the priorities against
which the company should be first to improve or redesign the quality of the
product or service that lags behind the best performance organization, hence,
Downloaded by Howard University At 07:19 16 February 2015 (PT)

the benchmarking built.

III. Empirical results


Recently, benchmarking is used by many different sectors of the economy as a tool
to control cost and product quality. In the empirical study, we examine the
benchmarking process for an airport. Airports worldwide have experienced
pressure to significantly change their management technique in order to compete
globally. Moreover, the airport operations have been gradually privatized, such as
those in the USA, Australia, Hong Kong, Singapore, and New Zealand. As long as
one of the airport's major objectives is to generate a profit, management will be
motivated to develop services and products to fit its customers and will not be
prone to make decisions solely on a political basis. In Taiwan, the Central
Economic Planning Department (CEPD) has developed a program to push the
Chiang Kai Shek (CKS) International Airport to establish an Asia Aviation
Operation Center. Moreover, Taipei is at the heart of Taiwan's powerful electronics
manufacturing industry and strongly links Asian markets and the North
American market. Now the CKS airport has been selected by UPS as the main Asia
hub for the air express carrier. Therefore, by using the QBD technique to evaluate
the CKS International Airport's performance and to build its benchmarking system
should provide useful management information for CEPD and its participants.

1. Planning phase
What is to be benchmarked? The service of the CKS International Airport has
been selected as our benchmarking target. Meanwhile, the service includes both
air cargo and passenger transport.
Identifying the competitors. From the reports of Airports Council International
(see Table I), Hong Kong International Airport, Singapore International Airport,
Shanghai International Airport, Manila International Airport, and Tokyo Narita
International Airport are set to be the competitive airports those that are
business competitors or geographical locations close to the CKS airport.
Data collection: sampling and the questionnaire. After the face-to-face
interview with the passengers, civil aviation departments, forwarding agents,
Passenger (year Air cargo (year Benchmarking
Airport 1999; numbers) Rank 1999; TEU) Rank and quality
improvement
Tokyo HND 54,338,212 1 724,318 8
Seoul 33,371,074 2 1,655,344 3
Hong Kong CLK 29,733,470 3 1,998,838 1
Bangkok 27,289,863 4 809,302 7
Singapore 26,064,645 5 1,522,984 4
763
Tokyo NRT 25,667,634 6 1,841,572 2
Sydney 21,542,000 7 527,027 10
Osaka 19,848,635 8 864,318 6 Table I.
Beijing 18,190,852 9 462,338 11 Asia's ten busiest
Taipei CKS 16,368,914 10 1,055,370 5 airports in terms of
passengers and
Sources: Airports Council International (1999); http:www.caaacct.gov.tw/1997/indexc.htm air cargo
Downloaded by Howard University At 07:19 16 February 2015 (PT)

aviation experts, and management scholars, we structured their linguistic


responses into signal meanings. Using the KJ method, we deployed the
customer needs into a hierarchy of primary, secondary, and tertiary levels.
Next, we performed another survey through the questionnaire based on the
signal meanings. We organized the questionnaire into two parts. Part one is the
basic data we recorded about the background of the respondents. Part two is
the performance evaluations of airports including CKS International Airport,
Hong Kong International Airport, Singapore International Airport, Shanghai
International Airport, Manila International Airport, and Tokyo Narita
International Airport. The performances were measured on a ten-point
semantic differential scale anchored by counter descriptions where 1 represents
very satisfied and 10, very dissatisfied. Representing samples included but
were not limited to airline companies, forwarders, scholars, and passengers.

2. Quality of benchmarking deployment house of quality


Requirement quality deployment. The primary level of the hierarchy of
customer needs is set to be the strategic direction for the aviation center, which
contains the core functional area, the support areas, and the multi-functional
area. Each primary need is elaborated into three-to-ten secondary needs, also
known as tactical needs or requirement qualities which indicate the specific
tasks. For example, if core functional area is the primary need, then the
secondary needs tell the team how the customer judges the core functional area.
Access to interface and processing, freight interface, and airfield are the
secondary needs. Also classified into the secondary needs are: infrastructure,
related industries, human resources, financial resources, and management
technology, which belong to the supporting area. BOT development and sea-air
cargo linkage belong to the multi-function area[3]. The tertiary needs are also
called ``the operational needs'' and they provide detailed information for the
engineering and R&D departments to develop the product or service to satisfy
the secondary needs (see Figure 3 for details).
Downloaded by Howard University At 07:19 16 February 2015 (PT)

19,6

764
IJQRM

Figure 3.
Quality table
Downloaded by Howard University At 07:19 16 February 2015 (PT)

Figure 3.
Benchmarking

improvement
and quality

765
IJQRM Customer perceptions: planning-quality table. To the right of the ``house'' is the
19,6 planning-quality table. In this table we see which parts of the product are
needed most, how well those needs are met, and whether there are any gaps
between the best product and our own product.
In order to assess performance of the CKS International Airport against
some external standard, we compare the requirement qualities of the CKS
766 International Airport against the selected competitive airports. In Table II, the
Pi values indicated under the ``priority rating'' represent priorities of the
customer needs[4]. The SCi values indicated under the ``satisfaction level of the
CKS airport'' represent the satisfaction level given of the CKS International
Airport. The STi values indicated under the ``satisfaction level of the
benchmarked airport'' denote the satisfaction level of the benchmarked airport
that performs best among competitive airports. Then the largest number
Downloaded by Howard University At 07:19 16 February 2015 (PT)

among Pi , SCi , and STi is recorded into the ``planning level'' (Pli ) which is the
target value for the CKS International Airport to achieve.
The upgrade rate in the satisfaction level (Ui ) is obtained by dividing the
``planning level'' value by the ``satisfaction level value'' (Pli =SCi ). The higher the
Ui , the worse the performance of the CKS airport. These ``benchmarked''
comparisons allow management to identify the best practices and compare
their company's performance with leading edge performers. As shown in
Table II, the CKS airport is lagging far behind the target airport in items of
``stops and lines of flight'', ``types of transport facilities connecting outside'',
``operation revenue/total operation cost'', and ``areas and quantity of parking
lots'', which means these items should be specially controlled or redesigned.
However, since these items are parts of the supporting area and multi-
functional area, it is apparent that the linkage of the operations and service of
the CKS International Airport with its related industries is deficient in forming
a complete working hub. The final column of Table II shows the relative
importance among the ``requirements qualities'', which is obtained by using the
following equation:
X
RIi Pi  Ui = Pi  Ui 2
i

where Pi is the priority rating of the need i, and Ui is the upgrading rate of the
need i.
Relationship matrix: quality table. Table II shows the recorded interactions
between requirement quality and quality elements. The values of relative
importance (RIi ) are put in the first column of the quality table. Using equation
(1), we could obtain the weighted importance Wj according to weight (symbol)
of each cell.

3. Analysis phase
In this phase we have to answer three questions:
(1) What is the performance level of the best organization?
Downloaded by Howard University At 07:19 16 February 2015 (PT)

Priorities The The satisfaction Upgrading


rating of the satisfaction of the rate of the Relative
customer of the CKS benchmarked Planning satisfaction importance
Quality requirements needs airport airport level level rating
Primary and secondary needs Tertiary needs Pi SCi STi Pli Ui RIi

1 Core functional area


11 Access interface and 111 Flow of passengers/area of terminals 6 6 8 8 1.33 2.78
processing 112 Time spent from check-in to boarding 6 7 8 8 1.14 2.38
113 Average distance of the passengers 6 5 8 8 1.60 3.35
114 Total seats/average passengers per day 6 5 7 7 1.40 2.93
115 Convenience of neighboring hotels,
dining and shopping facilities 6 6 8 8 1.33 2.78
116 Directional signs for the airport and
facilities 8 7 8 8 1.14 3.18
117 Degree of automation 8 7 8 8 1.14 3.18
118 Convenience of transit flights 7 6 8 8 1.33 3.24
119 Passport and visa checks 6 7 8 8 1.14 2.38
1110 Attitude and efficiency of customs
officers 8 7 8 8 1.14 3.18
1111 Airport levy or excise tax 6 7 7 7 1.00 2.09
1112 Luggage conveyance capability 6 7 8 8 1.14 2.38
12 Flight interface 121 Adequacy of connecting buses and
conveyer for passengers 7 7 8 8 1.14 2.78
122 Comfort of air bridges 6 7 8 8 1.14 2.38
13 Airfield 131 Quality of taxiways and runways 1 6 9 9 1.50 0.52
132 Adequacy of aviation guiding and
assistance facilities 1 7 9 9 1.29 0.45
133 Clearness of guiding lines and signs 1 7 9 9 1.29 0.45
134 Distribution and utilization of air
bridges 1 8 9 9 1.13 0.39
135 Adequacy of apron 1 4 6 6 1.50 0.52
(continued)

Quality-planning table
Table II.
Benchmarking

improvement
and quality

767
Downloaded by Howard University At 07:19 16 February 2015 (PT)

19,6

768
IJQRM

Table II.
Priorities The The satisfaction Upgrading
rating of the satisfaction of the rate of the Relative
customer of the CKS benchmarked Planning satisfaction importance
Quality requirements needs airport airport level level rating
Primary and secondary needs Tertiary needs Pi SCi STi Pli Ui RIi

136 Electricity, communication and lighting 1 6 6 6 1.00 0.35


137 Efficiency of navigational matters and
maintenance personnel 1 8 9 9 1.13 0.39
138 Efficiency of passenger affairs 1 7 8 8 1.14 0.40
139 Distribution of areas for ground service
vehicles 1 6 6 6 1.00 0.35
1310 Supplies and logistics service 1 7 8 8 1.14 0.40
1311 Operation of ground service vehicles 1 6 6 6 1.00 0.35
2 Support area
21 Infrastructure 211 Geographical advantages 10 7 8 10 1.43 4.98
212 Frequency of flights 5 5 8 8 1.60 2.79
213 Stops and lines of flights 10 10 10 10 1.00 6.97
214 Total size of airport 7 7 8 8 1.14 2.78
22 Related industry 221 Types of transport facilities with
outside 10 6 8 10 1.67 5.82
222 Areas and quantity of parking lots 10 9 10 10 1.11 3.87
223 Sizes of neighboring industrial,
warehousing and processing industrial
parks and zones 3 5 5 5 1.00 1.05
224 Scale of tourism and service industries 6 7 8 8 1.14 2.38
23 Human resource 231 Aviation guiding capability of control
tower staffs 1 7 8 8 1.14 0.40
232 Capability of maintenance and safety
control sectors 1 8 9 9 1.13 0.39
233 Total annual number of passengers/
total number of airport authority staffs 1 5 7 7 1.40 0.49
(continued)
Downloaded by Howard University At 07:19 16 February 2015 (PT)

Priorities The The satisfaction Upgrading


rating of the satisfaction of the rate of the Relative
customer of the CKS benchmarked Planning satisfaction importance
Quality requirements needs airport airport level level rating
Primary and secondary needs Tertiary needs Pi SCi STi Pli Ui RIi

234 Years of experience of airport staffs 1 5 5 5 1.00 0.35


24 Financial support 241 Landing fees 1 5 8 8 1.60 0.56
242 Economic status; per capita GNP 4 6 8 8 1.33 1.85
243 Financial liberalization and stability 5 7 9 9 1.29 2.25
244 Operation revenue/total operation cost 9 7 7 9 1.29 4.05
25 Management technology 251 Owned and operated by private or
public 4 3 7 7 2.33 3.25
252 Operation time per day 7 7 7 7 1.00 2.44
253 Efficiency of customs 7 7 8 8 1.14 2.78
254 Communication between airport
authority and cargo owners, shippers 4 5 8 8 1.60 2.23
3 Multi-functional area 311 Development by BOT 4 7 8 8 1.14 1.59
312 Sea-air cargo linkage 4 4 9 9 2.25 3.14

Table II.
Benchmarking

improvement
and quality

769
IJQRM (2) How do we determine the current performance level of the CKS airport?
19,6 (3) How do we benefit from the benchmarking?
Table II could answer the first two questions. To answer the third question, we
will analyze the benchmarking practices of the CKS International Airport.
The benchmarking. Reviewing from the requirement quality, as show in
770 Table II, the CKS International Airport should make improvement in those
items that it lags far behind the ``target airport'' that is, in these indicators:
``stops and lines of flight'', ``types of transport facilities with outside'',
``geographical advantages'', ``operation revenue/total operation cost'', and ``areas
and quantity of parking lots'', most which belong to the support area.
As shown in Figures 3 and 4, the ``convenience of transport facilities with
outside'', ``interior design and layout of the building'', ``sufficiency of stops and
Downloaded by Howard University At 07:19 16 February 2015 (PT)

lines of flights'', ``information service of the airport'', and ``sufficiency of the

Figure 4.
The priority of
benchmarking elements
related institution'' have higher weighted importance. Namely, these five Benchmarking
quality elements are, therefore, selected as first priorities to be benchmarked by and quality
the CKS airport. This result also shows that the weaknesses of the CKS airport improvement
focus on the dimensions of service and infrastructure. Therefore, the CKS
International Airport authority should consider these two areas carefully.
However, if the capability and ability are allowed, the CKS aviation center may
improve the other dimensions process and security according to their 771
weighted importance. Airport benchmarking for the CKS International Airport
authority may help to set up a long-term vision and the best strategic planning
for the airport service.

IV. Conclusion
Currently, benchmarking has been propelled to become a high priority practice
Downloaded by Howard University At 07:19 16 February 2015 (PT)

among business administration professionals. According to The


Benchmarking Exchange (TBE), the annual ranking of business process shows
that benchmarking rose to the top in 1998. Benchmarking has widely been
applied in such business processes as corporate mission statements, employee
development plans, and organization reconstruction.
The benchmarking process is one part of total quality management, and can
be described as a four-phase continuum: planning, analysis, integration and
action. There is much literature on the future of the benchmarking process;
however, a technical instrument for specifying benchmarking for a company
has yet to be developed. In this study, we develop such an instrument, which
we call ``the quality benchmarking deployment'' (QBD) method to help
management compare their product and service with the best in the industry.
In the empirical study, we examine the CKS International Airport and obtain
the following findings. First, Table II shows that the CKS International Airport
should make improvement in those items in which it lags far behind the target
airport that is, in these areas: ``stops and lines of flight'', ``types of transport
facilities with outside'', ``geographical advantages'', ``operation revenue/total
operation cost'', and ``areas and quantity of parking lots''. Second, Figure 3 tells
us to select the ``convenience of transport facilities with outside'', ``interior
design and layout of the building'', ``sufficiency of stops and lines of flights'',
``information service of the airport'', and ``sufficiency of the related institution''
as the first priorities to be benchmarked. Therefore, the administrators of the
CKS International Airport should strive to pay more attention to these design
elements. Finally, if the capability and ability are allowed, the CKS aviation
center may improve the other dimensions process and security according to
their weighted importance. Airport benchmarking will provide the CKS
International Airport authority with a long-term vision and a valuable strategic
planning tool for the airport service.

Notes
1. Different degree of interaction will be denoted by a symbol. If the customer need and the
design attribute have a strong relationship, it is recorded O and given five points; a
IJQRM medium relationship, by and three points; and a weak relationship, by , and one point;
no symbol or point will be shown for zero interaction. These corresponding relationships
19,6 are determined by selected experts. They meet and discuss, communicate, and brainstorm
until every one agrees on these results.
2. A popular method is so-called the independent matching point approach. For each cell in
the table, a weight is obtained by multiplying the relative importance of the requirement
quality and the point (i.e. weight) assigned to each relationship matrix symbol. Then we
772 aggregate the weights for each column, thus providing a weighted importance of each
customer need in achieving the collective design attribute.
3. In this area the secondary needs are not available.
4. These priorities are based on the frequency given by the interview respondents of these
signal meanings. The high the frequency, the more important the signal meanings are.

References
Downloaded by Howard University At 07:19 16 February 2015 (PT)

Airports Council International (1999), World Airport Traffic Report Calendar Year, Airports
Council International, Geneva.
Banker, R.D. (1997), ``Discussion: involuntary benchmarking and quality improvement: the effect
of mandated public disclosure on hospitals'', Journal of Accounting, Auditing & Finance,
Vol. 12 No. 3, pp. 347-52.
Dodwell, D. and Zhang, A. (2000), Air Cargo at Hong Kong's Service: An Analysis of Current and
Future Roles, and Policy Priorities, Cathay Pacific Airways Supported.
Evans, J.H., Hwang, Y., Nagarajan, N. and Shastri, K. (1997), ``Involuntary benchmarking and
quality improvement: the effect of mandated public disclosure on hospitals'', Journal of
Accounting, Auditing & Finance, Vol. 12 No. 3, pp. 315-46.
Griffin, A. and Hauser, J.R. (1993), ``The voice of the customer'', Marketing Science, Vol. 12,
pp. 1-27.
Govers, C.P.M. (1996), ``What and how about quality function deployment (QFD)'', International
Journal of Production Economics, Vol. 46-47, pp. 575-85.
Horsky, D. and Nelson, P. (1996), ``Evaluation of sales force size and productivity through
efficient frontier benchmarking'', Marketing Science, Vol. 15, pp. 301-20.
Lucertini, M., Nicolo, F. and Telmon, D. (1995), ``Integration of benchmarking and benchmarking
of integration'', International Journal of Production Economics, Vol. 38, pp. 59-71.
Mizuno and Akao, Y. (1996), Quality Function Deployment, 5th ed., Asian Productivity
Organization, Tokoyo.
Toni, D.A., Nassimbeni, G. and Tonchia, S. (1995), ``An instrument for quality performance
measurement'', International Journal of Production Economics, Vol. 38, pp. 199-207.

Further reading
Bossert, J.L. (1991), Quality Function Deployment A Practitioner's Approach, ASQC Quality
Press Inc., New York, NY.
Bryan, D.L. and O'Kelly, M.E. (1999), ``Hub-and-spoke networks in air transportation: an
analytical review'', Journal of Regional Science, Vol. 39 No. 2, pp. 275-95.
Bureau of Industry Economics (1994), Australia, Airport Performance Indicators, Canberra.
Daugherty, P.J., Germain, R. and Droge, C. (1995), ``Predicting EDI technology adoption in
logistics management: the influence of context and structure'', Logistics and
Transportation Review, Vol. 31, pp. 309-24.
Doganis, R. and Graham, A. (1987), ``Airport management: the role of performance indicators'',
Transport Studies Group Research Report, Vol. 13.
Doganis, R. and Graham, A. (1995), The Economic Performance of European Airports, Benchmarking
Department of Air Transport, Cranfield University, Cranfield.
Dopuch, N. and Gupta, M. (1997), ``Estimation of benchmark performance standards: an
and quality
application to public school expenditures'', Journal of Accounting & Economics, Vol. 23, improvement
pp. 141-61.
Gillen, D. and Lall, A. (1997), ``Developing measure of airport productivity and performance: an
application of data envelopment analysis'', Transportation Research Part E Logistics and
Transportation Review, Vol. 33 No. 4, pp. 261-73. 773
Gillen, D. and Noori, H. (1995), ``A performance measuring matrix for capturing the impact of
AMT'', International Journal of Productivity Research, Vol. 33 No. 7, pp. 2037-48.
Gillen, D. and Waters, W.G. (1997), ``Introduction airport performance measurement and airport
pricing'', Transportation Research Part E Logistics and Transportation Review, Vol. 33
No. 4, pp. 245-7.
Hooper, P.G. and Hensher, D.A. (1997), ``Measuring total factor productivity of airports an
Downloaded by Howard University At 07:19 16 February 2015 (PT)

index number approach'', Transportation Research Part E Logistics and Transportation


Review, Vol. 33 No. 4, pp. 249-59.
Horonjeef, R. and Mckelvey, F.X. (1994), Planning and Design of Airport, McGraw-Hill, Inc., New
York, NY.
Infrastructure Management Group, Inc. (1994), Benchmarking the Indianapolis Airport Authority,
Bethesda, MD.
Mohamed, Z. and Youssef, M.A. (1995), ``Quality function deployment: a main pillar for
successful total quality/management and product development'', International Journal of
Quality & Reliability Management, Vol. 12 No. 6.
Rolatadas, A. (1995), Performance Management: A Business Process Benchmarking Approach,
Chapman Hall, London.
Seneviratne, P.N. and Martel, N. (1991), ``Variables influencing performance of air terminal
buildings'', Transportation Planning and Technology, Vol. 16, pp. 3-28.
Stonehouse, G., Hamill, J. and Purdie, T. (2000), Global and Transnational Business Strategy
and Management, Wiley Ltd, Chichester.
This article has been cited by:

1. D. Talebi, H. Farsijani, F. Sedighi, M. Shafiei Nikabadi. 2014. The Role of Quality Benchmarking
Deployment to World-Class Manufacturing. Quality Engineering 26, 206-214. [CrossRef]
2. Wen-Hsien Tsai, Wei Hsu, Wen-Chin Chou. 2011. A gap analysis model for improving airport service
quality. Total Quality Management & Business Excellence 22:10, 1025-1040. [CrossRef]
3. Tony Garry, T.C. Melewar, Len Tiu Wright, Anne Broderick, Tony Garry, Mark Beasley. 2010. The
need for adaptive processes of benchmarking in small businesstobusiness services. Journal of Business &
Industrial Marketing 25:5, 324-337. [Abstract] [Full Text] [PDF]
4. Paulo Amaral, Rui Sousa. 2009. Barriers to internal benchmarking initiatives: an empirical investigation.
Benchmarking: An International Journal 16:4, 523-542. [Abstract] [Full Text] [PDF]
5. Dale Fodness, Brian Murray. 2007. Passengers' expectations of airport service quality. Journal of Services
Marketing 21:7, 492-506. [Abstract] [Full Text] [PDF]
Downloaded by Howard University At 07:19 16 February 2015 (PT)

S-ar putea să vă placă și