Documente Academic
Documente Profesional
Documente Cultură
www.emeraldinsight.com/2046-6099.htm
SASBE
2,2
106
A review of building/
infrastructure sustainability
reporting tools (SRTs)
Renard Y. J. Siew, Maria C. A. Balatbat and David G. Carmichael
The University of New South Wales, Sydney, New South Wales, Australia
Abstract
Purpose Buildings/infrastructure are recognised to have a significant impact on the environment
and the community, and hence there is pressure on industry practitioners to incorporate environmental
and social considerations in addition to the traditional cost, time and quality. The development of
sustainability reporting tools (SRTs) to assist in the management of green building/infrastructure
projects is pivotal in informing on progress in sustainability practices. However, the rapid growth
of SRTs in the last decade, with different criteria and methodology, has created complications
for stakeholders.
Design/methodology/approach The paper provides a comprehensive review of tools to guide
practitioners, property investors, policy makers and developers towards making informed choices in
green building/infrastructure projects. Comparative analyses, benefits and limitations of these tools
are discussed in the paper.
Findings Some of the findings from the analysis of SRTs include: an emphasis on environmental
issues; scoring which does not account for uncertainty or variability in assessors perceptions; lack of
published reasoning behind the allocation of scores; inadequate definition of scales to permit
differentiation among projects; and the existence of non-scientific benchmarks.
Originality/value The paper departs from earlier reviews to include a discussion on infrastructure
SRTs, life cycle tools, and issues broader than the environment. Changes and additions, subsequent to
earlier reviews, have been made to SRTs, making the updated review provided here useful.
Keywords Sustainable development, Infrastructure, Buildings, Rating tools, Sustainability criteria,
Sustainability indicators, Reporting
Paper type General review
1. Introduction
Sustainable development has been internationally agreed as a key goal for policy
makers to guide development at global, national and local levels (Singh et al., 2009).
The World Economic Forum (2011, p. 11) identifies the building sector as an area which
needs to be addressed because it accounts for 40% of the worlds energy use, 40% of
carbon output and consumes 20% of available water. The large use of electricity in
buildings has been identified as one of the main culprits for high emissions across the
globe. The Centre for International Economics Canberra and Sydney (2007) reports
that 23 per cent of the total greenhouse gas emissions in Australia come from
the energy demand of the building sector, while the US Green Building Council
(USGBC, 2011) claims that both residential and commercial buildings account for
39 per cent of total emissions in the USA, and more than any other country in the
world except China.
The increased recognition that buildings are substantial carbon dioxide (CO2)
rge-Vorsatz and Novikova, 2008; Buchanan and Honey,
emitters (Reed et al., 2009; U
1994; Levermore, 2008), and contribute significantly to climate change, puts pressure
on construction industry practitioners to incorporate sustainability goals aside from
the traditional project goals of cost, time and quality (Fernandez-Sanchez and
Rodrguez-Lopez, 2010). Translating sustainability goals into action at the project level
is complicated by the individual characteristics of countries, their cultures, climates
and types of buildings (Ugwu and Haupt, 2007).
Against this background, there is a widely recognised need to identify metrics and
tools that would help articulate the extent to which current activities are either
sustainable or not sustainable (Singh et al., 2009). This has been the key motivator
for the development and increased popularity of sustainability reporting tools
(SRTs) in the building sector and the civil engineering infrastructure sector.
Infrastructure includes transport (roads and bridges, bus and cycle ways, footpaths,
railways), water (sewage and drainage, water storage and supply), energy
(transmission and distribution) and communication (transmission and distribution)
among others (AGIC, 2012). This paper provides a review of available tools used
to assess and report sustainability in the infrastructure and building sectors. The tools
are commonly used in their country of origin, particularly if this is legislated, but are
adopted in other countries.
Cole (1999) suggests that SRTs, used with the intent to evaluate green
performance, usually take on a few common characteristics such as emphasis on the
assessment of resource use and ecological loadings, the assessment of design
intentions and potential through prediction rather than actual real-world performance,
use of performance scoring as an additive process and a performance summary,
certificate or label. Cole (2005) adds that SRTs are not only a means to facilitate
the reduction of environmental impacts, but also are increasingly being used as a basis
for risk and real estate valuations in obtaining development approval from the
banking industry. SRTs provide industry standard guidelines and allow comparability
across projects. For building owners and operators, using SRTs demonstrates
commitment to corporate social responsibility (CSR), and permits staying ahead of
future government regulations (Green Building Council of Australia (GBCA), 2012b).
Developing an ideal SRT is challenging because it needs to be able to satisfy all
stakeholders concerns (Ding, 2008).
SRTs have been in existence since the last decade within a number of countries, and
were introduced in an effort to better understand the sustainability level of buildings and
infrastructure. While it may be argued that different climates and cultures, and the
different nature of buildings/infrastructure for each country may warrant unique
reporting tools, the rapid growth of SRTs has made sustainability comparisons more
complicated for stakeholders, for example, property investors (Reed et al., 2009), who rely
on such tools to make informed investment decisions. According to Nguyen and Altan
(2011), although there are many registered building SRTs, only a few of them are widely
acknowledged. Infrastructure SRTs and life cycle tools are less commonly discussed.
This paper departs from other reviews (Ding, 2008; Reed et al., 2009; Mitchell, 2010;
Berardi, 2012; Sev, 2011), to include a discussion on infrastructure SRTs, life cycle tools,
and issues broader than the environment. Berardi (2012) provides a review of building
SRTs in three categories: total quality assessment, life cycle analysis (LCA) and energy
consumption evaluation. However, because of the wide scope adopted, the review on life
cycle tools is not extensive. Changes subsequent to the above reviews have been made to
some SRTs, for example, Leadership in Energy and Environmental Design (LEED),
making the updated review provided here useful. As well, this paper summarises the
empirical evidence on the benefits of engaging in SRTs, and provides a critique of the tools.
The Global Reporting Initiative (GRI) was set up with the intention of providing an
international sustainability reporting framework (Global Reporting Initiative (GRI), 2011).
Building/
infrastructure
SRTs
107
SASBE
2,2
108
Under this framework, specific reporting guidelines for the construction and real
estate sector are available. However, since the context provided by the GRI guidelines
is more applicable at a corporate level, it is not reviewed here. The scope of this paper is
SRTs for building/infrastructure projects.
The structure of the paper is as follows. The following sections explore the nature
of major building SRTs, infrastructure SRTs and life cycle tools applicable to both
buildings and infrastructure. A critique of these tools and suggestions for future
research are given.
This paper acknowledges that the multiplicity of terms in SRTs can be confusing to
the reader. As such, it is important to clarify some of this terminology upfront.
Typically, for most SRTs, there are hierarchical levels of sustainability criteria. To
ensure consistency, the top (highest) level will be referred to here as criteria, and the
next (lower) level as subcriteria.
The review provided here will be of interest to a range of stakeholders
construction industry practitioners, real estate investors and developers involved in
making decisions about green building/infrastructure projects. As well, it will serve
as a useful reference for the development of the next generation of SRTs.
2. SRTs for buildings
A review of some of the major tools applicable to buildings is given. This is followed,
after a similar review for infrastructure and life cycle tools, by a critique.
Building Research Establishments Environmental Assessment Method (BREEAM)
BREEAM, established in 1990, was first launched in the UK with office buildings
in mind (Bonham-Carter, 2010; Sharifi and Murayama, 2013) but later expanded
in scope to also include specific schemes for residential housing and neighbourhoods.
It is perceived to be one of the worlds foremost environmental reporting tools for
buildings (Crawley and Aho, 1999). Scores are awarded to ten criteria management,
health and well-being, energy, transport, water, materials, waste, land use and ecology,
pollution and innovation according to performance, and summed to produce an
overall score. This score is then matched to an award: pass, good, very good, excellent
or outstanding.
Table I highlights both the criteria and subcriteria in BREEAM. Scores are awarded
upon meeting the agreed performance targets for each of the subcriteria.
The award benchmarks for new buildings, refurbishments and, where applicable,
fit-out projects, are presented in Table II. The BREEAM tool offers a set of
weightings to be taken into account as part of the assessment process (see Table III)
(BREEAM, 2012).
LEED
LEED was developed by the USGBC in 2000. Since its inception, LEED has grown to
encompass more than 14,000 projects in the USA and more than 30 countries (Nguyen
and Altan, 2011). This tool promotes sustainable building and development practices
through a suite of reporting, and recognises projects which are committed to better
environmental and health performance (LEED, 2012). Two major building typologies
covered by LEED are:
(1)
New Construction and Major Renovations v2009. The criteria and scores
(included in parentheses) available for each criterion are as follows: sustainable
Management
Commissioning
Construction site impacts
Security
Health and well-being
Daylight
Occupant thermal comfort
Acoustics
Indoor air and water quality
Lighting
Energy
CO2 emissions
Low or zero carbon technologies
Energy sub-metering
Energy efficient building tools
Transport
Public transport network connectivity
Pedestrian and cyclist facilities
Access to amenities
Travel plans and information
Water
Water consumption
Leak detection
Water re-use and recycling
Building/
infrastructure
SRTs
Waste
Construction waste
Recycled aggregates
Recycling facilities
Pollution
Refrigerant use and leakage
Flood risk
NOx emissions
Watercourse pollution
External light and noise pollution
Land use and ecology
Site selection
Protection of ecological features
Mitigation/enhancement of ecological value
109
Materials
Embodied life cycle impact of materials
Materials re-use
Responsible sourcing
Designing for robustness
Innovation
New design and construction methods not formally recognised
Table I.
BREEAM criteria and
subcriteria
Award
Score (%)
Unclassified
Pass
Good
Very Good
Excellent
Outstanding
o30
X30
X45
X55
X70
X85
sites (26), water efficiency (10), energy and atmosphere (35), indoor
environmental quality (IEQ) (15), innovation in design (6), regional priority
(4) and materials and resources (14) (LEED, 2009a).
(2)
For both typologies, scores are accumulated using a base of 100 (innovation in design
and regional priority are added separately), and rated according to a scale as shown
Table II.
BREEAM award
SASBE
2,2
BREEAM criterion
110
Table III.
BREEAMs criteria
weightings
Management
Health and well-being
Energy
Transport
Water
Materials
Waste
Land use and ecology
Pollution
Innovation
Weightings (%)
New builds, extensions and
Building-fit-out only
major refurbishments
(where applicable to scheme)
12
15
19
8
6
12.5
7.5
10
10
10
13
17
21
9
7
14
8
na
11
10
in Table IV. There are embedded prerequisites within each criterion (except for
sustainable sites Existing Buildings: Operations and Maintenance v2009) which
must be met before a score is awarded. LEED for Neighbourhood Development (2009)
is the latest USGBC reporting tool, which incorporates site selection, design and
construction elements (Hurley and Horne, 2006) taking into account both landscape
and regional contexts (Sharifi and Murayama, 2013).
Green star
Green Star, developed by the Green Building Council of Australia (GBCA) is a
comprehensive voluntary building SRT. It was initially developed to accommodate the
need for buildings operating in hot climatic areas (Roderick et al., 2009; Tronchin and
Fabbri, 2008). It incorporates ideas from other tools, such as BREEAM and LEED, and
other environmental criteria specific to the Australian environment (Lockwood, 2006).
Green Star covers the nine criteria shown in Table V, where scores are awarded if
targets are met.
A single, overall score is calculated based on a series of steps. First, for each
criterion, a score is determined. Then, given weightings are applied. All the weighted
criteria scores are summed. Innovation points can be obtained by either engaging with
innovative strategies and technologies or exceeding the Green Star benchmark.
Innovation points are added to the weighted criteria scores. This gives an overall score,
which is then matched to an award (see Table VI). The GCBA only certifies buildings
with 4, 5, or 6 Green Stars.
Award
Table IV.
LEED award
Certified
Silver
Gold
Platinum
Source: LEED (2012)
Score
40-49
50-59
60-79
80 and above
Purpose
Management
Energy
Water
Land use and ecology
Indoor Environment
Quality (IEQ)
Transport
Materials
Emissions
Innovation
Building/
infrastructure
SRTs
111
Table V.
Green star criteria
Award
Score
Description
1 star
2 star
3 star
4 star
5 star
6 star
10-19
20-29
30-44
45-59
60-74
X75
Minimum practice
Average practice
Good practice
Best practice
Australian excellence
World leadership
Table VI.
Green star awards
SASBE
2,2
112
100
BEE = 3.0
BEE = 1.0
S
A
Q (Quality)
BEE = 1.5
B+
B
BEE = 0.5
50
Figure 1.
The BEE graph
50
L (Load)
100
Building/
infrastructure
SRTs
113
Criterion
Weighting (%)
existing buildings
Weighting (%)
new buildings
18
12
30
15
25
25
8
35
12
20
Table VII.
HK-BEAM criteria
weightings
Award
Platinum
Gold
Silver
Bronze
Overall (%)
SA (%)
EU (%)
IEQ (%)
75
65
55
40
70
60
50
40
70
60
50
40
70
60
50
40
Table VIII.
HK-BEAM awards
SASBE
2,2
114
Table IX.
NABERS office
buildings
Coverage
Tenancy
Base building
Whole building
Tenanted space
Central building services and common areas
A combination of the above
Energy STAR
Energy STAR derives from the US Environmental Protection Agency and the US
Department of Energy. Essentially, it is a tool used to track and benchmark a buildings
energy performance. An energy performance scale is developed based on (Energy
STAR, 2011):
.
From the results of the statistical analysis, development of a model to predict the
energy use of a certain type of building accounting for its location and type of
operation.
Award
1 star
2 star
3 star
4 star
5 star
199
167
135
103
71
Item
Green Star
NABERS
Environmental impact
assessment
Scope
Phase
Owner
Potential
Actual
Design
Design phase
GBCA
Performance
In operation/use
Department of Environment, Climate Change
and Water NSW
Office
Homes
Hotel
Shopping centres
Certifiable awards
Legislation
Office
Retail
Healthcare
Education
Industrial
4,5 or 6 stars
Accreditation is on a
voluntary basis
115
Comments
Coverage
Building/
infrastructure
SRTs
Table X.
NABERS award
Table XI.
Differences between
Green Star and NABERS
SASBE
2,2
116
ASPIRE
Society
Environment
Figure 2.
ASPIREs framework
Air
Land
Water
Biodiversity
Energy
Materials
Stakeholders
Culture
Population
Services
Health
Vulnerability
Economics
Equity
Livelihoods
Macro
Viability
Institutions
Structures
Skills
Policies
Reporting
Non-formal SRTs
Publications to date, on infrastructure reporting, have focused on the development of
sustainability criteria. For example, Fernandez-Sanchez and Rodrguez-Lopez (2010)
recommend more than 80 criteria for infrastructure projects in Spain. From their study,
they find that 11 of the criteria voted for by stakeholders in the top 30 (based on the
analytic hierarchy process) largely involve economic and social issues. These criteria
(with relative importance as a percentage in parentheses) include: health and safety
(3.85 per cent), necessity of work-urgency of work (3.77 per cent), life cycle cost (3.72 per
cent), economical cost/economical benefit (3.24 per cent), project declaration of general
interest (2.96 per cent), public participation and control on the project (2.59 per cent),
barrier effect of the project (2.38 per cent), project governance and strategic
management (2.26 per cent), accessibility for human biodiversity (2.26), respect for
local customs (2.05 per cent) and increase in economic value (1.42 per cent). Shen et al.
(2007) suggest a sustainability project performance checklist across a projects life
cycle inception, design, construction, operation and demolition. Ugwu and Haupt
(2007) identify key performance indicators (KPIs) and assessment methods for
infrastructure sustainability from a South African construction industry perspective.
Sahely et al. (2005) propose sustainability criteria for urban infrastructure, focusing on
key interactions and feedback mechanisms between infrastructure and wider
environmental, social and economic concerns. Morrissey et al. (2012) critically appraise
project impacts from an ecological limits sense.
Building/
infrastructure
SRTs
117
Design
As-built
Operation
During operation
Description
Awarded based on the inclusion of design elements and
construction requirements
Includes measured sustainability performance during
construction and built into the infrastructure asset
Given after 24 months of operation. Based on the
measured green performance of operating
infrastructure
Award
Score
Not eligible
Commended
Excellent
Leading
o25
25-o50
50-o75
75-100
Table XII.
AGICs three
reporting types
Table XIII.
AGICs award
SASBE
2,2
118
LCA
In LCA, the environmental impact of activities and raw materials over a building life
cycle (manufacture, transportation, deconstruction and recycling) are assessed. The
four phases in LCA are: goal and scope, life cycle inventory, life cycle impact
assessment and improvement (Hsu, 2010).
In setting a goal and scope, a person may wish to investigate, for example, which
structural design has a lesser environmental impact, or how a structure can be
further improved to lessen its impact on the environment. Life cycle inventory
involves collecting inflow and outflow data and modelling. Life cycle impact involves
selecting, classifying and characterising the impact on the environment. The final
phase of improvement involves revisiting the earlier phases. This is necessary
in order to identify the most important aspects of impact assessment, check
the validity of results and redo aspects of the LCA that need more work
(Hsu, 2010, p. 15).
Mroueh et al. (2001) use LCA for road and earth construction. Mithraratne and Vale
(2004) take into account embodied and operating energy requirements as well as life
cycle costs over the useful life of a house. Many countries have also developed specific
LCA tools, for example, BEES in the USA, BOUSTEAD and ENVEST in England,
Ecoinvent in Switzerland and GaBi in Germany (see Appendix Table AI).
IO analysis
IO analysis, as used in macroeconomic studies of monetary flows, has been adapted to
environmental impact analysis (Piluso et al., 2008). IO tables (Xu et al., 2010) have rows
representing outputs, and columns representing inputs. From an IO table, a matrix of
IO coefficients can be derived. These IO coefficients (also known as technical
coefficients) represent the amount of input required to produce one unit of output
(see Xu et al., 2010 for a mathematical formulation). For sustainability analysis,
typically a simplified IO matrix is adopted (Piluso et al., 2008; Xu et al., 2010) and the
technical coefficients could potentially help answer questions such as how much CO2
has been emitted in the production of one tonne of steel (Born, 1996). Norman et al.
(2006) use IO and LCA combined to estimate the energy use and greenhouse gas
emissions associated with the manufacture of construction materials for infrastructure,
buildings and transportation.
MFA
MFA is used to characterise the flow of materials, products and substances in a defined
system (Huang and Tsu, 2003; Kahhat and Williams, 2012). It applies a conservation
law total inputs must equal total outputs plus any net accumulation (EUROSTAT,
2001). EUROSTAT (2001, pp. 20-24) suggests making a distinction between material
flows: direct vs indirect; used vs unused; domestic vs rest of the world.
PaLATE
PaLATE is a spreadsheet-based tool used in the assessment of environmental and
economic impacts for pavements and roads. The tool depends on knowing the design,
roadway cost, construction materials and transportation information (both mode and
distances), as well as any road maintenance involved. Among some of the
environmental effects covered under PaLATE are energy consumption, CO2
emissions, NOx emissions, PM10 emissions, CO emissions and leachate (Horvarth
et al., 2007).
Building/
infrastructure
SRTs
119
CASBEE
HK-BEAM
Percentage of credits
achieved (%)
score
Design, review
Design
and construction
review
and as-built
Green Star
Graphical representation Total
(BEE graph)
Preliminary design,
Design,
construction design, and
and operation post design
LEED
Design, review
and construction
Stages of project
Table XIV.
Similarities across
reporting tools
BREEAM
120
Item
SASBE
2,2
exists for all tools. In terms of the final presentation, BREEAM and HK-BEAM
reporting is in the form of percentage of credits achieved, CASBEEs reporting is done
in the form of a BEE graph as illustrated in Figure 1, while the reporting in LEED and
Green Star is done in terms of a total score.
Nguyen and Altan (2011) compare several attributes of these five building SRTs by
scoring them based on: popularity and influence, availability, user-friendliness,
applicability, data collecting process, development and quality of results presentation.
The outcomes of their assessment (Table XV) show that BREEAM and LEED may be
better in terms of applicability and popularity. CASBEE, on the other hand, gives the
highest methodology score, possibly due to its more rigorous nature.
The other strengths of these mainstream SRTs are summarised as follows.
BREEAM:
.
Building/
infrastructure
SRTs
121
LEED:
.
criteria are publicly-reviewed by more than 15,000 members and are arguably
more transparent compared to BREEAM (LEED, 2012);
credits are allocated for heat island effect (trees as shades, or the specification
of high solar reflectance materials) (LEED, 2009a);
allows for credit interpretation request (CIR) in the event that developers have an
alternative means of meeting a credit point.
CASBEE:
.
Attribute
BREEAM
LEED
CASBEE
Green Star
HK-BEAM
10
7
11
13
7
8
8
8
3
75
10
7
10
13
7
7
10
8
3
75
6
7
13
11.5
6
9
6
7
4
69.5
5
8
9
10
9
5
8
8
3
65
5
8
11
9
8
5
8
8
4
66
Popularity (/10)
Availability (/10)
Methodology (/15)
Applicability (/20)
Data-collecting process (/10)
Accuracy (/10)
User-friendliness (/10)
Development (/10)
Presentation (/5)
Final Score (/100)
Source: Nguyen and Altan (2011)
Table XV.
Overall score of
reporting tools
SASBE
2,2
HK-BEAM:
.
Mandatory adoption of BEAM Plus Version 1.2 from 1 January 2013. This
allows for standardisation and better comparability (HKGBC and BEAM Society,
2012).
Allows for CIR in the event that developers have an alternative means of
meeting a credit point.
122
Green Star:
.
Allows for compliance interpretation request (CIR) in the event that developers
have an alternative means of meeting a credit point.
These building SRTs have also been criticised for a lack of attention to life cycle
perspectives. Bowyer et al. (2006) claim that there is no requirement for consideration of
life cycle inventory data in LEED; Scheuer and Keoleian (2002) find LEED to be an
unreliable sustainability assessment tool when looked at from a life cycle perspective.
However, a majority of SRTs have now started or are in the process of incorporating
life cycle thinking.
Other concerns regarding SRTs have been raised. Baird (2009) argues for the
inclusion of user performance criteria, claiming that buildings which perform poorly
from a users point of view are unlikely to be sustainable. Chew and Das (2007, p. 10)
highlight one of the issues with SRTs is that scores are lost for credits that are beyond
the scope of a project. For example, sustainable site development or provisions related
to fuel-efficient vehicles are not feasible in the case of a commercial building on a tight
site, downtown with a well-established public transport system. Fard (2012) as well as
Fenner and Ryce (2008) argue that point-hunting or green-washing may become an
issue where building owners are only concerned about gaining the required points
for certification without actually addressing pertinent issues relating to energy
efficiency and resource preservation. Saunders (2008) notes that different standards are
used in different SRTs, and this makes it difficult to do comparisons between tools.
Based on a normalised set of conditions, Saunders (2008) claims that LEED (USA) uses
a less rigorous, and to a certain extent lower building code standard compared to
Green Star (Australia) or BREEAM (UK).
An analysis of 14 SRTs for buildings/infrastructure, carried out by this papers
authors, is shown in Appendix Tables AII-AIV, and covers: criteria and subcriteria;
nature of the scoring used; and identification of international standards embedded.
Note that for the reason of brevity, only the major tools are explained in detail in
this paper; more information about the other listed tools is available on the respective
web sites indicated in Appendix Table AIII. The findings are summarised here:
First, from Appendix Table AII, it is observed that SRTs have a strong
environmental focus, where a majority of them have adopted subcriteria in areas such
as energy, water, waste, land use and ecology, as well as materials. Researchers (Fenner
and Ryce, 2008; Mateus and Braganca, 2011; Watson et al., 2013; Ding, 2008; Todd
et al., 2001) have also highlighted this point across different SRTs. Many SRTs
(CASBEE, EPRA, BCA Green Mark for Districts, Estidama Pearl Community, Green
Globe, Sustainable Design Scorecard, DGNB-Seal, Protocol ITACA) do not explicitly
mention the incorporation of financial aspects in their assessment (Watson et al., 2013).
Only ASPIRE is found to encompass more balanced triple bottom line criteria of
environmental, social and economic. Ding (2008, p. 456) argues for the inclusion of
economic criteria, claiming that even though a project may be environmentally sound
it could be very expensive to develop. While it might be reasoned that economic
concerns are not downplayed by SRTs, because they have been taken into
account under environmental impacts (e.g. cost savings that emanate from energy
reduction see LEED, 2009a), this does not appear to be the case for a majority of
SRTs. CASBEE, for example, makes no explicit mention of cost issues in its
assessment. As well, from Appendix Table AIV, it is clear that ISO 14001, concerning
environmental management, is the most commonly embedded standard across SRTs
such as BREEAM, Green Star, AGIC, BCA Green Mark for Districts, Estidama Pearl
Community, Green Globe, HK-Beam and DGNB Seal. Only one SRT out of the 14
analysed, namely HK-BEAM, has included OHSAS 18001, a standard in health
and safety.
Second, from Appendix Table AIII, it can be observed that all SRTs analysed have
adopted deterministic scoring which does not account for variability or uncertainty in
value judgments. Uncertainty in judgments needs to be accounted for because of the
existence of subjective criteria and measurement scales. Some examples follow.
In Green Star, the building users guide criterion gives that 1 point is awarded for an
easy-to-use guide that includes information relevant to users, occupants and tenants.
However, different people will have different understandings of what constitutes easyto-use, and they may not arrive at the same score. The conditional requirement
set under the energy criterion states that before points can be awarded against
greenhouse gas emissions, the projects predicted greenhouse gas emissions must not
exceed 110 kg CO2/m2/annum as determined according to energy modelling tools
(referring to either the Australian Building Greenhouse Rating Validation Protocol or
the Green Star energy calculator). Uncertainty lies in these energy modelling tools and
calculations may differ depending on how optimistic or pessimistic the inputs of
energy or gas consumption are (see GBCA, 2012a). These calculations also depend on
the phase of the project, whether it is at pre-design or post-design phase. The
environmental design initiative criterion in Green Star states that 1 point is awarded if
an initiative in a project viably addresses a valid environmental concern outside
the scope of the tool; opinions may differ as to what constitutes a valid environmental
concern. LEED incorporates post-occupancy thermal comfort surveys which are
based on the value judgement of users and are carried out in a deterministic manner
(LEED, 2012). Insua and French (1991), Wolters and Mareschal (1995) and Hyde et al.
(2004) demonstrate that uncertainty in input parameters needs to be incorporated
into decision-making processes due to its influence on the ranking of alternatives. In
summary, SRTs largely ignore uncertainty in value judgements and behavioural
issues, which have the potential to affect a buildings overall performance. Fenner and
Ryce (2008) share a similar view by arguing that assessors opinions tend to differ and
therefore inconsistencies are unavoidable in sustainability assessments.
Third, not all SRTs consider criteria weighting. From Appendix Table AIII, seven
out of 14 SRTs (ASPIRE, AGIC, BCA Green Mark for Districts, Estidama Pearl
Community, Green Globe, Sustainable Design Scorecard and Protocol ITACA) do not
explicitly mention criteria weighting. There is currently no consensus to guide the
derivation of the weightings. CASBEE, for example, derives its weighting from a
survey of building owners, operators and designers. BREEAM claims that its
Building/
infrastructure
SRTs
123
SASBE
2,2
124
A national survey conducted by the Green Building Council, which informed the
development of the tool and assisted in assessing regional variation.
Understanding how the weightings are derived and understanding the stakeholders is
important because this will have a bearing on the overall assessment of buildings/
infrastructure. Future harmonisation effects should consider this aspect in an attempt
to develop an overarching standard for weightings or, at the very least, there needs
to be specific mention or justification of the process involved in deriving these
weightings.
Fourth, there are issues over benchmarks and relative comparisons, based on the
scoring used by the tools. Sharifi and Murayama (2013) argue that the benchmarks set
are non-scientific. Mitchell (2010) claims that Green Star has been criticised for being
too idealistic, showing hallmarks of something developed by architects rather than
people with practical experience in the commercial building industry. As an example,
for Green Star under the IEQ criterion, 1 point is awarded when a daylight factor of 2
per cent is achieved over 90 per cent of the nominated area and 1 point is awarded
when high-frequency ballasts are installed in fluorescent luminaires over a minimum
of 95 per cent of the nominated area. The question remains as to how these standards
are set. There is no empirical evidence to justify that achieving a daylight factor of 2
per cent over the 90 per cent nominated area is actually beneficial to stakeholders.
Fenner and Ryce (2008) highlight that building SRTs rely heavily on designers to
estimate the amount of energy and resources consumed by building occupiers.
Fifth, there is a lack of published reasoning behind the scores allocated for each
criterion, further suggesting that users are merely applying these tools without really
understanding what lies behind the tools. Berardi (2012) and Ding (2008) claim that the
reasons behind the selection of criteria, allocation of scores and weights are not explicit.
SRTs are designed based on opinions, as opposed to a rigorous analysis of building/
infrastructure effects on the environment, economy and society (Fard, 2012; Fowler and
Rauch, 2006; Rumsey and McLellan, 2005 cited in Berardi, 2012; Udall and Schendler,
2005). For example, under the ecological value criterion for Green Star, 4 credit points
are available when the site has no threatened or vulnerable species, no reduction of
native vegetation cover or if the ecological value of the site is not diminished. No
further explanation is provided as to why these criteria are proposed or the reason
behind the allocation of 4 credit points. It is questionable as to whether rewarding
credit points for such criteria in this manner would lead to better environmental
impact. It could occur that a project developer has coincidentally acquired land, which
happens to meet all of the above criteria, without applying additional effort. It would be
helpful if more detailed rationalisation and explanation accompanied the criteria
proposed. Justification as to why certain criteria are allocated more credit points
Building/
infrastructure
SRTs
125
SASBE
2,2
126
Table XVI.
Comparison between
Green Star and NABERS
Building occupant
A
B
C
D
E
F
G
star
star
star
star
star
star
star
NABERS award
(energy)
NABERS award
(water)
5 star
5 star
4.5 star
4.5 star
3.5 star
3.5 star
na
5.5 star
3.5 star
na
na
na
4.5 star
2 star
Expanding the list of criteria to include more measurable social and economic
issues. Current SRTs for buildings are predominantly focused on the
environment.
Currently, there are tools that report potential environmental impact and tools
that report actual performance. Future research could look into harmonising
both tools to reduce confusion. A lot of work will be required around the
integration process. Mitchell (2010) suggests that perhaps a single agency
responsible for both tools would speed up this process.
Building/
infrastructure
SRTs
127
References
Adegbile, M.B.O. (2013), Assessment and adaption of an appropriate green building rating
system for Nigeria, Journal of Environment and Earth Science, Vol. 3 No. 1, pp. 1-11.
AGIC (2012), Australian Green Infrastructure Council IS Rating Scheme, Australian Green
Infrastructure Council (AGIC), Sydney, available at: www.agic.net.au/ISratingscheme1.htm
(accessed 7 January 2013).
Baird, G. (2009), Incorporating user performance criteria into building sustainability rating
tools (BSRTs) for buildings in operation, Sustainability, Vol. 1 No. 4, pp. 1069-1086.
BASF (2012), Life cycle Analyzer, BASF, Ludwigshafen, available at: www.basf.com/group/
corporate/en/function/conversions:/publish/content/news-and-media-relations/newsreleases/downloads/2012/P210_Life_Cycle_Analizer_e.pdf (accessed 27 January 2013).
Berardi, U. (2012), Sustainability assessment in construction sector: rating systems and rated
buildings, Sustainable Development, Vol. 20 No. 6, pp. 411-424.
Bonham-Carter, C. (2010), Sustainable communities in the UK, cited in Sharifi and Murayama
(2013).
Born, P. (1996), Input-output analysis: input of energy, CO2 and work to produce goods, Journal
of Policy Modelling, Vol. 18 No. 2, pp. 217-221.
Bowyer, J., Howe, J., Fernholz, K. and Lindburg, A. (2006), Designation of Environmentally
Preferable Building Materials: Fundamental Change Needed Within LEED, Dovetail
Partners Inc, Minneapolis, MN, available at: www.dovetailinc.org/files/Dovetail
LEED0606.pdf (accessed 27 January 2013).
BREEAM (2012), What is BREEAM? Building Research Establishment (BRE), Watford, available
at: www.breeam.org/about.jsp?id66 (accessed 20 November 2012).
Buchanan, A.H. and Honey, B.G. (1994), Energy and carbon dioxide implications of building
construction, Energy and Buildings, Vol. 20 No. 3, pp. 205-217.
CASBEE (2002), The Assessment Method Employed by CASBEE, Japan Green Build Council
and Japan Sustainable Building Consortium, available at: www.ibec.or.jp/CASBEE/
english/methodE.htm (accessed 20 November 2012).
Centre for International Economics Canberra and Sydney (2007), Embodied carbon metrics will
avoid higher than desired carbon content and additional costs, cited in Davis Langdon,
available at: www.davislangdon.com/ANZ/Sectors/Sustainability/ECM/ (accessed 24
March 2012).
Chan, P. and Chu, C. (2010), HK-BEAM (Hong Kong Building Environmental Assessment
Method): Assessing Healthy Buildings, BEAM Society, Hong Kong, available at:
www.mixtechnology.com/files/download/HK_BEAM.pdf (accessed 6 January 2012).
Chew, M.Y.L. and Das, S. (2007), Building grading systems: a review of the state-of-art,
Architectural Science Review, Vol. 51 No. 1, pp. 3-13.
SASBE
2,2
128
Cole, R.J. (1999), Building environmental assessment methods: clarifying intentions, Building
Research and Information, Vol. 27 Nos 4-5, pp. 230-246.
Cole, R.J. (2005), Building environmental assessment methods: redefining intentions and roles,
Building Research and Information, Vol. 35 No. 5, pp. 455-467.
Crawley, D. and Aho, I. (1999), Building environmental assessment methods: applications and
development trends, Building Research and Information, Vol. 27 Nos 4-5, pp. 300-308.
Department of Climate Change and Energy Efficiency (2010), NaTHERS, Department of Climate
Change and Energy Efficiency, Canberra, available at: www.climatechange.gov.au/whatyou-need-to-know/buildings/homes/B/media/publications/buildings/nationwide-homeenergy-rating-scheme.pdf (accessed 23 February 2013).
Ding, G.K.C. (2008), Sustainable construction the role of environmental assessment tools,
Journal of Environmental Management, Vol. 86 No. 3, pp. 451-464.
EIA (2012), 2012 Commercial Buildings Energy Consumption Survey (CBECS), US Department
of Energy, Washington, DC, available at: www.eia.gov/survey/form/eia_871/2012/
2012%20CBECS%20Form%20871A%20Building%20Questionnaire.pdf (accessed 10
January 2013).
Eichholtz, P., Kok, N. and Quigley, J.M. (2009), Doing well by doing good?, working paper,
Green Office Buildings, Centre for the Study of Energy Markets (CSEM), Berkeley, CA,
August.
Energy STAR (2011), Energy STAR Performance Ratings Technical Manual, US Environmental
Protection Agency and US Department of Energy, available at: www.energystar.gov/ia/
business/evaluate_performance/General_Overview_tech_methodology.pdf (accessed 7
December 2011).
EUROSTAT (2001), Economy-Wide Material Flow Accounts and Derived Indicators: A
Methodological Guide, European Commission, Luxembourg, available at: http://
epp.eurostat.ec.europa.eu/portal/page/portal/environmental_accounts/documents/3.pdf
(accessed 27 January 2013).
Fard, N.H. (2012), Energy-based sustainability rating system for buildings: case study of
Canada, Masters thesis of Applied Science, The University of British Columbia,
Okanagan.
Fenner, R.A. and Ryce, T. (2008), A comparative analysis of two building rating systems, part I:
evaluation, Engineering Sustainability, Vol. 161 No. 1, pp. 55-63.
Fernandez-Sanchez, G. and Rodrguez-Lopez, F. (2010), A methodology to identify sustainability
indicators in construction project management application to infrastructure projects in
Spain, Ecological Indicators, Vol. 10 No. 6, pp. 1193-1201.
Fowler, K.M. and Rauch, E.M. (2006), Sustainable Building Rating Systems Summary, Pacific
Northwest National Laboratory, US Department of Energy, Battelle Washington, DC.
Fuerst, F. (2009), Building momentum: an analysis of investment trends in LEED and
Energy STAR-certified properties, Journal of Retail and Leisure Property, Vol. 8 No. 4,
pp. 285-297.
Fuerst, F. and McAllister, P. (2008), Pricing sustainability: an empirical investigation of the value
impacts of green building certification, working paper presented at ARES, cited in Miller
et al. (2008).
Global Reporting Initiative (GRI) (2011), Sustainability Reporting Guidelines v3.1, GRI,
Amsterdam, available at: www.globalreporting.org/resourcelibrary/G3.1-SustainabilityReporting-Guidelines.pdf (accessed 24 January 2013).
Green Building Council of Australia (GBCA) (2012a), Green Star-Rating Tools, Green Building
Council of Australia (GBCA), Sydney, available at: www.gbca.org.au/green-Star/ratingtools/ (accessed 20 November 2012).
Green Building Council of Australia (GBCA) (2012b), Green Star-Performance Benefits, Green
Building Council of Australia (GBCA), Sydney, available at: www.gbca.org.au/green-Star/
green-Star-performance/benefits/ (accessed 3 January 2012).
Green Building Council of Australia (GBCA) (2012c), Green Star Project Directory, Green
Building Council of Australia (GBCA), Sydney, available at: www.gbca.org.au/projectdirectory.asp (accessed 12 December 2012).
Gryc, H. (2012), Sustainable Structural Engineering-Gaining Greater Benefits to Communities in
Developing Nations, Structures Congress, ASCE, Chicago, IL, available at: http://
ascelibrary.org/doi/pdf/10.1061/9780784412367.178 (accessed 20 November 2012).
Heerwagen, J. (2000), Green buildings, organizational success and occupant productivity,
Building Research and Information, Vol. 28 Nos 5-6, pp. 353-367.
Hes, D. (2007), Effectiveness of green building rating tools: a review of performance, The
International Journal of Environmental, Cultural, Economic & Social Sustainability, Vol. 3
No. 4, pp. 143-152.
HKGBC and BEAM Society (2012), BEAM Plus Existing Buildings Version 1.2, BEAM Society,
Hong Kong, available at: www.beamsociety.org.hk/files/BEAM_Plus_For_Existing_
Buildings_Version_1_2.pdf (accessed 6 January 2013).
Horvarth, A., Pacca, S., Masanet, E. and Canapa, R. (2007), PaLATE, University of California,
Berkeley, CA, available at: www.ce.berkeley.edu/Bhorvath/palate.html (accessed 27
December 2012).
Hsu, S.L. (2010), Life cycle assessment of materials and construction in commercial structures:
variability and limitations, Masters thesis, Massachusetts Institute of Technology,
Cambridge, MA.
Huang, S.-L. and Tsu, W.-L. (2003), Materials flow analysis and Energy evaluation of Taipeis
urban construction, Landscape and Urban Planning, Vol. 63 No. 2, pp. 61-74.
Hurley, J. and Horne, R. (2006), Review and analysis of tools for the implementation and
assessment of sustainable urban development, Environmental Institute of Australian and
New Zealand (EIANZ) report, Adelaide.
Hyde, K.M., Maier, H.R. and Colby, C.B. (2004), Reliability-based approach to multi-criteria
decision analysis for water resources, Journal of Water Resources Planning and
Management, Vol. 130 No. 6, pp. 429-438.
Insua, D.R. and French, R. (1991), A framework for sensitivity analysis in discrete
multi-objective decision making, European Journal of Operational Research, Vol. 54 No. 2,
pp. 176-190.
Kahhat, R. and Williams, E. (2012), Materials flow analysis of e-waste: domestic flows and
exports of used computers from the United States, Resources, Conservation and Recycling,
Vol. 67, pp. 67-74.
Kordjamshidi, M., King, S. and Prasad, D. (2006), Why are rating schemes always wrong?
Regulatory frameworks for passive design and energy efficiency, 23rd Conference on
Passive and Low Energy Architecture, Geneva, 6-8 September, available at: www.unige.ch/
cuepe/html/plea2006/Vol2/PLEA2006_PAPER176.pdf (accessed 23 February 2013).
Lee, Y.S. and Guerin, D.A. (2010), Indoor environmental quality differences between office
types in LEED-certified buildings in the US, Building and Environment, Vol. 45 No. 5,
pp. 1104-1112.
Lee, W.L., Yik, F.W.H. and Burnett, J. (2007), Assessing energy performance in the latest
versions of Hong Kong building environmental assessment method (HK-BEAM), Energy
and Buildings, Vol. 39 No. 3, pp. 343-354.
LEED (2009a), LEED for new construction and major renovations v2009, available at: http://
new.usgbc.org/credits/new-construction/v2009 (accessed 26 January 2013).
Building/
infrastructure
SRTs
129
SASBE
2,2
130
LEED (2009b), LEED for existing buildings: operations and maintenance v2009, available at:
http://new.usgbc.org/credits/existing-buildings/v2009 (accessed 26 January 2013).
LEED (2012), What is LEED? US Green Building Council, Washington, DC, available at: https://
new.usgbc.org/leed (accessed 20 November 2012).
Levermore, G.J. (2008), A review of the IPCC assessment report four: part I: the IPCC process
and greenhouse gas emission trends from buildings worldwide, Building Services
Engineering Research and Technology, Vol. 29 No. 4, pp. 349-361.
Lockwood, C. (2006), Building the green way, Tool kit, Harvard Business Review, Harvard, MA,
pp. 1-9, available at: http://ecologicdesignlab.com/files/Eco-Urban/VIII.1_HBR_building_
green_way.pdf (accessed 12 December 2012).
Lozano, R. and Huisingh, D. (2011), Inter-linking issues and dimensions in sustainability
reporting, Journal of Cleaner Production, Vol. 19 Nos 2-3, pp. 99-107.
Mateus, R. and Braganca, L. (2011), Sustainability assessment and rating of buildings:
developing the methodology SBToolPTH, Building and Environment, Vol. 46 No. 10,
pp. 1962-1971.
Miller, N., Spivey, J. and Florance, A. (2008), Does green pay off?, Journal of Real Estate
Portfolio Management, Vol. 14 No. 4, pp. 385-399.
Mitchell, L.M. (2010), Green Star and NABERS: learning from the Australian experience with
green building rating tools in Bose, R.K. (Ed.), Energy Efficient Cities: Assessment Tools
and Benchmarking Practices, The International Bank for Reconstruction and
Development/The World Bank, Washington, DC, pp. 93-124.
Mithraratne, N. and Vale, B. (2004), Life cycle analysis model for New Zealand houses, Building
and Environment, Vol. 39 No. 4, pp. 483-492.
Morrissey, J., Iyer-Raniga, U., McLaughin, P. and Mills, A. (2012), A strategic appraisal
framework for ecologically sustainable urban infrastructure, Environmental Impact
Assessment Review, Vol. 33 No. 1, pp. 55-65.
Mroueh, U.-M., Eskola, P. and Laine-Ylijoki, J. (2001), Life-cycle impacts of the use of industrial
by-products in road and earth construction, Waste Management, Vol. 21 No. 3,
pp. 271-277.
NABERS (2011), Preparing for NABERS Office Rating Application, NSW Office of Environment
and Heritage, Sydney, available at: www.nabers.gov.au/public/WebPages/Document
Handler.ashx?docType 3&id 15&attId 0 (accessed 6 January 2013).
NABERS (2012), Rating Register, NSW Office of Environment and Heritage, Sydney, available at:
www.nabers.gov.au/public/ WebPages/ContentStandard. aspx?module 30& template
3&id 310& side Ratingsfrom20July.htm (accessed 7 January 2013).
Newell, G., MacFarlane, J. and Kok, N. (2011), Building Better Returns: A Study of the Financial
Performance of Green Office Buildings in Australia, Australian Property Institute and
Property Funds Association, University of Western Sydney and the University of
Maastricht in conjunction with Jones Lang LaSalle and CBRE, Australian Property
Institute, Sydney, available at: www.api.org.au/ assets/media_library/ 000/000/219/
original.pdf?1315793106 (accessed 5 January 2013).
Newsham, G.R., Mancini, S. and Birt, B.J. (2009), Do LEED-certified buildings save energy? Yes,
but, Energy and Buildings, Vol. 41 No. 8, pp. 897-905.
New South Wales (NSW) Government (2013), Environmental Planning and Assessment
Regulation 2000, New South Wales (NSW) Government, Sydney, available at:
www.legislation.nsw. gov.au/maintop/ view/inforce/ subordleg 557 2000 cd 0 N
(accessed 24 February 2013).
Nguyen, B.K. and Altan, H. (2011), Comparative review of five sustainable rating systems,
Procedia Engineering, Vol. 21, pp. 376-386.
Norman, J., MacLean, H. and Kennedy, C. (2006), Comparing high and low residential density:
life-cycle analysis of energy use and greenhouse gas emissions, Journal of Urban
Planning and Development, Vol. 132 No. 1, pp. 10-21.
NSW Government Planning and Infrastructure (2013), Using the BASIX Assessment Tool, NSW
Government Planning and Infrastructure, Sydney, available at: www.basix.nsw.gov.au/
basixcms/ getting-started/using- the-basix-assessment- tool.html (accessed 23 February
2013).
Piluso, C., Huang, Y. and Lou, H.L. (2008), Ecological input-output analysis-based sustainability
analysis of industrial systems, Industrial & Engineering Chemistry Research, Vol. 47
No. 6, pp. 1955-1966.
Reed, R., Bilos, A., Wilkinson, S. and Schulte, K.-W. (2009), International comparison of
sustainable rating tools, Journal of Sustainable Real Estate, Vol. 1 No. 1, pp. 1-22.
Ries, R., Bilec, M.M., Gokhan, N.M. and Needy, K.L. (2006), The economic benefits of green
buildings: a comprehensive case study, The Engineering Economist, Vol. 51 No. 3,
pp. 259-295.
Roderick, Y., McEwan, D., Wheatley, C. and Alonso, C. (2009), Comparison of energy
performance assessment between LEED, BREEAM and Green Star, 11th International
IBPSA Conference, Glasgow, July 27-30.
Rumsey, P. and McLellan, J.F. (2005), The green edge the green imperative, environmental
design and construction, pp. 5556, cited in Berardi (2012).
Sahely, H.R., Kennedy, C.A. and Adams, B.J. (2005), Developing sustainability criteria for urban
infrastructure systems, Canadian Journal of Civil Engineering, Vol. 32 No. 1, pp. 72-85.
Saunders, T. (2008), A Discussion Document Comparing International Environmental Assessment
Methods for Buildings, Building Research Establishment (BRE), Watford, available at:
www.dgbc.nl/images/uploads/rapport_vergelijking.pdf (accessed 29 July 2012).
Scheuer, C.W. and Keoleian, G.A. (2002), Evaluation of LEED Using Life Cycle Assessment
Methods, National Institute of Standards and Technology (NIST), Gaithersburg, MD,
available at: www.fire.nist.gov/bfrlpubs/build02/PDF/b02170.pdf (accessed 27 January
2013).
Sev, A. (2011), A comparative analysis of building environmental assessment tools and
suggestions for regional adaptations, Civil Engineering and Environmental Systems,
Vol. 28 No. 3, pp. 231-245.
Sharifi, A. and Murayama, A. (2013), A critical review of seven selected neighbourhood
sustainability assessment tools, Environmental Impact Assessment Review, Vol. 38,
pp. 73-87.
Shen, L.-Y., Hao, J.L., Tam, V.W.-Y. and Yao, H. (2007), A checklist for assessing sustainability
performance of construction projects, Journal of Civil Engineering and Management,
Vol. XIII No. 4, pp. 273-281.
Singh, R.K., Murty, H.R., Gupta, S.K. and Dikshit, A.K. (2009), An overview of sustainability
assessment methodologies, Ecological Indicators, Vol. 9 No. 2, pp. 189-212.
Todd, J.A., Crawley, D., Geissler, S. and Lindsey, G. (2001), Comparative assessment of
environmental performance tools and the role of the green building challenge, Building
Research and Information, Vol. 29 No. 5, pp. 324-335.
Torcellini, P., Pless, S., Deru, M., Griffith, B., Long, N. and Judkoff, R. (2006), Lessons learned
from case studies of six high-performance buildings, Technical Report No. NREL/TP550-37542, National Renewable Energy Laboratory, Golden, CO, June.
Tronchin, L. and Fabbri, K. (2008), Energy performance building evaluation in Mediterranean
countries: comparison between software simulations and opening rating simulation,
Energy and Buildings, Vol. 40 No. 7, pp. 1176-1187.
Building/
infrastructure
SRTs
131
SASBE
2,2
132
Udall, R. and Schendler, A. (2005), LEED is Broken Lets Fix it, iGreenBuild.Com, San Mateo,
CA, available at: www.igreenbuild.com/cd_1706.aspx (accessed 27 January 2013).
Ugwu, O.O. and Haupt, T.C. (2007), Key performance indicators and assessment methods for
infrastructure sustainability a South African construction industry perspective,
Building and Environment, Vol. 42 No. 2, pp. 665-680.
rge-Vorsatz, D. and Novikova, A. (2008), Potential and costs of carbon dioxide mitigation in the
U
worlds buildings, Energy Policy, Vol. 36 No. 2, pp. 642-661.
USGBC (2011), Buildings and Climate Change, US Green Building Council, Washington, DC,
available at: www.documents.dgs.ca.gov/dgs/pio/facts/LA%20workshop/climate.pdf
(accessed 20 June 2012).
Vijayan, A. and Kumar, A. (2005), A review of tools to assess the sustainability in building
construction, Environmental Progress, Vol. 24 No. 2, pp. 125-132.
Watson, P., Jones, D. and Mitchell, P. (2013), Are Australian building eco-assessment tools
meeting stakeholder decision making needs?, available at: www.construction-innovation.
info/images/pdfs/Research_library/ResearchLibraryB/RefereedConferencePapers/Are_
Australian_building_eco-assessement_tools.pdf (accessed 3 May 2013).
Williamson, T.J., OShea, S. and Menadue, V. (2001), NatHERS: science and non-science, 35th
ANZAScA Conference, Wellington, November, available at: www.pc.gov.au/data/assets/
pdf_file/0017/45116/sub028attachment1.pdf (accessed 24 February 2013).
Wiley, J.A., Benefield, J.D. and Johnson, K.H. (2010), Green design and the market for
commercial office space, Journal of Real Estate Finance and Economics, Vol. 41 No. 2,
pp. 228-243.
Wolters, W.T.M. and Mareschal, B. (1995), Novel types of sensitivity analysis for
additive MCDM methods, European Journal of Operational Research, Vol. 81 No. 2,
pp. 281-290.
World Economic Forum (2011), A Profitable and Resource Efficient Future: Catalysing Retrofit
Finance and Investing in Commercial Real Estate, World Economic Forum, Geneva,
October, available at: www3.weforum.org/docs/WEF_IU_CatalysingRetrofit
FinanceInvestingCommercialRealEstate_ Report_2011.pdf (accessed 6 January 2013).
Xu, M., Allenby, B. and Kim, J. (2010), Input-Output Analysis for Sustainability, Centre for
Sustainable Engineering, Tempe, AZ, available at: www.ce.cmu.edu/Bcse/5aug07%
20Allenby%20IO.pdf (accessed 27 January 2013).
Further reading
AS/NZS:4801 (2001), Occupational Health and Safety Management Systems Specification With
Guidance for Use, Standards Australia, Sydney, available at: http://infostore.
saiglobal.com/store2/Details.aspx?ProductID386329 (accessed 16 October 2012).
BSI (2013), OHSAS 18001 Occupational Health and Safety, BSI, London, available at: www.
bsigroup.com.au/en-au/Assessment-and-Certification-services/Management-systems/
Standards-and-schemes/OHSAS-18001/ (accessed 24 January 2013).
EMAS (2013), What is EMAS?, European Commission, Brussels, available at: http://ec.
europa.eu/environment/emas/index_en.htm (accessed 23 January 2013).
ISO9001 (2008), Quality Management Systems Requirements, ISO, Geneva, available
at: www.iso.org/iso/home/store/catalogue_tc/catalogue_detail.htm?csnumber46486
(accessed 16 December 2011).
ISO14001 (2004), Environmental Management Systems Requirements with Guidance for Use,
ISO, Geneva, available at: www.iso.org/iso/home/store/catalogue_tc/catalogue_detail.
htm?csnumber31807 (accessed 16 December 2011).
www.ecoinvent.org/database/
www.gabi- software.com/
australia/ software/gabisoftware/gabi-5/
BOUSTEAD
Consulting UK
Edge Environment
Ecoinvent Centre
PE International
BOUSTEAD
ENVEST
Ecoinvent
GaBi
http://ws680.nist.gov/bees/
(A(7rkOp_kyzgEkAAAAZ
WExZWEwZDItMzBiY
S00YTVlLWJhYm
UtN2NiOTc4MzNm
Y2YwJt9Z32gVssFf9Qf_
Ghok1rCVQig1))/default.aspx
www.boustead-consulting.co.uk/
National Institute of
Standards and
Technology (NIST)
BEES
http://edgeenvironment.com.au/
envest/
Web link
Developer
LCA tool
Coverage
Outputs/types of analysis
Appendix
Building/
infrastructure
SRTs
133
Table AI.
LCA tools
Table AII.
Broad comparison of
criteria across a selection
of SRTs
Subcriterion
Environmental Energy
Water
Waste
Pollution/emissions/
air
Land use and ecology
Biodiversity
Materials
Management (i.e.
integrated
process, sustainable
Social
procurement, etc.)
Health and well-being/
IEQ
Economic
Innovation
Equity of economic
opportunity
Livelihood
opportunity
Macroeconomic effects
Criterion
134
BCA
Green
Mark
Estidama
Sustainable
Green
for
Pearl
Green
Design
DGNB- Protocol
BREEAM LEED Star CASBEE ASPIRE Districts EPRA Community Globe Scorecard BEAM Seal
ITACA AGIC
SASBE
2,2
ASPIRE
CASBEE
Green Star
LEED
BREEAM
SRT
http://breeam.org
www.usgbc.org
Building Research
Establishment, UK
US Green Building
Council
www.gbca.org.au/green-Star
Green Building
Council Australia
www.ibec.or.jp
Japan Green Build
Council and Japan
Sustainable Building
Consortium
ARUP and EAP
http://engineersagainstpoverty.
com/major initiatives/aspire.cfm
Web link
Owner
na
(continued)
Nature of SRT
Comments
Deterministic
scoring for
Weighting for
criteria
criteria
Building/
infrastructure
SRTs
135
Table AIII.
Analysis of building/
infrastructure reporting
tools
na
na
na
na
www.epra.wa.gov.au/
www.estidama.org/
www.greenglobe.org/
www.portphilip.vic.gov.au
www.bca.gov.sg/GreenMark/
Building and
green_mark_buildings.htm
Construction
Authority, Singapore
www.agic.net.au/
AGICschem.htm
na
(continued)
Nature of SRT
Comments
Deterministic
scoring for
Weighting for
criteria
criteria
136
Australian Green
Infrastructure
Council
AGIC
Web link
Owner
Table AIII.
SRT
SASBE
2,2
ITACA
DGNB-Seal
Protocol ITACA
www.irbdirekt.de/daten/iconda/
CIB9084.pdf
www.mixtechnology.com/files/
download/HK_BEAM.pdf
HK-BEAM
Web link
Owner
SRT
na
Nature of SRT
Comments
Deterministic
scoring for
Weighting for
criteria
criteria
Building/
infrastructure
SRTs
137
Table AIII.
Table AIV.
International standards
embedded
|
|
ISO 14001
ISO 9001
AS/NZS 4804
EMAS
OHSAS 18001
|
|
|
Green
BREEAM LEED Star CASBEE ASPIRE AGIC
|
BCA
Green
Mark
|
Green
EPRA Estidama Globe
Sustainable
Design
Scorecard
HK- DGNB
BEAM Seal
138
International
standard
Protocol
ITACA
SASBE
2,2
Building/
infrastructure
SRTs
139