Documente Academic
Documente Profesional
Documente Cultură
cedure for Ministers instructs ministers to give fair consideration and due
weight to informed and impartial advice from civil servants, but clearly
there is some feeling that all is not working well. Yet this diagnosis of ill-
health has not led to proposals for assessing when that advice is, or is not,
well informed and of good quality.
The New Zealand government has gone further. It has calculated the cost
of policy advice $NZ 245 million in the 19921993 financial year thus
estimating what ministers are purchasing (Boston et al. 1996, p. 129). The
government has devised a performance management framework for exam-
ining quality with seven defining characteristics: purpose, logic, accuracy,
options, consultation, practicality and presentation. Ministers are asked
every three months to indicate their level of satisfaction, as an input to the
review of the performance of departmental secretaries by the State Services
Commission (Hunn 1994, pp. 312). But the authors of the initiative
acknowledge that ministers may not always be the best people to judge
quality and that they may have a particular problem in those cases when
the advice was necessary, but not welcome. In addition the evaluation of
advice was part of the process of assessing official performance; policy
advice was not the principal subject of inquiry. Even after this attempt to
calculate where there is the best performance, observers can only comment
that some departments have acquired a well-deserved reputation for pro-
ducing excellent work, whereas others have been much less successful
(Boston et al. 1996, p. 133).
That the problem is hard is not in debate. That difficulty creates a
pressing requirement to assess the methodology and impact of those experi-
ments that have been undertaken. In that spirit this article examines an
experiment in Australia where a series of Policy Management Reviews
(PMRs) were undertaken to assess the performance of central agencies in
the provision of advice and, more broadly, in developing that advice. This
article describes the Australian process, explains some of the dilemmas, and
suggests the valuable if limited benefits that can flow from this method
of evaluating policy advice.
vided? there was little evidence to justify the confidence that it was of a
high standard. The central agencies therefore had to develop a means of
assessing policy advice in order to conform to the same standards of evalu-
ation imposed on line departments.
A working party made up from officials of the four central agencies
(Prime Minister and Cabinet, Treasury, Finance and the Public Service
Commission) was established to examine the problem. It first explored the
experience of other nations:
The overseas trip was interesting and useful in teasing out very different
cultural and philosophical approaches to whether, and if so, how the
business of policy advising should be evaluated. With the French it really
was the Gallic shrug, as though to say that we really were quite mad
in wanting to ask any of these questions. Whitehall was somewhat differ-
ent. Paraphrasing Dr Johnson, the British saw it as not a question of
whether you ought to do policy evaluation well. Instead it was a ques-
tion of why on earth would you want to evaluate policy advice at all
given that, self-evidently, anybody in the Whitehall advising policy game
must be doing it well.
The US was quite different again, and useful. Though structurally and
constitutionally distinct from the system that we are accustomed to here,
there was a great deal more yeast and openness in terms of questioning
the mores, the value and the appropritae application of policies (Waller
1996, p. 70)
Yet if some nations were sceptical of the value of the exercise, Australian
central agencies felt they had no choice if they were to answer the questions
being asked by Parliament. The problem was therefore not whether to
evaluate, but how to do so.
The Task Forces report, Performance Assessment of Policy Work (PAPW
1992), was judicious and careful. (An abbreviated version was soon pub-
lished: Waller 1992). It explained the process for developing good advice
and acknowledged the contingent nature of evaluating good policy advice
which involves:
taking a difficult and sometimes poorly understood problem or issue
and structuring it so that it can be thought about in a systematic way,
gathering the minimum necessary information and applying the appro-
priate analytical framework,
formulating effective options addressing, where necessary, mech-
anisms for implementation and evaluation; and
communicating the results of the work to the government in a timely
and understandable way (PAPW 1992, pp. 89).
The object of the evaluation policy advice was seen as more than merely
consideration of the final paper to the minister; it incorporated the whole
process of understanding the problems, co-ordinating the responses and
formulating their advice in essence, the whole policy cycle from issue
identification through to implementation and evaluation.
The report noted the need to consider three aspects: inputs (staff, man-
agement practices, information), process and outputs. It acknowledged the
first two were easier to analyse than the utility of the advice. Assessing that
utility, the report pointed out, involves assessing the effectiveness of policy
advice, firstly in terms of its effect on the recipient of that advice, secondly
whether that person adopts the advice and advocates it and thirdly its effect
on the decision-making process. This need to understand causal links
between the quality of the advice and the outcome is made harder by the
problem of judging impact, and deciding whether short-term or long-term
perspectives are the more appropriate time horizons.
Good advice is therefore both procedural getting the content right
and presentational telling it as it is. As the report says:
policy advice is an art or a craft, rather than a science. What the policy
adviser is required to do is reach a professional judgement about both
the underlying situation and the appropriate course for policy. These
judgements must be honest, disciplined and rigorous and be transparent
to those to whom the advice is directed (PAPW 1992, p. 9).
Good advice is what ministers should hear. To avoid giving unpalatable
information because that advice may be unwelcome is an exercise not in
politicization but in servility.
The working party emphasized that policy advising must be confidential,
to protect the relationship between ministers and their advisers. Any review
must appreciate this confidentiality which restricts the available options for
evaluation and people who might be able to undertake them.
The working party report (but not the version later published) then can-
vassed options. It admitted that informal methods of analysis ministerial
advisers, a better informed private sector, interest groups and governmental
research institutes provides some analysis through the process of con-
testing and debating policy alternatives. If the public service once had a
near monopoly on policy advice, this is no longer true. Ministers are, any-
way, always able to make informal assessments of policy advisers. Some
political assessment has existed, if not in a systematic way, through parlia-
mentary committees. On behalf of Parliament the Australian Audit Office
could assess programme efficiency and report to the Public Accounts Com-
mittee. But these methods were informal or tangential to the central issue;
none of them provided any formal consideration of the procedures and
problems of formulating advice.
If a more formal process was required, the report argued that the choice
was either an internal peer assessment, or a Policy Management Review
(PMR) undertaken by an external evaluator with specific terms of reference.
Either process would need to capture the complexity and time pressures
involved in developing policy advice, and to identify suitable areas of pol-
CONTINUING DILEMMA
There is now a familiar series of dilemmas facing those who undertake
PMRs. Some relate to context, to the well-defined professional environment
within which senior officials must work. For instance, Dr Michael Keating,
then the Secretary of the Department of the Prime Minister and Cabinet,
argues that the key ethical obligation of public service advisers is:
to ensure that decisions are fully informed and ministers are not misled.
To do this the Public Service has to be able to give its advice frankly,
without fear and favour. In particular, it has a responsibility to draw
on its professional knowledge and accumulated experience to point any
possibly unpalatable implications of particular problems which might
otherwise have remained unforeseen or been glossed over (Keating
1995).
A further problem is the political context. While the obligation is to pro-
vide advice that is frank and fair, to what extent should, or can, advice
reflect the interests of the government? Dr Keating noted with approval a
colleagues comment that: for a public servant to be aware of the govern-
ments mandate, philosophic approach, and of the political constraints is
not to be political. It is part of being professional to ensure that the advice
will be relevant (Keating 1996, p. 65). There is, of course, a fine balance
between determining what might be the political context and introducing
partisan elements. Nevertheless the task cannot be avoided.
A further professional requirement is the basis on which officials can
make their recommendation. Policy advice need not masquerade as the
general or public interest, as though the department is somehow the guard-
ian of that vague doctrine (Keating 1995). Rather it has to be based on a
distinct (even if contested) description of the problem, rigorous analysis
and logical options and recommendations. The last may be particularly sig-
options for the Cabinet. The review was to give specific emphasis to
the role of the Department of Prime Minister and Cabinet in the
activities of the committee.
(2) Review of Treasury policy advice on the governments response to
the report of the parliamentary committee of inquiry into banking.
Prepared by Fred Argy for the Treasury, May 1993. This report has
not been released.
(3) The Evaluation of the Forward Estimate Strategy Papers for the 1993
4 Budget, prepared by Graham Glenn for the Department of Finance
in April 1994. This PMR examined the preparation of the forward
estimate strategy papers and their impact on ministers. The 19934
Budget Paper had sought to give greater emphasis to strategic direc-
tion than in earlier years by developing plausible new options for
containing outlay growth. The review examined in detail two strategy
papers, on environmental programmes and child care, from the 48
prepared by the department. The first programme related to cross-
portfolio programmes, the latter to a specific, well-understood
growth programme.
(4) Review of Finances Role in Promoting and Using Evaluation, a paper
prepared by Dr John Uhr, Federalism Research Centre, Australian
National University for the Department of Finance in June 1993. The
review examined the performance of two interdepartmental commit-
tees, in which Finance was a key participant, to test the use of pro-
gramme evaluation. The two cases were aged care services for Abor-
iginal and Torres Strait Island people, and cost recovery practices in
the Commonwealth management of Australian Fisheries.
(5) Review of Commonwealth-State Reform Processes, prepared for the
Department of Prime Minister and Cabinet by Professor Patrick
Weller, Centre for Australian Public Sector Management, Griffith Uni-
versity, June 1995. The PMR considered the preparation of agenda
and advice for the Council of Australian Governments (an
intergovernmental body made up of the Prime Minister and state
premiers) in relation to micro-economic reform. The PMR took six
areas of reform and reviewed reasons why some succeeded more
than others; in particular it considered the impact of the Department
of Prime Minister and Cabinet on facilitating the process. It drew
conclusions about the strategies that worked the best and was pub-
lished soon after completion (Weller 1996).
SOME LESSONS
To this point all PMRs have been undertaken by those promoting the idea,
the central agencies (primarily the Departments of Prime Minister and Cabi-
net and Finance). Perhaps they had a greater need to explain their value-
adding participation in the advisory process. The initiation of PMRs by
these central agencies has a necessary consequence: the evaluation is look-
ing for the contribution made by the central agency, which is often as much
concerned with co-ordination and process as with outcome. In some of the
cases the central agency may have acted as the driver and catalyst for policy
development; the Department of Prime Minister and Cabinet, for instance,
was the initiator of competition policy and for many of the changes to
federal-state relations. Yet much of the later detailed work has to be done
in line departments. In other cases, when the policy initiative lies with line
departments, central agencies will assist to shape proposals, set some of
the rules and provide some input, whether based on political realism or
whole-of-government perspectives. For that reason, it is perhaps harder to
identify exactly what central departments add in policy terms. PMRs on
central agencies are therefore starting with cases that will be harder and
more complex than those where policy advice is primarily limited to a line
department. It would be easy to extend them to those more traditional
areas, where causal links may be easier to identify. It might also be useful,
as one observer noted, to examine a disaster in policy advice, to see what
went wrong (if any agency is prepared to participate in such a soul-
baring exercise).
Each PMR evaluator tried to draw lessons from the analysis. The PMR
on carers identified areas that should have been done better, and proposed
the department in future ensure certain policy conditions were met. The
PMR on forward estimates concluded that, since these papers had little
impact on ministers, a less costly process might be developed if the primary
benefit of the demands for the forward estimates was to act as an internal
catalyst for work on expenditure growth. The report on Commonwealth
State Reform Processes noted conditions that led to successful reforms and
the lessons which might be drawn. Two PMRs provided check lists for
policy advisers, as possible guides for action or as aides memoires when ask-
ing what else might need to be done.
The PMRs have therefore tended to report in two parts: to explain what
happened, and to provide lessons that might be adopted for future practice,
and for training. None of the findings have perhaps been startling for our
understanding of the way policy-making works. They reflect and develop
the wisdom, rather than shake it; but do provide a means of examining
processes in a structured way rare at the frenetic pace of central govern-
ment.
The recipients have been satisfied that some benefits have accrued. Dr
Keating wrote to department heads in May 1995 arguing that I see potential
for more widespread application of the options for performance assessment
of policy work, including the use of the PMR methodology, particularly as
a means of expanding the pool of information available to policy managers
(quoted in Gregory 1996, p. 160). The Secretary of the Department of Fin-
ance concluded that, while their early caution was valid, the results of
PMR undertaken to date show that the technique has promise, including
for the policy advising of line agencies (Sedgwick 1996a, p. 89).
CONCLUSIONS
The Australian experience represents a sustained attempt to come to terms
with the problems of assessing policy advice. The contingent nature of
advice is recognized, the problems of identifying the multiple strands
accepted. The PMRs are designed to analyse the process of developing
policy advice for ministers, while acknowledging a need to protect the
confidentiality of the relationship.
The PMR process does not pretend to provide a complete answer, nor
does it replace the need for more formal evaluations of government pro-
grammes. Rather a PMR can provide insight into the process of providing
advice, and more particularly on the capacities of co-ordinating agencies
in providing inputs.
The existing PMRs were all undertaken when a government had been in
power for some time; this may be a significant factor. Procedures may have
been well established and consequently lessons could be applied to well-
established routines. But the basic questions about quality will not change
with the election of a new government.
The Australian PMR outcomes suggest the number of evaluations that
can usefully be completed may be finite. PMRs provide ideas and give
guides to action. Every line department could undertake analyses and draw
lessons. But, since precisely the same circumstances will not re-occur, it is
these lessons, more than the analysis of the particular case, that will be of
the greatest benefit. In addition, as the initial report noted, the availability
of suitable evaluators is likely to be limited.
Yet policy evaluations are worth developing. The cost of PMRs is fairly
small compared to the potential benefits in an area where too little is known
and too much is assumed. As long as policy advice is a recognized skill,
its quality should be assessed. PMRs are an initial step worth taking.
REFERENCES
Boston, J., J. Martin, J. Pallot and P. Walsh. 1996. Public management: the New Zealand experience. Auckland:
Oxford University Press.
Colebatch, H. 1996. Testing the policy capacity of budgetary agencies: a comment on Uhr, Australian Journal
of Public Administration 55, 4.
Crawford, Sir John. 1954. The role of the Permanent Head, Public Administration (Sydney) 13, 3.
Dunleavy, P. 1995. Policy disasters: explaining the UKs record, Public Policy and Administration 10, 2.
Gregory, P. 1996. Policy management reviews, in Uhr and Mackay 1996 (see below).
Hunn, D.K. 1994. Measuring performance in policy advice: a New Zealand perspective, in Performance
Measurement in Government. OECD Public Management Occasional Papers, No. 5. Paris: OECD.
Keating, M. 1995. Public service values. 1995 Peter Wilenski Memorial Lectures, National Press Club.
. 1996. Defining the policy advising function, in Uhr and Mackay 1996 (see below).
Nicholson, J. 1996. Measures for monitoring policy advice, in Uhr and Mackay 1996 (see below).
PAPW. 1992. Performance assessment of policy work, Report of Working Party, Canberra.
Sedgwick, S. 1996a. Lessons from Finance, in Uhr and Mackay 1996 (see below).
. 1996b. Discussion, in Uhr and Mackay 1996 (see below).
Uhr, J. 1996a. Lessons from an external reviewer, in Uhr and Mackay 1996 (see below).
. 1996b. Testing the policy capacities of budgetary agencies: lessons from finance, Australian Journal of
Public Administration 55, 4.
Uhr, J. and K. Mackay. 1996. Evaluating policy advice: learning from Commonwealth experiences. Canberra:
Federalism Research Centre and Department of Finance.
Waller, M. 1992. Evaluating policy advice, Australian Journal of Public Administration 51, 4.
. 1996. The changing environment of policy making, in Uhr and Mackay 1996 (see above).
Weller, P. 1996. Commonwealth-state reform processes: a policy management review, Australian Journal
of Public Administration 55, 1.
Wildavsky, A. 1973. If planning is everything, maybe its nothing, Policy Sciences 4.
. 1979. Speaking truth to power. Boston: Little, Brown.