Home > Your ideas > Submissions > Dr Martin Dunn

Dr Martin Dunn


Submission on Policy Development in the Australian Public Service attached.

File Download (30.57 KB)
Automatic Transcription: 
Policy Development in the Australian Public Service

This submission relates to policy development within the public sector. Policy development is one of the major functions of the public service, but in my observation performance in this area often leaves much to be desired.

The Westminster conventions cloud much understanding. In principle, Ministers make policy decisions and the advice they receive from public servants is not made public. Ministers thus enjoy the credit and opprobrium for the performance of public service advice. My arguments are mostly based on my experience writing and reviewing policy with the public sector over the last 30 years for several governments. As part of my doctoral thesis, I interviewed senior executives in the Treasury and Department of Finance on their roles in the policy process. While I have specific examples in mind for the issues I identify, as a current public servant I cannot list them in a public submission.

While some good policy work is done, too often it suffers from a range of failures:

Unclear policy objectives. Often the focus is on achieving some specified deliverable, rather than understanding the policy problem and the potential role of government.

Failure to examine a suitable range of alternative solutions – often the arguments focus on a preconceived solution.

Failure to assess the solution in terms of value for money. While the Public Governance, Performance and Accountability Act 2013 requires decisions to be made in terms of “efficient, effective, economical and ethical use or management of public resources”, there is often no attempt to directly assess which options are the most efficient (deliver the best outcome for a given level of funding).

Lack of rigorous use of data in assessing the options. While there has been much discussion of “evidence-based policy” in recent years, it has not translated into practice.

Lack of a systematic review of policy. Some agencies have contestability units intended to challenge internal policies, but these only address a few policy areas. The central agencies (the Department of the prime Minister and Cabinet, the Treasury and the Department of Finance) act as gatekeepers, and can prevent poorly developed Cabinet submissions coming forward and can provide adverse coordination comments on the final Cabinet submissions – but they are not formally charged with providing contestability. Often the issues on which they will block a Cabinet submission is narrowly technical.

Broadly, this problem has two underlying causes, one cultural and one based in systems and skills. Neither is completely independent. A culture that accepts weak policy as normal will not develop the necessary skills. In the same way, and organisation with weak skills that still has its policies accepted will develop a culture that regards this as normal.

Part of the cause of poor policy relates to the relationship between the elected government and the public service. Too often ministers do not clearly signal to the public service that they want rigorous policy advice from the public service, even if that means challenging some of their pet projects. Nor is it clear that Ministers themselves understand the characteristics of good policy and that they are not getting the level of advice that they should.

On the other hand, public servants appear happy to focus on delivering the ministers’ policies, and not rocking the boat. The result is that public servants end up delivering advice that reflects their judgement of what the minister wants to hear. Clearly, this has the obvious problem of having public servants offer advice that does not reflect their judgement as to what the best policy solution is. It also has two other problems. First, even senior public servants are often not good at understanding what ministers really want – too often their model of politics is a caricature. Second, it unconsciously signals to ministers that public service policy skills are weak and hence you should not expect much.

Politicians and public servants need to set high standards for themselves and each other to avoid lowest common denominator policy. As with any cultural facet, consolidating a new culture needs leadership example, constant communications, rewarding desired behaviours and rejecting substandard behaviours.

As I noted above, my observation is that the quality of policy work is uneven across the public service. I attribute this to:

the lack of a clear model of better policy development;

devaluing policy skills, particularly domain knowledge and quantitative skills, in favour of “generalist” policy officers;

a rather ad hoc approach taken to seeking to benefit from external expertise and experience;

uneven training and staff development in policy skills; and

a prevailing model for approving policy based organisational hierarchies (occasionally supplemented by committees).

As a model of policy work, I suggest the New Zealand Policy Improvement Frameworks (https://www.dpmc.govt.nz/our-programmes/policy-project/policy-improvement-frameworks). There is no Australian equivalent to this. The closest the Australian Government got was found in parts of the Australian National Audit Office Better Practice Guides, which have now mostly been withdrawn without replacement.

On policy skills, there is a general lack of emphasis on deep domain knowledge and quantitative skills (with some exceptions). Yet both are needed for good policy. Any choice between options is greatly aided by quantification. Often where agencies recognise the need for these skills, the solution is seeking the advice of a consultancy. However, without knowledge in this field it is difficult to assess the quality of the consultant’s report and this approach leads to further diminution of public service skills.

As a case study to illustrate the shortfalls in quantitative analysis:

Probably the most critical quantitative skill is cost-benefit analysis. And the gold standard in cost-benefit analysis is the calculation of a net present value (NPV). Information on how to conduct this analysis should be readily available and applied as a matter of course.

Despite this, guidance is patchy. Some dated material exists on the Department of Finance website, watermarked “archived”. While the Department of The Prime Minister and Cabinet’s Office of Best Practice Regulation (OBPR) offers some more current guidance. Notably, the focus of OBPR’s work is the cost impact of government policy on industry.

The critical issue in calculating NPV is the choice of discount rate. Too high, and the answers are prejudiced against investment that can generate longer term savings, while too low encourages too much investment. The choice should be informed by market conditions. OBPR suggests a rather of 7 per cent and sensitivity testing at 3 and 10 per cent OBPR’s primary reference to justify this choice dates from 2010. Given persistent low interest rates since then, the rate should have been revisited.

The National Public Private Partnership Guidelines argue that the Capital Asset Pricing Model (CAPM) should be used to determine the appropriate discount rate for public private partnerships (PPP). This generates higher discount rates for riskier projects. There is a case for using the CAPM, but in the Commonwealth most PPPs do not generate net revenue (unlike what might be the case for a toll road). Hence, using CAPM rates distorts the analysis by reducing the net cost of riskier investments.

While new policy proposals going to Cabinet need to be costed by the Department of Finance or the Treasury, this is generally an assessment of budget impact. It does not look at proposals from a management accounting perspective.

Recruitment and promotion practices in the APS do not place value on quantitative skills. Selection for the Senior Executive Service (SES) is based on the skills and behaviours described in the Senior Executive Capability Leadership Framework (introduced 1999) while for more junior grades the similar Integrated Leadership System (introduced 2004) is used. Both (correctly) include communications skills but neither includes quantitative skills as a basic requirement. In contrast, the OECD Programme for the International Assessment of Adult Competencies assesses both literacy and numeracy.

Use of external expertise and experience, such as might be found in academia, business, state or overseas governments, is rather ad hoc. Generally, these contributions are considered optional – or something that the consultant can research.

While there are a few policy courses around, few emphasise quantitative skills. The APSC’s recent work on data literacy is a commendable initiative, but awareness of this work does not seem to have penetrated far into the public service.

Finally, the dominant mode for policy approval is the hierarchy. The problem with this approach is that senior people usually have a better understanding of the organisational context, may or may not have relevant technical expertise, but are often short on time. Committees can bring in other interested parties who may be able to allocate staff time to analyse the proposals, but often they will be focussed on the parts of the policy that affect their roles – rather than its overall suitability. Some parts of the public service have contestability units, but generally most policy is not formally reviewed. Contestability can be achieved at various levels, ranging from reviewing the assumptions with a sceptical eye through to formal Team A / Team B processes where a separate team is established to address the same problem.

I would recommend:

adopting the New Zealand Policy Improvement Frameworks and equivalent accountabilities and networks;

making cost-benefit analysis obligatory part of policy proposals going to government, and providing updated guidance;

develop a set of courses designed to promote quantitative policy skills in the public sector;

establish an expectation that academic research and other government experience is routinely considered in policy work – and prioritise development of networks;

revise the Senior Executive Capability Leadership Framework and Integrated Leadership System to include mandatory numeracy and data analysis skills; and

Formally establish mechanisms to provide contestability for major policy initiatives, including all Cabinet submissions.

Dr Martin Dunn


This text has been redacted: Date redacted