Please see attached document.
Submission to the APS review
Thank you for the opportunity to make this submission. It is based on a long career in the APS, the public services of other jurisdictions, and now as a private sector provider of services.
Outsourcing, contestability and internal staffing
As a person who lost an APS job because it was outsourced, and as someone who now provides outsourced services, I feel I have a balanced view on this. There are subject areas from anthropology to zoology where expertise is only needed occasionally and it is usually better to source advice from the outside rather than maintain permanent staff. There are also cases where unexpected short term peaks need to be met. Nevertheless there is probably excessive and wasteful use of external consultants in situations where the use of permanent or temporary APS staff would be more cost-effective.
Two main drivers of this overuse of consultants are apparent. One is that advice is somehow more valued when it comes from outside. This has been put to me as one reason for being hired – ‘we need an outside view’ and I fail to understand how my work suddenly becomes more valuable on leaving the public service. This reliance on the consultant crutch represents a failure of leadership and a lack of respect for the internal sources of advice. The other main driver is the staffing limit. This is a perverse constraint as not only does it waste money by driving resources to expensive consultants rather than more economic internal staff, it is an incentive to hire staff at higher APS levels.
On occasions where it is appropriate to hire external consultants, it sometimes happens that a lack of experience in the tendering process leads to impossible timescales for preparing complex tenders and sometimes with difficult-to-meet conditions that are not necessary to achievement of the task. Sometimes it seems that there is a naïve faith in the market that if we put up an RFT, then no matter how impractical, it can be met. True, there is almost always some provider who is willing to accept the brief, but when the task definition is poor the results are often disappointing.
Seniority levels in Canberra and the regions
Over the course of my career there has been a continual grade inflation so that, for example, the responsibilities of an APS 6 are now much lower than the equivalent 30 years ago. Also, there seems to be a disconnect between seniority levels in the regions and Canberra. I have seen cases where a team leader at the APS 5 level in a region, with significant responsibilities, was promoted to an APS 6 in Canberra, where their main public service contribution was photocopying, stapling and collating, leading to demotivation. Again, a reliance on overall running costs as the constraint rather than staffing levels, together with serious consideration of the PGPA requirement for cost-effective expenditure should lead to more recruitment at lower levels. Even in Canberra, there are high quality young people, some without university degrees (perhaps studying part time), who would be willing to start their career at the now almost defunct APS1 or APS 2 levels.
Taking performance measurement seriously
The recommendations of the Independent Review of the PGPA Act and Rule with respect to improvements to performance reporting are welcomed. However, this does not go far enough to enable the use of performance measurement to have a strong role in improving performance. At present, there appear to be no consequences for an organisation that fails to achieve its targets. In the private sector, failure to achieve guideline results will lead to a significant decline in share price and the careers of senior executives responsible being questioned. In the Commonwealth, there is not even much visibility of the extent to which an organisation meets, or fails to meet, its targets. This is partly due to the large number of measures that are produced, many of which are currently of questionable value. Resource Management Guide No. 134, Annual performance statements for Commonwealth entities, is not explicit in this area, using terms such as ‘entities can report the actual performance results achieved in the period’. This guide also states that the requirement to analyse performance ‘may be addressed through an entity-wide overview of performance or, where relevant, an entity may discuss specific issues on a case-by-case basis.’ In the succeeding paragraphs, entities are invited to provide context (a cynic might say excuses) through an analysis of factors that have restricted the delivery of its purposes. A requirement for more explicit reporting, for example in the early pages of the annual report, of failures to achieve specified targets, together with descriptions of planned action to address the gaps, would be helpful.
Taking evaluation seriously
Evaluation as a discipline was waxed and waned since the RCAGA. At present, APS capability is patchy; some departments do it well, others are weaker. Even where Departments are overall good performers now, the extent of commissioning and using quality evaluations is not strongly embedded and can vary substantially between Divisions. For example wider learnings from evaluations that are commissioned by specific Branches or Divisions can be lost to the Department as a whole because results are inconvenient or because of poor knowledge management. Sometimes evaluations are commissioned without a clear view of their purpose, i.e. what future decisions could be affected by the results of the evaluation.
Except in the larger Departments that use evaluations frequently, there is little expertise even to commission evaluations well. This can lead to circumstances such as requests for tender that are unrealistic in demanding complex evaluations in a very short time scale.
The need for performance reporting has been a central tenet of the PGPA. However, the link between annual performance reporting and more comprehensive evaluations is not well established. Both disciplines can contribute to each other – performance reporting can help define problems that an evaluation can address more comprehensively, and evaluations can establish sound principles for what should be monitored in the future.
Some strategies to address this patchiness and try to learn from the better performers include:
Returning to the former requirement that Cabinet Submissions should include or reference an evaluation plan, and that such submissions should include information from relevant prior evaluations.
Emphasise in selection criteria for staff, especially SES staff, that learning from experience, your own and others, is an essential attribute – in effect, proper commissioning and use of evaluations
Instituting in each agency a governance mechanism for evaluations. (This is currently done well in some agencies.) This might include having a high-level evaluation committee, or, depending on the nature of the agency, including monitoring of the commissioning, execution and use of evaluations as one of the responsibilities of the audit committee.
Strengthening arrangements for cooperation between agencies, such as sharing evaluation units so that those who do evaluations only occasionally can benefit from the experience of others.
Recognising innovations in the States
One theme of the review is innovation. In many cases, innovation is built on knowledge of what is done elsewhere. My observation is that in many cases, APS officers are rather dismissive of practices adopted in the States and Territories, with a view that of course the APS knows best and there is nothing to learn from State practices. However, there are instances, especially in service delivery, where the States have rather more experience than the Commonwealth. In the areas discussed in the previous paragraphs (performance measurement and evaluation) several States have developed good policies and practices that the APS should at least review.
Thank you for considering this submission, and please get in touch if you wish to discuss further.