Home > Your ideas > Submissions > Australian Evaluation Society

Australian Evaluation Society

Submission: 

No answer

Document: 
File Download (485.52 KB)
Automatic Transcription: 

00

Australian Evaluation Society Limited (AES)

ACN 606 044 624

2nd Submission to the

INDEPENDENT REVIEW OF THE AUSTRALIAN PUBLIC SERVICE

00May 2019

00

The Secretariat

Independent Review of the Australian Public Service

Department of the Prime Minister and Cabinet

PO Box 6500

Canberra ACT 2600

Dear Secretariat,

The Australian Evaluation Society (AES) would like to thank the Review Panel for the opportunity to make a second submission to the Independent Review of the Australian Public Service (APS).

The AES is pleased to see that the value of evaluation is being recognised, and that the Review is actively testing its current thinking regarding this (as well as all other aspects of a future APS). Quality evaluation practices and nuanced evaluative thinking can contribute to achieving each and all of the Review’s goals in view of the challenges and opportunities outlined in the Priorities for Change report. We see evaluation as a central strategy to support the ongoing development of the APS.

The AES proposes a “fit-for-purpose” APS would include an institutional infrastructure for evaluation. This infrastructure would consolidate the authorising environment for evaluation and sustain the good governance of evidence in decision-making. It would house strong senior leadership for performance and prioritise investment in performance-related skills (for both generalists and specialists). It would develop capability and capacity around culturally safe and appropriate practices - noting with the development of the AES Reconciliation Action Plan, the AES has demonstrated and will continue to build the evaluation capacity of indigenous evaluators and evaluation in the indigenous context. It would be developed through consultative processes and continuously supported by drawing on internal and external resources and expertise.

00The AES offers this submission as a formal response from the Board on behalf of AES members. If the Review panel wishes to discuss or inquire about any aspect of this submission, the AES is available to do so. Please contact the Bill Wallace, Chief Executive Officer, at bill.wallace@aes.asn.au.

President

Australian Evaluation Society

00May 2019

EXECUTIVE SUMMARY

In its previous submission to the Review, the AES noted that important reforms were underway in the APS context from the introduction of the Public Governance Performance Accountability Act 2013 (PGPA) and its associated enhanced Commonwealth Performance Framework (CPF), which have potential to enhance the APS capability, culture and operations. However, their successful medium-to-long term implementation and embedding - including the effective conduct and use of evaluation by and within the APS - are facing challenges. As such, the AES proposed:

investment in better systems to support the administration of policy and programs, including the collection of more relevant and reliable data to support APS staff

an increase in the evidentiary and performance literacy amongst APS staff, and having sufficient numbers of staff with specialist technical expertise in data analysis, research, and evaluation

encouraging a culture of performance management, including incentives for managers to engage with risk, innovate and the potential ‘to fail’

establishing institutional infrastructure such as an evaluator-general or similar, and having a chief evaluator at Senior Executive Service (SES) level appointed in each agency.

The AES is encouraged to note that aspects of these proposals are reflected in Review’s Priorities for Change paper. In this document, and in response to the Review’s priorities, we refine elements of our first submission and develop new ideas.

The key points in this submission are summarised below.

Priority 1 – Institutionally embed evaluation and strengthen governance systems around how evidence is generated and used in decision-making through:

establishing clear senior leadership to enable an authorising environment for performance (including evaluation)

institutionalising evaluation through the introduction of a ‘networked hub-and spoke’ model, with a whole of Australian Government centralised function operating collaboratively with centralised evaluation functions in each department.

Such a model should be informed by internal and external consultation and co-designed, and formally evaluated.

Priority 2 – Develop operational systems for this institutional infrastructure which are linked by this governance system for performance (including evaluation), through:

realising the potential of a fuller application of evaluation across the policy and program lifecycle

providing additional detail on the structures that would empower collaboration, more networked systems and common processes—and ensuring these structures are conceptually aligned with a ‘theory of change’ which relates APS cultural transformation to enhanced performance and outcomes

embedding collaborative design and feedback processes in the way the APS operates.

Priority 3 – Invest in workforce capabilities to increase generalist and specialist core competencies for evaluation, through:

pursing the idea of an APS Academy, ensuring its design is informed by consultations with APS entities to build on strengths and learn from good practices, including in relation to external partnerships

leveraging existing research/work in the evaluation sector (and others) about professional competencies to inform the development of a professions model

developing capability and capacity around culturally safe and appropriate policy, design and performance (including evaluation) practices.

Priority 4 – Continue to engage the expertise of professional sectors and wider Australian community, significantly Indigenous Australians, to inform the design and delivery of future APS arrangements and their evaluative measurement, through:

emphasis on the value of engagement in APS practice, and the role that accessible information, transparent consultation and tangible partnerships can play in building citizen participation and trust, improving service quality and fostering high expectations for APS performance.

Engaging with professional, other government and non-government sectors to develop public sector expertise and learn from applied responses to ethical issues in evaluation.

Commitment to consultation and reciprocity regarding the implications of the release of performance monitoring data and evaluation findings.

We would highlight that (for brevity) the contents of this second submission reflects at a high level the detailed thinking of the AES and its members on a range of issues, many of which involve technical matters and have policy implications.

The AES is available to directly discuss or facilitate further conversations with the Review panel on these issues in order to support the Review’s efforts to develop effective strategies to implement and sustain change.

INTRODUCTION

Evaluation encompasses the systematic collection and analysis of information to answer questions, usually about the effectiveness, efficiency and/or appropriateness of an ongoing or completed activity, project, program or policy. Evaluation professionals adopt a broad range of formal approaches, social science methods and stakeholder engagement activities to provide fit-for-purpose evidence.

Evaluation is often used at the end of a policy or program cycle (referred to as summative or impact evaluation). However, it can also be used to assess whole-of-government performance, provide information for continuous improvement, and it is a powerful tool in design and implementation (referred to as formative evaluation). Indeed, evaluative inquiry can be undertaken across the policy and program life-cycle to:

enhance public sector planning and operations, and inform budgetary decisions.

help identify and measure the need for a policy or program, or understand best practice

clarify and strengthen policy and program conceptualisation and design (including what the expected activities, outputs and outcomes are, when these are expected to occur and in what sequence, and what data is needed to measure these)

support implementation by assessing reach, dose, fidelity, context (process) and identifying opportunities for improvement during roll-out

inform ongoing program management and accountability/measurement by identifying and producing sound data and indicators

identify the outcomes, impacts effectiveness, efficiency and lessons learned of the policy and program (and in turn, inform budget allocations).

This conceptualisation of evaluation underpins this submission to the Review.

Responding to the Review’s three key questions for each of the four Priorities for Change, this submission comprises the following parts:

Executive Summary

Introduction

Response to Priority 1 – Strengthen the Culture, Governance and Leadership Model

Response to Priority 2 – Build a Flexible APS Operating Model

Response to Priority 3 – Invest in Capability and Talent Development

Response to Priority 4 – Develop Stronger Internal and External Partnerships

Conclusion

Appendices – Examples of good practice relevant to evaluation within the APS

Priority 1 – STRENGTHEN THE CULTURE, GOVERNANCE AND LEADERSHIP MODEL

REVIEW PROPOSALS

Common purpose that unites and inspires the APS

Secretaries Board driving outcomes across government and APS performance

A defined head of service and head of people

Clarity and confidence the appointment and expectations of secretaries

Genuine transparency and accountability for delivering outcomes for Australians

The AES agrees that culture and leadership practices are a key dynamic for the APS generally, and particularly for evaluation-related activity. A body of literature indicates that evaluation often thrives when there is clear senior support (i.e. an institutional ‘champion’) and diminishes when there is not.

In view of this, the following reflects AES responses most relevant to proposals 2 and 5.

How can we strengthen these proposals?

Clear senior leadership to enable an authorising environment for performance

Articulating how (and why) the Secretaries Board should take a prominent role in realising the intent of the PGPA Act 2013, and actively supporting the practical implementation of the associated enhanced Commonwealth Performance Framework. They provide the foundation for an authorising environment for evaluation and other performance activities.

Such an authorising environment would:

support continuous improvements to performance monitoring and reporting, such as information and metrics that meaningfully inform the public about outcomes and drive lasting change.

clearly signal performance as a priority and so help embed a serious commitment (including for resourcing) to strengthening monitoring and evaluation within and across the APS

drive a focus on enhancing ‘performance literacy’ at all levels, and in the future across all professions, across the APS.

Establishing central expectations for achieving outcomes would address some of the cultural issues identified in the ANZSOG paper, and lead to stronger performance leadership at all SES levels. This would also benefit from clear messaging from the Secretaries Board down, signalling:

the APS culture should be inclusive of, and reward curiosity and experimentation

the APS culture should be resilient in recognising that performance measurement activities (such as evaluation) will from time-to-time provide ‘uncomfortable truths’,

the APS culture should meet challenges with a transparent, learning and improvement response, rather than risk aversion, blame or compliance response.

What are we missing?

Institutionalising evaluation with a ‘networked hub-and spoke’ evaluation functions model

A clearer articulation of the institutional structure for evaluation, both across and within the APS, to enable evaluation to be sufficiently systemised and embedded.

AES submission in-focus: a networked hub-and-spoke model

Feedback from AES members (including current and former APS officers) suggests a ‘networked hub and spoke model’ be adopted where:

At a whole-of-Australian Government level, evaluation is overseen centrally by a stand-alone entity or one that is located within a central agency. It should:

be independent in nature and function (headed by a statutory officer who reports to the Parliament)

have responsibilities for:

broad evaluation policy and practice direction

ensuring that evaluation policy and practice embraces an equity and cultural lens

provision/acquisition of technical guidance and resources

monitoring, evaluating and reporting on the state of evaluation culture and practice within the APS (and its contribution to broader APS performance)

coordinating evaluations that involve multiple entities to ascertain whole-of-government impacts

providing expertise on evaluation capability and capacity building

building the cultural capability of the APS workforce

fostering cross-Commonwealth evaluation relationships and networking between entities (e.g. forums, leadership groups meetings, workshops).

Entities are responsible for evaluations within their portfolio, supported by a centralised evaluation function within each entity. There are a number of good practice models currently operating within the APS (see Appendices), involving features such as:

a level of independence from policy/program areas (but work collaboratively)

multi-disciplinary evaluation team(s)

strong connections to design, implementation and performance functions

strong focus on using evaluation for continuous improvement processes

leadership by a designated Chief Evaluator/Scientist/Economist, often SES level

high-level governance involving senior APS officers and external experts.

Factors to consider in collaboratively designing the proposed model:

In terms of evaluation policy, the central evaluation function should consider adopting an interdisciplinary model of evaluation that doesn't privilege a particular evaluative or methodological approach, or type of evidence.

This reflects the complex environment in which the AES operates. To enable the APS to engage and respond to this complexity, evaluations and other evidence need to go beyond "did it work" to answer questions about “what works, for whom, under what circumstances and when”, positioning it to adapt to different situations and cultural contexts.

Such an approach is consistent with the enhanced Commonwealth Performance Framework, which provides an overarching structure and consistent framework, but does not take a prescriptive approach as to how entities actually assemble performance information - they enable flexibility at an entity level to adopt approaches and methods that are fit-for-purpose for their particular context.

The most appropriate location of a whole-of government centralised function is best determined once key aspects of its mission, scope and design are confirmed, and by taking into account:

legislative mechanisms required to provide for sufficient independence

the resourcing that will be required (human, technical, IT)

the culture of a host organisation.

A whole-of government centralised function and entity-level units would need to have clearly defined relationships with other institutional actors in place now and which emerge as part of post-Review forms such as the APS Academy and the entity with responsibility for overseeing the ‘professions’ model. To this end, the Review could consider the merits of the central evaluation function:

having a key role in informing the design of any evaluation stream under the professions model, including on issues such as competencies (for which the central function could lead policy).

being engaged for advice on the content and delivery of evaluation-related professional development by the APS Academy (including courses offered, their content and frequency).

Overall, to ensure the ‘networked hub-and-spoke’ model is designed effectively, with an equity and cultural lens, and implemented in a practical way with the support of internal stakeholders and external partners who are operationally affected, the next steps to develop a fit-for-purpose approach should involve consultations:

internally within the APS (e.g. SES officers, internal evaluation practitioners)

externally (e.g. providers of evaluation services, grant recipients where evaluation is required, Aboriginal community controlled organisations).

Enhance current processes

There are also ‘quick wins’ that can help to embed evaluation activity in a way that supports APS performance by enhancing current polices and processes, for example:

include in Cabinet Submission templates a requirement for an evaluation strategy to be included with all new policy proposals (NPP’s), with some monetary or complexity/risk threshold for doing so identified

amend Budget process operational rules to require evaluation activities to be costed and funded under the NPP process.

How do we ensure lasting change?

Model and utilise best practice to design, monitor and evaluate any new institutional structures

In striving for cultural change in the APS, a theory of change should be collaboratively developed to clarify (and ensure there is a sound rationale for) the leadership governance mechanisms that are expected to drive the changes being sought in APS operations, workforce and partnerships.

The very process of articulating (and at times, challenging) the links between values, behaviour and incentives would result in an overall stronger approach. It would also support any future efforts to understand and assess the impact of cultural change processes.

Specifically, this networked hub-and-spoke model (or any other model that arises) should be continuously monitored and then formally evaluated (e.g. post-implementation review and impact evaluation) to identify intended and unintended consequences and understand ‘what works, for whom, when and under what circumstances’.

In the first instance, this requires that a program logic is developed along with a theory of change (which links to the wider reform theory of change) to articulate the rationale for the model and a strong case for why it matters, as well as providing the foundation for a monitoring and performance measurement framework

an impact evaluation could build upon the program logic, theory of change, performance framework and post implementation review.

a range of methodologies could be considered to understand changes as a result of this reform, examples include (but not limited to) pre/post staff surveys, interviews and focus groups including to develop case studies, feedback from external stakeholders, and a meta-evaluation of the quality of evaluation reports being produced under the new model.

Embedded monitoring and feedback processes for performance information across the APS

Support sound, fit-for-purpose performance measures generally across the APS through:

an entity (possibly the Auditor-General) being funded to audit all performance measures on an annual basis

the Department of Finance being resourced to regularly use existing provisions to examine the data supporting performance measures.

Obtaining bi-partisan agreement on the performance management framework for Secretaries, that would include a mandatory component for non-financial performance reporting, including (where appropriate) commissioning independent evaluations.

Priority 2 – BUILD A FLEXIBLE APS OPERATING MODEL

REVIEW PROPOSALS

Dynamic ways of working and structures to empower collaboration

Strategic allocation of funds and resources to outcomes and essential investment

Networked and enabling systems and common processes

The AES supports a more flexible APS operating model as outlined by the Review. This would contribute to an improved authorising environment for undertaking quality evaluation. This in turn would facilitate lessons learnt from evaluations being translated into APS operational and service delivery improvements, positioning the APS to be more informed, adaptive and agile. .

The following comments respond generally to all three proposals under this priority.

How can we strengthen these proposals?

Additional emphasis on evaluations’ contribution to performance and its application across the policy and program lifecycle

More fully describing evaluation activity as part of ongoing performance and improvement across the policy cycle (rather than as a discrete activity) and as an activity that is powerful at an agency/portfolio level (rather than just a program level) would strengthen this proposal by:

better reflecting the utility of evaluation in improving APS internal and external operations, in turn assisting the Review to establish the rationale for why evaluation should be more strongly coordinated and integrated across the APS, with leadership from the most senior levels (as noted in our response to Priority 1).

helping to create ‘buy in’ to the operational systems, processes, resources and timeframes required to effectively engage evaluation expertise at all stages e.g. developing program logics for NPPs, and building monitoring and evaluation plans (including a review of existing and new data requirements) into project plans.

highlighting how evaluation can assist the APS achieve outcomes for Australian’s by providing assessments of joint-enterprises (e.g. across agencies or with other jurisdictions), and in turn fostering a more ‘demand-driven’ commitment to better and more transparent evaluations.

What are we missing?

Additional detail on the structures that would empower collaboration, more networked systems and common processes to benefit APS operations and outcomes.

Additional detail on a potential model for institutionalising evaluation and how it would be linked to APS leadership and governance, as outlined under Priority 1 (pgs. 7-8)

The AES notes that the Review is still exploring the ‘best approach to funding for APS capital investments and sustainable departmental capital allocation models’. One of the missing links to this end is how evaluation does, and has further potential to, provide essential information for budgetary decisions. As noted in the AES first submission, the current and longer-term operational capacity of APS would be stronger with:

additional resourcing in operational systems, including for program administration, monitoring and outcomes reporting, to maximise the effectiveness and impact of evaluation, performance management and reporting.

a resourcing strategy that:

provides the APS and its external partners with greater clarity for forward planning of evaluation

enables multi-year evaluation activity to take place that can achieve effective economies of scale

operates with sufficient agency-wide or cross-portfolio coordination when required, to obtain greatest value.

How do we ensure lasting change?

Embedding collaborative design and feedback processes

Underpinning the design and operationalisation of the proposed reforms through a collaborative design and feedback processes, involving a human-centred approach that is also agile to change, would ensure reform remain ‘fit-for-purpose’.

Collaborative design is particularly important with regards to the proposals for common digital platforms, consolidating and harmonising IT systems.

Specifically, any IT reform process should invest in quality consultation with APS staff, back and front end users of data systems, to fully understand the range of data needs and requirements (so such systems are not just built for day-to-day administration, but also monitoring, evaluation, performance reporting, and strategic planning purposes).

Linking, in theory and practice, the benefits of the proposed operational changes to a simultaneous process of cultural change would also help sustain efforts. The elements of culture that may help drive the required changes are ones which inspire:

appropriate levels of risk taking

curiosity to embrace new ways of working.

Priority 3 – INVEST IN CAPABILITY AND TALENT DEVELOPMENT

REVIEW PROPOSALS

Professionalised functions to deepen expertise

Empowered managers accountable for developing people and teams

Strategic recruitment, development and mobility

C21st delivery, regulation and policy capabilities

Policy advice that integrates social, economic, security and international perspectives

Overall, the AES sees real potential in the professions model to address a number of the challenges facing the APS, including but not limited to evaluation capacity and its capability to contribute to 21st century program and policy responses. Indeed, there already a live conversation within the Australian and international evaluation communities about whether and if so how to professionalise evaluation—with a range of local responses to the matter.

Informed by this context, as well as an emphasis on operational and resourcing issues affecting the development of these capabilities (somewhat different in focus to the ANZOG paper’s emphasis on culture), the following comments reflect the AES responses that are most directly relevant to proposals 1 and 4. These ‘bottom up’ approaches as an essential complemented to the ‘top down’ leadership strategies outlined in Priority 1.

How can we strengthen these proposals?

Further develop the concept of an APS Academy

Inform the development and design of an APS Academy through:

consulting with entities and examine successful models of internally delivered professional development courses

examining where entities have successfully partnered with external organisations – professional, tertiary and consultancy – to increase specialist capacity and capability (see also comments under Priority 4)

identifying and leveraging existing external resources e.g. websites (such as BetterEvaluation), training modules, courses and packages

Adopt a broad, fit-for-purpose perspective of evaluation

Rather than refer to ‘consistent methodologies’ (pg. 41), instead refer to ‘situational-appropriate methodologies’. This change would:

meaningfully reflects that evaluation uses a broad range of approaches and methods, developed appropriately in response to the varied contexts that it is practised in.

better align with how the enhanced Commonwealth Performance Framework (CPF) recognises a range of approaches and methods may need to be considered.

Stronger articulation of the links between evaluation practice and broader aspects of APS work e.g.:

New Policy Proposals

Policy and program design

Implementation

Monitoring and performance

Continuous improvement processes

Recognition that cross-disciplinary and multi-functional evaluation teams can enhance learning, and build external relationships, as part of team- and organisational-level (not just personnel-level) evaluation capabilities (see comments under Priority 2 and 4)

What are we missing?

An approach to professionalisation that incorporates generalist and specialist competencies—with recognition of resources needed to lift capabilities to meet the 21st century challenges

A layered conceptualisation of evaluation capability is missing, one that encourages an understanding that evaluation is an essential activity through which APS officers can make a positive contribution to the APS performance and outcomes. Specifically, one that incorporates:

Generalist core competencies for all APS officers: an ability to manage and use evaluation and other evidence to inform decision making and managing under uncertainty are competencies for all professions that should be named as 21st century core skills, as well as a capacity to understand and use performance information. This could be enabled by:

integrating evaluative thinking into policy and leadership training, rather than only providing evaluation training for evaluators as a ‘specialist’ field

including demonstrated performance capabilities (data analysis, research, evaluation) or ‘performance literacy’ and ‘evaluative thinking’ more broadly, in the selection criteria for SES positions.

Specialist core competencies for identified ‘professions’ (e.g. legal, audit, procurement, corporate, communications/public affairs, data, research, evaluation)

In particular, this would involve an investment in skills to address what AES perceives (based on member feedback and stakeholder consultations) as a shortage of specialist capability and capacity discrete evidentiary (data analysis, synthesis, evaluation) skill sets.

AES submission in-focus: Leverage existing work to inform the ‘professions’ model – linking Priority 3 to Priority 4

A number of professional bodies have competencies for their members and/or their profession as a whole: these could offer a useful foundation for developing each specialist stream within the APS – as well as opportunities for partnerships

In terms of core competencies for evaluators (or evaluation teams) a number of evaluation professional bodies have sought to identify and codify these.

In Australia, the AES has its Evaluators Professional Learning Competency Framework (see /www.aes.asn.au/evaluator-competencies.html).

The AES is currently undertaking significant work on evaluator professionalization through its ‘Pathways’ project (www.aes.asn.au/resources/pathways-to-professionalisation.html) and would welcome the opportunity to discuss this and the most current thinking around this among members.

More broadly, the APS could work with various professional and member organisations to co-design and co-deliver relevant aspects of the professions model (and any associated training via the APS Academy), in areas such as finance, accounting, human resources, communications/public affairs, public administration, statistical analysis, research and evaluation.

Developing capability and capacity around culturally safe and appropriate policy, design and performance practices

The APS needs to develop the capability and capacity to undertake collaborative, culturally safe and credible monitoring and appropriate evaluation processes. The Review’s proposals are currently silent on this.

This need reflects (in part) that there are relatively few Indigenous Australians who are evaluation practitioners, policy makers or commissioners of evaluation, both inside and outside the APS.

It also reflects a need for evaluation practitioners, policy makers and commissioners of evaluation to have the opportunity and a commitment to access cultural ethics and awareness and safety training.

Acknowledgment that there are resource implications associated with culturally safe evaluations.

Efforts need to be made both internally and externally to address this – in collaboration and partnership with First Nations communities (see Priority 4).

How do we ensure lasting change?

Seek to generate and capitalise on positive experiences of evaluation for generalist APS staff

Positive experiences of evaluation can encourage further engagement in evaluation and skills acquisition by generalist staff

Internal evaluation practitioners have observed instances such as internal post implementation reviews being undertaken by the program staff resulting in those people valuing the experience, gaining some performance-related skills/knowledge seeing benefits in evaluation activities.

Effective implementation of proposed key institutional infrastructure

Ensuring clear linkages and coordination through co-designing and implementing the professions model, APS Academy and any other institutional (e.g. networked hub-and-spoke) structures. For evaluation in particular, changed would be sustained by:

providing core training for generalists and specialists (mechanisms can include partnering with organisations already providing workshops/training, staff attend intensive courses, or supported in obtaining relevant tertiary qualifications)

commissioning ongoing research about APS capacity and evidence of evaluation culture (mechanisms could include surveys and qualitative studies)—such would could be undertaken within the context of an overarching review of evaluation-related reforms, as proposed under Priority 1.

Priority 4 – DEVELOP STRONGER INTERNAL AND EXTERNAL PARTNERSHIPS

REVIEW PROPSOALS

Seamless services and local solutions designed and delivered with other jurisdictions and partners

An open APS accountable to sharing information and engaging widely

Strategic, service-wide approaches to procurement to delivery better value and outcomes

Ministers supported through easier access to APS expertise, formal recognition of distinct role of ministerial advisors

Evaluation bears the hallmarks of an activity that is best done in partnership, with evaluation practitioners, front-line workers, content-matter experts, service users and communities. When there exists an environment of trust and collaboration between these internal and external partners, evaluation procurement is more informed, data collection and stakeholder engagement is more effective, and recommendations are more actionable. As referenced in Priority 1—a ‘culture of openness’ that can facilitate this trust and collaboration needs to be committed to and driven through APS-wide governance structure.

Accordingly, the AES welcomes the Review’s focus on the APS developing stronger partnerships here, and provides the following suggestions to enrich the proposals, with a focus on ideas set out in proposal 2.

How can we strengthen these proposals?

Highlight the value of engagement in evaluation practice—and democratic function that accessible information and transparent consultation can play in enhancing APS performance

Setting out a stronger ambition to partner with communities to foster a ‘bottom-up’ demand for evidence-based performance information from government would strengthen these proposals:

This includes, vitally, partnerships with First Nation communities, and evaluators from or recognised by these communities.

To generate a whole of Australian Government commitment to cultural ethics approval for all evaluations in Indigenous contexts, and refining these ethics processes for more timely reviews.

As an example, the AES has for several years provided an opportunity for Indigenous evaluation practitioners to attend the AES Conference through its Indigenous Conference Support Grants program. In this been led by its Cultural Capacity and Diversity Committee and been supported by financial partnerships between the AES, its members and PM&C.

The Strengthening Evaluation Practices and Strategies (STEPS) in Indigenous settings Project identified funding practices responsive to First Nation needs and priorities as pivotal to community-engaged program planning and evaluation findings honouring and benefitting First Nation communities. Building and maintaining relationships with community was central to the cultural integrity of the overall evaluation.

Positioning the use of citizen surveys related to experiences/ satisfaction with the APS (Proposal 2) as part of a suite of wider engagement mechanisms. Strategies to boost the accessibility of survey findings to all Australians would also be warranted.

Engaging strategically with the evaluation sector around the conduct of procurement to better understand ‘what it takes’ to commission and conduct high quality evaluations i.e.

how to make informed procurement decisions (including timeframes and budgets)

how to collaboratively manage external evaluation consultancy projects in a way that helps the APS (and, indeed the public) to get better value for money.

how to design and implement culturally sensitive commissioning processes that engage First Nations’ leaders.

Engaging with professional sectors to learn from applied response to ethical issues

In ensuring the APS standards of ethics and integrity are reflected in arrangements with external providers, the proposals would be strengthen by the Review:

acknowledging that many professions also already have particular codes or values that their members are required to abide by

considering how these existing codes and the APS Values interact (noting that in many instances these are likely to align)

The AES Code of Conduct and Guidelines for the Ethical Conduct of Evaluations may provide the Reviewers with a useful example (see www.aes.asn.au/resources.html )

The Australian Institute of Aboriginal and Torres Strait Islander Studies (AIATSIS) Guidelines for Ethics Research in Australian Indigenous Studies may also be of assistance (https://aiatsis.gov.au/sites/default/files/docs/research-and-guides/ethics/gerais.pdf)

What are we missing?

Identification of possible partnership models and pathways

See box “AES Submission in-focus” linking Priority 3 to Priority 4 (pages 13-14).

What is missing to achieve APS policy capabilities suitable for the 21st century, is a reflection of the advanced evaluation approaches that the APS will need to be able to conduct and/or commission. To this end, the APS may be well served by:

establishing temporary research partnerships between policy/program areas, scientific intuitions and university-based researchers.

consolidating relationships with non-government organisations that deliver and evaluate APS-funded/supported services, as these relationships may bear fruit in supporting the APS to develop more citizen-centric approaches to evaluation.

Commitment to consultation regarding the implications of the release of data, research and evaluation

The AES has engaged with its membership (including public service practitioners and external consultants) at a number conference/events on this matter.

There has been support for wider publication of, or greater accessibility to evaluation reports, as this can promote efficient knowledge sharing about ‘what works’, in addition to public accountability. Members also note that some APS departments this is already occurring routinely (especially for impact evaluations). With this in mind, there is scope for further consideration in the APS context about:

what mechanisms and resources may be required to support wider access and more regular publication—all of which will have implications for how activities may be procured, conducted, reported and used (by primary audience and other parties)

what financial and reputational implications there could be for APS partners (e.g. external evaluators) given that projects occur in time and resource-limited environments, sometimes with incomplete information or limited access to all relevant stakeholders.

How do we ensure lasting change?

Effect changes in leadership, operating models, skills, culture and practice

Developing stronger internal and external relationships is both contingent on, as well as an enabler of, many of the proposals being put forward by the Review, as well as those in this response.

Looking to increase opportunities to partner internally and externally requires a combination of policy, leadership, structures, knowledge and skills. Over time this should lead to an APS culture and practice of internal and external partnerships and collaboration.

Some AES members have offered the broader reflection that, if the APS is successful in strengthening public trust in the institution—and that achieving this is likely to require engaging citizens in transparent feedback and reflection process—then the public may over time become more engaged in evidence-based policy, design, implementation and performance.

CONCLUSION

The AES thanks the Review panel for the opportunity to make a second submission. Core elements of our submission are built on the following AES positions, informed by consultation with our members.

Evaluation can contribute across the policy and program lifecycle – determining need and/or best practice, informing design, supporting implementation, delivery and monitoring, and determining impact.

It draws upon a broad range of approaches and methodologies and should be seen as part of and contributing the broader performance arrangements within the APS.

There is a need for senior leadership and support of evaluation (and performance generally) to create an enabling culture.

Embedding evaluation through institutional infrastructure arrangements, such as a ‘networked hub and spoke’ model with a central whole of APS central evaluation function working in collaboration with centralised functions embedded within each Australian Government entity.

Investing in institutional arrangements and operational systems that support effective performance.

Investing in capability and capacity, both for generalist staff in terms of performance literacy and enabling culturally safe evaluation practices, as well as specialist evidentiary and performance staff, using strategies such as an APS Academy and ‘professions’ model.

Seeking to partner and draw upon the expertise that exists within and outside the APS both in the short term to help inform the design of future arrangements, as well as the medium to longer term measurement of these to ensure the APS meets its objectives and goals.

Many of the points raised briefly in this submission have deeper technical and policy detail underlying them, and around which the AES and its members have advanced thinking. The AES is happy to discuss these directly or facilitate further consultations to support the Review process.

Evaluation and review processes can be challenged to identify practical recommendations. Change and its effects in complex systems can also be difficult to measure, and is sometimes incremental in nature.

With this in mind, we would encourage those charged with taking the Review forward to use ‘tools of the trade’ such as program logics and theory of change to plausibly articulate why certain activities are anticipated to contribute to various outcomes. Further, thinking evaluatively about the ‘early warning signs’ and ‘signs of success’ for the Review could help refine design and implementation, and establish a framework for assessing reform outcomes.

APPENDICES – EXAMPLES OF BEST PRACTICE WITHIN THE APS

Office of the Chief Economist – Department of Industry, Innovation and Science (DIIS)

To ‘close the loop’ between policy development, implementation and evaluation, evaluative activities have been embedded across internal and Business Grants Hub programs. A central Evaluation Unit is located within the Office of the Chief Economist (OCE) providing a level of independence from policy and program areas, and is responsible for delivering and supporting a range of evaluative activities.

A suite of evaluation tools and processes has been developed to build an evaluation culture within the department, all of which could be replicated and adopted across the APS. They include:

a published departmental Evaluation Strategy

a departmental-wide Evaluation Plan with scheduled evaluation activities

an SES level ‘Evaluation Champion’

the ‘Evaluation Ready’ process to establish an evaluation strategy for programs early in their lifecycle

providing data collection advice

Centralised Evaluation Unit responsible for conducting and managing evaluations

Evaluation Fair.

Further information can be found at the DIIS submission and the DIIS Evaluation Strategy.

Office of Development Effectiveness - Department of Foreign Affairs and Trade

The Office of Development Effectiveness (ODE) is a unit within the Department of Foreign Affairs and Trade that monitors the quality and assesses the impact of the Australian aid program.

ODE’s work spans three main areas:

undertaking performance and quality analysis to test and quality assure the department’s internal aid performance assessment systems.

supporting, conducting and reviewing program evaluations of Australian aid investments.

conducting ODE evaluations with a policy, program, sectoral or thematic focus.

ODE’s work is subject to the external oversight of the Independent Evaluation Committee (IEC), an external, advisory body. Its objective is to strengthen the quality, credibility and independence of ODE’s work program, which includes:

program and strategic evaluations

performance and quality analysis

independent analysis of the department’s assessments contained in the annual Performance of Australian Aid report.

The IEC is comprised of three independent members, including the Chair, along with a DFAT Deputy Secretary. Together they contribute extensive development expertise, evaluation knowledge and high-level public and private sector experience.

Maturity model: developing the maturity of evaluation in Indigenous Affairs, Prime Minister and Cabinet

The Department of Prime Minister and Cabinet utilised a maturity development model, which informed their thinking in developing the Evaluation Framework for the Indigenous Advancement Strategy.

The scope of the Evaluation Framework includes a role in guiding “the conduct and development of a stronger approach to evaluation.”

One of the goals of the Framework is to “promote dialogue and deliberation to further develop the maturity of evaluation over time”

The concept of gradual development of the maturity of the evaluation system (i.e., the organisational practices such as governance and the attitudes of staff to transparency, for example), informs the approach to building a culture of evaluative thinking.

The Framework states:

“To move towards best practices in evaluation, the Framework will implement three concurrent streams of complementary activities to support continual learning and development covering: collaboration, capability and knowledge.”

“A key part of building a culture of evaluative thinking through these activities will be dialogue and deliberation about best practice in evaluation to support development of the maturity of evaluation over time.”

The Evaluation Framework for the Indigenous Advancement Strategy is at -

https://pmc.gov.au/resource-centre/indigenous-affairs/indigenous-advancement-strategy-evaluation-framework

The Indigenous Evaluation Committee (the Committee) is a key part of the Evaluation Framework. Through the provision of independent strategic and technical advice, the Committee supports the improvement of evaluation practices of the Indigenous Affairs Group in line with the Framework’s principles of relevance, credibility, robustness and appropriateness. One of the responsibilities of the Committee is to endorse the Annual Evaluation Work Plan (see https://www.pmc.gov.au/indigenous-affairs/evaluations/indigenous-evaluation-committee )

Centralised Evaluation model, Department of Social Services

The Department of Social Services has centralised its impact evaluation functions within its Policy Office in order to drive, support and coordinate evaluation for DSS, and to provide a consistent source of evaluation advice. The DSS Evaluation Unit is independent of but works collaboratively with policy and program areas. The Evaluation Unit:

provides costings for the evaluation component of New Policy Proposals (NPPs) to include funds to monitor and evaluate outcomes of the policy or program initiative

delivers and 'Evaluation Readiness Service' to support policy and program areas to develop theories of change, program logics and performance measurement frameworks

procures and manages all DSS trials and impact evaluations

advises on all aspects of evaluation practice

improves evaluation capacity and capability through both informal and formal channels