Home > Your ideas > Submissions > Australasian Evaluation Society

Australasian Evaluation Society


Submission abstract

The Australasian Evaluation Society (AES) would like to thank the Independent Reviewers for the opportunity to make a submission to the Independent Review of the Australian Public Service (APS).

The AES is an over 950 member-based organisation in Australasia for people involved in evaluation including evaluation practitioners, managers, teachers and students of evaluation, and other interested individuals both within and external to the APS. It aims to improve the theory, practice and the use of evaluation through the provision of Conferences, professional development workshops, communities of practice, a Code of Ethics, the Guidelines for the Ethical Conduct of Evaluations, the AES Evaluator’s Professional Learning Competency Framework and the Evaluation Journal of Australasia.
The AES submission is offered as a formal response from the Board on behalf of AES members. In it the AES suggests that the effectiveness of the APS is best demonstrated through formal evaluation that requires long-term attention to program outcomes. It acknowledges that the Australian Government continues to experience fiscal challenges and there is a need for restraint, however sound investment in the Australian Public Service through better administrative systems, staff capacity and capability and institutional infrastructure will contribute to it being fit-for-purpose for the future.

If the Independent Reviewers wish to discuss or inquire about any aspect of this submission, the AES is available to do so. Please contact the AES Chief Executive Officer at bill.wallace@aes.asn.au.

File Download (208.27 KB)
Automatic Transcription: 

Australasian Evaluation Society Ltd (AES)

ACN 606 044 624

Submission to the


July 2018


The Australasian Evaluation Society (AES) would like to thank the Independent Reviewers for the opportunity to make a submission to the Independent Review of the Australian Public Service (APS).

With around 950 members, the AES is an Australasian organisation for people involved in evaluation, including practitioners, managers, commissioners, teachers and students of evaluation, and other interested individuals both within and external to the APS.

It aims to improve the theory, practice and the use of evaluation through the provision of conferences, professional development workshops and seminars, communities of practice, a Code of Ethics, the Guidelines for the Ethical Conduct of Evaluations, the AES Evaluator’s Professional Learning Competency Framework, and the peer-reviewed Evaluation Journal of Australasia.

The AES submission is offered as a formal response from the Board on behalf of AES members. If the Independent Reviewers wish to discuss or inquire about any aspect of this submission, the AES is available to do so. Please contact the Bill Wallace, Chief Executive Officer, at bill.wallace@aes.asn.au.

Liz Smith

Vice President

Australasian Evaluation Society

July 2018


The AES recognises that the current context for the Australian Public Service (APS) is a complex one, bounded by legislation and statutory authorities established by the Parliament, operating in fiscally constrained times and in an environment where it is tasked to responding to numerous and invariably challenging global, technological and public policy developments.

A key requirement of the APS is to be “…efficient and effective in serving the Government, the Parliament and the Australian public.” There are important reforms already underway with the introduction of the Public Governance Performance Accountability Act 2013 (PGPA) and its associated Enhanced Commonwealth Performance Framework (ECPF).

The PGPA and ECPF, together with a number of recent and historic inquiries and reviews, have identified evaluation as a key activity that can enable the APS to manage or even capitalise on these opportunities, improve citizens’ experience of government, and deliver better services.

The AES is of the view that:

these reforms have the potential for enhancing the capability, culture and operating model of the APS in order for it to meet its mission now and in the future

their successful medium-to-long term implementation and embedding—including the effective conduct and use of evaluation by and within the APS—face challenges due to capability and capacity issues

these risk the Commonwealth not realising value for money in terms of the current investment in evaluation, or could be going forward

recent findings from Commissions and Inquiries that more evaluations should be undertaken by the Australian Government have a basis in evidence and should be seriously considered.

The APS is facing different challenges and (inter)national priorities in the future. This Review should consider options for developing appropriate organisational infrastructure and support systems for evaluation and policy evidence which is capable of informing policy decision-making and showing the effectiveness of the APS in meeting these challenges. These could include:

investment in better systems to support the administration of policy and programs, including the collection of relevant and reliable data to support APS staff

increase the levels of evidentiary (including research, evaluation, and data) and performance literacy amongst APS staff, and having sufficient numbers of staff with specialist technical expertise in data, research, and evaluation

encourage a culture of performance management, including incentives for managers to engage with risk, innovate and the potential ‘to fail’

institutional infrastructure such as the establishment of an Evaluator-General, and having a Chief Evaluator at SES level appointed in each agency.


This submission is primarily responding to the following elements of the Independent Review’s Terms of Reference:

delivering high quality policy advice, regulatory oversight, programs and services

acquiring and maintaining the necessary skills and expertise to fulfil its responsibilities

improving citizens’ experience of government and delivering fair outcomes for them.

This submission comprises the following parts:

current context for the APS

evaluation and its role in contributing to appropriate, effective and efficient public administration

current capacity of the APS to undertake evaluation

possible strategies.


In making this submission, the AES recognises that the context in which Government, and particularly the APS, operates is a highly complex one.

Some of this context is established by the Parliament, particularly through key legislation such as the Public Service Act 1999 which requires ‘…an apolitical public service that is efficient and effective in serving the Government, the Parliament and the Australian public’ and the Public Governance Performance and Accountability Act 2013 which requires “…high standards of governance, performance and accountability”. It is also bounded by key institutions such as the Australian Public Service Commission and Australian National Audit Office.

Other aspects are established by the broader environment presenting significant macro-policy issues – for example the last decade has seen Australian Governments’ consistently wrestle with deficit budgets during a period that the first intergenerational reports predicted there would still be a structural surplus. This has led to significant debate on the size, nature and role of Australian Government, as evidenced by work such as the National Commission of Audit, Productivity Commission’s Reform to Human Services Inquiry and this Inquiry.

At an individual Portfolio level, there are numerous complex and often inter-related issues being responded to such as national security, education, intergenerational change, workplace relations, infrastructure, human services, energy, community cohesion, international relations, and economic growth.

With hindsight, the almost a decade old prediction of the ‘Ahead of the Game’ report that the APS would face organisational challenges such as “rising citizen expectations of government, rapid technological change, tight fiscal pressures, increasing pressure to deliver in restricted timeframes and a tightening labour market that will place greater pressure on the APS to attract and retain the best employees” has proven to be quite prescient.


Evaluation encompasses the systematic collection and analysis of information to answer questions, usually about the effectiveness, efficiency and/or appropriateness of an ongoing or completed activity, project, program or policy.

Evaluation answers questions about whether government objectives have been achieved and the extent to which program activities have contributed to its purpose. Through careful data collection (qualitative and/or quantitative) and analysis, evaluation incorporates monitoring and additional complementary descriptive performance information to make assessments, form judgements about success, and inform decisions about future programming, strategic decision making and resource allocation.

While evaluation is often used at the end of an activity or program (commonly referred to as summative or impact evaluation), it is also a powerful tool in program design and implementation (referred to as formative evaluation). Evaluation professionals use formal methodologies to provide useful empirical evidence about public entities (such as programs, products, or performance) in decision-making contexts that are inherently political and involve multiple stakeholders, where resources are seldom sufficient and where time-pressures are salient.

Evaluative inquiry therefore can be undertaken across the policy and program life-cycle to:

help identify and measure the need for a policy or program and to understand best practice

clarify and strengthen policy and program conceptualisation and design (including what the expected key activities, outputs and outcomes are, when these are expected to occur and in what sequence, and what data is needed to measure these)

support implementation by testing fidelity (process) and identifying opportunities for improvement during roll-out

inform ongoing program management and accountability/measurement by identifying and producing sound data and indicators

identify the outcomes, impacts effectiveness, efficiency and lessons learned of the policy and program.

When it operates across the program and policy life-cycle—and particularly when planned strategically across an agency or portfolio—evaluation makes a significant contribution to an entity’s performance framework, contributing to the development of its underlying architecture, as well as contributing to the delivery of knowledge, evidence and performance information. This enables entities to ascertain and report on the level to which they are achieving their purpose.

Evaluation’s role in an effective public sector has long standing recognition. The ‘Baume Report’ (1979), more formally known as the Report of the Senate Standing Committee on Social Welfare (SSCSW) 'Through a glass, darkly; Evaluation in Australian health and Welfare services' asserted that “in order to achieve an efficient, effective, rational and equitable health and welfare system, it is necessary to conduct ongoing evaluation.”

More recently, the introduction of the Public Governance Performance Accountability Act 2013 (PGPA) has identified a key role for evaluation in enabling effective governance, performance, reporting and accountability of Commonwealth entities. This was emphasised from the outset in the PGPA Act’s explanatory memorandum:

“….and future elements of the CFAR reforms, will seek to link the key elements of resource management so that there is a clear cycle of planning, measuring, evaluating and reporting of results to the Parliament, Ministers and the public.

  1. The PGPA Bill does this by: explicitly recognising the high-level stages of the resource management cycle; recognising the value of clearly articulating key priorities and objectives; requiring every Commonwealth entity to develop corporate plans; introducing a framework for measuring and assessing performance, including requiring effective monitoring and evaluation; and maintaining the rigorous audit arrangements currently in place.”

In addition to the Department of Finance (through its Resource Management Guides and public presentations) and the broader literature, this contribution has also been highlighted in international initiatives such as the Sustainable Development Goals (SDGs) and the 2030 Agenda for Sustainable Development, which stress the importance of national-led evaluations.

Evaluation has evolved into a profession that requires a specialist skillset that is separate from other disciplines. This is reflected by many national and international associations establishing competency frameworks. In Australia, the AES Evaluators’ Professional Learning Competency Framework identifies seven domains of competence:

Evaluative Attitude and Professional Practice

Evaluation Theory

Culture, Stakeholders and Context

Research Methods and Systematic Inquiry

Project Management

Interpersonal Skills

Evaluation Activities.

These domains contain 100 individual skills.

Evaluation is conducted both internally by APS staff and externally for the Australian Government by consultants. Impact evaluations have invariably been undertaken externally for several decades. The reasons for this are usually a combination of perceived independence, accessing additional technical skills (capability) and staff resources (capacity), as well as being a more cost-effective option. The level of investment in an externally commissioned evaluation can vary, from $0.1m to $10m in administered funding.

A critical question is whether the current level of investment in evaluation by the Australian Government is enough. The National Commission of Audit (2014), the Productivity Commission (2016), and most recently, the PGPA Act 2013 and Rule Independent Review (2018) have called for investment in more evaluative activity.


The manner in which evaluation is undertaken across the APS has changed over time. In the 1980’s it was somewhat ad-hoc, in the 1990’s a centralised model overseen by the Department of Finance evolved, and into the 2000’s it moved to a devolved approach where performance management was the responsibility of individual Ministers and agency heads. While the recent development of 'grants hubs' at Commonwealth level also provide an element of whole-of-Commonwealth-government evaluation service, the devolved model remains in place.

The introduction of the PGPA Act 2013 in many ways established the current context in which evaluation is undertaken within and by the APS. The AES has commended the progress made to date to introduce the PGPA Act and Enhanced Commonwealth Performance Framework (ECPF) to improve performance governance and accountability reporting to the Parliament and the public. It has also acknowledged that the implementation of the reforms has been a complex and, in some instances, challenging task, and may continue to be so in the near future.

The AES argues that government performance frameworks can potentially serve a number of complementary purposes including:

supporting Parliamentary oversight and accountably

informing strategic policymaking and driving better outcomes

guiding resource allocation decisions and identifying potential cost savings

improving the implementation of programs and enhancing coordination across departments

engaging civil society in clarifying service expectations and reviewing performance.

Feedback from AES members for both this submission and its recent submission to the PGPA Act 2013 and Rule Independent Review suggests that the reforms have had a positive impact within government. In a number of APS entities, it has led to a greater focus on outcomes at both the program and broader policy level.

This feedback also indicates that there are a number of areas for further development in terms of the APS being able to commission, fund, project manage, and utilise evaluations that provide a sound, rigorous evidence base on which to provide high quality advice to government on policy, programs and services.

Embedding reform has been an ongoing challenge for governments at all levels in Australia. In its submission to the PGPA Act 2013 and Rule Independent Review the AES noted that challenges remain to embed this reform, which are also relevant to the terms of reference to this inquiry. These and other challenges are outlined below:

Insufficient incentives for entities to fully engage in the spirit and substance of current performance reforms

AES members have reported contrasting responses in terms of resourcing, effort and commitment from entities. At one end, there are cases of increased development in information technology and reporting architecture, increased resourcing to the evaluation function, and a clearer understanding of the role and linkages from evaluation practice through performance and information management, to achieving accountability via being able to tell a performance story. At the other end, there are indications of agencies that have reduced their effort and investment in evaluation and performance reporting.

The draft Report of the Independent Review of the PGPA Act and Rule found that “Accountable authorities should also drive a wider use of policy evaluation approaches by government departments to improve the quality of performance reporting. Academics suggested to us that the use of independent evaluation of government programs and services could be increased and was more frequent in the 1990s than it is now.”

Good performance management and reporting (as noted by Department of Finance) also means engaging with risk – including being willing to innovate and even ‘to fail’. Engaging with risk is not straightforward in an environment where political dynamics and considerations exist. A question that has been raised by a number of AES members has been one of incentives; namely whether existing APS leadership incentives are antithetical and incompatible with “performance leadership”? The Independent Reviewers have noted “The tone is set at the top…strong and sustained leadership on improving performance monitoring, reporting and evaluation regimes is needed to improve performance reporting in entities.”

The AES supports its 3rd recommendation that “The Secretaries Board should take initiatives to improve the quality of performance reporting, including through the greater use of evaluation, focussing on strategies to improve the way entities measure the impact of government programs.”

Evaluation findings being incorporated into Performance Measurement and Reporting

At the 2017 AES International Evaluation Conference, the Department of Finance noted that evaluations and their findings were not yet being sufficiently presented in Corporate Plans or Annual Performance Statements, and asked the evaluation community how this could be addressed. This is also the perspective of a number of AES members. Some dynamics contributing to this have been outlined above. Others include:

limited understanding that valid performance information comprises both quantitative and qualitative indicators. While the PGPA Act and ECPF have promoted a renewed interest and focus on outcomes in a number of entities, often first instincts are to measure these quantitatively. Even when both are being considered, they are often seen to be distinct streams, and there is a need to move towards adoption of a more ‘mixed-methods’ approach where they are utilised in a combined manner.

the changes sought via the PGPA and ECPF Acts are not insignificant, and it may be that even where Departments are moving positively towards these objectives, the time required to do so may be longer than first anticipated.

A key aim of the PGPA was to strengthen and enhance cross-agency and portfolio reporting on shared purposes and outcomes. In our view this is still a ‘work in progress’ and will benefit from additional mechanisms or models that support cross-government partnerships.

Additional resourcing required to effectively undertake evaluation, performance management and reporting

Prior to the introduction of the ECPF, it was foreshadowed by a number of stakeholders (both internal and external to Government) that it would have significant implications on entities’ resourcing—particularly in terms of capabilities and capacities. This was consistent with findings that arose from the Capability Reviews, which suggested that ‘Managing Performance’ was a development area for over half of those assessed in 2012–13, as was ‘Plan, Resource and Prioritise’, ‘Outcome-focused Strategy’, and ‘Develop People’.

The AES view has been that Department of Finance is to be commended for the work it has undertaken to support the introduction of the reforms at a time of fiscal challenges and restraints. However, indications from AES members and participants at recent AES conferences suggest that a lack of additional resourcing has had the following impacts:

Maturity of data collection, management and reporting systems

There have been some positive developments in terms of the availability of administrative data and their management and reporting systems, but it is still common for evaluation practitioners (both internal and external) to find that limitations in the capabilities of data and administrative management information systems—including linkages to relevant jurisdictional data, compromises both them and APS staff to support performance management, measurement and evaluative inquiry.

Lack of staff performance management literacy, evaluation and evidentiary expertise

As noted earlier, AES members have reported variations in terms of entity’s resourcing, effort and commitment. Some describe the APS’s capability to undertake, commission and utilise evaluation as ‘patchy’, with some entities doing it well, but others less so. Members report evaluations are often commissioned with no reference to either the PGPA Act and ECPF, or how findings are expected or required to contribute to performance reporting. This raises questions about the level of awareness amongst APS staff about non-financial accountability and reporting requirements.

There is also a broader question as to whether there is a sufficient critical mass of APS staff with expertise in research, evaluation and performance measurement. This is not a recent question – senior public servants have previously expressed a view that key skill sets in research and analysis and evaluation within the APS are in short supply. This may reflect both a capability and capacity issue, possibly arising from the trend for several decades – across all levels of government – for staff to be ‘generalists’ who are capable of and can be deployed to undertake the wide range of tasks often required of public servants – while specialist expertise is often purchased or procured. While this has provided the APS with flexibility, it may also have led to some costs.

AES members, both internal and external practitioners, report that this manifests itself all-too-often in evaluation briefs or Request For Quotes that are poorly constructed and articulated (i.e., without a clear view of their purpose), are unrealistic in terms of their scope, timing and resourcing (e.g. demanding, complex evaluations being required with very short timeframes and limited budgets), and with little sense as to what constitutes quality. Even if the evaluation is being conducted externally, it can result in its project and contract management being fraught, and the quality of the evaluation findings being compromised.

Ultimately, it creates risk in regards to the Australian Government obtaining value-for-money from its commissioning evaluations – both in terms of immediate value for money from an individual evaluation project (i.e. how sound and rigorous the evidence and information, knowledge and insights it delivers), and the broader benefits that could be derived from it effectively informing decisions that lead to improved policies and programs (e.g. either through better returns on expenditure or improved efficiencies).


The AES suggests the following strategies for the Independent Reviewers’ consideration:

Investment in Capacity and Capability building

The AES notes the importance placed on embedding evaluation and its practices across the APS, while observing that currently they are limited to technical and specialist areas of agencies. The AES supports the Department of Finance’s practice of issuing Resource Management Guides to elaborate more generally on the principles of the PGPA Act.

However, this practice is at risk of being read only by key staff and subject to their interpretations of its requirements. The intention of the PGPA Act is to change APS practice and embed its requirements into the future and, as such, needs additional structure to bring this about. Additional investment may be required in creating a greater awareness of the PGPA and ECPF Investment amongst APS staff, and in particular skill sets and knowledge e.g.

Use of program and ‘purpose' logic and theory

Developing performance measurement frameworks

Being able to tell a performance story

How to include useful qualitative analysis in different levels of reporting (e.g. Australian Government Solicitor and DSS - https://www.dss.gov.au/publications-articles/corporate-publications/annual-reports/dss-annual-report-2016-17 )

The AES, a number of its members and a range of other organisations and institutions provide resources and training in these areas. There may be value in the APS, either via the Department of Finance or the Australian Public Service Commission, looking to develop strategic relationships and partnerships with such entities to deliver relevant training and resources to APS staff.

The practice of evaluation would also be enhanced by entities having sufficient numbers of dedicated evaluation specialists within their departments. In addition to supporting generalist staff across the policy and program lifecycle in design, review, performance management, monitoring and reporting, such staff would also enable their entity to be an ‘informed purchaser’ of evaluation services and ensure value for money.

The AES recognises that this would require the Australian Government to incur additional expenditure in staffing at a time when budget constraints remain. However it also notes the findings from the Building the Education Implementation Taskforce, which found a relationship between those jurisdictions who retained internal expertise and capability and those who most obtained value for money from the measure, and con concluded that a diminution of internal capacity may represent a false economy.

Reviewing incentives for staff and the role of performance leadership

A common question that arises with reform and change management processes is whether it needs to be driven by a ‘top down’ process, a ‘bottom up’ process, or a combined approach. For the PGPA Act and the ECPF, it may require the latter. Consideration should be given to providing incentives for APS staff at all levels and roles to engage with performance measurement and reporting – such as earned autonomy or differential approach to regulation. Additionally, given the role that senior officers have both in influencing organisational culture and in approving input into Corporate Plans and Annual Performance Statements, there may also be benefit in seeking to foster a culture of performance leadership at that level. The AES notes Recommendation 1 from the Independent Review of the PGPA Act and Rule Consultation Draft that calls for the Secretaries Board to periodically assess progress by Commonwealth entities.

Encourage and facilitate partnership models that focus on policy evaluation and research

To enhance shared cross portfolio evaluations and increase evaluation capacity more generally, an example of an effective partnership model for policy evaluation-research can be found in the UK’s Centre for the Evaluation of Complexity Across the Nexus (CECAN). Co-funded by research and policy departments (Economic and Social Research Council; Natural Environment Research Council; Department for Environment Food & Rural Affairs; Department for Business, Energy & Industrial Strategy; Food Standards Authority, Environment Agency), CECAN undertakes ‘real-life’ evaluation projects, delivers a program of evaluation methods workshops, training courses in evaluation tools and specialist seminars delivered by international experts, to encourage knowledge sharing and capacity building amongst those working in UK policy making.

Introduction of Chief Evaluators

The AES notes and supports Professor Andrew Podger’s suggestion for establishing a position of Chief Evaluation Officer in each agency in his latest submission to the Independent Review of the PGPA Act and Rule. This would be consistent with similar senior management functions (e.g. Chief Operating Officer, Chief Financial Officer and Chief Information Officer) that have emerged to support the agency Secretary, in recognition of the workload and complexities involved in managing a present-day APS department and being effective and responsive to Ministers and Government.

In larger agencies, these positions are often at the current SES Band 3 level. The AES notes the Department of Innovation, Industry and Science’s comprehensive agency Evaluation Strategy was created under the responsibilities of its Band 3 Chief Economist. Similarly, the Agriculture Division head led an increase in the performance measurement capability within that area of the then Department of Natural Resources and Environment in Victoria during the late 1990’s/early 2000’s.

The AES would suggest that such a position should be at least at the SES Band 1 level for smaller entities and higher in larger ones, for sufficient influence and authority.

Introduction of an Evaluator-General

The difficulty of assessing the non-financial performance of government policies and programs should not be underestimated. In addition to the above options, the AES also supports in principle the concept proposed by Dr Nicholas Gruen for the introduction of an institutional-based strategy, an Office of the Evaluator-General, as an independent office to meet the performance information needs of Parliament by building evaluation capital across government entities and ensuring that evaluation and performance reporting are effectively undertaken. Dr Gruen proposes that its characteristics would include:

being part of the ‘independent executive’ – i.e. Auditor-General, Productivity Commission

providing resources and institutional support for a level of evaluation expertise to be cultivated within the public sector with clear career pathways through evaluation to the highest levels of the public service

monitoring and evaluation would be designed and operated in the field by officers of the Evaluator-General, in collaboration with the responsible department/agency

monitoring and evaluation outputs would be available first and foremost to service deliverers to assist them to optimise their performance, with the Evaluator-General making these public with appropriate comment and analysis.

The AES notes that in establishing such an Office, there is a caution about creating another layer of bureaucracy, which may work against the intent, particularly for smaller agencies, and suggests it would benefit from consultations with stakeholders prior to a final form and design being established. If implemented effectively, the value of such an office would ensure that reforms, policies and programs are designed to be evaluable based on assessments of effectiveness, while contributing to the non-financial performance statements of the PGPA Act. It could also serve as an invaluable resource to all agencies and complement internal evaluation resources (in keeping with external and internal audit practices).


The AES thanks the Independent Reviewers for the opportunity to make a submission to the Inquiry.

We suggest that the effectiveness of the APS is best demonstrated through formal evaluation that requires long-term attention to program outcomes. Many APS programs (especially in health, for example) require many years to produce these outcomes and demonstrate their effectiveness, needing lengthy management attention to program delivery and impact. Short-term reform will not produce this effectiveness, which requires longitudinal management perspectives and embedded agency practices.

In this review, it will be important to move beyond an APS reform cycle, to embrace embedding permanent practice by APS agencies. We acknowledge that the Australian Government continues to experience fiscal challenges and there is a need for restraint. Therefore, sound investment in the Australian Public Service through better administrative systems, staff capacity and capability, and institutional infrastructure will contribute to the APS being fit-for-purpose for the future.



Alexander, E. and Thodey, D. (2018). Public Governance, Performance and Accountability Act 2013 and Rule Independent Review Consultation Draft. Available at: https://www.finance.gov.au/sites/all/themes/pgpa_independent_review/report/PGPA_Act_and_Rule_-_Draft_Report.pdf

Banks, G. (2009). Challenges of Evidence Based Policy-Making. Productivity Commission, Australian Public Service Commission. Available at: http://www.pc.gov.au/news-media/speeches/cs20090204/20090204-evidence-based-policy.pdf

Australasian Evaluation Society (2013a). Code of Ethics. Available at: https://www.aes.asn.au/images/stories/files/membership/AES_Code_of_Ethics_web.pdf

Australasian Evaluation Society (2013b). Evaluators Professional Learning Competency Framework. Available at https://www.aes.asn.au/images/stories/files/Professional%20Learning/AES_Evaluators_Competency_Framework.pdf

Australasian Evaluation Society (2013c). Guidelines for the Ethical Conduct of Evaluations. Available at: https://www.aes.asn.au/images/stories/files/membership/AES_Guidelines_web_v2.pdf

Commonwealth of Australia (1979). 'Through a glass, darkly; Evaluation in Australian health and Welfare services' Senate Standing Committee on Social Welfare (SSCSW). Available at: https://www.aph.gov.au/Parliamentary_Business/Committees/Senate/Significant_Reports/socialwelfarectte/welfareservices/index

Commonwealth of Australia (2010). Ahead of the Game – Blueprint for the Reform of Australian Government Administration. Attorney-General’s Department. Available at: http://apo.org.au/system/files/20863/apo-nid20863-24401.pdf

Commonwealth of Australia (2011) Building the Education Revolution Implementation Taskforce: Final Report. Department of Education Employment and Workplace Relations. Available at http://pandora.nla.gov.au/pan/128244/20110727-1626/www.bertaskforce.gov.au/documents/publications/BERIT_final_report.pdf

Commonwealth of Australia (2013). Public Governance, Performance and Accountability Act 2013, No. 123, 2013. Available at https://www.legislation.gov.au/Details/C2013A00123

Commonwealth of Australia (2014). Towards Responsible Government – The Report of the National Commission of Audit Phase Two. Department of Finance. Available at: https://www.ncoa.gov.au/report/phase-two/executive-summary

Commonwealth of Australia (2016). Overcoming Indigenous Disadvantage Key Indicators 2016 Report. Productivity Commission. Available at: http://www.pc.gov.au/research/ongoing/overcoming-indigenous-disadvantage/2016/report-documents/oid-2016-overcoming-indigenous-disadvantage-key-indicators-2016-report.pdf

Commonwealth of Australia (2017). Evaluation Strategy 2017-2021. Department of Industry, Innovation and Science, Canberra. At https://www.industry.gov.au/sites/g/files/net3906/f/May%202018/document/pdf/department_of_industry_innovation_and_science_evaluation_strategy_2017-2021.pdf

Department of Industry, Innovation and Science (2018). Response to the Independent Review of the PGPA Act and Rule – Consultation Draft Report (June 2018). Available at: https://www.finance.gov.au/sites/all/themes/pgpa_independent_review/draft-submissions/DOIIS.pdf

Gruen, N. (2016). Why Australia Needs an Evaluator-General. The MANDARIN (9 May 2016). At https://www.themandarin.com.au/64566-nicholas-gruen-evaluator-general-part-two/

Mcdonald, B. Rogers, P and Kefford, B. (2003). Teaching people to fish? Building the evaluation capability of public sector organisations. Evaluation, Sage Publications, London

Podger, A. (2018). Comments on the Consultation Draft Report of the Independent Review of the PGPA Act 2013 and Rule. Available at: https://www.finance.gov.au/sites/all/themes/pgpa_independent_review/draft-submissions/professor-andrew-podger.pdf

Tune, D. (2010). Evaluation: Renewed Strategic Emphasis – Presentation to the Canberra Evaluation Forum. Department of Finance. Available at: https://www.finance.gov.au/presentations/


Australasian Evaluation Society – www.aes.asn.au

Evaluation of Complexity Across the Nexus - https://cecan.ac.uk