Home > Your ideas > Submissions > Department of Industry, Innovation and Science

Department of Industry, Innovation and Science

Submission: 

Please find attached submission to the Independent Review of the Australian Public Service from the Office of the Chief Economist at the Department of Industry, Innovation and Science.

Document: 
File Download (376.13 KB)
Automatic Transcription: 

Submission to the APS Review

Office of the Chief Economist

August 2018

For further information on this submission please contact:

David Turvey

A/- Chief Economist, Economic and Analytical Services Division (as at 7 August 2018)

Department of Industry, Innovation and Science

GPO Box 9839

Canberra ACT 2601

Phone : +61 2 6102 9929

Email: David.Turvey@industry.gov.au

Department’s website at: www.industry.gov.au/OCE

Submission to the APS Review from the Office of the Chief Economist, Department of Industry, Innovation and Science

The Office of the Chief Economist in the Department of Industry, Innovation and Science (DIIS) is pleased to offer our views and experience to the Australian Public Service (APS) Review specifically in relation to the following aspects of the APS Review Terms of Reference:

‘How the APS monitors and measures performance and how it ensures the transparent and most effective use of taxpayers’ money in delivering outcomes’.

The first part of our submission discusses the importance of establishing evaluative capability within all government departments, based on our experience in this department. The second part makes a similar case for embedding good data management practices. Both sections demonstrate the importance of embedding similar practices within all departments and ultimately the APS as a whole.

Introduction

The capture, analysis and reporting of relevant data and intelligence on program and policy performance is critical for any high performing organisation, and the APS is no different. The appropriate use of past and future (predictive) performance information gained via good program management practice and the systematic use of evaluation can contribute substantially to improvement and innovation across the APS. Applying a data and evaluative lens, we need to be continually asking ourselves ‘How are we performing? How do we know what works and are we doing the right things?’

The Enhanced Commonwealth Performance Framework under the Public Governance, Performance and Accountability Act 2013 (PGPA Act) has reframed expectations around performance reporting and accountability. The Act has encouraged better evaluation strategies. We adopted a new strategy in 2015 and reviewed it in 2017. The review found that the Department is a leader in the APS but still has room to improve, especially in making sure that all staff are aware of the importance of evaluation and that evaluation findings influence policy and program design more directly.

The Department recognised that better harnessing its data has the power to improve policy development, program management and corporate services. Our Data Management Strategy for 2016-18, and accompanying benefits realisation strategy, set the roadmap to establish the foundational capabilities needed to turn our ambition and commitment of a data-driven organisation into reality

Our experience in implementing this framework across the Department may present useful lessons for other Commonwealth departments and agencies.

Evaluation

Due to our efforts to ‘close the loop’ between policy development, implementation and evaluation, the DIIS Evaluation Unit, we have embedded evaluative activities across internal and Business Grants Hub programs.

Our central Evaluation Unit is located within the Office of the Chief Economist (OCE) providing a level of independence from policy and program areas, and is responsible for delivering and supporting a range of evaluative activities as illustrated in Appendix 1. The following table outlines the suite of evaluation tools and processes that DIIS has developed and is using to build an evaluation culture within the department. All of these could be replicated and adopted across the APS:

Table: Tools and capabilities that have helped DIIS build and maintain an evaluation culture

What? Tool/Resource/Capability

Why? Purpose

Published departmental Evaluation Strategy

Sets out how DIIS addresses its responsibilities under the Enhanced Commonwealth Performance Framework under the PGPA Act

Evaluation Plan with scheduled evaluation activities

Definitive message that evaluation is resourced, supported and expected

Senior executive level evaluation champion

High-level leadership and championship of evaluation drives cultural change

Evaluation Ready process to establish an evaluation strategy for programs early in their lifecycle

Ensure evaluability of all programs with timely and appropriate data collection strategies

Data collection advice

Relevant and trusted data is collected at the right time, for the right purpose; and supports performance monitoring and evaluation

Centralised Evaluation Unit responsible for conducting and managing evaluations

Definitive message that evaluation is resourced and supported

Evaluation Fair

Enabling cultural change through showcasing practical evaluative tools and resources across the department

In order to bring about real change in evaluation capacity, the purpose of evaluation needs to be meaningfully integrated into organisational policies and procedures. At DIIS, fostering evaluative thinking across policy and program staff is a key success factor, helping us to ensure that staff who are developing and implementing programs can effectively identify and articulate program outcomes (not just outputs) and have been engaged in the process of developing indicators and data sources for future evaluation of their programs. In parallel, the department regards the actual conduct of evaluations as a specialised role to ensure robust evidence is acquired using rigorous methodologies.

The DIIS Evaluation Strategy established a robust framework for the conduct of evaluations under the PGPA Act and introduced the process of Evaluation Ready to ensure programs are prepared for evaluation as early as possible in their lifecycle with measures, indicators and data sources identified to answer evaluation questions. The Strategy is the foundation of our efforts to incorporate evaluative thinking across varied policy and program areas.

Evaluation Ready is a tool that both promotes an evaluation culture and enhances evaluation capability within the department. An evaluation strategy is the key deliverable product emanating from the Evaluation Ready process. It includes a program logic, evaluation questions, data requirements and an evaluation schedule. As the Evaluation Ready process is undertaken as close as possible to the program design stage, it is timely to be able to inform program documentation including application forms and reporting templates to ensure sufficient data collection (including baselines) for future evaluations.

The Evaluation Strategy supports the department’s Evaluation Plan which schedules evaluation activities for all relevant departmental programs across a four-year cycle. The Evaluation Strategy categorises program evaluations into three tiers to determine evaluation effort and resourcing. This tiering system prioritises program evaluations based on:

•total funding allocated for the program

•internal priority (importance to DIIS and the Australian Government’s goals)

•external priority (importance to external stakeholders)

•overall risk rating of the program

•track record (previous evaluation, the strength of performance monitoring and lessons learnt).

Evaluation of the Department’s programs are planned over a rolling four year cycle according to this tiering system.

The evaluation strategy also includes an ambition to publish evaluation findings wherever possible. This promotes transparency and rigour and allows for external review and feedback to improve our evaluation methods. Clearly there will be circumstances where it will not be appropriate to publish evaluation findings.

The OCE has recently trialled a process of supporting staff to apply an evaluative lens when preparing New Policy Proposals for government consideration. This includes encouraging policy analysts to articulate the program’s theory of change and be clear about intended outcomes. It also includes encouraging planning for when and how evaluation will be conducted as part of the policy proposal. This should help ensure that adequate funding is available at the appropriate time to ensure effective evaluation. There may be opportunities to make this approach more common across government through the budget process operational rules and NPP templates.

The Evaluation Unit’s experience in working with other Departments and agencies’ evaluation areas has revealed a wide range of different approaches. There are different models and approaches to evaluation and different methods used. Departments have developed a community of practice around evaluation activity and this has been supported by active engagement with the Australasian Evaluation Society. However, there may be some benefits from greater co-ordination of evaluation activities across government, including promulgating best practice, promoting comparability of evaluation results where possible, and developing more consistent approaches to evaluation of cross government policy initiatives. One model that has been suggested is establishing an Evaluator-General, or equivalent evaluation function, similar to the Auditor-General, but there are other ways that greater co-ordination could be achieved.

Data Governance and Management

Purpose

The Data Management and Analytics Taskforce was established in 2016 to catalyse a shift in how the department discovers, accesses, shares and uses data. The Data Management Strategy for 2016-18, and accompanying benefits realisation strategy, set the roadmap to establish the foundational capabilities needed to turn our ambition and commitment of a data-driven organisation into reality.

The objective of this first tranche of transformational change activities was to achieve a department-wide and standards-led regime for data management and governance. This included:

Making it easier for staff to find and access the data that is relevant to their needs.

Ensuring data is trusted, documented and understood, and that data stewards are managing data and available to answer user’s questions.

Enabling staff to analyse data using tools that they are comfortable with, understand the data, and are using it appropriately.

What have we achieved?

DatMAT has worked iteratively with many partners across the department to implement the Data Management Strategy to develop new capabilities in data management, governance, reporting and policy, which are affecting material change in the department’s data culture. This new culture is seen in the growing use of the department’s 400+ data assets through DataHub, the ongoing use and consumption of easy-to-use data analytical products, new data management and governance practices, recruitment of data stewards; and data communities of practice where staff can share new data skills and lessons learned. DatMAT has transitioned the department from a state where data was largely unmanaged to a data management maturity where the majority of high value data is well managed and defined.

From our lessons in designing and implementing a data strategy for a large APS organisation, below are the key tools and capabilities required to build and maintain good data practices and culture:

Table: Tools and capabilities that have helped DIIS build and maintain good data practice and culture

What? Tool/Resource/Capability

Why? Purpose

Data Governance Framework

Good data management relies on a strong data governance base. This includes agreed policies and processes, accountabilities, formal decision structures and enforcement of rules in data management.

Data Certification

The Data Certificate was adapted from the Open Data Institute and is considered best practice for well managed published datasets and brings together a set of legal, practical, technical and social elements. Data Certification is one of a number of initiatives that is helping us to achieve our vision and comprises three certification levels, Bronze, Silver and Gold, which signal increasing levels of trust and reliability of datasets.

Data Collection Advice

Data Collection Advice service is designed to assist a range of stakeholders across the department to understand their data requirements and design data collection points in a standard way. The Enterprise Data Model is referenced to assure data collection is standardised across all the key data items in the department. This provides an excellent vehicle for structuring the department’s data standards.

Dataset Register

Catalogue of all the datasets both internal and external available to the department.

Enterprise Data Model

The goal of the Enterprise Data Model is to provide an integrated view of the data produced and consumed across an entire department. EDM represents a single integrated definition of data, unbiased of any system or application. This an excellent vehicle for structuring the department’s data standards. The model consists of enterprise-wide subject areas, fundamental entities and their relationships, and unified terms and definitions.

Data Capability Frameworks

Data Analysis is one of the department’s five core capability areas. The Data Capability Frameworks outline the minimum expectations for a number of roles related to data. Each framework provides direction to staff looking to expand their capability in data skills, including internal and external training options, as well as opportunities to develop skills on the job.

Data training

The data training provides a means to develop capability in face to face training. Training options include data literacy, foundation data skills, data management, and strategic data for executives. The department also supports staff to develop advanced analytics skills by providing licences to online providers, and sponsoring staff to complete masters level university qualifications in data analysis and data science.

Data Communities of practice

Communities of practice include the Data Analytics Guild and several user groups for specific data tools. These groups offer staff a chance to share knowledge, experiences and code in a collegiate and collaborative environment.

New Data Strategy 2018-20

Sets the vision and goals in how DIIS will enhance our data culture, with a renewed focus on people capabilities and cultural change.

Efficient and effective program management needs trusted data and insights

We are responsible for data governance within DIIS; we optimise program data collections to ensure it is fit for the purpose of program management, reporting and evaluation. We have embedded data management practices across Business Grants Hub programs in a standardised way.

We have also coupled data certification process with Evaluation Ready, to ensure our dataset attains silver certification level (Born Silver). This is quintessential of good program management.

The process includes:

Dataset Registration – the dataset is registered in the department’s Data Register along with assigned Data Steward.

Data glossary – provides descriptions of the contents of a dataset to allow users to understand what has been collected

Data sharing rules – a clear rights statement, detailing sharing rules, impacting legislation and copyrights of a dataset

Data Certification -Data governance assurance where datasets are audited against accepted governance and management standards

.

Key Links:

Department of Industry, Innovation and Science, Evaluation Strategy 2015-2019

Department of Industry, Innovation and Science, Evaluation Strategy Post-Commencement Review, December 2017

Department of Industry, Innovation and Science, Evaluation Strategy 2017-2021

Appendix 1 – Service offered by the DIIS Evaluation Unit