Language selection

Search


Summary of the Joint Audit and Evaluation of the Impact Canada Initiative – Clean Technology Stream

About the Program

The Impact Canada Initiative (ICI) is a whole-of-government effort, led by the Privy Council Office (PCO), which promotes the use of innovative funding models designed to help departments accelerate the adoption of outcomes-based approaches to deliver results to Canadians. Natural Resources Canada (NRCan)’s Clean Technology Stream (CTS) is one of the first program streams developed under this new initiative. Led by the Energy Technology Sector (ETS), the objective of the program is to co-develop and launch a series of prize challenges to achieve breakthroughs in clean technology innovation. Since 2017, the program has launched five challenges and one Indigenous innovation initiative. Each of these has its own mission and unique set of goals, performance targets, incentive structures, timelines, collaborators, participants, outreach strategies, and evaluation criteria. This program was allocated $75 million (M) from 2017-18 to 2020-21 (extended to 2021-22 in response to COVID-19).

What the Engagement Found

Relevance

The engagement found that the ICI-CTS is relevant. The program aligns with federal government and NRCan priorities, roles and responsibilities and each of the six selected focus areas meet key criteria that were set out for their prioritization. More broadly, the ICI-CTS supports mandate commitments to increase experimentation, and is structured to give more flexibility to proponents to propose innovative solutions to problems for which solutions are not apparent.

Governance

The engagement found governance structures, processes and mechanisms for ICI-CTS were defined, established and operating effectively. Differences in key intradepartmental players’ interpretations of the ICI-CTS’ Terms and Conditions in the initial stages of program design and development resulting from the unfamiliar policy context have since been addressed with an ongoing collaborative approach.

Processes and Controls

Overall, we found that processes and controls are in place to support compliance with relevant departmental guidance and the Treasury Board (TB) Policy on Transfer Payments. Key financial operational controls have been designed and are operating effectively most of the time. However, there was an opportunity to improve the retention of documentation related to key decision processes. Proactive disclosure was not always performed within the required reporting period. The program’s process for tracking, managing and resolving recipient audit recommendations issued is not clear.

Program Design

Overall, we found that the design of focus areas under the ICI-CTS was consistent with principles of Impact Canada and best practices in challenge models. While communications strategies could have been more innovative and better resourced, the program attracted a sufficient number of quality proposals.

Stakeholder engagement in co-design was recognized as a key strength for the program. The program was responsive to co-design feedback and, while there was some room for improvement, the resulting funding flexibilities and non-financial proponent supports built into the design of each focus area facilitated participation from diverse solvers. NRCan also made good use of collaborators to co-develop and/or deliver activities under the ICI-CTS.

Measuring Expected and Unexpected Outcomes

Impact Canada uses innovative approaches that necessitate new methods and strategies for measuring program outcomes and impacts. While a notional performance framework was included with program approvals, NRCan is still working on its own and with PCO to enhance this framework and to develop challenge-specific outcomes and related methodologies. Nevertheless, available evidence suggests that the program has made some progress on each of its immediate outcomes – increased collaboration and investment in clean tech, mobilization of new talent (including in underrepresented groups), and enhanced public awareness.

It is too soon to report on progress towards intermediate and ultimate outcomes. However, a key difference in the design of challenge-based models is that if the result is not achieved, the funding is not awarded. We found progress at least insofar as finalists are progressing towards outcomes specific to their project; projects are achieving expected project milestones.

Lessons Learned

The challenge model is a new tool that provides an alternative to traditional funding programs, but decisions on when it is appropriate to use this model should be carefully considered in light of the context of the outcomes to be achieved. NRCan has gained a lot of experimental learning by testing innovations in each focus area – we expect NRCan to also apply lessons learned from the ICI-CTS in its own design of future prize challenges, whether under the umbrella of Impact Canada or as part of a separate initiative.

ABOUT THE ENGAGEMENT

The joint engagement examines the design and delivery of the ICI-CTS from its announcement in Budget 2017 to March 2021, with updates reflecting the impact of COVID-19. The objectives of the joint engagement were:

  • To assess the extent to which the program’s design and delivery facilitates the effective and efficient achievement of immediate results and complies with relevant authorities.
  • To identify and document lessons learned from the experimental approaches under the Impact Canada – Clean Technology Stream that can be applied to future interventions.

The scope of the engagement includes an examination of outcomes for the program stream as a whole and the activities and outputs for each of its six focus areas. While recognizing the importance of the outcomes targeted by each challenge and initiative, the joint engagement focused on expected results for the program as a whole. Although it considers their role in facilitating design and delivery by NRCan, the engagement’s scope excludes the performance of PCO and any other external collaborators.

NRCan’s Audit and Evaluation Branch (AEB) conducted this engagement in accordance with the TB Policy on Results (2016) and the Institute of Internal Auditors’ International Standards for the Professional Practice of Internal Auditing and the Government of Canada’s Policy on Internal Audit. Details on specific engagement questions, methods and limitations are found in the full report.

In future, NRCan’s AEB will look for evidence that lessons learned have been considered at least in the proposed design of relevant programs and activities. Key lessons learned include:

Design

  • Engage on co-design of the prize challenge as early as possible in the design phase. If possible, future interventions should undertake engagement and/or consultation in advance of committing to a focus area.
  • Allow for the maximum possible flexibility in the fiscal arrangements, including contingency plans to deal with the unexpected. If possible, include a carry-forward provision in funding authorities to simplify use of flexibilities in allocating funds across fiscal years.
  • Engage corporate services early to enable them to fully understand the program and its objectives.
  • Ensure that senior management is closely engaged help overcome resistance to innovative programming, and that expectations for the program’s implementation are clear and aligned to the intended design and delivery of the initiatives proposed.
  • Recognizing the limits to co-creation with external entities, carefully select collaborators based on who is best positioned to support results and benefits to Canadians and ensure that their role is clearly defined, understood, and does not introduce a real or perceived conflict of interest.

Delivery

  • Ensure that dedicated resources are available (within the program team and/or corporate services) to fully deliver on innovative communications objectives as a core element in challenge design.
  • Develop clear selection criteria and clearly communicate to applicants how they will be assessed against these pre-determined criteria, and ensure that documentation of decisions related to selection of applicant proposals is complete. 
  • Ensure that in cases where intermediaries are used, required documentation identified per the funding agreements is collected and retained to fulfil NRCan’s oversight role.
  • Ensure that approvals of proactive disclosures of grants and contribution information are completed in a timely manner.

Impact

  • Dedicate sufficient resources to performance measurement, to assess the effectiveness (impact) and efficiency of the prize challenge.
  • As early as possible in challenge design or implementation, consider ways to effectively contribute to concrete follow-on strategies towards the further advancement of government priorities.

Recommendations Management Response and Action Plan
  • The ADM ETS should review and document lessons learned from the Impact Canada – Clean Technology Stream, including those that may emerge in final prize award and any post-program follow-on, and make these available within NRCan, including with corporate services, and across government such that they can inform the design and delivery of future prize challenges.

Management Agrees. The Office of Energy Research and Development (OERD) will further formalize and standardize its processes for documenting lessons learned from Impact Canada Cleantech Challenges. OERD will also explore options and develop a plan for sharing lessons from Impact Canada Cleantech with NRCan senior management and other sectors.

Following the awarding of prizes and program wrap up, OERD will develop a lessons learned summary document, which will be updated as lessons emerge. OERD will continue to share lessons with the PCO Impact and Innovation Unit to support the development of “Challenge Resource Materials” that are shared and presented to other federal departments and the public (e.g. PCO case studies, PCO Challenge Guide and PCO advice on challenge juries). OERD will also continue with its practice of giving presentations on Impact Canada within NRCan and to other departments upon request.

Position Responsible: DG, OERD on behalf of ADM, ETS  

Date to achieve:

  • December 30, 2021 to develop plan for documenting and sharing lessons
  • September 30, 2022 to develop lessons learned document and then updated as lessons emerge.
  • The ADM ETS should review and finalize the performance measurement strategy for the ICI-CTS. This strategy should include:
    • Logic model (or theory of change), indicators, data collection methods, data sources, and potential counterfactuals.
    • Metrics for the program stream as a whole, as well as for each challenge or initiative to demonstrate the achievement of outcomes. 
    • As applicable, metrics to inform GBA+ considerations.
    • A plan for the timing and means by which reporting on program results and impacts will be communicated to senior management.

Management agrees. OERD will update and finalize the core Impact Canada Cleantech performance measurement strategy, including:

  • Finalize the draft Impact Canada Cleantech logic model, indicators, data collection methods, data sources, and, where appropriate and possible, identify suitable counterfactuals. OERD would like to acknowledge that due the nature of transformative innovation and RD&D, OERD will also draw on principles from innovation literature and developmental evaluations by integrating new outcomes, indicators, and updating assumptions based on emergent findings.
  • Work is underway to update and finalize core metrics for individual challenges and the overall program.
  • Work is underway within OERD to integrate GBA+ metrics into program data collection and monitoring. OERD will continue to collaborate within NRCan, and with PCO and Statistics Canada to collect and monitor GBA+ and equity, diversity and inclusion data where possible.  

Position responsible: DG OERD on behalf of ADM,ETS

Date to achieve:

  • March 31, 2022 for finalization of performance measurement strategy.
  • Ongoing performance measurement and monitoring until 2029 (at minimum).     
  • The ADM ETS should ensure that the risk-based recipient audit planning and methodology is strengthened to include the tracking and monitoring the status and results of recipient audit recommendations.

Management agrees. OERD will strengthen its risk-based recipient audit plan and methodology to ensure that it better documents and tracks the status and results of audits at the Branch (OERD) level, rather than Division level. OERD has already undertaken a review of its risk-based audit processes and updated its methods for planning, selecting and monitoring audits to ensure that all OERD programs implement NRCan’s Guide on Recipient Auditing in a consistent manner (ex. consistent sampling method). OERD piloted its updated risk-based recipient audit planning and selection method in Q2 2021. OERD is currently refining methods, and developing new guidance documents and tools for monitoring OERD’s annual audit plan. Implementation of processes and tools for monitoring audit recommendations and compliance will occur in Q3&Q4 2021-22. This will include optimising the use of NRCan’s AMI database for tracking audit results, and proponent’s audit history.

Position responsible: DG, OERD on behalf of ADM, ETS

Date to achieve: March 31, 2022

 

Page details

Date modified: