Several of the national projects managed by RePaDD had the chance to participate in a two-day forum in Canberra, including a Parliamentary Friends event. The National Palliative Care Projects are funded by the Department of Health and Aged Care. They work towards improving quality and access to critical support and treatment for people with life-limiting illnesses and their loved ones.
The forum was an amazing chance to meet and learn from the other projects, but it was also an opportunity to consider the role of the projects and how to plan and measure what we see as success. The CareSearch team had been asked to prepare a presentation on incorporating evaluation and thinking about impact to share with the forum. Below are some of the key ideas discussed in the presentation:
- The American Evaluation Association defines evaluation as “a systematic process to determine merit, worth, value or significance”.  Importantly, they also note that evaluation can be used for different purposes. So, as researchers and as grant recipients, we need to be clear on why we are undertaking evaluation. Do we want to know if the program was delivered or to what extent did it meet its deliverables? Was our interest in what happened because of the program or what we can learn from how we did this to do better?
- Program logic can be a helpful tool as it makes us describe the resources and activities that comprise the program, and the changes that we expect to result from them. It can visually represent the relationships between the program inputs, goals and activities, operational and organisational resources, the techniques and practices, and the expected outputs and effects.  This can help us see if this is what we expect to happen and highlight if there are assumptions that are not shared within the team.
- The important thing is to understand what you think shows evidence of the worth or value of the project. It is also important to know if things do not work and perhaps to understand why they did not.
- Impact assessment focuses on the effects of the intervention. In broadest terms, impacts address the contribution that the research or activity makes to the economy, society, environment, or culture.
- Governments are increasingly interested in evaluation and impact. The Department of Prime Minister and Cabinet even has The Office of Impact Analysis. While their framework relates to policy planning, it reminds us that context matters. Funders are trying to understand if what they are doing is valuable and addresses the problem for which they provided funding.
- Activity workplan not only details the activity that needs to be delivered but includes performance indicators to assess what happened because of the activity. They are measures of success, often including a quantitative component.
- There are challenges in impact assessment. [3,4] Do we know what we are trying to measure? Can we collect meaningful data without disrupting the ability to deliver the work program efficiently? Do we need ethics and how will that affect our timelines?
- As well as challenges, there are opportunities at the program level. There is a chance for us to use collective power. Can we aggregate information and measures at the project level to get a better picture of the whole of sector engagement and impact? We could consider looking at common tools, measures, and resources where possible to enable aggregation or further analysis. Again, all projects will be working with common and unique groups; this is an opportunity to help understand variance across groups and settings.
- There is also the opportunity for us to think about legacy, sustainability, and scalability. First, we need to make sure we do not lose knowledge of what has been done and whether it was successful. Second, we should look at how we sustain successful changes to keep capturing the benefits of the investment to date. Finally, we need to look at what has sufficient value that we should look at scaling broadly across the sector or more specifically within particular needs or population groups.
For all research, being clear about what are the desired impacts is important as it should inform our study design planning, our choice of measures, and the consideration for our population. For funders, having proof that projects make a difference is fundamental to pursuing sustainability, further funding, and scalability.
RePaDD members are involved lead or part of consortium delivering several National Palliative Care Projects:
- CareSearch including palliAGED 2023-2026: Flinders led (Investigators: Professor Jennifer Tieman, Dr Raechel Damarell)
- End of Life Essentials 2023-2026: Flinders led (Investigators: Associate Professor Kim Devery, Dr Caroline Phelan)
- End of Life Directions in Aged Care (ELDAC) 2023-2026: QUT led (Investigator: Professor Patsy Yates), Flinders Co-Lead (Investigators: Professor Jennifer Tieman, Dr Priyanka Vandersman)
- The Advance Project 2023-2024: Hammondcare led (Investigator: Professor Josephine Clayton), Flinders Consortium Member (Investigator: Professor Jennfier Tieman)
- National Palliative Care Coordination Program 2023-2026: UoW led (Investigators: Associate Professor Barb Daveson) Flinders Consortium Member (Investigator: Associate Professor Deidre Morgan)
- CarerHelp 2023-2026: St Vincents led (Professor Peter Hudson, Associate Professor Mark Boughey), Flinders Consortium Member (Investigator: Professor Jennifer Tieman)
- American Evaluation Association (AEA). What is evaluation? (563kb pdf). Washington, DC: AEA.
- Australian Institute of Family Studies (AIFS). How to develop a program logic for planning and evaluation [Internet]. 2016 [cited 2023 Dec 12].
- Greenhalgh T, Raftery J, Hanney S, Glover M. Research impact: a narrative review. BMC Med. 2016 May 23;14:78.
- London School of Economics (LSE). LSE Impact Blog (Filter for impact) [Internet]. 2023 [cited 2023 Dec 12].