Evaluation Systems Award
Department of Education and Training Victoria Evaluation Practice
The Department of Education and Training Victoria has strengthened their internal evaluation capacity through a comprehensive system. Launched in early 2021, the Evaluation Practice now employs over 20 staff and has achieved considerable successes. For example, the Evaluation Practice team have led nine large-scale evaluations, synthesised sense-making and cross-initiative insights at the system level, enabled greater access and use of internal DET data to ensure more robust findings, influenced state-administered education surveys to further increase the quality of data available, and ensured appropriate resourcing are provided to evaluation across the agency.
This provides a good model of an internal capacity building system which could provide practical examples for other agencies and organisations.
Dr Julian King, Value for Investment approach
Value for Investment is an innovative, exemplary evaluation system that integrates theory and practice from evaluation and economic disciplines to inform judgements and decision making. Initially developed though Julian King’s doctoral research it is a flexible and collaborative approach that can be applied to all domains and program types. The letters of support demonstrate it has elevated the thinking and approaches of a wide group of practitioners across varied domains of evaluation practice including organisations and projects such as: Oxford Policy Management (an international development consultancy), The Indigenous Land and Sea Corporation, the Australia Indonesia Partnership for Economic Development, the International Atomic Energy Agency, the Australian Council for Education Research and Hikitia Consultants (a Māori evaluator). It has been used internationally including in New Zealand, Australian, Mozambique, Pakistan and the UK. It is highly practical and adaptable and promotes the use of mixed methods in value for money assessment for sense and decision making. It shifts the focus of value for money from costs and inputs to holistic, whole of program and system, considerations. Through the use of rubrics and multiple criteria to assess value for money it provides a robust mechanism to consider value and promotes the use of transparent evaluative reasoning. The Value for Investment integrated evaluation system is a significant contribution to evaluation theory and practice and is shared on an open access basis through the Oxford Policy Management’s value for money guide, Julian King’s published PhD dissertation, journal articles in American Journal of Evaluation, the Evaluation Journal of Australasia, New Zealand’s evaluation Journal Evaluation Matters, Evaluation and Program Planning, as well as open-access evaluation reports and blogs. It is also currently taught in the University of Melbourne’s Master of Evaluation program as a component of the course ‘Evaluation and Value for Money’ and in workshops through evaluation societies internationally. The Value for Investment system is an exemplary integrated evaluation system that has been implemented successfully in a variety of contexts and is a worthy recipient of the AES Evaluation Systems Award.
Public Sector Evaluation Award
Te Ihuwaka | Education Evaluation Centre, Education Review Office – for Evaluation of learning in residential care
The Education Review Office is the public service department of New Zealand charged with reviewing and publicly reporting on the quality of education and care of students in all New Zealand schools and early childhood services. This evaluation study looked at the quality of education for students in Oranga Tamariki Care and Protection and Youth Justice residences and how it can be improved. Children and young people who are placed in Oranga Tamariki residential care are among the most at risk of poor outcomes later in life. This was the first nationwide, system-level evaluation undertaken by the ERO. The study developed an excellent theory of change based on a published systems framework and used this effectively to develop the evaluation methodology which was robust and of a very high quality. The study developed a unique set of indicators of education performance for the residences, using a 4-point rubric to define the quality of practice, which will be useful in future evaluation studies and service delivery. The rubric was developed in consultation with the education providers to ensure their support for its use. Given the target population of school aged children, the evaluation team developed processes to give voice to these students including an appropriate reporting process. The evaluation team recognized the need to strongly engage stakeholders in the design and implementation of the study. Importantly, the team communicated regularly with the stakeholders on the study findings and developed a range of hard copies and online publications including the substantive report, a summary document, and four guides for students; parents and whanau (community members); leaders and teachers; and social workers. The evaluation team used a range of innovative approaches to enhance utilization of the study findings and products. As a result, there are a number of impacts that have already resulted from the study both in developing improved evaluation practice and decisions about the future of the residential care program.
Victorian Department of Health (Centre for Evaluation and Research Evidence), Review of the North Richmond Medically Supervised Injecting Room
This study was incorporated in a review of the first medically supervised injecting room set up in Melbourne. It took place within a challenging and politically charged environment and over several years. The study addressed a number of these challenges through allocating significant time and resources to consulting with the wide range of stakeholders involved in the study. In addition, there were significant ethical issues that the study dealt with through applying the AES Code of Ethics and seeking approval through an NHMRC ethics committee. A wide range of data collection methods were used to gather evidence to address both formative and summative questions. The work of the evaluation team was highly praised by members of the review team stating that the team developed and tested a multi-faceted framework for the review, drew on local and international evidence, conducted a complex project to support the review panel with clear evidence based findings on which to base recommendations. In addition, the Director of the of the Injecting Room Project praised the team’s work, stating the team helped build, “… a credible foundation for government to make decisions on the next phase of this life-saving trial. Importantly and immediately, the Victorian government accepted recommendations to continue the trial and expand it to include a second supervised injecting centre. The work of the Centre has helped inform other changes (that) have either already been made to improve the trial, such as the development of the approach to local community engagement in North Richmond, or are being considered (e.g. in future legislation)”.
DFAT Office of Development Effectiveness – for the Evaluation of the Australian NGO Cooperation Program
This evaluation displays a highly professional and rigorous approach to public sector evaluation planning, and subsequent practice in data collection, analysis and synthesis, as well as determination of findings and recommendations. It is notable in the participatory manner in which it included industry stakeholders in the major stages of conduct of the evaluation and built acceptance and ownership in both its process and product, including useful recommendations. Importantly, with respect to the Award's assessment criteria, these qualities gave considerable impetus to new policy and program initiatives in line with the recommendations made.
Research and Evaluation Unit, New Zealand Inland Revenue (IRD) – for the Kiwisaver Evaluation
KiwiSaver is a New Zealand, voluntary, work-based savings scheme, designed to help people prepare for their retirement. The KiwiSaver Evaluation Programme was a longitudinal, multi-agency, multi-disciplinary evaluation led by the IRD. It included monitoring usage, consideration of the effects on the market through a panel of providers, as well as research into the attitudes, awareness and behaviours of members and non-members. In addition, a value-for-money assessment of KiwiSaver was completed. The diverse studies enabled a holistic narrative of the program to be built.
The collaborative evaluation included a memorandum of understanding to establish data sharing arrangements between the participating government agencies to enable the dataset to be used to inform individual evaluation and research projects as well as the programme overall. Better practice governance was also demonstrated through the composition and level of involvement of the Steering Group and use of peer review. The evaluation was designed to allow for program changes over time and the design included innovative approaches using existing data sources, for example probabilistically linking. The evaluation has been acknowledged both in New Zealand and internationally, for its comprehensive assessment of a retirement savings scheme. For example, Collard and Moore, in their review of international pension reform for the UK's Department of Work Pensions, heavily referenced the evaluation reports and say "pension reform seems only to have been evaluated in New Zealand, where the Inland Revenue has established a multi-strand programme of evaluation to assess the impact of the KiwiSaver initiative". Inland Revenue has been invited to participate in international delegations due to the evaluative work and approached by the London School of Economics and The Open University to participate in a joint research study. The evaluation as demonstrably contributed to the evidence base in this policy area and is being used within New Zealand and internationally.
The Office of Development Effectiveness (ODE), Department of Foreign Affairs & Trade and ARTD Consultants – for Evaluation of Australian Volunteers for International Development
This award recognises evaluation work conducted within the Australasian public sector that has been used to effect real and measurable change in policies or programs. This was a well-designed robust evaluation executed in close collaboration with ODE at all stages. Adherence to professional standards was demonstrated and supported by peer reviewers. Cognisance of the complexity of data collection when different language groups are involved was then dealt with in a way that involved all the stakeholders. ODE has been provided with a report that clearly recommends ways in which the program may be improved with the evidence base to support the recommendations. The release of the full report by ODE has also created the opportunity for further discussion by others interested in the field which will add to the way in which the information is used and changes made to the program and implementation by DFAT, already underway.
David Matthews, Alison Hickman, Rebecca Karlson and Paul Bennett and Lt Colonel David Garside (commissioner) – for Defence Campaign Assessment
(known as Award for Excellence in Evaluation in the Australian and New Zealand Public Sectors)
This evaluation project was well constructed to both introduce and transfer M&E capability to DSTO staff and to achieve ongoing use of the evaluation results through the reporting systems developed and integrated to support/enhance management practices. The Framework developed is now recognised by NATO, the Australian Civil-Military Centre and the UK Stabilisation Unit. It demonstrated the application of program logic to Defence plans, country-specific metrics, information portal and heat map visualisations, all of which contributed to the development of evaluation methods and practice as well as building evaluation capacity within the organisation. There was evidence of trust and rapport built up between the evaluators and stakeholders, evidenced also by the agreement to produce two reports, one from the stakeholders and another from the evaluators.
EJA Publication Award
Central Lands Council for 'Checking up to keep on track: An Aboriginal-led approach to monitoring well-being'
Authors: Linda Kelly, Mary Whiteside, Hayley Barich and Komla Tsey
The paper addresses an important issue—the need to develop monitoring systems that privilege Aboriginal led visions of community development in northern Australia. The authors were careful to ensure that their findings were framed using an appropriate theoretical framework for wellbeing. The findings provide valuable lessons for others working with Aboriginal communities where externally imagined assumptions about what is good for communities end up being imposed on them, rather than being developed from the ground up. The paper is well written, topical and provides a valuable point of reference for evaluators to draw from.
Ruth McCausland for "‘I’m sorry but I can’t take a photo of someone’s capacity being built’: Reflections on evaluation of Indigenous policy and programmes”
The paper presents clear, robust research on evaluation with evidence presented and a number of 'tools' (e.g. rubric with a worked example) which could be used by practitioners in their work. Similarly, limitations and areas for further work are articulated, with a contribution to the evaluation evidence base. The article makes an important contribution and is relevant to key readership of the journal. The paper is very contemporary and adds value to the evidence base in an area that is sadly lacking.
Emerging New Talent Award
Allison Clarke, CPE University of Melbourne
During her evaluation career to date, Allison has successfully used skills and abilities from her previous experience in other sectors to conduct high quality evaluations, strengthen the evaluation field and support other emerging evaluators with whom she works. Allison is clearly an important part of her organisation, which itself is an integral part of the evaluation landscape in Australia. The diversity of Allison's experience presented in the nomination shows that she is able to successfully apply good evaluation practice to several sectors and contexts. The very positive feedback of stakeholders who have worked with Allison is a testament to her professional standards and high quality of evaluation work. This diversity is impressive for a career of only a few years. In addition, the use of evaluation outputs which Allison has helped to produce demonstrates the high quality and relevance of the work. Delivering such high quality and varied results in addition to other organisational and evaluation capacity building activities is an excellent achievement.
The organisational and capacity strengthening activities that Allison has been involved with, particularly in response to the COVID-19 pandemic, are important and show Allison’s leadership skills and ability to use these skills to strengthen the sector. In particular, Allison’s ability to build and maintain professional relationships with both internal and external stakeholders is impressive. A particular strength is Allison’s work on Aboriginal cultural safety including contributions to her Centre’s uptake of the AES Cultural Safety Framework and evaluation activities which have produced positive results for stakeholders. Allison has demonstrated a significant contribution to strengthening the sector and great potential for doing more of this in the future.
Amanda Mottershead, Toolkit to Engage Young and Emerging Evaluators (YEEs) in Voluntary Organizations for Professional Evaluation (VOPEs) Governance and Leadership
Amanda Mottershead has been selected for the Award based on a substantial piece of work she undertook in a difficult environment, with good references to how issues such as professionalism and ethics were managed. The process of working internationally across a large number of voluntary associations is extremely difficult and Amanda managed this with the skills of a much more experienced evaluator. The final product, a Toolkit to Engage Young and Emerging Evaluators in VOPEs, is comprehensive, engaging, practical and well designed. It is a resource that will be used across the world to encourage young evaluators to become more engaged in the profession. This is a goal we can all support.
Alexandra (Lex) Ellinson
The judges were impressed by the high level of professional competence achieved by Lex, which has been developed within four years of evaluation experience. She holds a Senior Consultant role in ARTD and is an active contributor to the AES as a member and secretary of the Advocacy and Alliances Committee, as well as a member of the Realist SIG.
In the projects Lex has worked on or led, she has demonstrated a breadth of knowledge and skill essential for a high quality evaluator. Her evaluation reports demonstrate ability to engage with different methodologies including quasi-experimental designs, cost analysis, qualitative methods and monitoring methodologies. Lex has colleagues who seek her out for discussions about ethics, particularly in the areas of cultural sensitivity and people experiencing trauma. The nomination provides evidence of exemplary stakeholder engagement, particularly in areas involving youth and very positive working relationships with Aboriginal community elders.
Clancy is a Postdoctoral Research Fellow at the Telethon Kids Institute and also leads I.D.E.A. Global Consulting. The judges were struck by the breadth of studies Clancy has worked on since completing her PhD in 2012. She is currently leading a RCT study on rheumatic heart disease in the Northern Territory and has previously worked on projects in the Pacific and Asia. A key feature of her current work is the prominence of incorporating evaluation, qualitative methods and a theory of change in a RCT study. Clancy has demonstrated her commitment to sharing her knowledge and learning through active participation in the AES 2015 conference (and has had two abstracts accepted for the 2016 AES Conference).
Ruth commenced at the Centre for Program Evaluation, The University of Melbourne, in 2012 and during her time there has overseen multiple complex evaluations in her role as senior project manager, contributed to advances in theory through her research, and she has presented and had published multiple papers.
In the nomination her colleagues have called attention to her talent, her zeal, her support and her commitment to evaluation. Her clients praise her excellent communication style, willingness to learn and attention to quality which have helped to deliver sustainable improvements in their own evaluation capability. Ruth has commenced study for a PhD with a topic that aims to contribute to the field of evaluation by investigating appropriate ways to measure social change.
Delyth's achievements as a new talent is recognised by her client work, her contribution to the body of evaluation knowledge through journal articles and conferences. The Award judges noted her creativity and passion for evaluation e.g. through problem solving connecting with end-user/service-beneficiaries.
James Curtis (2010)
Jessica Kenway (2009)
Bradley Shrimpton (2007)
Award for Enhancing the Social Good
Centre for Health Service Development, Australian Health Services Research Institute, University of Wollongong – for Evaluation of the Pathways to Community Living Initiative
The evaluation makes a strong contribution to the social good by building up knowledge and support for the deinstitutionalisation process for those with severe and persistent mental illness (SPMI). The evaluation informed and supported the development of new service models for appropriate care in the community. It is noted that community-based settings aid recovery of those with SPMI, and historically limited support to this context has reinforced inequalities experienced by those with long-term mental health challenges. The overall quality of the evaluation was excellent. The team adopted a useful formative approach with a strong commitment to add value and contribute to the on-going refinement of the program. The evaluation team's focus on the significant role of clinicians and the need to build support and reflective practice within rehabilitation psychiatry are impressive. Similarly, theoretical and methodological attention to the drivers of transformation in a complex health system were important to identify and maintain focus on areas critical to change. Attention to including consumer and carer perspectives were pivotal in building a valid, credible, and useful evaluation.
ARTD and the Kinchela Boys Home Aboriginal Corporation (KBHAC)- KBHAC Practice Framework “The More I Talk, The Stronger I Get”
ARTD and KBHAC have developed a practice framework that brings together best practice trauma-informed approaches, is strongly evidence based and harnesses the voice of survivors to shape the work. Importantly, it articulates what survivors, descendants and program participants can expect from KBHAC. The framework is based on preliminary work that ARTD did with KBHAC in developing an outcomes framework to map KBHAC’s programs and services and identify intended outcomes and success measures of these programs and services for monitoring and evaluation purposes.
The work excelled against all of the criteria of excellence related to the award. It employed developmental and empowerment evaluation approaches to a high standard. ARTD’s consultation and project design was highly collaborative, informed by Aboriginal-led co-design practice and the recognition of KBHAC’s knowledge and expertise and the importance of ground-up, creative and iterative planning and design of project priorities. The KBHAC Practice Framework has been received with excitement and positivity by the KBHAC community. Staff have reported finding the tool useful in guiding their work and working with clients, showing it as a way of communicating KBHAC’s way of working. KBHAC staff have also found the workshops and processes for developing the frameworks to be valuable and reported feeling more confident and inspired in their work.
There is a lot of literature on the effects of trauma and intergenerational trauma, stories, reports and education material on the Stolen Generation. There is a gap, however, on documenting approaches that make a difference for the healing journey of those directly and indirectly effected by this multigenerational trauma. This is the gap that the practice framework fills.
The award is provided given the excellent use of evaluative thinking to develop a framework that is useful to the practice of a social justice focused organisation with great potential for impact across the sector and at community level. This is the first Practice Framework designed with and for Stolen Generations survivors, descendants and families. It aims to help restore and reconstruct the identity, dignity and integrity of Aboriginal men who were forcibly removed from their families and placed in the Kinchela Boys Home (KBH). The framework recognises that Aboriginal people and survivors are best placed to develop the healing frameworks to address the effects of the multigenerational trauma that adversely impacts on their lives. KBHAC’s work guided by the framework is at the forefront of healing work being done in Australia. Their work is critical to reckoning with Australia’s recent history of genocide, pain and injustice suffered by Aboriginal and Torres Strait Islander families and communities.
Indigenous Evaluation Award
Curijo Research Monitoring and Evaluation Team
This was an innovative submission for the AES Indigenous Evaluation Award. The submission was not for the conduct of a particular evaluation, nor for the commonly recognised types of evaluative thinking process documentation such as a monitoring and evaluation plan. This Award recognises the company’s holistic approach to developing its Research, Monitoring and Evaluation Framework and how it has shaped the organisational settings. We have awarded the nomination as an example of an excellent evaluation product that has been prepared and used by a privately-owned Indigenous Australian monitoring and evaluation company. The company is courageously embarking on influencing the qualities and competitiveness of the Indigenous evaluation market through a culturally specific research, monitoring and evaluation framework. It employs a model of economic independence to build upon the growing national commitment to improve the quality of Indigenous-related programs and services through better evaluation. Curijo’s commitment has translated into measured organisational staffing structures, staff training in philosophical and practical Indigenous methodological approaches, as well as provision of community support beyond the requirements of the commercial contract. The internal application of the model against the business is still developing but it is excellent to see an Indigenous research and evaluation company that has developed a holistic, cross cultural approach to centring Indigenous perspectives, priorities and knowledges. This company is making a significant contribution to promoting the accountability and learning functions of Indigenous evaluation, as a means to realising beneficial impacts and self-determination for the First Nations peoples of Australia.
Aboriginal Evidence Building Partnership Pilot
ARTD Consultants Simon Jordan, Sue Leahy, Ruby Leahy Gatfield, Kieran Sobels, Christine Eastman, Stephanie Quail, Imogen Williams, Holly Kovac
Aboriginal Evidence Building Team, Their Futures Matter: Nattlie Smith, Sharon Macleod
Coffs Harbour Aboriginal Community Centre Inc (Abcare): Garry Mathews, Greg Bennett, Trent Matthews
Tirkandi Inaburra Cultural and Development Centre: Matt Watts, Damien Thorne, Beverley Tucker, Department of Communities, and Justice Troy Mott, Department of Education
This evaluation was conducted by a team committed to enabling Indigenous self-determination and empowerment from the outset. This was due in large part to the Indigenous leadership within the team combined with the skill and experience of the other team members in this context.
The evaluation demonstrated how collaborative and participatory approaches, together with a strong partnership model, enabled and paid respect to the roles and cultural imperatives of two organisational partners involved, as well as to the intended objectives of the NSW government agency involved. Overall, the judges concluded that this Award winner has made a strategic contribution to fostering a culture of evidence building and evaluation capacity for the ongoing monitoring and review of the Aboriginal services sector in NSW. The judges are confident that the work will lead to increased benefits for the two organisations’ service recipients, who are Aboriginal families whose children are at risk of removal, and Aboriginal boys aged 12–15 years who are seeking to develop their resilience and life skills.
Gill Potaka-Osborne and Lynley Cvitanovic of Whakauae Research Services; Maaki Tuatini and Roberta Williams of Te Oranganui Trust; and Raetihi Pah – for the Te Puawai O Te Ahi Kaa Evaluation
This New Zealand evaluation was conducted by a team with strong representation and expertise in Indigenous evaluation, as well as technical aspects related to a health program supported by the New Zealand government. The evaluation used a collaborative capacity building approach and participatory methodologies to fully involve Indigenous communities in the design and conduct of the evaluation.
The evaluation captures the true essence of what Indigenous evaluation is about and what it should reflect – done by, with and for Indigenous people. It gives Indigenous people a powerful voice and platform to determine their future and destiny by encapsulating the community beliefs and traditions. These are highly valued and prized as an asset.
What the judges found most exciting about the evaluation is that it utilised Indigenous perspectives, languages and world views in each aspect of the evaluation, from concept design to stakeholder engagement. The evaluation thereby provides a model for practitioners in Australasia to do evaluations with Indigenous people by drawing on Indigenous people’s knowledge and insights.
Hokianga Hapū Community Water Supplies; Marara Rogers-Koroheke & Hone Taimona (Hokianga Health Enterprise Trust), Andrea Clark (Social Foci Limited), Jeff Foote & Maria Hepi (Institute of Environmental Science and Research Limited); John Wigglesworth (Hokianga Health Enterprise Trust) – for Tukua te Wai Kia Rere: Evaluation of the Hokianga Drinking Water Programme
Nau mai e te wai whakaora i te punawai Ariki
Matapuna mai i konā i te pupuke i te kukune Pīataata mai i konā i te tapu ahu mai i te rangi Hirere mai i konā i te mana ahu mai i te whenua Wai tapu nō ngā tūpuna, wai whakaora nō ngā matua Ka puta i te whei ao, ka puta ki te ao mārama
The judges found Tukua te Wai Ki Rere to be outstanding in a number of respects, representing all that Indigenous people value highly in terms of evaluation practice. As was noted by one member of the assessing committee, 'the level of community participation involved in this project and evaluation was significant and the accountability and ownership back to the community was high'.
The judges commended the Tukua te Wai Ki Rere evaluators for their partnership with local hapū members, as this partnership approach appeared to demonstrate real benefits, both for the community-based evaluators and for the community more widely. There was significant involvement of Indigenous people in the evaluation process and capacity building was evident. The process used was open and transparent, and gave ownership back to these communities. One of the judges noted that 'there was clear evidence throughout that Indigenous people were leading the process, working in partnership [with the evaluators] to find solutions for better outcomes'.
This evaluation and the process described in the Tukua te Wai Ki Rere report will contribute to Indigenous evaluation knowledge to what is best practice in community development to how to involve and engage with Indigenous people using cultural structures that already exist in communities to deliver benefits.
By using this approach and existing culturally appropriate structures other community development initiatives can be implemented to deliver broader benefits back to these communities. The level of capacity building in evaluation was high and the process transparent to all stakeholders including the communities involved. The evaluation and process described would be a good case study from which to learn about what is a culturally safe evaluation.
Lauren Siegman, String Theory – for the Straight Talk evaluation for Oxfam Australia
This evaluation is an example of exemplary cross-cultural evaluation practice. The design was strengths-based, practical and participatory. Team members, including participants, were included in the design, analysis and sense making. Of particular note, team members said they were 'listened to' and came away with a positive perspective about evaluation and are now more receptive to being involved in evaluation processes in the future. Oxfam report they have begun a program redesign and the evaluation findings will contribute directly to the reshaping of the program.
Cultural and Indigenous Research Centre Australia (CIRCA) – for the Evaluation of the NSW Aboriginal Child and Family Centres
This was a three-year evaluation of nine Aboriginal Child and Family Centres (ACFCs) in NSW. The evaluation incorporated aspects of responsive, democratic and participatory approaches. The nomination included evidence of a high level of meaningful involvement by Aboriginal stakeholders at all stages of the project, with stakeholders reporting they were more confident about using monitoring and evaluation in their work practice. The assessors were impressed by how the evaluation team addressed challenges experienced by the centres and within the Aboriginal communities in establishing the centres, such as difficult relationships between stakeholders. In these circumstances the evaluation team showed sensitivity and professionalism and provided opportunities for stakeholders to share, reflect and debrief their experiences.
Menzies School of Health Research – for the Sentinel Sites Evaluation of the Indigenous Chronic Disease Package
This was a large-scale formative evaluation of the Australian Government's Indigenous Chronic Disease Package, a very significant and complicated national initiative to improve the health of Aboriginal people, as part of the government's Closing the Gap Commitment to address the extreme disadvantage of Aboriginal people in life expectancy, health, education and employment. The evaluation had to undertake rigorous measures of service delivery outcomes with appropriate and respectful evaluation practice within the very different indigenous communities around Australia. The Sentinel site design addressed this by a placed-based approach using 24 sentinel sites across Australia with varying degrees of intensity of data collection and analysis, including eight in-depth case study sites. The evaluation had a cyclical approach that involved local and national level stakeholders, in cycles of reflection and feedback. The design drew on utilisation focused and realist evaluation theory. The available evidence indicated that the evaluation had a positive influence on improving the package, particularly addressing variations in implementation between different sites, recognising that addressing this at a policy level was critical to improving outcomes for the most vulnerable. Indigenous people were engaged in the evaluation practice at each stage, from design through to dissemination. The evaluation demonstrated effective practice in the evaluation in indigenous community settings with elements of flexibility, community control and ownership, and inclusiveness.
Annie Holden, ImpaxSIA Consulting – for Visual Participatory Evaluation (VPE) Cape York Aboriginal Australian Academy (CYAAA)
This nomination was considered to display a high degree of professionalism, being sensitive to and respectful of Indigenous ways of knowing. It demonstrated an inclusive approach which included significant capacity building for an important Indigenous organisation. A modified narrative approach was used that included aspects of the most significant change strategy. Digital technology was used as a communication channel, blended with structured dialogue approaches that are inclusive in nature. The evaluation contributed to knowledge about the activities and outcomes of the project. Indigenous and wider community engagement and the positive outcomes reported demonstrated successful engagement with what has been a disengaged population
Department of Finance and Deregulation, Office of Evaluation and Audit (Indigenous Programs) with Anne Markiewicz, Anne Markiewicz and Associates Pty Ltd (2008)
Indigenous Project Team, Ombudsman Victoria (2007)
Evaluation Study or Project Award
Queensland Government’s Office of the Commonwealth Games, Department of Innovation, Tourism Industry Development, and the Commonwealth Games; and the evaluation team of Mark Douglas, Robert Grimshaw, Nicolette Pavlovski, Sean Conway, Kelly Reynolds, Joanne Ryan and Meghan Purcell – for the Evaluation and Monitoring Framework for the Embracing 2018 Legacy Program
The Queensland Government’s Embracing 2018 Legacy Program aims to ensure the Queensland community realised lasting benefits from hosting the 2018 Commonwealth Games. The reach of this event extends beyond the Games host and event cities to provide measurable outcomes for Queensland and Australia. An Evaluation and Monitoring framework for this program has evolved since 2013 and is designed to guide implementation of the Embracing 2018 Legacy Program and assess its outcomes over a 10-year period.
The assessors were impressed by the evaluation’s sound use of evaluation theory and approaches, including the ability to incorporate emergent findings into a results’ framework and ensure ongoing connections between projects. There was evidence of strong and sustainable connections being developed; for example, with the evaluation team building on work started in Glasgow with the 2014 Commonwealth Games, and their commitment to developing a framework that can be used by future hosts of the Commonwealth Games. The assessors noted the evaluation team’s commitment to publicly share the methods behind their framework.
Alison Wallace, Linda Kurti, Nicki Hutley, Caroline Tomiczek, Alex Batchen (Urbis) and NSW Treasury – for the Evaluation of the Newpin Social Benefit Bond Program
The Evaluation of the Newpin Social Benefit Bond addresses the complex needs of a growing number of children placed in out-of-home care. The evaluation was commissioned by NSW Treasury and conducted by Urbis.
The evaluation covered the process, outcomes and cost-effectiveness of the service. It used a mixed methods design and incorporated extensive consultation with parents, the service provider, staff and program administrators. Outcomes were assessed through analysis of child protection data for the intervention and a control group.
The Evaluation of the Newpin Social Benefit Bond products are excellent and would constitute useful learning tools for evaluation students and practitioners. These are published online by the NSW Office of Social Impact Investment. The project demonstrated the usefulness of high quality evaluation for improving social policy and delivering good outcomes for some of the most disadvantaged people in our community.
Wei Leng Kwok, WLKConsulting – for Generating Equality and Respect: Evaluation of a world-first model for preventing violence against women in Victoria, Australia. A study in engagement for learning and evaluation capacity building
The Victorian Health Promotion Foundation (VicHealth) selected two community partners to implement a multi-faceted programme of action in three different settings at the same time. Throughout all phases of the program, VicHealth worked alongside them as a third and equal partner. Evaluation was built into the program to run in parallel with the prevention activities.
A theory and practice based rationale for a participatory and learning oriented approach guided the evaluation. Evaluation capacity building was an integral part of the evaluation process. The evaluation belonged to the whole team who co-authored the full evaluation report.
The evaluation is of a high quality and shows the potential to have a major impact on the contribution to evaluation in their field of practice. The nomination provided strong and credible evidence from both stakeholders and recipients of the relevance and effectiveness of their evaluations. The rationale and application of the approach was clearly described & evidenced. Overall, we consider that it is an evaluation that epitomises the values of AES.
Social Policy Research Centre, University of NSW – for the Keep Them Safe Evaluation
This a significant example of an outcome evaluation of the child protection system in NSW. It used a wide range of methods to assess multiple dimensions of a complex service system at a State-level. These methods included spatial analysis, cost effectiveness analysis, case study research, analysis of case records, service and client interviews, and meta evaluation. The use of these methods, and subsequent incorporation of multiple forms of data were undertaken in a highly professional manner. The report identified improvements required for the longer-term monitoring of child protection outcomes including the further development of State-level KPIs. The evaluation reflects good practice, and has the potential to inform future evaluation approaches to complex service delivery systems.
The Centre for Program Evaluation at the University of Melbourne, in partnership with Shelby Consulting and Murdoch University – for The Evaluation of the [Western Australian] Independent Public Schools Initiative
This study was a large and complex piece of policy evaluation. It was carefully designed, with a sound conceptual underpinning, and well implemented. It has produced valuable information of high credibility that has been widely disseminated in both the evaluation and education sectors, making a significant contribution to the practice and (potentially) the use of evaluation within Australasia.
The evaluation was conducted in a highly professional manner, with a number of cutting edge features. It faced the challenge of being conducted in a highly politicised environment, with community concerns that the project breached standards of equity, making genuine stakeholder involvement difficult. It was a large-scale, multi-site, multi-level evaluation that entailed the collection and collation of a mass of data which needed sophisticated analysis and triangulation to produce sound, transparent evaluative syntheses and policy-relevant findings.
Victorian Bushfire Case Management Service, sponsored by Department of Human Services, Victoria and conducted by a team from Urbis including Claire Grealy, Duncan Rintoul, Jessie Connell, Ginette Anile, Christine Healy, Gail Winkworth and Kristen Murray
The nomination was for an evaluation of a service provided in an emergency. The evaluation was an example of the highest quality of evaluation practice in its use of, and application of, evaluation theory, principles, methods and practice.The evaluation was conducted while the service was in operation, requiring high sensitivity, confidentiality and speed. The evaluation design was innovative, and well-crafted using rapid response principles well adapted to the needs of the program. Multiple methods and data sources were used to provide triangulation and to test the data. The quality assurance mechanisms were carefully considered and developed to support the evaluation’s objectives. The evaluation reports and literature review have made a significant contribution to the evaluation of crisis case management. The evaluation developed models to improve evaluation practice and scholarship and added to the evidence base of what works in emergency case management. The recommendations in the three reports were immediately applicable and operationalised with minimal burden to the large workforce.
Mathea Roorda, Heather Nunns, Ieti Lima, Amton Mwarksurmes, Senorita Laukau, Kateata Binoka, Talonga Pita, Alison Gray, Charlotte Bedford – for Evalue Research and Sankar Ramasamy, Department of Labour, Wellington (2010)
Rosemarie Tweedie, Mary Carey, Kim Stewart and the Baptist Community Services (NSW and ACT) (2009)
Simon Smith, Julie McGeary, the Victorian Department of Primary Industries; Martin Andrew, Lili Pechey, Don Burnside, Todd Richie from URS Australia (2008)
The Consortium for the Strategy 2000–2004 Evaluation at CIRCLE, RMIT (Team included Patricia Rogers, Sue Funnell, John Scougall, Keryn Hassall, Peter Tyler, Gerald Elsworth, Sue Kimberley, and Kaye Stevens) (2007)
Virginia Lum Mow, VL Educational Research and Development; and Retraining Unit, NSW Department of Education and Training (2006)
Wendy Searle, Tania Slater, Trish Knaggs, Janet November and Christopher Clark, Ministry of Justice, New Zealand (2005)
Review and Evaluation Unit of the Queensland Police Service (Team included Robert Lake, Angela Richardson, Diana Beere, Ruth Beach and Joe Nucifora) (2003)
ARTD Consultants (Chris Milne, Marie Delaney, Klas Johansson and Marita Merlene) (2002)
Paul Aylward, South Australian Community Health Research Unit (2002)
Rick Cummings, Murdoch University and Kath Stephenson, Estill and Associates (2001)
Robert Curnow, Community Changes (2000)
Pamela Williams, KPMG (1999)
Team from NZ Ministry of Justice (Team included Alison Chetwin, Steve Dunstan, Miriama Scott, Jennifer Leigh, Mark McCallum) (1998)
Peter Bycroft and Ellen Vasilauskas, Corporate Diagnostics Pty Ltd (1998)
Team from Australian Bureau of Transport and Communications Economics (Team included Joe Motha, Bogey Musldlak, Seu Cheng, Catarina Williams) (1997)
Technical Quality Evaluation Team, New Zealand Department of Inland Revenue (Team included Prue Oxley, Heather Turner, Valmai Copeland, Fiona Hoult, Colin Usherwood and Robyn Pullar) (1996)
Evaluating Taxpayer Audit Program of the New Zealand Department of Inland Revenue (Team lead by Alan Pinder) (1994)
Evaluation Policy and Systems Award
The Victorian Department of Health and Human Services (DHHS) Centre for Evaluation and Research
The DHHS Centre for Evaluation and Research led the development and implementation of evaluation policy and systems in this large Victorian state government department. It played a pivotal role in transforming the evaluation culture of the department through policy guidance, advice and support, design and delivery, training and knowledge translation.
The Centre for Evaluation and Research demonstrated excellence in applying theory-based principles and approaches to developing and implementing evaluation policy and systems. The judges were impressed with the high-quality evaluation resources and pragmatic support that departmental staff have access to, including tools and tailored advice. The nomination included good evidence of stakeholder satisfaction with the team’s work. The judges were also impressed with the significant contribution of the team to transferring knowledge across the department and broader community and public sectors. Overall, the judges considered this Award winner as an exemplary case of evaluation policy and systems contributing to service delivery and improving community wellbeing, including among those most vulnerable to the risks of poorer outcomes.
Jessica Kenway, Bluebird Consultants – for the Australia Africa Community Engagement Scheme: M&E Framework and M&E Systems Review
The M&E system was developed by Jessica Kenway over five years from 2011 as part of a significant international development program, AACES, a partnership funded by the Australian the Department of Foreign Affairs and Trade (DFAT), and implemented by 10 Australian NGOs and their in-Africa partner organisations, across 11 countries and three sectors. Jessica used a consultative process at two levels to design and develop an M&E system for the overall program, and supported the participating NGOs to develop M&E frameworks for their individual projects. She reviewed the project level M&E systems one year into implementation to clarify outcome measures and confirm collection of baseline data, which was a valuable innovation. The system was developed through a partnership approach consistent with the program’s principles, and built a culture of peer review and of M&E capacity in the 10 NGOs. The M&E system was well resourced and addressed learning as well as accountability needs.
The nomination substantiated the claims for each of the criteria for excellence. An external review of AACES by the funder DFAT in 2016 provided systematic evidence of effective implementation and impacts of the M&E systems. An independent review of M&E systems of Australian aid projects in Africa by Coffey International rated the AACES system as the highest quality. Testimonials from key stakeholders in DFAT and the 10 NGOs collectively demonstrated stakeholder satisfaction, and fully supported the claims on the use and impact of the system.
Nan Wehipeihana, Kate McKegg and Kataraina Pipi of Research Evaluation Consultancy Limited (a member of the Kinnect Group), evaluators, and Veronica Thompson (Sport New Zealand) commissioner – for He Oranga Poutama: What we have learned? A report on the developmental evaluation of He Oranga Poutama
This project was responding to a significant shift in government policy from increasing numbers of individual Maori in sport to introducing into sport the collective cultural contribution as Māori. The developmental/ deep culture approach is explained well, and characterises this work as having collaboratively created the systemic change 'making the road by walking it'). It offers a particularly strong exemplar of developmental evaluation to build a new knowledge system but also pays attention to the more usual programme level monitoring and data collection tools aligned to the framework in the after-the-event audit space.
Monitoring and Evaluation Framework, Mongolia Australia Scholarship Program developed by Ian Patrick and Associates (Ian Patrick), AusAID Scholarships Section (Amy Haddad) and Coffey International Development
The judges noted that the framework was comprehensive without being too complex to implement and in the words of a referee 'provided a strong basis for ... [the client] to collect data and to verify performance for the entire program'.
Areas that stand out as exemplary and examples of good practice in monitoring and evaluation frameworks include:
- an inclusive and collaborative approach
- a high level of professionalism
- the application of leading-edge practices – this framework used the OECD Development Assistance Committee's criteria and reflected contemporary thinking in development evaluation
- clarity around the ethical principles underlying the approach and a demonstration of how they were applied.
ARTD Consultants, Klas Johansson, Janet Kidson and Joanne Battersby from Ageing, Disability and Home Care, NSW Department of Human Services (2010)
The Centre for Health Policy, School of Population Health, University of Melbourne and the Department of Health and Ageing (2009)
Western Sydney Region, NSW Department of Education and Training (2008)
Karen Goltz, Health Promotion/Public Health, Department of Human Services; Yoland Wadsworth, Ani Wierenga and Gai Wilson c/ Youth Research Centre, Melbourne University; and The Victorian Department of Human Services, North and West Metropolitan Region (2007)
Jessica Dart and Rick Davies (2006)
Bruce Davidson, Noosa council and Ellen Vasiliauskas, d-sipher (2005)
Anna L Johnson and the Strategic Review Evaluation and Research Branch, Queensland Department of Communities (2004)
Julie Rolfe, Victorian Premier’s Drug Prevention Council and John Pilla, BearingPoint Australia (2003)
Zita Unger, Evaluation Solutions (2001)
Jennifer Leigh, Ministry of Social Policy, Wellington (2000)
Australian Taxation Office/ Corporate Diagnostics Pty Ltd and Evaluation (1999)
Auditing Services Ltd (formerly the Evaluation Department, Queen Mary Hospital, Wellington) (1999)
Team from Olympics Roads and Transport Authority for the Royal Easter Show Transport Evaluation (1998)
DASFLEET, Australian Department of Administrative Services and Corporate Diagnostics Pty Ltd (1996)
Terrence Measham, Director, Powerhouse Museum & Carol Scott, Co-ordinator of the Powerhouse Museum's Evaluation & Visitors Research Unit (1995)
Evaluation Management Team, NSW Public Works in conjunction with staff from Canberra University (1994)
Community Development Award
Palmerston/ Tiwi Island Communities for Children (C4C) Participatory Evaluation conducted by the Communities for Children (C4C) Local Committee, Pandanus Evaluation & Planning Services (Nea Harrison) and the Australian Red Cross (Rachel Dunne)
The Award judges noted that the work was well-thought through, extremely thorough and comprehensive. A local committee, the elders and young women were all actively involved in the project in a strong collaboration with the facilitator of the evaluation. The nomination indicated an understanding of how an evaluation design contributes to community development goals and how the use of community development processes enables the accomplishment of the evaluation.
Areas that stand out as exemplary and examples of good evaluative practice in this evaluation include:
- culturally appropriate evaluation design and methodology
- strong community engagement at all stages of the evaluation from inception to conclusion, including closure and reporting back to the community
- a developmental approach working sensitively and in sympathy with the local community
- outcomes resulting in sustainable benefits for local participants (e.g. capacity building)
- use of advanced techniques –logic framework, quality rubrics, the community report.
The End of Project Evaluation: Bobonaro Early Childhood Care and Development Project, sponsored by World Vision and conducted by John Donnelly
This evaluation made a strong contribution to capacity building in Timor Leste; it is an exemplar of a community development evaluation that other evaluators could use to inform and improve their own practice.
The evaluator worked closely with the local communities using methods that were respectful, collaborative and inclusive. The methods were derived from leading edge Appreciative Inquiry approaches and Participatory Rural Appraisal techniques. Multiple data sources were used and brought together in an excellent example of triangulation. The report was clear and concise. The limitations of the evaluation were clearly articulated as caveats early in the report.
The evaluation demonstrated a transparent process, with all of the conclusions drawn firmly from the available evidence.
John Donnelly, Donnelly Consultants (2010)
Brad Shrimpton, Centre for Program Evaluation and Mandy McKenzie, Domestic Violence Resource Centre, Melbourne (2005)
Evaluation Publication Award (Caulley Tulloch Award)
Samantha Abbato – for 'The case for evaluating process and worth: evaluation of a programme for carers and people with dementia'
This is a book chapter published in the most recent volume of the prestigious series Advances in Program Evaluation, edited by Trisha Greenhalgh & Saville Kushner. Its thesis is the utility of the case study approach as a major component of a mixed-methods evaluation. What makes the chapter worthy of the award is the author's careful analysis and demonstration of the role of case studies in mixed methods evaluations, including the way she contrasts them with the quantitative methods more frequently used in the sector covered by the project (community health/dementia). This chapter will be particularly valuable for people new to evaluation, or to those coming from a quantitative background who wish to gain an understanding of the role of case study research in evaluation. The nomination explicitly and convincingly addresses the specified filters and the seven specified criteria of excellence.
Coffey International Development, Jennifer Rush and AusAID (2009)
Glenys Jones and her team, Parks and Wildlife Service, Tasmania (2005)
Bron McDonald, Patricia Rogers and Bruce Kefford (2003)
Scott Bayley (2001)
Patricia Rogers and Gary Hough (1996)
Yoland Wadsworth, Merinda Epstein and Maggie McGuiness (1995).
Outstanding Contribution to Evaluation Award
Jess is a recognised leader in evaluation with over 25 years of experience in the collaborative design and evaluation of programs that seek to bring about a more equitable and just society. The judges were impressed by this nominee’s sustained application of authentic inclusivity, and her high level of ethical standards so clearly evidenced through her practice. The judges noted this nominee’s skill in being able to combine deep evaluation knowledge and theoretical understandings with straightforward communication. She has undertaken more than 30 external evaluations and overseen over 120 evaluations in countries around the world. The judges noted the high level of professionalism and stakeholder satisfaction evidenced in this nominee’s work. As an ‘evaluation entrepreneur’, the judges were impressed by this nominee’s capacity to constantly scan the horizon for where evaluation is headed, and to create fresh approaches and techniques. The judges were impressed by her pioneering work on no less than five innovative approaches.
The judges were impressed by this nominee’s other contributions to evaluation knowledge. She is a contributor to evaluation textbooks, an author of refereed journal articles including publications in the American Evaluation Journal and New Directions in Evaluation, and a prolific trainer who has reached more than 1,000 participants, including many AES members.
Jess has been an AES member since 1997 and a Board member since 2014 in the role of treasurer; and was involved in the aes18 International Evaluation Conference as convenor. The judges acknowledged this nominee’s significant contribution to the AES.
Anne has made a sustained contribution to evaluation and the AES over a considerable period and has been instrumental in the formation of the Evaluation Society in PNG. She has contributed to theoretical debate around evaluation, to training and the delivery of workshops, and service to the AES as a member of the Victorian Branch committee, as secretary to the AES National Committee and most recently as Vice President of the AES. Anne is committed to social justice issues, which underlie her work, as well as to capacity and capability building. The approach she takes combines theory with practice in an accessible manner ensuring that those from a variety of positions of knowledge are able to incorporate new understandings in their work.
Anthea Rutter and Zita Unger