• Simple Item 1
  • 1

Pre-conference workshop program: Sunday 3 September 2017

>>> DOWNLOAD a printable conference workshop program

Having trouble seeing the full overview table? Switch to mobile version here.
View Post-conferene (Thursday) workhop program here.

8–9am REGISTRATION
9am–12:30pm WORKSHOP PROGRAM
 

Increasing organisational knowledge capital: 
a contribution from policy evaluation (half day)

Rick Cummings, John Owen

> Details
> Register

Communicating evaluation effectively (full day)

Kathryn Newcomer

> Details
> Register

Developing monitoring and evaluation frameworks (full day)

Anne Markiewicz

> Details
> Register

Planning an evaluation that works
(full day)

Nea Harrison, Carol Watson

> Details
> Register

How to be a successful evaluation consultant
(full day)

Glenn O'Neil

> Details
> Register

Realist methods for evaluation of large scale and complex programs
(full day)

Gill Westhorp

> Details
> Register

Human centred design for evaluators  (full day)
        
Jess Dart,
Chris Vanstone

> Details
> Register
 

BEGINNER

BEGINNER / INTERMEDIATE

BEGINNER / INTERMEDIATE

 BEGINNER

BEGINNER / INTERMEDIATE

 ADVANCED

BEGINNER TO ADVANCED

12:30–1:30pm LUNCH
1:30–5pm WORKSHOP PROGRAM

Social network analysis for evaluators (half day)

Dan Healy, Matt Healey

> Details
> Register

Applying trauma theory in evaluation—from theory to practice (half day)

Richard Weston, Lisa Hillan

> Details
> Register

Kathryn Newcomer continued

Anne Markiewicz continued

Nea Harrison, Carol Watson continued

Glenn O'Neil continued

Gill Westhorp continued

Jess Dart, Chris Vanstone continued

BEGINNER

INTERMEDIATE / COMMISSONERS OF EVALUATION

           

WORKSHOP DETAILS

Increasing organisational knowledge capital: a contribution from policy evaluation

presented by Rick Cummings, Murdoch University, Perth; John Owen, The University of Melbourne, Melbourne

HALF DAY (AM) – BEGINNER

Government stakeholders are increasingly requiring public sector agencies to evaluate their policies and related implementation strategies. This presents new challenges because much of the theory and practice of evaluation is program–based. This workshop explores a framework within which to build in evaluation at the start of the policy development cycle, so as to plan a high-quality evaluation study.  

The workshop has been developed to cater for individuals involved in policy development and/or implementation, as well as for those who need to commission or plan an evaluation of a policy or strategy. Participants do not need prior experience in evaluation. By the end of this course participants will be able to:

  • identify the key steps in formulating an evaluation of a public policy or strategy
  • understand elements of an evaluation plan of an actual policy.

In addition, they will have practised the skills required to respond to an evaluation brief for an actual government policy.

The teaching strategy in this workshop rests on an assumption that the expertise and experience of participants is built upon discussion of key issues for policy evaluation. This developmental process is continued through small group work focused on a particular public policy.   

About the presenters
Rick Cummings is a Senior Research Fellow at Murdoch University in Perth.  His interests are in the areas of evaluation utilisation and policy evaluation. He has over 30 years’ experience in evaluation particularly in education, health and crime prevention. Rick currently teaches policy research and evaluation in the Sir Walter Murdoch Graduate School of Public Policy and International Affairs. He is a Fellow of the Australasian Evaluation Society.

John Owen is Principal Fellow at the Centre for Program Evaluation at the University of Melbourne. He is interested in providing useful evaluation based knowledge to policy and program decision makers. His book Program Evaluation: Forms and Approaches provides an integrated framework to evaluation theory and practice that has had favourable reviews worldwide. In addition to traditional evaluation approaches designed to determine impact, this framework includes approaches that locate evaluative action more towards the front end of policy and program interventions. He is a Fellow of the Australasian Evaluation Society.

> back to overview > register


Social network analysis for evaluators

presented by Dan Healy and Matt Healey, First Person Consulting, Melbourne

HALF DAY (PM) – BEGINNER

Social network analysis (SNA) is a particularly useful approach for capturing the structure and interactions of networks between individuals, groups and organisations. SNA provides a framework for collecting and analysing data and displaying it through network maps. While networks can be incredibly complex and dynamic, well designed network analysis can reveal features of networks that are otherwise difficult to see.

The purpose of this workshop is to introduce SNA to attendees as an evaluative technique, the theory that underpins it and the ways it is implemented. This is an introductory level workshop with no prior knowledge needed. By the end of the workshop, participants will be able to:

  • understand the role and value SNA can play as part of the evaluator’s toolkit
  • critically assess whether SNA is appropriate in different contexts
  • design and implement data collection
  • understand different types of analysis
  • interpret results and network maps
  • explore the ways in which SNA results can be used and communicated to stakeholders.

The workshop will be divided into five topics that will guide the session:

  1. understanding the theory and scoping the SNA
  2. designing and implementing data collection
  3. data setup
  4. data analysis and visualisation
  5. communicating the results.

This workshop is designed to be highly practical, with participants walked through each stage of the SNA process. They will be provided with data sets to use during the session and will learn the skills to conduct an SNA. The presenters will also facilitate engaging (and entertaining) peer-to-peer activities designed to encourage discussion on how SNA is relevant to participants own work.

About the presenters
Dan Healy is a Senior Consultant and co-founder of First Person Consulting. His work includes a range of evaluation and social research projects in the areas of community engagement, public health, sustainability and natural resource management. This has included work with local, state and Commonwealth government, industry and research bodies, and non-government organisations. Professionally, Dan has led teams in all stages of evaluation including: planning, design, qualitative and quantitative data collection and analysis, reporting and presentation of findings. Dan has implemented SNA over the last six years across a variety of content areas, such as public health, emergency management and natural resource management. SNA has been used as a key tool in many of these evaluation and research projects, with network maps and analysis used for planning, engagement, reporting and sharing results. Dan has training in SNA and is continually engaged in extending his knowledge via participating in conferences and special interest groups. Dan has a keen interest in systems-based approaches to complex issues and making the best use of data to produce insights that are useful and presented in an understandable way.

Matt Healey is a Consultant and Co-Founder at First Person Consulting (FPC). He has worked on social research and evaluation projects across a variety of areas including public health and health promotion, sustainability and climate change, natural resource management and waste management and litter. His clients span all levels of government, not-for-profits and private enterprises. Matt has a reputation as an energetic and adaptable consultant and presenter, known for consistently producing and providing products and services that are both engaging and useful. His current interests include (among other things) human centred design and other design processes, innovation and the role of evaluation in supporting these. Matt is Secretary of the Australasian Evaluation Society (AES) Special Interest Group (SIG) on Design and Evaluation, which recognises the links between design and evaluation and the importance and possibilities of these fields complementing each other.

> back to overview > register


Applying trauma theory in evaluation—from theory to practice

presented by Richard Weston and Lisa Hillan, Healing Foundation, Barton (ACT)

HALF DAY (PM) – INTERMEDIATE/COMMISSIONERS OF EVALUATION

The purpose of this workshop is to introduce participants to

  • applying trauma theory in evaluation processes 
  • designing evaluation methodologies that preference Indigenous knowledge.

In Aboriginal and Torres Strait Islander organisations there is an increasing recognition that many individuals, families and communities have extensive histories of trauma that left unaddressed will get in the way of achieving health and wellbeing. Since 2009, the Healing Foundation has been working nationally to establish an evidence base for trauma and healing from an Aboriginal and Torres Strait Islander perspective.

Culturally informed modes of evaluation practice and theories of change are important elements of policy design and monitoring and evaluation. This workshop will assist participants to:

  • understand the impact of trauma in Aboriginal and Torres Strait Islander communities
  • describe how trauma and healing informed practice is used to frame and design evaluation strategies including co-design processes
  • interact with the Healing Foundation theory of change and the key domains required for sustaining trauma informed healing.

Participants will learn how to:

  • apply trauma theory in evaluation processes including safe practice
  • the purpose of empowerment in evaluation as a trauma recovery tool for Aboriginal and Torres Strait Islander people
  • outline the benefits of participatory evaluation strategies in elevating Indigenous knowledge systems.

Since 2009, the Healing Foundation has been building a theory of change for healing, drawing on over 30 internal and external evaluations. Raising evaluation standards is a significant undertaking in most settings due to the combined challenge of building both evidence for a theory of change and strong evaluation frameworks that tie in culturally appropriate data collection methods. The Healing Foundation has partnered with over eight external organisations in evaluating and creating knowledge about healing and trauma.

About the presenters
One of the conference keynote speakers, Richard Weston is Chief Executive Officer of the Healing Foundation. Lisa Hillan is Director, Knowledge Creation at the Healing Foundation. Richard and Lisa have been at the forefront of developing the evidence base for healing complex trauma, working with Aboriginal and Torres Strait Islander communities across Australia.

> back to overview > register


Communicating evaluation effectively

presented by Kathryn Newcomer, Professor and Director, The Trachtenberg School of Public Policy and Public Administration, The George Washington University, Washington, DC, USA; President American Evaluation Association

FULL DAY – BEGINNER/INTERMEDIATE

The use and usefulness of evaluation work in the governmental setting is highly affected by the effectiveness of reporting strategies and tools. Care in crafting both the style and content of findings and recommendations is critical to ensure that stakeholders understand and value the information provided to them. Skill in presenting sufficient information—yet not overwhelming the audience—is essential to raise the likelihood that potential users of the information will be convinced with both the relevance and the credibility of the evidence provided to them. This workshop will provide guidance and practical tips on communicating about evaluation findings in governmental settings that are affected by political considerations. Attention will be given to the selection of appropriate reporting strategies/formats for different audiences and to the preparation of: effective executive summaries; clear analytical summaries of quantitative and qualitative data; user-friendly tables and figures; discussion of limitations to measurement validity, generalizability; causal inferences, statistical conclusion validity, and data reliability; and useful recommendations. The class will include some group exercises and cases.

Participants will develop skills in:

  • planning during evaluation processes to communicate effectively with stakeholders
  • considering how ‘evidence’ may be transmitted to inform decision-making
  • conveying the methodological integrity of evaluation work
  • formulating feasible and actionable recommendations
  • communicating effectively about evaluation work in the summaries; scope and methods; major findings; quantitative and qualitative data analyses.

About the presenter
Kathryn Newcomer is the Director of the Trachtenberg School of Public Policy and Public Administration at the George Washington University where she teaches graduate level courses on public and non-profit program evaluation, and research design. She is a Fellow of the National Academy of Public Administration, and currently serves on the Comptroller General’s Educators’ Advisory Panel. She served as an elected member of the Board of Directors of the American Evaluation Association (AEA) (2012–2015), and currently serves as AEA president (2017). Kathryn served as President of the National Association of Schools of Public Affairs and Administration (NASPAA) for 2006–2007.
Kathryn has published five books, including The Handbook of Practical Program Evaluation (4th edition 2015) and Transformational Leadership: Leading Change in Public and Nonprofit Agencies (June 2008), a volume of New Directions for Public Program Evaluation, Using Performance Measurement to Improve Public and Nonprofit Programs (1997), and over 60 articles in journals including the Public Administration Review and the American Journal of Evaluation. She routinely conducts program evaluations for U.S. federal government agencies and non-profit organisations, and conducts training on program evaluation in the U.S. and internationally. She has received several awards for teaching, most recently the Duncombe Excellence in Doctoral Education Award from NASPAA (2016). Kathryn earned a B.S. in secondary education and an M.A. in Political Science from the University of Kansas, and her Ph.D. in political science from the University of Iowa.

> back to overview > register


Developing monitoring and evaluation frameworks

presented by Anne Markiewicz, Anne Markiewicz and Associates, Alphington (VIC)

FULL DAY – BEGINNER/INTERMEDIATE

The development and implementation of Monitoring and Evaluation Frameworks at strategy, program and project levels are important processes to adopt in order to provide an indication of results achieved and to resource organisational learning. The Monitoring and Evaluation Framework defines the parameters of routine monitoring and periodic evaluation that will take place over the life of a program or an initiative. The workshop provides participants with useful, step by step practical guidance for developing a Monitoring and Evaluation Framework, supported by relevant background and theory. It presents a clear and staged conceptual model, discusses design and implementation issues and considers any barriers or impediments, with strategies for addressing these. Participants will learn the format and approach for developing a Monitoring and Evaluation Framework, the range of techniques and skills involved in its design and implementation and develop an appreciation of the parameters of the tasks involved and how to approach them.

Participants will learn:

  • the value and purpose of investing in and developing Monitoring and Evaluation Frameworks
  • the participatory approach and processes involved in developing such frameworks
  • the steps and stages involved and the suggested ‘Table of Contents’ for constructing a Monitoring and Evaluation Framework.

The trainer will alternate between use of a Powerpoint presentation and small group interactive work. The workshop follows a case-study approach and involves participants in the development a Monitoring and Evaluation Framework for the case-study. In this way, the approach to training is participatory and hands-on while still conveying sufficient theory and context.

About the presenter
Anne Markiewicz is the Director of Anne Markiewicz and Associates, a consultancy that specialises in developing Monitoring and Evaluation Frameworks, and the delivering of training, mentoring and capacity building in monitoring and evaluation. Anne is the co-author of the text book Developing Monitoring and Evaluation Frameworks (Sage 2016). She has extensive experience in the design and implementation of monitoring and evaluation frameworks for a wide range of different initiatives, building the capacity of organisations to plan for monitoring and evaluation. Anne has been an evaluator for over 20 years and has been recognised by the Australasian Evaluation Society through receipt of a number of awards for excellence in evaluation and she is a Fellow of the Society. Anne has delivered this training program extensively in Australia, the Pacific and in the USA and the UK.

> back to overview > register


Planning an evaluation that works

presented by Nea Harrison, Pandanus Evaluation, Darwin; Carol Watson, Carol D Watson, Planning and Evaluation Services, Hobart

FULL DAY – BEGINNER

This practical one-day workshop provides an introduction to key evaluation concepts, methods and planning processes. It will enable participants to get started in evaluating their own programs and projects in collaboration with key stakeholders and beneficiaries.

By the end of the workshop participants will have:

  • considered the critical issues in planning and conducting an evaluation and reporting results, including the importance of context, values and ethics
  • considered key stakeholders and their information needs
  • developed a program logic model
  • developed evaluation questions
  • identified indicators of success
  • considered appropriate data collection and reporting methods
  • explored ways to ensure evaluation findings are useful and used.

The workshop draws on utilisation-focused, participatory and empowerment evaluation theories and methods. It introduces participants to evaluation concepts and steps participants through the participatory planning processes involved in developing an evaluation plan.

The workshop will be conducted in a participatory manner using adult learning principles. Workshop facilitators will use a range of workshop techniques such as short presentations, including illustrative examples from practice, and large and small group discussions that draw on the knowledge and experiences of the group. Participants will work through a case study in small groups to develop a program logic and plan. Participants will be provided with supporting handouts, planning worksheets and links to online evaluation websites and resources. They will take away the necessary tools to get them started on an evaluation plan for their own program/project.

The workshop is aimed at people new to evaluation who want to learn about the process of planning a rigorous and useful program evaluation. It is suitable for people who are new to evaluation or people with some evaluation experience who wish to increase their knowledge of the participatory process of planning a systematic program evaluation. Evaluation feedback has indicated that the workshop is engaging, clear, logical, and makes the process of developing and implementing a sound evaluation plan meaningful and achievable.

About the presenters
Nea Harrison conducts high quality evaluation, participatory planning and evaluation capacity development work for a wide range of government, non-government and community based agencies throughout Australia and internationally. She has a background in research, management and program and policy development in the education, health, and social service sectors. Nea is Director of Pandanus Evaluation and is the M&E Adviser to the Papua New Guinea Pacific Women Shaping Pacific Development Program. Nea was awarded the 2012 Australasian Evaluation Society (AES) Community Development Evaluation Award for Excellence for the Participatory Evaluation in partnership with the The Australian Red Cross (NT) Communities for Children Program and the Palmerston and Tiwi Islands Communities for Children Local Committees.

Carol Watson is an experienced researcher, service and program planner and evaluator. She has a 30-year history working in the areas of Aboriginal health, alcohol and other drugs, public health and mental health through her various roles in the NT Department of Health, as a senior researcher for the Cooperative Research Centre for Aboriginal Health and as a consultant (Carol D Watson Planning and Evaluation Services). She has also been involved in developing and writing resources for health and community services practitioners covering alcohol issues and solutions, public health topics and workplace development and support.

> back to overview > register


How to be a successful evaluation consultant

presented by Glenn O’Neil, Owl RE, Geneva, Switzerland

FULL DAY – BEGINNER/INTERMEDIATE

Being a successful evaluation consultant is more of an art than a science. But as you can learn how to paint, sculptor or draw, you can also learn how to be a successful evaluation consultant! What are the qualities of a successful evaluation consultant? How do you kick-start your career? How can you find, maintain and expand your evaluation consultancy? Glenn O’Neil, will provide key insights and know-how based on his 15 years as a successful international evaluation consultant. As founder of evaluation consultancy, Owl RE, Glenn has carried out over 100 evaluations in 50 countries for some 40 international organisations, NGOs and governments.

The workshop will provide a solid understanding of strategies, skills and tactics for participants to become successful evaluation consultants. Key learning outcomes are:

  • Understand if you are suited or not to be an evaluation consultant
  • Learn skills needed to market and maintain an evaluation consultancy
  • Practice using strategies and tactics to build an evaluation consultancy.

This workshop is aimed at persons interested in becoming an evaluation consultant or those who have recently started their consultancy career. It could also be of interest for evaluation commissioners/managers who manage evaluation consultants. There are no prerequisites required.  

The one day workshop will be organised around six modules:

  1. The basic qualities of an evaluation consultant
  2. Setting up as an evaluation consultant
  3. Finding work—‘Push’/‘Pull’ strategies 
  4. Delivering evaluation services effectively
  5. Retaining business—customer management and beyond
  6. Diversifying and growing evaluation consultancy.

A mix of presentations, case studies and practical exercises will be used. Participants will be provided with the trainer’s own 20-page guide.

About the presenter
Glenn O’Neil is founder of evaluation consultancy, Owl RE based in Geneva, Switzerland. For 15 years, Glenn has built up his boutique consultancy to carry out over 100 evaluations in over 50 countries for some 40 international organisations, NGOs, governments and foundations with a specialization in the communications, advocacy and humanitarian areas. Recent clients have included Doctors without Borders, CERN, the Scout Movement, the Global Environmental Facility, UNAIDS, WIPO, Oxfam, EU agencies and the International Red Cross. Glenn is also an experienced teacher, facilitator and trainer in media, communications and evaluation management and methods. Glenn has a PhD in research and evaluation methodology from the London School of Economics and Political Science and an Executive Masters in Communications Management from the University of Lugano, Switzerland. Glenn is Swiss–Australian.

> back to overview > register


Realist methods for evaluation of large scale and complex programs

presented by Gill Westhorp, Charles Darwin University, Darwin

FULL DAY – ADVANCED

This workshop will support evaluators and researchers to use realist methods in large scale and complex evaluations. Because realist evaluation (RE) was initially developed using smaller-scale programs, methods need to be modified for large scale programs, while remaining consistent with underlying methodological principles. The workshop will discuss the value of and the dilemmas involved in using realist methods for large and complex programs, and demonstrate strategies to address the dilemmas. Knowledge, techniques and practices to be addressed include:

  • the rationale for using realist evaluation in large scale and complex programs
  • different options and constructs of ‘program’ theory for use in large scale programs
  • different ways of conceptualising ‘program mechanisms’ and why they are necessary for large scale or complex programs
  • use of formal theory in evaluations of large scale and complex programs
  • the nature of data required for realist evaluations of large scale and complex programs.

By the conclusion of the workshop, participants should be able to:

  •  determine whether original realist methods are appropriate in their own evaluations, and if not, what modifications may be needed
  • distinguish between the types of theory that may be necessary for their own evaluations and describe how they might be developed to be consistent with realist principles
  • discuss the implications for evaluation design, including data requirements, for realist evaluations of large scale and complex programs.

The workshop will include a combination of whole group discussions, presentation of examples from large scale and complex realist evaluations, and exercises focused on applying key ideas to the participants’ own evaluation work. Learning resources will include the Rameses Standards, training materials for realist evaluation and references for further reading.

About the presenter
One of the conference keynote speakers, Gill Westhorp is Professorial Research Fellow at Charles Darwin University, leading the Realist Research Evaluation and Learning Initiative. Gill is an internationally recognised practitioner and innovator of realist evaluation methods. Much of her current work is in the international development, climate change, health and community services sectors.

> back to overview > register


Human centred design for evaluators

presented by Jess Dart, Clear Horizon Consulting, Melbourne; Chris Vanstone, Chief Innovation Officer, The Australian Centre for Social Innovation, Adelaide

FULL DAY – BEGINNER TO ADVANCED

Design can be defined as a purposeful and systematic process to create positive change. human-centred design and derivations of it travel under lots of names including: user-centred design; co-design and ‘design thinking’. The phrases ‘co-design’ and ‘design thinking’ are sweeping our institutions and offering the promise of more effective and user-oriented social services and policy. In September 2015, the Harvard Business Review ran an article called ‘Design thinking comes of age’. It described how design thinking led to the creation of the apple mouse, right through to re-imagining Peru’s economy. The 2016 American Evaluation Society conference embraced it as a theme and was entitled ‘Evaluation and Design’. Here in Australia many organisations are using design thinking: from social enterprises to Government departments, particularly in the social services.

Design thinking has much to offer evaluators. We can simply use it to design more user-oriented monitoring and evaluation systems—a modification which can greatly increase the usefulness of M&E. We can take a step further and partner with social innovators and support their service-design with evaluative thinking and evidence. Human-centred design shares a common mission with evaluation: both disciplines aim to help create effective policies and programs that truly make a difference to our society.
This workshop will introduce concepts and some tools of ‘human-centred design’. We start by introducing the human-centred design process and share examples of how the Australian Centre for Social Innovation uses this process to design new service offerings for Australian families and disadvantaged people. We will then taste some of the tools through a hands-on case study about designing a measurement system for an organisation. We will gain an overview of the overall design-process and learn some ‘discover’ stage methods like card-sorting. The workshop will include a mix of presentation and small-group experiential process.

All evaluators are welcome, from beginners to more advanced evaluators. In terms of human centred design, we assume no prior knowledge.

About the presenters
Chris Vanstone started his career designing biscuits and razors, for the last 17 years he has been working to bring the rigour of product development to tackling social issues—in the UK and in Australia— working with Government, NGOs, business and philanthropy.  Chris was a founding member of the UK Design Council’s RED team, Participle and The Australian Centre for Social Innovation. At TACSI he was design-lead on the teams that developed TACSI’s solutions Family by Family and Weavers, both winners of Australian International Design Awards. Chris is currently leading work to develop an approach to measurement and evaluation across all TACSI’s work developing new services and policy, building innovation capability and catalysing systems innovation.

Recipient of the National Evaluation Development Award, Jess Dart is a recognised leader with over 25 years of involvement in evaluating and designing large scale social change programs in Australia and overseas. Jess is passionate about developing and designing real-world methods and process for both program design and evaluation. After completing her PhD in evaluation, she co-authored the Most Significant Change (MSC) Guide with Dr Rick Davies—which is now translated into 12 different languages.

Chris will lead the sessions on the design process, and Jess will help translate this for an evaluation audience.

> back to overview > register