Evaluator Perspective

Interview

Evaluator Perspective—Scott Bayley


An interview with Anthea Rutter of the Evaluation Journal of Australasia 2021, Vol. 21(4) 226–230.

Scott Bayley has over 25 years of experience in evaluation and became a Fellow of the Australasian Evaluation Society in 2016. He holds a master’s degree in Public Policy majoring in evaluation and social measurement. Prior to developing his own evaluation consultancy, Scott held senior positions with Oxford Policy Management, the Australian Department of Foreign Affairs and Trade, the Asian Development Bank, the Victorian Auditor General’s Office, and was a senior evaluator with the Aboriginal and Torres Strait Islander Commission. Scott has also served as an advisor to the United Nations Secretariat on performance measurement; the Evaluation Capacity Development Group in Geneva; UNICEF; and the world’s peak evaluation body, the International Organization for Cooperation in Evaluation.

Scott has published over 70 evaluation reports across numerous fields including Indigenous affairs, international development, health, education, the environment, infrastructure, capacity building, and governance. Scott is experienced in applying a broad range of data collection methodologies ranging from document analysis, interviews, focus groups, surveys, benchmarking costs of service delivery, and has skills in both qualitative and quantitative data analysis. He has an advanced knowledge of impact evaluation methods, performance measurement, plus theory-based evaluations and learning.

Scott Bayley has provided a range of services to the AES Board and regions spanning a period of more than 20 years. He has been a member of the Society since 1994 and was founder and chair of the AES cross-cultural interest group 1992–4 and served a period as an assistant editor of the AES journal. During these 20 years, he has also delivered a number of book reviews for the journal. Scott has served on the AES Board from 2009 to 2013 during which time he took on the role of the convenor of the International relations committee 2009–2012 and was a member of the strategic engagement and the finance and audit committees. During this time, he was the AES’s representative on the International Organisation for Cooperation in Evaluation (2009–2012). He also served on the professional training and development committee during 2002–2004, was the national regional coordinator for Australia during 2002–2004 and most recently was a member of the Government engagement working group in 2014–2015.

Over his 20 years of membership, Scott served on the WA regional committee for 7 years from 1994 to 2001. During his time in WA, Scott was the Regional Representative during 2000 and worked on the AES 1999 Conference committee, with responsibility for workshops. He also served on the Victorian regional committee from 2003 to 2004 and the ACT committee from 2013 until the present.

Delivering an evaluation training workshop in Jakarta for the Department of Foreign Affairs and Trade, 2017.

How did you become involved in evaluation?

My crisis of confidence! In the late 1980s, I was a program manager in the Northern Territory Health and Community Services Department, when I came to have serious doubts whether or not our programs were making a positive difference. I had been taught evidence-based decision making, and when I asked myself whether we were making a difference, I could not answer, and it really bothered me.

A work colleague suggested I might be interested in reading this new book by Patton on utilisation focused evaluation. I was immediately hooked, and I knew then that I wanted to work as an evaluator! I subsequently did courses at Darwin University (now Charles Darwin University) in research methods, evaluation and statistics. I then joined ATSIC in Canberra (1991–1992); I worked as an evaluator in Indigenous affairs and have been working in evaluation ever since.

Later on, after I had a bit more practical experience, I did my master’s degree at Murdoch University in Perth and studied evaluation with Ralph Straton.

An interview is one way of collecting data.

What kind of evaluation are you involved with, and has that changed over time?

One thing which I have noticed is that the AES membership has changed. When I first joined, it was all academics and government staff. Now we have a lot more NGOs and private consultants. A great many more Australians are now working in international development; that was quite rare when I first got into evaluation. Another change is the range of new impact evaluation methods which we have seen coming up in the last 10 years. I have also noticed that 25 years ago, there were various programs that were considered to be almost impossible to evaluate: the environment, community development, Indigenous programs and policy advice to name a few. These topics were considered to be too complex and hard to evaluate. Now we routinely do such evaluations. I think that the boundaries and work of practicing evaluators has evolved significantly over time.

Scott is awarded an Australian Evaluation Society Fellowship in 2016.

I am interested in theories of change which are very important in international development and have completed many evaluations in this area on topics ranging from building organisational capacity with volunteers, the social benefits of infrastructure improvements, through to financial management reforms and complaints handing systems in public sector agencies.

The real value of a theory of change approach is that it brings a systematic conceptual structure to the issue of what we should be measuring. Of course it also has its limitations as this approach can be very data intensive and it is weak on ruling out non-program explanations for any observed changes.

My other main interest is evaluation capacity building (ECB). I was doing that in China, Vietnam, Cambodia and Laos for 4 years with the Asian Development Bank. The international experience with ECB is now quite clear. We can focus our capacity building efforts on leadership’s demand for and ability to make use of evaluative feedback; our institutional infrastructure (evaluation policies, resources, staff skills, IT systems, etc.); or on the supply of evaluation feedback.

The international lesson is that demand is where we need to focus our capacity building efforts; supply side strategies (producing more evaluation reports) simply does not work.

In Paris presenting to the OECD on performance measurement, 2015.

What have been the major influence(s) which have helped to define your evaluation practice?

A number of factors have helped define my evaluation practice. Firstly, the following people have been a great influence: They include Gordon Robertson, Des Pearson, Patrick Batho, Ralph Straton, Darryl Cauley, David Andrich, Ron Penney, John Owen, Robert Ho, Rick Cummings, Burt Perrin and Ray Rist. I have been exceptionally lucky in that regard. Collectively these mentors influenced my views about what it means to deliver an evaluation that is technically sound and fit for purpose while facilitating the engagement of stakeholders and ensuring that their voices are heard by decision makers. Another influence has been my academic background in social research methods and later in public policy analysis which have influenced my evaluation practice.

In New Zealand for the Pacer Plus evaluation for the New Zealand Ministry of Foreign Affairs and Trade, 2019.

I am interested in impact evaluation methods, particularly critical multiplism (CM) which is not well known in Australia. It was developed by Cook and Shadish and is based on a particular view of reality – the idea being that the world is complex, and we can never know it perfectly. The best that we can do as evaluators is to study it from multiple perspectives using multiple methods. CM also believes that causality is probabilistic not deterministic. Not every smoker gets cancer, but a significant proportion do, and hence, we can say smoking causes cancer. To test causal relationships, CM uses three criteria first proposed by John Stuart Mill in 1850. In order to conclude that program A causes outcome B, you need to establish an association between A and B, and you need to show that A occurs in time before B. Finally, we need to rule out alternative explanations for the relationship between A and B. If, and only if, we can credibly satisfy all three tests can we conclude that program A causes outcome B.

The real value of CM is that it asks us to focus on the evidence we need for making causal inferences, rather than getting bogged down in unproductive debates about experiments versus case studies versus surveys, etc.

 I would like to see government focusing on processes for good policy formulation and evaluation, and AES members should be helping with this so that more informed decisions can be made

What have been the major challenges to your practice?

Initially developing my own skills was a big challenge. Evaluation is such a big field with so much to learn! Undertaking cross-cultural evaluations is very complex. There are many potential dimensions to performance and some of them are not immediately obvious. Speaking truth to power is an issue all evaluators face at some point in their career. I have had some tense discussions in Australia when evaluating the economic impact of the Melbourne Grand Prix, the privatisation of a prison, mental health services for suicidal youth, and contracting NGOs for service delivery and when evaluating the policy advising process in state government agencies. All highly controversial evaluations ultimately helped stakeholders to engage with the issue and make more informed decisions. I have also noticed that the commitment to evaluation of both state and Commonwealth governments waxes and wanes over time; this is very short sighted, and the public deserves better. We should be aiming to use public monies for best effect.

There is an opportunity for the AES to form more alliances and partnerships, particularly with external agencies such as IPAA and ANZSOG.

What have been the highlights of your career?

I worked on a wide variety of challenging evaluation topics: the delivery of health and community services in rural Australia, cost–benefit study of the Melbourne Grand Prix, assessing cement production in China, the effectiveness of petrol sniffing programs for remote Indigenous youth, financial management reforms in Mongolia, quality assuring Australia’s international aid program and complaint handling systems in government departments.

I have had the great fortune to have had a number of highly skilled advisors, people who went out of their way to coach and mentor me (key people listed above).

Another important highlight was being recognised with an AES Fellowship – this was a big highlight! It is an honour and a privilege to be recognised by one’s peers.

Can you share one critical piece of advice from your evaluation experience?

One of my biases is that coming up with answers to evaluation questions is generally not that difficult. The hard part is actually identifying good questions to ask: questions that stakeholders care about; questions that reduce uncertainty and questions that support learning, collaborative relationships and better program results.

Scott Bayley, Managing Director of Scott Bayley Evaluation Services and former Principal Consultant for Monitoring Evaluation and Learning at Oxford Policy Management (OPM) for the Asia Pacific region.

Scott is currently a self-employed evaluation consultant trading as Scott Bayley Evaluation Services.

He specialises in formulating measurement and reporting frameworks, leading impact evaluations, building evaluation capacities, providing decision support to program staff and senior executives and utilising performance feedback to drive the continuous improvement of programs while strengthening relationships with stakeholders.

Call Scott now

on:

 +61 452 509 756

or,

 Email me

Scott Bayley Evaluation Services - Continuous Improvement

Affiliations

Fellow of the:

Australian Evaluation Society logo

Ask a question


A quick question or,

Make an appointment

Please type your full name

This field is required

Please supply a valid email address

This field is required

Please type your phone number

This field is required

Ask for a quote or ask a question.

This field is required