top of page

Evaluation

Know your outcomes and make better decisions

The purpose of program evaluation is to inform decision making at the policy and operational levels. 

It requires a wide range of skills including group skills, management ability, political skills, and sensitivity to multiple stakeholders

Evaluators need to have refined judgement, self knowledge, empathy and a commitment to impartiality.

This differs from conventional research, by describing:

  • What the program has achieved

  • How well it has contributed to the goal, met the objectives and undertaken the strategies

  • What worked well and what didn’t, and why

  • Whether there were any unintended outcomes, and

  • What can be learnt from the program to improve practice and inform other programs

There are many different types of evaluations including: formative, summative, process, outcome, impact.

  • Formative evaluations - used to guide the future direction of a project

  • Summative evaluations- provide a summary of what has occurred

  • Process evaluation– evaluates how the program is being implemented so that expected outcomes are achieved 

  • Impact evaluation– involves an assessment of the overall effects of the program

  • Outcome evaluation – investigates whether the program caused demonstrable effects on the target outcomes.

Determinants of success

Evaluation Theory

Deploy appropriate theory,  evaluative knowledge,  and reasoning to the project

Project Management

Project management skills to effectively scope, manage and complete the evaluation

Culture & Context

Be responsive to the cultural, social and political context for the evaluation is conducted

Interpersonal Skills

Skills to communicate effectively with clients, consumers and other stakeholders in the evaluation.

Methods & Inquiry

Knowledge and skills of the evaluator to conduct systematic inquiry to collect reliable data

Professional Practice

Demonstrate relevant knowledge, skills & attitudes incl. integrity and flexibility

Our commitment to you:

  1. Methodologies will be rigorous in design, data collection and analysis to adhere to the highest standards of validity and reliability, appropriate to the intended use, to increase the accuracy and credibility of the information produced.

  2. Design, conduct and reporting will respect the rights, dignity and entitlements of those affected by the project/research/investment.

  3. Judgements and recommendations that are made as a result of the project will be based on sound data, modelling, evidence and complete information.

  4. Unexpected and significant problems identified during the project will be reported as soon as possible.

  5. Findings will be presented as clearly and simply as accuracy allows so that you and other stakeholders can easily understand the process and results. Oral and written reports will be direct, comprehensive and honest in the disclosure of findings and the limitations.

Our experience:

  • Evaluation of needs assessment and case management tools for DIAC funded service delivery programs (including Humanitarian Settlement Services (HSS)) (Department of Immigration and Citizenship)

  • Evaluation of New Perinatal Infant Mental Health Teams (NSW Health)

  • Evaluation of program resources, training materials, including delivery formats, developed for the implementation of stage 2 of the Diabetes Medication Assistance Service (Commonwealth Department of Health and Aging)

  • Evaluation of Selected Victorian Child Youth and mental Health Reform Initiatives Stage 1: Preliminary Investigation (Victorian Department of Health);

  • Evaluation of St John of God Health Care community-based early intervention perinatal and infant mental health services (Raphael Services)

  • Evaluation of the COAG Supporting Measures Relating to Needle & Syringe Programs in Victoria (Department of Human Services – Victoria)

  • Evaluation of the GLBTI Youth Suicide Prevention Initiative (Department of Health Victoria)

  • Evaluation of the National Perinatal Depression Initiative (Department of Health)

  • Evaluation of the Positive Futures Initiative (Department of Communities, Queensland)

  • Evaluation of the SA Returning Home Program (SA Health)

  • Evaluation of the services provided by Quit SA (Drug and Alcohol Services South Australia)

  • Evaluation of the Western Metropolitan Perinatal Emotional Health Program Trial (Department of Health, Victoria)

  • Evaluation of the young persons in residential aged care program in NSW (Department of Human Services, NSW)

  • Evaluative Research on Outcomes for People with an Intellectual or Cognitive Disability who Exhibit Severely Challenging Behaviours for Disability Services Queensland

  • Reducing Risky Drinking Evaluation (Department of Human Services – Victoria)

  • Evaluation of the In-Home Telemonitoring Trial ((Department of Veterans' Affairs)

LETStalk

We would love to talk to you about your evaluation needs.

As a consultant, Darren has considerable experience in evaluation design, program logic and stakeholder management and consultation. He has delivered evaluation projects of varying size and scale including:

  • Local and council/shire based program

  • Statewide services

  • National projects, programs and reforms

bottom of page