On this page:
This step is about making decisions based on your evaluation, and telling others about it.
Evaluations are an opportunity to learn about strengths or gaps in an initiative — including how it can be more effective in future. But you won’t get any value from what you’ve learned about the initiative if nothing is done with your conclusions.
Consider how you communicate your evaluation’s conclusions and decisions about the future of the initiative. If you plan carefully, you will make sure:
- Everyone has the right level of detail and information they require to understand the situation and inform their decisions.
- The communication of results has been mindful of everyone’s interests and sensitivities.
By the end of this step you will have:
- Used your information to make or recommend decisions about the initiative.
- Considered your audiences.
- Been transparent.
- Shared your conclusions.
Use your information to make or recommend decisions about the initiative
Let’s recap. At the start of the evaluation you picked a ‘purpose’. We also looked at what decisions you might make at the end of the evaluation.
Return to this now. Use the questions below to think about how you might make a decision based on your conclusions:
If your purpose was to consider whether to continue or stop the initiative:
- Did the initiative have its intended impact?
- If yes, is this still considered a valuable impact for your school / organisation?
- If no, consider why. Could issues be resolved through changing the initiative?
- If aspects of the initiative were more effective than others, or there was a clear bottleneck (e.g. few students benefited because teachers didn’t know of the initiative), consider changes to the initiative.
If your purpose was to consider whether the initiative should be changed, and if so, how:
- For scaling up: Can you generalise the results for a larger group of students / teachers?
- For changing the target population: Could those beyond the target group get the same level of benefits?
- For looking to incorporate partners: Could working with a school, business or other organisation add further value to students?
If your purpose was to demonstrate the value of the initiative to a funder or potential partner:
- Did the initiative have its intended impact?
- Why is this impact still valuable and could it be achieved without the initiative? Why not?
- Why does it require additional support to increase its impact?
- Evaluation question: How effective is a Year 6 STEM education initiative for improving student achievement for all students?
- Related decision: Should we implement the initiative across the whole school?
- Conclusions / supporting evidence:
- Several classes tried and enjoyed the initiative.
- Low-performing students did better. Mid to high-performing students did as expected without the initiative.
- The resource and time cost is low.
- Recommendation on decision:
- Rather than extend the initiative to all students, we recommend the initiative be run in classes where there is a wide range of abilities. This will encourage equity of achievement.
- We reached this decision because the initiative improved the engagement of all students for a low cost. The improvement of lower-performing students is still something we are interested in, even though it is not the intended aim of this initiative.
If you decide to exit the initiative, there are some important considerations:
- Be ethical: You might be shutting down an initiative that is valuable to people. Consider what can be put in place to still support them if the initiative is remove.
- Be sensible: Make sure you transition out of the initiative with a plan. Give people early warnings about changes to come.
Consider your audiences
Now you’ve got your conclusions, it’s time to communicate them to others.
Start by thinking about the different audiences you need to communicate with, and what they might use the conclusions for. Use the table below as a guide.
(principals / school leaders / industry CEO / senior staff)
|Partners||Parents / students||Teachers or other staff who participate in initiative||Government authorities / education organisations||School communities|
|Interests||Wants information for decision-making||Want to know initiative’s effectiveness
Want to assess whether the initiative is worthwhile to the organisation
|Want to know whether initiative will continue||Want to know initiative’s effectiveness
Might have invested a lot of time into initiative and hold high expectations for a very positive evaluation
|Spreading ideas about effective initiatives, including what made it effective and what evidence exists||Want to know initiative’s effectiveness
Want to know how to implement it in their own classrooms / school
|Level of detail required||High: Enough detail to be able to make informed decision about initiative’s future||High: Enough detail to be able to make informed decision about initiative’s future||Low: High-level results and recommendations||Moderate: Results and recommendations, with a bit more information on methodology||Moderate: Enough detail to understand how evaluation was run, and results to compare with other initiatives||Moderate: Results and recommendations. Particularly recommendations around implementation and lessons learned|
|Potential communication channel||Face-to-face meeting to run through conclusions, recommendations / decisions and why||Face-to-face meeting to run through conclusions, recommendations / decisions and why||Options vary according to significance of decisions and parent / student interest:
||Staff meeting or information session and Q&A||Email evaluation report / results||Update in usual communication channel (e.g. Facebook group) with potential follow-up conversations with those interested|
Evaluations are not always perfect. Sometimes you have to make do with gaps in evidence, biases and making assumptions to reach conclusions.
This does not mean your evaluation is faulty. You just need to make a note of any issues in case they affect how conclusions are interpreted, or what decisions are made by others. Provide a note of issues alongside your conclusions.
Some key things to be transparent about:
- Your high-level methodology (e.g. asked teachers how they found PD, assumed positive feedback would result in higher student engagement).
- Where the information comes from (e.g. from 20 interviews with teachers or based on 2 respondents to 20 surveys).
- Potential biases (e.g. you interviewed only high-achieving students who completed a unit, not other students or those who dropped it halfway through). See the Gather the evidence you need page for more information.
- Potential limitations (e.g. your school results reflect only improvements in knowledge achievement, rather than improvements in skills).
Don’t shy away from communicating the hard messages
Evaluations ask whether an initiative has achieved its intended outcomes. Sometimes the answer to this question is no. Sometimes the news may be positive, but not very. Sometimes the honest conclusion an evaluation reaches is “it’s not yet clear”.
While it can be difficult to deliver unwanted news, there are some tips to make it easier for yourself and others:
- Be respectful. People might have invested a lot of time and resources into the initiative.
- Be prepared to explain why the conclusion was reached. People may want to test your methodology when they hear unexpected or unwanted conclusions. Have this information ready for them, and be open about any limitations.
- Take a ‘no surprises’ approach. If you sense early that something in the evaluation might be disappointing to certain people, give them advance warning.
Share your conclusions
It’s important to share information you glean from a STEM education initiative evaluation. To improve student engagement and achievement across Australia, we need to know what’s working, for whom and how. Sharing information about evaluations is the best way to do this.
Here is a Template: Evaluation Report that helps you summarise what you've learned from the initiative and evaluation.