Wouldn’t it be wonderful if it was possible to dump a bunch of figures about your organisation’s work into a clever system, push compute and then be told what outcomes you’ve achieved.
From what I’ve learnt, it just doesn’t work like that.
In a recent report by the every reliable Idealware on “Understanding Software for Program Evaluation”, the authors make this caveat early on: working what you’re achieving is not about the tools.
“…software is not a requirement for a successful [outcomes reporting] strategy, merely a way to make your process easier, and many organizations complete them with little to no technology to assist them. That’s entirely up to you.”
Most of the tools they describe in rich detail don’t stand alone. Typically using software for reporting relies on more than one layer of software. This could include using a core database, surveys for specific activities and some means for presenting this attractively.
A set of five categories are included to describe the different ways software can support program evaluation:
- Central Hub of Program Data
- Auxiliary Data Systems
- Proactive Data Gathering
- Pulling Existing Data
- Reporting and Visualizing.
The report offers something of a pick ‘n mix approach that can helps organisations grasp the full range of options.
As with other reports from our distant colleagues not all the software described is available in Aotearoa. Even if the internet does allow us to theoretically download anything that’s available, it won’t necessarily fit here without adaption. The reported $500,000 cost of customising Penelope – case management software from Canada – funded by Te Puni Kōkiri is a case in point.
Exploring what tools will help a specific organisation determine what’s working (as the team at Community Research like to describe this challenge) can only start after an organisation is clear on what they’re setting out to achieve.
So if there is no killer app, the where to start. At risk of pre-empting more detailed work in this area that is relevant to Aotearoa (watch this space), I’ll point to what I think is a useful guide from the well-established charity evaluation services (CES).
CES’s 2007 workbook “Using ICT to Improve your Monitoring and Evaluation” is still relevant, technology continues to stand still so it’s already somewhat dated.
Instead I’d suggest looking at “Assessing change: Developing and using outcomes monitoring tools” (2010) which places in the role of technology in a wider content. As much effort is paid to framing questions around outcomes as it is to tools.
Not surprisingly there isn’t an easy option: some tool that will collect data and export results. Fortunately, there is lots of good help.