Saturday, 14 January 2012

Review of Get B.U.S.Y 2008/2009 Evaluation Report

The Get B.U.S.Y Evaluation Report was prepared by the Centre For Community Based Research for the Boys and Girls Clubs of Canada in August 2009.  The Boys and Girls Club of Canada(BGCC) provides programs to children and youth that support physical, educational and social development.  The Centre for Community Based Research(CCBR) is a non-profit organization focused on social research to strengthen communities(Centre for Community Based Research, 2009, p.3).  The evaluation looks specifically at the BGCC program Get B.U.S.Y, which focuses on healthy eating, physical activity, as well as youth leadership.  Ten Boys and Girls Clubs, in various communities across Canada received funding to run the Get B.U.S.Y program, and 9 of those clubs participated in the evaluation. 

The CCBR began this evaluation process with a kind of needs assessment to set evaluation priorities.  They met with the clubs involved in the Get B.U.S.Y program and developed a Logic Model to establish activities, outputs and outcomes for the program.  The Get B.U.S.Y Logic Model ( CCBR, 2009,p.7) clearly outlines the aims of the program and illustrates the focus for evaluation, including increased knowledge of benefits of physical activity, eating and youth pursuing their own goals within the program.  
The evaluation conducted was formative as the researchers sought to use information gathered to inform the continued implementation of the Get B.U.S.Y program in other Boys and Girls clubs, and also to guide future evaluations to be conducted by the CCBR for the Boys and Girls Clubs of Canada.  The evaluation is largely goal-based.  The Boys and Girls clubs, with the CCBR, set out goals in the Logic Model they created, and the evaluation was used to assess the effectiveness of the program to attain the goals.  
The initial plan for the evaluation included 15 tools for collecting data such as surveys and questionnaires for clients and staff, attendance forms, and program materials like journals and calendars.  Based on feedback from the participating Boys and Girls Clubs that they could not realistically administer all these tools, the CCBR simply used pre and post surveys for clients and a staff activity log(CCBR, 2009, p8-9)
The Get B.U.S.Y Program Evaluation Report has many strengths.  The 23 page report followed proper formating and was easy to navigate.  The report was very thorough in providing background information on the Boys and Girls Clubs of Canada, the Centre for Community Based Research and the Get B.U.S.Y program itself.  I could easily understand and envision what the program would look like in action.  
The evaluation plan was clearly described.  In the report, it states that the researchers had intended to provide 15 different tools to collect data.  I appreciate the inclusion of this fact because otherwise I may have questioned why they only used pre and post surveys and staff logs.  As a teacher, I understand the feeling of despair when additional paperwork is pilled on, so I respect the researchers’ decision to scale back to 3 tools. 
I believe that this program evaluation was conducted to assess the Get B.U.S.Y program, but intentionally or not, to inform the practices of the CCBR going forward in conducting research and evaluation for the Boys and Girls Clubs of Canada. In the analysis portion of the report, the researchers seem to indicate that the Get B.U.S.Y program effectively meets the program outcomes.  The real “learning” from this evaluation seems to be more centered on how the CCBR collects data from the Boys and Girls Clubs.  The report comments on the rewriting of questions to make them simpler and more kid (client) friendly, and on the realization that the completion of evaluation materials took up time that should have been spent on the program itself.  I don’t know if this kind of metacognitive evaluation is common in program evaluation, but as a learner I enjoyed reading their thought process.  
I also like the inclusion of statistical data in tables and quotes from staff and clients to provide a more personal view.  
There are some weaknesses in this program evaluation report as well.  First, the report was written in an informal voice.  It made it a quick and simple read for me as an outsider, but this report is being presented to a high profile, not for profit, Canada wide organization, so I would have expected it to be more formal.  
While I appreciated of the description of original plan with 15 evaluation tools, and the reflections in the report about issues with data collection (not all materials were returned), it made me question the professionalism of the CCBR.  Perhaps because I work with children, I understand what is possible, but I thought it was strange that only after the data collection was complete did the researchers think it was important to use simple words and questions to survey clients from 8-17 years old.  
Lastly, I found that the evaluation report included many more recommendations for future evaluation practices and considerations and very few recommendations for the program itself.

Centre for Community Based Research. (2009).  Get B.U.S.Y Evaluation Report. Retrieved from


  1. Great work Teresa. You chose a a focused, well-defined program evaluation to study. It is clearly laid out and easy to follow. It was also good to see the use of logic model. I agree with your assessment of the strengths and weaknesses. The most glaring weakness is the lack of recommendations for the future of the program. I too found it strange that they provide advice for future evaluations but not really any specific suggestions for the program. If I were responsible for funding the program I would not be convinced of its effectiveness based on this report.

  2. Sorry. It should be Theresa. I hate when I make mistakes like that.