Earlier this month I participated in a forum about library makerspaces and how we evaluate engagement and measure success. Since my library’s equivalent of a makerspace is STEAM focused, I used my time at the forum to reflect on how I have been evaluating the BOOMbox and related STEAM programming for youth.
When the BOOMbox opened in November of 2014, we had evaluation methods in place. We asked learners to complete a participation survey after every visit and the survey questions aligned to the goals of the BOOMbox: peer learning and teaching, reaching new audiences, and facilitating hands on learning experiences. We also asked staff to complete an end of shift report that captured success stories (how learners engaged in the space), if any photos were taken, and any troubleshooting or hiccups. All of this data was then used to create a dashboard or report for each rotation. And we learned from each rotation’s report in order to iterate and improve learner experiences.
In the nearly five years the BOOMbox has been open, we have modified how we capture and discuss this data. For example, we no longer use participation surveys and have moved those questions into observable interactions that staff can comment on via the end of shift report. And for the most part, this works. We’re able to construct a narrative of the 15-16 week rotations that include qualitative and quantitative data.
But what the forum encouraged me to do was to rethink the BOOMbox goals, and reset them in alignment with the library’s values and priorities per our latest strategic plan. This will, of
course, be a process with staff who facilitate learning experiences in the space. And I think it will be a worthwhile process to ensure the vision of the BOOMbox and related program match or is in parallel with the vision of the library.
What do you do in evaluating your STEAM or STEM programs and spaces? And what are your measurement tools? And how do you tell the story of learning and the power of learning to your community?
Leave A Comment