Search
On-Site Program Calendar
Browse By Day
Browse By Time
Browse By Person
Browse By Room
Browse By Unit
Browse By Session Type
Search Tips
Change Preferences / Time Zone
Sign In
Bluesky
Threads
X (Twitter)
YouTube
Earlier this year, my colleagues and I presented a beta version of a novel assessment tool called [name redacted]. The tool was designed to collect and organize qualitative data to aid library staff and maker educators in assessing hands-on library programs (citation redacted). [Tool name] was developed over several studies with research-practice partnerships [citations redacted] and builds on previous work advancing assessment tools and literature for informal, maker-based learning (Chang et al., 2019; Gutwill et al., 2015; Wardrip & Brahms, 2015). Truly a tool made for and by library staff and educators, the opportunities and challenges of practice that surfaced in the studies guided its design. For instance, responses from interviews conducted in 2019 and 2022, with participants from five libraries in the central United States, directly impacted the tool’s design, including the development of three slider features (Image 1): an engagement slider, a vibe slider, and a depth/ richness slider.
For this paper, I returned to those interviews (n=38) to investigate the following: (a) How do library staff/ maker educators understand “vibes?” (b) What role do they say vibes play in maker programs and in their facilitation? With these questions, I explored what “vibes” mean to these educators and how they are conceptualized to better understand why they seem to matter. I adapted each research question into a short phrase and used them as etic codes for holistic coding of the transcriptions. I then analyzed the data more deeply using pattern coding to identify common ideas and themes (Fugard & Potts, 2020; Saldaña, 2016).
Themes from the data indicate (1) “vibes” can be prepared and predicted, as when educators described how they design a learning environment to support a certain vibe. (2) When learners arrive, the vibe shifts, and the educator can sense it. (3) Vibes linger to be reflected on after a program ends and they are able to be projected into future program planning and design. These themes suggest vibes are embryonic and conjectural when offered from educator to learner, but they are also considered a useful summation of the program, helping educators draw conclusions about how “well” a program went. Vibes seem to play a large part in educators' informal assessment of their programs and they appear to operate as a micro-process (or many) of hypothesis and inference, not unlike the fundamental processes in assessment (Pellegrino, 2005). However, unlike typical assessments, vibes do not seem to be meaningfully reduced to quantifiable metrics and they are used to describe rather than evaluate tone(s) of a program.
Preliminary exploration of educators’ perceptions of vibes leaves many questions unanswered. For instance, More work is necessary to understand how educators make sense of vibes (though my collaborators in this session begin to scratch the surface of that work) and how educator attention to vibes supports and is supported by other assessments. Ultimately, vibes play an important role in assessment of library maker programs whether or not they are formally reported and require careful attention.