Research and Scholarship (Research and Program Assessment)
Permanent URI for this collection
Browse
Recent Submissions
Now showing 1 - 3 of 3
Item “Try it, Make it Better, Perfect it”: Implementing a Statewide Textbook Affordability Initiative(2021-03) Jaggars, Shanna Smith; Prieto, Kaity; Folk, Amanda L.; Rivera, Marcos D.; Hance, Elizabeth K.; alexander, eThis report focuses on the Ohio Open Ed Collaborative (“OOEC”), a statewide initiative which recruited and supported inter-institutional teams of faculty to develop open and affordable college course materials. Across its first two years, OOEC developed 23 freely-available modular course packages that were aligned to statewide learning outcomes and were designed to be engaging and appropriate for students of diverse backgrounds attending two-year and four-year colleges and universities across the state. In this report, we focus on OOEC’s structure, its first two years of implementation, and early indicators of OOEC package adoption, to provide other regional or statewide collaborative approaches with concrete examples for good practice, as well as “lessons learned” and opportunities for improvement. We also examine motivators and barriers to adoption among instructors across the state.Item COVID-19 Teaching and Learning Survey(The Ohio State University, 2020-06) Jaggars, Shanna Smith; Rivera, Marcos; Hance, Elizabeth; Heckler, AndrewThis report presents descriptive results and open-ended comments from a survey of Ohio State faculty, undergraduate students, and graduate/professional students regarding their experiences with teaching and learning during the emergency transition to remote learning due to the COVID-19 pandemic in Spring 2020.Item Understanding students' satisfaction with OERs as course materials(Emerald Publishing Limited, 2018) Jaggars, Shanna Smith; Folk, Amanda L.; Mullins, DavidPurpose: The purpose of this paper is to introduce a survey instrument to measure three components of students’ perceptions of open and affordable course materials – quality, integration, and experience – and discuss its reliability and predictive validity. Design/methodology/approach: The authors distributed an end-of-semester online survey to students enrolled in sections of 12 courses that adopted OER in Fall 2016, as well as conducting a within-interview survey with the instructors of those courses. The authors calculated the descriptive statistics from the responses to the student survey, as well as examining the inter-item and inter-rater reliability of the instrument. Finally, explored correlations in the data gathered through both the student and faculty surveys were explored. Findings: The authors found that both students and faculty were generally pleased with the quality and experience of using open and affordable digital materials. The authors also found that our three survey subscales had strong inter-item reliability, and that the quality and experience subscales had predictive validity in terms of whether students would choose a traditional or digital text in future courses. Originality/value: In addition to providing evidence in terms of the full survey instrument’s reliability and predictive validity, factor analysis indicates that a short scale of quality and experience Likert scale items could be used by practitioners to effectively assess satisfaction of digital materials among traditionally aged undergraduate students.