3rd ANNUAL ASSESSMENT FAIRE
Date: April 6, 2018
Time: 10:00 - 1:00pm
Location: Golden Eagle Ballroom 3
Presenters from the 2nd Annual Assessment Faire on April 12, 2017
COLLEGE OF NATURAL AND SOCIAL SCIENCES
Ji Son, Ph.D.
Associate Professor, Department of Psychology
Assessing (And Building) Transfer
What good is learning if is not useful in solving new problems? Dr. Son’s statistics exams are designed to mimic whole authentic situations that students may come across rather than stand-alone problems from textbooks. These exams are both difficult and practical for students because these exams are transfer exams, applying learned statistical ideas in new situations. These exams are useful for shaping students’ self-directed attention because (1) they illuminate for students the relevance of what they are learning and (2) point students to the skills they need to practice to have a useful understanding of statistics.
COLLEGE OF HEALTH AND HUMAN SERVICES
Jessica Morales-Chicas, Ph.D.
Assistant Professor, Department of Child and Family Studies
Making Assessment Fun and Interactive using Kahoot!™
Assessment of student learning is critical in teaching, but how do we make it fun and interactive? In my courses, I incorporate an online, interactive, and free game called Kahoot!. Through this online game, I generate questions about course content that students see projected in real-time. Students answer each question anonymously (e.g., using their phones) and compete for points based on accuracy. Collectively, we review and discuss the projected responses while also debunking misconceptions about incorrect answers. Through this group-based and low-stakes testing tool, I assess how well students learned the material and adapt future lessons accordingly.
COLLEGE OF ENGINEERING, COMPUTER SCIENCE, AND TECHNOLOGY
Ni Li, Ph.D.
Assistant Professor, Department of Mechanical Engineering
Developing a Survey Instrument to Improve Students’ Learning
In EDSP 489: Demonstration of Instructional ME 3210: Kinematics of Mechanisms. Throughout the semester, students in ME 3210 are required to take different surveys: Pre-knowledge survey, Progressive survey, and Post-Exam self-reflection survey. Dr. Li will demonstrate how she uses these different types of surveys to help students achieve learning goals of the class, and improve their studying effectiveness.
CHARTER COLLEGE OF EDUCATION
Elina Saeki, Ph.D.
Assistant Professor, Department of Special Education and Counseling
Evaluating Real Life Integration and Application of Content Knowledge
In COUN 5370: School Psychology Practicum, students select one PK-12th grade student at their fieldwork sites and implement an academic intervention, collect data, and analyze the data to measure their student’s academic progress. A grading rubric is used to evaluate each project component, ranging from 0 (unsatisfactory) to 3 (excellent). Dr. Saeki will discuss how the grading rubric is used to provide summative feedback to the students, as well as how data are used to analyze mastery of course objectives and program outcomes.
COLLEGE OF BUSINESS AND ECONOMICS
COLLEGE OF ARTS AND LETTERS
Nina O’Brien, Ph.D.
Assistant Professor, Department of Management
Distinguishing Public Presentation Confidence and Competence of Undergraduate Business Majors
The ability to deliver effective oral presentations is related to both student competence and to student confidence, but these two constructs are rarely distinguished. This study analyzed data on the public speaking confidence and competence of students enrolled in the upper-division business communication course to determine whether students improved in each area, as well as to better understand the relationship between confidence and competence. The results show that students developed greater confidence in public speaking over the course, but did not significantly improve in their competence. Implications of these results and recommendations for curriculum design are advanced.
Michele Dunbar, Ph.D.
Associate Director of Institutional Research
Wayne Tikkanen, Ph.D.
Coordinator of General Education
Assessing GE in the Face of Changing Outcomes, Calendar, and Curriculum
Students’ perception of their GE learning outcomes achievement can help assess GE course contributions to student learning and the GE program. The GE Survey collects students’ rating of their achievement on each GELO for a select GE course they completed. The survey was refined for 2016 and 2017 administrations, coinciding with pre- and post-curriculum and Q2S changes. We will present an overview of the survey results from 2017 with 2016 comparisons. Results will show if the changed GE curriculum has improved students’ ratings of GELO achievement, especially for the new civic learning and diversity outcomes. Effects of administrative changes and direct assessment will be discussed briefly.