Conference Session Descriptions
Filters:
Sessions with this icon will be broadcast live as part of the virtual conference.
Sessions with this icon will be recorded and made available online after the conference
Session 3
3C A New Methodology for Piloting Emerging Technologies
Presenter:
Jeffrey Kaplan, ProctorFree

A pilot study should not be the make-or-break for a deal, nor should it be used as a sales technique to inch a vendor's way into a school. I will present a framework to use the pilot as a furnace to mold, shape, and refine your partnership with a vendor. Over the course of working with dozens of schools, ProctorFree has created a methodology that allows for growth and improvement, embraces iterations, and values transparency. In this session we will discuss the agenda for formative pilots and evaluate case studies including conventional pilots, competitive pilots, and extensive pilots. The structure we will dive deeply into is Plan, Connect, Implement, Improve, Evaluate, and Expand.

Conference Track: Assessment Design and Psychometrics

Session 4
4A Empowered by Psychometrics: Inside the Black Box of Computerized Adaptive Testing
Presenters:
Jim Wollack, University of Wisconsin-Madison
Sonya Sedivy, University of Wisconsin-Madison

The goal of the Empowered by Psychometrics series is to help campus testing offices transition towards becoming campus assessment centers which, in addition to providing test administration services, also actively offer expertise to the faculty, staff, administration, and student body on issues related to test development and educational measurement. This year's session focuses on the psychometrics underlying computerized adaptive testing (CAT). In CAT, no two examinees take the same sets of items. Furthermore, as a result of the adaptive algorithm, most examinees wind up with proportion correct scores near 50%. In this session, participants will learn about the science behind CAT that allows for examinees' test scores to be directly comparable to each other. Participants will learn about different test delivery models, item selection algorithms, and security issues as they relate to CAT. Time will be provided at the end for audience members to ask questions.

Conference Track: Assessment Design and Psychometrics

Session 7
7B Including Embedded Assessments in the Learning and Assessment Landscape
Presenter:
Bryan Bradley, Brigham Young University (UT)

One of the drawbacks of formal exams is that the testing environment is artificial from the real-world context of the learning content. Embedded assessments provide opportunities for learners to be tested while they are actively engaged in performance tasks that are generally used as learning activities. Assessment data are captured at specific points in the learners' activities and are used for formative feedback and grading. This data can be stored in the same databases used by courses for grading and for evidence of learning-outcomes achievement.

In this presentation we will discuss different scenarios that work well with embedded assessments. We will also discuss how testing center leaders can help instructors plan for and use data provided from embedded assessments

Conference Track: Assessment Design and Psychometrics

Session 12
12C Retesting: The Good, the Bad, the Ugly
Presenter:
Cindy L James, Thompson Rivers University (BC/Canada)

Does your educational institution allow retesting on entrance exams? How many students retest annually? Do students' scores on retests improve and, if so, by how much? Is there one test that is rewritten more than others? Although retesting for admission or placement in higher education is quite common, this type of information appears to be lacking. To fill this void, retesting activity at one North American university was studied utilizing five years of testing data. In this session, the results from this study will be presented, providing answers to all of these questions and more. Although this information is institution-specific, it is hoped that this session will invite discussion about retesting activities and policies at other educational institutions, starting with the basic question of whether or not retesting should be allowed and, if so, under what conditions.

Conference Track: Assessment Design and Psychometrics