International Summer School ‘Applied Psychometrics in Psychology and Education’Camps 26.03.2018
DATES: July 29 – August 4, 2018
VENUE: HSE’s ‘Kochubey Mansion’ Training Center, Saint Petersburg, Russia
- Mary Pitoniak (PhD, Educational Testing Service)
- Carol Myford (PhD, University of Illinois at Chicago)
- Linda Cook (PhD, National Center for the Improvement of Assessment)
- Lidia Dobria (PhD, Wilbur Wright College)
The School’s tracks will be taught in English.
ADMISSIONS CLOSE ON APRIL 30, 2018
The School participants can opt for one of the following study tracks:
Standard Setting Course (Dr. Mary Pitoniak) is designed to provide participants with information about a wide range of considerations relevant to setting standards on educational assessments. These include how to choose a standard setting method, which methods are currently being used, and how to know if the cut scores set for an assessment yield valid interpretations within the context of a particular testing program. The fundamentals of standard setting will be presented, including the steps required in all methods. Vertically moderated standards and adjusting committee-recommended cut scores will also be discussed. The course will give a thorough consideration to the validity of standard setting procedures and the resulting cut scores. Course Details
Fairness in Educational Testing Course (Dr. Linda Cook) is dedicated to test fairness, a topic of central importance to test developers, test takers, and those who have been using test scores for many decades. The vision of fair assessment has evolved over time and has psychometric, societal, and legal foundations. With the conceptual framework for a fairness discussion developed, the course will focus on key practical implications of this framework. Fairness in test design and development, as well as test administration will be discussed. The comparison of scores across different tests, modes of administration, grade levels, and across different languages and different populations will be considered. Course Details
The Course ‘Analyzing Rating Data Using Many-Facet Rasch Measurement and Multilevel Rater Modelling Approaches’ by Dr. Carol Myford and Dr. Lidia Dobria is designed to introduce participants to two approaches to analyzing rating data: a many-facet Rasch measurement (MFRM) approach, and a multilevel rater model (MRM) approach. In performance assessment settings, raters who evaluate students’ performances or products may introduce errors (rater effects) into the assessment process. The approaches discussed in the course help assessment administrators learn how various ‘facets’ (e.g., students, raters, rating criteria) of their assessment systems are performing. This discussion can be helpful in determining to what extent their systems are under statistical control and introducing meaningful changes to improve their systems. The course will include two parts. The first part will focus on how to use a many-facet Rasch measurement (MFRM) approach to analyzing the rating data, while the second part will feature a multilevel rater model (MRM) approach to analyzing the rating data. Course Details
- Spoken English
- Basic knowledge of the Classical Test Theory and IRT
- Your own laptops
Track 2 enrollees are also expected to have prior experience in regression and variance analysis and interpretations.
The Summer School will provide in-depth field insights and equip the students with advanced skills enabling more objective measurements and assessments in education and psychology. Ample opportunities for professional and academic networking will be a valuable fringe benefit.
The cost of participation is 700 Euros. This fee includes accommodation at the picturesque historic ‘Kochubey Mansion’ Training Center (Saint Petersburg, Russia), meals and handouts. The early-bird fee makes 600 Euros and applies before April 1. Payment is effected only after you receive the confirmation of your participation in a chosen Track (not after the registration). This payment DOES NOT cover transportation costs.
Our Past Summer Schools
Since their inception in 2014, IOE’s International Summer Schools in Psychometrics have covered a range of specialist topics and areas, such as Evidence Centered Design (Mark Zelman, World Bank consultant); Standards of Test Quality (Bas Hemker, CITO, Netherlands); Performance-based Assessment (Carol Myford, USA); and Differential Item Functioning Analysis (Thierry Rocher, France).
Head of the Summer School is Elena Kardanova, Director of the IOE Center for Monitoring the Quality in Education and the ‘Measurements in Education and Psychology’ Master’s program.
For questions and additional information, please contact the Summer School’s Organizing Committee:
- Alina Ivanova, email@example.com
- Inna Antipkina, firstname.lastname@example.org
- Denis Ferediakin, email@example.com