Raven’s Advanced Progressive Matrices International Technical Manual Copyright © 2011 NCS Pearson, Inc. All rights reserved. No part of this publication may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopy, recording, or any information storage and retrieval system, without permission in writing from the copyright owner. Pearson, the TalentLens, Watson-Glaser Critical Thinking Appraisal, and Raven’s Progressive Matrices are trademarks in the U.S. and/or other countries of Pearson Education, Inc., or its affiliate(s). Portions of this work were previously published. Produced in the United Kingdom Contents Chapter 1 Introduction .......................................................................................................1 Development of the 23-Item Form....................................................................................................2 Internal Consistency Reliability...........................................................................................................3 Content Validity.....................................................................................................................................3 Convergent Validity ..............................................................................................................................3 Criterion-Related Validity...................................................................................................................4 Equivalency Information ......................................................................................................................5 Global Applicability...............................................................................................................................7 Development of Raven’s APM International Versions ..................................................................7 Chapter 2 Australia/New Zealand (English) ...................................................................9 Translation/Adaptation Process.................................................................................................................9 Sampling Procedure ......................................................................................................................................9 Item/Test Difficulty .......................................................................................................................................11 Distribution of Scores ................................................................................................................................11 Evidence of Reliability .................................................................................................................................12 Chapter 3 France (French) ........................................................................................................13 Translation/Adaptation Process................................................................................................................13 Sampling Procedure ....................................................................................................................................13 Item/Test Difficulty .....................................................................................................................................14 Distribution of Scores ...............................................................................................................................15 Evidence of Reliability ................................................................................................................................15 Chapter 4 India (English) .........................................................................................................17 Translation/Adaptation Process.......................................................................................................17 Sampling Procedure ............................................................................................................................17 Item/Test Difficulty .............................................................................................................................18 Distribution of Scores .......................................................................................................................19 Evidence of Reliability ........................................................................................................................19 Chapter 5 The Netherlands (Dutch) ...............................................................................21 Translation/Adaptation Process...........................................................................................................21 Sampling Procedure ................................................................................................................................21 Item/Test Difficulty .................................................................................................................................23 Distribution of Scores ...........................................................................................................................23 Evidence of Reliability ............................................................................................................................20 Chapter 6 The UK (English) .................................................................................................25 Translation/Adaptation Process...........................................................................................................25 Sampling Procedure ................................................................................................................................25 Item/Test Difficulty .................................................................................................................................27 Distribution of Scores ...........................................................................................................................28 Evidence of Reliability ............................................................................................................................28 Chapter 7 The US (English) ................................................................................................29 Sampling Procedure ..............................................................................................................................29 Item/Test Difficulty ...............................................................................................................................30 Distribution of Scores .........................................................................................................................31 Evidence of Reliability ..........................................................................................................................31 Appendix A Best Practices in Administering and Interpreting the APM .................................33 Administrator’s Responsibilities ........................................................................................................33 Assessment Conditions .......................................................................................................................33 Answering Questions ..........................................................................................................................34 Administration.......................................................................................................................................34 Understanding the Scores Reported ...............................................................................................35 Maintaining Security of Results and Materials ................................................................................35 Sources of Additional Best Practice Information...........................................................................35 Instructions for Administering the APM Online............................................................................37 APM Short Test Administration Instructions – Paper and Pen..................................................41 References ..........................................................................................................43 Tables 1.1 Descriptive Statistics of the APM by Test Version and Administration Order....................6 1.2 Reliability Estimates by APM Test Version and Administration Order .................................6 1.3 Descriptive Statistics and Internal Consistency Reliability Estimates for Raven’s APM Across Countries........................................................................................................................7 2.1 Demographic Information for the Australia/New Zealand Sample .......................................10 2.2 Raven’s APM Item Analysis Information for the Australia/New Zealand Sample ................11 2.3 Distribution of APM Scores in the Australia/New Zealand Sample ......................................11 2.4 Internal Consistency Reliability Estimates in the Australia/New Zealand Sample..............12 3.1 Demographic Information for the France Sample .....................................................................13 3.2 Raven’s APM Item Analysis Information for the France Sample..............................................14 3.3 Distribution of APM Scores in the France Sample ....................................................................15 3.4 Internal Consistency Reliability Estimates in the France Sample ...........................................15 4.1 Demographic Information for the India Sample.........................................................................17 4.2 Raven’s APM Item Analysis Information for the India Sample .................................................18 4.3 Distribution of APM Scores in the India Sample .......................................................................19 4.4 Internal Consistency Reliability Estimates in the India Sample...............................................19 5.1 Demographic Information for the Netherlands Sample ..........................................................22 5.2 Raven’s APM Item Analysis Information for Netherlands Sample .........................................23 5.3 Distribution of APM Scores in the Netherlands Sample .........................................................23 5.4 Internal Consistency Reliability Estimates in the Netherlands Sample ................................24 6.1 Demographic Information for the UK Sample ..........................................................................26 6.2 Raven’s APM Item Analysis Information for the UK Sample...................................................27 6.3 Distribution of APM Scores in the UK Sample .........................................................................28 6.4 Internal Consistency Reliability Estimates in the UK Sample.................................................28 7.1 Demographic Information for the US Sample ...........................................................................29 7.2 Raven’s APM Item Analysis Information for the US Sample....................................................30 7.3 Distribution of APM Scores in the US Sample ..........................................................................31 7.4 Internal Consistency Reliability Estimates in the US Sample.................................................. 31 Chapter 1 Introduction The Raven’s Progressive Matrices have been used in many countries for decades as a measure of problem-solving and reasoning ability (Raven, Raven, & Court, 1998a). The various versions of the Raven’s Progressive Matrices have been studied in over 45 countries on samples totalling over 240,000 participants (Brouwers, Van de Vigver, & Van Hemert, 2009). This manual describes the adaptation/translation of the latest 23-item version of the Raven’s Advanced Progressive Matrices (APM) for the US, Australia/New Zealand, France, India, the Netherlands and the UK. From an international perspective, several enhancements were made to facilitate cross-country score comparisons, and to standardise the testing experience for administrators and participants. These enhancements include: Use of a uniform test format and common test content across countries Uniform scoring and reporting of scores across countries Availability of local manager norms for each country, based on a common definition of manager across countries Implementation of a common set of items and administration time across countries (i.e., 23 items; 40 minutes). Description of the Raven’s Advanced Progressive Matrices The Raven’s Advanced Progressive Matrices (APM) is a nonverbal assessment tool designed to measure an individual’s ability to perceive and think clearly, make meaning out of confusion and formulate new concepts when faced with novel information. The APM score indicates a candidate’s potential for success in such positions as executive, director, general manager, or equivalent high-level technical or professional positions in an organisation. These categories of positions typically require high levels of 1 Copyright © 2011 NCS Pearson, Inc or its affiliate(s) clear and accurate thinking, problem identification, holistic situation assessment, and evaluation of tentative solutions for consistency with all available information. Each item in the APM comprises a pattern of diagrammatic puzzles with one piece missing. The candidate’s task is to choose the correct missing piece from a series of possible answers. Development of the Current 23-Item Form The current revision of the APM was undertaken to provide customers with a shorter version of the assessment that maintains the essential nature of the construct being measured and the psychometric features of the assessment. The APM is a power assessment rather than a speeded assessment, even though it has a time limit. Speeded assessments are typically composed of relatively easy items and rely on the number of correct responses within restrictive time limits to differentiate performance among candidates. In contrast, the APM items have a wide range of difficulty and a relatively generous time limit, which makes it a power assessment. The 42-minute administration time for the current APM (40 minutes for 23 operational items in Part 1; 2 minutes for 2 experimental items in Part 2) maintains the APM as an assessment of cognitive reasoning power rather than speed. N.B. The paper and pencil format does not contain Part 2 experimental items Classical Test Theory (CTT) and Item Response Theory methodologies were used in the analyses of the APM data for item selection. Specifically, for each of the 36 items in the previous APM version, the following indices were examined to select items: item difficulty index (p value), corrected item-total correlation, IRT item discrimination (a) parameter, and IRT item difficulty (b) parameter. Because the APM was designed to differentiate among individuals with high mental ability, less discriminating items were dropped from the current version of the APM. For the current APM revision, data were used from 929 applicants and employees in a number of positions across various occupations. These individuals took the APM within the period May 2006 through to October 2007. Five hundred and nine of these individuals provided responses about their current position levels (e.g., “Executive,” “Director,” 2 Copyright © 2011 NCS Pearson, Inc or its affiliate(s) “Manager,” and “Professional/Individual Contributor”). See the Appendix of separate document ‘APM Development’ 2007, for more details regarding the composition of the sample. Internal Consistency Reliability The internal consistency reliability estimate (split-half) for the APM total raw score was .85 in the U.S standardisation sample (n=929). This reliability estimate for the 23-item version of the APM indicates that the total raw score on the APM possesses good internal consistency reliability. Internal consistency reliability estimates for each country-specific Manager norm group in the global data-collection effort are summarised in each country-specific chapter within this manual. Content Validity In an employment setting, evidence of content validity exists when an assessment includes a representative sample of tasks, behaviours, knowledge, skills, abilities, or other characteristics necessary to perform the job. Evidence of the content-related validity of the APM should be established by demonstrating that the jobs for which the APM is to be used require the problem solving skills measured by the assessment. Such evidence is typically documented through a thorough job analysis. Convergent Validity Evidence of convergent validity is provided when scores on an assessment relate to scores on other assessments that claim to measure similar traits or constructs. Years of previous studies on the APM support its convergent validity (Raven, Raven, & Court, 1998b). In a sample of 149 college applicants, APM scores correlated .56 with math scores on the American College Test (Koenig, Frey, & Detterman, 2007). Furthermore, in a study using 104 university students, Frey and Detterman (2004) reported that scores from the APM correlated .48 with scores on the Scholastic Assessment Test (SAT). Evidence of convergent validity for the current version of the APM is supported by two findings. First, in the standardisation sample of 929 individuals, scores on the current APM correlated .98 with scores on 3 Copyright © 2011 NCS Pearson, Inc or its affiliate(s) the previous APM. Second, in a subset of 41 individuals from the standardisation sample, the revised APM scores correlated .54 with scores on the Watson-Glaser Critical Thinking Appraisal®—Short Form (Watson & Glaser, 2006). Criterion-Related Validity Criterion-related validity addresses the inference that individuals who score better on an assessment will be more successful on some criterion of interest (e.g., job performance). Criterion-related validity for general mental ability tests like the APM is supported by validity generalisation. The principle of validity generalisation refers to the extent that inferences from accumulated evidence of criterion- related validity from previous research can be generalized to a new situation. There is abundant evidence that measures of general mental ability, such as the APM, are significant predictors of overall performance across jobs. For example, in its publication on the Principles for the Validation and Use of Personnel Selection Procedures, SIOP (2003) notes that validity generalisation is well-established for cognitive ability tests. Schmidt & Hunter (2004) provide evidence that general mental ability “predicts both occupational level attained and performance within one's chosen occupation and does so better than any other ability, trait, or disposition and better than job experience” (p. 162). Prien, Schippmann, and Prien (2003) observe that decades of research “present incontrovertible evidence supporting the use of cognitive ability across situations and occupations with varying job requirements” (p. 55). Many other studies provide evidence of the relationship between general mental ability and job performance (e.g., Kolz, McFarland, & Silverman, 1998; Kuncel, Hezlett, & Ones, 2004; Ree & Carretta, 1998; Salgado, et al., 2003; Schmidt & Hunter, 1998; Schmidt & Hunter, 2004). In addition to inferences based on validity generalisation, studies using the APM over the past 70 years provide evidence of its criterion-related validity. For example, in a validation study of assessment centres, Chan (1996) reported that scores on the Raven’s Progressive Matrices correlated with ratings of participants on “initiative/creativity” (r=.28, p< .05). Another group of researchers (Gonzalez, 4 Copyright © 2011 NCS Pearson, Inc or its affiliate(s)
Description: