ebook img

ERIC ED447151: Putting Scholastic Aptitude Test Results on an Even Playing Field. PDF

7 Pages·2000·0.15 MB·English
by  ERIC
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview ERIC ED447151: Putting Scholastic Aptitude Test Results on an Even Playing Field.

DOCUMENT RESUME TM 031 965 ED 447 151 Costello, Ronald W.; Cox, Marge AUTHOR Putting Scholastic Aptitude Test Results on an Even Playing TITLE Field. 2000-00-00 PUB DATE NOTE 5p. Reports - Descriptive (141) PUB TYPE MF01/PC01 Plus Postage. EDRS PRICE *Achievement Gains; *College Entrance Examinations; DESCRIPTORS Comparative Analysis; *High School Students; High Schools; Scores; *Test Results Preliminary Scholastic Aptitude Test; *Scholastic Assessment IDENTIFIERS Tests ABSTRACT It is very important, when considering how students from one area compare with those from another, to use similar student performance data. This is particularly evident when comparing the performance of students taking the Scholastic Aptitude Test (SAT) from an area where many students take the test to the performance of students from an area from which only a small percentage of students participate in the SAT. It is more appropriate and useful to use SAT results in a way that is similar for each student and which shows whether the student improves from one year to the next. To that end, the Noblesville, Indiana, schools used the Preliminary Scholastic Aptitude Test (PSAT) and SAT scores for the same student, converting the PSAT scores to a score similar to the SAT score. This comparison was more meaningful in terms of student growth and class growth. The next step is to use student performance data to improve performance. (SLD) Reproductions supplied by EDRS are the best that can be made from the original document. Putting Scholastic Aptitude Test Results on an Even Playing Field U.S. DEPARTMENT OF EDUCATION By Ronald W. Costello AND Office of Educational Research and Improvement PERMISSION TO REPRODUCE EDU ATIONAL RESOURCES INFORMATION DISSEMINATE THIS MATERIAL HAS CENTER (ERIC) Assistant Superintendent, Noblesville Schools BEEN GRANTED BY This document has been reproduced as co.cte(0 received from the person or organization ron [email protected] originating it. Minor changes have been made to Marge Cox improve reproduction quality. Media Coordinator, Noblesville Schools TO THE EDUCATIONAL RESOURCES Points of view or opinions stated in this INFORMATION CENTER (ERIC) document do not necessarily represent official OERI position or policy. marge [email protected] 1 David Grissmer, in a recent Indianapolis presentation, focused on the current RAND report titled Improving Student Achievement: What State NAEP Test Scores Tell Us. He talked about the importance of comparing similar student performance data when we want to know how students are performing. Let me try to explain what this might mean as we look at student performance data on the Scholastic Aptitude Test (SAT) which was recently released for the high school graduating class of 2000. I will start with an athletic example for this comparison. Suppose you and I coached football of teams with 50 players on each team. We both think the players on our team have more foot speed. We decide to settle the issue by having a foot race. The only condition I set is that I want to race my 2 fastest players against 30 of your players. If all 30 of your players are not faster than my 2 players, does this mean that my team has more foot speed than yours? No. Unfortunately, we use those same kinds of comparisons with test scores. Let me give you another example, using the recent SAT for the students at Noblesville High School. Top 10% of 2000 Graduating Class Better Than Any Other State In Indiana, there is always a question about the large number of students taking the SAT. Is it fair to compare Indiana with 60% of the students taking the SAT in 2000 with North Dakota who has only 4% of the students take the SAT? North Dakota had the highest scores on both the Verbal and Math parts of the SAT. I would like to show you how a smaller number of students would compare. For Noblesville High School students, we combined the Verbal and Math SAT into a single score instead the individual ones for each test because it is easier to show total comparisons. From Noblesville High School test results, I used the SAT scores from the top ten percent of the 2000 graduating class. These results are shown in the table below compared to the results from the states that scored at the following levels 1, 10, 20, 25, 30, and 40. 4% 1,197 1 7% 1,164 10 8% 1,138 20 1,125 13% 25 Page 1 of 4 2 1,111 18% 30 30% 1,087 40 44% 1,019 National 10% 1,257 NHS 79% 1,005 NHS Source: The College Board at http://www.collegeboard,org/press/senior00/htmlitable3.htin1 As you can see from the table, the top 10 percent of the students at Noblesville High School scored better than all states shown. It was only after you looked at the scores for about Is it a correct half of the states that you find more than 10 percent of the students were tested. comparison to say the top 10 percent of the students at Noblesville High School compare better than all 50 states? Is this comparison any different than the foot race comparison I made earlier? I would agree with you, but I I suspect that you would say these are not proper comparisons. would state that this is not any less appropriate than how we are presently using student performance data. This brings us back to the original point that David Grissmer made: that in order to get proper comparisons you need to use similar data, and I would like to give you a better way to use student SAT results so that the playing (or reporting) field is even. What SAT results should we be considering? We had been very pleased with our SAT results in Noblesville Schools over the last five years because we had a steady increase in SAT scores. We had worked very hard to bring about this increase by changing the academic offerings at our high school with more demanding academic courses and increasing the number of students taking those courses. In 2000, our average student scores declined from the previous four years. We decided to look more in depth at our results, and we wanted to share that information in terms of way to look at performance data that is similar for each student and whether the student improves one year to the next. In reviewing our records, we found that we had scores on both the Preliminary Scholastic Aptitude Test (PSAT) and the SAT. For the class of 2000, we had 317 take the PSAT as juniors (82%) versus the 291 (79%) take the SAT as seniors. Although these two groups were not all the same students, there were such a high percentage of students within both testing groups that it seemed fair to compare their results. Shown in the table below are the PSAT and SAT results by class for the last six years at Noblesville High School. SAT % Tested PSAT % Tested z Score Change Combined Combined Class c (fi +68 1.64 1,005 79% 82% 937 2000 60% +17 79% 1,029 -1.51 1,012 1999 +39 -0.15 75% 61% 994 1,033 1998 -0.03 68% +41 1,029 53% 988 1997 3 Page 2 of 4 0.03 +42 70% 1,007 67% 965 1996 0.03 +42 72% 923 47% 881 1995 Mean 41.5 Standard Deviation 16.2 PSAT scores were converted to a score similar to the SAT score by dropping the decimal point on the PSAT. For example a 51.9 became at score of 519, this is a practice that school counselors have used for years to compare the two scores, and the verbal and math scores were combined. Differences were compared by computing a mean and standard deviation for all six years, and determining through the use of z score (Chase, 1976) computation the statistical difference between the test scores. In looking at the student performance from the PSAT testing to the SAT testing, we find that there is an improvement for all six years reviewed. The year that we had the highest improvement from PSAT to SAT was also the year (2000) that we had the least amount of growth (-1.51) from the class tested the previous year. The year in which we had the least amount of growth from PSAT to SAT (1999) was a year in which we had our second highest total SAT scores (1,029). According to The College Board 2000, on the average juniors taking the PSAT in October had a gain of 10 points higher in verbal and 12 points higher in math (http://www.collegeboard.org/sadcbssenior/html/stat00e.html). The 68-point gain by the Class of 2000 was more than three times the national gain as compared to 17-point gain of the Class 1999. Now those scores gave different meaning because we were looking at similar information overtime. Being able to understand and compare similar performance data for the same students is a first step if we are ever going to set expectations to improve student performance. How should we use the test results? The next step is being able to use the student performance data to improve performance, not just report it. There are number of things that can be done and some of these things are: identify particular math and verbal skills where students can improve based upon PSAT results; implement interventions between the PSAT and SAT to improve the specific math and verbal skill which were identified; and, analyze SAT results to make sure that the interventions are achieving the improvement in the skill desired. We believe that steps outlined above will allow us to develop more specific interventions, which will help to improve the performance of all students. If we do this, then is it really important whether SAT scores of this class are as high as the previous class? No. Our goal for Page 3 of 4 4 y. the information is to convince people that looking at similar student performance data over time is the most important thing that we can do to achieve the types of improvement in student performance which we all desire. Conclusion At the presentation by Dr. Grissom, one of the questions was how do all parties involved work together for the improvement of student performance. To answer that question, we must agree on how we measure student performance. There is not a disagreement about the need for students needing greater skills to be successful in the future, but there is considerable criticism that the students graduating from Indiana high schools do not measure up with their counterparts from other states. It is important that we do more to improve student performance, and agree how to measure it, then we will know whether we are seeing improvement or not. Furthermore, it is important than we expect all students to improve not just some. If students came to school with the same ability, parental support, motivation, and the other external factors which affect how they perform, then it would be fair to compare students as if they were the same and the only factor that made a difference was the school in which they were enrolled. However, seldom are all of those factors equal. As we talk about having Indiana student performance measure up and create within the individual students the kind of skills that they need to be successful in the future, there are many things that must be done. The first thing that must be addressed is the setting of expectations. As we have tried to show you here, the expectations shouldn't focus on just wanting to be able to say, "my students are better than yours," but show whether they are truly improving over time. The only way this can happen will be for a change in thinking from rank ordering and comparing to looking at student improvement. If improvement becomes our measure of student success, then we believe you will see better results in the overall performance of all students. References Chase, Clinton Elementary Statistical Procedures. McGraw-Hill: New York, 1976. Grissmer, David Improving Student Achievement: What State NAEP Test Scores Tell Us. RAND, Santa Monica, CA, 2000. The College Board at http://www.collegeboard.org/press/senior00/html/table3.html Page 4 of 4 TM031965 ERIC U.S. Department of Education Office of Educational Research and Improvement (OERI) National Library of Education (NLE) Educational Resources Information Center (ERIC) REPRODUCTION RELEASE (Specific Document) I. DOCUMENT IDENTIFICATION: SclioL ASIezc. Ae-rx-ruiDE TEST RWrA bTS or.) AN tac(J ?(A7C-1=1,) Title: PlMcea.)04 1.00. CoVEl. CO( A0.6E Author(s): Po N) ALD Corporate Source: Publication Date: Sci-0/51..S 402,l3ZcOriCle II. REPRODUCTION RELEASE: In order to disseminate as widely as possible timely and significant materials of interest to the educational community, documents announced in the monthly abstract journal of the ERIC system, Resources in Education (RIE), are usually made available to users in microfiche, reproduced paper copy, and electronic media, and sold through the ERIC Document Reproduction Service (EDRS). Credit is given to the source of each document, and, if reproduction release is granted, one of the following notices is affixed to the document. If permission is granted to reproduce and disseminate the identified document, please CHECK ONE of the following three options and sign at the bottom of the page. The sample sticker shown below will be The sample sticker shown below will be The sample sticker shown below will be affixed to all Level 1 documents affixed to all Level 2A documents affixed to all Level 2B documents PERMISSION TO REPRODUCE AND PERMISSION TO REPRODUCE AND DISSEMINATE THIS MATERIAL IN PERMISSION TO REPRODUCE AND MICROFICHE, AND IN ELECTRONIC MEDIA DISSEMINATE THIS MATERIAL HAS DISSEMINATE THIS MATERIAL IN FOR ERIC COLLECTION SUBSCRIBERS ONLY, BEEN GRANTED BY MICROFICHE ONLY HAS BEEN GRANTED BY HAS BEEN GRANTED BY \ e \e, \e Sad '``s) TO THE EDUCATIONAL RESOURCES TO THE EDUCATIONAL RESOURCES TO THE EDUCATIONAL RESOURCES INFORMATION CENTER (ERIC) INFORMATION CENTER (ERIC) INFORMATION CENTER (ERIC) 2A 2B Level 1 Level 2A Level 2B Check here for Level 1 release, permitting Check here for Level 2A release, permitting Check here for Level 2B release, permitting reproduction and dissemination in microfiche or other reproduction and dissemination in microfiche and In reproduction and dissemination In microfiche only ERIC archival media (e.g., electronic) and paper electronic media for ERIC archival collection copy subscribers only Documents will be processed as Indicated provided reproduction quality permits. If permission to reproduce Is granted, but no box Is checked, documents will be processed at Level 1. I hereby grant to the Educational Resources Information Center (ERIC) nonexclusive permission to reproduce and disseminate this document as indicated above. Reproduction from the ERIC microfiche or electronic media by persons other than ERIC employees and its system contractors requires permission from the copyright holder. Exception is made for non-profit reproduction by libraries and other service agencies to satisfy information needs of educators in response to discrete inquiries. Sign Printed Name/Position/Title: co-*21. Cos-rtu tio Rokytth La. Ara 1-rCAPtl- here, -I iLLE ScHcaLs OrgenizationfAddress, 11.4x please FAt 17-1-73-7ei5 TiNT-e+7,5-3 1-7 I 063 icQ, Addreu: Date: pvtc 102.1h- US 1.14 0 (over) III. DOCUMENT AVAILABILITY INFORMATION (FROM NON-ERIC SOURCE): If permission to reproduce is not granted to ERIC, or, if you wish ERIC to cite the availability of the document from another source, please provide the following information regarding the availability of the document. (ERIC will not announce a document unless it is publicly available, and a dependable source can be specified. Contributors should also be aware that ERIC selection criteria are significantly more stringent for documents that cannot be made available through EDRS.) Publisher/Distributor: Address: Price: IV. REFERRAL OF ERIC TO COPYRIGHT/REPRODUCTION RIGHTS HOLDER: If the right to grant this reproduction release is held by someone other than the addressee, please provide the appropriate name and address: Name: Address: V. WHERE TO SEND THIS FORM: Send this form to the folkia8 aCEplearinaliouse: AR1NUOUSE ON ASSESSMENT AND EVALUATION UNIVERSITY OF MARYLAND 1129 SHRIVER LAB COLLEGE PARK, MD 20/2 ATTN: ACQUISITIONS However, if solicited by the ERIC Facility, or if making an unsolicited contribution to ERIC, return this form (and the document being contributed) to: ERIC Processing and Reference Facility 4483-A Forbes Boulevard Lanham, Maryland 20706 Telephone: 301-552-4200 Toll Free: 800-799-3742 FAX: 301-552-4700 e-mail: [email protected] WWW: http://ericfac.piccard.csc.com EFF-088 (Rev. 2/2000)

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.