Monday, March 28, 2005

California High School Exit Exams; and the Bee

Back to the California High School Exit Exams.
California High School Exit Exam.
On Monday, March 21, 2005, The Sacramento Bee editorial writers took a position in favor of the California State High School Exit exams scheduled for implementation in 2006.
This blog raised some questions about the editorial position on that date, including an analysis of the report the editorial writers claim to have used. Now, with more time, the exit exam can be further examined. (Please see blog date Monday, March 21 )
The editorial writers argued, "Retain the Exit Exam." They say,
" Now the most recent independent evaluation reaches a different conclusion: "Keep the exit exam requirement in place for the Class of 2006 and beyond." Legislators should heed that recommendation, too."
This is interesting. It is fundamental to writing, and fundamental to passing a high school exit exam, that when you cite a source, you provide the details of the source so that readers can consider the original source. This would be considered required by 9th. grade English.
Upon raising the question with the Bee editorial page editors, I found out that, “All the independent evaluations of the exit exam can be found on the website of the California Department of Education at
Section 60855 of the Education Code requires an independent evaluation of the quality and impact of the exit exam. The California Department of Education awarded a five-year contract for this evaluation to the Human Resources Research Organization beginning Jan. 2000.” Pia Lopez, Associate Editor.
I encourage you to read the report itself.
The above report ,which was apparently the source of the Bee editorial, is worth consideration for all of us struggling with these exams. A series of questions were raised in the earlier blog entry of March 21. The report cited above also raises a number of vital issues including the possibility that the tests are focusing the curriculum on the tests. The report notes that the tests have not increased drop out rates. However, it is not clear if the report is using the drop out rates of the CDE, or the more accurate data as developed by the Harvard Civil Rights Project. The report calls for the development of more programs of remediation and support for those not passing the tests. Each of these are important issues to teachers.
Here are a number of additional serious questions about this testing not raised in the State’s evaluation. First, it is rather clear that the testing movement has lead to a distinct narrowing of the curriculum and an emphasis on teaching what is in the test. Algebra teachers, for example, might think that this is good since most of the students are now drilling and repeating algebra. But such testing has pushed out art, science, and much of the social sciences, particularly those social sciences considering multicultural issues and identity. (See Valenzuela, 2004)
As we develop a theory on testing in general, and high school exit testing in specific during this era of business lead “reform”, it would be good to learn from studies already completed and states that have been using such tests.
The Executive Summary of the report State High School Exit Exams, cited by the Bee says,
"Currently, more than half (52 percent) of all public school students and even more (55 percent) minority public school students live in states requiring they pass the tests in order to graduate. By 2009, 7 in 10 public school students and 8 in 10 minority students nationwide will be affected, according to State High School Exit Exams: A Maturing Reform, the third in a series of annual studies conducted by the Center.

The report finds that with the right conditions, exit exams “probably have some positive effects on student motivation and achievement,” but also may encourage some students to pursue a general education diploma (GED) instead of a regular diploma, and that the tests may be linked to increased dropout rates for key groups of students and states with tougher exam systems."
They say,
“The evidence on the effects of exit exams is mixed and tentative. With adequate
supports and the right policy context, exit exams probably have some
positive effects on student motivation and achievement and on curriculum
and instruction, at least for some groups of students. But there is also enough
evidence of negative effects of these exams, such as encouraging some students
to pursue a GED instead of a diploma, to suggest that policymakers are making
tradeoffs when they adopt exit exam requirements. “
State High School Exams: A Maturing Reform, (2004) Center on Education Policy.

So, this major study finds the evidence mixed and tentative. Well, if it tentative, perhaps we should look at the results in some other states prior to implementation in California.
Here is a report from New York. A group working in New York, calling themselves the Teachers Network share these experiences in a document yet to published:

Lets look at New York where they have a long history of high school exit exams.

New York State public school students must pass five standardized Regents examinations — Global History, American Government and History, English Language Arts, the Mathematics Course A exam and one of the Science exams — to graduate high school. It is hard to imagine a system of examinations with higher stakes. Yet year after year, one reads in the newspaper about some mishap with yet another exam: Physics one year, Math the next, ELA the third.

What is clear is that the New York State Education Department is making these high stakes examinations on the cheap, without the necessary development and due diligence and care. Exams are written without the input of a wide, representative range of teachers, and questions are almost fully field tested. It is irresponsible, plain and simple, to provide flawed instruments for such purposes. Insofar as Regents Examinations are going to play a major role in decisions over whether a student will graduate high school, they must fit the principles we have laid down here.

But that is only part of the problem. The Regents seem to have mistaken setting high standards with standardization. We support without reservation the setting of high standards for graduation from high school, but we also believe that there should be more than one way to demonstrate that one has met those standards. Specifically, we support the development of an option where a school and a student could substitute academically rigorous and accredited forms of performance based assessment for the three Regents examinations — Global History, American Government and History and Science.

While the English Language Arts and Mathematics A examinations are largely skilled based examinations, the Social Studies and Science examinations are subject knowledge intensive tests. Given the sheer breadth of the Social Studies curricula, examinations of this nature effectively prevent Social Studies teachers from looking at any topic in some depth, and instruction suffers as the course becomes a mile wide and an inch deep. If schools and students had the choice of substituting rigorous performance based assessments for the Social Studies and Science examination, instruction in these subject areas would improve. Instruction would drive assessment, as it should, instead of assessment driving instruction.
See the Standards for Psychological and Educational Testing developed jointly by the American Psychological Association, the American Educational Research Association and the National Council on Measurement in Education, and published in a second edition by the APA in 1999, and the Code of Fair Testing Practices in Education, prepared by the Joint Committee on Testing Practices, which includes the three above organizations and the American Counseling Association, the American Speech-Language-Hearing Association, the National Association of School Psychologists, and the National Association of Test Directors. []
Among the responsibilities of test users such as school districts and other educational authorities laid out in the Code of Fair Testing Practices in Education, one finds “Avoid using a single test score as the sole determinant of decisions about test takers. Interpret test scores in conjunction with other information about individuals.” [Ibid., p. 9] And again, in the American Psychological Association’s Statement on “Appropriate Use of High-Stakes Testing in Our Nation's Schools” []: “Any decision about a student's continued education, such as retention, tracking, or graduation, should not be based on the results of a single test, but should include other relevant and valid information.”
Code of Fair Testing Practices in Education, p. 9: “Avoid using tests for purposes other than those recommended by the test developer unless there is evidence to support the intended use or interpretation.”
In the language of psychometrics, exams used to test whether or not a student has met or surpassed a particular set of standards should be criterion, not norm, referenced exams. This means that the student is tested on a fixed performance standard of what she should know and be able to do, rather than where she stands in relation to other students, which is the primary measure on norm referenced exams. [Norm referenced tests are designed to distribute test takers along a normal curve and thus fail a target percentage, usually around 16%, regardless of how much they know, and pass another target percentage, usually around 84%, regardless of how little they know.]
AERA Research Points: Essential Information for Education Policy, “Standards and Tests: Keeping Them Aligned.” Vol.1,No. 1. {Spring 2003][]
So, while the Sacramento Bee editorial board finds the issue clear, those of working in schools find the issue much more complex. Readers are encouraged to use this blog to share their experiences with this testing.
Post a Comment
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial 3.0 Unported License.