Eagle Forum
Email
Subscribe
Shop
Shop
Youtube
Youtube
Blogger
Blog
Feeds
Feed
Back to September Ed Reporter

Education Reporter
NUMBER 236 THE NEWSPAPER OF EDUCATION RIGHTS SEPTEMBER 2005

How Schools Cheat
Google Ads are provided by Google and are not selected or endorsed by Eagle Forum
On March 17, 2005, 15-year-old Delusa Allen was shot in the head while leaving Locke High School in Los Angeles, sending her into intensive care and eventually killing her. Four months before that, several kids were injured in a riot at the same school, and last year the district had to settle a lawsuit by a student who required eye surgery after he was beaten there. In 2000, 17-year-old Deangelo Anderson was shot just across the street from Locke; he lay dead on the sidewalk for hours before the coroner came to collect his body.

Violent crime is common at Locke. According to the Los Angeles Police Department, in the 2003-04 school year its students suffered three sex offenses, 17 robberies, 25 batteries and 11 assaults with a deadly weapon. And that's actually an improvement over some past years: In 2000-01 the school had 13 sex offenses, 43 robberies, 57 batteries, and 19 assaults with a deadly weapon.

Sounds unsafe, doesn't it? Not in the skewed world of official education statistics. Under the federal No Child Left Behind Act, states are supposed to designate hazardous schools as "persistently dangerous" and allow their students to transfer to safer institutions. But despite Locke's grim record, the state didn't think it qualified for the label.

Locke is not unique. In the 2003-04 school year only 26 of the nation's 91,000 public schools were labeled persistently dangerous. Forty-seven states and the District of Columbia proudly reported that they were home to not a single unsafe school. That would be news to the parents of James Richardson, a 17-year-old football player at Ballou Senior High in Southeast Washington, D.C., who was shot inside the school that very year. It would be news to quite a few people: The D.C. Office of the Inspector General reports that during that school year there were more than 1,700 "serious security incidents" in city schools, including 464 weapons offenses.

Most American schools are fairly safe, it's true, and the overall risk of being killed in one is less than one in 1.7 million. The data show a general decline in violence in American public schools: The National Center for Education Statistics' 2004 Indicators of School Crime and Safety shows that the crime victimization rate has been cut in half, declining from 48 violent victimizations per 1,000 students in 1992 to 24 in 2002, the last year for which there are complete statistics.

But that doesn't mean there has been a decline at every school. Most of the violence is concentrated in a few institutions. According to the National Center for Education Statistics, during the 1999 -2000 school year 2% of U.S. schools (1,85%) accounted for about 50% of serious violent incidents - and 7% of public schools (5,400) accounted for 75% of serious violent incidents. The "persistently dangerous" label exists to identify such institutions.

So why are only 26 schools in the country tagged with it?

The underreporting of dangerous schools is only a subset of a larger problem. The amount of information about schools presented to the general public is at an all-time high, but the information isn't always useful or accurate.

Thanks to the No Child Left Behind Act, now three years old, parents are seeing more and more data about school performance. Each school now has to give itself an annual report card, with assessment results broken down by poverty, race, ethnicity, disability, and English-language proficiency. Schools also are supposed to accurately and completely report dropout rates and teacher qualifications. The quest for more and better information about school performance has been used as a justification to increase education spending at the local, state and national levels, with the federal Department of Education alone jacking up spending to nearly $60 billion for fiscal year 2005, up more than $7 billion since 2003.

But while federal and state legislators congratulate themselves for their newfound focus on school accountability, scant attention is being paid to the quality of the data they're using. Whether the topic is violence, test scores or dropout rates, school officials have found myriad methods to paint a prettier picture of their performance. These distortions hide the extent of schools' failures, deceive taxpayers about what our ever-increasing education budgets are buying, and keep kids locked in failing institutions. Meanwhile, Washington — which has set national standards requiring 100% of school children to reach proficiency in math and reading by 2014 — has been complicit in letting states avoid sanctions by fiddling with their definitions of proficiency.

Prospering cheaters 
Under No Child Left Behind, if schools fail to make adequate yearly progress on state tests for three consecutive years, students can use federal funds to transfer to higher-performing public or private schools, or to obtain supplemental education services from providers of their choice. In addition, schools that fail for four to five consecutive years may face state takeovers, have their staffs replaced, or be bid out to private management.

Wesley Elementary in Houston isn't a school you'd expect to be worried about those threats. From 1994 to 2003, Wesley won national accolades for teaching a majority of its low-income students how to read. Oprah Winfrey once featured it in a special segment on schools that "defy the odds," and in 2002 the Broad Foundation awarded the Houston Independent School District a $1 million prize for being the best urban school district in America, largely based on the performance of schools like Wesley.

It turned out that Oprah was more right than she realized: Wesley was defying the odds. A December 31, 2004 expos‚ by the Dallas Morning News found that in 2003 Wesley's 5th-graders performed in the top 10% in the state on the Texas Assessment of Knowledge and Skills (TAKS) reading exams. The very next year, as 6th-graders at Houston's M.C. Williams Middle School, the same students fell to the bottom 10%.

The newspaper obtained raw testing data for 7,700 Texas public schools for 2003 and 2004. It found severe statistical anomalies in nearly 400 of them. The Houston, Dallas and Fort Worth districts are now investigating dozens of their schools for possible cheating on the TAKS test. Fort Worth's most suspicious case was at A.M. Pate Elementary. In 2004, Pate 5th-graders finished in the top 5% of Texas students. In 2003, when those same students were 4th-graders, they had finished in the bottom 3%.

In the Winter 2004 issue of Education Next, University of Chicago economist Steven D. Levitt and Brian A. Jacob of Harvard's Kennedy School of Government explored the prevalence of cheating in public schools. Using data on test scores and student records from the Chicago public schools, Jacob and Levitt developed a statistical algorithm to identify classrooms where cheating was suspected. Jacob and Levitt's analysis looked for unexpected fluctuations in students' test scores and unusual patterns of answers for students within a classroom that might indicate skullduggery.

They found that on any given test the scores of students in 3% to 6% of classrooms are doctored by teachers or administrators. They also found some evidence of a correlation of cheating within schools, suggesting some centralized effort by a counselor, test coordinator or principal. Jacob and Levitt argue that with the implementation of the No Child Left Behind Act, the incentives for teachers and administrators to manipulate the results from high-stakes tests will increase as schools begin to feel the consequences of low scores.

Texas' widespread cheating likely was a response both to high-stakes testing and to financial incentives for raising test scores. The Houston school district, for example, spends more than $7 million a year on performance bonuses that are largely tied to test scores. Those bonuses include up to $800 for teachers, $5,000 for principals, and $20,000 for higher-level administrators.

Texas is not the only state where schools have cheated on standardized tests. Teachers provided testing materials to students nearly a dozen times in 2003 in Nevada, for example. And Indiana has seen a raft of problems, including three Gary schools that were stripped of their accreditation in 2002 after hundreds of 10th-graders received answers for the Indiana Statewide Testing for Education Progress-Plus in advance. A teacher in Fort Wayne took a somewhat subtler approach in 2004, when school officials had to throw out her 3rd-grade class's scores after she gave away answers by emphasizing certain words on oral test questions. In January 2005 another Fort Wayne 3rd-grade teacher was suspended for tapping children on the shoulder to indicate a wrong answer.

Phantom dropouts 
If you want to make a school's performance look more impressive than it really is, you don't have to abet cheating on standardized tests. Instead you can misrepresent the dropout rate.

In 2003 the New York Times described an egregious example of this scam in Houston. Jerroll Tyler was severely truant from Houston's Sharpstown High School. When he showed up to take a math exam required for graduation, he was told he was no longer enrolled. He never returned.

So Tyler was surprised to learn, when the state audited his high school, that Sharpstown High had zero dropouts in 2002. According to the state audit of Houston's dropout data, Sharpstown reported that Tyler had enrolled in a charter school — an institution he had never visited, much less attended. The 2003 state audit of the Houston district examined records from 16 middle and high schools, and found that more than half of the 5,500 students who left in the 2002 school year should have been declared dropouts but were not.

The Manhattan Institute's Jay P. Greene argues, in his 2004 paper "Public School Graduation Rates in the United States," that "this problem is neither recent nor confined to the Houston school district..Official graduation rates going back many years have been highly misleading in New York City, Dallas, the state of California, the state of Washington, several Ohio school districts, and many other jurisdictions." Administrators, he explains, have strong incentives to count students who leave as anything other than dropouts. Next to test scores, graduation rates are an important measure of a school's performance: If parents and policy makers believe a school is producing a high number of graduates, they may not think reform is necessary. Greene writes that "when information on a student is ambiguous or missing, school and government officials are inclined to say that students moved away rather than say that they dropped out."

Greene and his associates have devised a more accurate method for calculating graduation rates. Simplifying a bit, it essentially counts the number of students enrolled in the 9th grade in a particular school or jurisdiction, makes adjustments for changes in the student population, and then counts the number of diplomas awarded when those same students leave high school. The percentage of original students who receive a diploma is the true graduation rate. Using Greene's methodology, the national high school graduation rate for 2002 was 71%.

As Sharpstown High School's former assistant principal, Robert Kimball, told the New York Times, "We go from 1,000 Freshman [sic] to less than 300 Seniors with no dropouts. Amazing!"

The problem isn't limited to Texas. In March researchers at Harvard's Civil Rights Project released an analysis of state graduation rates for 2002, in which they derived their figures by counting the number of students who move from one grade to the next and then on to graduation. The report found serious discrepancies between the rates calculated by the Civil Rights Project and those offered by education departments in all 50 states. In California, for example, the state reported an 83% graduation rate, but the Harvard report found that only 71% of students made it through high school.

The Civil Rights Project's paper also found a high dropout rate among minorities, which California officials hides behind state averages. Almost half of the Latino and African-American students who should have graduated from California high schools in 2002 failed to complete their education. In the Los Angeles Unified School District, just 39% of Latinos and 47% of African-Americans graduated, compared with 67% of whites and 77% of Asians.

Moving goalposts on proficiency 
A subtler way to distort data is to report test scores as increasing when in fact more students have been excluded from taking the test. One egregious example of this practice took place in Florida, which grades schools from F to A based on their standardized test scores. Oak Ridge High School in Orlando boosted its test scores from an F to a D in 2004 after purging its attendance rolls of 126 low-performing students.

The students were cut from school enrollment records without their parents' permission, a violation of state law. According to the Orlando Sentinel, about three-quarters of the students had at least one F in their classes, and 80% were 9th- or 10th-graders — a key group, because Florida counts only the scores of freshmen and sophomores for school grades. More than half of the students returned to Oak Ridge a few weeks after state testing.

The Sentinel also reported that in 2004 some 160 Florida schools assigned students to new schools just before standardized testing in a shell game to raise school grades. In Polk County, for example, 70% of the students who were reassigned to new schools scored poorly on Florida's Comprehensive Assessment Test, suggesting they were moved to avoid giving their old schools a bad grade.

Florida is not alone. Houston's Austin High used a strategy of holding back low-scoring 9th-graders and then promoting them directly to 11th grade to avoid the 10th-grade math exam.

States are also excluding a higher percentage of disabled students and students for whom English is a second language. And states often report that their test scores are going up when they've merely dumbed-down their standards by changing the percentage of correct responses necessary to be labeled "proficient" or by changing the content of the tests to make them easier.

Lisa Snell is director of the Reason Foundation's education program. This article was originally published in Reason magazine (June 2005) and is reprinted with permission, in slightly abridged form.


 
Google Ads are provided by Google and are not selected or endorsed by Eagle Forum
Eagle Forum • PO Box 618 • Alton, IL 62002 phone: 618-462-5415 fax: 618-462-8909 eagle@eagleforum.org