NY Times - October 5, 2005
One Secret to Better Test Scores: Make State Reading
Tests Easier
By MICHAEL WINERIP
PARENTS are delighted when state test scores go up. Obviously, their children are getting smarter and the teachers are doing better. Politicians are ecstatic; their school reforms must be working. Indeed, during his re-election campaign, Mayor Michael R. Bloomberg has repeatedly cited the rise in the city's 2005 fourth-grade test results (up 10 percentage points in English to 59 percent at grade level, and up 9 points in math to 77 percent) as proof that his school programs are a success. "Amazing results," he said, that "should put a smile on the face of everybody in the city."
However, those in the trenches, the teachers and principals, tend to view the scores differently. While they would rather be cheered than booed, they know how much is out of their control.
Take Frances Rosenstein, a respected veteran principal of Public School 159 in the Bronx. Ms. Rosenstein has every right to brag about her school's 2005 test scores. The percentage of her fourth graders who were at grade level in English was 40 points higher than in 2004.
How did she do it? New teachers? No, same teachers. New curriculum? No, same dual-language curriculum for a student body that is 96 percent Hispanic and poor (100 percent free lunches). New resources? Same.
So? "The state test was easier," she said. Ms. Rosenstein, who has been principal 13 years and began teaching in 1974, says the 2005 state English test was unusually easy and the 2004 test unusually hard. "I knew it the minute I opened the test booklets," she said.
The first reading excerpt in the 2004 test was 451 words. It was about a family traveling west on the Oregon Trail. There were six characters to keep track of (Levi, Austin, Pa, Mr. Morrison, Miss Amelia, Mr. Ezra Zikes). The story was written in 1850's western vernacular with phrases like "I reckon," "cut out the oxen from the herd," "check over the running gear" for the oxen, "set the stock to graze," "Pa's claim."
Ms. Rosenstein said such language was devastating for her urban Hispanic children. "They're talking about a 'train' and they mean wagon train," she said. "Our kids know the subway. I walked into a class and there was a girl crying. I took the test booklet and read it. I thought, 'Oh, my God, we're in trouble.' "
In contrast, the first reading in the 2005 test was 188 words about a day in the life of an otter. A typical sentence: "The river otter is a great swimmer." Ms. Rosenstein said: "The otter story was so easy, it gave our kids confidence. It was a great way for them to start the test."
She said the pattern continued throughout the two tests. In 2004, on the "hard test," the second passage was about the Netherlands thanking Canada for its support during World War II by sending 100,000 tulip bulbs to Ottawa. The third story was about a photographer, Joel Sartore, who embedded himself in Madidi National Park in Bolivia to get rare nature shots.
"These were very sophisticated pieces," Ms. Rosenstein said. "We teach our kids when reading to make a connection to themselves. These stories were foreign to their experience. You didn't have anything like this on the 2005 test."
In 2005, on the "easy test," the second passage was about hummingbirds. The third was about a boy who thought he won a real horse, but it was a china horse. The story was told mainly in dialogue that read like the old Dick and Jane primers:
" 'What's going on?' asked Beth.
'I just won a horse,' said Jamie."
"What a difference from the 2004 test," Ms. Rosenstein said. "I was so happy for the kids - they felt good after they took the 2005 test."
In an e-mail message, Jonathan Burman, a state education spokesman, said there was no cultural bias on the 2004 test. He said the 2004 and 2005 tests were extensively field-tested. "We found that the passages could be understood by all students, including urban students," he wrote.
He acknowledged that the 2004 test was harder but said the state compensated by using a tougher scale to score the 2005 test. "Students had to answer a few more questions correctly in 2005 and get more raw points in order to get the same scaled score as in 2004," he said. But even if the 2005 test was scaled, scores still soared statewide, with 70.4 percent at grade level, up 8.2 percentage points from 2004 and with several cities - Yonkers, Syracuse, Rochester - posting increases even higher than New York City's.
Ms. Rosenstein does not believe the scaling made the two tests equivalent. "If a child can't follow the passages, a few points won't make a difference," she said. "They give up."
P.S. 159 has just 242 students from kindergarten to fifth, with 28 fourth graders taking the state test in a typical year. As a result, the performance of a handful of students can cause a big scoring swing. P.S. 159's test results followed the ups and downs statewide; they're just amplified. For example, on the 2004 "hard test," 62.2 percent of students statewide scored at grade level, down 2 points from 2003. At P.S. 159, 17.9 percent were at grade level, down 46 points from 2003.
BUT at a small school it's easier to examine the variables at play. For example, all three years, as scores fluctuated, Yehonela Ortiz taught fourth grade. Her principal called her an outstanding teacher, a nine-year veteran who is bilingual.
Ms. Ortiz said she could not take credit for the big jump this year nor the blame for last year's big drop. "So many things go into it," she said. "They've had a lot of teachers since pre-K. I feel it's a collaboration of all the many teachers since."
A few years ago, 64 percent of her fourth graders scored at grade level in English, her best results. "It wasn't me," she said. It was a class that happened to have a large number of Hispanic parents speaking English at home. "They came to me more academic. I don't think it was anything we did."
She said that there were yearly fluctuations, but that test scores would generally rise over time because the state has been using the same format for seven years.
"We know the test now," Ms. Ortiz said. "We start preparing them in September. When I go through a lesson, I always connect it to what's in the exam. We know there's always letter-writing, so we give more of that. We know there's nonfiction, so we make sure we do it before the test." When she gives a writing assignment, she now sets a timer for 10 minutes, to simulate testing conditions.
Does it mean students are getting smarter and teachers better?
"I don't know," said Ms. Ortiz.
E-mail: edmike@nytimes.com