High School Gets An ‘A’ Despite Falling Test Scores

Payson High School Principal Anna Van Zile says the AIMS test switching to multiple choice questions in 2012 took a toll on overall scores statewide.

Photo by Andy Towle. |

Payson High School Principal Anna Van Zile says the AIMS test switching to multiple choice questions in 2012 took a toll on overall scores statewide.


All a parent wants to know about their child’s school is this: Does the school do a good job? Will my child learn what they need to succeed in life?

Arizona State Superintendent John Huppenthal decided to distill that desire into a simple letter grade, and through a series of well-intentioned calculations, Payson High School (PHS) received an “A.”

But oversimplification can sometimes muddle the message.

For instance, while the high school’s letter grade from the state reflects gains by the school’s weakest students, the average scores for the whole school have actually declined slightly in math and science and sharply in writing. Science scores have showed a small increase.

Test data for the last few years offers a deeper look at how well PHS students are doing on standardized tests of core academic subjects.

Students take two different tests. The AIMS (Arizona Instrument to Measure Standards) and the Stanford 10 test, which compares a student to peers across the nation.

In the last three years, PHS students have dropped their average AIMS scores in reading, math and writing, while their science scores have increased.

In those same years, PHS students have increased their math scores on the Stanford 10 test, while their reading and language scores have dropped.

Arizona’s AIMS test results

The AIMS test started out as a high-stakes graduation requirement, but has since evolved as a result of the federal No Child Left Behind legislation. The Legislature originally imposed the AIMS test in 1999 to ensure that all high school graduates had mastered basic academic skills. Initial high failure rates prompted years of fiddling with the test.

The authors of AIMS sought to capture a student’s understanding of math, reading, writing and science, according to Arizona standards.

Arizona standards evolved in 1996 when educators developed common goals for learning, then invited national experts and Arizona citizens to comment.

Since that time, standards for reading, math, writing, science, and social studies have continued to evolve, especially after the federal No Child Left Behind legislation made federal money available for states that established comprehensive testing systems.

In high school, the state only requires 10th-graders to take the AIMS test. Students who fail in their sophomore year can take it again each year until they pass — or even in the summer after their senior year as a last chance to earn a diploma.

In 2010, the average math score for sophomores through seniors was 503. By 2012, that had dropped 3 percent to 486.

In reading in 2010, PHS students received an average score of 721. In 2012, that dropped by 2 percent to 704.

Writing saw a startling 30 percent drop, from 707 in 2010 to 495 in 2012. PHS Principal Anna Van Zile believes that was the year the AIMS writing test portion went from straight writing, to writing and multiple choice.

“The multiple choice questions really took its toll on overall scores across the state,” she said, “The questions were written to the six traits of writing, as I recall. And a lot of scores dropped because of it.”

Yet, science scores rose 2 percent, going from 479 in 2010 to 489 by 2012.

Parents want to understand how their child’s school compares to state and national standards. The AIMS test solely compares results with other schools in Arizona.

Last year, 60 percent of students statewide passed the AIMS math test, compared to 50 percent at Payson High School.

In reading, 80 percent of state students passed compared to 70 percent at PHS.

At the state level, 70 percent passed writing, compared to 69 percent in Payson.

In science, 42 percent passed statewide, compared to 40 percent locally.

The Arizona Department of Education breaks down the scores and percentage passed for each grade in high school. Because Payson has so few juniors and seniors taking the test, their results did not measurably affect the overall results. For the purposes of this article, only the state sophomore scores were used, while all of the PHS student scores were listed.

National comparisons on Stanford 10

In comparison to the AIMS test, the Stanford test compares students on a national level. Developed in 1926 by professors from the Stanford University. The test has stood the test of time, evolving into its 10th version (called the Stanford 10) as of 2010.

On the Arizona Department of Education Web site, the ADE reports compares the average scores for an entire school to the national percentile. Principals receive reports on individual student’s scores.

Arizona decided students in grades two through nine would take the reading, math and language portions of the Stanford 10 test. Rather than having the third through eighth grade students take two tests, the state combined the Stanford 10 test with the AIMS test.

The national percentile ranking of Payson High School students has risen in math, but dropped in reading and language since 2010.

In 2012, Payson High School’s freshmen scored in the 75 percentile in math. In 2010 they scored in the 71 percentile.

Last year in reading, PHS freshmen scored in the 58th percentile compared to the 66th percentile in 2010.

And in language, in 2012 the students scored in the 48th percentile, compared to the 55th percentile in 2010.

School Letter Grade

Reconciling the high school’s “A” ranking with dropping AIMS and Stanford 10 scores offers a seeming contradiction.

However, the grades reflect an attempt by the Department of Education to take into account the various starting points of students in different districts.

Studies show that students who have a secure home, are well rested and have good nutrition do better on tests than students who do not.

Moreover, students with affluent, college-educated parents normally score better on standardized tests.

Payson has an estimated 25 percent of its population defined as homeless. Each day these children have no guarantee they will have a place to sleep or a meal to eat — not the best circumstances to take a high-pressure test.

So, the state determined “growth points” for each school to show improvement over time. The analysis attempts to determine whether students in a particular school make progress, regardless of their starting point.

Growth points are calculated using up to six years of AIMS and Stanford 10 scores. Analysts then place each student in a statewide peer group based on the scores they received on reading and math, said Nick Bishop, the Interim Director of Accountability in the ADE research department.

For example, a student scoring an average of 430 in math is thrown into a group of other students her age with similar scores from around the state. Using an algorithm and the most recent scores as a control, the state determines how much the student has grown throughout her years of schooling by comparing her to her peers.

The grade given each school is decided using 50 percent from the current year’s test scores, 25 percent from overall student growth and 25 percent from the growth rate of the lowest performing students.

In effect, the state counts the improvement of the bottom 25 percent performers twice.

PHS showed the most improvement under the median growth percentile in the bottom 25 percent of performing students.

Former PHS principal Kathe Ketchem had the opportunity to explain the results to the teachers, “In the median growth score, 79 percent of our students grew more than other kids in the state,” she said.

Ketchem told the teachers she believed the implementation of close, in-depth reading skills, differentiated teaching, tracking kids who were failing, and offering tutoring with one-on-one attention before and after school made the difference.

“In-depth reading brought up comprehension 90 percent,” said Ketchem.

Differentiated teaching, according to Carol Ann Tomlinson, author of several books on the subject, allows teachers to tailor their lessons to the level of each student in their classroom rather than a “one-size-fits-all” approach.

Research shows some students learn more quickly than others, some learn visually, and some learn auditory. Some do better demonstrating they have learned a subject through projects or presentations, while others do great on tests.

While differentiation forces teachers to reconsider how they teach by requiring them to change the curriculum to meet the needs of the student, instead of the student fitting the needs of the curriculum, Ketchem praised the teachers for going the extra mile.

And that is the reason for testing, grading and comparing results.

Helping the kids to improve and prepare for life.


Use the comment form below to begin a discussion about this content.

Requires free registration

Posting comments requires a free account and verification.