This wide- and large- screen layout
may not work quite right without Javascript.
Maybe enable Javascript, then try again.
A school should do more than just give a good education to its "average student." It should also educate those who can barely keep up rather than just writing them off. And it should continually challenge even the best students rather than forcing them into a standardized McStudent role. For many years Ipswich schools had a district wide program that tried to serve the special needs of the high end. But the ELP program shrank over time and as a district wide program was eventually cut from the budget entirely. There was some discussion of returning a similar program christened SPARK.
How well do nearby school districts educate not just the typical students but the ones at the high end? The following chart gives a first very rough idea of how well different school districts educate their exceptional students, with positions higher on the chart being better than lower positions. Different schools have very different "average students," and this difference easily swamps out differences in educating students that are exceptional in their school. The statistical procedures used here control for ("hide" or "remove") average student differences to expose how well each school handles its exceptional students.
The left two columns are based on results of the fourth grade MCAS test and serve as an indicator for "the elementary" schools. The middle three columns are based on results of the eighth grade MCAS test and serve as an indicator for "the middle" school. The right two columns are based on results of the tenth grade MCAS test and serve as an indicator for "the high" school.
What this chart shows is hopefully a little more meaningful than just counting "Advanced" MCAS scores. Simply counting Advanced scores would make schools with high overall averages look like they were doing a good job with exceptional students even though they in fact squeezed everyone into the same McGood mold. This chart sidesteps that problem by showing for each grade and subject area the ratio of students who actually scored Advanced compared to the number of students expected to score Advanced based on the average and distribution of scores. A way to think of this procedure is each count of Advanced scores was "adjusted" according to that districts's overall average score. Of course trying to adjust for all the different factors that distinguish one community from another by using the single number actual average score is almost certain to cause problems.
Numbers greater than 1.0 (higher on the chart) mean the school had even more Advanced scores than expected from their regular program and so is apparently doing a good job of educating the best students. Numbers less than 1.0 (lower on the chart) mean the school had fewer Advanced scores than expected from their regular program. Note the reasoning includes two assumptions either of which may not be true: that the MCAS does a good job of measuring what exceptional students should know; and that behind many high scores on the MCAS are exceptional students.
This is the first quantitative measure we have of how different school districts educate exceptional students, and as such is better than nothing. Keep in mind though it's statistically quite dubious and also the reasoning is dubious. The results are not nearly as meaningful as one might wish. They're published at all only because --as questionable as they are-- they're one of the very few quantitative pieces of information we have. See the methodology section below for a list of issues.
Exceptional Index (ratio of actual Advanced to expected Advanced) | Elementary School English | Elementary School Math | Middle School English | Middle School Math | Middle School History & Social Studies | High School English | High School Math | |
---|---|---|---|---|---|---|---|---|
>1.0 -- More | 4th grade | 4th grade | 8th grade | 8th grade | 8th grade | 10th grade | 10th grade | |
<1.0 -- Fewer | stdev 11.5 | stdev 18.5 | stdev 9.5 | stdev 21.5 | stdev 13.5 | stdev 19.5 | stdev 31 | |
2.5 | ||||||||
2.4 | ||||||||
2.3 | Rockport | |||||||
2.2 | ||||||||
2.1 | ||||||||
2.0 | ||||||||
1.9 | Danvers | |||||||
1.8 | ||||||||
1.7 | Swampscott | |||||||
1.6 | ||||||||
1.5 | Gloucester Manchester-Essex | |||||||
1.4 | ||||||||
1.3 | Triton | |||||||
1.2 | Ipswich | Rockport Marblehead | Swampscott Danvers | Andover | Manchester-Essex Hamilton-Wenham Masconomet | Newburyport | ||
1.1 | Middleton Swampscott | Newburyport Swampscott Hamilton-Wenham | Marblehead Ipswich Danvers | Manchester-Essex | Marblehead | Swampscott Andover Masconomet Marblehead Hamilton-Wenham |
||
↑ 1.0 ↓ | Topsfield | Triton Andover Lynnfield Middleton Danvers | Gloucester Rockport Andover Lynnfield | Hamilton-Wenham Rockport | Gloucester Lynnfield Triton Swampscott Georgetown | Lynnfield Georgetown Manchester-Essex |
||
0.9 | Andover Marblehead Lynnfield Newburyport | Boxford | Manchester-Essex Hamilton-Wenham Georgetown Triton Marblehead | Triton Georgetown Manchester-Essex Masconomet | Andover Masconomet | Rockport | Ipswich Danvers |
|
0.8 | Georgetown Danvers | Topsfield | Swampscott Lynnfield | Newburyport Danvers Andover Ipswich | Triton Rockport |
|||
0.7 | Georgetown Manchester-Essex Gloucester | Newburyport Ipswich | Newburyport | Lynnfield | ||||
0.6 | Hamilton-Wenham | Ipswich | Gloucester | Gloucester | ||||
0.5 | Boxford | Masconomet | ||||||
0.4 | Rockport | Marblehead Hamilton-Wenham | ||||||
0.3 | ||||||||
0.2 | ||||||||
0.1 | ||||||||
No Data | Georgetown Gloucester Ipswich Newburyport Triton |
Raw MCAS score summaries from the spring 2001 test cycle were obtained from the Massachusetts Department of Education. (More recent results are available.) For the districts included in each column, the number of students in each quartile (Advanced, Proficient, Needs Improvement, Warning/Failing) was totaled up, and an overall average was calculated. The distribution of scores was assumed to be Gaussian/normal, the distribution was centered on the overall average, and the best fit standard deviation was found.
To find the overall standard deviation for the whole column the number of students in the lower three quartiles was summed and the standard deviation that most closely matched this sum was selected. Standard deviations were expected to be somewhere between 15 and 25. Actual best fit standard deviations vary from less than 10 to more than 30. The headings for each column include the standard deviation that best fit the data in that column.
For each district a Gaussian/normal distribution with the mean being the average score for that school district and the standard deviation that best fit all the data in the column was used to predict how many students could be expected to score in the top (Advanced) quartile. One way to think of this is "sliding" the distribution back and forth to align with the average peak for each school district. Thus schools with high averages were expected to have many Advanced scores. The net effect was to "adjust" for the average score so districts with higher averages could be fairly compared to districts with lower averages.
There are quite a few statistical problems with both the data and the prodedure, including:
There are also possible reasoning problems, including: