Practice:
Evaluate outcomes to show your program is making a difference
Key Action:
Design the most rigorous evaluation possible
SAMPLE MATERIAL: Process for Selecting Comparison Schools
Purpose:
When you evaluate magnet program outcomes, selection of appropriate comparison schools is essential to producing credible data analysis. The screenshots, tables, and narrative in this report outline the process one district used to select comparison schools for an MSAP rigorous evaluation. The document illustrates the steps the district took to narrow the pool of potential comparison schools to select the best candidates, including matching by demographics and eliminating those with similar program elements, or “treatment.”
Source:
Northwest Suburban Integration School District, MN, is an interdistrict consortium that received Magnet Schools Assistance Program (MSAP) funds in 2004 and 2007. The sample material is from the district’s 2007 MSAP rigorous evaluation documentation.
1
Practice:
Evaluate outcomes to show your program is making a difference
Key Action:
Design the most rigorous evaluation possible
Northwest Suburban Integration School District - Selection of Comparable Schools for Rigorous Evaluation The selection of comparable schools was based on the closest match of demographics to the Magnet Schools Assistance Program (MSAP) magnet schools being evaluated. The demographics that were closest for total enrollment, percentage of black students, percentage of free or reduced-price lunch, and percentage of limited English proficient (LEP) students, determined the first round of selection for comparable schools. Once those factors were determined, then each of the proposed comparable schools was contacted to find out if the school had programs like International Baccalaureate or other treatments that would give it characteristics similar to those that the magnet schools are implementing. The schools that had similar treatments were eliminated and the next closest school was checked until a final selection could be made.
Step 1. Gathering the demographic information October 1 enrollment data for each year is available from the Minnesota Department of Education (MDE) website. The source of the data is the Minnesota Automated Reporting Student System (MARSS). The data downloads were saved as an Excel file. The October 1 enrollment file includes data from every school in the state by grade, ethnicity, and gender. Total enrollments by school are also included in this file. A second data file is available for enrollments by special population such as school, grade, free lunch, reduced lunch, special education, and limited English proficient. Free lunch and reduced lunch were added together. That file contains all schools in the state and can be downloaded as an Excel file.
2
Practice:
Evaluate outcomes to show your program is making a difference
Key Action:
Design the most rigorous evaluation possible
Step 2. Preparing the data for evaluation The two data files both contain a detailed list of enrollments by grade, and a summary list by school. The October 1 enrollment summary file contains over 2,000 records, one for every school in the state. The detailed enrollment files contain over 11,000 records. These files were reduced to only the metro schools that had potentially similar demographics. The data download of AYP records for each school contains over 2,600 records, and these records also were scaled down to just the metro schools. Records from out-of-state schools and all charter schools were taken out of all files. Formulas were created in the October 1 enrollment file to calculate minority and black percentages on the remaining records. Formulas were created in the special population enrollment file to determine the percentage of limited English proficient and the percentage of free or reduced-price lunch population.
.
The two Excel files were then merged to show one record for each school so that the key factors such as total enrollment, percentage of black student enrollment, percentage of LEP and the percentage of free or reduced-price lunch could easily be sorted and compared for the evaluation.
3
Practice:
Evaluate outcomes to show your program is making a difference
Key Action:
Design the most rigorous evaluation possible
Step 3. Using the data to select comparable schools Grade
Total Minority
Total # Student
October 1 Enrollment Data - MDE % Total % Black LEP Minority Black K-12
School (* name is protected) SCHOOL A
SPE K12
% LEP K-12
% SpEd K-12
All Grades
708
2259
31.34%
392
17.35%
233
197
10.31%
8.72%
Total Freereduced 567
% Freereduced
SCHOOL B
All Grades
276
884
31.22%
166
18.78%
77
115
8.71%
13.01%
321
36.31%
SCHOOL C
All Grades
1714
2030
84.43%
386
19.01%
968
268
47.68%
13.20%
1660
81.77%
SCHOOL D
All Grades
431
1316
32.75%
267
20.29%
61
188
4.64%
14.29%
295
22.42%
SCHOOL E
All Grades
628
1718
36.55%
398
23.17%
99
174
5.76%
10.13%
459
26.72%
SCHOOL F
All Grades
930
1978
47.02%
492
24.87%
135
206
6.83%
10.41%
724
36.60%
SCHOOL G
All Grades
829
1437
57.69%
366
25.47%
314
231
21.85%
16.08%
743
51.70%
SCHOOL H
All Grades
986
1476
66.80%
420
28.46%
419
206
28.39%
13.96%
909
61.59%
SCHOOL I
All Grades
764
1420
53.80%
411
28.94%
213
146
15.00%
10.28%
650
45.77%
SCHOOL J
All Grades
1264
1607
78.66%
490
30.49%
489
240
30.43%
14.93%
1229
76.48%
SCHOOL K
All Grades
535
962
55.61%
308
32.02%
157
146
16.32%
15.18%
530
55.09%
SCHOOL L
All Grades
1430
1495
95.65%
482
32.24%
743
204
49.70%
13.65%
1372
91.77%
SCHOOL M
All Grades
1403
2134
65.75%
697
32.66%
443
179
20.76%
8.39%
1168
54.73%
SCHOOL N
All Grades
979
1943
50.39%
663
34.12%
72
257
3.71%
13.23%
790
40.66%
SCHOOL O
All Grades
1012
1522
66.49%
542
35.61%
195
139
12.81%
9.13%
743
48.82%
SCHOOL P
All Grades
732
1111
65.89%
410
36.90%
147
166
13.23%
14.94%
569
51.22%
SCHOOL Q
All Grades
533
724
73.62%
301
41.57%
160
83
22.10%
11.46%
504
69.61%
SCHOOL R
All Grades
757
868
87.21%
376
43.32%
343
186
39.52%
21.43%
752
86.64%
SCHOOL S
All Grades
367
506
72.53%
223
44.07%
79
104
15.61%
20.55%
398
78.66%
SCHOOL U
All Grades
1034
1202
86.02%
535
44.51%
192
147
15.97%
12.23%
888
73.88%
SCHOOL V
All Grades
957
1126
84.99%
510
45.29%
463
184
41.12%
16.34%
900
79.93%
SCHOOL W
All Grades
810
1053
76.92%
542
51.47%
215
159
20.42%
15.10%
627
59.54%
SCHOOL X
All Grades
899
1020
88.14%
567
55.59%
397
144
38.92%
14.12%
853
83.63%
SCHOOL Y
All Grades
298
447
66.67%
252
56.38%
0
51
0.00%
11.41%
239
53.47%
SCHOOL Z
All Grades
610
634
96.21%
417
65.77%
122
161
19.24%
25.39%
510
80.44%
25.10%
This chart includes schools that had the most similar demographics to the MSAP grant schools. The magnet schools are highlighted in green and the potential comparable schools in yellow.
4
Practice:
Evaluate outcomes to show your program is making a difference
Key Action:
Design the most rigorous evaluation possible
The selection of comparable schools was narrowed to the closest in demographics to the magnet schools, based first on black population, secondly by free or reduced-price lunch, and third by LEP population, using the size of the school’s population when possible. A sample chart of the Adequate Yearly Progress (AYP) report from the Minnesota Department of Education follows: 2008 AYP Results Schools 8/1/2008 School Name
School XYZ (Equivalency School) School K School B School E School P School O School Q
Title 1 in 2009 NO NO NO NO NO NO NO
Free AYP Status &Reduced Lunch Percentage 19 Making AYP 55 36 26 51 48 69
Not Making AYP Not Making AYP Not Making AYP Not Making AYP Not Making AYP Not Making AYP
Step 4. Final selection of comparable schools The final step was to explore the status of the school to determine if it was implementing a treatment similar to those of the MSAP magnet schools. If the proposed school was using any treatment similar to the magnet schools being evaluated, then it was not selected, even if the demographics were closer than other potential comparable schools.
5