Color profile: Disabled Composite Default screen
IRGEE 178d
Development of Evaluation Tools for GIS: How Does GIS Affect Student Learning Sophia Linn Fort Collins, Colorado, USA Joseph Kerski US Geological Survey, Denver, Colorado USA Sarah (Sally) Wither Orton Family Foundation Community Mapping Program, Steamboat Springs, Colorado USA Although Geographic Information Systems (GIS) are viewed as increasingly valuable for educational use, few tools have been developed to assess the effects of GIS on student learning (Barstow et al., 1994; Kerski, 2003). A joint project of the Colorado Geographic Alliance (COGA) and the USGS focuses on the development of evaluation tools to determine what, if anything, students gain from using GIS in a classroom setting. Anecdotal evidence suggests that there exist tangible benefits for students – whether in content understanding or in attitudes – but it is necessary for tools to be created to measure such effects. Teachers, principals, and other district administrators need to see measurable benefits in using new technologies in order to determine whether investments in equipment and staff training is justified. How can results be measured unless there are reliable tools that can measure them? A grant from the Colorado Geography Education Fund provided support to initiate the development of such tools. As part of the grant, scholarships were awarded to 10 teachers to participate in a week-long, intensive GIS institute for educators during the summer of 2002. In addition to providing the support for introductory training, the grant supported a follow-up session in which these teachers, along with others who had used GIS in their classrooms, gathered to reflect upon their observations of students using GIS in their classes. Seventeen geography and science teachers were guided in their discussion by an external evaluator from the University of Northern Colorado in an effort to develop evaluation instruments that would extract what students are gaining in their use of GIS. A lively and sometimes impassioned discussion ensued that resulted in a rather substantive list of areas in which teachers ‘suspect’ students are being impacted by the technology. The group worked through a structured process in order to build a model of their collective thoughts, with the goal of describing and quantifying the end result of using GIS in the classroom. Throughout the discussion, participants reflected on their own classroom experiences while being guided and focused by
1038-2046/04/03 0000-00 $20.00/0 International Research in Geographical and Environmental Education
59
C:\edrive\Irgee\14-3\irgee14-3.vp Wednesday, September 28, 2005 11:31:54
Forum
IRGEE 178d
© 2005 S. Linn et al. Vol. 14, No. 3, 2005
Color profile: Disabled Composite Default screen
60
IRGEE 178d
International Research in Geographical and Environmental Education
Table 1 Quantitative rubric 3
2
1
Questions
Student asks geographic questions, such as ‘where?’ and ‘why there?’ on his/her own. Questions are related to the problem. Able to revise questions as new data comes in. Research questions are appropriate to the tools, time, and task.
Student asks geographic questions, such as ‘where?’ and ‘why there?’ with some adult prompting. Questions are usually related to the problem. The student is sometimes able to revise questions as new data comes in. Questions are appropriate to the tools, time, and task.
Student formulates questions only with teacher help. Questions are not revised to connect with new information. Questions are either too broad, too narrow, or irrelevant to the problem.
Information
Student determines data needed to answer question. Student selects or creates primary, secondary, and tertiary sources of data appropriately. Student is able to justify data collection methods and the data chosen to be collected. Student observes and systematically records information. Student reads and interprets maps, and conducts interviews as needed to collect data.
Student determines data needed to answer the question with adult assistance. Student selects or creates primary, secondary, and tertiary sources of data appropriately. Student is able to explain data collection methods and the data chosen to be collected. Student usually observes and systematically records information. Student reads and interprets maps, and conducts interviews as needed to collect data with some adult help.
Student may be able to tell what information was gathered but does not understand how it relates to the research question. Student does not understand how data was gathered. Student’s data was not systematically collected nor recorded. Student is unable to read or interpret geographic information.
Organising Data
Student organises and displays geographic data in ways that help with the analysis and interpretation of it. Organising the data may include using and developing maps, graphs, tables, and timelines with available tools.
Student organises and displays geographic data in ways that help with the analysis and interpretation of it, with some adult help. Organising the data may include using and developing maps, graphs, tables, and timelines with available tools.
Student is not able to organise geographic information in a way that contributes to analysis and interpretation. Student is not able to develop maps, graphs, tables or timelines with or without tools.
Interpretation and Analysis
Student is able to see patterns, relationships and connections among the collected and organised data. Student is able to describe the data and ask alternative questions about it. Student notes associations and similarities between areas, recognise patterns in the data, and draw inferences. Student uses simple statistics to identify trends, sequences, correlations, and relationships.
Student is able to see some of the patterns, relationships and connections among the collected and organised data. Student is usually able to describe the data and ask alternative questions about it. Student notes associations and similarities between areas, recognise patterns in the data, and draw inferences. Student may use simple statistics to identify trends, sequences, correlations, and relationships.
Student is unable to see patterns, relationships or connections among collected and well organised data. Student is not able to describe the data nor ask alternative questions. Student does not use simple statistics to identify trends, sequences, correlations, and relationships.
Student can usually formulate explanations or propose solutions based on the data. Student is sometimes able to identify biases or flaws in the data and/or conclusion. Student usually supports conclusions with evidence in map form, table form, graphic forms, and timelines. Student can sometimes connect current knowledge to new conclusions or solutions. Student can sometimes make inferences based on the information in geographic form.
Student is unable to formulate explanations or solutions using the data. Student is unaware of biases or flaws in the data or the conclusion. Student is unable to support conclusions with data from maps, tables, graphs, or statistics. Student is not able to make connections between the new information and present information or knowledge. Student cannot make inferences based on the information in geographic form.
Conclusions or Student formulates Solutions explanations or proposes solutions based on the data. Student is able to identify biases or flaws in the data and/or conclusion. Student supports conclusions with evidence in map form, table form, graphic forms, and timelines. Student connects current knowledge to new conclusions or solutions. Student makes inferences based on the information in geographic form.
C:\edrive\Irgee\14-3\irgee14-3.vp Wednesday, September 28, 2005 11:31:55
Color profile: Disabled Composite Default screen
IRGEE 178d
Forum
61
Table 1 (contd) Quantitative rubric
Presentation
3
2
1
Student uses maps, tables, graphs, timelines and statistics to support the conclusion or solution. Visuals are large enough to be read easily, simple and easy to understand, and pleasantly designed. If GIS is used, the presentation runs smoothly, the maps are clear, and queries are done quickly and efficiently. Presentation employs but does not depend upon the technology to support the conclusion. The student is able to continue with the presentation even if the technology fails. The argument is supported by the presentation of the question, evidence, addressing of faulty reasoning, poor data, explanation of data collection methods, and analysis and interpretation of data.
Student may use maps, tables, graphs, timelines and statistics to support the conclusion or solution. Usually, the visuals are large enough to be read easily, simple and easy to understand, and pleasantly designed. If GIS is used, the presentation usually runs smoothly, the maps are mostly clear, and queries can be done. Presentation may depend upon the technology to support the conclusion. The student is able to continue with difficulty with the presentation if the technology fails. The argument includes summaries of the question, evidence, addressing of faulty reasoning, poor data, explanation of data collection methods, and analysis and interpretation of data.
Student’s presentation does not include maps, tables, graphs, timelines, or statistics to support the conclusions or solution. Visuals are absent or are poorly constructed. Student tends to depend on technology, GIS, to make the presentation, if it fails, the student is unable to proceed. The argument is not supported by a presentation of the research process.
the facilitator. They discussed the utility of GIS and stressed that as a classroom tool, GIS must not make teaching more cumbersome. Teachers were prompted to ask themselves: ‘Does GIS facilitate or improve something? Does it make teaching more effective or efficient?’ Teachers observed that GIS makes lessons more student-centred, that students using GIS are using higher-order thinking skills, abstract thinking, inference, and prediction to a greater degree, and they become more critical about what they are learning. The evaluator then asked the teachers, ‘What does it look like when students do these things?’ The teachers responded that the students are interpreting data, maps, and graphs, observing patterns at multiple scales, questioning the data, looking at issues holistically, and transferring their conceptual knowledge from local to global scales. Next, the group examined whether the students could do these things without GIS. The general response was that while students can do many of these things with paper maps and tables, GIS enabled students to acquire data and create and alter maps more quickly. For example, instead of spending 80% of the time on data acquisition and representation and 20% for analysis and interpretation, the percentages were switched when using GIS. With GIS, students can spend the bulk of the time on projects in analysing data, asking questions, observing patterns, linkages, and trends, and making connections. As the discussion continued, it became clear that the effects of GIS on students could be divided into three general categories: quantitative, as measured by a pre-determined rubric; qualitative, as observed by student behaviour; and student perceptions, as observed in student reflections through journals and interest surveys. To measure the quantitative effect of GIS on students, a rubric was created that assesses the questions students pose, the information they gather, the way they organise data, how they interpret or analyse the data, how they come to their conclusions or solutions, and how they present their results (Table 1). To assess
C:\edrive\Irgee\14-3\irgee14-3.vp Wednesday, September 28, 2005 11:31:55
Color profile: Disabled Composite Default screen
62
IRGEE 178d
International Research in Geographical and Environmental Education
Table 2 Student engagement chart Students are . . .
66%
50%
33%
0
Very Seldom
Seldom
Less than normal
Normal
33%
50%
66%
More than Frequently Very normal Frequently
Making comments or asking questions about the project whenever they see you (hallways, before/after school, etc) Asking relevant questions in class Working on their own Helping each other Asking relevant questions of each other Integrating new material with previous knowledge Monitoring their own work Solving task-related problems Discussing project/ problem issues with you (teacher) Discussing project/ problem issues with peers Coming to class Coming to class ready to work Staying on task during the class time on their own Taking risks Initiating action when the opportunity is available Suggesting new ideas and methods to answer questions/issues Coming late to class Staying after class to work on projects Following directions Passive Giving up easily Not trying hard Bored Interested
Compared to your regular class times with this group (‘Normal’ column) rate the classroom atmosphere on the listed student engagement indicators on the continuum from Very Seldom to Very Frequently during the time you are using GIS in the classroom.
the impact of GIS in the classroom, a quasi-experimental design is needed where students using GIS are compared to students using traditional methods and
C:\edrive\Irgee\14-3\irgee14-3.vp Wednesday, September 28, 2005 11:31:55
Color profile: Disabled Composite Default screen
Forum
IRGEE 178d
63
tools. Kerski (2003) used this model to examine students who used GIS to study the regional geography of Africa, patterns of global oil production, consumption, and reserves, and demographic change in local neighbourhoods. Half of the students used GIS, and the other half investigated these same topics using paper maps, atlases, tabular data, and textbooks. To assist teachers in gathering qualitative data about student engagement when using GIS in a classroom setting, a second tool was developed. The indicators are derived from ‘Alternative Approaches to Assessing Student Engagement Rate,’ (Chapman, 2003), which provides working definitions of student engagement. The findings can be analysed by creating an average of all the charts received and showing the averages on a blank chart. Percentage suggestions have been added to each column to help guide the teachers and the researchers as to what the words mean. Teachers could also fill out a chart for a class before using GIS and then complete an additional chart for the same class while using GIS. This method would provide pre-scores and post-scores for each area of student engagement. The gain scores could then be analysed using t-tests. Porter (2002) justifies the use of teacher self-report surveys because teachers are willing to complete them. He has found that agreement between the information in a survey and in a teacher’s log has correlations of 0.7 to 0.8, which is considered high (Porter & Smithson, 2001). The use of surveys also is cost effective. Some limitations of using surveys include the fact that the surveys are limited to what the researcher asks, they are subject to a self-report bias, and the surveys do not capture all of the instructional complexity that exists in a classroom. To evaluate the effects of GIS on students’ perception and attitudes, students’ self-reporting techniques were discussed. This can be done in journal format where students reflect upon various aspects of their experience using GIS. What have they learned? What have they enjoyed? What caused them frustration or difficulty? Do students enjoy using these tools? Teachers can periodically gather these journals and summarise student responses. Alternatively, interest surveys could be developed at the classroom level that would ask students these or similar questions. This project enabled the creation of an initial array of tools that teachers can use to evaluate the utility of GIS. The fundamental purpose for evaluating new technologies such as GIS as teaching tools is for teachers to determine for themselves whether or not using GIS is worthwhile. Future research projects will seek to gather and compile data from teachers in order to find more conclusive evidence regarding the effectiveness of GIS as a classroom teaching tool. Correspondence Any correspondence should be directed to Sophia Linn, Fort Collins, Colorado, USA (
[email protected]). References Barstow, D., Gerrard, M.D., Kapisovsky, P.M., Tinker, R.F. and Wojtkiewicz, V. (1994) Report from the First National Conference on the Educational Applications of Geographic Information Systems. 27–29 January 1994. Cambridge, Massachusetts: TERC Communications.
C:\edrive\Irgee\14-3\irgee14-3.vp Wednesday, September 28, 2005 11:31:56
Color profile: Disabled Composite Default screen
64
IRGEE 178d
International Research in Geographical and Environmental Education
Chapman, E. (2003) Alternative approaches to assessing student engagement rates. Practical Assessment, Research and Evaluation: On WWW at http://edresearch.org/pare/. Kerski, J. (2003) The implementation and effectiveness of geographic information systems technology and methods in secondary education. Journal of Geography 102 (3): 128–37. Jacksonville, AL: National Council for Geographic Education. May-June. Porter, A.C. (2002) Measuring the content of instruction: Uses in research and practice. Educational Researcher 31 (7): 3–14. Porter, A.C., and Smithson, J.L. (2001) Are content standards being implemented in the classroom? A methodology and some tentative answers. In S.H. Fuhrman (ed.) From the Capital to the Classroom: Standards-based Reform in the States, 100th Yearbook of the National Society for the Study of Education, Part II (pp. 60–80). Chicago, IL: National Society for the Study of Education.
C:\edrive\Irgee\14-3\irgee14-3.vp Wednesday, September 28, 2005 11:31:56