Keck-evaluation

From Earlham CS Department
Revision as of 16:24, 4 August 2006 by Watsolo (talk | contribs) (First draft of evaluation stuff)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Evaluation

The progress toward achieving the goals set forth in Part A will be evaluated in both a formative and summative fashion. The formative process evaluations conducted at various points throughout the grant period will assist the faculty and academic departments involved in this interdisciplinary project to refine the project goals and objectives and to make any ongoing modifications or revisions. They will identify what is working during the initial implementation and point to areas needing further development. The summative evaluation will take place at the end of the grant period and will provide a measure of how well the program goals were met as well as provide directions for future growth.

We have identified several possible sources of both internal and external evaluation. Internal evaluations may be conducted by one or more members of the project or other faculty and staff within the science division who are not directly involved in developing or using these curricular modules. External evaluators may be drawn from program assessors who have worked previously with Earlham College or from organizations which provide consultancy services such as the Council on Undergraduate Research (CUR). CUR can provide program evaluation sensitive to issues at predominantly undergraduate institutions including evaluators with experience in evaluating interdisciplinary and interdepartmental programs.

The project leaders will meet with the evaluators to clarify, operationalize, and select and development instruments which evaluate the stated and implicit goals of the project. Some key goals in this case might be:

• Bridge the gap between scientific research and science education by incorporating research modules into several lower and upper division courses

• Increase understanding of interdisciplinary use of field, laboratory, and computational methods to solve a particular problem

• Expand Earlham’s Environmental Studies program by incorporating the study of environmental issues into the core courses of several disciplines

• Connect community interest and expertise with collaborative science research to investigate a question of local concern

They might also help to conceptualize key issues or problems that would keep our program from meeting our stated objectives and specify particular criteria for success as well as identify particular data needed to determine how well the components of the program are meeting their objectives.

Delineating project goals will assist us in developing both qualitative and quantitative measures for determining how well our goals are being met during both the formative and summative evaluation phase. Possible qualitative evaluations include:

• Focus Groups: Informal small-group discussions facilitated by the evaluator conducted with students involved in one or more of the courses impacted by this program as well as faculty involved in the implementation.

• Open-ended surveys: The evaluator will collect answers without preset response categories to written questions. Surveys with an open-ended component would be given to all students enrolled in courses where a research-based interdisciplinary module was used, faculty who developed and taught such courses, and participants in any workshops in which the faculty involved in the Keck program presented the pedagogy and organization of this project.

• Semi-structured interviews: The evaluator might conduct semi-structured interviews of key personnel and a representative sample of students to allow the evaluator some first hand experience with the evaluated activities and a chance for in-depth exploration of particular issues.

• Peer evaluation: Efforts at developing interdisciplinary course modules incorporating computational modeling will be described in peer-reviewed publications providing feedback from the reviewers as well as others who read the articles.

Quantitative evaluations might include:

• Quantitative surveys: Quantitative pre and post surveys of student or workshop participant attitudes toward and confidence in computational methodology, interdisciplinary collaboration, relevance of environmental studies in particular disciplines, and interest in science and society.

• Institutional data: Assessment of pre and post grant levels of student participation in undergraduate research, likelihood of taking a second science class, and number of science majors.

• Incorporation in curriculum: Measurement of the pre and post grant percentages of incorporation of computational modeling in primarily field or laboratory based courses and field or laboratory modeling in computational courses.

• Incorporation in co-curriculum: Comparison of the numbers pre and post grant interdisciplinary projects including collaborative research projects and grant writing efforts.

Final reports summarizing both the quantitative and qualitative data will be produced.