To provide real-world insights into the use of program evaluations in school districts, we spoke with Carla Stevens, the Assistant Superintendent for Research and Accountability for the Houston Independent School District. Below, we share highlights of our conversation, which covers challenges that district administrators face in applying data, advice for overcoming these challenges, and actions that district offices can take to play a stronger role in program evaluation. These insights also complement the Hanover blog post Applying Student Data to District Decision-Making. Check out our conversation with Carla Stevens below.
Hanover Research: In your work, have you found that there are certain components of applying data that are the most challenging for district administrators? Do you have any advice for overcoming these challenges?
Carla Stevens: In my opinion, one of the challenges for district administrators is that we often find ourselves in situations where we are data rich, but information poor. It is difficult at times to apply the data we have to the decisions that need to be made in order to improve the educational outcomes for our students. In the research department, we receive numerous requests for data and for program evaluations, but often the requests, when taken literally, won’t actually yield the information that is really needed by district administrators to make the best instructional decisions. It is important for the research staff to not just respond to the request, but to thoughtfully work with the requestors to determine what questions they are trying to answer, or what issues they are trying to solve, so that we can better provide the appropriate data using the most accurate methodology and analysis.
Another challenge that we have is balancing the need for well-designed and thorough program evaluations with the time that it takes to complete them. District administrators have to make decisions that impact budgets which are conducted on a specific calendar set either by district or state policies. However, these budgetary calendars do not always perfectly align with when student outcome data become available and the time that it takes to appropriately clean, match, and analyze the data. This often means that budget decisions are made prior to the release of results of the program evaluations, or program evaluations have to rely on older data to meet the deadlines needed for the current budget cycle. This is a constant challenge. One way in which we have tried to meet the budget cycle demand is to provide interim reports with available participation data which is then updated with a final report later that informs the next budget cycle. We have also begun the development of a program review process that is aligned with the budget cycle. This program review process relies on the program staff to determine in advance if they are going to need a formal program evaluation from internal or external research teams, and to work with the research department early in the process to obtain participation and implementation data as the program evolves. Results from formal evaluations are then rolled into this review process as they are completed to augment the participation and implementation data the program staff are collecting. This allows for program staff to have at least some information available about program implementation when budget decisions have to be made to continue or cut the program.
Hanover Research: Are there specific actions that district offices can take to play a stronger role in program evaluation and evaluating resource-spend?
Carla Stevens: To me it is important that the district staff have a basic understanding of the value that a well-designed and thorough program evaluation can provide to them in making programmatic decisions, such as improvements to program design, implementation, training, etc. However, to have a well-designed and thorough program evaluation, the researcher, no matter internal or external, needs to be included in the conversations as early as possible in program design and implementation. When program evaluation is an afterthought, and needed on a very short turnaround because of being an afterthought, the researcher is limited in his/her ability to provide an impactful evaluation.
Another specific action, in addition to including the researcher/evaluator early in the process, is for the researcher/evaluator to have regular, active feedback with the district staff during the program evaluation so that they too develop an ownership in the report that is provided and see it as useful in their decision-making process. We include a section in our program evaluation reports for an “administrative response” where the district staff indicate what they gleaned from the program evaluation and how they are going to implement the recommendations or what decisions they have already made regarding the program as a result of the evaluation report.
Hanover Research: Please share any additional thoughts on ways that the district offices can advance research, evaluation, measurement and its practices.
Carla Stevens: I am a firm believer in the value of applied educational research studies, effective and well-designed program evaluations, and data driven instructional practices to improve teaching and learning for all of our students. In order to make this happen, it is critical that those who are conducting the research, program evaluations, and data analysis have a collaborative relationship with the district staff who are designing and implementing programs and interventions for students. It is when the instructional side of the district sees value in the research, and when the research team meets the needs of the instructional staff, that we can make the necessary and appropriate changes in the educational program we are providing to students to close the achievement gap.
About Carla Stevens:
Carla Stevens is the Assistant Superintendent for Research and Accountability for the Houston Independent School District. Her responsibilities include the oversight of the Department of Research and Accountability, project director of the district’s multiple federal Teacher Incentive Fund grants, and district liaison for the Houston Education Research Consortium with Rice University’s Kinder Institute. Ms. Stevens’ areas of interest include state and federal accountability systems, student assessment, program evaluation, teacher performance pay models, and student performance within teacher appraisal systems.