After learning how to identify problems and pose good questions, find existing data, collect basic data yourself, and analyze and interpret data, we hope that you feel more data-empowered and data-savvy. In this module, we discuss the important final stage of the data-driven decision-making cycle: creating ongoing, reflexive feedback loops when discussing data with colleagues, translating your own data findings into concrete action steps, evaluating any interventions and incorporating data collection into your broader decision-making processes.
To continue Module 5: Putting Data to Use, click on Objectives & Keywords in the right-hand navigation.
Objectives & Keywords
After completing this module, you will be able to:
- Define decision-making feedback loops
- Propose implications from data
- Share results with collaborators and stakeholders
- Translate data findings into action steps
- Evaluate the process and effect of intervention(s)
To continue Module 5: Putting Data to Use, click on Keywords in the right-hand navigation.
Feedback loop: The continual process of data-driven decision making: scanning the environment to develop a specific question, reviewing and/or collecting data, analyzing the data, using the implications to provide feedback on your question and help plan next steps, and repeating as necessary to foster improvement.
Implication: A potential outcome interpreted from the data that explains how the findings can be used in practice.
Intervention: Concrete action(s) taken to address problems identified by data through implementing programmatic or curricular change.
To continue Module 5: Putting Data to Use, click on Case Studies in the right-hand navigation.
At the Classroom Level: Alex
Having discovered that his students’ problem-solving skills were so closely linked to their mathematics preparation, Alex then wondered what to do about it. This proposition had implications for teaching practices. In an attempt to close the feedback loop during the in-service period before school started, Alex organized a series of brainstorming meetings between the science and math departments at the school. He used the analysis of the data he collected in his own course as a way to start the conversation about the two departments collaborating to ensure student success across both disciplines. Through these conversations it became clear that the results could be interpreted in two complementary ways. Alex originally thought a lack of mathematics skills—or an inability to apply the raw math skills students already knew—likely led to students’ poor performance on physics problem-solving tasks, but further conversations with the mathematics teachers also suggested that science classes could help to lay the foundation for the critical thinking skills that the mathematics teachers needed their students to have.
Together, the math and science faculty decided that as a first step, based on Alex’s analysis, they would add specific scientific applications to their 10th-grade mathematics courses, both to help students bridge the gap between pure math and its applications to the sciences and to encourage students to further develop their mathematics skills by showing them how their work in math will pay off in science.
Nearly three years to the day after Alex first decided to try to use data to better understand students’ problem-solving abilities, he sat down and repeated the same analyses, this time with the benefit of another year’s worth of data: data from his 11th-graders who, as 10th-graders, had first participated in this new 10th-grade math curriculum enriched with scientific applications. He found that, while there was still a range of performance in both 10th and 11th grades, it was narrower, with more students performing at a high level than ever before. He shared his findings with his science and math colleagues, who were encouraged by the results. Together, they decided to continue teaching scientific applications in math courses. They committed to engage in an ongoing feedback loop: they would repeat the data collection process periodically to evaluate the intervention’s effect on students’ physics problem-solving skills and adjust their teaching interventions as needed.
To continue Module 5: Putting Data to Use, click on Beth in the right-hand navigation.
At the Department Level: Beth
Beth walked down the hall toward the classroom used for faculty meetings without the usual sense of dread that occupied her thoughts prior to such meetings. Her colleagues were also looking forward to hearing the results, as many of them had participated in individual interviews with Beth and had rated the writing skills of the students in their literature courses.
As she had expected, her findings did ignite some controversy. One colleague noted that students were not randomly placed into freshman composition courses, so it could have simply been the case that the differences Beth found were due to differences in high school preparation. Beth didn’t agree because the first-year GPA and scores on the SAT writing portion were actually lower for students in the high-performing course. However, before she could respond, another colleague started discussing implications of the data by suggesting ways that they could engage in further study. Beth was delighted to find that the unproductive venting from previous meetings was being replaced by a valuable discussion on what actions they could take as a department to enhance student writing skills.
One faculty member suggested that even though Beth’s analysis could be strengthened, the findings were compelling enough to motivate taking action. This suggestion excited Beth as she witnessed her colleague closing the feedback loop by suggesting ways the data could be used to inform practice. They decided to work together to identify the characteristics of the most successful composition courses, which they would then codify as a series of “good practices” that composition instructors could use in preparing for their own courses. One characteristic that seemed consistent across the high-performing classes was a peer-editing component; once faculty members saw that this was present in all of the high-performing classes, they were eager to adopt it for their own courses.
A year after implementing the peer editing intervention across the writing courses, Beth repeated her study. She found that the differences in student writing ability between courses were not nearly as stark as they had been the previous year. In fact, faculty ratings for students’ writing across all of the courses were better than the highest-performing course in the previous year.
To continue Module 5: Putting Data to Use, click on Cristina in the right-hand navigation.
At the Institutional Level: Cristina
The timing of Cristina’s analysis was quite fortuitous, as many of Cristina’s colleagues and institutional leaders were planning to attend an upcoming retreat. At the retreat, Cristina presented her findings, which helped everyone gain a better understanding of how their home institutions compared to peer institutions and statewide averages in STEM degree attainment over the past decade.
As the retreat group speculated on the forces behind the changes in STEM degree attainment, the discussion shifted back to the gateway STEM courses, and Cristina and her colleagues realized that was the heart of the educational issue. It was interesting to look over STEM attainment rates at their institutions, but it did not help them understand how different factors, such as poor student performance in STEM gateway courses, affect STEM degree attainment. In addition, the data Cristina presented did not directly connect to decision making.
Cristina felt frustrated after the retreat because she had worked hard to collect the data on STEM degree attainment, but she also felt glad because the discussion clarified the most important question: How can we as an institution help improve student performance in STEM gateway courses? She and two colleagues made a decision to work together after the retreat to collect additional data on student performance, engage STEM faculty and students in gateway courses via surveys and focus groups, and learn more about the specific factors affecting student performance. Their ultimate goal would be to design curricular interventions to facilitate better teaching and learning in the gateway courses. They hoped this would cause more students who might previously have opted out of a STEM discipline due to poor gateway course performance to attain STEM degrees.
Cristina’s initial data analysis and interpretation did not resolve as she had anticipated. She realized she had investigated a question that, while interesting and helpful for context, did not directly address the most pressing problem at her institution. Her efforts did, however, help her refocus her question to provide the most useful data for decision making. Cristina’s experience is quite common in the data-driven decision-making process. You, too, may find that you investigate a question and collect data that do not end up telling you what you need to know to take action. While this can be discouraging, it is an important part of the learning process and a very normal occurrence. It takes practice and patience to build your skills as a data-driven decision maker.
To continue Module 5: Putting Data to Use, click on Module in Action in the right-hand navigation.
Module in Action
Moving from Analysis to Action
Simply analyzing the data is not enough. It is essential to describe implications for the data and, more importantly, identify steps in the feedback loop so that the data are used to improve practice.
Before You Begin
- Make sure you are investigating questions that are of interest to others. If there isn’t interest, it may be difficult to create buy-in or advocate for resources.
- Only explore questions that you believe have a good chance of getting addressed. Rather than collecting data in an area you cannot impact or influence, it may be worthwhile to evaluate something over which you have more control and can address through interventions.
- Create a concrete plan for all steps of the data collection and analysis process, including implementing interventions based on your results.
- Ensure that your entire plan to address your question is well done. If people disagree with the results, the first thing that will be doubted is the quality of the data collection.
- Get buy-in. Make sure the formal and informal leaders are involved and included in your process, even if it is simply by providing them a progress report. This buy-in will help with incorporating change in the form of programmatic or teaching interventions.
After Your First Cycle of Collection and Analysis is Complete
- Share your results with colleagues and administrators.
- Using input from colleagues and administrators, identify implications for your results. Describe the various ways that your teaching and/or practice can be improved.
- Prioritize these implications. You can prioritize by most important or easiest to implement.
- Develop an action plan, including any needed interventions. Outline the specific steps needed to address each implication.
- Describe the resources needed for each step in your action plan.
- Discuss your action plan with administrators to secure approval for moving forward and advocate for any resources needed.
- Carry out your action plan by implementing intervention(s).
To continue Module 5: Putting Data to Use, click on Ongoing Data Collection in the right-hand navigation.
Ongoing Data Collection
Now that you’ve reached this final module, you already know each step of the data-driven decision-making process. To systematize that process, you simply need to continue the feedback loop with a concrete plan for the present and immediate future.
Once you treat your plan as an ongoing process that you review on a regular basis, you will look forward to seeing if your interventions have made an impact. Consider Alex and Beth's strategies as examples. In your plan:
- Identify the question you would like to address, either based on a scan of your environment or on results and implications from previous data you’ve collected.
- Decide what type of data will be most helpful for your question. Note the manner in which you collect any data and from what populations.
- Determine potential data analysis techniques, given the type of data you are collecting.
- Brainstorm ideas for how you will share your results and implications, including presenting them to the individuals who would benefit most from the findings, can discuss next steps with you, and may be collaborators in implementing interventions.
- Indicate a general timeline for any data collection, your data analysis, a written or verbal report of your findings, and the next potential question to address.
Once you treat your plan as an ongoing process that you review on a regular basis, you will look forward to seeing if your interventions have made an impact. Consider Alex's and Beth's strategies as examples.
To continue Module 5: Putting Data to Use, click on Video 5.2 in the right-hand navigation.
In this video, David Greiner reflects on the importance of ongoing data collection as a source of constant feedback about teaching and learning.
To continue Module 5: Putting Data to Use, click on Activity 5.1 in the right-hand navigation.
In this video, Dr. Eric Mazur shares the story of how using data-driven decision making led him to create and implement the innovative pedagogical technique of Peer Instruction.
To continue Module 5: Putting Data to Use, click on Conclusion & Review in the right-hand navigation.
Conclusion & Review
Module 5 guided you through the process of using data to take actions that improve teaching and learning. Considering data implications, sharing your results with stakeholders, engaging collaborators, and implementing interventions are essential components of data-driven decision making. In addition, you learned the importance of using data on an ongoing basis through reflexive feedback loops.
To continue Module 5: Putting Data to Use, click on Review Questions in the right-hand navigation.
- What is a feedback loop, and why is it important?
- How will you share your findings with critical stakeholders and collaborators?
- What is one intervention you could implement that addresses your findings?
- What next steps will you take to create and implement your own data-driven decision-making plan?
To view the supplemental materials for Module 5: Putting Data to Use, click on Supplemental Materials in the right-hand navigation.
To move on to the Closing, click on Closing in the right-hand navigation.
This suite of modules has asked you to identify problems and pose questions, find data and evidence to address those questions, collect data, analyze and interpret data, and put the data to use as a data-driven decision maker. Most crucially, we want you to come away from your module experience better able to articulate what data-driven decision making is and how you can best incorporate it into the work you already do.
We cannot emphasize enough our belief in data’s power to drive positive classroom, department, and institutional change that benefits each and every educational stakeholder—students, teachers, administrators, educational leaders, and more—regardless of his or her role in our state’s educational system. It is our hope that you will take what you have learned from this suite of modules and put it into practice on a daily basis as well as spread the word about data-driven decision making and the positive effect it can have on education in Texas.