top of page

Leadership at the Gap: Actionable Assessment for School Improvement


Faculty meeting led by school leader reviewing data with teachers to support actionable assessment and school improvement.
The author supporting a faculty team in using data to drive meaningful, actionable change.

Moving Beyond a Data-Driven Culture in Schools

Are we tired of the term “data-driven culture” yet? Like many educational buzzwords, this one has been tossed around its fair share. The fatigue we feel when we hear it, however, has less to do with a devaluation of data and more to do with the fact that schools use the term but don’t walk the walk.


I think we can all agree that meaningful data is a powerful tool. In fact, when I was doing my doctoral coursework, I was astounded and almost horrified to find out how much incredible data is at our fingertips as educators, yet how little we actually access and use.


We know from many leadership case studies that doing what you say you will do, or DWYSYWD (Kouzes and Posner, 1987), is one of the key leadership practices to foster trust and lead an effective team. So, I’m here to help you figure out what doing what you say will do means when you say your school is “data-driven.”


What Counts as Actionable Assessment in Schools?

As school leaders, we talk to teachers on repeat about assessment… meaningful, actionable, timely, summative, formative, and so on. We know that, for students, an assessment is only as good as the actions they take afterward. The same can be said for formative and summative assessments we conduct as a school. Below are examples of the assessments we conduct regularly as school leaders.


Formative- What’s happening right now, and how do we adjust?

Summative- How did we do, and what does it mean going forward?

-School surveys of stakeholders

-Classroom walkthroughs

-Review of longitudinal data

-Professional development feedback forms

-Focus groups/listening forums

-Standardized testing

-Graduation rates

-College admission tracking

-End of program review

-Annual enrollment/retention data

-Accreditation

So, how can we make the data we have collected actionable?


Step 1- Start with Purpose in Actionable Assessment

Whether you are working alone or with a team or a subcommittee, ask yourself the following questions to get started:

  • What is the purpose of this data collection? 

  • What do we want to know?

  • What do we want to improve?


During my first year as principal at my current school, I received a number of complaints about the amount of homework middle schoolers were assigned. I assembled a group of grade-level team leaders, and we used the questions above to guide us. We decided we wanted to know how much homework was actually being assigned in each grade, which subjects it was assigned in, and what recent research indicates about the effectiveness and quantity of homework at this developmental phase. The end goal was to write a new policy.


Step 2- Identify the Right Data for Actionable Assessment

What information do you need to collect in order to answer your questions above? One of the biggest pitfalls of this process is the temptation to collect data that is related but not necessarily relevant to the purpose at hand.


In our effort to solve the homework crisis, we decided to survey teachers. One colleague suggested asking teachers how long they believed students should spend on homework each night. While interesting, we realized that the question would give us opinions rather than the concrete data we needed. Instead, we narrowed our focus to questions that would help us understand actual practice: What homework are you assigning? How long do you expect it to take? How often is it assigned each week?


The data you need might already be accessible to you, such as statistics that can be pulled from your learning management software, like attendance records or GPAs. In other cases, you might need to facilitate data collection. Conducting a stakeholder survey is a great way to get input and feedback, however it needs to be done very carefully because a poorly designed survey can quickly result in useless data. I highly recommend reaching out to a professional such as ACP who can help with data collection, survey design, and response analysis.


In my homework case study, we collected the following data: 

  • The past five years of course surveys in which students indicate how much homework they receive per class per night per subject

  • A teacher questionnaire about what homework they assign

  • Research studies or literature reviews about homework quality, quantity, and effectiveness for grades 6-12 (this followed its own protocol for reading the studies and generating themes and patterns)


Step 3- Data Visualization & Review for Actionable Assessment

The best advice here is to adopt a school-wide data review protocol; at one school I worked at, we called this the Evidence to Action Protocol. The protocol should be formatted like a worksheet so notes can be taken throughout the process.


Small groups of three to five people work best for data review. Based on the data sets you have, you might have all the small groups review the same data set or assign different sets to each group. I usually identify a facilitator for each group to keep people on time and take notes.


Before the protocol, consider how you will present the data to participants. There are different ways to visualize data, which can have an impact on how it is analyzed. In the homework study, I presented participants with two sets of data: one in bar graphs and the other in thematically organized narrative comments from an open-response survey. They were so excited to read the open response that they skipped right over the bar graphs. I should have given each group only one set of data, so this didn’t happen.


To present the data in a visual manner, Google Surveys and other survey platforms typically allow you to do this. If you are pulling your own data from your LMS, for example, I have typically used Tableau, but there are a number of easy online tools (especially with generative AI).


A Data Review Protocol

A good data protocol should have these three components:

  1. Observations: In this first stage, participants can make objective statements about what they see in the data, without judgment or interpretation. For example: “I see that 45% of students have more than one hour of homework per night” or “I see that 63 seniors graduated with a GPA of 3.0”. Every statement should be recorded.

  2. Interpretations: In this stage people can draw conclusions, develop hypotheses, and make interpretations. Participants can also ask questions for further exploration. For example, “Last year, sixth graders reported having less homework than years prior in Science; this could be because we got a new English teacher” or “If we extract the students who scored below a 2.5 GPA, how many of them exceeded the allowed school absences for the year?”

  3. Action Steps: In this stage, participants identify the most high leverage next steps. You might ask: What happens next? How might we do this? How long will it take? What do we need? Who is involved? What other data might we need? How will we know if we are successful?


To make this protocol actionable in a faculty meeting or team setting, consider using a simple structure with suggested timeframes for each stage. This helps keep the conversation focused and ensures the work moves from observation to action within a set time frame. The protocol below can serve as a guide during your data review.


Step

Action

Time

Observations: What do we see?

Capture only what is directly observable in the data.

5 to 10 min

Interpretations: What does it mean?

Identify patterns, trends, and possible explanations

10-15 min

Action Steps: What will we do?

Determine clear, actionable next steps, as well as ownership

10-15 min

Total Time 25 to 40 minutes


Step 4- Sharing Dating to Support Actionable Assessment

If stakeholders have participated in your data collection, such as with a focus group, a survey, or teacher feedback, they will want to hear the results of the process. I typically select a few key pieces of information to share in a report: the sample size (e.g., 5 focus group participants or 350 student survey responses), standout numbers that either reaffirm what we believed to be true or are potentially surprising, and the action steps we have identified. For example, here is a paragraph from the homework case study report: 


“The majority of student respondents (65%) report having over 2 hours of homework each night. When asked to elaborate in the open-response questions, it was found that this was occurring primarily because students are distracted when the homework is on their laptops; they have multiple tabs open and are multitasking, scrolling, watching YouTube, etc. One action step that the committee identified is to develop an executive functioning lesson for students on how to complete work at home, from setting up an appropriate study space, to limiting access to unnecessary devices, and setting timers for focused work. Additionally, we will run an online session for parents to learn more about how to support their children with this.”


This step is particularly important for DWYSYWD (doing what you say you will do) to build trust and morale in our community. Stakeholders will feel like it was worth their time to participate in the data collection and that further action steps will occur.


When students are the stakeholders who participated, I do still recommend sharing out. However, this might take the shape of an infographic, a slideshow, or an assembly announcement.


If no stakeholders participated in your data collection, you can save the sharing until the action steps have a wider impact. For example, if you looked at ten years worth of attendance data to examine an increase in student absences, and then decided to rewrite your attendance policy, you can share the highlights of the data analysis when the policy goes out as a way to establish context and create a sense of urgency for the action you will undertake.


Step 5- Turning Data into Action Steps

Now the real work begins! As part of the action steps identified in the data protocol, school leaders must shift from discussion to disciplined implementation. This means assigning clear ownership, setting realistic timelines, and identifying the specific supports needed to carry the work forward.


Too often, action steps live in meeting notes but never make it into classrooms or student experiences. Start small and focused—pilot a change with a team, build in checkpoints to assess progress, and be willing to adjust based on what you learn along the way. Just as we expect students to revise their work based on feedback, schools must be willing to iterate on their practices.


A truly data-driven culture is not defined by how much data we collect, but by how consistently we act on it. You can use an action step road map to indicate what needs to happen, when, and with whom.


Action Step Road Map Template

Evidence

Action

Timeframe

People

Other notes or needs

EX: Majority of students taking over 2 hours to complete homework each night due to distractions 

Develop an advisory lesson on at-home study spaces and habits

Developed by November, implemented in advisory at the start of semester 2

Learning specialist to develop lesson; Principal to communicate with teachers, advisors, and families

n/a


From Data-Driven to Data-Responsive Schools

If we want to move beyond the buzzword of a “data-driven culture,” we have to commit to the harder, more meaningful work of becoming a data-responsive school that uses actionable assessment.


That means using data not as a report to file away, but as a catalyst for change—something that shapes decisions, informs practice, and ultimately improves student learning.


When we clarify our purpose, focus our data collection, engage in thoughtful analysis, and—most importantly—follow through with action, we model the very learning process we expect from our students. Doing what we say we will do is not just a leadership principle; it is how we build trust, momentum, and a culture of continuous improvement.


Source:

Kouzes, J. M., & Posner, B. Z. (2017). The leadership challenge: How to make extraordinary things happen in organizations (6th ed.). John Wiley & Sons.


bottom of page