Data+Analysis+Meeting

Adelphia School: PLC Meeting 2010:

[]

Willow Grove: Kindergarten Team:

[]

Bad Collaboration:

[]

Learning by Doing, Chapter 5 Collaboration in PLC:

[]

Mitchell Elementary: Professional Learning Teams:

[]

Inspirational- Team Building

[]

The Power of Teamwork: Inspired by the Blue Angels:

[]

As we speak about Professional Learning Communities, we know that excitement for student learning is palpable. It is impossible to not be jazzed when we speak about all students learning at high levels! The big question though, when people dig deep and begin to look at the nuts and bolts of truly focusing on learning for all is inevitably, “Is it worth it?” Test scores from the latest administration of the Kansas Assessment were recently presented to the USD 207 Fort Leavenworth Board of Education. Rather than simply taking the yearly AYP data and presenting it, we looked at our growth longitudinally and wanted to show that the tasks associated with living as a PLC really made a difference. We addressed the question of “Is it worth it?” from a practical position.

**Four Essential Questions**

Everything we do at Fort Leavenworth is guided by these four questions. From budget development to curriculum alignment to parent/teacher conferences, we bring everything back to: For example, we used these questions to change how we choose new textbooks. In traditional textbook adoptions, materials from various vendors are reviewed and a text is chosen that is thought to be a good match to the curriculum. After implementation, the questions used to evaluate the choice were, “How do you feel about the new book?” or “Is that textbook working for you?” In a PLC school, the four essential questions help us determine if a new book is going to help us in helping our students learn. We don’t worry about whether the text is going to “work for” the teacher. The template we have used during our latest [|curriculum text review is included here]. **Data Analysis**
 * 1) What do we want students to learn?
 * 2) How do we know the students learned it?
 * 3) What are we going to do when the students do not learn it?
 * 4) What are we going to do when the students already know it?

As we’ve learned to work as a PLC, we have gone from being teachers who looked at data as something we received to teachers who use data as a tool to improve learning. Functioning as a Professional Learning Community meant we didn’t take for granted that students may have learned what a teacher presented. Rather, we learned to look at the data from common assessments as a guide for what each student needs to learn in order to master the essential outcomes. This took practice for us as it was not how we were accustomed to working. We had been trained to teach, test, and continue on with the next unit, with the focus on TEACHING. Now we teach, assess, review the data, change our instruction to meet the needs of the student, assess again. This shifted the focus to LEARNING! By returning to the key question of what we wanted students to learn we were able to keep this new focus on learning, using results to guide us.

**Collaboration**

Working together came naturally to us. Working together INTERDEPENDENTLY did not. It was very easy to get together and “chat” about things like lesson plans, travel forms, and recess procedures. It was not until we began to look at our data that we realized, by accident, that some of us were better at some things and some of us were better at others! A boost to our collaboration came from Cindy Wepking, a then third-grade teacher. In a meeting, she commented that one of the norms for her group was to __leave their egos at the door__. When everyone did that, honest conversations started to happen and learning was paramount. Realizing that it was not a personal affront to anyone when you spoke about data was a huge AHA moment for our teachers! It gave professionals permission not to have to be supermen and superwomen in their classrooms, but to focus on what they can do better – together!

**Dedicated Time (Early Dismissal)**

Through the years of learning to live as a PLC, we had the challenge of finding time to collaborate. We worked on carving out time in our weekly schedules using specialty areas and teacher plan times. That worked initially. However, we found that 40-45 minutes once per week was not enough to accomplish what we had set out to do. Using the “Here’s What/So What/Now What” protocol, we met with our board of education to see if in fact we needed to move to an “early release” format.

__Here’s What:__ We identified what we knew as fact. We looked at data and, without judgment or analysis, we were able to see our scores.

__So What:__ After we had identified our data, we were able to step back and analyze it, predicting what would happen if the data stayed the same with no intervention. We concluded that if we continued along this path, our students would not continue to grow in their learning. Teachers needed more time to collaborate and take collective responsibility to help students make this growth.

__Now What:__ After engaging in the collection and analysis of the data and collectively studying best practice in improving schools, we recommend that students be dismissed two hours early on Fridays and that teachers use this time for collaboration in both vertical and horizontal teams. Three years later, we see that the data reflects that our analysis and plan were exactly what we needed.

[]

First Things First: Demystifying Data Analysis: []

Educational Leadership
February 2003 | Volume ** 60 ** | Number ** 5 **

** Using Data to Improve Student Achievement ** Pages 22-24

First Things First: Demystifying Data Analysis
** To improve student achievement results, use data to focus on a few simple, specific goals. ** Mike Schmoker  I recently sat with a district administrator eager to understand her district's achievement results. Pages of data and statistical breakdowns covered the table. Looking somewhat helpless, she threw up her hands and asked me, "What do I do with all this?"  Many educators could empathize with this administrator. The experts' tendency to complicate the use and analysis of student achievement data often ensures that few educators avail themselves of data's simple, transparent power. The effective use of data depends on simplicity and economy.  First things first: Which data, well analyzed, can help us improve teaching and learning? We should always start by considering the needs of teachers, whose use of data has the most direct impact on student performance. Data can give them the answer to two important questions: >  The answers to these two questions set the stage for targeted, collaborative efforts that can pay immediate dividends in achievement gains.
 * How many students are succeeding in the subjects I teach?
 * Within those subjects, what are the areas of strength or weakness?

Focusing Efforts
 Answering the first question enables grade-level or subject-area teams of practitioners to establish high-leverage annual improvement goals—for example, moving the percentage of students passing a math or writing assessment from a baseline of 67 percent in 2003 to 72 percent in 2004. Abundant research and school evidence suggest that setting such goals may be the most significant act in the entire school improvement process, greatly increasing the odds of success (Little, 1987; McGonagill, 1992; Rosenholtz, 1991; Schmoker, 1999, 2001).  If we take pains to keep the goals simple and to avoid setting too many of them, they focus the attention and energies of everyone involved (Chang, Labovitz, & Rosansky, 1992; Drucker, 1992; Joyce, Wolf, & Calhoun, 1993). Such goals are quite different from the multiple, vague, ambiguous goal statements that populate many school improvement plans.

Turning Weakness into Strength
 After the teacher team has set a goal, it can turn to the next important question: Within the identified subject or course, where do we need to direct our collective attention and expertise? In other words, where do the greatest number of students struggle or fail within the larger domains? For example, in English and language arts, students may have scored low in writing essays or in comprehending the main ideas in paragraphs. In mathematics, they may be weak in measurement or in number sense.  Every state or standardized assessment provides data on areas of strength and weakness, at least in certain core subjects. Data from district or school assessments, even gradebooks, can meaningfully supplement the large-scale assessments. After team members identify strengths and weaknesses, they can begin the real work of instructional improvement: the collaborative effort to share, produce, test, and refine lessons and strategies targeted to areas of low performance, where more effective instruction can make the greatest difference for students.

So What's the Problem?
 Despite the importance of the two questions previously cited, practitioners can rarely answer them. For years, during which // data //and // goals // have been education by-words, I have asked hundreds of teachers whether they know their goals for that academic year and which of the subjects they teach have the lowest scores. The vast majority of teachers don't know. Even fewer can answer the question: What are the low-scoring areas within a subject or course you teach?  Nor could I. As a middle and high school English teacher, I hadn't the foggiest notion about these data—from state assessments or from my own records. This is the equivalent of a mechanic not knowing which part of the car needs repair.  Why don't most schools provide teachers with data reports that address these two central questions? Perhaps the straightforward improvement scheme described here seems too simple to us, addicted as we are to elaborate, complex programs and plans (Schmoker, 2002; Stigler & Hiebert, 1999).

Over-Analysis and Overload
<span style="font-family: Verdana,Arial,Helvetica,sans-serif; font-size: 12px;"> The most important school improvement processes do not require sophisticated data analysis or special expertise. Teachers themselves can easily learn to conduct the analyses that will have the most significant impact on teaching and achievement. <span style="font-family: Verdana,Arial,Helvetica,sans-serif; font-size: 12px;"> The extended, district-level analyses and correlational studies some districts conduct can be fascinating stuff; they can even reveal opportunities for improvement. But they can also divert us from the primary purpose of analyzing data: improving instruction to achieve greater student success. Over-analysis can contribute to overload—the propensity to create long, detailed, "comprehensive" improvement plans and documents that few read or remember. Because we gather so much data and because they reveal so many opportunities for improvement, we set too many goals and launch too many initiatives, overtaxing our teachers and our systems (Fullan, 1996; Fullan & Stiegelbauer, 1991).

Formative Assessment Data and Short-Term Results
<span style="font-family: Verdana,Arial,Helvetica,sans-serif; font-size: 12px;"> A simple template for a focused improvement plan with annual goals for improving students' state assessment scores would go a long way toward solving the overload problem (Schmoker, 2001), and would enable teams of professional educators to establish their own improvement priorities, simply and quickly, for the students they teach and for those in similar grades, courses, or subject areas. <span style="font-family: Verdana,Arial,Helvetica,sans-serif; font-size: 12px;"> Using the goals that they have established, teachers can meet regularly to improve their lessons and assess their progress using another important source: formative assessment data. Gathered every few weeks or at each grading period, formative data enable the team to gauge levels of success and to adjust their instructional efforts accordingly. Formative, collectively administered assessments allow teams to capture and celebrate short-term results, which are essential to success in any sphere (Collins, 2001; Kouzes & Posner, 1995; Schaffer, 1988). Even conventional classroom assessment data work for us here, but with a twist. We don't just record these data to assign grades each period; we now look at how many students succeeded on that quiz, that interpretive paragraph, or that applied math assessment, and we ask ourselves why. Teacher teams can now "assess to learn"—to improve their instruction (Stiggins, 2002). <span style="font-family: Verdana,Arial,Helvetica,sans-serif; font-size: 12px;"> A legion of researchers from education and industry have demonstrated that instructional improvement depends on just such simple, data-driven formats—teams identifying and addressing areas of difficulty and then developing, critiquing, testing, and upgrading efforts in light of ongoing results (Collins, 2001; Darling-Hammond, 1997; DuFour, 2002; Fullan, 2000; Reeves, 2000; Schaffer, 1988; Senge, 1990; Wiggins, 1994). It all starts with the simplest kind of data analysis—with the foundation we have when all teachers know their goals and the specific areas where students most need help.

What About Other Data?
<span style="font-family: Verdana,Arial,Helvetica,sans-serif; font-size: 12px;"> In right measure, other useful data can aid improvement. For instance, data on achievement differences among socio-economic groups, on students reading below grade level, and on teacher, student, and parent perceptions can all guide improvement. <span style="font-family: Verdana,Arial,Helvetica,sans-serif; font-size: 12px;"> But data analysis shouldn't result in overload and fragmentation; it shouldn't prevent teams of teachers from setting and knowing their own goals and from staying focused on key areas for improvement. Instead of overloading teachers, let's give them the data they need to conduct powerful, focused analyses and to generate a sustained stream of results for students. Assessment Tracking Walls: a Focus for Professional Dialogue: []