How Behaviour Data in Secondary Schools Can Hide System Problems By Brad Holmes • 8 April 2026 • 7 min read A trust leader reviews behaviour data across five schools. School A has 8 exclusions this term. School B has 3. School C has 12. The immediate assumption: School C has a behaviour problem. Poor leadership, weak culture, or a cohort of challenging students. So the trust responds by supporting School C with additional behaviour expertise. Better behaviour frameworks. New systems. But the real problem wasn’t behaviour management. It was system inconsistency. Here’s what was actually happening: School A has clear, consistent tutor time structure and consistent behaviour expectations across all form groups. Tutor time includes a daily homework check and behaviour reflections — built into a structured student planner that every form group uses the same way. School B has varying tutor time structure depending on the tutor. Some are tight, some are ad-hoc. Behaviour expectations are interpreted differently by different staff. School C has no clear tutor time structure. Each tutor improvises. Behaviour expectations are unclear. Different students experience different standards depending on which form group they’re in. School C isn’t a behaviour problem. It’s a system problem. The exclusion rate is partly a reflection of behaviour, but it’s also a reflection that when systems are unclear, behaviour is harder to predict and manage. How Systems Affect Behaviour Data This is counterintuitive, so it’s worth being explicit: A school with clear, consistent systems has: More predictable behaviour (students know what’s expected) Clearer escalation (issues are surfaced early through the system) More consistent consequences (students experience fairness) Lower crisis escalation (problems don’t build because they’re addressed through the system) A school with unclear, inconsistent systems has: Less predictable behaviour (students don’t know what’s expected) Inconsistent escalation (some issues are addressed, some are missed) Inconsistent consequences (depends which staff member responds) Higher crisis escalation (problems build because they’re not addressed early) The behaviour data reflects this. But a trust leader who doesn’t understand the system dynamics will misinterpret it. They’ll blame leadership (“School C’s headteacher is weak”) or culture (“School C has a behaviour problem cohort”) when the real issue is system design. What the Data Actually Says Let’s be specific. A trust leader sees this data: SchoolExclusionsAttendanceHomework CompletionA894%82%B391%75%C1287%58% The pattern seems clear: C is struggling across the board. A is strong. But now add system data: SchoolTutor Time StructureHomework TrackingBehaviour ExpectationsConsistencyAStandardised, dailyClear, visibleConsistent across staffHighBVaries by tutorPortal, low adoptionVaries by staffMediumCAd-hocFragmentedUnclearLow Suddenly the picture changes. School C’s poor behaviour data isn’t because of bad leadership or a bad cohort. It’s because the system is unclear. When students don’t know what’s expected, behaviour is harder to manage. When expectations vary, students test boundaries constantly. When escalation paths are unclear, small issues become big problems. And School B’s lower homework completion isn’t because teachers aren’t trying. It’s because the homework tracking system is inconsistent. The Diagnostic Value of System Analysis A trust leader who understands this can use behaviour data as a diagnostic tool: High behaviour incidents + clear systems = coaching/professional development problem. The system is good, but staff aren’t executing it well. Help the staff. High behaviour incidents + unclear systems = system design problem. The structure isn’t supporting positive behaviour. Fix the system first. High behaviour incidents + unstable systems = change management problem. The system changed recently and staff/students haven’t adapted. Support the transition. Low behaviour incidents + clear systems = great work. The system and execution are both strong. Low behaviour incidents + unclear systems = anomaly. Something else is working really well. Identify and document it. These different diagnoses require different responses. But a trust leader who doesn’t understand system dynamics will treat all high incident rates the same way: more behaviour support, when sometimes what’s needed is system clarity. How Consistency Changes the Narrative When a trust implements system standardisation, behaviour data often shifts. Before standardisation: School A: 8 exclusions School B: 3 exclusions School C: 12 exclusions After standardisation (all schools running the same tutor time structure and behaviour expectations): School A: 7 exclusions (marginal change, system was already clear) School B: 6 exclusions (some improvement, system is now clearer) School C: 8 exclusions (significant improvement, system is now clearer) The gap narrows. Why? Because the variation was partly driven by system variation, not behaviour differences. Some of School C’s issues resolve not through more intervention, but through clearer systems. What This Means for QA This has important implications for how a trust assesses school quality. In a non-standardised trust, behaviour data is confounded. You can’t tell whether differences are real or systemic. In a standardised trust, behaviour data is clearer. If all schools run the same system and one has significantly higher incidents, you know it’s either: A cohort difference (real but manageable) Leadership/staff execution (needs coaching) A local issue (specific to that school’s context) It’s not “the system varies and we’re not sure what we’re measuring.” This clarity allows better QA and better support. It’s also what Ofsted expects to see — how Ofsted evaluates trust-wide consistency sets out the specific questions inspectors ask about whether systems are coherent across schools. The Equity Angle There’s also an equity argument here. In a non-standardised trust, a student’s experience of behaviour expectations depends on which school they attend. This is the same dynamic that drives up the hidden costs of inconsistent systems across a multi-academy trust — the financial case for standardisation and the equity case are often two sides of the same argument. What’s a serious consequence in School A might be a verbal warning in School C. This isn’t just operationally inefficient. It’s inequitable. Students experience different standards. Standardisation ensures that behaviour expectations are consistent: every student knows what’s expected, regardless of which school they attend. Implementation: From Diagnosis to Action A trust leader who suspects system inconsistency is driving behaviour variation: 1. Audit current systems. What does tutor time look like in each school? How is behaviour expectations communicated? How is escalation handled? 2. Compare data to systems. Where are the correlations between system clarity and behaviour data? 3. Identify the leverage point. Which system change would have the biggest impact on behaviour variation? Usually, it’s tutor time structure. Consistent tutor time reduces behaviour issues more than any other single intervention. If you’re unsure what that structure should look like, the case for standardising tutor time across your trust explains what belongs in the core framework and what can stay flexible. 4. Implement standardisation. Standardise the core system, train staff, support transition. For most trusts, this starts with a consistent student planner system across schools — the tool students and tutors use daily becomes the foundation for everything else. 5. Measure change. Track behaviour data after standardisation. You should see convergence. What Success Looks Like After system standardisation: Behaviour data becomes more consistent across schools (gaps narrow) Leadership time on “why is this school different?” decreases Staff are more confident (they know what’s expected) Students experience consistency (fairness) Exclusions decrease overall These aren’t just operational improvements. They’re equity improvements. The Question for Your Trust When you look at behaviour data variation across schools, do you know what percentage is real behaviour difference vs. system difference? If the answer is “we’re not sure,” that’s a sign that system inconsistency is masking what’s actually happening. The schools that manage behaviour best across a trust aren’t necessarily the ones with the most sophisticated behaviour policies. They’re the ones where systems are clear and consistent. And the trust that understands this can use behaviour data not just to assess performance, but to diagnose systemic issues and fix them at the root. That’s the move from reactive to strategic. Brad Holmes School Planner Company With over two decades of experience turning complex systems into simple, useful tools, Brad brings a strategist’s eye to school planning. He shares proven methods for organisation and productivity that help students, teachers, and parents stay focused and on track Previous Post Next Post