Breaking the Cycle

This is the fourth and final article in the current series. The first, “When Decisions Come First, explored how choices are often made before any real evidence exists, leading to decision-led data making. The second, “How Guesses Become Gospel”, examined how untested numbers and statements can turn into accepted truths through repetition. The third, “The Missing Mirror”, looked at the absence of evaluation, particularly during organisational reconfigurations. Without proper review, we never really know what works, and unfounded assumptions spread quietly from one initiative to the next, often with damaging results.

This final piece in the series asks how we can break that cycle. How can data, analysis, and evaluation be used properly, not as decoration but as tools for thinking? How can organisations, whether public, private, or charitable, become more honest and more effective without constantly reinventing the same mistakes?

To see the cycle clearly, it helps to start by naming what happens. In many organisations, a decision is made before the evidence is gathered. Someone at senior level proposes a new initiative, a merger, a system, a strategy refresh, often because of external pressure or political expectation. The data that follows is assembled to support the plan, not to test it. Targets are written into press releases before anyone has defined what success will mean.

The data that follows is assembled to support the plan, not to test it. Targets are written into press releases before anyone has defined what success will mean.

Once the plan is public, the assumptions begin to harden. Reports talk about being “on track” and “within tolerance”. Any evidence that suggests problems is described as “an anomaly” or “a transitional issue”. By the time an evaluation might have been useful, the funding cycle has moved on, and the same team is planning the next reform.

The National Audit Office’s reports tell this story repeatedly. The NHS’s National Programme for IT, HS2, and large defence procurements have all followed the same curve: early optimism, rising cost, limited transparency, late recognition of problems. Each one began with persuasive modelling and confident promises, and each later required parliamentary scrutiny to expose what had gone wrong.

The charity and private sectors mirror the pattern in smaller ways. A large charity announces a new digital platform to “transform service delivery”. A retailer launches a rebrand to “put customers at the heart of everything”. A bank claims it will “build trust” through new technology. All of these phrases sound plausible. All are supported by data that looks precise but often hides more uncertainty than it shows.

The cycle survives because it feels efficient. There is comfort in movement, in the idea that doing something is better than standing still. Decision-makers like to look decisive. Funders and shareholders like progress. Staff like direction.

The cycle survives because it feels efficient. There is comfort in movement, in the idea that doing something is better than standing still. Decision-makers like to look decisive. Funders and shareholders like progress. Staff like direction.

There are also structural reasons. Short-term funding cycles leave little space for learning. Political calendars reward visible action more than careful review. Private companies are measured by quarterly results, not by long-term understanding. And in charities, the fear of disappointing funders can make honesty about mixed results seem risky.

Psychology plays its part too. People like stories that make sense. Once an organisation has invested in a narrative of success, it becomes part of its identity. Questioning that story feels like betrayal. Teams stop asking whether the numbers are real and start repeating them because they sound reassuring.

This culture of confirmation runs deep. It rewards optimism and discourages reflection. It creates what researchers at the Institute for Government once called “policy amnesia”; the habit of forgetting what happened last time because it is awkward to admit.

Breaking the cycle means building systems that value learning over appearance. It starts with permission. People need to be allowed to say “we do not know yet”. That single sentence, spoken honestly, changes how decisions are made.

People need to be allowed to say “we do not know yet”. That single sentence, spoken honestly, changes how decisions are made.

Some public bodies are already showing how it can work. The Office for National Statistics now publishes the uncertainty ranges for its major datasets so that users can see how confident (or not) we can be in each figure. The What Works Network, and its various centres, review and rate evidence, showing not only what is supported but how strong that support is. The Education Endowment Foundation goes further by publishing evaluations of education interventions even when they show no measurable benefit. Those negative results are treated as useful information, not as embarrassment.

Local government can do the same. Some councils now publish progress dashboards showing not just targets met but where delivery has slipped. Bristol’s One City Plan reports, for instance, include plain language updates that acknowledge when goals are delayed or need revising. That openness has encouraged public challenge and, in some cases, collaboration with external researchers to understand what is really happening.

In the charity world, there are slow shifts toward transparency. Some organisations now advocate for more honest reporting, urging funders to see learning as part of accountability. Some trusts now fund independent evaluations or require that grantees publish full results, not only highlights. It is gradual, but it shows that funders can encourage truth rather than polish.

In business, industries that cannot afford self-deception have long shown how evaluation can save lives. Aviation and pharmaceuticals are examples where post-incident review and clinical trial transparency are mandated. The lesson is simple: transparency and learning systems are not luxuries; they are infrastructure.

The lesson is simple: transparency and learning systems are not luxuries; they are infrastructure.

Culture changes when incentives do. At the moment, reward structures still favour confidence. A senior manager who delivers a high-profile project gets recognition even if the benefits are unclear. A cautious analyst who questions the assumptions is told they are slowing things down.

Boards, regulators, and funders could change that pattern easily. They could ask for evidence that lessons have been learned, not only that outputs have been delivered. They could publish review findings routinely. The Treasury’s Green Book already requires evaluation plans for public projects; what is missing is consistent enforcement and public follow-up.

Language also matters. In many workplaces, “delivery” and “innovation” are seen as progress, while “review” sounds like delay. If we continue to equate action with success, reflection will always lose. It might be time to treat evaluation as a creative process rather than a compliance task.

It might be time to treat evaluation as a creative process rather than a compliance task.

Technology can support learning, but only if used honestly. In the wrong hands, dashboards become performance theatre. Numbers are cherry-picked to show improvement. Graphs rise neatly, but they describe only what was measured, not what mattered.

Used well, data can be the opposite of defensive. The NHS has begun using near real-time analytics in several trusts to track patient flow and staffing needs. When paired with honest discussion about results, this data can inform live adjustments rather than just retrospective justification.

Transparency is spreading elsewhere too. The Charity Commission’s online data portal allows anyone to view financial and governance information for registered charities. Journalists and researchers have used that openness to check claims and highlight discrepancies. In the corporate sector, open reporting on carbon emissions and modern slavery statements is starting to provide similar scrutiny.

These developments show that transparency is not simply about compliance. It is a practical tool for truth-telling. It helps organisations see themselves clearly; and it gives the public a chance to do the same.

transparency is not simply about compliance. It is a practical tool for truth-telling. It helps organisations see themselves clearly; and it gives the public a chance to do the same.

None of this works without cultural change. Healthy organisations make space for uncertainty. They do not confuse confidence with competence.

Doubt does not mean paralysis. It means curiosity. It means leaders being willing to ask questions that do not have neat answers. It means analysts being encouraged to publish uncomfortable findings rather than hide them in footnotes.

The most successful teams often show this behaviour naturally. They test small ideas, measure them carefully, and share what they find. They expect to be wrong sometimes. The Behavioural Insights Team, which grew out of the Cabinet Office, made this approach routine. It ran hundreds of small-scale trials on policy design, published the results, and adjusted its methods over time. Some of its experiments failed completely, but every one added to understanding.

This principle applies beyond government. Charities, councils, and businesses could all benefit from replacing top-down “transformation” with small, measured experiments. The question is not “did we win?” but “what did we learn?”.

The question is not “did we win?” but “what did we learn?”.

Large-scale reform tends to lock in assumptions. Once money and reputation are committed, change becomes difficult. Smaller experiments allow organisations to test ideas before they harden.

A council might trial a new waste collection route for a few wards before city-wide rollout. A charity might run two different versions of a programme in parallel to compare outcomes. A business might pilot a shift pattern change with one team before applying it across the organisation.

If these trials are properly measured and published, the results accumulate into real knowledge. They replace stories with data. They create an archive of experience rather than a pile of forgotten PowerPoint slides.

Breaking the cycle is about more than accuracy. It is about accountability in its truest sense: being answerable not just to superiors or funders but to the public and to reality itself.

Breaking the cycle is about more than accuracy. It is about accountability in its truest sense: being answerable not just to superiors or funders but to the public and to reality itself.

When organisations operate without reflection, they become storytellers rather than learners. They confuse effort with progress and optimism with truth. The public, employees, and donors eventually notice.

Accountability means asking, regularly and publicly, what happened next. Did the project deliver what was promised? Did it improve lives? Did it save what it claimed to save? Those questions should be built into every programme’s lifecycle, not left to auditors years later.

Open evaluation strengthens trust even when the results are mixed. People forgive mistakes more readily than deception. What corrodes faith in institutions is not failure itself but the appearance of denial.

Real progress is slow. It means publishing evidence before it is flattering. It means admitting uncertainty. It means choosing clarity over comfort. There are no quick victories in that process, but it is how institutions grow up.

It means publishing evidence before it is flattering. It means admitting uncertainty. It means choosing clarity over comfort.

The NHS, councils, charities, and businesses all face the same challenge: to prove that they can learn as well as act. That is the test of maturity.

The first article in this series showed how decisions often come before evidence. The second traced how uncertain figures become accepted fact. The third examined what happens when evaluation goes missing. This final piece looks forward. The cycle will continue for as long as people pretend not to see it.

Breaking it means slowing down long enough to ask the awkward questions. It means letting data inform rather than defend. It means treating evaluation not as a sign of weakness but as a mark of care.

When organisations start to work this way, assumptions stay flexible, analysis stays honest, and progress begins to mean something measurable. It is not glamorous work, but it is the only kind that lasts. The only kind that breaks the cycle.