The Truth is Out There
Part of this article was originally written in 2018 (Magna est Veritas - why truth in data matters), shortly after I stepped down from my role as National Director of Analysis at NHS Digital. At the time, I was in the early stages of setting up my own training and consultancy business, working independently for the first time after many years in national leadership. That version of the piece was part reflection, part manifesto. A way of setting out why truth matters in data, and what I hoped to stand for in the work I was about to begin.
Seven years on, it feels right to return to it. Some of the message remains the same, but a great deal has changed. The landscape we work in now is very different. We’ve lived through a global pandemic. We’ve watched public trust in institutions fluctuate. We’ve seen AI move from theory to practice in ways few imagined possible in 2018. Automation, large language models, and real-time decision systems have all arrived with speed and scale. At the same time, the UK has wrestled with economic turbulence, workforce pressures, and deep shifts in how public services are delivered and understood. In that context, the need for truth in data has not diminished. It has become sharper, more urgent, and perhaps even more fragile. So this updated version reflects not just where I am now, but where we are now.
I’ve always been curious about where our family name came from. Varlow isn’t a name you hear very often, and it regularly throws people. When I say it over the phone, I already know what’s going to happen. It’ll be written down as Barlow, or Farlow, or occasionally Varley. Even Google isn’t quite sure what to do with it. Type it into the search bar and it politely assumes I must have meant something else, something safer, something more likely.
There’s something in that. A small reflection of how systems prefer what they recognise. Whether it’s a call centre worker typing a name, or an algorithm scanning for search intent, there’s a quiet tendency to smooth things out. The unfamiliar gets reshaped to fit expectations. The rough edges are filed down.
That same habit crops up constantly in my work. I spend my time dealing with data, and the systems that depend on it. The pressure to neaten things is everywhere. Information is cleaned, summarised, adjusted, formatted, filtered, and at times, quietly rewritten. Sometimes the intention is good. Sometimes it isn’t. Either way, the truth is no longer what it was. And that’s where I find myself returning to something older. Something from the family crest.
A few years ago, I contacted someone with the same surname on Facebook. He kindly sent me a photo of the Varlow coat of arms. At the bottom, just beneath the shield, sits a short Latin phrase. Magna est Veritas. It means, "Great is truth." He’s no longer with us, but the phrase stuck. It became something more than a family motto. It became a lens I now use to test the work I do. It is the reason I use a version of family crest as the logo for my business
Now Latin has never been my strong suit. At school I didn’t study it, and for years the only two phrases I could remember were "Cogito ergo sum", and "In canis corpore transmuto". The first, of course, comes from Descartes. "I think, therefore I am" (A recent blog on AI concentrated on this quote). This is a phrase wrapped in logic and doubt and attempts to define what can be known. The second comes from a fairly bad Disney film called "The Shaggy DA", and means, “Turn into a dog.” One is foundational to how we understand consciousness. The other has no relevance and lives in my head for reasons I can’t explain.
So I’ve chosen a different phrase to hold onto now. One that reflects what I try to do. Magna est Veritas feels more grounded in what matters. Not just in theory, but in work. In practice. In the decisions we make and the systems we build and the messages we share with others.
My career has taken me through the NHS and higher education, in both local and national roles. I’ve worked in analysis, research, policy, leadership, and performance. I’ve been in the room when difficult truths needed to be voiced, and also when they were softened, diluted, or delayed. I now work independently. I still provide training. I run courses, deliver workshops, and support professionals one-to-one. But I also advise. I work with organisations on their data structures, their use of automation, their approach to AI, and how they build governance into these systems so that what emerges is not only efficient, but dependable. And honest.
In late 2020, I was asked to help Gibraltar improve its COVID-19 reporting system. At the time, the entire operation was being managed by spreadsheet. Not one spreadsheet, but many. Spreadsheets that people cut and pasted into. Spreadsheets that took an eternity to open, if at all. Versions were flying around by email. Manual edits were being made by different people in different ways. The pressure to get the numbers right was enormous. It was clear for data to be “good quality” that the technology in the collection needed to change. So, I helped build a more structured system. Something that could be trusted. Something that didn’t rely on memory or frantic last-minute adjustments. The data went from chaotic to coherent, and the public and government were able to make decisions with confidence. For those interested this was all done in Power Query, SQL server links and Power BI front-ends. It didn’t need millions throwing at it, just some consistency and some understanding of the data and data flows.
But while the technological solution was necessary, the experience reinforced something I already knew. If the data isn’t true, the consequences are real. No amount of technical brilliance can save a decision that is based on fiction. The foundation has to be honest. Otherwise, everything built on top of it is just noise.
The tools have changed since then, and they’re still changing. In 2025, AI systems generate forecasts and analysis in seconds. Automation handles performance reports that once took teams of people. Dashboards update in real time, and natural language models produce summaries that sound convincing. But speed does not guarantee truth. A quick answer is not always a correct one. A confident sentence is not always a reliable insight.
This is what concerns me. The ease with which we now accept what is generated. The faith placed in outputs that have never been properly questioned. The way a neat visual can distract from an uncomfortable reality. Organisations often tell themselves they are being data-driven. What they actually mean is that they have accepted whatever their reporting platform showed them at 9am on a Tuesday morning.
In my training, I return again and again to the same basic questions. What are you trying to find out? Do you have the right data to answer it? How good is that data? Is your method sound? Is the result understandable? Is it repeatable? These questions are not designed to trip people up. They are there to protect the decision-making process. And more importantly, to protect the people affected by those decisions.
Outside the classroom, in my consultancy work, I ask related but sometimes sharper questions. Who owns this data? Who has access? What assumptions are baked into the automation process? When you ask your AI model to make a judgment, do you understand what it's doing, or are you trusting the surface because the answer came back quickly?
We have developed incredible tools. But we often use them without pausing to check whether they are appropriate for the question at hand. We fall in love with visualisation. We become preoccupied with the colour scheme. We tweak the axis so the trend looks smoother. We move the thresholds so the category shifts. These changes seem small. But in the end, they move us away from the truth. And once we’re a few steps removed, it becomes very hard to find our way back.
This is where I find the old motto helpful. Magna est Veritas. It cuts through the presentation layer. It reminds me that the role I play is not simply to deliver a solution. It is to make sure that what I leave behind is honest. That it can be relied on. That someone else can pick it up six months later and understand where the numbers came from and what they mean.
When I look at the Varlow coat of arms, I don’t imagine some grand history. I don’t think in terms of lineage or pride. I think about the task in front of me. To protect something that slips away too easily. I’m not a fighter. But I am someone who will ask the question nobody else wants to. I am someone who will suggest we pause before publishing that graph. I am someone who will say, “This isn’t right.”
That, I think, is the role of someone who works with data in 2025. Whether you are training, consulting, modelling, or managing, the job is the same. You are holding the line against noise. You are slowing things down just enough for the truth to be seen properly. This is doubly important now we are dealing with AI.
So whatever role you’re in, and however much or little you deal with data, try to carry that phrase with you. Magna est Veritas. Not because it sounds impressive. But because it reminds us that in 2025 honesty in analysis is still possible. And still necessary.
Magna est Veritas. Great is Truth