top of page

Why AI Needs Humility — Lessons from the Darkest Data

  • Writer: William Beresford
    William Beresford
  • 3 days ago
  • 2 min read

AI is everywhere. From customer engagement to predictive analytics, the conversation is dominated by opportunity, speed and scale. But my recent conversation with Freddie Lichtenstein of Camera Forensics reminded me that there’s another word we need to bring into the debate: humility.


Camera Forensics works with some of the world’s most sensitive data, supporting law enforcement and NGOs in the fight against child sexual abuse and exploitation. Their challenge is unique: they need to train AI models to detect harmful content — without ever being able to look at that content themselves. The images are illegal to view, store or process in normal environments. And yet the models must still work.


It’s a stark reminder that AI is not just about what’s possible, but about what’s responsible. Safety, in Fred’s words, can’t be a patch you stick on afterwards. It has to be embedded from the start.


Paul Alexander head shot with a digital imagery AI halo
Lessons from the Darkest Data

For business leaders, marketers and data professionals, there are three key lessons:


1.      Humility matters. AI should be designed with an awareness of its limits. Blind confidence in outputs is dangerous — whether in forensic investigations or marketing personalisation.


2.      Safety drives innovation. By treating safety as a core design principle, you end up with better, more usable tools. Just as Camera Forensics reduces the exposure of investigators to traumatic content, businesses can reduce risks to customers and employees through thoughtful design.


3.      Responsibility is shared. From app developers to app stores, from data teams to CMOs, accountability runs through the chain. If AI harms someone, the excuse that “we just built the tool” doesn’t stand up.


Marketing may feel a long way from the frontline of child protection. But the underlying truth is the same: we are custodians of data, and we carry responsibility for how it is used. Humility in AI is not weakness. It is the foundation of trust.


At Beyond, we believe in putting data to work — but in the right way. As we embrace AI in our organisations, we must ask not just “what can this do?” but also “what should this do?” That’s the real test of leadership in the age of intelligence.


 
 
bottom of page