#002: The Trust Deficit
A new global study from the University of Melbourne and KPMG surveyed 48,000 people across 47 countries. The results?
“Only 46% of people say they trust AI systems.
That’s despite 66% using AI regularly, and 83% seeing its benefits.”
The disconnect isn’t about usefulness—it’s about risk:
Cybersecurity
Privacy
Bias
Lack of transparency
And above all: lack of accountability
Even more telling:
70% of respondents want stronger AI regulation.
Trust levels are significantly higher in emerging economies (three in five trust AI) than in advanced ones (just two in five). That gap speaks to expectation, context, and responsibility.
We see it mirrored in boardrooms and meetings every week:
Organisations aren’t trying to build harmful systems.
But they’re often moving too fast, and trust becomes nobody’s job.
That’s why we’ve launched TRUST-AI.CO—a new home for practical, universal tools to help AI earn trust:
The TRUST-AI open standard
The AAA+ badge for trustworthy design and delivery
Clear pathways for teams to act ethically, not just aspire to it
Rooted in Third Way Consulting’s focus on clarity, human-centred design, and integrity, these tools help bridge the gap between good intentions and real-world confidence.
Trust isn’t a layer you add after deployment—it’s the foundation you build on.
So here’s the question:
What does trustworthy AI look like in your context?
If you're grappling with that, start here: trust-ai.co
Let’s make trust someone’s job again.
—B.