Not yet, at least.
And still, we seem to believe that more code, more data, more models will somehow deliver it.
That by automating complexity, we can sidestep the difficult, emotional work of building confidence in systems, in institutions, in each other. It’s too early to know what the impact of the blackout in Spain and Portugal will be, but it will force us to think about it too.
Earlier this year, the European Commission unveiled its AI Continent Action Plan: a strategic, and frankly impressive roadmap to position the EU at the forefront of artificial intelligence. It speaks of gigafactories, AI skills academies, new governance tools, and hundreds of billions in investment. It captures the imagination. And it is necessary: Europe must not lag behind in shaping the rules of this powerful, fast-moving field.
Impressive. Necessary. But also… incomplete.
Because trust doesn’t come from faster processors or bigger data lakes. It comes from the decisions we make when no one (or maybe, everyone) is watching. From how we treat the people on the margins of the system. From those we leave in the dark, in the cold, or in the scorching heat, because it happens here, too.
That’s what I talked about on the Decisions Now podcast, now featured on Energ’ Ethic Podcast as partner content.
Yes, we spoke about AI and energy. But we also talked about the invisible frictions between people and processes. In the power dynamics behind smart meters. In the rise of data-driven decision-making. In the moment when the system doesn’t work quite as planned. In the way we handle ambiguity. In who gets to decide what “acceptable risk” looks like. And the everyday absurdity of wondering whether your toaster knows more about your habits than your energy provider does.
These are not abstract concerns. They are daily realities for consumers, especially those navigating essential services like energy: woven into our homes, embedded in our fridges, reacting to our habits, predicting our usage. So, we talked about the risks of sleepwalking into systems we no longer understand, and the very real gap between digital possibility and public trust.
For all the investment in infrastructure, the regulatory precision, the policy foresight: trust does not live in the blueprint.
Fairness doesn’t scale unless it’s designed to
When I supported the European Economic and Social Committee (EESC) in preparing its opinion on energy digitalisation, we focused not only on technological ambition but on the social contract that surrounds it.
Because it is easy, too easy, to assume that digitalisation is neutral. That AI will naturally serve the public good if left to innovate freely. That smart meters and connected devices will empower consumers rather than overwhelm them.
But none of that is guaranteed.
And in a sector like energy, where consumer trust is already fragile, we cannot afford to assume that automation equals progress.
The transition must be human-centred, not just user-optimised.
Households need to be able to understand, question, and reject what is offered to them. And this is not just about literacy or design: it is about agency.
A transition built on silence is not a transition: it’s a bypass
Energy poverty, the digital divide, opaque pricing, aggressive default contracts… none of these are solved by algorithms. And many of them are made worse by the assumption that innovation is a proxy for fairness.
We need better tools, yes. But we also need to ask better questions.
Why are we digitising this process?
Who benefits from the optimisation?
Who carries the cost of error?
Because while the AI Continent Action Plan includes important measures on transparency, accountability, and inclusion, it does so in a context where platforms increasingly mediate public services, and where “consumer empowerment” often just means learning how to navigate a very sophisticated menu of options you never asked for.
But
if opting out becomes impossible, trust becomes irrelevant.
So what do we do with this?
We start where we always should: with the people on the margins of the system.
Those who can’t afford the latest devices, who aren’t fluent in digital interfaces, who’ve had reasons not to trust energy providers in the past.
We design with humility.
We communicate with empathy.
And we recognise that trust isn’t something you can automate or retro-fit. It is built slowly, often painfully, through openness, clarity, and the willingness to be wrong and to be held accountable.
🎧 You’ll find the Decisions Now episode on the Energ’Ethic stream.
It’s not a technical explainer. It’s a conversation about values—about what gets lost when we chase scale, and what we must hold onto if we want this transition to mean something beyond efficiency.
And if you listen closely, you might just hear the toaster.