The scene was a packed conference hall in Helsinki. Hundreds of nursing leaders and thinkers from across the globe had gathered. On stage sat an impressive line-up - Kathleen McGrow, Chief Nurse at Microsoft, and Amy McDonough, Managing Director of Strategic Health Solutions at Google Health. A moderator to keep things moving but lurking in the background was a rather large elephant.
This blog isn't a critique of their presentation - far from it. I've left enough stages replaying the points I missed and the questions I should have answered better to sympathise with any speaker. Instead, this is a plea: let's reframe the debate around artificial intelligence (AI) in health care.
We should be curious and excited by the possibilities of AI - but let's not fix our gaze only on the "bright and shiny" features. We must also focus on the less glamorous, but arguably more important, considerations from the start.
AI has the power to transform health care. That much is undeniable. Our health systems generate mountains of data - by some estimates, around 36% of all global data[1] - making our sector a natural focus for AI innovation.
For some professions, this is an existential shift. The working life of a radiologist, for instance, is changing beyond recognition.
On the surface, nursing might seem more secure. It's hard to imagine ChatGPT performing a complex dressing change or offering the nuanced reassurance a patient needs in the middle of the night. But perhaps that's just a lack of imagination on my part.
Amy and Kathleen emphasised the many positive potentials of AI: streamlining workflows, enhancing diagnostics, and freeing nurses for deeper patient engagement. They spoke about nurses not just as caregivers, but as innovators and advocates who can help shape health care systems with AI as a key tool. None of this is wrong - in fact, it's inspiring.
But then came the audience feedback. The largest word in the closing "word cloud" was not "innovation" or "hope", but "Orwell". Probably not the outcome Microsoft and Google were aiming for.
And that's when the elephant in the room became harder to ignore.
Not a real elephant, I should add - although I did once see one paraded around the football pitch of my local club (Wimbledon) in the 1990s. (That really is worth five minutes of your time on Google!).
No, this was the elephant of equity.
The AI we use today is being built on massive harvesting of human-generated data - surveys, research and information available online. But whose data?
Ironically, the home continents of actual elephants - Africa and Asia - are among the most underrepresented in the datasets AI depends on. These "data deserts" mean vast swathes of the world are absent from AI's design and testing.
This has consequences. The AI that will roll across our health care systems will inevitably be skewed to meet the needs of the wealthiest nations.
A vast proportion of the world, its poorest and most marginalised, is not being included in the data which is being used to build the AI we increasingly rely on. That is something that should be alarming us, more than it appears to be.
As a world we have travelled down a similar road before and still suffer its consequences in many areas of daily life. In her seminal book Invisible Women, Caroline Criado-Perez makes the case that technology often delivers less than it could because women have been systematically excluded from the design process.
The same ‘elephantine’ danger looms here - large, impossible to ignore, and potentially harmful. Without urgent action, many nations will be forced to operate in an AI-shaped world that did not include them in its creation and fails to meet their needs.
If the last five years of global health have taught us anything, it's that exclusion on this scale is never harmless. It ripples out, affecting the well-being of all.
Perhaps the most crucial point is this: we are the only generation of nurses who will witness the birth of AI in health care.
If we do not demand that its development includes all of humanity, our richest and our poorest, and every nation and place considered, future generations will rightly judge us harshly - and, like elephants, they will not forget.
[1] [1] Thomason, Jane. "Data, digital worlds, and the avatarization of health care." Global Health Journal 8.1 (2024): 1-3.