How to lead with empathy in the age of AI

How to lead with empathy in the age of AI


Emma du Parcq, research and thought leadership at Roffey Park Institute (Credit: Roffey Park)

Healthcare leaders must navigate the tension between innovation and compassion in the age of AI, writes Emma du Parcq, head of consulting, research and thought leadership at Roffey Park Institute

Following the launch of the National Commission on the Regulation of AI in Healthcare and the roll-out of AI diagnostic tools across selected hospital trusts, the healthcare sector is rapidly embracing AI-enabled clinical transformation.

While these technologies promise enhanced efficiencies and outcomes, they also raise new ethical and operational questions. Healthcare leaders must navigate the tension between innovation and compassion, ensuring that technology enhances, rather than erodes, the human touch in care delivery.

From stroke diagnostics to radiology and workflow optimisation, AI-assisted tools are demonstrating measurable improvements in efficiency and accuracy. A 2025 Feedough report found that 80% of UK hospitals now use some form of AI, with radiology departments leading the way.

Despite these gains, the integration of AI into clinical practice is far from seamless. Outdated IT infrastructure, lengthy procurement cycles and a lack of interoperability between systems often hinder progress, which is further compounded by cultural resistance and ethical concerns.

A University of Manchester review highlights how prioritising one ethical principle, such as data privacy, can inadvertently compromise another like beneficence.

For example, strict limits on sharing patient data to protect privacy might prevent clinicians from accessing information that could improve diagnoses or treatment outcomes. Leaders must navigate these trade-offs carefully, ensuring that AI enhances rather than undermines the core values of healthcare.

Staff attitudes also reflect this tension. While a Health Foundation survey found that 76% of NHS staff support AI for patient care, 65% worry it will make them feel more distant from patients.

Empathy and ethics

To overcome this challenge, healthcare leadership must evolve by balancing AI adoption with empathy, ethics and patient-centred care. This requires more than technical know-how, it demands a shift in mindset.

Regulatory frameworks are beginning to catch-up, with initiatives such as the National Commission on the Regulation of AI in Healthcare bringing together clinicians, regulators and technology firms to create clearer guidance on safety, accountability and governance.

As AI becomes more visible in care pathways, patients need reassurance that their data is safe, their dignity respected and their care still personal

Yet regulation alone cannot address the cultural and emotional dimensions of AI adoption. Leaders must bridge the gap between innovation and compassion, guiding teams through uncertainty, fostering trust in new technologies and ensuring that progress aligns with the values of compassionate care.

Disciplines like organisational development play a critical role in this transformation. By embedding these strategies into AI implementation, healthcare organisations can help staff adapt.

This includes investing in training that builds digital literacy, creating safe spaces for dialogue and reflection and involving clinicians in the co-design of AI tools.

These efforts not only ease transitions but also empower staff to take ownership of change. Recent insight from NHS England reinforces this, noting that staff engagement is a key predictor of successful AI implementation.

Leadership must also extend empathy to patients.

As AI becomes more visible in care pathways, patients need reassurance that their data is safe, their dignity respected and their care still personal. Clear communication, active listening and transparency are vital. Leaders must be prepared to explain how AI works, what it does and crucially what it doesn’t do.

Leaders who navigate this shift most effectively are those who intentionally build capability, not just knowledge.

Practical development tactics, action-learning sets, reflective practice and scenario-based simulations give healthcare leaders space to explore real ethical dilemmas, practice difficult conversations and examine the impact of their decisions on patients and teams.

These approaches strengthen confidence, emotional intelligence and psychology safety, all of which are core conditions for leading responsibly with AI.

By investing in these kinds of development experiences, leaders can cultivate the humane, values-led leadership that ensures AI enhances care rather than distances it.

The future of compassionate technology

If healthcare leaders can strike this balance, the benefits could be transformative. AI has already demonstrated its potential to reduce diagnostic errors by 23% and cut interpretation time by 35%. These gains can translate into faster, more accurate care, reduced clinician burnout and better patient outcomes.

But the true value of AI lies not just in efficiency, it lies in its ability to support more compassionate, personalised care.

Leadership that embraces both innovation and empathy will ensure that AI enhances the human experience of care. Organisational development will be essential in helping staff navigate change, build confidence and maintain human connection at the heart of healthcare.



Content Curated Originally From Here