Beyond the Bot, But Not Yet Beyond the Human

Reflections on McKinsey’s conversation on empathetic AI, and the deeper work still ahead

In a recent McKinsey Operations Practice discussion titled Beyond the Bot: Building empathetic customer experiences with agentic AI, Eric Buesing, global head of customer service operations at McKinsey, and Gadi Shamia, CEO and cofounder of Replicant, explore how agentic AI is beginning to reshape customer experience.

The conversation reflects a moment many organizations are feeling right now. AI is moving out of experimentation and into operational reality. Automation is becoming more adaptive. Conversations with machines feel more fluid. And leaders are beginning to treat AI not as a tool but as infrastructure.

There is a quiet optimism in the discussion. A sense that something meaningful is changing.

Much of what they describe resonates. Yet reading closely reveals a boundary. The discussion approaches empathy as a defining quality of the next generation of AI, but stops just short of redefining what empathy itself might mean when intelligence is no longer exclusively human.

That distinction is subtle. It is also important.

What McKinsey’s Perspective Gets Right

The strongest parts of the conversation focus on the operational implications of agentic AI.

Buesing describes agentic systems as collections of smaller specialized agents working together toward goals. Shamia emphasizes that organizations must rethink processes rather than automate existing workflows. Both highlight that many AI initiatives stall because companies underestimate the organizational transformation required to scale.

Several themes emerge clearly:

  • AI adoption is becoming a CEO-level priority rather than a technical experiment.

  • Customer experience and operational design are inseparable.

  • Modular AI systems allow organizations to build adaptive workflows instead of rigid scripts.

  • Measurement, change management, and system integration determine whether AI succeeds.

These observations feel grounded in reality.

There is also a thoughtful acknowledgment that conversational experience matters deeply. Tone, responsiveness, contextual memory, and conversational flow influence whether users engage with AI or reject it. The conversation design layer is treated as central rather than cosmetic.

Taken together, McKinsey’s perspective presents a clear operational direction: organizations that redesign around intelligent systems will move beyond automation into genuinely adaptive service models.

This is valuable insight.

And it leads us to the hinge point.

How Empathy Is Framed in the Conversation

Throughout the discussion, empathy appears as a guiding aspiration for AI-driven experiences.

Empathy is described through behaviors such as:

  • remembering prior interactions

  • adapting tone and conversational style

  • responding consistently without frustration

  • creating interactions that feel attentive and natural

These qualities matter. For decades, automated systems struggled because they felt mechanical or indifferent. Designing AI that responds more fluidly represents meaningful progress.

But notice how empathy is defined.

It is framed primarily as an outcome of interaction design.

Empathy becomes something expressed through conversational behavior.

This approach treats empathy as a property of user experience.

And that framing deserves closer examination.

When Empathy Becomes UX

Designing systems that feel empathetic is not the same as designing systems that understand human experience.

The distinction is subtle because both approaches can produce similar surface outcomes. Conversations may feel smoother. Responses may appear thoughtful.

Yet the underlying models differ.

One approach defines empathy as performance:

  • adjust tone

  • remember context

  • respond appropriately

Another approach understands empathy as interpretation:

  • recognizing emotional context beyond language patterns

  • understanding ambiguity and vulnerability

  • preserving dignity even when optimizing efficiency

McKinsey’s conversation sits primarily within the first interpretation. Empathy becomes a design objective that enhances adoption and satisfaction.

This is not a criticism. It reflects the current stage of organizational thinking.

But it also reveals what remains unexplored.

What the Conversation Leaves Unsaid

As insightful as the operational perspective is, several deeper dimensions of empathy are not addressed.

Emotional context versus memory

Remembering previous interactions improves continuity. It does not necessarily mean the system understands the emotional meaning behind those interactions.

Efficiency versus care

Automation often emphasizes speed and resolution. Yet many human interactions require moments of presence rather than acceleration. Empathy cannot be reduced to throughput.

Humans as more than tasks

Customer interactions are rarely purely transactional. People bring identity, uncertainty, and emotional context into conversations. Designing solely for problem resolution risks flattening the human experience.

Legibility and transparency

Empathetic systems should help users understand what the intelligence is doing, how it operates, and where its limits lie. Without this clarity, empathy risks feeling simulated rather than genuine.

These gaps do not undermine McKinsey’s operational perspective. They point toward the next layer of thinking.

The Deeper Opportunity Emerging Beneath the Conversation

What McKinsey describes is an operational transformation framed through the language of experience design. That framing makes sense. Operational change is measurable and actionable.

Yet something more profound is beginning to surface.

AI is no longer only a tool executing tasks. It is becoming a participant in relational space.

When intelligence becomes agentic, empathy cannot remain a surface feature of conversational design. It becomes a structural property that shapes how the system interprets human reality.

Empathy shifts from tone to context. From optimization to integrity. From transaction to relationship.

This shift is still emerging. The conversation gestures toward it without fully naming it.

The Question Just Beyond the Horizon

The current discussion asks how AI can feel empathetic.

The next question asks what empathy means when intelligence itself is changing.

Instead of focusing on imitation, the conversation expands toward deeper inquiries:

  • How do intelligent systems preserve dignity in interaction?

  • How can AI recognize emotional complexity without reducing it to patterns?

  • What responsibility does technology hold for the emotional consequences it creates?

These are design questions. Engineering questions. Ethical questions.

They represent the next stage of the work.

A Closing Reflection

McKinsey’s conversation captures an important moment. Organizations are moving beyond experimentation toward operational transformation. Agentic AI will reshape how service and experience are delivered.

But the most significant change may not be operational.

It may be conceptual.

We are moving from designing interfaces to designing relationships between forms of intelligence.

And relationships require more than responsiveness.

They require understanding.

Next
Next

The Lazarus Illusion: How Exception Culture Breaks AI Systems Before Anyone Notices