AI, Identity & Empathy
Human-Centered Intelligence for a Fragmented Age
Essays that examine how AI interprets identity, where it fails, and what it takes to build systems that understand people with accuracy, dignity, and emotional truth.
There is a special kind of nausea that hits when a founder calls people “the biggest constraint” and their new AI “the employee.” Podium’s Jerry 2.0 is sold as 24/7 help for local businesses, but the real pitch is simple: more revenue with fewer humans. This piece pulls apart why that story feels so slick, so scummy and what it reveals about how tech leaders really see the rest of us.
Retail AI does not begin at the edge of the app. It begins inside the organization, in the way leaders treat employees who raise concerns about harm.
The Human Harm Layer is the mechanism that carries those internal habits into external decisions about customers. When a worker is told to “go into listen mode” after naming how a pricing model will make life more expensive for poor Black neighborhoods, that is not just a bad meeting. It is Emotional Metadata the system chooses to ignore.
The same logic shows up later in delivery platforms and dynamic pricing schemes that quietly charge the highest tax on convenience to the people with the least slack. This essay maps how that harm travels, how it becomes code and what it will take for retailers to stop exporting workplace violence into customer experience.
When the noise finally stopped, the pattern came into focus. The employee had not failed to “handle stress.” They had been shaped by a system that rewarded harm, normalized confusion, and relied on their silence. Part Five dissects the archetypes that uphold that system and reveals the psychology beneath the collapse.
This installment examines what happens when a workplace treats a person as toggleable in its systems. The body does not forget the instability. It records it. Part Four traces how hypervigilance, panic, and exhaustion emerge not from weakness, but from prolonged exposure to organizational harm.
The most devastating harm in a workplace rarely arrives through shouting. It arrives through silence. Part Two examines the manager who held authority without responsibility, and how their refusal to claim the employee became its own form of injury — one that eroded confidence, blurred accountability, and revealed the deeper architecture of an unstable system.
When a manager refuses to claim responsibility, the harm doesn’t come through conflict. It comes through absence. Part Two exposes how avoidance, scattered feedback, and hidden workload ethics quietly dismantle a person’s stability.
Part One of a six-essay series exploring how unstable workplace structures and vague leadership expectations create slow, cumulative harm. This narrative is a conceptual examination, not a depiction of real events, showing how systems can quietly erode a person’s stability long before collapse becomes visible.
Instacart calls it experimentation. Consumer Reports and Groundwork Collaborative call it what it is: an AI pricing system that quietly tests how far it can push your grocery bill. This essay connects their findings to food deserts, price elasticity, and racialized access, and lays out the guardrails any serious retailer or regulator should demand before turning AI loose on the cost of eating.
At the Canadian border, an officer asked what I thought about AI. When I told him its darkest risk is erasure, he brought up something most people avoid. New hires sometimes rely on instinct more than training, and their decisions shape who gets searched or stopped. If AI learns from those patterns, the bias becomes logic. This piece explores how border automation can harden inequities that already exist, and why systems must be designed to truly see the people they serve.
Answer synthesis is positioned as the future of search, but most AI systems cannot see underrepresented communities clearly enough to summarize them. When a model replaces exploration with a single synthesized answer, nuance disappears and minority perspectives fall out of view. This piece examines how visibility collapses inside synthesis systems and what must change before this shift becomes the default.
Design Sprint Academy delivered one of the clearest AI frameworks I’ve seen. Their focus on structured decision making, guided facilitation, and rapid learning is powerful. Through a retail lens, the work becomes even more interesting. Retail has two customers, deep emotional dynamics, and identity shaped behaviors that must be understood from the first conversation. The result is a richer, more human way of framing AI value. Here is what I learned and how the framework expands inside a sector that never stops moving.
There are moments when your voice becomes impossible to ignore. This is the story of how pressure clarified my purpose, sharpened my leadership, and revealed the future I am built to lead in.
When Amazon bought Whole Foods, everyone predicted Instacart’s collapse. Instead, the grocery industry spent years chasing the wrong infrastructure—and is now retreating from automation, robotics, and self-distribution. Instacart’s flexible, store-proximate model didn’t just survive; it became the design pattern grocers are returning to.
AI doesn’t fail queer and BIPOC people because of “bias.” It fails because it cannot interpret our identities with accuracy, context, or cultural truth. This essay reveals the mechanisms behind that harm and introduces a new architecture to prevent identity collapse in the age of autonomous AI.
Amazon doesn’t have a convenience problem.
it doesn’t have a speed problem.
It doesn’t have an assortment problem.
Amazon has a discernment problem.
People don’t remember flawless. They remember feeling seen.
They remember the human who made the stress smaller, the journey lighter, the moment calmer. They remember the person who didn’t just process them, but acknowledged them.
Travel, at its core, is a human ritual. It is migration, reunion, escape, reinvention, obligation, hope. And any airline that forgets that truth — even unintentionally — begins to lose the one thing technology can’t replicate: emotional resonance.
This essay is a lead in to my six part series, A System Built On Silence. It follows a composite employee inside a composite organization and asks a simple question with uncomfortable answers: what does this system do to a human nervous system over time, especially when that human is not at the center of power. There is no redemption arc here, only a clear line between identity, harm, and why my future work on The Knox AI Empathy System™ will not pretend to be neutral.
This one was hard to write. Not because I’m mad. Because I still want the blanket.
For years, Target was the rare brand that made belonging feel casual. You’d walk in for toothpaste and walk out with a little affirmation tucked into your bag. Pride wasn’t seasonal — it was spatial. It lived in the aisles, woven into the colors, the tone, the quiet promise that this place sees you.
And then, 2023 happened. The backlash. The retreat. The silence.
When Target pulled its Pride collection, it didn’t just move product — it moved meaning. It told queer people what we’ve always half-suspected: our safety here was situational.
This isn’t about outrage. It’s about empathy debt — the emotional residue that builds when remorse never matures into reflection. You can’t pay it off with apologies. You pay it off with change.
For years, Nintendo has been the moral compass of interactive entertainment.
A company that taught the world that joy could be engineered, and that simplicity could feel transcendent.
They’ve spent decades proving that technology doesn’t have to exploit emotion to move people. It can honor it.
That’s why their recent stance on AI — a public commitment not to use it in game design — is so fascinating. Because in one sense, they’re right.
And in another, they’re missing their greatest opportunity yet.
I wrote AI, Craft & the Human Future of Retail because I’ve come to believe the real disruption of AI isn’t technical — it’s human. After years leading transformation work with Tredence and SymphonyAI, I watched automation accelerate faster than reflection, and intelligence scale faster than intention.
This paper is my attempt to restore that balance. It introduces The Knox AI Empathy System — a model and a mindset for designing organizations that think with care. It’s not about faster machines; it’s about wiser leaders.
Because in an age where intelligence is abundant, discernment has become the rarest skill of all.
A few weeks ago, OpenAI announced that users could now buy directly inside ChatGPT. No checkout page, no app handoff, no cart abandonment halfway through a funnel. Just a conversation, a suggestion, and a sale.
At first, it sounded like novelty — a new way to shop without leaving the chat. But beneath the surface, it marks something far deeper: the beginning of agentic commerce.
As I prepare to leave SymphonyAI and step into a new chapter, I’ve been reflecting on these quiet moments and the lessons they hold. This past week, I’ve been deeply moved by the kind words of colleagues and clients who’ve shared how my work impacted their own. These conversations reminded me of something we often forget: we are far more exceptional than we give ourselves credit for.
The best leaders aren’t the ones with the fanciest titles or the slickest presentations. They’re the ones who show up for their people—consistently, authentically, and with purpose. Next time you notice something isn’t quite right on your team, skip the analysis and just talk to the person. Be curious. Be honest. Be human. Great leadership isn’t about strategies or optics. It’s about showing up for the people who make it all happen.
AI has become a household name in the world of creativity. It’s helping artists design stunning visuals, musicians compose new melodies, and writers craft stories. But let’s be honest—many of us still feel a little uneasy about it. Can a machine really understand creativity? And even if it can help, what does that mean for the uniquely human magic we bring to the process?
The AIM Loyalty Framework (AI-Integrated Maturity) draws on the strengths of industry-leading models while addressing the unique needs of loyalty programs in retail, hospitality, finance, and beyond. It’s a tool designed to help businesses move beyond generic AI strategies and build programs that foster meaningful customer relationships, deliver measurable ROI, and set new standards for innovation.
Imagine planning a dream vacation where every recommendation feels custom-made for you, yet you never have to give up your personal data. Sounds too good to be true, doesn’t it? It’s not—thanks to the rise of Customer-Centric AI, a new approach to artificial intelligence that promises to deliver exceptional, tailored experiences while respecting privacy.
AI is reshaping CSR initiatives in ways that benefit both business outcomes and society at large. This blog post explores why retailers must embrace AI-driven social responsibility and dives into five innovative AI applications that can drive impactful change.
As the grocery industry faces increasing pressure to address sustainability concerns, Artificial Intelligence (AI) offers transformative potential. By leveraging AI to optimize operations, reduce spoilage, and make data-driven decisions, grocery retailers can substantially reduce food waste while improving profitability.
While traditional Point-of-Sale (POS) data has been useful for tracking transactions, customer data holds far greater potential for driving business growth and building long-term loyalty. The retail giants—Kroger, Amazon, and Walmart—have demonstrated the transformative power of leveraging customer data, not just to enhance the shopping experience but also to drive new revenue streams.
Retail is no longer just about selling products—it's about creating experiences, building connections, and staying ahead in a constantly evolving digital landscape. To thrive in this environment, retailers need more than just good products—they need a strategic framework to guide them through the complexities of digital transformation.